Lightning download it The latest software Freeware Green software

network resource Special software

Your location: Lightning _ download it green free software download site > Network tools > Browsing assistant Screaming Frog SEO (Spider > optimization analysis tools website) 10.4 official version of the registration code

(optimization analysis tool website) 10.4 official version of the registration code

  • Software size: unknown
  • Update: 2018-11-08
  • The official website: Lightning download it
  • Software: * * * * * grade
  • Operating environment: Winxp/Win7/Win8/Win10
  (optimization analysis tool website) 10.4 official version of the registration code
  • software documentation
  • Software screenshot
  • Download address
  • The related software
  • User comments
  • Complaint and suggestion:
is a web site optimization analysis tools, especially for the engine optimization design and testing of web search links. support to crawl the site and find broken links (404) and the server error, audit content, analysis of repeated redirection, found the page title and metadata and other functions, is a very use of the website optimization and SEO tools. Need friends not to miss!

Install the crack tutorial

In 1, the station download and unzip, ScreamingFrogSEOSpider-9.0.exe setup and Keygen.exe register

2, double-click the ScreamingFrogSEOSpider-9.0.exe operation, according to the prompt installation is complete, click close to exit setup

3, run the software, as shown in the picture, click the accept button

4, as shown in the picture, click on the menu bar of the Licence button

5, the software interface is activated, as shown in Figure

6, run the registration registration machine, copying machine user name and serial number register software

7, as shown in figure, Enter a valid license key. Please restart to make your changes to take effect.



Disconnect the link immediately crawl the site and find broken links (404) and server error. Batch error source and export URL to repair or sent to the developers.

Audit redirection

Find the temporary and permanent redirection, identifies the redirection chain and cycle, or upload to the site URL list transfer audit.

The page title and metadata analysis

In the process of crawling the page title and description of metadata, and determine on your site is too long, short, deletion or duplication of the page title and meta description.

Find duplicate content

MD5 algorithm is used to check found duplicate URLs, repeat the page title, description or title and other elements, and find the content of low ".

The use of XPath data extraction

The use of CSS Path, XPath or regex to collect any data from web pages in HTML. This may include social meta tags, additional title, price, SKU or more!

Check the robot and instructions

View by robots.txt yuan, robot or X-Robots-Tag instructions (such as "Noindex" or "nofollow") as well as canonical and rel = "next" and rel = "prev" block URL.

Generate XML site map

To quickly create a XML site map and image XML site map, the advanced configuration of URL include the last modified priority and the frequency of change.

Integration of Google and Analytics

Connect to the Google AnalyticsAPI and access to user data, such as session or bounce rate and conversion rate, target, trade and income of landing page.

Installation tutorial

1, download and install

2 Frog SEO, running Screaming Spider, click the register, enter the following registration information:

User name:

The registration code: 888DE26461-1541776050-1676B6E26A

3, click OK, you can activate the .

Use help

A crawling,
1, conventional grab
In the conventional capture mode, will grab your input subdomains, and all the other subdomain name default encountered for the case of external links (shown in the "external" tab). In the software license version, you can adjust the configuration to select all the subdomains to crawl the site. Search engine optimization is one of the most common uses of spider found errors on the website, such as link disconnect, and redirect server error. In order to better control the crawl, please use your site's URI structure, SEO spider configuration options, such as only grab HTML (images, CSS, JS etc.), out of function, custom robots.txt, contains function or change the search engine optimization spider model, and upload a URI list to grab.
2, grab a sub folder
SEO Spider from the default folder path forward crawl, so if you want to crawl the site on a specific sub folder, simply enter the file path is URI. For example, if it is a blog, it may be:, like our own blog. Through the direct input to the SEO Spider, which will include all URI capture in the /blog/sub directory.
3, grab a list of URLs
By entering the URL and click on "start" to crawl the site, you can switch to the list mode, paste or upload to download the specific list. For example, in the audit for the site redirected, this migration is especially useful.
Two, configuration
In the tool of the licensed version, you can save the default configuration of the crawl, and save the configuration configuration file can be loaded when needed.
1, to save the current configuration to the default, select File > > save current configuration configuration to the default value".
2, to save the configuration file so that it can be loaded, click File > Save as "and adjust the file name (preferably descriptive).
3, to load the configuration file, click File > Load ", then select your configuration file or file > recently loaded from the recent list to choose.
4, to is reset to the original default configuration, select File > > clear the default configuration configuration".
Three, export
The top part of the window function are at the top of the window and view your current work together. Therefore, if you use the filter and click "export", it will only contain data derived filtering options.
There are three main methods of data derived:
1, export data: top-level window simply click on the upper left corner of the "export" button, you can export data from the top window tab.
The data window 2, derived low (URL information, links, links, image output information) to export the data, only in the top window right click to export data from URL, and then click "export" under the "URL information", "link", "link" or "picture information".
3, batch export: located in the top menu, allowing the bulk export data. You can use the all in to find all instances of links "option to export links in the crawl, can also be exported to all links all have specific status code (such as 2XX, 3XX, 4XX or 5XX response) URL. For example, select the "client error 4XX" link in the option to export all links to all error pages (for example 404 error page). You can also export all images instead of text, all images lack alternative text and all anchor text.
  • Label
Download address
High speed download download download address: the need for priority
Other download address
About the station - A web site to help - Cooperative advertising - Download statement - friendly links - Site map - The website label - Hongkong server
Have any comments or suggestions please contact email: 858898909[at] this part collected on the Internet, if the infringement, inappropriate, please contact us to delete. Please forgive me!
Copyright of 2012 All Rights Reserved Henan ICP No. 12021367 Henan Public Security No. 41130302000066 Lightning download it