Copy whole websites or sections locally for offline browsing

Cyotek WebCopy

Join our mailing list

Stay up to date with latest software releases, news, software discounts, deals and more.

Subscribe
Download Cyotek WebCopy 1.8.1 Build 725

 -  100% Safe  -  Freeware
  • Latest Version:

    Cyotek WebCopy 1.8.1 Build 725 LATEST

  • Requirements:

    Windows Vista / Windows 7 / Windows 8 / Windows 10

  • User Rating:

    Click to vote
  • Author / Product:

    Cyotek Ltd. / Cyotek WebCopy

  • Old Versions:

  • Filename:

    setup-cyowcopy-1.8.1-build-725.exe

  • MD5 Checksum:

    75d1f57fa9782c16bb7c0b3e147e848d

  • Details:

    Cyotek WebCopy 2020 full offline installer setup for PC 32bit/64bit

Cyotek WebCopy is a free tool for copying full or partial websites locally onto your harddisk for offline viewing. It will scan the specified website and download its content onto your harddisk. Links to resources such as style-sheets, images, and other pages on the website will automatically be remapped to match the local path. Using its extensive configuration you can define which parts of a website will be copied and how. This software may be used free of charge, but as with all free software, there are costs involved to develop and maintain.

What can WebCopy do?

The Web Copy Tool will examine the HTML mark-up of a website and attempt to discover all linked resources such as other pages, images, videos, file downloads - anything and everything. It will download all of these resources, and continue to search for more. In this manner, WebcCopy can "crawl" an entire website and download everything it sees in an effort to create a reasonable facsimile of the source website.

What can WebCopy not do?

It does not include a virtual DOM or any form of JavaScript parsing. If a website makes heavy use of JavaScript to operate, it is unlikely It will be able to make a true copy if it is unable to discover all of the websites due to JavaScript being used to dynamically generate links.

It does not download the raw source code of a web site, it can only download what the HTTP server returns. While it will do its best to create an offline copy of a website, advanced data-driven websites may not work as expected once they have been copied.

Features and Highlights

Rules
Rules control the scan behavior, for example excluding a section of the website. Additional options are also available such as downloading a URL to include in the copy, but not crawling it.

Forms and Passwords
Before analyzing a website, you can optionally post one or more forms, for example to login to an administration area. HTTP 401 challenge authentication is also supported, so if your website contains protected areas, you can either pre-define user names and passwords or be automatically prompted for credentials while scanning.

Viewing links
After you have analyzed your website, the Link Map Viewer allows you to view all the links found in your website, both internal and external. Filtering allows you to easily view the different links found.

Configurable
There are many settings you can make to configure how your website will be crawled, in addition to rules and forms mentioned above, you can also configure domain aliases, user agent strings, default documents and more.

Reports
After scanning a website, you can view lists of pages, errors, missing pages, media resources, and more.

Regular Expressions
Several configuration options make use of regular expressions. The built-in editor allows you to easily test expressions.

Website Diagram
View and customize a visual diagram of your website, which can also be exported to an image.

Note: Requires .NET Framework.


Join our mailing list

Stay up to date with latest software releases, news, software discounts, deals and more.

Subscribe