Sometimes you want to create an offline copy of a site that you can take and view even without internet access. Using ‘wget you can make such copy easily.

 wget is a nice tool for downloading resources from the internet.

Command:

Here in place of http://example.org/ give your desired link to download.

Options:

recursive: Used for traversing parts of the Web (or a single HTTP or FTP server), following links and directory structure.

domains <domain_name>: Limiting the recursion only to the hosts that belong to these domains..

no-parent: Using it guarantees that you will never leave the existing hierarchy.

page-requisites: This option is to download all the files that are necessary to properly display a given HTML page (Images, CSS and so on).

html-extension: Save files with the .html extension.

convert-links: Convert the links in the document to make them suitable for local viewing (off-line).

restrict-file-names=windows: Modify filenames so that they will work in Windows as well.

no-clobber: Don’t overwrite any existing files (used in case the download is interrupted and resumed).

Example:

 

Screenshot:

 

In coming up post, I will discuss about how to download a website in Windows. So, stay tuned.

Your comments and feedback are always welcome.