Download Whole Website Like httrack via Command Line in Linux

On Oct 28, 2010 0 comments

Usually you need to download the whole website, and you use additional application like httrack in windows. But there will be a question how to do it in Linux? What application do I need to do this. But that question will be answered by a simple solution that missed by many people.

Yes, we will do it just by command line. And the command that we need is a popular command that many people have known. The command is wget. Now, how wget will download whole website? Here it is the command that you need :

$ wget -r --level=0 -convert-links --page-requisites --no-parent

The wget option explanation :
-r / --recursive                   Do recursive
-l / --level=                        Use 0 for infinite depth level or use number greater than 0 for limited depth.
-k / --convert-links            Modify links inside downloaded files to point to local files.
-p / --page-requisites         Get all images, css, js files which make up the web page.
-np / --no-parent               Don't download parent directory contents.

That's the command that you need to download whole website like httrack. Very simple, isn't it?

0 comments:

Post a Comment