සිංහ‍ල Keep it simple.. Sri Lanka Webvision Sri Lanka
clients
web design home
domain registration
web hosting
සිංහල
about
தமிழ்
blog
contact us
server
 

How to download a website using wget Linux command

Website codeThe wget utility is a powerful and effective mode that enables you to download web pages, files and images from the web using the Linux command line.

For this guide, we will learn how to download this free one page template – https://blackrockdigital.github.io/startbootstrap-coming-soon/.

So you open up the terminal on Linux.

It is worth creating your own folder on your machine using the mkdir command and then moving into the folder using the cd command.

For example, let’s make a folder named “onepage”:

mkdir onepage
cd onepage
wget https://blackrockdigital.github.io/startbootstrap-coming-soon/ 

The result is a single index.html file. On its own, this file is useless as the content, images and stylesheets are not included.

To download the full site and all the pages you can use the following command:

wget -r https://blackrockdigital.github.io/startbootstrap-coming-soon/

This downloads the pages recursively up to a maximum of 5 levels deep.

Five levels deep might not be enough to get everything from the site. You can use the -l switch to set the number of levels you wish to go to as follows:

wget -r -l10 https://blackrockdigital.github.io/startbootstrap-coming-soon/

If you want infinite recursion you can use the following:

wget -r -l inf https://blackrockdigital.github.io/startbootstrap-coming-soon/

You can also replace the inf with 0 which means the same thing.

There is still one more problem. You might get all the pages locally but all the links in the pages still point to their original place. It is therefore not possible to click locally between the links on the pages.

You can get around this problem by using the -k switch which converts all the links on the pages to point to their locally downloaded equivalent as follows:

wget -r -k https://blackrockdigital.github.io/startbootstrap-coming-soon/

If you want to get a complete mirror of a website you can simply use the following switch which takes away the necessity for using the -r -k and -l switches.

wget -m https://blackrockdigital.github.io/startbootstrap-coming-soon/

Therefore, if you have your own website you can make a complete backup using this one simple command.

Do remember though, not all websites will grant you permission to download all the pages.  If you are the legitimate owner, you can still get though by using the following switches to specify the username and password.

wget --user=yourusername --password=yourpassword

Beware, on a multi user system if somebody runs the ps command, they will be able to see your username and password.

The wget command has a huge number of options and switches. It is worth reading the manual page for wget by typing the following into your terminal window:

man wget

Enjoy!

Leave a Reply

*


© 2002-2018 Webvision (Private) Limited | home | sitemap | about | faq | web design | web hosting | domain registration | blog | clients | contact