|The instructions provided in this article or section require shell access unless otherwise stated.|
You can use the PuTTY client on Windows, or SSH on UNIX and UNIX-like systems such as Linux or Mac OS X.
GNU Wget (or just Wget, formerly Geturl) is a program that retrieves content from web servers, and is part of the GNU Project. Its name is derived from the words World Wide Web and get. It supports HTTP, HTTPS, and FTP download protocols.
Using the wget program over SSH at the UNIX shell command line prompt is a great shortcut for uploading software or other files from a remote server to your DreamHost server. You can avoid the sometimes painful and slow download/upload process, and mainline downloads straight to DreamHost's servers.
Wget is a powerful tool, with a lot of options, but even the basics are useful.
|Note:||rsync may be a better (faster, less complicated) option for users migrating between two rsync-enabled servers (such as moving from DH to DH PS).|
To use wget:
- Create a shell user in your panel.
- Once you create a shell user, log into your server via SSH.
- Type in ‘wget’ followed by the full URL of the file you wish to download. For example, run the following command to download the .tar.gz file for Python version 2.7.7:
- This downloads the .tgz file to the directory you ran the command in.
- Wget is often used to download compressed files.
- If the file you download is compressed, decompress the file using gunzip, unzip, or tar to expand and unpack the download.
- If you need to pass variables to a script, then enclose the URL in sinqle quotes which prevents the ampersand character from being interpreted as the shell command:
To create a mirror image of a folder on a different server (with the same structure as the original one), you can simply ftp into the server and transfer it:
wget -r ftp://username:firstname.lastname@example.org/folder/*
This command downloads 'folder/' and everything within it keeping its directory structure which can save you a lot of time rather than using wget on each file individually.
Now you can simply zip the folder using:
zip -r folder.zip folder
and then clean up by deleting the copy:
rm -rf folder
Its a great way to backup your entire website at once and of course it's very helpful moving large sites across hosts.
For example, use the following command to download the entire contents of example.com:
wget -r -l 0 http://www.example.com/
- Taken from: GNU Wget Manual/Examples – Advanced Usage
Man page info
To view the manual page for wget, run the following in your terminal: