Retrieves files from the Web.
Wget is a utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
It can follow links in HTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing.
Recursive downloading also works with FTP, where Wget can retrieves a hierarchy of directories and files.
With both HTTP and FTP, Wget can check whether a remote file has changed on the server since the previous run, and only download the newer files.
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports re-getting, it will instruct the server to continue the download from where it left off.
Wget supports proxy servers; this can lighten the network load, speed up retrieval, and provide access behind firewalls.
Selection from the FSF shop
If you have corrections to this entry or questions about it, please contact: mailto:email@example.com
- IRC general channel
released on 26 December 2018
OpenPGP signature URL: https://ftp.gnu.org/gnu/wget/wget-1.20.1.tar.lz.sig
git clone git://git.sv.gnu.org/wget.git
- Wget Script generator
2 May 2018
2 May 2018
Leaders and contributors
Resources and communication
|Debian (Ref) (R)||https://tracker.debian.org/pkg/wget2|
|VCS Repository Webview||https://git.savannah.gnu.org/cgit/wget.git/|
|VCS Repository Webview||https://gitlab.com/gnuwget/wget2/|
This entry (in part or in whole) was last reviewed on 19 January 2019.