Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
It can follow links in HTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing.
Recursive downloading also works with FTP, where Wget can retrieves a hierarchy of directories and files.
With both HTTP and FTP, Wget can check whether a remote file has changed on the server since the previous run, and only download the newer files.
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports re-getting, it will instruct the server to continue the download from where it left off.
Wget supports proxy servers; this can lighten the network load, speed up retrieval, and provide access behind firewalls.
DocumentationUser reference manual available in HTML, Info, ASCII, TeX dvi, PostScript, and Texinfo formats from https://www.gnu.org/software/wget/manual/.
This is a GNU package:wget
released on 27 October 2014
|License||Verified by||Verified on||Notes|
|GPLv3orlater with exception||Kelly Hopkins||24 September 2009|
Leaders and contributors
|Micah Cowan||Maintainer from mid-2007 to mid-2010|
|Giuseppe Scrivano||Current maintainer|
Resources and communication
|Developer||VCS Repository Webview||http://git.savannah.gnu.org/cgit/wget.git|
|Bug Tracking,Developer,Help,Support||Mailing List Subscribe||http://lists.gnu.org/mailman/listinfo/bug-wget|
This entry (in part or in whole) was last reviewed on 28 October 2014.