Free Software Foundation!

Join now

Difference between revisions of "Wget"

From Free Software Directory
Jump to: navigation,
(Created page with "{{Entry |Name=Wget |Short description=Retrieves files from the Web |Full description=Wget is a network utility to retrieve files from the Web using http and ftp, the two most wid...")
 
m
 
(7 intermediate revisions by 5 users not shown)
Line 1: Line 1:
 
{{Entry
 
{{Entry
|Name=Wget
+
|Name=GNU wget
 
|Short description=Retrieves files from the Web
 
|Short description=Retrieves files from the Web
|Full description=Wget is a network utility to retrieve files from the Web using http and ftp, the two most widely used Internet protocols. It works non-interactively, so it will work in the background, after having logged off. The program supports recursive retrieval of web-authoring pages as well as ftp sites-- you can use wget to make mirrors of archives and home pages or to travel the Web like a WWW robot. Wget works particularly well with slow or unstable connections by continuing to retrieve a document until the document is fully downloaded. Re-getting files from where it left off works on servers (both http and ftp) that support it. Both http and ftp retrievals can be time stamped, so wget can see if the remote file has changed since the last retrieval and automatically retrieve the new version if it has. Wget supports proxy servers; this can lighten the network load, speed up retrieval, and provide access behind firewalls.
+
|Full description='''Wget''' is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
 +
 
 +
It can follow links in HTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site.  This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (''/robots.txt'').  Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing.
 +
 
 +
Recursive downloading also works with FTP, where Wget can retrieves a hierarchy of directories and files.
 +
 
 +
With both HTTP and FTP, Wget can check whether a remote file has changed on the server since the previous run, and only download the newer files.
 +
 
 +
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved.  If the server supports ''re-getting'', it will instruct the server to continue the download from where it left off.  
 +
 
 +
Wget supports proxy servers; this can lighten the network load, speed up retrieval, and provide access behind firewalls.
 +
|Homepage URL=https://www.gnu.org/software/wget/
 
|User level=intermediate
 
|User level=intermediate
 +
|VCS checkout command=git clone git://git.savannah.gnu.org/wget.git
 +
|Computer languages=C
 +
|Documentation note=User reference manual available in HTML, Info, ASCII, TeX dvi, PostScript, and Texinfo formats from https://www.gnu.org/software/wget/manual/
 +
|Related projects=Checkurls,cURL,fget,Ftpcopy,Kernin,KMAGO,Larbin,MH-E,Sirobot,Sitecopy,Wget4web,Wget_Script_generator,wmget,Gwget
 +
|Keywords=FTP,Internet,download,wget,file management,WWW,mirror
 +
|Version identifier=1.15
 +
|Version date=2014/01/19
 +
|Version status=stable
 +
|Version download=https://ftp.gnu.org/gnu/wget/wget-1.15.tar.xz
 +
|Last review by=Genium
 +
|Last review date=2014/01/25
 
|Submitted by=Database conversion
 
|Submitted by=Database conversion
 
|Submitted date=2011-04-01
 
|Submitted date=2011-04-01
|Version identifier=1.10.2
+
|Status=
|Version date=2005-10-13
+
|Is GNU=Yes
|Version status=stable
+
|GNU package identifier=wget
|Version download=http://ftp.gnu.org/gnu/wget/wget-1.10.2.tar.gz
+
|License verified date=2009-09-24
|License verified date=2003-10-22
+
}}
|Version comment=1.10.2 stable released 2005-10-13
+
{{Project license
 +
|License=GPLv3orlater with exception
 +
|License verified by=Kelly Hopkins
 +
|License verified date=2009-09-24
 +
}}
 +
{{Person
 +
|Real name=Micah Cowan
 +
|Role=Maintainer from mid-2007 to mid-2010
 +
|Resource URL=
 +
}}
 +
{{Person
 +
|Real name=Giuseppe Scrivano
 +
|Role=Current maintainer
 +
|Email=gscrivano@gnu.org
 +
|Resource URL=
 +
}}
 +
{{Resource
 +
|Resource audience=Developer
 +
|Resource kind=VCS Repository Webview
 +
|Resource URL=http://git.savannah.gnu.org/cgit/wget.git
 +
}}
 +
{{Resource
 +
|Resource audience=Bug Tracking,Developer,Help,Support
 +
|Resource kind=E-mail
 +
|Resource URL=mailto:bug-wget@gnu.org
 
}}
 
}}
 
{{Software category
 
{{Software category
 
|Interface=command-line
 
|Interface=command-line
 +
|Internet-application=tool
 +
|Use=internet-application
 
}}
 
}}
{{Project license
+
{{Featured}}
|License=GPLv2orlater with Open SSL exception
+
|License verified by=Aaron Hawley
+
|License verified date=2003-10-22
+
}}
+

Latest revision as of 12:29, 25 January 2014

[edit]

GNU wget

https://www.gnu.org/software/wget/
Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.

It can follow links in HTML pages and create local versions of remote web sites, fully recreating the directory structure of the original site. This is sometimes referred to as "recursive downloading." While doing that, Wget respects the Robot Exclusion Standard (/robots.txt). Wget can be instructed to convert the links in downloaded HTML files to the local files for offline viewing.

Recursive downloading also works with FTP, where Wget can retrieves a hierarchy of directories and files.

With both HTTP and FTP, Wget can check whether a remote file has changed on the server since the previous run, and only download the newer files.

Wget has been designed for robustness over slow or unstable network connections; if a download fails due to a network problem, it will keep retrying until the whole file has been retrieved. If the server supports re-getting, it will instruct the server to continue the download from where it left off.

Wget supports proxy servers; this can lighten the network load, speed up retrieval, and provide access behind firewalls.

Documentation

User reference manual available in HTML, Info, ASCII, TeX dvi, PostScript, and Texinfo formats from https://www.gnu.org/software/wget/manual/

Heckert gnu.small.png This is a GNU package:wget

Download

Download External-link-icon.png version 1.15 (stable)
released on 19 January 2014

VCS Checkout

Categories

Related Projects




Licensing

LicenseVerified byVerified onNotes
GPLv3orlater with exceptionKelly Hopkins24 September 2009



Leaders and contributors

Contact(s)Role
Micah CowanMaintainer from mid-2007 to mid-2010
"Email gscrivano@gnu.org" Giuseppe Scrivano Current maintainer


Resources and communication

Audience Resource type URI
Bug Tracking,Developer,Help,Support E-mail mailto:bug-wget@gnu.org
Developer VCS Repository Webview http://git.savannah.gnu.org/cgit/wget.git


Software prerequisites

This entry (in part or in whole) was last reviewed on 25 January 2014.



Entry




























Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the page “GNU Free Documentation License”.

The copyright and license notices on this page only apply to the text on this page. Any software or copyright-licenses or other similar notices described in this text has its own copyright notice and license, which can usually be found in the distribution or license text itself.


Personal tools
Namespaces

Variants
Actions
Navigation
Contribute