is a computer program
that implements simple and powerful content retrieval from web servers
and is part of the GNU Project
. Its name is derived from World Wide Web
, connotative of its primary function. It currently supports downloading via [
], and FTP
protocols, the most popular TCP/IP
-based protocols used for web browsing.
Its features include recursive download, conversion of links for offline viewing of local HTML, support for proxies, and much more. It appeared in 1996, coinciding with the boom of popularity of the web, causing its wide use among Unix users and distribution with most major Linux distributions. Written in portable C, Wget can be easily installed on any Unix-like system and has been ported to many environments, including Mac OS X, Microsoft Windows, OpenVMS and AmigaOS. Competitive tools include cURL.
It has been used as the basis for graphical programs such as Gwget for the GNOME Desktop. Released under the terms of the GNU General Public License v3, Wget is free software.
Wget has been designed for robustness over slow or unstable network connections. If a download
does not complete due to a network
problem, Wget will automatically try to continue the download from where it left off, and repeat this until the whole file has been retrieved. It was one of the first clients to make use of the then-new
Range HTTP header
to support this feature.
Wget can optionally work like a web crawler
by extracting resources linked
from HTML pages
and downloading them in sequence, repeating the process recursively
until all the pages have been downloaded or a maximum recursion depth specified by the user has been reached. The downloaded pages are saved in a directory structure resembling that on the remote server. This "recursive download" enables partial or complete mirroring of web sites
via HTTP. Links in downloaded HTML pages can be adjusted to point to locally downloaded material for offline
viewing. When performing this kind of automatic mirroring
of web sites, Wget supports the Robots Exclusion Standard
Recursive download works with FTP as well, where Wget issues the
LIST command to find which additional files to download, repeating this process for directories and files under the one specified in the top URL. Shell-like wildcards are supported when the download of FTP URLs is requested.
When downloading recursively over either  or FTP, Wget can be instructed to inspect the timestamps of local and remote files, and download only the remote files newer than the corresponding local ones. This allows easy mirroring of  and FTP sites, but is considered inefficient and more error-prone when compared to programs designed for mirroring from the ground up, such as rsync. On the other hand, Wget does not require special server-side software for mirroring, so the comparison is at least somewhat flawed.
Wget is non-interactive in the sense that, once started, it does not require user interaction and does not need to control a TTY
, being able to log its progress to a separate file for later inspection. That way the user can start Wget and log off
, leaving the program unattended. By contrast, most graphical
or text user interface web browsers
require the user to remain logged in and to manually restart failed downloads, which can be a great hindrance when transferring a lot of data.
Written in a highly portable style of C
with minimal dependencies on third-party libraries, Wget requires little more than a C compiler and a BSD-like interface to TCP/IP
networking. Designed as a Unix
program invoked from the Unix shell
, the program has been ported to numerous Unix-like environments and systems, such as Cygwin
and Mac OS X, as well as to Microsoft Windows
- Wget supports download through proxies, which are widely deployed to provide web access inside company firewalls and to cache and quickly deliver frequently accessed content.
- It makes use of persistent HTTP connections where available.
- IPv6 is supported on systems that include the appropriate interfaces.
- SSL/TLS is supported for encrypted downloads using the OpenSSL library.
- Files larger than 2 GB are supported on 32-bit systems that include the appropriate interfaces.
- Download speed may be throttled to avoid using up all of the available bandwidth.
Typical usage of GNU Wget consists of invoking it from the command line, providing one or more URLs as arguments.
- Download the title page of example.com to a file
- named "index.html".
- Download Wget's source code from the GNU ftp site.
More complex usage includes automatic download of multiple URLs into a directory hierarchy.
- Download the title page of example.com, along with
- the images and style sheets needed to display the page, and convert the
- URLs inside it to refer to locally available content.
wget -p -k http://www.example.com/
- Download the entire contents of example.com
wget -r -l 0 http://www.example.com/
- Download a mirror of the errata for a book you just purchased.
- Follow all local links recursively and make the files suitable
- for off-line viewing.
- Use a random wait of 0*0 to 5*2 seconds between files.
- When there is a failure retry for up to 7 times with 14 seconds
- between each retry.
- Set the user agent to Firefox on Windows XP and ignore robot exclusions.
- Collect access results to the local file "myLog.log"
wget -t 7 -w 5 --waitretry=14 --random-wait --user-agent="Mozilla/5.0
(Windows; U; Windows NT 5.1; en-US; rv:220.127.116.11) Gecko/20060111
Firefox/18.104.22.168" -m -k -K -e robots=off
http://www.oreilly.com/catalog/upt3/errata/ -o ./myLog.log
- Collect only the specific links listed line by line in
- the local file "my_movies.txt"
- Use a random wait of 0 to 33 seconds between files.
- When there is a failure, retry for up to 22 times with 48 seconds
- between each retry. Send no user-agent at all. Ignore robot exclusions.
- Place all the captured files in the "/movies" directory
- and collect the access results to the local file "my_movies.log"
- Good for just downloading specific known images or other files.
wget -t 22 --waitretry=48 --wait=33 --random-wait --user-agent=""
-e robots=off -o ./my_movies.log -P/movies -i ./my_movies.txt
- Using wget to download content protected by referer and cookies.
- 1. get base url and save its cookies in file
- 2. get protected content using stored cookies
wget --cookies=on --keep-session-cookies --save-cookies=cookie.txt http://first_page
wget --referer=http://first_page --cookies=on --load-cookies=cookie.txt --keep-session-cookies --save-cookies=cookie.txt http://second_page
- Mirror website to a static copy for local browsing.
- This means all links will be changed to point to the local files.
- Note --html-extension will convert any CGI, ASP or PHP generated files to HTML (or anything else not .html).
wget --mirror -w 2 -p --html-extension --convert-links -P http://www.yourdomain.com
Authors and copyright
GNU Wget was written by Hrvoje Nikšić with contributions by many other people, including Dan Harkless, Ian Abbott, and Mauro Tortonesi. Significant contributions are credited in the AUTHORS
file included in the distribution, and all remaining ones are documented in the changelogs
, also included with the program. Wget is now maintained by Micah Cowan.
The copyright to Wget belongs to the Free Software Foundation, whose policy is to require copyright assignments for all non-trivial contributions to GNU software.
Wget is the descendant of an earlier program named Geturl
by the same author, the development of which commenced in late 1995. The name was changed to Wget
after the author became aware of an earlier Amiga
program named GetURL
, written by James Burton in AREXX
Wget filled a gap in the web downloading software available in the mid-1990s. No single program was able to reliably download files via both  and FTP protocols. Existing programs either only supported FTP (such as NcFTP and dl) or were written in Perl, which was not yet ubiquitous at the time. While Wget was inspired by features of some of the existing programs, it aimed to support both HTTP and FTP and to enable the users to build it using only the standard development tools found on every Unix system.
At that time many Unix users struggled behind extremely slow university and dial-up Internet connections, leading to a growing need for a downloading agent that could deal with transient network failures without assistance from the human operator.
The following releases represent notable milestones in Wget's development. Features listed next to each release are edited for brevity and do not constitute comprehensive information about the release, which is available in the NEWS
file distributed with Wget
- Geturl 1.0, released January 1996, was the first publicly available release. The first English-language announcement can be traced to this Usenet news posting, which probably refers to Geturl 1.3.4 released in June.
- Wget 1.4.0, released November 1996, was the first version to use the name Wget. It was also the first release distributed under the terms of the GNU GPL, Geturl having been distributed under an ad-hoc no-warranty license.
- Wget 1.4.3, released February 1997, was the first version released as part of the GNU project with the copyright assigned to the FSF.
- Wget 1.5.3, released September 1998, was a milestone in the program's popularity. This version was bundled with many Linux distributions, which exposed the program to a much wider audience.
- Wget 1.6, released December 1999, incorporated many bug fixes for the (by then stale) 1.5.3 release, largely thanks to the effort of Dan Harkless.
- Wget 1.7, released June 2001, introduced SSL support, [cookie|cookies], and persistent connections.
- Wget 1.8, released December 2001, added bandwidth throttling, new progress indicators, and the breadth-first traversal of the hyperlink graph.
- Wget 1.9, released October 2003, included experimental IPv6 support, and ability to POST data to HTTP servers.
- Wget 1.10, released June 2005, introduced large file support, IPv6 support on dual-family systems, NTLM authorization, and SSL improvements. The maintainership was picked up by Mauro Tortonesi.
- Wget 1.11, released January 2008, moved to version 3 of the GNU General Public License, and added preliminary support for the non-standard but widely-used
Content-Disposition header, which is often used by CGI scripts to indicate the name of a file for downloading. Security-related improvements were also made to the HTTP authentication code. Micah Cowan took over maintainership of the project.
Development and release cycle
Wget is developed in an open fashion, most of the design decisions typically being discussed on the public mailing list
followed by users and developers. Bug reports are relayed to the same list.
The preferred method of contributing to Wget's code and documentation is through source updates in the form of textual patches
generated by the diff
utility. Patches intended for inclusion in Wget are submitted to a designated mailing list
where they are reviewed by the maintainers. Patches that pass the maintainers' scrutiny are installed in the sources, and all others are rejected. Instructions on patch creation as well as style guidelines are outlined in the PATCHES
document provided with the distribution
, mostly based on the GNU Coding Standards
Because all changes go through this list, even ones from core developers, the subscribers to the list can track Wget development and provide feedback.
The source code can also be tracked via a remote version control repository that hosts revision history beginning with the 1.5.3 release. The repository is running Mercurial; previously it had been hosted on Subversion, and prior to that via CVS.
When a sufficient number of features or bug fixes accumulate during development, Wget is released to the general public via the GNU FTP site and its mirrors. Being entirely run by volunteers, there is no external pressure to issue a release nor are there enforceable release deadlines.
Releases are numbered as versions of the form of major.minor[.revision], such as Wget 1.11 or Wget 1.8.2. An increase of the major version number represents large and possibly incompatible changes in Wget's behavior or a radical redesign of the code base. An increase of the minor version number designates addition of new features and bug fixes. A new revision indicates a release that, compared to the previous revision, only contains bug fixes. Revision zero is omitted, meaning that for example Wget 1.11 is the same as 1.11.0. Wget does not use the odd-even release number convention popularized by the kernel Linux.
At any moment there are two branches of development: the trunk, where new features get added, and the stable branch, forked after each minor release, where only the bug fixes are applied. All revision-level releases are built off the stable branch; all minor version releases are built off the trunk.
GNU Wget is distributed under the terms of the GNU General Public License
, version 3 or later, with a special exception
that allows distribution of binaries linked
against the OpenSSL
library. The text of the exception follows:
Additional permission under GNU GPL version 3 section 7
If you modify this program, or any covered work, by linking or
combining it with the OpenSSL project's OpenSSL library (or a
modified version of that library), containing parts covered by the
terms of the OpenSSL or SSLeay licenses, the Free Software Foundation
grants you additional permission to convey the resulting work.
Corresponding Source for a non-source form of such a combination
shall include the source code for the parts of OpenSSL used as well
as that of the covered work.
It is expected that the exception clause will be removed once Wget is modified to also link with the GnuTLS library.
Wget's documentation, in the form of a Texinfo reference manual, is distributed under the terms of the GNU Free Documentation License, version 1.2 or later. The man page usually distributed on Unix-like systems is automatically generated from a subset of the Texinfo manual and falls under the terms of the same license.