Recursively downloads Web pages
Sirobot is a Perl script that downloads Web pages recursively. The main advantage over wget is its ability to get them concurrently, and is able to continue aborted downloads and convert absolute links to relative ones. It uses curses, can do HTTPS, and has a pattern-matching filter to prevent you from downloading the whole Internet.
released on 22 May 2004
|License||Verified by||Verified on||Notes|
|GPLv2orlater||Janet Casey||9 April 2003|
Leaders and contributors
Resources and communication
|Required to use||Perl 5 or later|
|Required to use||LWP|
|Required to use||URI|
|Required to use||Digest_MD5|
|Weak prerequisite||SSL library (needed for https support)|
This entry (in part or in whole) was last reviewed on 15 January 2017.
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.3 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the page “GNU Free Documentation License”.
The copyright and license notices on this page only apply to the text on this page. Any software or copyright-licenses or other similar notices described in this text has its own copyright notice and license, which can usually be found in the distribution or license text itself.