Recursively downloads Web pages
Sirobot is a Perl script that downloads Web pages recursively. The main advantage over wget is its ability to get them concurrently, and is able to continue aborted downloads and convert absolute links to relative ones. It uses curses, can do HTTPS, and has a pattern-matching filter to prevent you from downloading the whole Internet.
released on 22 May 2004
|License||Verified by||Verified on||Notes|
|License:GPLv2orlater||Janet Casey||9 April 2003|
Leaders and contributors
Resources and communication
|Required to use||Perl 5 or later|
|Required to use||LWP|
|Required to use||URI|
|Required to use||Digest_MD5|
|Weak prerequisite||SSL library (needed for https support)|
This entry (in part or in whole) was last reviewed on 15 January 2017.