This is a minor update to SiteSucker, but it is a program that I depend on, so I am posting it.
For the uninitiated, SiteSucker is a program that sucks everything (pages, images, etc.) out of a website and downloads it to your hard drive. I use it because I update my school web page at the beginning of each year, but I don’t want to lose all the information from before. I don’t usually go back and look at it much, but mainly save it as a backup of what happened the previous year, in case I lose some other record.
The updates are:
- Allowed users to view the download settings while downloading.
- Replaced wildcard support in paths settings with regular expressions.
- Removed “Get Files via Image Links” from the Download Option and added “Only Follow Image Links” option under the Advanced tab in the download settings.
- Added an option to save log files in ~/Library/Logs/SiteSucker.
- Added a Logs tab in the Download Settings window and reorganized the settings.
- Added scanning of the style attribute in all tags for URLs.
- Replaced URL parameters with a value in local file names.
- Deleted empty folders in the download folder when all downloads are paused.
- Modified the document format to improve performance when analyzing files.
- Fixed an issue where some files failed to download when a download was resumed.
- Fixed some issues with the Open File command.
{"source":"https:\/\/gigaom.com\/2008\/05\/20\/sitesucker-updated\/wijax\/49e8740702c6da9341d50357217fb629","varname":"wijax_da1f81495b088a78eedd883dc31d2e4b","title_element":"header","title_class":"widget-title","title_before":"%3Cheader%20class%3D%22widget-title%22%3E","title_after":"%3C%2Fheader%3E"}