Site Ripping

absynthe minded

New Member
I recently started at a new company who wants to start developing their HTML/CSS in house. All of their previous HTML development was done externally, and then they turned them into Java struts files, and tossed the HTML.

So I have to download the entire site from the web to set up an HTML/CSS/image repository. Unfortunately, because of the session ids and Java logic, the site ripping software I've used so far often ends up downloading the same pages over and over again, dozens of times.

I'd like something that would just allow me to go through, page by page, and download everything keeping the directory structure.

Any ideas? Mac or Windows is fine, as I have both.
 

johnscott

New Member
Try http://www.httrack.com/

On a website that I was building I had a need to emulate another website, but because I was working on my laptop and didn't always have an internet connection I used this program to basically copy all the content of the entire site locally...so I could actually go in an open the individual html files. It also organizes the files and downloads all the css needed, etc..

There may be other programs out there that may even do a better job, but I think this might work for you.

I hope this helps.
 

absynthe minded

New Member
No, what I really need is to be able to browse to a page, and then save it, with all files, preserving the directory structure. The way the site is designed is hostile to spiders.
 
Top