There are several ways to save a website to your local hard drive and they largely depend on your needs. If you only want to save textual information you can simply copy and paste the contents into a local text file on your computer. If you want to preserve the links you need to save the page in HTML format. Most browsers have the option to save a website locally but what if you need more than one page or would like the information of the links as well?
You could open every website and save it. This has some disadvantages. First, there is no link structure between the saved pages. If you want to open page 1 you have to find the index file for page 1 which is different from all other pages. Its great for single pages but not great for entire websites or networks.
Before i start with the solution I'd like to point out some of the reasons why someone would like to save a website to a local drive:
We will be using the freeware tool Httrack which is available for windows, mac os x and linux personal computers. Download it from the official website httrack.com
Every website that you save to your local drive is stored in a project file. The first step after you've started httrack is to create a new project by clicking NEXT.
Add some basic information about the project, name and category and the path where you want to save it. I suggest a drive with enough space for all of the files of the website. Please note that you can't create a new directory in the program itself.
This is the most important options screen for your project. You select an action and add urls to perform this action. If you want to download an entire website select Download web site(s) and add urls to the web address field.
If you only want to download certain file types select Get separated files. You specify the file types by clicking on set options and selecting scan rules.
You can add urls by simply typing one in the text field or by clicking add url. Clicking add url allows you to enter a website you want to download and add login information for that website. Httrack allows you to capture urls as well by using a proxy.
Set Options leads to a projects options page. You can specify lots of information here. Depth of website scan, follow external links, include / exclude files and directories and much more.
The default settings will download all internal websites and refuse to download external websites.
That means if you only want to download a website try the default settings and take a look at the result. Php files will be saved as html.
Advertising revenue is falling fast across the Internet, and independently-run sites like Ghacks are hit hardest by it. The advertising model in its current form is coming to an end, and we have to find other ways to continue operating this site.
We are committed to keeping our content free and independent, which means no paywalls, no sponsored posts, no annoying ad formats or subscription fees.
If you like our content, and would like to help, please consider making a contribution:
Ghacks is a technology news blog that was founded in 2005 by Martin Brinkmann. It has since then become one of the most popular tech news sites on the Internet with five authors and regular contributions from freelance writers.