As a old-school Linux user I am quick to use the command line before any other tool. One of those tools I use almost daily is wget. Wget is a command-line only tool that downloads files from remote locations easily and quickly. Wget is a great tool to have in your toolbox, but for many people it falls short in a couple of key features.
One big feature is no GUI. Yes I realise for many that is NOT a selling point. Even though wget is about as simple a command as you can get, it is, after all, a command. The other feature is multi-threading. Although wget can do cool things like run in the background (without even being logged in), it will download a file with a single thread.
For both of these needs there is another tool that works as a pretty impressive front-end for the wget command. That tool - Multiget. Now, before you assume Multiget is a Linux-only tool, it's not. Multiget can be used in Linux, Windows, OS X, and the BSDs. This article, however, is about using Multiget in Linux (in particular - Ubuntu).
Multiget offers the following features:
Fortunately, for Ubuntu users, Multiget can be found in the Universe repository. All you have to do for installation is:
To start up Multiget go to the Internet sub-menu of the Applications menu and you will find the Multiget entry. Open that and the main window will start up (see Figure 1). The first step in downloading a file should be fairly obvious - you hit the the New Task button (denoted by the "+" symbol). Simple.
When you hit the New Task button a new window will appear that asks for certain information. At least in the Linux release of Multiget, you might find a portion of this window that might stump you. As you can see (in Figure 2) in the Basic Info section you have four bits of information to add. The MAIN URL section is obvious - you copy and paste a URl into the this section. But wait a minute - you will find, if copying your URL from your default web browser, that as soon as you right-click a URL, and select Copy Link Location that URL will appear in the MAIN URL section and then automatically moved down to the Mirrors section. This is actually the only means of getting an address into the Mirrors section.
With that in mind, let's use a few mirrors to quickly download the ISO image of the latest release of Ubuntu. To do this go to the Ubuntu Karmic page page. The first thing you will do is go to the main download page, right click the download link, and click Copy Link Location. Now go back to the main download page and select a mirror and do the same thing. You can add as many mirrors as you like - understanding the more mirrors you add, the faster the download will happen.
Once you have all of your URLs added (make sure you still have a URL in the MAIN URL section) configure the rest of the window (you should configure the SaveTo section at least) and then click OK.
As your file is downloading click on the Progress section in the bottom and you will see how the pieces of the download are coming together. As you can see (in Figure 3) I have five threads running to download the Ubuntu 9.10 ISO image. You can also see the overall progress bar as the green bar starting right under the icon toolbar.
During an active session you can increase or decrease the amount of threads given to a download. This, of course, can speed up your download. Naturally if you give too many threads to a download you might see your overall performance of either your machine or your general network connection degrade. At this point you can decrease the number of threads given to a download. To increase or decrease the number of threads go to the Task menu and select the appropriate entry.
During the download you can click on the Running section to see the information of your download. You can also pause your download by clicking the Pause button.
Multiget is an outstanding tool to download large files using muliple mirrors and threads. As well it is just an overall outstanding front end for the wget command.Advertisement
Advertising revenue is falling fast across the Internet, and independently-run sites like Ghacks are hit hardest by it. The advertising model in its current form is coming to an end, and we have to find other ways to continue operating this site.
We are committed to keeping our content free and independent, which means no paywalls, no sponsored posts, no annoying ad formats or subscription fees.
If you like our content, and would like to help, please consider making a contribution:
Ghacks is a technology news blog that was founded in 2005 by Martin Brinkmann. It has since then become one of the most popular tech news sites on the Internet with five authors and regular contributions from freelance writers.