OS X: Using curl instead of wget

Jan 15 2010

OS X does not come with wget, a command-line tool for retrieving websites. For a while, I grumbled about this. I knew that curl was installed, but I hadn't ever used curl from the command line. But once I tried it out, I realized that for my needs, curl is just as good as wget... and I don't have to install anything extra to get it.

Here's how to use curl to fetch a remote URL:

$ curl -OL h ttp://spine-health.com/index.php
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 34646    0 34646    0     0  15314      0 --:--:--  0:00:02 --:--:-- 17767

This will download the remote file and store it locally in index.php. If you leave off the -O, it will write the file to standard output (your terminal, usually).

curl: Remote file name has no length!

Fetching from a URL's root can behave differently. If you perform the same command as above, but pointing to the base URL, you will get an error:

$ curl -OL h ttp://spine-health.com/
curl: Remote file name has no length!
curl: try 'curl --help' or 'curl --manual' for more information

What's going on here? The error is not terribly informative on this point. The problem is that curl doesn't know where to write the output file. A better command here is something like this:

$ curl -L h ttp://spine-health.com/ > out.html

This time, the retrieved data will be written to the out.html file.

Like wget, curl has many options. You can read the man page for details, or you can get a quick summary with the usual curl --help command. <!--break-->