CLI Magic: Use cURL to measure Web site statistics
cURL is a handy command-line network tool whose name stands for "client for URLs," but think of it as a "copy for URLs" -- it can copy to or from a given URL in any of nine different protocols.
Although cURL is sometimes misconceived as an updated wget, that's wrong. The two utilities do share some features and options, but are distinctly different tools; wget is for downloading files from the Web, and is best used to mirror entire sites or parts of sites -- which is something that cURL alone can't do.
cURL's job is to copy data to or from a given set of URLs; along with HTTP it recognizes the FTP, TFTP, GOPHER, TELNET, DICT, LDAP, FILE, HTTPS, and FTPS protocols. Other features include support for proxies, forms, cookies, SSL, client-side certificates, URL globbing, and very large files. Along with the curl command-line tool is a counterpart library, libcurl, that you can use to get cURL's functionality from within your own programs.
You can do a lot of neat tricks with curl. Here's a look at how you can copy to and from URLs, and then use cURL's reporting facilities to get simple Web server metrics from your operations.