Tips and tricks for curl and wget
Flex your command line muscles with these tricks for using curl and wget to interact with remote systems.
The unix commands curl
and wget
are useful for accessing URLs without resorting to a browser. Both commands allow you to transfer data from a network server, with curl
being the more robust of the two. You could use either of them to automate downloads from various servers.
- The curl command
The curl
command allows you to transfer data from a network server, but it also enables you to move data to a network server. In addition to HTTP, you can use other protocols, including HTTPS, FTP, POP3, SMTP, and Telnet. Administrators commonly rely on curl
to interact with APIs using the DELETE, GET, POST, and PUT methods.
The syntax for curl
is fairly straight-forward at first glance. Here is an example:
$ curl http://url/help.txt
curl Options
You can supply various options to your command syntax:
curl [options] [url]
It is the options which make curl
so robust. The following are some of the available options used with curl
and examples of their use.
-a, –append
When uploading a file, this option allows you to append to the target file instead of overwriting it (FTP, SFTP).
$ curl --append file.txt ftp://ftp.example.com/file.txt
–connect-timeout
The --connect-timeout
option sets the maximum time in seconds that curl
can use to make its connection to the remote server. This option is handy to prevent the connection from terminating too quickly, and to minimize the amount of time you want the command to attempt the connection.
$ curl --connect-timeout 600 http://www.example.com/
–dns-servers
This option allows you to list DNS servers curl
should use instead of the system default. This list can be handy when troubleshooting DNS issues or if you need to resolve an address against a specific nameserver.
$ curl --dns-servers 8.8.8.8 http://www.example.com/
–http3
You can specifically tell curl to use the http3 protocol to connect to the host and port provided with a https URL. –http2 and –http1.1 function in the same way and can be used to verify a webserver.
$ curl --http3 http://www.example.com:8080/
–output
If you need to retrieve a file from a remote server via a URL, --output
is an easy way to save the file locally.
$ curl http://www.example.com/help.txt --output file.txt
–progress-bar
This option displays the progress of the file transfer when combined with the --output
option.
$ curl --progress-bar http://www.example.com/help.txt --output file.txt
–sslv2
Like with HTTP, you can specifically tell curl to use a specific SSL option for the command to connect to and in this case we are specifying version 2. –ssl specifies SSL needs to be used and –sslv3 specifies SSL version 3. Note: sslv2 and sslv3 are considered legacy by the maintainer though still available.
$ curl --sslv2 https://www.example.com/
–verbose
The --verbose
option with curl
is useful for debugging and displaying what is going on during the call to the URL.
$ curl --verbose http://www.example.com
2. The wget command
Unlike curl
, the wget
command is solely for the retrieval of information from a remote server. By default, the information received is saved with the same name as in the provided URL.
Here is an example of the basic wget
syntax:
$ wget http://www.example.com/help.txt
wget Options
Like curl
, you can supply various options to your wget
command syntax:
wget [option] [url]
–dns-servers=ADDRESSES
You can specify one or more specific DNS servers to use when utilizing wget
to access a remote server. The syntax differs, however, if the option and nameserver addresses are joined with an =.
$ wget --dns-servers=8.8.8.8 http://www.example.com
-O
To save a file with a new name when using wget
, utilize the --output-document
option, or more simply -O
.
$ wget http://www.example.com/help.txt -O file.txt
–progress=type
With wget
, you can supply a type (dot or bar) to determine the ASCII visual of the progress bar. If a type is not specified, it will default to dot.
$ wget --progress=dot http://www.example.com
Wrap up
The curl
and wget
commands can be very useful when added to scripts to automatically download RPM packages or other files. This post only touches some of the most common features of what these commands can do. Check the related man pages for a complete list of options available for both curl
and wget
.