Tag Info

You could do it like this:
*/5 * * * * wget -O /dev/null -o /dev/null example.com
Here -O sends the downloaded file to /dev/null and -o logs to /dev/null instead of stderr. That way redirection is not needed at all.

You can use the -d option in curl with a @- argument to accept input from a pipe. You will need to construct the key-value pairs yourself. Try this:
echo "time=`uptime`" | curl -d @- http://URL
The backticks (`) denote that the enclosed command (in this case uptime) should be executed, and the backtick-quoted text replaced with the output of the executed ...

The GNUTLS client tool, gnutls-cli, can also make this easy:
gnutls-cli --print-cert www.example.com \
< /dev/null \
> www.example.com.certs
The program is designed to provide an interactive client to the site, so you need to give it empty input (in this example, from /dev/null) to end the interactive session.

If you want to use the GUI, try clicking Places -> Connect to Server.... For Service Type choose Windows share, and fill out the fields like so:
Server: 192.168.1.66
Share: SharedFolder
Then download your file from the window. If you want to use a command-line interface, smbclient uses a FTP-like interface (get, put, etc.):
~$ smbclient //192.168.1.66/...

ISPs often prioritize traffic to speedtest.net so that they can brag how fast their connections are, while in reality, they don't provide that much bandwidth. They're perfectly aware that most users will only check that site for confirmation.
You also have to keep in mind that transfer speed relies both on the client and the server. In today's world most ...

I'm surprised nobody mentioned the .netrc file. First create the file if it doesn't exists and set safe permissions:
touch ~/.netrc
chmod 600 ~/.netrc
Then you can add the hostname, username and password all on one line:
echo 'machine example.com login casper password CasperPassword' >> ~/.netrc
Then when you do wget https://example.com and the ...

wget is not hanging. Your shell is waiting for you to enter another command, and the shell prompt is at the top of the output somewhere...
The problem is: You did not quote the URL, and it contains an ampersand. This character is used to put a process in the background, and importantly, anything following it is treated as another command line to be run.
So ...

Use a .wgetrc file (GNU manual) in which you can set username and passwords for either or both ftp and http.
To use the same credentials for both specify
user=casper
password=CasperPassword
or individually
ftp_user=casperftp
ftp_password=casperftppass
http_user=casperhttp
http_password=casperhttppass

You probably want to buy more disk space, but assuming you don't, you could...
pipe the tarball around rather than downloading it.
newserver# ssh olduser@oldserver "cat /path/to/tarball" | tar xf -
or if you don't have SSH access to your old server
newserver# wget -O - http://oldserver/path/to/tarball | tar xf -
or use rsync like Dennis said.
Be ...

In addition to the other reasons posted, TCP connections don't work well with large files when the bandwidth-delay product becomes large.
Like on an otherwise fast connection to an island.
See Wikipedia's entry on TCP tuning.
So Speedtest can dump a small file through the connection at 95 mb/sec, but wget can only get 10 mb/sec on a 20 MB file.

I think that the problem is that wget doesnt handle well IPv6 addresses and the DNS server is sending a IPv6 for that site. Sorry if I misunderstood your question. Check those tests:
hmontoliu@ulises:~$ wget -T10 http://www.fcc-fac.ca
--2011-07-14 16:44:34-- http://www.fcc-fac.ca/
Resolving www.fcc-fac.ca... failed: Connection timed out.
wget: unable to ...

By default wget will check for certificates in the path defined in openssl conf file /etc/pki/tls/openssl.cnf (no sure whether the path is correct for fc8). Please check the openssl configuration file and confirm that the paths are correct. May be it is openssl, that need to be corrected.

If you don't care about checking the validity of the certificate just add the --no-check-certificate option on the wget command-line.
Edit:
Not checking the validity of the certificate opens you up to man-in-the-middle attacks (MiTM). Depending on the environment you're working in (over the Internet vs. a private LAN) this could be a major vulnerability. ...

There is no wget like built-in command in Windows. You can use the .net Framework via Windows PowerShell like in this example:
https://superuser.com/questions/362152/native-alternative-to-wget-in-windows-powershell
or like i do and use wget for Windows:
http://gnuwin32.sourceforge.net/packages/wget.htm

Have a look at man xargs:
-P max-procs --max-procs=max-procs
Run up to max-procs processes at a time; the default is 1. If
max-procs is 0, xargs will run as many processes as possible at
a time.
Solution:
xargs -P 20 -n 1 wget -nv <urs.txt

wget --reject-regex '(.*)\?(.*)' http://example.com
(--reject-type posix by default). Works only for recent (>=1.14) versions of wget though, according to other comments.
Beware that it seems you can use --reject-regex only once per wget call. That is, you have to use | in a single regex if you want to select on several regex :
wget --reject-regex '...

I studied a week on this, finaly with the help i got from this page.
The command i used to connect is:
"wget --ca-cert=/etc/ssl/certs/winhostname.pem --certificate=/etc/ssl/private/linuxhost.pem --private-key=/etc/ssl/private/linuxhost.key https://winhostname.home.net:8443/winhosturl.asmx"

It's megaBytes. Apart from anything else, if you divide the file length (231997440 bytes) by the time (4.0s) you get the same answer (give or take).
Edit: if you merely want to set the text output for the rate, so it displays eg "MB" instead of just "M", it's free software, you can always recompile it yourself. But it may be worth checking if there's a ...