lftp is a sophisticated command line based file transfer program. Supported protocols include FTP, HTTP, SFTP, and FISH. It has a multithreaded design allowing you to issue and execute multiple commands simultaneously or in the background. It also features mirroring capabilities and will reconnect and continue transfers in the event of a disconnection. Also, if you quit the program while transfers are still in progress, it will switch to nohup mode and finish the transfers in the background. Additional protocols supported: FTP over HTTP proxy, HTTPS and FTP over SSL, BitTorrent protocol. IPv6 is fully supported. There are lots of tunable parameters, including rate limitation, number of connections limitation and more.

mulk is a multi-connection command line tool to download Internet sites. It is similar to wget and cURL, but manages up to 50 parallel links. The main features are: recursive fetching, Metalink retrieval, segmented downloading, and image filtering by width and height.

phpWebHacks is an advanced HTTP client written in PHP, a utility for HTTP scripting. It simulates a real Web browser, but you use it with lines of code rather than a mouse and keyboard. It uses pure PHP; no Curl or other fancy dependencies are needed. You can build scripts in minutes: a GMail/Yahoo/MSN contacts grabber, a YouTube video downloader, an HTTP mailer, and many more.

hotpotato (or hptt, for short) is a high performance and throughput oriented HTTP client Java library, with support for HTTP 1.1 pipelining. It was developed mostly towards server-side usage, where speed and low resource usage are the key factors, but it can be used to build client applications as well. Built on top of Netty and designed for high concurrency scenarios where multiple threads can use the same instance of a client without any worries for external or internal synchronization, hotpotato helps you reduce initialization and/or preparation times and resource squandering. Among many small optimizations, connections are reused whenever possible, which results in a severe reduction of total request execution times by cutting connection establishment overhead.

Aletheia is a browser-like application for sending raw HTTP requests. It is designed for debugging and finding security issues in Web applications. It is possible to apply filters on every request and response which can modify the content, for making Basic Authentication or OAuth requests, for example. Because it is possible to modify every bit of the request, it is easy to exploit Web applications. That means you can set, for example, custom Cookies or User-Agents, or send file uploads to the server. It uses the Apache HTTP core components library to send HTTP requests. This application also helps you understand how the HTTP protocol works.

Higgs.IO is a high performance, message oriented network library built for realtime systems. It provides a core extensible framework and libraries built on top of the core. Libraries include a WebSocket server, an HTTP server and client, and Boson, a custom serialization and RMI library.

res is a tiny commandline HTTP client.
It lets you easily interact with HTTP calls quickly within your terminal.
It's built on top of the requests library and meant to be a commandline wrapper for requests.

RoboZombie allows easy integration with remote services by allowing you to replicate an endpoint contract and generate a proxy to access it.
Contracts can be very flexible in terms of the resources they access. These could be vary from static HTML content or an RRS feed, to a RESTful web service endpoint.
Each endpoint contract is specified on a single interface using annotations to provide the communication metadata. It is then wired into your code via an annotation, where it is created, cached, and injected at runtime.

Corn Httpclient is an HTTP client designed make HTTP requests from Java clients to Web servers as plain requests and form requests. Since authentication requirements are supported, you can make requests to secured Web resources behind the proxies. It supports multiple clients making requests simultaneously with different security credentials.

Head-r recursively follows links located at (HTML) Web pages hosted on an HTTP server, and performs HEAD requests upon links of interest to the user. The intended use is to create URI lists for later selective mirroring of file hosting sites.