Posts Tagged ‘internet connection’

If you've been given a new dedicated server from a New Dedicated-Server-Provider or VPS with Linux and you were told that a certain download speed to the Server is guaranteed from the server provider, in order to be sure the server's connection to the Internet told by service provider is correct it is useful to run a simple measurement console test after logging in remotely to the server via SSH.Testing connection from Terminal is useful because as you probably know most of Linux / UNIX servers doesn't have a GUI interface and thus it is not possible to test Internet Up / Down Bandwidth through speedtest.net.

For the download speed (internet) test the historical approach was to just try downloading the Linux kernel source code from www.kernel.org with some text browser such as lynx or links count the seconds for which the download is completed and then multiple the kernel source archive size on the seconds to get an approximate bandwidth per second, however as nowdays internet connection speeds are much higher, thus it is better to try to download some Linux distribution iso file, you can still use kernel tar archive but it completed too fast to give you some good (adequate) statistics on Download bandwidth.

If its a fresh installed Linux server probably you will probably not have links / elinks and lynx text internet browers installed so install them depending on deb / rpm distro with:

root@pcfreak:~# apt-cache show lftp|grep -i descr -A 12Description: Sophisticated command-line FTP/HTTP client programs Lftp is a file retrieving tool that supports FTP, HTTP, FISH, SFTP, HTTPS and FTPS protocols under both IPv4 and IPv6. Lftp has an amazing set of features, while preserving its interface as simple and easy as possible. . The main two advantages over other ftp clients are reliability and ability to perform tasks in background. It will reconnect and reget the file being transferred if the connection broke. You can start a transfer in background and continue browsing on the ftp site. It does this all in one process. When you have started background jobs and feel you are done, you can just exit lftp and it automatically moves to nohup mode and completes the transfers. It has also such nice features as reput and mirror. It can also download a file as soon as possible by using several connections at the same time.

root@pcfreak:/root# apt-cache show iperf|grep -i desc -A 2Description: Internet Protocol bandwidth measuring tool Iperf is a modern alternative for measuring TCP and UDP bandwidth performance, allowing the tuning of various parameters and characteristics.

To test Upload Speed to Internet connect remotely and upload any FTP file:

Once having iperf on the server the easiest way currently to test it is to useserverius.net speedtest server – located at the Serverius datacenters, AS50673 and is running on a 10GE connection with 5GB cap.

You see currently my home machine has an Uplink of 30.3 Mbit/s per second, that's pretty nice since I've ordered a 100Mbits from my ISP (Unguaranteed Bandwidth Connection Speed) and as you might know it is a standard practice for many Internet Proviers to give Uplink speed of 1/4 from the ISP provided overall bandwidth 1/4 would be 25Mbi/s, meaning my ISP (Bergon.NET) is doing pretty well providing me with even more than promised (ordered) bandwidth.

Iperf is probably the choice of most sysadmins who have to do regular bandwidth in local networks speed between 2 servers or test Internet Bandwidth speed on heterogenous network with Linux / BSDs / AIX / HP-UX (UNIXes). On HP-UX and AIX and other UNIXes for which iperf doesn't have port you have to compile it yourself.

The fix to situation is to Restart Promsvijazy and restart my Notebook. As this fix takes a lot of time I found another "work around". Make Windows Flush its DNS servers (forget old DNS servers and re-assign them again)

Everyone who used Linux is probably familiar with wget or has used this handy download console tools at least thousand of times. Not so many Desktop GNU / Linux users like Ubuntu and Fedora Linux users had tried using wget to do something more than single files download. Actually wget is not so popular as it used to be in earlier linux days. I've noticed the tendency for newer Linux users to prefer using curl (I don't know why).

With all said I'm sure there is plenty of Linux users curious on how a website mirror can be made through wget. This article will briefly suggest few ways to do website mirroring on linux / bsd as wget is both available on those two free operating systems.

1. Most Simple exact mirror copy of website

The most basic use of wget's mirror capabilities is by using wget's -mirror argument:

# wget -m http://website-to-mirror.com/sub-directory/

Creating a mirror like this is not a very good practice, as the links of the mirrored pages will still link to external URLs. In other words link URL will not pointing to your local copy and therefore if you're not connected to the internet and try to browse random links of the webpage you will end up with many links which are not opening because you don't have internet connection.

2. Mirroring with rewritting links to point to localhost and in between download page delay

Making mirror with wget can put an heavy load on the remote server as it fetches the files as quick as the bandwidth allows it. On heavy servers rapid downloads with wget can significantly reduce the download server responce time. Even on a some high-loaded servers it can cause the server to hang completely. Hence mirroring pages with wget without explicity setting delay in between each page download, could be considered by remote server as a kind of DoS – (denial of service) attack. Even some site administrators have already set firewall rules or web server modules configured like Apache mod_security which filter requests to IPs which are doing too frequent HTTP GET /POST requests to the web server. To make wget delay with a 10 seconds download between mirrored pages use:

The -mk stands for -m/-mirror and -k / shortcut argument for –convert-links (make links point locally), –random-wait tells wget to make random waits between o and 10 seconds between each page download request.

Some websites has a robots.txt which restricts content download with clients like wget, curl or even prohibits, crawlers to download their website pages completely.

/robots.txt restrictions are not a problem as wget has an option to disable robots.txt checking when downloading. Getting around the robots.txt restrictions with wget is possible through -e robots=off option. For instance if you want to make a local mirror copy of the whole sub-directory with all links and do it with a delay of 10 seconds between each consequential page request without reading at all the robots.txt allow/forbid rules:

Sometimes when try to use wget to make a mirror copy of an entire site domain subdirectory or the root site domain, you get an error similar to:

Sorry, but the download manager you are using to view this site is not supported. We do not support use of such download managers as flashget, go!zilla, or getright

This message is produced by the site dynamic generation language PHP / ASP / JSP etc. used, as the website code is written to check on the browser UserAgent sent. wget's default sent UserAgent to the remote webserver is:Wget/1.11.4

As this is not a common desktop browser useragent many webmasters configure their websites to only accept well known established desktop browser useragents sent by client browsers. Here are few typical user agents which identify a desktop browser:

6. Make a dynamic pages static site mirror, by converting CGI, ASP, PHP etc. to HTML for offline browsing

It is often websites pages are ending in a .php / .asp / .cgi … extensions. An example of what I mean is for instance the URL http://php.net/manual/en/tutorial.php. You see the url page is tutorial.php once mirrored with wget the local copy will also end up in .php and therefore will not be suitable for local browsing as .php extension is not understood how to interpret by the local browser. Therefore to copy website with a non-html extension and make it offline browsable in HTML there is the –html-extension option e.g.:

A good practice in mirror making is to set a download limit rate. Setting such rate is both good for UP and DOWN side (the local host where downloading and remote server). download-limit is also useful when mirroring websites consisting of many enormous files (documental movies, some music etc.). To set a download limit to add –limit-rate= option. Passing by to wget –limit-rate=200K would limit download speed to 200KB.

Other useful thing to assure wget has made an accurate mirror is wget logging. To use it pass -o ./my_mirror.log to wget.

One of the most precious commands I ever learned to use in Linux is ifconfig and route .

They have saved my life in configuring the static IP based internet of numerous Desktop Linux computers & notebooks.

Though the usage is very much known by most of the people who are into Linux, I believe it’s likely that the newer people who entered the world of Linux or some Unix system administrators are still lacking the knowledge on how to manually configure their eth0 lan card, thus I thought it might be handy for someone to share it, I know that for most unix users & admins especially the advanced ones this post might be funny, so if you’re an advanced administrator just skip the post and don’t laught at it 😉

Now the universal commands (works on each and every Linux host) to configure manually static IP internet connection on Linux are:

Today I helped my cousing to fix his internet connection on a laptop. The laptop was running Vista. A real nightmare, this OS is really heavy and even messier than Windows XP. What else I’m trying to cope with life. Life is tough. What I can say….

Also I started a vsftp server on a FreeBSD box it took me some time because of configuration issues.

Right now I’mtrying to run a snort server still unsuccessfully for some reason the snort daemon does not start.

In the college everything is going in the old manner, except we have started studying Marketing II and another subject I forgot the name it is supposed to be something like statistics. The day was quiet with a bit of work.