On the Smashing Magazine site today they've posted a list of powerful CLI tools that every developer should at least know about to help make their lives easier - six of them ranging from SSH tunnels to HTTP testing.

Good tools are invaluable in figuring out where problems lie, and can also help to prevent problems from occurring in the first place, or just help you to be more efficient in general. Command line tools are particularly useful because they lend themselves well to automation and scripting, where they can be combined and reused in all sorts of different ways. Here we cover six particularly powerful and versatile tools which can help make your life a little bit easier.

The tools they mention are all things you'd install on a unix-based system:

On DZone.com's Web Builder Zone today there's a new post from Eric Hogue talking about some of the tools you can use to profile your PHP application and squeeze that much more performance out of it (or maybe just find that pesky, elusive bug).

When developing web applications, we often run into performance issues. People often blame PHP or MySQL for bad performance, but in most case the answer is not that easy. Blindly trying to optimize random parts of our applications can lead to some uneven results. There are many available tools to profile a PHP application. Learning how to use them can help us pinpoint which parts are slow. With this information we can pick the optimizations that will give us the best results.

There are many available tools to profile a PHP application. Learning how to use them can help us pinpoint which parts are slow. With this information we can pick the optimizations that will give us the best results. This post describes the installation and configuration of some of them. I tested them in a Ubuntu 10.10 virtual machine. If you want to try those tools, don't forget that they can greatly impact the performance of you web server when they are active. Installing a tool like Xdebug on a production server is not recommended.

First he looks at benchmarking your application with a tool called Siege, a load testing tool that can be configured to send requests to your application in lots of different ways. He also mentions Xdebug, a handy debugger and XProf, a profiling tool to help find the bottlenecks in your code (and XHGui to view its results).

Paul Jones has come back around and revisited the benchmarking setup he's created and has rerun some of the baselines on a new, clean EC2 instance and posted the results to his blog. These benchmarks were run using Apache's ab, Acme's http_load and joedog's siege.

I thought it might be interesting to see what each of them reports for the baseline "index.html" and "index.php" cases on the new Amazon EC2 setup (using a 64-bit OS on an m1.large instance). The results follow (all are at 10 concurrent users, averaged over 5 one-minute runs).

In his results, he shows quite different numbers for the "absolute" requests per second that each of the tools registered. There were some differences from the previous benchmark runs that could have been caused by the update in Ubuntu version and using a 64-bit instance over a 32-bit instance on the EC2 setup.

Paul Jones, who is obsessed (is that the right word?) with keeping benchamrks on recent versions of several popular PHP frameworks, has posted another look at a slight change in his testing method - a move away from the Apache ab tool and towards seige (a product of JoeDog Software).

It turns out that the siege tool from JoeDog Software is more accurate in reporting static HTML and PHP responsiveness. This is confirmed through Paul Reinheimer as well, who reported the expected responsiveness on other systems.

The over-reporting from ab means that all my previous reporting on benchmarks is skewed too low when comparing framework responsiveness to PHP's maximum responsiveness. As such, I have re-run all the previously published benchmarks using siege instead of ab.

Included in this new post are the "more correct" numbers as produced by siege. His baselines turned out a bit more rational and the statistics of the frameworks seemed to jump a bit because of it, but the percentages in comparing between the frameworks is still about the same.