mod_perl, freetds and environment variables???

I have a application written as CGI scripts that run under mod_perl mode via ModPerl::Registry or if anyone still uses it Apache::Registry. It also connects to a variety of databases, including Microsoft SQL Server via freetds.

Freetds can be controlled via a freetds.conf config file or environment variables.

I liked the idea of using the distros freetds libraries (actually, freetds contains the ct library, which is the only one that DBD::Sybase cares about) and i thought i could avoid the issues with me and everyone else writing to the freetds.conf file by using environment variables.

Which worked well until it ran in mod_perl

Then it died horribly

Turns out that controlling a C shared library via setting environment variables in perl is tricky in mod_perl

Browser based testing of your own code

For those of us engaged in developing/supporting a web application, testing seems to be an almighty pain.

The state of the art seems to be; put a massive investment into writing test cases in Tool X, which manages (sort of) Browser Y. However thanks mainly to mozilla and khtml, but really opera and explorer are just as capable here, we can never be really sure if someone isn't going to come up with a great new browser.

The moment someone comes out with a new product, the whole testing routine needs to be re-written, usually, completely from scratch. B/c i haven't seen a tool that automates konqueror, opera, explorer and mozilla. Certainly not for linux as well as windows. However, i think this should be possible.

First things first, the automation script should be able to start up the browser and control it. X11::GUITest and Win32::GuiTest provides the basics of being able to recognise a window and send keystrokes and mouse clicks to it, which should be enough.

Secondly, the automation program must be able to recognise the correct page to the page to send input to, by clicking the mouse or tabbing.

This is where each new browser needs to provide some initialisation data. By using HTTP::Daemon, some sort of ajax mechanism, the onfocus and onclick events and X11::GUITest/Win32::GuiTest, it should be possible to recognise how many tab presses to get the first form field on the page AND the exact offset to allow you to put the mouse over any desired element. Calculating each elements position can be achieved using offsetLeft, offsetTop and offsetParet, which i think are universally supported now.

Once this is done, we can take load the web application for testing into some incarnation of HTTP::Daemon, together with our testing code and launch the browser to point at the daemon.

Apologies to those frame haters out there but i think the testing code needs to create a couple of frames, one containing the web application and one containing the requests/responses from the testing code. b/c all the requests are coming from the same source (the HTTP::Daemon) as the web application, it should be possible to launch cross frame queries to interrogate the web application window for the position of say the "Submit" button or the "Description" text box, allowing the test code to then move the mouse over it and click it, or just to press tab the correct number of times.

Getting at the source of the current web application page can be done by interrogating the HTTP::Daemon for the page source.

At this point i think i have a fair chance of being able to implement a pretty interesting version of the WWW::Mechanize interface, but testing the css and javascript for all known or future browsers.

Devel::Cover and how to test inputs?

I've been experimenting with the awesome Devel::Cover, and while it's great fun to use it to improve my test suite, my code also tries very hard to validate all input sources.

Devel::Cover can tell me that the line

$a == $b ? 1 : 0

has been executed and even that both halves of the condition have been executed, but it can't tell me that i haven't tested the case where $a isn't numeric.

I'm testing as much as i can think of, but i'm unwilling to bet that all inputs have been checked, and i can't think of a way to guarantee it

Which is considerably annoying, as a lot of these inputs wind up in a database and databases are very unforgiving about bad input

My test suite already parses the sql ddl statements, maps them to the correct classes and runs through null,min,max,max+1,min-1,etc tests for each field but can i guarantee that i've caught every instance in a large code base?

Probably not

And to look at it another way, does this matter?

And why the hell am i writing test cases to improve test suite coverage (58% and climbing! woohoo!) when i could be writing more features?

Job Interview oddness

$work has been advertising for a new testing position and i've been doing the role of assessment of technical skills of the prospective employees.

i've been a bit surprised when interviewees reach into their bag, pull out a folder of paper and offer to show me documents from their previous/current job.

Part of me wants to enquire "So, how do you think your previous/current employer feels about you showing their internal policies and procedures to external parties, and are you going to show us the same courtesy if/when you leave us?"

I can't quite figure out how to say that without it sounding like "you've got 10 seconds to leave the building before i call the cops you thieving $#@&!!!!!"

The other part of me is thinkin' "Maybe this is just me being paranoid/super-sensitive/dumb about trade secrets, and i should just get over myself"

Which certainly seems the case when i consult my co-workers.... i dunno, it just seems odd to me.

Thursday July 16, 2009

06:41 PM

Net::Ping, Time::HiRes and virus scanners

at $work, we've been timing tcp connections with Time::HiRes and we hit an interesting problem. Time::HiRes was returning ridiculously small times for the tcp ping, approx 1-2 milliseconds, when we knew the time should be approx 30 milliseconds.

After firing up that beautiful piece of software known as (?:ethereal|wireshark), we verified that the traffic actually took approx 30 milliseconds as expected.

turns out that the Web Shield portion of the AVG anti virus product filters among other things, tcp port 80, and this filtering process stuffs the Time::HiRes timings up.

Melbourne.pm and lack of sleep

Had a small meeting last night, only about 10 people managed to make it as the weather was a little chilly and damp.

we discussed and approved of the idea of having group hacking sessions to work on perl modules in the fail100 list in a future meeting.

jarich also presented a talk on preparing abstracts (which seem to be actually "short descriptions" instead of an academic abstract) for conferences which was very interesting.

The talk was also delivered after only 3-4 hours sleep in the past 36. Personally, i'm fine to physically work in that state, but mental concentration just goes to pieces, so i was impressed. I had a bit of a chuckle at the occansional word transposition, but i was impressed

Secondly, quoting of data is somewhat different. Instead of quoting data like so 'foo', it becomes N'foo'.

Now finally, when connecting with DBD::Sybase with underlying freetds libraries, you need to define "Client Charset = UTF-8" AND use the Encode module to encode your statement as 'UTF-8' before passing it to DBI->prepare.

Salutations to the DBD::Sybase and freetds team for their excellent work in allowing perl to talk UTF-8 (at least the UCS-2 compat portion of it) to a database that doesn't even support it.

As a side note, i think that when constructing a test suite for a program that uses a database for a backend, it's essential to test the database itself for edge case conditions, such as in this case, making sure that you can fill a varchar(10) with ten three byte unicode characters AND retrieve it AND the retrieved value matches your expected result. Cos maybe your definitions are behaving slightly differently than you thought.

Saturday April 04, 2009

06:31 PM

Embedded databases and installable packages

I happen to like the idea of using cross database compatible SQL in my application code.

While this can produce somewhat horrific code (thank you Sybase for passing on to Microsoft the notion that only one query per database handle can be active at a time), most of the time i am happy with the extra planning trade off that allows customers to use a database that they are familiar with.

However, while testing a number of database backed packages of alternative products, it became horribly apparent how vital at least support for an embedded database is.

Because all i wanted to do is experiment with each application and see how it worked and if it matched with my needs.

Oh goody. Thank you for making me find and read the sys. admin. notes on how to reconfigure the application to use my pre-installed and locked down database.

This takes time. It's annoying b/c i have to evaluate a pile of these packages. By the time i am actually looking at the working product, i already hate it for wasting my time. B/c my job is not dealing with this package. The package just helps???? me do my job

Alternatively, if the installable package came including a pre-configured and instantly available SQLite database (possibly even with sample data in it) my first experiments with the package would have left me feeling much happier.

To be fair, any embedded product would probably be fine, including Embedded MySQL (which i've never used), it just needs to not depend on the user having a completely unsecured database installation to work.

Thursday March 19, 2009

12:14 AM

Melbourne.PM and frothing hatred of msi packages

This month we had a talk from Tony Smith on cellular automata, specifically on using Golly 2.0, a new tool for this type of work. My brain melted, but it looked awesomely cool. Little spaceships running around, consumers, generators, chaos, order....

And despite previous vague approvals of the msi native packaging for win32 platforms, i now hate this mechanism. Blind frothing hatred. I do not understand how a packaging system explodes b/c a package has over 1200 files (technically components, but "best practices" is one file per component) in it.

Agreed, it's a big package. But unable to cope with it?? Forcing the user to descend into the registry (/etc for unixers) to fix the corruption manually? No helpful messages even, unless you believe that a popup window saying "Error 2908" leads you to delete the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Installer\UserData\ S-1-5-21-1123561945-1935655697-1060284298-1003\AD95649F068525549B26938D7D18FEA7 key