MMilani has asked for the
wisdom of the Perl Monks concerning the following question:

I have a "SEND" button in a order form that ends a section and submit the order to the server.
Some users do not wait the complete end of the action and have clicking simultaneously more than one time on this button.

How can I avoid a second click on this button inside of the one same section ?

Please have in mind that I can't hide this button, nor to use more cookies, nor to use a JS solution. All have to be made in Perl only.

If you can provide some sort of unique "transaction identifier" number that the server can use to check for duplicate submissions, you might be able to catch these.

In your CGI handler, for example, you would look to see if that transaction identifier had been submitted before, and if it had, you could either generate an error, or merely display the result of that prior transaction.

The problem with the second approach is that the user might have hit the emergency "Stop" button, changed something, and submitted again. In that case their change will be silently ignored, and that's not good.

What I do in this case is that I store the parameters of the form, and if I find that I have already received it, I check compare the 2 sets of parameters. If they are identical then no problem, I do not process the second form and just resend the result. If there is a difference I send an error message. What you do in this case really depends on your application. You might want to "undo" the first transaction and process the second one.

Update: I forgot to mention that I store the unique identifier in a hidden parameter. This is certainly not completely secure, as the parameter can be changed and intercepted, but I believe it is OK in my case (malicious users changing the parameter would just loose the session and outside attackers trying to hijack the session would get the error message).

Depending on how you're applying this concept, it could be broken. A user might submit the exact same parameters later in the day (e.g. order "one more" of an item they had previously order that day or suchlike), or maybe even five minutes later.

You have gotten some good answers, but the most obvious, and the recommended first measure wasn't mentioned:

Tell your users to not click twice.

If you just put a little text saying "this will take a litle while, please do not click more than once on this button", you will get rid of 90% of the problem, or more.

You still should have some kind of fallback in the code, as suggested and shown, but this is where you start.

I also don't really know what you mean with "nor to use more cookies", especially the word "more" confuses me. If you can set some cookies (which the word implies), there is no hindrance at all for you to add more, is there?

You have moved into a dark place.
It is pitch black. You are likely to be eaten by an ant.

It's a lovely idea but I'm afraid you overestimate the intelligence of some web users. I used to work for a company that provided the Credit Card processing for other companies. And one of the big problems we had was with users pressing the submit button more than once. Yes even with something as important as their own money they would ignore explicit instructions to be patient and not submit their credit card multiple times. I kind of felt we should treat it as a idiot tax but we had to protect the customers from themselves :-)

I have to second this... Same experience with my customers... In fact, in one case my customer clicked the submit button as fast as he could just to prove the system could be broken. Ended up setting a hidden form value on the page and if that variable was set, we would display an alert box saying "Don't do that!"

----
Zak
"There is no room in this country for hyphenated Americanism" ~ Theodore Roosevelt (1915)

Not because it will solve all of the problems. (gratuitous and predictable posts predicting the idiocy of most humanity except for us omitted here for brevity.) But it is true that many people may click twice because they aren't sure if their clicks registered. If you warn them that there might be a delay, those people who can read will exhibit more patience.

You should still implement other technical solutions (several shown here will work nicely), but you can cut down on a lot of the problems with a simple message to your users. Some of them will see it and they are the ones who deserve to be helped the most anyway.

It's okay to abandon the "us and them" mentality that many programmers and developers exhibit. The people who use our stuff are really part of "us" too. Talk to them instead of trying to be incredibly tricky and saving them from themselves silently. (On the other hand, a safety net is a good idea. And a lot of people won't listen to you when you talk to them. But you take reasonable measures to protect this latter category and then don't worry about them too much.)

..except in the case of PDA, WAP, text-mode, and browsers with Javascript disabled (such as High-Security IE, Mozilla's new Javascript function granularity, etc.)

Javascript solutions are only applicable when you can absolutely, posititively, 100% verify the client configuration.. i.e. locked down intranet usage. Anything else is just a "best-guess" approach.

Rule 1: Never trust the client (config)

Your solution may work, but isn't bulletproof. I'm accustomed to having to code my web-based work against a minimum of 11 browsers (IE, Konqueror, Galeon, Opera, Dillo, Mozilla, Netscape, Lynx, Links, Amaya, and w3m). In my case, pages must validate to XHTML 1.0 Transitional and be 100% CSS1 and CSS2 compliant. Very strict requirements, but the end result is near-perfection (and yes, the pages look good too, and I don't have to use tables or Javascript)

Another approach to this is to take a simple submit form, and send the user to a confirm form, which has an expiration set. If the user clicks Back and tries to resubmit, the confirmation form will fail. Instead of submitting directly from the main entry form, submit the values from a "Are these values correct?" type of page.

It sounds to me like your problem is that the CGI is not responding quickly enough to the user. I can think of at least 2 ways of presenting a "Please wait" screen to the user immediately, and then handling the expensive processing:

1. merlyn wrote up a nice solution a while ago. He forked a child process to do all the dirty work, which would be written to a session-specific file. The page would reload at 5 second intervals until the child process finished.

2. Alternatively, you could use the server push multipart MIME type. By printing a multipart/x-mixed-replace and then your "Please wait" page, you can then print the final resulting page whenever you're ready (from the same process).

You could submit all parameters to a small script, which loads very quickly stating something like 'Processing your request' (no forms, no strings attached, just for sex ;-)

Then this little scriptie submits checks the parameters which come in, and if they seem valid, resubmits them (perhaps) using LWP to Yet Another Perl Script. Once this POST (or even GET) action is complete, you have your 'please hold on there' script reload to another page.

er formait hyarya.-- "Life is a house and the next tornado is never far away"-- "lovely by nature"

When putting a smiley right before a closing parenthesis, do you:

Use two parentheses: (Like this: :) )
Use one parenthesis: (Like this: :)
Reverse direction of the smiley: (Like this: (: )
Use angle/square brackets instead of parentheses
Use C-style commenting to set the smiley off from the closing parenthesis
Make the smiley a dunce: (:>
I disapprove of emoticons
Other