TwitteyBot allows you to schedule tweets - the schedule part is an important usability element. We could work on making a complex web user interface that helps the user with all sorts of scheduling, but a simpler solution is possible. Allowing the user to simply upload CSV files that contain the schedule of the tweets would let the user use other tools like MSExcel, etc to schedule more efficiently. We just released this feature with check- in 71. This feature is deployed on the application also. The CSV file should have the following columns

The date when it should be tweeted. This has to be in the format MM/dd/YYYY (example 12/31/2009).

The time, in the format hh:mm (example 13:15). Note that the hour should be in teh 24-hr format

The actual status message. This can be as long as you want and can be clipped using the web UI.

When using MSExcel, you can use 'autofill' feature for scheduling. If the date or time filed is empty, the date and time of the previous row is used. A video showcasing the various features is posted below.

One of the greasemonkey scripts that I wrote was a utility that modifies the Google search result page. By default, the URLs of search items on the results page change to a Google Url when the results are clicked. Hence, when you visit a website from the search engine, you path is tracked. This is a good behavior for Web History, but does not play well while copying or dragging the link content. This does not work when the user navigates using the keyboard.
This post is about the update to the script that existed earlier. Apart from checking for automatic version update, the script has been simplified a lot. Google's search page has a "onMouseDown" event attached to all search results that change the URL. However, if the user is dragging the link as a bookmark or to a chat session to paste it, google's redirect URL is pasted instead of the actual page location. I am not sure why it is attached to the "mouseDown". I think it makes more sense in the "click" event.
This script adds an event handler to "mouseDown" that nullifies the link change by the page's script. It also attaches an "click" handler that restores the link that Google wants, for Web History's sake. I have also added the scriptUpdateChecker that checks for new versions of scripts. There is some discussion about this here, and I would update it once a conclusion is reached. Watch out this space for more updates.

One of the ubiquity commands that I have worked on includes "bookmark" to delicious. The command was using automatically generated tags from a service which is currently dead. I stole a few minutes to quickly change the Tag Generator Yahoo pipe that was feeding the command with tags.
The Tag Generator now picks up the key terms indicated in the YAHOO BOSS search result for the web page. You can take a look at the modified pipe here.Unfortunately, the only part that did not was the use of delicious tags. For some reason, the delicious xml element in the response disappears when the pipe hits the filter module in the source code.
In an eagerness to test, I headed to delicious. Unfortunately for me, I had just linked my account with my YAHOO Id, rendering the V1 APIs useless. The V2 APIs require OAuth, something that is currently not supported directly in ubiquity yet. I was planning to write an OAuth library for ubiquity, but that is for later. Hence, the bookmark command is broken for now, if you are using the newer delicious accounts. I am falling back to the share-on-delicious built in command to get my work done. This command circumvents the requirement for OAuth by picking up the cookie from the browser using native Cookie Manager (Components.classes["@mozilla.org/cookiemanager;1"])
method of firefox.Interestingly, the AJAX call is also made by fetching the XMLHTTPRequest object in a native way. Watch out this space for more updates on the OAuth utility and my other ubiquity commands