The best way to learn a platform is to use a platform Wow what a week it's been. First week back from vacation and I'm diving right into a sprint of stuff that needs to be delivered to the customer. My task for the week has been develop a connectivity layer between Salesforce and Dropbox using OAuth. This ...

Currency conversion in Apex While waiting for my flight in the lounge tonight I was playing around with currencies in Salesforce because - why not... Conversion between configured currencies are supported in SOQL and Salesforce but only between the configured corporate currency and ...

Salesforce week 25-27 and finishing this weekly thing... Wow!! A half year has gone by. Half a year... Where did the time go? Over the last weeks I've gradually noticed that my view on being with Salesforce has shifted from being "something new" to being "how things are". On feeling at home in the organisation ...

Salesforce Lightning Component API change As we get closer to Summer 17 we start using difference versions across production instances and sandboxes. This of course also leads to opportunities for differences in API's... I just found one such difference as I'd been developing some Lightning ...

Recent Responses

Re: The best way to learn a platform is to use a platformAs far as dev platforms go, I've been working with Polymer now for almost 2 years. And their tagline is "Use the Platform". Meaning, use the browser platform to do what it can do and does best whenever possible. That too goes to what you're ...

Re: Salesforce Lightning Components and image dependenciesOf course the Salesforce Lightning Design System is Salesforce agnostic but it's funny that the SLDS website only mention the SVG approach. I'll have to look into using the lightning:icon tag instead of the SVG custom component as that would definitely ...

Just spent an hour or so looking through the new book from PACKT Publishing called "Classic Web Application Development Techniques" and it's a good read. It goes from A-Z through developing web applications on Domino using the classic form/view/agent approach. The table of contents reads like this:

Chapter 1: Preparation and Habits

Chapter 2: Design and Development Strategies

Chapter 3: Forms and Pages

Chapter 4: Navigation

Chapter 5: Cascading Style Sheets

Chapter 6: JavaScript

Chapter 7: Views

Chapter 8: Agents

Chapter 9: Security and Performance

Chapter 10: Testing and Debugging

If starting out in classic Domino web application development (read "non-XPages) today I would highly recommend the book to serve as a good, solid, introduction to the topic.

A question I get often is how to perform an operation synchronously (that is blocking) instead of asynchronously when developing plugins. The question stems from the fact that most operations are done using the Job-framework where code is run in a background thread. But what if you need the result before continuing. What if you don't want to wait or the code doesn't lend itself to that approach?

I find that the easiest way (without resorting to Job scheduling rules) is to use the java.util.concurrent classes to make the calling thread wait for the Job to complete. This approach works for all job-types including Notes based operations using NotesSessionJob.

It's been quiet around the blog the last few months because I have been neck deep in work getting a new product ready. I'm slowly resurfacing and as blogged about the last few days we (OnTime) are now shipping the latest release of the group calendar product called OnTime Group Calendar 2011. We showed of the UI's at Lotusphere 2011 but now we're shipping and are ready to go.

Besides having a brand new backend with it's own interesting features and performance improvements (see here) the product also ships with a brand new, all Java, Notes UI that runs full screen inside the Notes client. The client is called OnTime Group Calendar 2011 - Notes (or Notes 2011) and is a good showcase of what's possible inside the Notes client and why choosing Eclipse as the platform for Notes 8 was important. We no longer have to use separate clients for our UI but can run it inside Notes where it belongs. The below screenshot shows the UI running inside Notes 8.5.2.

(click the image for a larger version)

The
Since the group calendar now runs full screen (a perspective in Eclipse Java parlance) it's launched from the Open menu in Notes. Once opened it adds its own top level OnTime menu and loads data using the new OnTime Group Calendar API. One of the cool things about the UI being in Java is that it does away with the traditional Notes view limitations (for instance one document per row) and allows for some super cool, pixel level, UI drawing. It also allows us to read from an API layer that abstracts the actual reading and providing of data from the application itself and allows us to reuse the API in all our UI's (Notes 2011, Discovery 2011, Web 2011, Mobile 2011 and Team-At-A-Glance 2011 (sidebar)).

The UI allows the user to switch between a day view (see above) where the user may choose to see from 1 to 7 days to a week view to a month view. The week view for instance gives a very nice overview of the calendar of the people you work with.

In all the views you may filter the people shown using groups and legends. Legends are what we call the types of appointments/meetings being shown. On the server you configure what makes an appointment be put in what legend and may be based on category, type or a formula you specify. Once you select one or more legends the viewer is filtered to highlight the appointments/meetings that match the legend. Below I have chosen to only see external meetings.

(click the image for larger version)

Besides the cool and slick UI (if I have to say so myself) we also provide some nice new functionality. If you have write to a calendar (your own or a colleagues) you may drag'n'drop appointments in the group calendar. The below screen shot shows me dragging an appointment from Susanne to Saiful.

The Notes 2011 also allows for full Lotus Sametime integration and customization using Eclipse based extension points but that's a topic for another day.

If you like to try out OnTime Group Calendar 2011 you may obtain an unrestricted, 30 day, trial. Simply drop us an e-mail at sales@intravision.dk. We'll even be happy to offer you 20% discount for all new licenses purchased in May or June as an introductory offer. Just tell us that you learned about OnTime on lekkimworld.com and we'll discount your purchase.

As I've tweeted I have spent the last couple of days (and the weekend) helping out a customer that exceeded the hard 64 gb database size limit in Lotus Domino. Before discussing how we solved the problem and got the customer back in business I would like you to think about how situations like this could be avoided. And avoiding it is key as once you exceed the size you're doomed.

First --- how and why database platform would EVER allow a database to cross a file size that makes it break. Why doesn't Domino start to complain at 50gb and make the warnings progressively harder to ignore as the database gets closer to 64gb. Why doesn't it refuse data once it reaches 60gb? I find it totally unacceptable that a software product allows a database to exceed a size it knows it cannot handle.

Now I know that there are considerations for such a warning and that it could be done in application code (e.g. database script, QueryOpen event) but it really isn't something an application developer should think about. Also it should be applied to backend logic as well and really doesn't lend itself to a UI computation. I also know that DDM or similar could warn about it but it still doesn't change my stance. The 64gb limit is a hard limit and reaching, and exceeding it, shouldn't depend on me configuring a specific piece of functionality.

Second -- having the option of keeping the view index in another location/file than the database would have helped. This has been brought up a number of times including at Lotusphere Ask-The-Developers sessions. One could argue that externalizing the view index from the database would just have postponed the problem but the view index takes up a substantial amount of disk for databases of this size.

Now on to how we saved the data.

The bottom line in this is that the customer was lucky. VERY lucky. The customer uses Cisco IP telephones and keeps a replica of the database in question on a secondary server for phone number lookup using a Java servlet. Due to the way the way the servlet is written only as single, very small, view was built on the secondary server. This is turn meant that the database that had exceeded 64 gb on the primary server was "only" 55 gb on the secondary server. The database on the primary server was toast and gave out very interesting messages if attempting the access or fixup the database:

**** DbMarkCorruptAgain(Both SB copies are corrupt)

So thank God they had the secondary server otherwise the outcome of the story would have been less pleasant because using the secondary server we were able to:

Take the database offline (restrict access using ACL)

Purge all view indexes (using Ytria ViewEZ)

Create a database design only copy to hold archived documents

Delete all views to avoid them accidentally being built

Build a very simple view to prepare for data archiving

Write a LotusScript to archive documents (copy then delete) from the database

Use Ytria ScanEZ to delete deletion stubs from the database (this works for them because the database isn't replicated to user workstations or laptops)

Do a compact to reclaim unused space

Make the database available on the primary server

Whew! They are now back in business after building views in the database. They were lucky - VERY lucky. If they hadn't had that secondary replica the data would probably have been lost to much distress. To them and me.

So what are the main take aways from this?

UI check -- in the future all databases that I develop will have a database script check on the database size to try and prevent situations like this

DAOS -- enable DAOS for databases to keep attachments out of the database and keep the size down

Monitoring -- monitor databases either using DDM or other tools to try and prevent sitations like this

And so concludes a story from the field. 4 days later where my hair have turned gray from watching copy/fixup/compact progress indicators the customer is back in and happy once again. Whew!!

As I have been tweeting recently I have finished coding the new OnTime Group Calendar Notes UI and we are now shipping it (OnTime Group Calendar 2011 ). This release is a brand new, completely rewritten, product and it's shipping with some very cool features and UI's. Currently we're shipping a standalone UI (Discovery 2011) and the Notes UI (Notes 2011) with a mobile and web UI coming soon (Web 2011 and Mobile 2011). My main contribution is the OnTime Group Calendar Notes 2011 client which is a full Java based group calendar UI that runs inside the Notes 8.5 Standard client. More in a separate post about the Notes 2011 client.

Part of the OnTime Group Calendar is the backend that runs on the server. Previously we easily scaled to 100.000+ users but you normally had to run multiple group calendar databases to control access and visibility of calendar data within your organization.

This "restriction" has now been is now lifted with the new release of OnTime Group Calendar 2011. Now all customers will run a single group calendar database and the OnTime backend takes care of controlling access and visibility either based on custom configuration or based on mail database ACL's.

To get ready for the new release we have done a number of pilot installs and there we found some very interesting performance numbers at a customer which I'll share below.

Text

OnTime 9.x

OnTime 2011

% of previous

Users

8.900

8.900

Storage need

17 GB

170 MB

1%

Number of views required

300

20

6.6%

Document count

300.000

27.000

9%

At another customer we went from a group calendar database of 1.3 GB to 22 MB. That's also a reduction to a little less than 2% of the previous disk usage (1.6%).

As you can see from the above the disk savings are massive. Smaller databases leads to less I/O which leads to major improvements in performance. Domino as a backend screams for this kind of solution. So cool.

When introducing developers to XPages as I did yesterday the question on where to learn (more) about CSS always comes up. Therefore I was happy to see an article this morning in the developerWorks newsletter. The article is titled "Get started with CSS" and gives a good introduction to syntax and how to use and write CSS. Highly recommended article.

Continuing my XPages theme from yesterday I also wrote an XAgent base-class in JavaScript to make XAgents easier to do. Part of the base class is that it allows me to pass in a JSON object to configure the XAgent based on the need. Since the information I pass in should override the built-in defaults I needed a way to easily allow the user supplied values to override the defaults. The easiest way to do this would be to use the dojo.mixin.

"dojo.mixin is a simple utility function for mixing objects
together. Mixin combines two objects from right to left,
overwriting the left-most object, and returning the newly
mixed object for use."

Unfortunately Dojo isn't available in SSJS I needed to do it myself. To my luck it was surprisingly easy as the below code illustrates.

What's super neat is that now, in my run-method I can access the this._args object and all values that the user supplied have overridden the default values because we used the mixin-function. Very cool and an easy and flexible way to have default values but allowing them to be overridden by the caller. Also you know that the variables you need have been defined so no checks are necessary.

Calling my XAgent class is very easy allowing me to override the defaults. The run-method accepts a function which is called when the XAgent stuff has been set up supplying the Writer and the parameters supplied in the URL as a JSON object.

Yesterday and today I've been spending some time doing some XPages coding for a customer project and after spending quite some time doing Java plugin development I'm amazed of how easy XPages are. Don't get me wrong - there is still work to do for the XPages team but it's a joy to work with XPages and coding in JavaScript is just - well - flexible and fun.

One of the real joys of JavaScript is it's dynamic nature and that it allows one to really cut down on the boilerplate and repeated code. For one I spent a lot of key pressed getting field values from backend documents in server side JavaScript. Instead of repeating the same ol' Document.getItemValueString(String) over and over again I did a neat little shortcut. Since I needed all items from a document I just created a utility function to JSONify a document to make it easier to access. The method is below.

Very happy this morning to see that LotusLive Engage has been updated over the weekend. For me the most noticeable difference is that it's now possible to specify that the meeting requires a password before hosting the meeting. A pleasant, although loooong overdue, change.

I often explain plugin installation into Lotus Notes to client and customers at meetings, conferences and user groups. To make it easier to understand I've created the below presentation that hopefully makes it easier to understand. Comments more than welcome.

You should take 10 minutes out of your day to take a look at the IBM Collaboration Assessment Tool. It's a web-based diagnostic tool designed specifically to help you identify your organization's personalized path to gaining maximum value from your on-line collaboration practices. The assessment will allow you to see how your organization stacks up against industry peers. At the end of the assessment you will receive a customized report which will help you obtain actionable best practice recommendations for your collaboration strategy.