Performance Optimization is somewhat like housekeeping. If you don’t take care of it regularly, the shit piles up. My history of our web performance goes back almost five years now and what started as a blazin’ 94 once was now a 76 while mobile, which was never really good, dropped from the mid-seventies to the high-fifties. Even though I optimize all images and write as less code as possible.

I came across these horrifying numbers because one of our main competitors relaunched recently and while pointing at him and laughing at his numbers (52 Desktop, 51 Mobile) I had to admit that ours aren’t really great either.

Running the Erzgebirge-Palace through PageSpeed Insights spoiled two obvious issues: The long-avoided above-the-fold render-blocking stuff and the trust seal. But that will be a different story, we will focus on the render-blocking css for now.

The core layout for Erzgebirge-Palace is ten years old now and also it has clean markup which made reponsive retrofitting quite easy there is no build process or anything near to it. So the CSS is a little bit messy. Identifying the above-the-fold styles manually was not an option, but hey, Smashing Magazine tackled this issue already two years ago. I came across this article after I unsucessfully tried to run a simple gulp task with the critical plugin by Addy Osmani.

While gulp just spit errors into my console the grunt task by Ben Zörb worked right away and returned a pretty good result. I had to add some more code as there happen things on the page the plugin cannot be aware of as different headers during the holiday season and different headers for different languages. But in the end it was less work than I expected and the result gives me now 92 on Desktop and 91 on Mobile (also resolved the trust seal, though).

All numbers given are from Google PageSpeed Insights. While this tool is ok to do a quick check on your site’s performace you should rely on other tools when actually optimizing your website, my tool of choice here is Webpagetest.

// alt is added to the img tag even if it is null to prevent browsers from outputting// the image filename as default$image='<img src="'. xtc_parse_input_field_data($src,array('"'=>'&quot;')).'" alt="'. xtc_parse_input_field_data($alt,array('"'=>'&quot;')).'"';

if(preg_match("/__/",$src)){$aImage=pathinfo($src);$sSrc=preg_replace("/(.*)__(.*)/","$1",$aImage['filename']);$sRealSrc=$aImage['dirname']."/".$sSrc.".".$aImage['extension'];}if(file_exists($src)){$iCDN=(filesize($src)%5);}elseif(isset($sRealSrc)&&file_exists($sRealSrc)){$iCDN=(filesize($sRealSrc)%5);}if(REQUEST_TYPE!="SSL"){$sCDNSrc=str_replace("{i}",$iCDN, HTTP_SERVER_CDN)."/".$src;}else{$sCDNSrc=str_replace("{i}",$iCDN, HTTPS_SERVER_CDN)."/".$src;}// alt is added to the img tag even if it is null to prevent browsers from outputting// the image filename as default$image='<img src="'. xtc_parse_input_field_data($sCDNSrc,array('"'=>'&quot;')).'" alt="'. xtc_parse_input_field_data($alt,array('"'=>'&quot;')).'"';

In the unfolding discussion some folks stated that they use both JPEGmini and ImageOptim when it comes to JPEGs and that this would reduce the file size even more. Fortunately I had to add pictures to 50 new products today. Every product has three pictures from different angles, so this sums it up to 150 images in various sizes, 900 images in total, all JPEG.
So I gave it a try and here are the results (I bought JPEGmini for this test as 20 images on the lite version might not be enough to get reliable results, still, your mileage may vary).

‘Save for Web’

With just using ‘Save for Web’ the images had a total size of 84MB. Quite heavy.

ImageOptim

Running these through ImageOptim saved nearly 10% and lowered the total size to 75.7MB.

JPEGmini

Using the ‘Save for Web’ images again (not touched by ImageOptim yet) results in 74.2MB.

ImageOptim after JPEGmini

Now adding another run of ImageOptim to the images already optimized by JPEGmini lowers the total size to 66.9MB. Thats more than 20% in file size saved without compromising image quality.

As far as I can tell it is not necessary to run the tools multiple times. At least in ImageOptim it is obvious that the latest version does this by itself. And I wasn’t able to spot any major changes on a second and third run with JPEGmini.

To give you the ability to compare the results, below are two images. The left one is the image just saved from Photoshop, the right one is the image after the whole optimization process.

Share this:

Modern shopsystems, and for the sake of this post I’m counting the dated xt:Commerce 3 as ‘modern’, are full of features. Features you most likely never use. We, for instance, don’t use Tell-a-friend (as it’s not compatible with german law) or wishlists (as we never saw a need for it).

However, chances are good that those unused features turn out to be a bottleneck when it comes to performance. Especially when they are coded by someone who has heard the word ‘MySQL’ for the first time (i.e. those guys who wrote xt:Commerce 3).

At X-Skating we recently faced the problem that the front page loaded pretty good, the page for a single product however had almost one second till time to first byte (the .700 in the screenshot is better than most other tests we ran).Read more

Share this:

To give you some context this starts last friday. I attended border:none, a conference for web professionals that took place in Nuremberg. My parents only live an hour north and so I took the opportunity and swung by their place. Also my brother is among the best german model railroaders, his work is known all over Europe and, thanks to his website, worldwide. His club ran their yearly exhibition on that same weekend and as I hadn’t seen it in action before I went there to have a look at his new South Africa layout. Of course I took some pictures and back at my parents uploaded them to my dropbox. Well, at least I tried.

My parents have DSL6000, that means 6016 kbit/s down and 576 kbit/s up. However, as the line is too long, they only can use a fraction of this. It took ages to upload only one picture. I put my phone in the pocket and went upstairs to see my other brother. There is no WIFI reception upstairs and when I unlocked my phone in the attic the upload was incredible fast. I was stunned. What happend? Due to the lack of a WIFI connection my iPhone 5S ‘dropped’ to LTE, which is available in the area. LTE, however, gives me something similar to DSL16000 and all 40 images and two videos got uploaded in less than 10 minutes. You would like to cancel the DSL6000 contract now, put a SIM into your router and use LTE from there on. There is, however, one problem and it has a name: The carriers.

The contracts offered by Deutsche Telekom (and all other carriers in Germany) don’t have a real flatrate, you get your bandwith limited when you hit a ridiculously low amount of data transfered. Remove this, Deutsche Telekom, or at least lift it up to something useful, 50GB or 100GB would do for the beginning. Free these mobile contracts! It’s your chance! And your obligation. Make Germany fast!

Share this:

I’m currently building a new ERP for our company. To make life easier for our customer service they have direct links from within any order to the parcel tracking. We ship most of our items with DHL Germany. Other than with DHL Express the parcel will not be delivered with DHL in the destination country but with the local company instead. DHL provides us with a match code/UPU once the parcel reaches the destination country.

I already built a lot of icons for this but I wanted to improve this even more for our customer service. So where available I tracked down the URL for the Track&Trace in the destination countries – if available the direct link to the result page without any need for our people to put in the match code into another form and submit it.

I’d like to share this list, maybe it’s helpful to anybody else. If you have suggestions, improvements or want to add missing companies, just drop me a line.

Share this:

While researching hotels I not only rely on the HRS ratings but also look the hotel up on TripAdvisor for even more reviews. As this works pretty well I decided to give something back to the community by rating the hotels we stayed.

Signup is easy as you can use your Facebook login. Reviewing is also quite easy, however, your review goes to a pending queue, more commonly known as a black whole and this is where the fun begins. It stays there for some time. The FAQ says 24 to 48 hours, however mine are in the queue for more than 72 hours each. This is quite anoying, as they pile up while you write new reviews and don’t even know if they are helpful for the community, as nobody can read them. On the first review I thought this might be because I’m new to the platform but also with more reviews written nothing speeds up.

But here is what really sucks: Some of my reviews got deleted while pending. No word about why TripAdvisor trashed them, not even an email and there have been reviews for hotels that have only one review to date. And none of them was rude or anything like this, just honest reviews.

I wrote TripAdvisor about this five days ago but haven’t heard back yet. The contact form is also pretty hard to find and the issue ‘My review got deleted’ is not even an option to select.
So all of you who rely on user generated content: If I spend my time to improve your site, let me know what’s going on! Learn from the TripAdvisor desaster!