Monthly Archives: January 2014

The QCon London conference is on in early March (5-7). It is always a conference I look forward to since it is vendor neutral, though with an agile flavour. Although it covers high scale systems it is not the place to go if you think heavyweight Enterprise middleware from a big name vendor will solve your problems. In fact, it was QCon where I heard Martin Fowler and Jim Webber from Thoughtworks expound on Does my Bus look big in this? (what a great title) in which they argue that whatever software need you have, an Enterprise Service Bus is not the best way to meet it. You are not going to hear this kind of disruptive address at vendor-driven events.

So what is on this year? There are 15 tracks, some covering the buzzwords of the day like Big Data Architecture, DevOps, Internet of Things (two different tracks on this) and Bleeding Edge HTML5 and JavaScript (with case studies from Netflix and the Financial Times). Next Gen Cloud intrigues me because it covers multi-cloud services.

Java is almost conspicuous by absence (though it will no doubt show up throughout) but there is a track on Not Only Java which looks at performance tuning, garbage collections, lambdas and streams in Java 8, and more.

In the wake of Snowden’s revelations, Privacy and Security gets a track to itself, long overdue.

So how does QCon choose its tracks? It’s done by a committee of community experts, InfoQ CEO Floyd Marinescu told me.

Over 15 people came together, facilitated by QCon, and through an intensive multi-week process debated and voted down our tracks until we had these final 15. We do place some constraints such as a desire to have a certain balance of tracks to cover areas of innovation of interest to different communities such as project managers, architects, engineers as well as operations people.

What really counts of course is the speakers, and this year they include Jafar Husain (technical lead at Netflix), Eva Andreasson who pioneered deterministic garbage collection, Graham Tackley, director of architecture at Guardian News and Media, Erik Meijer who created Microsoft’s LINQ, and Joe Armstrong, co-inventor of Erlang.

Google has sold Motorola Mobility to Lenovo at some kind of loss, prompting a few quick observations.

It matters little whether Google’s Motorola transactions were profitable in themselves. Google can afford it. This is all about strategy and the long term.

Why did Google acquire Motorola Mobility? Primarily for the patents. The fact that it pushed Google into competing with its Android licensees looks now to have been an unfortunate side-effect. Google has shown no inclination to become Apple and make a virtue of controlling the entire stack from device hardware to web platform.

Why did Google sell Motorola Mobility (though not all its patents)? Maybe because it was trading at a loss, but more because there was no strategic benefit, given that it wants to foster its relationship with OEM vendors rather than undermine it.

Google is not a hardware company. It is an advertising company, but it is now more accurately described as a data company, with advertising the tax it imposes to pay for those data services.

Why does Apple remain a hardware company and not license OSX or IOS to third parties? Because it makes a virtue of controlling every detail of the user experience, and because it enables it to charge a premium price, since to get the software you have to buy Apple hardware (yes there is hackintosh but that is not mainstream).

Why is Microsoft doing more hardware alongside Xbox, with Surface tablets, and most recently with the Nokia acquisition? Because its hand was forced. The Windows brand has been damaged by too much poor quality hardware accompanied with too much trialware put there for the OEM’s benefit (it gets paid) rather than for the user’s benefit. There was too little innovation around tablet hardware for Windows 8. There was too much designing down to a price rather than up to a standard. Hence Surface. As for Nokia, the future of Windows Phone depends on it, since it has most of the market. Microsoft could not risk Nokia turning to Android or dialling back on Windows Phone.

Should Microsoft follow Google and dispose of Surface and in due course Nokia? Maybe, but not while the strategic importance of those two businesses remains.

If Windows Phones develops such a strong ecosystem and diverse hardware base that owning Nokia is no longer necessary, then I’d guess that Microsoft would be glad to dispose of it.

What about Surface, is it still needed? The case is less clear. Some hardware partners, like Lenovo, are now doing a reasonable job with Windows 8 hardware. That might suggest that Surface has done its job. Then again, there is the Windows RT problem. Only Microsoft and Nokia/Microsoft offer current Windows RT devices; and Windows RT is strategically important as the version of Windows that is low on maintenance and high on security, like Google’s Chromebook.

Note that Microsoft has not as yet started to offer conventional laptop or desktop PCs. The implication is that its primary goal is not to compete with its hardware partners, but to do something different that will move Windows forward.

Now that Logitech has near-abandoned the Squeezebox (the one remaining player is the UE Smart Radio, and even that is not quite a Squeezebox client unless you download different firmware), existing users may be concerned for the future of the system.

Squeezebox consists of free server software which runs on a PC or NAS (Network Attached Storage) device, while the players are supplied by Logitech and controlled by a web app or smartphone/tablet app. Although more fiddly to set up than rivals like Sonos, Squeezebox is a strong choice for multi-room audio at a modest choice, and its community has come up with solutions such as support for high-resolution audio.

The latest community innovation is a project to make a Raspberry Pi into a Squeezebox client. piCorePlayer is delivered as an image file which you can write to an SD card. Pop the card into a Raspberry Pi, supply power, and it is ready to go – meaning that you need no longer worry about getting hold of a Squeezebox player.

I gave this a try. It was almost very easy: my Pi booted successfully from the piCorePlayer image and was immediately recognised by my Logitech Media Server. The player supports output to the built-in audio jack, or HDMI, or a USB DAC, or an add-on DAC for the Raspberry Pi called HifiBerry.

I am using a USB DAC (Teac UD-H01) which requires a little extra configuration. I logged in to the piCorePlayer using Putty, and typed picoreplayer to display the configuration menu:

Configuring a USB DAC is a matter of getting a list of available ALSA devices and setting the output accordingly.

It worked, but oddly I found that FLAC in 16/44.1 format played with crackling and distortion. 24-bit files played perfectly.

The only solution I have found (though it sounds counter-intuitive) is to force output to 16-bit by adding –a 40::16 to the Squeezelite arguments. Everything now plays nicely, though limited to 16-bit – you are unlikely to notice much difference but it is a compromise.

If you try piCorePlayer, here are a few tips.

Log in with user: tc pwd: nosoup4u

The Squeezelite executable is stored at:

/mnt/mmcblk0p2/tce

and the settings scripts are in

/usr/local/sbin/settings_menu.sh

If you need to edit the configuration without the script, you can use vi, which is the only pre-installed editor I have found. Quick start with vi:

Type i to enter edit mode

Press ESC to enter command mode

Quit without saving by typing :q!

Save and quit by typing :wq

There are plenty of vi tutorials out there if you need to know more!

Finally, note that this version of Linux runs in RAM. If you make changes they will not persist unless you create a “backup” with

/usr/bin/filetool.sh –b

This is also an option in the picoreplayer menu, and must be used if you want your changes to survive.

For current users of either SkyDrive or SkyDrive Pro, you’re all set. The service will continue to operate as you expect and all of your content will be available on OneDrive and OneDrive for Business respectively as the new name is rolled out across the portfolio.

I have no strong views on whether OneDrive or SkyDrive is a better name (the reason for the change was a legal challenge from the UK’s BSkyB).

I do have views on SkyDrive OneDrive though.

First, it is confusing that OneDrive and OneDrive for Business share the same name. I have been told by Microsoft that they are completely different platforms. OneDrive is the consumer offering, and OneDrive for Business is hosted SharePoint in Office 365. It is this paid offering that interests me most in a business context.

SharePoint is, well, SharePoint, and it seems fairly solid even though it is slow and over-complex. The Office Web Apps are rather good. The client integration is substandard though. A few specifics:

Yesterday I assisted a small business which has upgraded to full-fat Office 365, complete with subscription to the Office 2013 Windows applications. We set up the team site and created a folder, and used the Open in Explorer feature for convenient access in Windows. Next, run Word, type a new document, choose Save As, and attempt to save to that folder.

Word thought for a long time, then popped up a password dialog (Microsoft seems to love these password dialogs, which pop up from time to time no matter how many times you check Remember Me). Entered the correct credentials, it thought for a bit then prompted again, this time with a CAPTCHA added as a further annoyance. Eventually we hit cancel out of frustration, and lo, the document was saved correctly after all.

Another time and it might work perfectly, but I have seen too many of these kinds of problems to believe that it was a one-off.

Microsoft offers another option, which is called SkyDrive OneDrive Pro. This is our old friend Groove, also once known as Microsoft SharePoint Workspace 2010, but now revamped to integrate with Explorer. This guy is a sync engine, whereas “Open in Explorer” uses WebDAV.S

Synchronisation has its place, especially if you want to work offline, but unfortunately SkyDrive Pro is just not reliable. All the businesses I know that have attempted to use it in anger, gave up. They get endless upload errors that are hard to resolve, from the notorious Office Upload Center. The recommended fix is to “clear the cache”, ie wipe and start again, with no clarity about whether work may be lost. Avoid.

One of the odd things is that there seems to be a sync element even if you are NOT using SkyDrive Pro. The Upload Center manages a local cache. Potentially that could be a good thing, if it meant fast document saving and seamless online/offline use. Instead though, Microsoft seems to have implemented it for the worst of every world. You get long delays and sign-in problems when saving, sometimes, as well as cache issues like apparently successful saves followed by upload failures.

OK, let’s use an iPad instead. There is an app called SkyDrive Pro which lets you access your Office 365 documents. It is more or less OK unless you want to share a document – one of the the main reasons to use a cloud service. There is no way to access a folder someone else has shared in SkyDrive Pro on an iPad, nor can you access the Team Site which is designed for sharing documents in Office 365. Is Microsoft serious about supporting iPad users?

Office 365 is strategic for Microsoft, and SharePoint is its most important feature after Exchange. The customers are there; but with so many frustrations in trying to use Office 365 SharePoint clients other than the browser, it will not be surprising if many of them turn to other solutions.

It’s Mac anniversary time: 30 years since the first Macintosh (with 128K RAM) in 1984 – January 24th according to Wikipedia; Apple’s beautiful timeline is rather sketchy when it comes to details like actual dates or specs.

My first personal computer though was a hand-me-down Commodore PET 4032 with only 32K of RAM, which pre-dated the Mac by about 4 years (though not by the time I got hold of it).

The PET was fun because it was small enough that you could learn almost everything there was to know about it though a book called The PET Revealed that listed every address and what it did. I had a word processor called Wordcraft that was excellent, provided you could live with only having one page in memory at a time; a spreadsheet called VisiCalc that was even better; and a database that was so bad that I forget its name. You could also play Space Invaders using a character-based screen; the missiles were double-dagger (ǂ)characters.

The small company that I was a little involved with at the time migrated to Macs almost as soon as they were available so I had some contact with them early on. The defining moment in my personal computer history though was when I needed to buy a new machine for a college course. What would it be?

If all the choices had cost the same, I would have purchased a Mac. My second choice, since this was a machine for work, would have been a PC clone. Both were expensive enough that I did not seriously consider them.

Instead, I bought a Jackintosh, sorry an Atari ST, with a mono 640 x 200 monitor and a second disk drive. It had the GEM graphical user interface, 512K RAM, a Motorola 68000 CPU, and built-in MIDI ports making it popular with musicians.

The ST exceeded expectations. Despite being mainly perceived as a games machine, there were some excellent applications. I settled on Protext and later That’s Write for word processing, Signum for desktop publishing, Logistix for spreadsheets, Superbase for database, the wonderful Notator for messing around with MIDI and music notation, and did some programming with GFA Basic and HiSoft C.

If I had had a Mac or PC, I would have benefited from a wider choice of business applications, but lost out on the gaming side (which I could not entirely resist). The ST had some quirks but most things could be achieved, and the effort was illuminating in the sense of learning how computers and software tick.

Despite the Mac-like UI of the Atari ST, my sense was that most Atari owners migrated to the PC, partly perhaps for cost reasons, and partly because of the PC’s culture of “do anything you want” which was more like that of the ST. The PC’s strength in business also made it a better choice in some areas, like database work.

I was also doing increasing amounts of IT journalism, and moving from ST Format to PC Format to Personal Computer World kept me mainly in the PC camp.

For many years though I have found it important to keep up with the Mac, as well as using it for testing, and have had a series of machines. I now have my desktop set up so I can switch easily between PC and Mac. I enjoy visiting it from time to time but I am not tempted to live there. It is no more productive for me than a PC, and Microsoft Office works better on a PC in my experience (no surprise) which is a factor. I miss some favourite utilities like Live Writer, dBpoweramp, and Foobar 2000.

That said, I recognise the advantages of the Mac for many users, in terms of usability, design, and fewer annoyances than Windows. Developers benefit from a UNIX-like operating system that works better with open source tools. There is still a price premium, but not to the extent there was when I picked an Atari ST instead.

Ear buds are massively popular, but most do not sound that good. Tinny bass and splashy treble is nothing unusual. They can sound good though. At CES I heard a couple of true high-end in-ear headsets, Shure’s SE 846 ($999) and Audiofly’s AF180 ($549); I especially liked the AF180 and wrote about it here.

But how about Om’s INEARPEACE at a mere $149? No, they are not the equal of the AF180s, but at one third the price they are delightful, musical, smooth, clear and with actual bass.

Om Audio is a company with some personality – “listening to music should be a sacred experience,” says the website, and that is reflected in the packaging, with the ear buds embedded in the side of a foam inner container.

You get a set of ear buds with an inline controller and microphone for a smartphone, a smart zipped bag, and a packet of ear tips in various sizes.

The ear buds themselves have a distinctive design, with a cylindrical body. The cable is flat and supposedly hard to tangle.

Within each ear bud are two drivers, a balanced armature driver for treble and mid-range, and a 10mm coiled driver for bass.

The INEARPEACE ear buds are aimed at those in search of better audio quality than the average in-ear headset, and they deliver. Listen to these and you will not want to go back to the set that came free with your phone. There is adequate treble, but no sign of the shrillness that characterises so many ear buds. The bass is not overpowering, but it is clean and reasonably extended, making music more balanced, rhythmic and enjoyable.

I am not going to get too carried away; these are not the last word in sound quality. There are others to consider in the price range $75 – $150. These are more than decent though, and their musical sound and elegant construction wins them a recommendation.

Microsoft has announced record revenue for its second financial quarter, October-December 2013. Revenue was bumped up by the launch of Xbox One (3.9 million sold) and new Surface hardware. The real stars though were the server products:

Here is what is notable. Looking at these figures, Microsoft’s cash cow is obvious: licensing server products, Windows and Office to businesses, which is profitable almost to the point of disgrace: gross margin $million 10,077 on sales of $million 10,888. Microsoft breaks this down a little. Hyper-V has gained 5 points of share, it says, and Windows volume licensing is up 10%.

Cloud (Office 365, Azure, Dynamics CRM online) may be growing strongly, but it is a sideshow relative to the on-premises licensing.

How do we reconcile yet another bumper quarter with the Microsoft/Windows is dead meme? The answer is that it is not dead yet, but the shift away from the consumer market and the deep dependency on on-premises licensing are long-term concerns. Microsoft remains vulnerable to disruption from cheap and easy to maintain clients like Google’s Chromebook, tied to non-Microsoft cloud services.

Nevertheless, these figures do show that, for the moment at least, Microsoft can continue to thrive despite the declining PC market, more so that most of its hardware partners.

Postscript: Microsoft’s segments disguise the reality of its gross margins. The cost of “licensing” is small but it is obvious from its figures that Microsoft is not including all the costs of creating and maintaining the products being licensed. If we look at the figures from a year ago, for example, Microsoft reported a gross margin of $million 2121 on revenue of $million 5186 for Server and Tools. That information is no longer provided and as far as I can tell, we can only guess at the cost per segment of its software products . However, looking at the income statements, you can see that overall Microsoft spent $million 2748 on Research and Development, $million 4283 on Sales and Marketing, and $million 1235 on General and administrative in the quarter.

At CES Audio-Technica showed off its new range of ear buds, sorry “in-ear headphones”, including this budget SonicFuel ATH-CKX5iS model, at a recommended price of $49.95 and including an in-line mic, answer button and volume control for use with smartphones. While towards the low-end, it is by no means the cheapest in the Audio-Technica range, which starts at just $14.95 for the ATH-CLR100.

The distinctive feature of the SonicFuel range is the C-tip earpieces which have a short curved arm that fits snugly in the ear. The ear tips also rotate so that they angle themselves to the shape of your ear. The result is an exceptionally snug fit, and ear buds that are less likely to fall out when you are on the go. Three sizes of C-tips and four sizes of ear tips are supplied. It does pay to take some time selecting the right size, and journalists attending CES were fortunate to have assistance from an expert fitter.

The truth is that the sound you hear from ear buds does vary substantially according to how snugly they fit, and while not everyone can get a personal fitting at CES, it is essential to fit them correctly to get the best results. Fitting the tips to the ear buds is slightly fiddly, but you only have to do this once.

The in-line controller has a sliding volume control (a mixed blessing as you can accidentally slide it down and wonder where the volume has gone), a microphone and an answer button.

The headset is supplied with a handy bag for your headset and the spare gels.

So how do they sound? The biggest problem is that bass is lacking and the sound overall is thin. Slight sibilance can be annoying on some material. Tonally they are bright rather than warm, though not unpleasantly so. The best thing I can say is that they are inoffensive.

The specifications show an amazing frequency response of 15-22,000 Hz which is hard to reconcile with the puny bass, but since no +/- dB range is shown I guess this does not mean much.

Summary: I love the C-tips and the snug, strong fit; but the sound is a let-down. Possibly going a little further up the SonifFuel range would be worthwhile, though these are the only ones I have heard.

Nokia has released its fourth quarter results for 2013. They make odd reading because of the division into “Continuing operations” and “Discontinued operations”, the latter including the mobile phone business which has been acquired by Microsoft. This tends to cloud the key point of interest for some of us, which is how Windows Phone is faring in the market.

The answer seems to be that sales slightly declined, though it is not clear. Here is what we know.

Mobile phone revenue overall declined by 29% year on year and by 5% quarter on quarter, for the quarter ending December 2013.

Disappointing; though in mitigation Lumia (ie Windows Phone) sales volume in 2013 overall is said to be double that in 2012.

We do know that much of Lumia’s success is thanks to the introduction of low-end devices such as the Lumia 520. That has been good for building market share, but not so good for app sales or mind share – on the assumption that that purchasers of high-end devices are more likely to spend on apps, and that aspirational devices have a greater influence on mind share than cheap ones.

That does mean though that units might have gone up even though revenue has fallen.

Still, the results do put a dampener on the theory that Windows Phone is taking off at last.

This is a moment of transition following the Microsoft acquisition. Microsoft has not got a good track record with acquisitions, and the Danger/Kin disaster is hard to forget, but Nokia comes with an influential executive (Stephen Elop) and common sense would suggest that the team which created excellent devices like the Lumia 1020, and which was able to engineer strong budget offerings like the 520, should be kept together as far as possible. Or will it be dragged into the mire of Microsoft’s notorious internal politics? Over to you Microsoft.

Update: it is now reported that Lumia sold 8.2m devices in Q4, down from 8.8m in Q3 but up from 4.4m in the same quarter 2012.

David Sobeski, former Microsoft General Manager, has written about Trust, Users and The Developer Division. It is interesting to me since I recall all these changes: the evolution of the Microsoft C++ from Programmer’s Workbench (which few used) to Visual C++ and then Visual Studio; the original Visual Basic, the transition from VBX to OCX; DDE, OLE and OLE Automation and COM automation, the arrival of C# and .NET and the misery of Visual Basic developers who had to learn .NET; how DCOM (Distributed COM) was the future, especially in conjunction with Transaction Server, and then how it wasn’t, and XML web services were the future, with SOAP and WSDL, and then it wasn’t because REST is better; the transition from ASP to ASP.NET (totally different) to ASP.NET MVC (largely different); and of course the database APIs, the canonical case for Microsoft’s API mind-changing, as DAO gave way to ADO gave way to ADO.NET, not to mention various other SQL Server client libraries, and then there was LINQ and LINQ to SQL and Entity Framework and it is hard to keep up (speaking personally I have not yet really got to grips with Entity Framework).

There is much truth in what Sobeski says; yet his perspective is, I feel, overly negative. At least some of Microsoft’s changes were worthwhile. In particular, the transition to .NET and the introduction of C# was successful and it proved an strong and popular platform for business applications – more so than would have been the case if Microsoft had stuck with C++ and COM-based Visual Basic forever; and yes, the flight to Java would have been more pronounced if C# had not appeared.

Should Silverlight XAML have been “fully compatible” with WPF XAML as Sobeski suggests? I liked Silverlight; to me it was what client-side .NET should have been from the beginning, lightweight and web-friendly, and given its different aims it could never be fully compatible with WPF.

The ever-expanding Windows API is overly bloated and inconsistent for sure; but the code in Petzold’s Programming Windows mostly still works today, at least if you use the 32-bit edition (1998). In fact, Sobeski writes of the virtues of Win16 transitioning to Win32s and Win32 and Win64 in a mostly smooth fashion, without making it clear that this happened alongside the introduction of .NET and other changes.

Even Windows Forms, introduced with .NET in 2002, still works today. ADO.NET too has been resilient, and if you prefer not to use LINQ or Entity Framework then concepts you learned in 2002 will still work now, in Visual Studio 2013.

Why does this talk of developer trust then resonate so strongly? It is all to do with the Windows 8 story, not so much the move to Metro itself, but the way Microsoft communicated (or did not communicate) with developers and the abandonment of frameworks that were well liked. It was 2010 that was the darkest year for Microsoft platform developers. Up until Build in October, rumours swirled. Microsoft was abandoning .NET. Everything was going to be HTML or C++. Nobody would confirm or deny anything. Then at Build 2010 it became obvious that Silverlight was all-but dead, in terms of future development; the same Silverlight that a year earlier had been touted as the future both of the .NET client and the rich web platform, in Microsoft’s vision.

Developers had to wait a further year to discover what Microsoft meant by promoting HTML so strongly. It was all part of the strategy for the tablet-friendly Windows Runtime (WinRT), in which HTML, .NET and C++ are intended to be on an equal footing. Having said which, not all parts of the .NET Framework are supported, mainly because of the sandboxed WinRT environment.

If you are a skilled Windows Forms developer, or a skilled Win32 developer, developing for WinRT is a hard transition, even though you can use a familiar language. If you are a skilled Silverlight or WPF developer, you have knowledge of XAML which is a substantial advantage, but there is still a great deal to learn and a great deal which no longer applies. Microsoft did this to shake off its legacy and avoid compromising the new platform; but the end result is not sufficiently wonderful to justify this rationale. In particular, there could have been more effort to incorporate Silverlight and the work done for Windows Phone (also a sandboxed and touch-based platform).

That said, I disagree with Sobeski’s conclusion:

At the end of the day, developers walked away from Microsoft not because they missed a platform paradigm shift. They left because they lost all trust. You wanted to go somewhere to have your code investments work and continue to work.

Developers go where the users are. The main reason developers have not rushed to support WinRT with new applications is that they can make more money elsewhere, coding for iOS and Android and desktop Windows. All Windows 8 machines other than those running Windows RT (a tiny minority) still run desktop applications, whereas no version of Windows below 8 runs WinRT apps, making it an easy decision.

Changing this state of affairs, if there is any hope of change, requires Microsoft to raise the profile of WinRT among users more than among developers, by selling more Windows tablets and by making the WinRT platform more compelling for users of those tablets. Winning developer support is a factor of course, but I do not take the view that lack of developer support is the chief reason for lacklustre Windows 8 adoption. There are many more obvious reasons, to do with the high demands a dual-personality operating system makes on users.

That said, the events of 2010 and 2011 hurt the Microsoft developer community deeply. The puzzle now is how the company can heal those wounds but without yet another strategy shift that will further undermine confidence in its platform.