Collection of thoughts about what I have seen and experienced in the areas of technology and international business.

I am thinking about the future of the media industry and what is happening to the entertainment and media mix. eBooks are outselling paperbacks, cable TV is loosing subscriptions to streaming services, hard copy purchases of music and video recordings are loosing ground to digital purchases and newspaper subscriptions have been going down for the past 14 years even though the number of households has been growing. But there are winners too. Radio advertisement revenues have been growing since 2010 and box office ticket sales are bigger than ever! The US entertainment and media industry over all is growing at a faster pace than GDP. The media pie is not becoming smaller, but rather evolving in its ability to tell stories. Making content digital, mobile accessible, available for real-time streaming and allowing consumers to buy increments with micro-payments are just foundational elements that will enable transmedia storytelling or cross-media integration.

Cross-media integration is being adopted by large corporations. Unilever’s Susan Glenn Axe cross-media integrated campaign is an example of the potential in cross-media integration, gaining over double brand recognition compared to their main competitor. Or Sprint’s Does Your Phone Dream When it is Shut campaign is edgy, mobile, interactive and data driven. Both examples are media rich, but only mildly integrated. We are only seeing the tip of the integration ice berg.

From a technology enablement perspective transmedia story telling requires:

Both business and content management supported by powerful analytics and compute elasticity

Concept development and resource management

Content acquisition, licensing and management

Content planning, scheduling and quality control

Advertising sales and scheduling

E-Commerce and billing

Automation, master control and digital platform

Digital delivery

To the audience it is all about the content experience, which requires a large subscriber base, comprehensive media mix and rich multi form factor devices ecosystem to enable participation.

It has been a while since I’ve reflected on the economy. The last couple of years have seen steady growth, but mostly in stock prices through corporate stock buy backs. The true economy has grown at a far lesser speed. One might say that for this reason current stock prices are inflated. However, the bull run seems to continue. November and December have historically seen increases in stock prices, but what will happen when consumer buying drops in Q1FY15?

The second half of 2015 will start being affected by the 2016 Presidential Elections, which typically means that all legislation slows and until the election is done, and the winner announced, there will be a long period of uncertainty. Even if a correction happens in the first half the uncertainty will slow down the rebound.

The European economy has slowed down for multiple reasons. Some are structural and others related to uncertainty with Russia. Germany is the largest consumer of Russian gas and could cut the pipe off buying its energy from other markets, but many of the other European nations do not have that strength. The formal sanctions on Russia have mostly been a nuisance. The real penalty is from US test selling stock from its strategic oil reserves and pressuring its oil producing allies to add supply to the market. The Russian economy is solely based on energy sales and the drop in energy prices is crippling the Russian economy more than any sanctions could. In many ways Russia’s renewed military capability and ambition is not aligned with its economic base. A military is only mighty if it has strong economy to sustain its commitment.

As I look at the US economy I see a slow down in the fundamentals, regardless of the stock markets. The housing come back has stalled and even taken a step back. Unemployment gains are regional with net winners and losers. Middle class pay raises have been on hold for years. When we talk of the trickle down effect, the theory is predicated on the notion that there is actually trickle and that trickle is large enough to fuel consumer spending, which in turn drives the economy. We now have an engine coughing with a constricted fuel line. What we are currently doing is consolidating wealth at the top with negligent trickle and it doesn’t take a rocket scientist to figure out that the model will not work for long and the engine will stall. The power to make the engine turn again comes from the battery reserves and not the fuel tank. We are all proverbially in the same car and the current strategy is not optimal from a from distance travelled perspective, even thought the MPG reading might look good now.

Chairman Yellen recently spoke about her concerns on the wealth gap. The rise in stock prices through stock buy backs has made the wealthy even wealthier. The wealthiest at some point will cash out, which will cause a strong market correction. The stock held by the middle class is slower to react and will again carry the brunt of the blow. With a correction the gap will become even wider. The middle class will loose in the correction and the wealthy will reinvest in the bottom reaping the gains of the next cycle. Some times the correction is uncoordinated and that is when we have a depression and even the wealthy loose.

What the current government has failed in is outlawing inversions, limiting stock buy backs, simplifying the tax code, enabling the repatriation of wealth and reducing corporate taxes to be more inline with the global average. All these actions would increase the trickle effect as capital would need to be forced back to work at home. The administration talks about supporting the middle class, but it has been largely words. We fixed the housing catastrophe at the tax payers dime, but we did not fundamentally change anything. The nation needs leadership with long term vision and the willingness to make hard short term sacrifices. We need structural change badly. I am more republican than democratic in my views, but we need enlightened strong leadership, which we’ve been lacking for too long.

Next generation is a topic that often comes up in a major technology transformation. Rather than put the same old product on a new platform, many elect to start from a fresh slate, asking themselves what can I do in this new technology paradigm and more importantly what will my customers need in the future?

We are often faced with having to explain to our customers why they should be interested in the new infrastructure option and it is a very valid question. The benefit may be in a lower cost or generally better performance, but those arguments do not differentiate against any other solution having made the same infrastructure decisions. Justification based on cost alone often doesn’t serve to maintain strong product margins either. When we fully embrace and incorporate the potential of a new platform or paradigm, we are no longer forced to compare the old versus the new, as the new offers business benefits that were not technically feasible or economically viable with the old.

Cloud computing is an example of a technology transformation that can have significant next generation implications. Migrating and existing on premise solution on to cloud infrastructure may offer cost and performance benefits, but it will not differentiate. Starting form a clean slate and fully embracing Platform-as-a-Service and what the platform, as well as the platform ecosystem, can enable in a next generation sense will lead to more compelling value promises and true thought leadership. Cloud in itself isn’t transformational. Virtualization and Software-as-a-Service licensing models have been around for quite a while. It is what we do on it that can be.

Big data is one of those terms that really means nothing or pretty much anything. In multi-tenant environments we generate vast quantities of data that can be easily mined and analyzed across the whole population of customer accounts. Data privacy and proprietary nature of data in certain industries may cause concern, but if we only analyze a higher level sanitized abstraction of the data with large enough populations these concerns should not be an issue. Benchmarks can be of great value as they tell us how we are performing in relation to our peers. Adding layers and streams of additional data can further enhance the value of our core data set. When we apply analytics to the data we can generate deeper and more valuable insight. Statistical analysis can help us predicts outcomes and analysis of outcomes can enable systems to prescribe action.

Mobility really became a factor with 4G speeds and the proliferation of smartphones and slates. Next generation services must account for a mobile workforce. Mobility often leads to a reassessment of the user interface. This is in line with the general ‘applification’ trend in software solutions. Some take the approach of mobile first and in cases mobile only.

We never really start from a clean slate, as we have legacy and/or open source code libraries. The first build is a minimum viable product (MVP) that is good enough to compete with market entrants and forms a platform for a modular development roadmap executed through agile sprints. Modules can be included in the core price of the solution or can be add-on in nature. They can be functionality, content or services. With next generation services offerings we need to think through average revenue per user (ARPU) and maximizing reoccurring profit, rather than revenue. The difference is that in maximizing revenues we are focused on top line sales and in maximizing profit we also pay attention to delivery performance. Modularization often leads to e-commerce and in product discovery. Opening the platform to third parties rounds it off as an ecosystem content platform.

The software industry is going through dramatic changes and traditional enterprises are not immune to this transformation. For most traditional industries software based services are less than 5% of overall gross revenues. As processes become more automated and machine controls become increasingly digitized enterprises are faced with vast quantities of data. Even most software companies have yet to fully explore their data play, leaving money on the table. Data is increasingly the new currency.

Traditional enterprises will continue to improve core processes, develop new materials, engineer more efficient machines, research new fuels, etc., but as consumers and as business customers, we increasingly expect even traditional products and services to come with applications that are facilitative, collaborative, analytic, predictive and even prescriptive, in order for us to be able to maximize the return on our investments. Intelligent systems enable data to flow across an enterprise infrastructure, spanning the devices where valuable data is gathered, to back end systems where that data can be translated into insights and action.

The first step is to structure, collect and display data in static reporting format. The design of the data architecture needs recognize that this is only a minimum viable offering and that the design must support additional data streams, data merges, benchmarking and analytics. As the amount of data stored increases the architecture must allow for this with minimum degradation in quality of service.

No system exists in a silo, but rather as a component of an ecosystem of solutions and services. Integration with possible value adding third party data streams or overlays should be considered. Examples of overlays are geological, geospatial, socioeconomic, etc. Value adding data streams can be threat data, macro-/microeconomic data, ERP/CRM data, social feeds, weather data, etc. Layering data or merging data increases the value of our core data set generating a wider range of insights. To fully monetize our core data enterprises should also consider if other parties within the ecosystem could use the data to generate value.

In addition to adding depth to services, enterprises can also add breadth. Value chain integration, expansion to adjacent markets and addressing external stakeholders (such as communities, local governments, etc.) can increase breadth of the addressable population. Each stakeholder persona has their own needs and motivations that add complexity to the services being offered and marketing messaging.

Software based ‘overlay’ services to traditional core products and services can increase utilization, satisfaction and loyalty.

The way we interact with machines is evolving. I remember the lines outside electronics stores when Nintendo Wii first came out. The controller wand revolutionized gaming. Since then Sony has come out with their own wand and Microsoft upped the game with Kinect. AquaTop is cool derivative use of Kinect, where a touch surface made out of a pool of water… and the innovation goes on and on.

Virtual touch has gone mainstream with controllers, such as Leap Motion and the Haptix project. Touch gesture control already dominates all mobile devices from slates to smart phones. We are slowly breaking the boundaries between human and machine. Virtual touch gestures are more intuitive and mimic how humans generally interact with their surroundings through motion.

I remember working with 3D virtual touch sensors in 2006. At the time solutions were very limited and virtual touch grid was very basic. At the time we were already envisioning storefront screens where passersby could interact with applications through the store window without touching the glass or a screen. We envisioned lobby portals with interactive building maps. Screens for changing booths that would overlay product information to enhance the shoppers experience. Philips 3D autostereoscopic LCD monitors are still a bit on the expensive side, but represent the future of 3D. 3D will not take full flight until we, as consumers, can get rid of those ridiculous glasses. When we fuse full 3D projection and accurate virtual touch I think we have taken a huge step in complexity of expression and interaction.

Windows 8 is definitely designed with mobile and touch in mind. I recently purchased the Surface Pro and I use it now as my primary work PC. I have it hooked into a Polycom speaker (no more wired headsets) and a monitor. I have Bluetooth mouse and keyboard. This was my first foray into Windows 8… and even though I am waiting for 8.1 to go official, I really don’t understand all the talk about Windows 8 being so different or difficult. It took me about a day to figure it out and become fully productive. Now I could not go back to Windows 7 anymore. Imagine Windows 8 with a autostereoscopic monitor, where the tiles not just on a two dimensional plane, but can be stacked three dimensionally. Where live tiles offer 3D content. The thing that I think makes Windows 8 next generational is the integrated nature of services and the seamless flow. What if we could integrate services in three dimensions and overlay data in 3D.

I love movies that are fantastical and push the limits of our imagination. In Harry Potter newspaper pictures are live and have depth. In Minority Report we project screens into the air and interact with virtual touch. In Tron we merge the human consciousness with the virtual world. I don’t think we are that far off with any of these examples.

I’ve now done quite a few business modeling sessions with ISVs about operationalizing their cloud strategies. Every case has its unique needs, but there are some commonalities as well.

When asked about services layer elements, like monitoring, metering, billing and provisioning, all claim to have a handle. However, when you scratch the surface none have all services elements thought out and automated. A whole ecosystem ecosystem of services layer partners exists. Some are more mature than others, but all seem to require a degree of configuration and customization to work with complex enterprise solutions. Green Button plays in an interesting space with regards to burst compute provisioning, but I cannot see ISVs agreeing to their margins for long. The ability to provision compute from hot nodes on demand should be an in built PaaS feature.

Financial transaction modeling is also an area were ISVs need help. It is not that mature ISVs would not have advanced financial systems for accounting or that they would not be able to calculate their cloud costs. It is more a need to have the ability to do sensitivity analysis based on average deal sizes. What is required for breakeven? What is realistic? What is in it for the channel? Often ISVs real their modeling at reaching an acceptable breakeven model, but neglect to calculate what profit margin that would offer for partner who is looking to commit a full time resource to promoting the solution. Partners are not in it to breakeven. They want a profit margin on top of cost of sales and/or cost of support.

ISVs that are not used to SaaS pricing need to rethink their incentive models. A 100k on premise solution with 20k S&M component, for a partner with 40% margin on first year, will provide less than half the profit margin as a reoccurring SaaS deal, all else being equal. We need to look at the life time value of the deal. If we were to provide 10% on subsequent years for both S&M and for SaaS renewal the partner would still not benefit from a SaaS sale over an on premise sale. Assuming that a typical technology depreciation cycle is 5 years, a SaaS partner would not breakeven with an on premise partner before the on premise partner sells a whole new solution with perpetual licensing and the whole cycle starts again. Pricing in all models needs to be based on roles and responsibilities of parties concerned. The question is what are the roles and responsibilities in a SaaS model and how they differ from a perpetual license sale?

Value articulation can also be tricky. ISVs often fall victim to separating their core value and value derived from the cloud. Faster, cheaper and easier associated with cloud in general doesn’t differentiate any more. What that specific solution can do when powered by the cloud differentiates and should be integrated into the core value messaging. One value often under utilized is big data inherent to large multitenant solutions. This degree of benchmarking and data analytics has not been possible on premise and is one of the core value adds of the cloud.

Consider the Windows 8 smartphone ecosystem. Nokia is known for mapping, HTC now for audio with Beats Audio and Samsung will like be know for integration with smart appliances. So whom else would fit this picture?

It has been long rumored that Facebook will come out with a smartphone. It would definitely lead the pack as the socially inspired device. What about Electronic Arts or Activision coming out with a gaming inspired device? What about a Disney or Warner inspired phone? How inspired could a Virgin Mobile device really be? What if RIM went Win 8 and came out with an enterprise information worker inspired device? Would they still lead the pack? Coca Cola spends millions researching daily patterns of consumers to optimize the ‘aah’ moment. What would a Coca Cola inspired phone be? How about a phone inspired by an automotive giant? Real time readings on oil viscosity, wireless engine tuning, etc.?

HP has announced that they will get back into the smartphone game, but who will they inspire? What will their uniqueness be? How will they differentiate, when hardware, form and usability have become commodity? Will HP allow another brand (maybe one listed above) inspire their device or will they bring to market a portfolio of inspired devices for different market segments?

At the end of the day yesterdays inspiration gets consumed by the ecosystem and becomes commodity. Device manufacturers need continue to evolve and inspire consumer sin new ways. In this sense the game has become even more competitive and unforgiving.

I would predict that Apple will lose out because how ever inspired they are they cannot ultimately compete with the aggregated inspiration of an ecosystems. Apple is also too locked down and Android is too open… Win 8 is just right.