What is Happening?

Earlier this week, Salesforce rolled out its Salesforce1 Lightning Components and App Builder. Built on the Lightning platform launched late in 2014, “Lightning” is being touted by many in the industry as “promising the ability for any business user to assemble apps visually by dragging and dropping pre-built, reusable components.”

While we at Saugatuck believe that Lightning is very cool and important, it is not what is being most widely hyped. What Lightning is, is a platform-based component library and visual environment that enables knowledgeable and experienced people to point, click, drag and drop several important, pre-configured components into functional groups that work in pre-existing, Salesforce/AppExchange software and solutions.

That’s very cool, because it can and should help to simplify the development and launch of relatively simple-yet-valuable – read, “mobile” – apps that are practically guaranteed to function well within the industry-leading, increasingly-influential Salesforce / AppExchange environment. And it demonstrates the concept and value inherent in simplifying, streamlining, and speeding the “development” of new functionality using Cloud-based apps, component libraries and services – something that Saugatuck began including in our IT industry scenarios for clients in 2003/2004.

As announced, Lightning isn’t – as announced – a revolutionary or innovative approach that enables “any business user to assemble apps.” But it is a compelling and important development (pardon the pun) for bizapp development overall, and helps point toward more, better, faster, and cheaper approaches to internal enterprise and external marketplace-oriented apps, for mobile, Cloud-based, and traditional systems as well. It even portends significant improvement and acceleration in synchronizing IT and Business function / group / process alignment for Digital Business. Continue reading Cloud, DevOps, and Digital Business: Salesforce Rolls Out Lightning→

Earlier this week, Saugatuck had the opportunity to talk to analytics provider Alteryx. Alteryx provides a tool that combines both ETL and Advanced Analytics into a single tool, which helps their primary customers – LOB analysts – get their analysis done faster.

Over the last 2 years, Alteryx has gained significant traction with a “land and expand” go-to-market strategy that targets LOB users initially and then expands internally, often with the support of IT. This strategy has helped them grow from 150 customers two years ago to just over 1,000 today (including EMC, Home Depot, Verizon, and Cardinal Health – see Saugatuck Lens360 blog post Alteryx Inspire – The Importance of Analytic Context, published 01July 2014). They have also been succeeding in deploying their Alteryx Server solution – which enables end users to schedule the analytics jobs, publish results and provide reports, on the public or private cloud, rather than just on a local machine.

Alteryx came at the advanced analytics market just at the time that companies were first considering Big Data solutions like Hadoop and MapReduce, but took a different tack. Initially their product aimed to Continue reading Catching Up With Alteryx→

Increasing demand from users for both new functionality (e.g., mobility, analytics, etc.) and new application solutions;

User dissatisfaction over the protracted time and high costs required for the development and implementation of new functionality and new application solutions;

Escalating costs for maintaining a steadily growing inventory of production workloads and infrastructure;

Budgets constraints; and,

Diminishing or exhausted “payback” from major infrastructure initiatives in centralization/consolidation, and virtualization as older devices, sometimes lacking enhancements to minimize overheads, are pushed to their limits.

Saugatuck fully agrees that appropriate use of Cloud infrastructure can yield significant benefits applicable to all five of the challenges listed above. However, Saugatuck has identified that challenge three is sometimes being incorrectly interpreted as a reason to implement “wholesale” migrations of production application workloads to Cloud infrastructures. Responses to our recent Cloud Infrastructure Survey provide the following data as indicators of this flaw in IT infrastructure planning: Continue reading Think Beyond Cloud to a Malleable Infrastructure→

The legacy problem has often been viewed as being mainly about the hundreds of millions of lines of COBOL code operating on back room systems of financial institutions, health care facilities, and government agencies. While the legacy problem rose to particular importance with the millennium crisis, in which dating in old code resulted in flaws, the secret of legacy is that it will never go away. Legacy is not just about COBOL, it is an inherent issue of information technology. In areas of early computer use, such as the USA and Japan, legacy may be about COBOL. In other countries, it might be about Visual Basic. In either case, a number of problems are likely to occur:

Potential for incompatibilities and difficulties in upgrades,

Security issues, since upgrades may be difficult or impossible, particularly if the code is heavily customized,

Inability to optimize operation and performance of hardware and software due to legacy components,

Problems in maintenance, in understanding of the code and the reasons for various operations which have been put into place, and,

Difficulties in locating expertise and hiring to maintain systems.

Over time, software and hardware can become black boxes. Companies no longer wish to provide resources for updates, no longer have access to certified engineers, or no longer understand how the software was developed or what it was intended to do. The software keeps running; the procedures keep being performed; and all is fine until something changes in the environment that brings the whole mechanism to its knees. That was of course the millennium crisis which required updating of software to incorporate a new dating scheme. This created panic, but it was nothing compared to what it would have entailed had it occurred some years later, when many fewer experienced COBOL programmers were available. Continue reading Legacy’s Big Little Secret→

No, it’s not the infamous Steve Ballmer sweaty-shirt rant & dance this time – it’s IBM’s strategy to increase its presence, relevance, and role(s) in the increasingly Cloudy+Mobile world of software development.

IBM has announced its acquisition of (relatively) tiny, Mountain View-based Compose Inc., which offers a handful of open-source, Cloud-based, database offerings. Terms of the purchase were not disclosed; Compose is privately held, and has raised $6.4 million dollars since its founding in 2011.

Why buy Compose? The company attracts developers working on Cloud, web, and Mobile-oriented apps, APIs, and more – mostly net-new developers as regards IBM, because these developers are working with and on a agile, lightweight, small-client DBMSs like CouchDB, MongoDB, PostgreSQL, Rethink and other databases. IBM purchased CouchDB-oriented, distributed-DB firm Cloudant in 2014, so the company is not bereft of Cloud/web/mobile DB presence by any means. But Compose adds to and extends IBM’s attractiveness to the generation of developers focused on the most disruptive and forward-thinking business software, apps, UIs, and methodologies.

What is Happening?

Saugatuck attended the Esri User Conference this week in San Diego. The venerable GIS company touted its wares in the form of customer success stories. In the map business, people want maps. And maps they got – example after example from drought assessment to building management. Yet the company’s primary messaging went beyond the eye-catching visuals. Esri believes geography’s automation in GIS is essential to solving the world’s biggest problems. As an example, Bill Gates appeared virtually, asserting the progress in mapping and championing the Gates Foundation’s experiences using GIS for its work to improve health services and agricultural production.

Why is it Happening?

“We are entering a period of geographic enlightenment,” said Jack Dangermond, Esri’s founder and CEO, in his opening presentation. “GIS is the system of understanding that will alter the evolution of our planet.” The conference theme was Applying Geography Everywhere. Esri says GIS provides the framework and process to help people make smarter decisions. This is truly lofty positioning for a technology company.

Esri also thinks it can change how businesses operate. At the conference, SAP announced new spatial intelligence capabilities, enhancing its HANA platform and Esri integration. But like other established software providers, technology changes such as mobile and the Cloud require heavy lifting. Also, as we have recently discussed, the ecosystem of location-based systems and services is changing (1606MKT, Location-based Solutions: Shifting from the Earth to the Cloud, 10Jul2015).

What is Happening?

Earlier this week, Dell Services briefed Saugatuck on their emerging strategy for their Digital Business Services. Dell clearly recognizes the growing opportunity represented by companies that want to embrace Digital Business, but need to rely on external expertise to bring together a holistic portfolio of technologies and services.

Dell Services has a track record of approaching the services business through industry-specific expertise and offerings. Dell’s Digital Business practices follow this same path for good reason. The industry-vertical focus has, in the past, enabled Dell to go after deep solutions in various industries such as Finance or Healthcare, which require specific domain knowledge and careful compliance with industry-specific regulations such as FINRA, PCI, and HIPAA.

One of the key facets of any Digital Business transformation effort today is ensuring that your transformation partner is equipped to handle the increasing myriad of new technologies, all of which must be woven into seamless products, applications, and services. To this end, Dell has ramped up its software portfolio with several strategic acquisitions over the last several years in areas including integration (Boomi), Analytics (Statsoft, Kitenga, and Quest), and Systems Management (KACE). The acquisitions have given them a strong foundation for their Cloud products and are key components of their emerging Digital Business offerings.

In addition, Dell has focused on building out some key partnerships in expanding categories such as Mobility (Apperian and Kony) and the IoT (PTC ThingWorx), among others. The combination of industry-specific knowledge and both in-house and partner technologies are expressions of the depth and breadth necessary to help companies create and realize Digital Business strategies and move to implementation. With this structure, we believe Dell is on the right track toward enabling their customers to successfully manage the transformation ahead. Continue reading Dell Services: Enabling Digital Business Transformation→

Does it matter strategically if Microsoft offers smartphones? Probably not. Does it matter strategically if Microsoft offers tablets? Unlikely. Does it matter what Microsoft does as regards Mobility and mobile devices in general? Yes, but not in the way(s) that most believe it does.

A new Strategic Perspective for clients of Saugatuck’s CRS subscription research service looks at Mobility as part of Microsoft’s overall business and technology strategy, and comes to the following conclusion: “Mobility” is not Microsoft’s strategy. Mobility – including devices like smartphones and tablets – is a tactical means toward Microsoft’s overall strategic goal of being Always On, everywhere users and businesses may be. Understanding that is the key to making money with Microsoft, or by using Microsoft software and services.

A decade ago, Saugatuck wrote a long-term planning report for a global business management software provider that included our assessment of Microsoft’s long-term strategy, which I termed “Always On.”

“Always On” referred to what today is often termed “Windows Everywhere,” or more accurately, “Windows Experience Everywhere” (i.e., WEE – yes, pun intended). Given Cloud platform capabilities and browser utility, Microsoft can enable the WEE with very limited physical presence on any device anywhere. That gets them closer to their core strategic goal: Anywhere you are, anything you interact with, any action that you take utilizing any computing and communications device or application, Microsoft is Always On. And if Microsoft is Always On, Microsoft is always making more money.

Today, Always On requires having presence and utility in mobile environments – not just where the user and device are moving, but also including remote, location-specific needs enabled and supportable by location-independent services – all linked by the WEE. This is where competitors, partners, and buyers/users tend to get misdirected, because it is so easy to focus on the smartphone and/or the tablet as pillars of mobile/remote/location IT strategies. The important aspect is their relative interoperability with existing ways of doing business. That is a software issue – a language, eco-stack, tools, UI/UX, and OS issue. Continue reading WEE – Microsoft is Always On→

Vulnerability management is the area of security that can best be compared with playing
whack-o-mole, a world where rubber mole heads pop up and out at random from the holes in which they are hiding. Your job is to whack away at the head of each mole with a rubber mallet, thereby forcing the head of the mole back into the hole from which it came. You score points for each mole you force back into a hole and the more points you score in the allotted time of play the higher your total score.

In the game of vulnerability management, you are hitting the heads of the moles by applying patches and configuration changes to IT assets to eliminate or minimize the attack surfaces available to hackers. The problem is that hacker moles like to operate silently and you don’t know which ones are there and which holes they are operating in, unless of course you are constantly searching all the holes to determine if hacker-moles have enough space to get into and through the holes.

Determining where the holes are that will attract the moles is the job of vulnerability scanners, most of which are now operated as Cloud service subscriptions. And it’s not working. And the reason it’s not working has little to do with the scanning services and almost everything to do with the lack of the other tools and services you need to run your score up by smashing the moles faster and in less time. Having access to information about where the moles are, how many there are, where they are lurking and what their cycle-times are would make you invincible in the face of the onslaught of hacker-moles attacking the enterprise network. Continue reading How the Security Game of Whack-a-mole Changes→

Location is central to Digital Business applications. Over the last decade, large providers invested in mapping and location-based services. But in 2015 many of these providers are shifting courses. Some are enhancing their capabilities while others shed theirs. Meanwhile, open source mapping is reaching critical mass. An API-based Cloud model is emerging as the preferred method for buying and selling geospatial data and services.

The geospatial industry is practically spinning off its axis from provider actions in 2015. In a short period, Google, Microsoft, Nokia, and Yahoo sold, modified, or discontinued mapping services. Apple continues its journey from the depths of its 2012 Maps fiasco that ended in an apology from CEO Tim Cook. Apple is investing in location, with regular enhancements of Apple Maps and other technologies including iBeacon. Uber inserted itself into the mapping data game when it bough image collection capabilities and people from Microsoft’s Bing Maps group. Intergraph, Oracle, and Pitney Bowes added functionality to address business and government needs around Big Data and the Cloud. ESRI continued to be widely deployed globally and across industries. Yet the news isn’t all positive.

Collecting, storing, sharing, analyzing, and managing geospatial data is difficult. Before this decade, specialized providers performed those services. Governments collected data. Data was expensive and lacked currency and, sometimes, accuracy. Stovepipe systems dominated. While the world is changing, interoperability remains one of the top challenges in building and deploying applications with location services. Some providers who thought mapping fit their portfolios have decided it’s now not core. Yahoo completely discontinued Yahoo Maps, while Microsoft sold the data collection part of its Bing Maps, and Google ended some of its enterprise mapping services. They are responding to changing markets. Continue reading Earth to Cloud – Shifting Trends in Location-based Services→

Posts navigation

Sign Up

Welcome to Lens360

Welcome to Lens 360, Saugatuck Technology's companion blog that features original posts, Research Alerts, and select summaries of premium research. For a more comprehensive and searchable database of our research, click here to go to our Research Library by Topic Area.