Monday, June 20, 2016

What
was on display at HPE Discover 2016 led me to take a closer look at some of the
start-ups where Pyalla Technologies has a presence and the results add further
weight to the argument about just how far NonStop systems have come – who would
have guessed just a year or so ago!

Wandering the exhibition hall at this year’s HPE
Discover, for me, it was a treat to see as many cars on display as there were.
Whether it was the HPE sponsored Formula E or what I recall was a Nissan Leaf
or even the BMW i3 pictured above, there seemed a lot more engagement with cars
at this show. Of course, there was the opportunity to race a car against other
attendees; that was part of the HPE Labs exhibit but the “controls” were a tad
artificial and not to my liking, so I wasn’t tempted to join in the fun.

The BMW i3 is a pure electric play and where HPE was involved had to do with
gathering information from many sources in order to keep the car on course and
in practical terms, safe. The i3 had communications to the HPE IoT Platform
where the integration allowed “for rapid installation of new services or
applications, as well as for communication with nearby connected devices.” What
this integration covered was, “Advance warning of rain, high winds, and
potholes using “swarm” intelligence; Smart home/infrastructure integration and
Geo fencing to alert drivers to restrictions as they cross international
borders.”

Of course this was all complementary technology to whatever autonomous car
technology develops but it was a graphic way to demonstrate a future where
unheralded volumes of data are routinely examined with potentially important
data communicated to where it’s most wanted. For the NonStop community all of
this may be peripheral to the core function of processing transactions, but for
many years we have heard presentations by classic bricks-and-mortar retailer,
Home Depot, of how it has integrated weather feeds from the internet directly
into its transaction processing on NonStop systems in order to ensure the right
merchandise is in the right store when local weather crisis develops. And yes,
this is only just the beginning.

Part of what Pyalla supports are a number of start-ups where my involvement has
run the full gamut, from product management and marketing to business
development to simply supporting media outreach programs. One start-up centers
on NonStop while another includes NonStop. The third is not on NonStop, even as
it is a huge user of HPE products. However, there is a synergy evident in these
start-ups. They are all bridging the old with the new and helping us get a firm
foothold in solutions that will allow us to tap vast resources, no matter how
the data is captured, processed or stored. Given the renewed focus on NonStop
within HPE, the associations I have with each of these startups is likely a
foretaste of what the NonStop users will become very interest in over the
course of the next 18 to 30 months. And the catalyst for all of this interest
has been the arrival of hybrid infrastructures where NonStop X is playing its
part.

My involvement with InfraSoft
dates back to its earliest days as the company came together. Based in Sydney,
Australia, it was only natural for Pyalla Technologies to remain engaged with a
company that influenced the choice of company name – Pyalla. Following
considerable market success with its uLinga communications product suite, it
has been its deep port of Node.js that has really interested me. So many
applications written in JavaScript have to do with processing data that it’s
not surprising to see the need to support Node.js’ gain as much momentum as it
has.

For those who participated in the presentation by HPE IT at HPE Discover 2016,
we heard of the choice of JavaScript and Node.js underpinning their
applications (with JDBC access to SQL/MX on NonStop) and this is only the beginning.
The processing of voluminous amounts of data and then doing something with the
data that is of interest is right at the center of the sweet spot for
JavaScript and with as much talk as there is of late about microservices, it’s
good to see that there is a solution for NonStop X systems and whereas Node.js
may be thought of as a platform running on Linux as part of a HPE hybrid
infrastructure, the addition of NonStop can only elevate the importance of
Node.js for many users where NonStop has a presence.

When it comes to data and data analytics however, in my opinion the premier
vendor in this space that should be on the radar screens of every NonStop user
is Striim.
Originally called WebAction and established by former GoldenGate Software
executives, the transition from simply supporting data integration as was being
done by GoldenGate to where the data itself was the interesting element
shouldn’t be a surprise to anyone in the NonStop community.

The first PoCs among NonStop users are being done and shortly there will be
news about a number of successful use-case scenarios everyone in the NonStop
community will be able to relate to – Striim is synonymous with data stream
analytics and there will not be a transaction process solution operating anywhere
without the deep ties to the environment that Striim so effectively supports. Like
InfraSoft, Striim is platform neutral where Linux is perhaps the more viable
choice of platform making it an even stronger candidate for running on HPE
hybrid infrastructure.

Whereas my ties to InfraSoft are deeply rooted in my
ties to all things Australian and my ties to Striim are associated with my good
times at GoldenGate, my connection with InkaBinka
can be traced directly to time spent at Starbucks, Wood Ranch, Simi Valley.
Chalk it up to unintended consequences or simply to serendipity, but from the
time I met InkaBinka founder, Kevin McGushion, over a Starbucks’ Latte, I was
hooked. And today, what started out at InkaBinka with its news application, a
neat way to read news summaries while standing in line for that Latte has
developed into a serious piece of raw feed processing utilizing very advanced
natural language processing.

Or, as Kevin recently explained it, “InkaBinka has created a state variable,
neural network that can summarize vast amounts of information by writing about
a subject, emulating human abstraction. This is especially powerful in internet
search, where 300 million results are common for a single search term. This
neural network, through learning and building of a summary, creates new ways of
visualizing information while allowing rapid discovery of new information that
would have been hidden by the sheer volume of content.”

For me it’s a case of InkaBinka developing an optimized
neural network that learns over time and creates and adjusts relationships
between ideas, something Kevin now calls “connectedness,” as those ideas
evolve. InkaBinka then uses a form of neural network that relates more abstract
ideas to summarize large volumes of data. “Artificial neural networks at a
basic level emulate the function of neurons where, if new information is
introduced, processes may be applied to learn about those new things,
essentially allowing the system to become smarter,” said Kevin.

And of course, from a high-level, macro perspective, “It’s hard to apply these
concepts to abstract things such as language. Traditional NLP (natural language
processing) works sequentially where a next state is dependent on a previous
state and works well for things like spell checking, grammar checking,
translation even code breaking. This is not how the human brain tends to work.”

Before diving even deeper into what was behind InkaBinka today, by way of
explanation Kevin then suggested that I should, “Take, for example what the
human mind does when it begins to formulate a sentence, a series of sentences
or a thought. It does not create each sentence with each word sequentially.
Rather, it takes into account the main ideas to be covered and the order in
which they are to be covered. This may depend on a number of things, such as
most important concepts, most recent concepts, or even bias. The mind then
takes those main ideas in a prescribed order and weaves them together with
individual words, taking into consideration what has been said and what is left
to be said.”

In the example below (and now viewed many times on
numerous social media sites), 10,024 websites may be learned from in order to
create a summary and a connectedness map of basic ideas covered in these sites.
Any node of the connectedness map may be selected and a new summary created
based on the relationship between these words.
Information can be quickly discovered without the need to sort through
page after page of internet searches, representing a ‘wall of text.’

It is often a mystery to a user how internet search
engines relate or rank information but now, with InkaBinka search, the
connectedness is graphically understood and manipulated. Like for many in the
NonStop community, visualization works best and the above goes a great way to
explaining the powerful transition under way inside InkaBinka.

InkaBinka runs completely on HPE Moonshot processors and is implemented using
JavaScript and Node.js. And it’s heavily focused on data, search and
visualization – all sounding rather familiar given the interests of Pyalla. When
it comes to hybrid infrastructure we have already seen SLQ/MX on NonStop X
being accessed by microservices on Linux (at HPE IT), so thinking in terms of
NLP on Linux being accessed by transaction processing on NonStop doesn’t
represent too big a stretch of our imagination.

With the looming presence of virtual NonStop, running on commercial,
off-the-shelf hardware, where all that is required is a Linux and KVM (think
OpenStack) perhaps there will be those looking for an even lower cost hardware
option where Moonshot may become an option to consider. If not Moonshot – how
much attention is being paid to HPE CloudLine as here too, there is another
alternative for use with virtual NonStop. In other words, it would be unwise to
place any fences around HPE and NonStop as to where it’s products will surface
or is it wise to rule out participation in a solution by any one product.

Microservices, data analytics and natural language processing, all are involving
NonStop today. In many ways, too, pointing to where NonStop is headed as well
as to the types of platforms we will likely see NonStop becoming a part of. The
support Pyalla is providing is not accidental nor is it random and even though
these associations developed several years ago where, as the time, their likely
impact on NonStop transaction processing may have been dubious at best and a
completely off-the-wall at worst, it’s symbolic in many ways of just how wide
the net is now being cast when it comes to future possibilities for NonStop. HPE Discover 2016 opened many eyes not the
least being my own. The question now for the NonStop community is simply one of
how open are your eyes!