Dana Gardner: Hello, and welcome to the next edition of the HP Discover Podcast Series. I'm Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion of IT innovation and how it's making an impact on people’s lives.

Once again, we’re focusing on how IT leaders are
improving their services to deliver better experiences and payoffs for
businesses and end users alike. I’m now joined by our co-host for this
sponsored series, Chief Software Evangelist at HP, Paul Muller. Welcome Paul, how are you today?

Paul Muller: Fighting fit, and healthy Dana, yourself?

Gardner: Glad to hear it. I’m doing very well, thanks. We’re going to now examine the impact that big-data
technologies and solutions are having on the highly dynamic healthcare
industry. We’ll explore how analytics platforms and new
healthcare-specific solutions together are offering far greater insight
and intelligence into how healthcare providers are managing patient
care, cost, and outcomes.

And we’re going to hear firsthand of how these new offerings, announced this week at the HP Discover Conference in Barcelona,
are designed specifically to give hospitals and care providers new
data-driven advantages as they seek to transform their organizations.

Patrick Kelly: Thank you, Dana. It's great to be with both you and Paul.

Gardner: Just to put this into some perspective, Paul, as
you travel the globe, as I know you do, how closely are you seeing an
intersection between big data and the need for analytics in healthcare. Is this a US-specific drive, or is this something that’s sweeping many markets as well?

Muller: It's undoubtedly a global trend, Dana.
One statistic that sticks in my mind is that in 2012 what was estimated
was approximately 500 petabytes
of digital healthcare data across the globe. That’s expected to reach
25,000 petabytes by the year 2020. So, that’s a 50-times increase in the
amount of digital healthcare data that we expect to be retaining.

The
reasons for that is simply that having better data helps us drive
better healthcare outcomes. And we can do it in a number of different
ways. We move to what we call most evidence-based medicines, rather than
subjecting people to a battery of tests, or following a script, if you
like.

The test or the activities that are undertaken
with each individual are more clearly tailored, based on the symptoms
that they’re presenting with, and data helps us make some of those
decisions.

Basic medical research

The
other element of it is that we’re now starting to bring in more people
and engage more people in basic medical research. For example, in the
US, the Veterans Administration
has a voluntary program that’s using blood sample and health
information from various military veterans. Over 150,000 have enrolled
to help give us a better understanding of healthcare.

We’ve
had similar programs in Iceland and other countries where we were using
long-term healthcare and statistical data from the population to help
us spot and address healthcare challenges before they become real
problems.

The other, of course, is how we better manage healthcare data. A lot of our listeners, I’m sure, live in countries where electronic healthcare records
(EHR) are a hot topic. Either there is a project under way or you may already
have them, but that whole process of establishing them and making sure
that those records are interchangeable is absolutely critical.

Then,
of course, we have the opportunity of utilizing publicly available
data. We’ve all heard of Google being utilized to identify the outbreaks of flu in various countries based on the frequency of which people
search for flu symptoms.

There’s a huge array of data that you need to bring together, in addition to just thinking about the size of it.

So,
there’s definitely a huge number of opportunities coming from data. The
challenge that we’ll find so frequently is that when we talk about big
data, it's critical not just to talk about the size of the data we
collect, but the variety of data. You’ve got things like structured EHR. You have unstructured clinical notes. If
you’ve ever seen a doctor’s scribble, you know what I’m talking about.

You have medical imaging data, genetic data, and epidemiological
data. There’s a huge array of data that you need to bring together, in
addition to just thinking what is the size of it. Of course,
overarching all of these are the regulatory and privacy issues that we
have to deal with. It's a rich and fascinating topic.

Gardner:
Patrick Kelly, tell us a little bit about what you see as the driving
need technically to get a handle on this vast ocean of healthcare data and
the huge potential for making good use of it?

Kelly:
All the points Paul brought up were spot-on. It really is a problem of
how to deal with such a deluge of data. Also, there’s a great change
that’s being undertaken because of the Affordable Care Act (ACA) legislation and that’s impacting not only the business model, but also the need to switch to an electronic medical record.

Capturing data

From
an EHR perspective to date, IT is focused on
capturing that data. They take and then transpose what’s on a medical
record into an electronic format. Unfortunately, where we’ve fallen
short in helping the business is taking that data that’s captured and
making it useful and meaningful in analytics and helping the business to
gain visibility and be able to pivot and change as the need to change
the business model is being brought to bear on the industry.

Gardner:
For those of our audience who are not familiar with Avnet, please
describe your organization. You’ve been involved with a number of
different activities, but healthcare seems to be pretty prominent in the
group now. [Learn more about Avnet's Healthcare Analytics Practice.]

Kelly: Avnet has made a pretty significant
investment over the last 24 months to bolster the services side of the
world. We’ve brought numbers up to around 2,000 new personnel on board
to focus on everything in the ecosystem, from -- as we’re talking about
today -- healthcare all the way up to hardware, educational services,
and supporting partners like HP. We happen to be HP’s largest enterprise distributor. We also have a number of critical channel partners.

In
the last eight months, we came together and brought on board a number
of individuals who have deep expertise in healthcare and security. They
work to focus on building out healthcare practice that not only provides
services, but is also developing kind of a healthcare analytics
platform.

Gardner: Paul Muller, you can’t buy
healthcare analytics in a box. This is really a team sport; an ecosystem approach. Tell me a little bit about what Avnet is,
how important they are in HP’s role, and, of course, there are going to
be more players as well.

What Avnet brings to the table is the understanding of the HAVEn
technology, combined with deep expertise in the area of healthcare and
analytics.

The
"n" or any numbers of apps is really where we work together with our
partners to utilize the platform, to build better big-data enabled
applications. That’s really the critical capability our partners have.

What
Avnet brings to the table is the understanding of the HAVEn technology,
combined with deep expertise in the area of healthcare and analytics.
Combining that, we've created this fantastic new capability that we’re
here to talk about now.

Gardner: Back to you,
Patrick. Tell me a bit about what you think are the top problems that
need to be solved in order to get healthcare information and analytics
to the right people in a speedy fashion. What are our hurdles to
overcome here?

Kelly: If we pull back the covers
and look at some of the problems or challenges around advancing
analytics and modernization into healthcare, it’s really in a couple of
areas. One of them is that it's a pretty big cultural change.

Significant load

Right
now, we have an overtaxed IT department that’s struggling to bring
electronic medical records online and to also deal with a lot of
different compliance things around ICD-10 and still meet meaningful use. So, that’s a pretty significant load on those guys.

Now,
they’re being asked to look at delivering information to the business
side of the world. And right now, there's not a good understanding, from
an enterprise-wide view, of how to use analytics in healthcare very
well.

So, part of the challenge is governance and
strategy and looking at an enterprise-wide road map to how you get
there. From a technology perspective, there’s a whole problem around
industry readiness. There are a lot of legacy systems floating around that can range from 30-year-old mainframes
up to more modern systems. So there’s a great deal of work that has to
go around modernizing the systems and then tying them together. That all
leads to problems with data logistics and fragmentation and really just
equals cost and complexity.

One of the traditional approaches that other industries have followed with enterprise data warehouses and traditional extract, transform, load (ETL)
approaches are just too costly, too slow, and too difficult for
healthcare system to leverage. Finally, there are a lot of challenges in
the process of the workflow.

Muller: These
sound conceptual at a high level, but the impact on patient outcomes is
pretty dramatic. One statistic that sticks in my head is that
hospitalizations in the U.S. are estimated to account for about 30
percent of the trillions of dollars in annual cost of healthcare, with
around 20 percent of all hospital admissions occurring within 30 days of
a previous discharge.

Better utilizing big-data technology can have a very real impact on the healthcare outcomes of your loved ones.

In
other words, we’re potentially letting people go without having
completely resolved their issues. Better utilizing big-data technology
can have a very real impact, for example, on the healthcare outcomes of
your loved ones. Any other thoughts around that, Patrick?

Kelly:
Paul, you hit a really critical note around re-admissions, something
that, as you mentioned, has a real impact on the outcomes of patients.
It's also a cost driver. Reimbursement rates are being reduced because
of failure. Hospitals would be able to address the shortfalls either in
education or follow-up care that end up landing patients back in the ER.

You’re
dead on with re-admissions, and from a big-data perspective, there are
two stages to look at. There’s a retrospective look that is a challenge
even though it's not a traditional big-data challenge. There’s still lot
of data and a lot of elements to look into just to identify patients
that have been readmitted and track those.

But the
more exciting and interesting part to this is the predictive, looking
forward and seeing the patient’s conditions, their co-morbidity, how sick
they are, what kind of treatment they receive, what kind of education
they received and the follow-up care as well as how they behave in the
outside world. Then, it’s bringing all that together and building a
model to be able to determine whether this person is at risk to readmit.
If so, how do we target care to them to help reduce that risk.

Gardner:
We certainly have some technology issues to resolve and some cultural
shifts to make, but what are the goals in the medical field, in the
provider organizations themselves? I’m thinking of such things as
cutting cost, but more that, things about treatments and experience and
even gaining perhaps a holistic view of a patient, regardless of where
they are in the spectrum.

Waste in the system

Muller:
You kind of hit it there, Dana, with the cutting cost. I was reading a
report today, and it was kind of shocking. There is a tremendous amount
of waste in the system, as we know. It said that in the US, $600
billion, 17.6 percent of the nation’s GDP,
that is focused on healthcare is potentially being misspent. A
lot of that is due to unnecessary procedures and tests, as well as
operational inefficiency.

From a provider perspective,
it's getting a handle on those unnecessary procedures. I’ll give you an
example. There’s been an increase in the last decade of elective
deliveries, where someone comes in and says that they want to have an
early delivery for whatever reason. The impact, unfortunately, is an
additional time in the neo-natal intensive care unit (NICU) for the baby.

It
drives up a lot of cost and is dangerous for both the mother and child.
So, getting a handle on where the waste is within their four walls,
whether it’s operationally, unnecessary procedures, or tests and being
able to apply Lean Six Sigma, and some of these process is necessary to help reduce that.

Then,
you mentioned treatments and how to improve outcomes. Another shocking
statistic is that medical errors are the third leading cause of death in
the US. In addition to that, employers end up paying almost $40,000
every time someone receives a surgical site infection.

From a provider perspective, it's getting a handle on those unnecessary procedures.

Those
medical errors can be everything from a sponge left in a patient, to a
mis-dose of a medication, to an infection. Those all lead to a lot of
unnecessary death as well as driving up cost not only for the hospital
but for the payers of the insurance. These are areas that they will get
visibility into to understand where variation is happening and eliminate
that.

Finally, a new aspect is customer experience.
Somehow, reimbursements are going to be tied to -- and this is new for the
medical field -- how I as a patient enjoy, for lack of better term, my
experience as the hospital or with my provider, and how engaged I had
become in my own care. Those are critical measures that analytics are
going to help provide.

Gardner: We have a big
chore ahead of us with the need for changing the way that IT is
conducted in these organizations. Obviously, what you’ve just described
are different ways of doing medicine based on data and analysis, but we
also have this change in the way that medicine is being delivered in the
US. You mentioned the ACA. We’re moving from a paid by procedure basis
much more to a paid by the outcomes basis. This shifts things and
transforms things tremendously too.

Now that we have a
sense of this massive challenge ahead of us, what are organizations
like Avnet and providers like HP with HAVEn doing that will help us
start to get a handle on this? Give us a sense, Patrick, of what you are
bring into the market with the announcement in Barcelona.

Kelly:
As difficult as it is to reduce complexity in any of these analytic
engagements, it's very costly and time consuming to integrate any new
system into a hospital. One of the key things is to be able to reduce
that time to value from a system that you introduce into the hospital
and use to target very specific analytical challenges.

From
Avnet’s perspective, we’re bringing a healthcare platform that we’re
developing around the HAVEn stack, leveraging some of those great
powerful technologies like Vertica and Hadoop, and using those to try to
simplify the integration task at the hospitals.

Standardized inputs

We’re building inputs from HL7,
which is just a common data format within the hospital, trying to build
some standardized inputs from other clinical systems, in order to
reduce the heavy lift of integrating a new analytics package in the
environment.

In addition, we’re looking to build a
unified view of the patient’s data. We want to extend that beyond the
walls of the hospital and build a unified platform. The idea is to put a
number of different tools and modular analytics on top of that to have
some very quick wins, targeted things like we've already talked about,
from readmission all the way into some blocking and tackling operational
work. It will be everything from patient flow to understanding capacity
management.

It will bring a platform that accelerates
the integration and analytics delivery in the organization. In addition,
we’re going to wrap that into a number of services that range from
early assessment to road map and strategy to help with business
integration all the way around continuing to build and support the
product with the help system.

The goal is to
accelerate delivery around the analytics, get the tools that they need
to get visibility into the business, and empower the providers and give
them a complete view of the patient.

Gardner:
Paul, it’s very impressive when you look at what can be done when an
ecosystem comes together. When you look at applications, like what Avnet
is delivering, it seems to me they’re also changing the game in terms
of who can use these analytics. We’re seeing visualizations and we’re
seeing modular approaches like Patrick described. How much of a sea
change are we seeing in terms of not just creating better analytics, but
getting them to more people, perhaps people had never really had access
to this intelligence before.

It’s the immediacy of interaction that is going to make the biggest difference.

Muller:
That’s a critical element. It's simple, easy to understand, and
visualizations are an important element of it. The other is just simply
the ability to turn these sorts of questions around more quickly.

If
you think about traditional medical studies and even something as
simple as drug development, in the past getting access to the data,
being able to have a conversation with the data, has been very
difficult, because sourcing it, scrubbing it, correlating it, processing
it has taken years.

Even simple queries could take
days to run. It’s become more complex and you have to do things like
look for correlation across longitudinal records or understanding
unstructured clinical notes that have been written by a doctor or, more
importantly, by different doctor's. Each of them is writing something
similar, but in a different way. Then, there’s the massive volume of
information involved. Patrick touched on some of the behavioral aspects
or lifestyle choices people make.

The ability to take
all of that information at one time and have a conversation, where it's a
slice and dice it and interact with it, is another important aspect to
the usability and the democratizing access to some of that information.
Whether, it would be the researchers or government officials and health
care workers looking for example for the potential outbreaks of disease
or to plan a better health care system, it’s not just great
visualizations that are important. That certainly helps, but it’s the
immediacy of interaction that is going to make the biggest difference.

Gardner:
Patrick, when you do these basic infrastructure improvements, when you
create a different culture to make the data analysis available fast, you
start to get towards that predictive, rather then reactive, approach.
Do you have some sense or even examples of what good can come of this?
Are there some tangible benefits, some soft benefits, to get as a
payback. I’m thinking clearly pretty quickly because we probably need to
demonstrate value rather soon in this environment?

About visibility

Kelly:
Dana, any first step with this is about visibility. It opens the eyes
around processes in the organization that are problematic and that can
be very basic around things like scheduling in the operating room and utilization of
that time to length of stay of patients.

A very a
quick win is to understand why your patients seem to be continually
having problems and being in the bed longer then they should be. It’s
being able, while they're filling those beds, to redirect care, case
workers, medical care, and everything necessary to help them get out of
the hospital sooner and improve their outcomes.

A lot
of times, we've seen a look of surprise when we've shown, here is the
patient who has been in for 10 days for a procedure that should have
only been a two-day stay, and really giving visibility there. That’s the
first step, though very basic.

As we start attacking
some of these problems around hospital-based infection, we help the
provider make sure that they are covering all their bases and doing kind
of the best practices, and eliminating the variation between each
physician and care provider, you start seeing some real tangible
improvements and outcomes in saving peoples lives.

When
you see that from any population be it stroke, re-admissions -- as we
talked about earlier -- with heart failure and being able to make sure
those patients are avoiding things like pneumonia, you bring visibility.

A challenge for a hospital that has acquired a number of physicians is how to get visibility into those physician practices.

Then,
in predictive models and optimizing how the providers and the
caregivers are working is really key. There are some quick wins, and
that’s why traditionally we built these master repositories that we then
built reports on top of. It’s a year and a half to delivery for any
value, and we’re looking to focus on very specific use cases and trying
to tackle them very quickly in a 90- to 120-day period.

Gardner:
Patrick, do you have any early-adopter examples you can provide for us,
so that we have a sense of what types of organizations are putting this
into place, what they’ve done first, and what have been the outcomes?

Kelly:
We're partnering with a 12-hospital health care system, dealing with
again some blocking and tackling around understanding better how to
utilize their physician network.

A challenge for a
hospital that has acquired a number of physicians is how to get
visibility into those physician practices. How do you understand the
kinds of things we've talked about -- cost, patient experience, outcomes
-- out in the wild, in the primary care offices, and in the specialty
offices? That data has traditionally just been completely segmented from
the hospital systems.

The challenge is building tools
that are going to be leveraged by the physician themselves, as well as
the hospitals and at an executive level, and utilizing that information
to help optimize how those practices are running. It’s kind of a basic
problem for most businesses, but it's something very real for hospitals
to deal with.

Massive opportunity

Gardner:
Paul Muller, this seems to be a massive opportunity, something that
will be going on from many years with HP, Vertica, and HAVEn. Trillions
of dollars have been spent on ways that can give us better patient
experiences, higher health rates, lower mortality rates. So, it’s a win,
win, win, right? The hospitals win, the insurers win, the governments
win, the patients win, the doctors win. What sort of opportunity is this
and how is HP going at it?

Muller: You’ve
absolutely nailed the assessment there. It’s an all-around benefit. A
healthy society is a healthy economy. That’s pretty crystal clear to
everybody. The opportunity for HP and our partners is to help enable
that by putting the right data at the finger tips of the people with the
potential to generate life saving or lifestyle improving insights. That
could be developing a new drug, improving the impatient experience, or
helping us identify longer-term issues like genetic or other sorts of
congenital diseases.

From our perspective, it’s about
providing the underlying platform technology, HAVEn, as the big data
platform. The great partner ecosystem that we've developed in Avnet is a
wonderful example of an organization that’s taken the powerful platform
and very quickly turned that into something that can help not only save
money, but as we just talked about, save lives which I think is
fantastic.

Gardner: Patrick, as we wrap up, we
can certainly see many ways in which these technologies in this analysis
can be used immediately for some very significant benefits. But I’m
thinking that it also puts in place a tremendous foundation for what we
know is coming in the future -- more sensors, more information coming
from the patients, more telemetry, so that it's coming remotely, maybe
from their bodies, while they are out of the hospital.

In this industry, it’s very life and death, versus it's just purely a financial incentive.

We know that mobile devices
are becoming more and more common, not only in patient environments,
but in the hospitals and the care-provider organizations. We know the
cloud and hybrid cloud services are becoming available and can
distribute this data and integrate it across so many more types of
processes.

It seems to me that you not only get a
benefit from getting to a big-data analysis capability now, but it puts
you in a position to be ready when we have more types of data -- more
speed, more end points, and, therefore, more requirements for what your
infrastructure, whether on premises or in a cloud, can do. Tell me a
little bit about what you think the Avnet and HP Solution does for
setting you up for these future trend?

Kelly: At this point, technology today is just not where it needs to be, especially in healthcare. An EKG
spits out 1,000 data points per second. There is no way, at this point,
without the right technology, that you can actually deal with that.

If
we look to a future where providers do less monitoring, so less vital
collection, fewer physicals, and all of that is coming from your mobile
device, it's coming from intelligent machines. There really needs to be
an infrastructure in place to deal with that.

I spent a
lot of time working with Vertica even before Avnet. Vertica, Hadoop,
and leveraging economy in the area of unstructured data is a technology
that is going to allow the scalability and the growth that’s going to be
necessary to leverage the data that we need to make it an asset and
much less challenge and allow us to transform healthcare.

The
key to that is unlocking this tremendous trove of data. In this
industry, as you guys have said, it’s very life and death, versus it's
just purely a financial incentive.

Targeting big data

Muller:
I might jump in on that as well, Dana. This is an important point that
we can’t lose sight of as well. As I said when you and I hosted the previous show, big data is also a big target.

It's
not just a governance issue, but it's a question of morals and making
sure that we are doing the right thing by the people who are trusting
themselves not just with their physical care, but with how they present
in society. Medical information can be sensitive when available not just
to criminals but even to prospective employers, members of the family,
and others.

The other thing we need to be mindful of
is we've got to not just collect the big data, but we've got to secure
it. We've got to be really mindful of who’s accessing what, when they
are accessing, are they appropriately accessing it, and have they done
something like taking a copy or moved it else where that could indicate
that they have malicious intent.

It's also critical we think about big data in the context of health from a 360-degree perspective.

It's also critical we think about big data in the context of health from a 360-degree perspective.

Kelly:
That’s a great point. And to step back a little bit on that, one of the
things that brings me a little comfort around that is there are some
very clear guidelines in the way of HIPAA around how this data is
managed, and we look at it from baking the security into it, in
everything from the encryption to the audit ability.

But
it’s also training the staff working in these environments and making
sure that all of that training is put in place to ensure the safety of
that data. One of the things that always leaves me scratching my head is
that I can go down the street into the grocery store and buy a bunch of
stuff. By the time I get to register, they seem to know more about me
than the hospital does when I go to the hospital.

That’s
one of the shocking things that make you say you can’t wait until big
data gets here. I have a little comfort too, because there are at least
laws in place to try to corral that data and make sure everyone is using
it correctly.

Muller: Thank you for having me back on the show again, Dana. I really love being here.

Gardner:
Of course and also a thank you to the supporter of this series, HP
Software. And a reminder to our audience to carry on the dialog with
Paul Muller through the Discover group on LinkedIn. We've been having a
discussion about how big data and healthcare are intersecting and how
there’s a huge opportunity for far greater insight and intelligence into
how healthcare providers are managing their patient’s care, the cost
and ultimately the outcomes.

And I’d also like to
remind you that you can access this, and other episodes of the HP
Discover podcast series on iTunes under BriefingsDirect.

And,
of course, a big thank you to our guest. We’ve been talking with
Patrick Kelly, Senior Practice Manager at the Avnet Services Healthcare
Practice. Thanks so much, Patrick.

Kelly: Thank you, guys.

Gardner:
This is Dana Gardner, Principal Analyst at Interarbor Solutions, your
co-host for this ongoing series. And lastly, a big thank you to our
audience for joining this HP Discover Discussion, and reminder to come
back next time.

Transcript
of a BriefingsDirect on the need to tap the potential of big data to
improve healthcare delivery and how the technology to do that is
currently lagging. Copyright Interarbor Solutions, LLC, 2005-2013. All
rights reserved.

Today, we present a sponsored podcast discussion on how the recent and rapid evolution of mobile and client management requirements have caused considerable complexity and confusion.

We’ll
examine how incomplete solutions and a lack of a clear pan-client
strategy have hampered the move to broader mobile support at
enterprises and mid-market companies alike. This state of muddled
direction has put IT in a bind, while frustrating users who are eager to
gain greater productivity and flexibility in their work habits, and device choice.

To
share his insights on how to better prepare for a mobile-enablement
future that quickly complements other IT imperatives such as cloud, big data, and even more efficient data centers, we’re pleased to welcome our special guest, Tom Kendra, Vice President and General Manager, Systems Management at Dell Software. [Disclosure: Dell is a sponsor of BriefingsDirect podcasts.]

Welcome, Tom. How are you?

Tom Kendra: Hey, Dana. I am doing very well, and with that intro, it sounds like you’ve pretty much got the answers, my friend.

Gardner: Well, we have the questions, Tom. The answers are what people are looking for.

Kendra:
I think that you’ve laid it out quite well in your opening comments.
There is an enormous amount of conversation in this area and it’s moving
very, very rapidly. Similar to many of your listeners, I imagine, the
number of invitations we get to attend conferences on mobility or bring your own device (BYOD) is off the charts.

Every
day my inbox is filled with new invites. So there’s a lot of
conversation around it. Part of that, Dana, is around the fact that this is an evolving space. There are a lot of moving parts, and hopefully,
in the next few minutes, we’ll be able to dive into some of those.

Gardner:
I suppose, Tom, looking at this from a historical perspective, people
have been dealing with a fast-moving client environment for decades.
Things have changed rapidly with the client. We went through the Web
transition and client-server. We’ve seen all kinds of different ways of getting apps to devices. It’s always been a fast-moving target.

Kendra:
Our industry is characterized by speed and agility. Right now, the big
drivers causing the acceleration can be put into three categories: the
amount and type of data that’s available, all the different ways and
devices for accessing this data, as well as the evolving preferences and
policies for dictating who, what, and how data is shared.

For example, training videos, charts and graphs
versus just text, and the ability to combine these assets and deliver
them in a way that allows a front-line salesperson, a service desk staffer or anyone else in the corporate ecosystem to satisfy customer requests much more efficiently and rapidly.

The
second area is the number of devices we need to support. You touched on
this earlier. In yesterday’s world -- and yesterday was a very short
time ago -- mobility was all around the PC. Then, it was around a
corporate-issued device, most likely a business phone. Now, all of a
sudden, there are many, many, many more devices that corporations are
issuing as well as devices people are bringing into their work
environment at a rapid pace.

We’ve moved from laptops to smartphones that were corporate-issued to tablets. Soon, we’ll get more and more wearables in the environment and machine-to-machine
communications will become more prevalent. All of these essentially
create unprecedented opportunities, yet also complicate the problem.

The
third area that’s driving change at a much higher velocity is the
ever-evolving attitude about work and work-life balance. And, along with
that ... privacy. Employees want to use what they’re comfortable using at
work and they want to make sure their information and privacy rights are
understood and protected. These three items are really driving the
acceleration.

Employees want to use what they’re comfortable using at work and they
want to make sure their information and privacy rights are understood
and protected.

Gardner: And the response to this complexity so far, Tom, has been some suite, some mobile device management (MDM)
approaches, trying to have multiple paths to these devices and
supporting multiple types of infrastructure behind that. Why have these
not yet reached a point where enterprises are comfortable? Why have we
not yet solved the problem of how to do this well?

Kendra:
When you think about all the different requirements, you realize there
are many ways to achieve the objectives. You might postulate that, in
certain industries, there are regulatory requirements that somewhat
dictate a solution. So a lot of organizations in those industries move
down one path. In industries where you don’t have quite the same
regulatory environment, you might have more flexibility to choose yet
another path.

The range of available options is wide,
and many organizations have experimented with numerous approaches. Now,
we’ve gotten to the point where we have the unique opportunity -- today
and over the next couple of years -- to think about how we consolidate
these approaches into a more integrated, holistic mobility solution that
elevates data security and mobile workforce productivity.

None
of them are inherently good or bad. They all serve a purpose. We have
to ask, “How do I preserve the uniqueness of what those different
approaches offer, while bringing together the similarities?”

More efficient

How
can you take advantage of similarities, such as the definition of roles
or which roles within the organization have access to what types of
data? The commonalities may be contextual in the sense that I’m going to
provide this kind of data access if you are in these kinds of locations
on these kinds of devices. Those things we could probably pull together
and manage in a more efficient way.

But we still want
to give companies the flexibility to determine what it means to support
different form factors, which means you need to understand the
characteristics of a wearable device versus a smartphone or an iPad.

I
also need to understand the different use cases that are most prevalent
in my organization. If I’m a factory worker, for example, it may be
better to have a wearable in the future, rather than a tablet. In the
medical field, however, tablets are probably preferred over wearables
because of the need to enter, modify and view electronic medical records. So there are different tradeoffs, and we want to be able to support all of them.

Gardner:
Looking again at the historical perspective, in the past when IT was
faced with a complexity -- too many moving parts, too many variables -- they
could walk in and say, “Here’s the solution. This is the box we’ve put
around it. You have to use it this way. That may cause you some
frustration, but it will solve the bigger problem.” And they could get
away with that.

Today, that’s really no longer the case. There’s shadow IT. There’s consumerization of IT.
There are people using cloud services on their own volition without
even going through any of the lines of business. It's right down to the
individual user. How does IT now find a way to get some control, get the
needed enterprise requirements met, but recognize that their ability to
dictate terms is less than it used to be?

Line-of-business owners are coming forward to request that different employees or
organizational groups have access to information from a multitude of
devices.

Kendra: You’re bringing up a very
big issue. Companies today are getting a lot of pressure from
individuals bringing in their own technology. One of the case studies
you and I have been following for many months is Green Clinic Health System,
a physician-owned community healthcare organization in Louisiana. As
you know, Jason Thomas, the CIO and IT Director, has been very open
about discussing their progress -- and the many challenges --
encountered on their BYOD journey.

As part of Green
Clinic’s goal to ensure excellent patient care, the 50 physicians
started bringing in different technologies, including tablets and
smartphones, and then asked IT to support them. This is a great example
of what happens when major organizational stakeholders -- Green Clinic’s
physicians, in this case -- make technology selections to deliver
better service. With Green Clinic, this meant giving doctors and
clinicians anytime, anywhere access to highly sensitive patient
information on any Internet-connected device without compromising
security or HIPAA compliance requirements.

In other kinds of businesses, similar selection processes are underway as line-of-business
owners are coming forward to request that different employees or
organizational groups have access to information from a multitude of
devices. Now, IT has to figure out how to put the security in place to
make sure corporate information is protected while still providing the
flexibility for users to do their jobs using preferred devices.

Shadow
IT often emerges in scenarios where IT puts too many restrictions on
device choice, which leads line-of-business owners and their
constituents to seek workarounds. As we all know, this can open the door
to all sorts of security risks. When we think about the Green Clinic
example, you can see that Jason Thomas strives to be as flexible as
possible in supporting preferred devices while taking all the necessary
precautions to protect patient privacy and HIPAA regulations.

Similar shift

Gardner:
When we think about how IT needs to approach this differently -- perhaps
embracing and extending what's going on, while also being mindful of
those important compliance risk and governance issues -- we’re seeing a
similar shift from the IT vendors.

I think there’s such a
large opportunity in the market for mobile, for the modern data center, for
the management of the data and the apps out to these devices, that we
are seeing vendor
models shifting, and we’re seeing acquisitions happening.

What's different this time from the vendor
perspective? When you’re trying to bring out a solution, like with IT operators,
you don’t have the same ability to just plop down a product and say,
“Here’s what you do. Here’s how you buy it.” Is this is something that’s
closer to an ecosystem or solution type of approach?

Kendra:
An excellent point again. The types of solutions Dell is bringing to
the market embrace what’s needed today while being flexible enough to
accommodate future applications and evolving data access needs.

The
goal is to leverage customers’ existing investments in their current
infrastructures and find ways to build and expand on those with
foundational elements that can scale easily as needs dictate. You can
imagine a scenario in which an IT shop is not going to have the
resources, especially in the mid-market, to embrace multiple ways of
managing, securing, granting access, or all of these things.

The industry has to move from a position of providing a series of point
solutions to guiding and leading with a strategy for pulling all these
things together.

The industry has to move from a
position of providing a series of point-solutions to guiding and leading
with a strategy for pulling all these things together. Again, it comes
down to giving companies a plan for the future that keeps pace with
their emerging requirements, accommodates existing skill sets and grows
with them as mobility becomes more ingrained in their ways of doing
business. That’s the game -- and that’s the hard part.

We were at MobileCON
two months ago in San Jose and we spoke about how companies need to
think through this as they move forward. There are a couple of important
points we think need to be taken into consideration. First of all, it
is not just a line-of-business, IT, legal, security, or HR discussion.
It's getting all those teams together to think about their current and
future requirements. These conversations are critical and they need to
happen in context with what’s happening across the business while taking
into account the intersections and correlations with the various
stakeholders.

Line of business has to step forward and
say, “This is what I think allows me to drive customer value. This is
what I think I need to do.” HR needs to think about it and have a say in
giving employees what they need to achieve ideal work-life balance
while ensuring that policies address the impact on current and future
employees, contractors and consultants.

And IT needs to
say, “Here is how I have to leverage the investments we’re making.”
That conversation has to happen, and it happens in some organizations at
a much more rapid rate than others.

Long-term affair

Gardner:
That’s why I think this is easily going to be a three- to five-year affair. Perhaps it will be longer, because we’re not just
talking about plopping in a mobile device management capability. We’re
really talking about rethinking processes, business models,
productivity, and how you acquire working skills. We’re no longer just
doing word processing instead of using typewriters. We’re not just
repaving cow paths. We’re charting something quite new.

There is that interrelationship between the technology capabilities and
the work. I think that’s something that hasn’t been thought out.
Companies were perhaps thinking, “We'll just add mobile devices onto the
roster of things that we support.” But that’s probably not enough. How
does the vision from that aspect work, when you try to do both a
technology shift and a business transformation?

Kendra:
You’ve hit again on a really important point. You used the term “plop
in a MDM solution.” It's important to understand that the efforts and
the initiatives that have taken place have all been really valuable.
We’ve learned a lot. The issue is, as you are talking about, how to
evolve this strategy and why.

Equally important is
having an understanding of the business transformation that takes place
when you put all these elements together—it’s much more far-reaching
than simply “plopping” in a point solution for a particular aspect.

In
yesterday's world, I might have had the right or ability to wipe entire
devices. Let’s look at the corporate-issued device scenario. The
company owns the device and therefore owns the data that resides or is
accessed on that device. Wiping the device would be entirely within my
domain or purview. But in a BYOD environment, I’m not going to be able
to wipe a device. So, I have to think about things much differently than
I did before.

Users, based on their roles, need to have access to applications and
data, and they need to have it served up in a very easy, user-friendly
manner.

As companies evolve their own mobility
strategies, it’s important to leverage their learnings, while remaining
focused on enhancing their users’ experiences and not sacrificing them.
That’s why some of the research we’ve done suggests there is a very high
reconsideration rate in terms of people and their current mobility
solutions.

They’ve tried various approaches and point
solutions and some worked out, but others have found these solutions
lacking, which has caused gaps in usability, user adoption,
and manageability. Our goal is to address and close those gaps.

Gardner:
Let's get to what needs to happen. It
seems to me that containerization has come to the fore, a way of
accessing different types of applications, acquiring those applications
perhaps on the fly, rather than rolled out for the entire populace of
the workforce over time. Tell us a little bit more about how you see
this working better, moving toward a more supported, agile,
business-friendly and user-productivity vision or future for mobility.

Kendra:
Giving users the ability to acquire applications on the fly is hugely
important as users, based on their roles, need to have access to
applications and data, and they need to have it served up in a very
easy, user-friendly manner.

The crucial considerations
here are role-based, potentially even location-based. Do I really want
to allow the same kinds of access to information if I’m in a coffee
house in China as I do if I am in my own office? Does data need to be
resident on the device once I’m offline? Those are the kinds of
considerations we need to think about.

Seamless experience

What’s
needed to ensure a seamless offline experience is where the issue of
containerization arises. There are capabilities that enable users to
view and access information in a secure manner when they’re connected to
an Internet-enabled device.

But what happens when
those same users are offline? Secure container-based workspaces allow me
to take documents, data or other corporate information from that online
experience and have it accessible whether I’m on a plane, in a tunnel
or outside a wi-fi area.

The container provides a
protected place to store, view, manage and use that data. If I need to
wipe it later on, I can just wipe the information stored in the
container, not the entire device, which likely will have personal
information and other unrelated data. With the secure digital workspace,
it’s easy to restrict how corporate information is used, and policies
can be readily established to govern which data can go outside the
container or be used by other applications.

The industry is clearing moving in this direction, and it’s critical that we make it across corporate applications.

Heretofore, it's been largely device-centric and management-centric, as opposed to user productivity role-centric.

Gardner:
If I hear you correctly, Tom, it sounds as if we’re going to be able to
bring down the right container, for the right device, at the right
time, for the right process and/or data or application activity. That’s
putting more onus on the data center, but that’s probably a good thing.
That gives IT the control that they want and need.

It
also seems to me that, when you have that flexibility on the device and
you can manage sessions and roles and permissions, this can be a cost
and productivity benefit to the operators of that data center. They can
start to do better data management, dedupe, reduce their storage costs,
and do backup and recovery with more of a holistic, agile or strategic
approach. They can also meter out the resources they need to support
these workloads with much greater efficiency, predict those workloads,
and then react to them very swiftly.

We’ve talked so
far about all how difficult and tough this is. It sounds like if you
crack this nut properly, not only do you get that benefit of the user
experience and the mobility factor, but you can also do quite a bit of a
good IT blocking and tackling on the backend. Am I reading that
correctly or am I overstating that?

Kendra: I
think you’re absolutely on the money. Take us as individuals. You may
have a corporate-issued laptop. You might have a corporate-issued phone.
You also may have an iPad, a Dell tablet,
or another type of tablet at home. For me, it’s important to know what
Tom Kendra has access to across all of those devices in a very simple
manner.

I don’t want to set up a different approach
based on each individual device. I want to set up a way of viewing my
data, based on my role, permissions and work needs. Heretofore, it's
been largely device-centric and management-centric, as opposed to user
productivity role-centric.

Holistic manner

The
Dell position -- and where we see the industry going -- is
consolidating much of the management and security around those devices
in a holistic manner, so I can focus on what the individual needs. In
doing so, it’s much easier to serve the appropriate data access in a
fairly seamless manner. This approach rings true with many of our
customers who want to spend more resources on driving their businesses
and facilitating increased user productivity and fewer resources on
managing a myriad of multiple systems.

Gardner: By bringing the point of management -- the point of
power, the point of control and enablement -- back into the data center, you’re also able to link up to your legacy assets much more easily than
if you had to somehow retrofit those legacy assets out to a specific
device platform or a device's format.

Kendra:
You’re hitting on the importance of flexibility. Earlier, we said the
user experience is a major driver along with ensuring flexibility for
both the employee and IT. Reducing risk exposure is another crucial
driver and by taking a more holistic approach to mobility enablement, we
can address policy enforcement based on roles across all those devices.
Not only does this lower exposure to risk, it elevates data security
since you’re addressing it from the user point of view instead of trying
to sync up three or four different devices with multiple user profiles.

Gardner:
And if I am thinking at that data center level, it will give me choices
on where and how I create that data center, where I locate it, how I
produce it, and how I host it. It opens up a lot more opportunity for
utilizing public cloud services, or a combination that best
suits my needs and that can shift and adapt over time.

Kendra:
It really does come down to freedom of choice, doesn’t it? The freedom
to use whatever device in whichever data center combination that makes
the most sense for the business is really what everyone is striving for.
Many of Dell’s customers are moving toward environments where they are
taking both on-premise and off-premise compute resources. They think
about applications as, “I can serve them up from inside my company or I
can serve them up from outside my company.”

We’re a very trusted brand, and companies are interested in what Dell has to say.

The
issue comes down to the fact that I want to integrate wherever
possible. I want to serve up the data and the applications when needed
and how needed, and I want to make sure that I have the appropriate
management and security controls over those things.

Gardner:
Okay, I think I have the vision much more clearly now. I expect we’re
going to be hearing more from Dell Software on ways to execute toward
that vision. But before we move on to some examples of how this works in
practice, why Dell? What is it about Dell now that you think puts you
all in a position to deliver the means to accomplish this vision?

Kendra:
Dell has relationships with millions of customers around the world.
We’re a very trusted brand, and companies are interested in what Dell
has to say. People are interested in where Dell is going. If you think
about the PC market, for example, Dell has about an 11.9 percent
worldwide market share. There are hundreds and hundreds of millions of
PCs used in the world today. I believe there were approximately 82
million PCs sold during the third quarter of 2013.

The
point here is that we have a natural entrée into this discussion and the
discussion goes like this: Dell has been a trusted supplier of hardware
and we’ve played an important role in helping you drive your business,
increase productivity and enable your people to do more, which has
produced some amazing business results. As you move into thinking about
the management of additional capabilities around mobile, Dell has
hardware and software that you should consider.

Now,
given that we’ve been a trusted supplier for a long time, when getting
into the discussion of our world-class technology around hardware,
software and services, most people are willing to listen. So we have a
natural advantage for getting into the conversation.

World-class technologies

Once
we’re in the conversation, we can highlight Dell’s world-class
technologies, including end-user computing, servers, storage,
networking, security, data protection, software, and services.

As
a trusted brand with world-class technologies and proven solutions,
Dell is ideally suited to help bring together the devices and underlying
security, encryption, and management technologies required to deliver a
unified mobile enablement solution. We can pull it all together and
deliver it to the mid-market probably better than anyone else.

So
the Dell advantages are numerous. In our announcements over the next
few months, you’ll see how we’re bringing these capabilities together
and making it easier for our customers to acquire and use them at a
lower cost and faster time to value.

Gardner:
One of the things that I'd like to do, Tom, is not just to tell how
things are, but to show. Do we have some examples of organizations --
you already mentioned one with the Green Clinic -- that have bitten the
bullet and recognized the strategic approach, the flexibility on the
client, leveraging containerization, retaining control and governance,
risk, and compliance requirements through IT, but giving those end-users
the power they want? What's it like when this actually works?

Kendra:
When it actually works, it's a beautiful thing. Let’s start there. We
work with customers around the world and, as you can imagine, given
people's desire for their own privacy, a lot of them don't want their
names used. But we’re working with a major North American bank that has
the problems that we have been discussing.

The concept of an integrated suite of policy and management capabilities
is going to be extremely important going forward.

They
have 20,000-plus corporate-owned smartphones, growing to some 35,000 in
the next year. They have more than a thousand iPads in place, growing
rapidly. They have a desktop virtualization (VDI) solution, but the VDI solution, as we spoke about earlier, really doesn't support the offline experience that they need.

They
are trying to leverage an 850-person IT department that has worldwide
responsibilities, all the things that we spoke about earlier. And they
use technology from companies that haven’t evolved as quickly as they
should have. So they're wondering whether those companies are going to
be around in the future.

This is the classic case of,
“I have a lot of technology deployed. I need to move to a container
solution to support both online and offline experiences, and my IT
budget is being squeezed.” So how do you do this? It goes back to the
things we talked about.

First, I need to leverage what I
have. Second, I need to pick solutions that can support multiple
environments rather than a point solution for each environment. Third, I
need to think about the future, and in this case, that entails a rapid
explosion of mobile devices.

I need to mobilize rapidly
without compromising security or the user experience. The concept of an
integrated suite of policy and management capabilities is going to be
extremely important to my organization going forward.

Mobile wave

This reminds me of some information we reviewed from a Lopez Research
report. In their “Mobile Management: A Foundation for the New Mobile
Ecosystem,” Maribel Lopez shared that more than half the firms
interviewed as part of custom CIO research plan to mobile-enable
business apps and processes. The mobile wave is coming and it’s coming
fast.

This large financial institution fits that
profile. They're moving rapidly. They’re thinking about how to give
greater access to applications and data and they need streamlined ways
to accomplish that. It’s a typical customer scenario that we are seeing
these days.

Gardner: Tom, who gets to do this
faster, better, cheaper? Is it the large enterprise that's dragging a long
legacy and has a thousand IT people to either help them or hinder them
-- or the mid-size organization that can look to a myriad of sourcing
options and wants to get out of the data center or facilities business?
Is there some sort of a natural advantage, in some way -- a leapfrog
type of an effect -- for those mid-market organizations with this?

Kendra:
The mid-market has the advantage of not having giant deployments and
huge teams, which gives them a certain advantage in being able to move
fast and nimbly. On the flip side, the mid-market organization often is
resource constrained in terms of budget and skills. Let’s face it, a
10-person IT shop will likely have deep skills in certain areas, but
they have to have more generalized experiences.

For
them, finding solutions that address multiple problems quickly is an
absolute imperative, so they can rollout simple solutions while
maximizing economies of scale to the fullest extent. That’s not to say
that large enterprises don’t have similar priorities, but they often
have complex legacy issues that exacerbate their issues. Dell is equally
adept at helping those organizations work through those issues and
devise a plan for what to move, when and how without losing sight of
longer-term plans and business directions.

You’ll see more from us detailing how those integrated solutions come together to deliver fast time to value.

There
are advantages and disadvantages with each. Both need agile solutions
and want to leverage their resources to the fullest extent. Both are
striving to lower costs and eliminate risks. Both groups are interested
in very much the same things but often take different approaches to
achieving those goals.

Gardner: It certainly
sounds as if Dell is approaching this enterprise mobility manager market
with an aggressive perspective, recognizing a big opportunity in the
market and an opportunity that they are uniquely positioned to go at.
There’s not too much emphasis on the client alone and not just emphasis
on the data center. It really needs to be a bridging type of a value-add
these days. Can you tease us a little bit about some upcoming news?
What should we expect next?

Kendra: The
solutions we announced in April essentially laid out our vision of
Dell’s evolving mobility strategies. We talked about the need to
consolidate mobility management systems and streamline enablement. We
focused on the importance of leveraging world-class security, including
secure remote access and encryption. And the market has responded well
to Dell's point of view.

As we move forward, we have
the opportunity to get much more prescriptive in describing our unified
approach that consolidates the capabilities organizations need to ensure
secure control over their corporate data while still ensuring an
excellent user experience.

You’ll see more from us
detailing how those integrated solutions come together to deliver fast
time to value. You'll also see different delivery vehicles, giving our
customers the flexibility to choose from on premise, software-as-a-service (SaaS) based or cloud-based approaches. You'll see additional device support, and you'll see containerization.

Leverage advantages

We
plan to leverage our advantages, our best-in-class capabilities around
security, encryption, device management; this common functionality
approach. We plan to leverage all of that in upcoming announcements.

As
we take the analyst community through our end-to-end mobile/BYOD
enablement plans, we’ve gotten high marks for our approach and
direction. Our discussions involving Dell’s broad OS support, embedded
security, unified management and proven customer relationship all have
been well received.

Our next step is to make sure that,
as we announce and deliver in the coming months, customers absolutely
understand what we have and where we're going. We think they're going be
very excited about it. We think we're in the sweet spot of the
mid-market and the upper mid-market in terms of what solutions they need
to ease their mobile enablement objectives.

We also believe we can provide a unique point-of-view and compelling
technology roadmaps for those very large customers who may have a longer
journey in their deployments or rollout.

We also
believe we can provide a unique point-of-view and compelling technology
roadmaps for those very large customers who may have a longer journey
in their deployments or rollout.

We're very excited
about what we're doing. The specifics of what we're doing play out in
early December, January, and beyond. You'll see a rolling thunder of
announcements from Dell, much like we did in April. We’ll lay out the
solutions. We’ll talk about how these products come together and we’ll
deliver.

Gardner: Very good. I’m afraid we'll
have to leave it there. You have been listening to a sponsored
BriefingsDirect podcast discussion on how the recent rapid evolution of
mobile and client management requirements and approaches have caused
complexity and confusion, but we have now heard Dell's vision for how
mobile enablement should be able to quickly complement other IT
imperatives and allow for the IT department do what it does best and for
end-users to innovate and do what they do best as well.

So
a big thank you to our guest, Tom Kendra, Vice President and General
Manager, Systems Management at Dell Software. Thanks so much, Tom.

Kendra: Thank you, Dana.

Gardner:
And also a big thank you to our audience for joining this insightful
discussion. This is Dana Gardner, Principal Analyst at Interarbor
Solutions, thanks again for listening, and come back next time.

Transcript
of a Briefings Direct podcast on the new landscape sculpted by the
increasing use of mobile and BYOD, and how Dell is helping companies
navigate that terrain. Copyright Interarbor Solutions, LLC, 2005-2013.
All rights reserved.

Today, we present a sponsored podcast discussion on the top new business imperatives: Creating big-data capabilities and becoming a data-driven organization.

We’ll examine how business-intelligence (BI)
trends are requiring access and automation across data flows from a variety of sources, formats, and from many
business applications.

Our discussion focuses on
ways that enterprises are effectively harvesting data in all its forms,
and creating integration that fosters better use of data throughout the business process lifecycle.

Here now to share their
insights into using data strategically by exploiting all of the data from
all of the applications across business ecosystems, we’re joined
by Jon Petrucelli, Senior Director of Hitachi Solution Dynamics, CRM and Marketing Practice, based in Austin, Texas. Welcome, Jon.

Gardner:
Betsy, let me start with you. We know that more businesses are trying
to leverage and exploit their data, helping them to become more agile,
predictive, and efficient. What's been holding them
back from gaining access to the most relevant data? What's the roadblock
here?

Bilhorn: There are a couple of things. One is the explosion in the different types and kinds of data. Then, you start mixing that with legacy systems
that have always been somewhat difficult to get to. Bringing those all
together and making sense of that are the two biggest ones. Those have
been around for a long, long time.

That problem is getting exponentially harder, given
the variety of those data sources, and then all the different ways to
get into those. It’s just trying to put all that together. It just gets
worse and worse. When most people look at it today, it almost seems
somewhat insurmountable. Where do you even start?

Gardner:
Jon, how about your customers, at Hitachi? What are you seeing in terms
of the struggle that they're facing in getting better data for better
intelligence and analytics?

Legacy systems

Petrucelli:
We work with a lot of large enterprise, global-type customers. To build on
what Betsy said, they have a lot of legacy systems. There's a lot of
data that’s captured inside these legacy systems, and those systems
were not designed to be open architected, with sharing their data with
other systems.

When you’re dealing with modern systems, it's definitely getting easier. When you deal with middleware
software like Scribe, especially with Scribe Online, it gets much
easier. But the biggest thing that we encounter in the field with these
larger companies is just a lack of understanding of the modern
middleware and integration and lack of understanding of what the
business needs. Does it really need real-time integration?

Some of our customers definitely have a
good understanding of what the business wants and what their customers
want, but usually the evaluator, decision-maker, or architect doesn’t have a strong background in data integration.

It's
really a people issue. It's an educational issue of helping them
understand that this isn't as hard as they think it is. Let's scope it
down. Let's understand what the business really needs. Usually, that
becomes something a lot more realistic, pragmatic, and easier to do than
they originally anticipated going into the project.

In
the last 5 to 10 years, we've seen data integration get much easier to
do, and a lot of people just don’t understand that yet. That’s the lack
of understanding and lack of education around data integration and how
to exploit this big-data
proliferation that’s happening. A lot of users don't quite understand
how to do that, and that’s the biggest challenge. It’s the people side
of it. That’s the biggest challenge for us.

Gardner:
Rick Percuoco at Trillium, tell us what you are seeing when it comes to
the impetus for doing data integration. Perhaps in the past, folks saw this as too
daunting and complex or involved skill sets that they didn't
have. But it seems now that we have a rationale for wanting to have a
much better handle on as much data as possible. What's driving the need
for this?

Percuoco: I would definitely agree
with what Betsy and Jon said. In dealing with that kind of client base, I
can see that a lot of the principles and a lot of the projects are in
their infancy, even with some of the senior architects in the business.
Certain companies, by their nature, deal with volume data. Telecom
providers or credit card companies are being forced into building these
large data repositories because the current business needs would
support that anyway.

So they’re really at the forefront of most of these.
What we have are large data-migration projects. There are disparate
sources within the companies, siloed bits of information that they want
to put into one big-data repository.

Mostly, it's used from an analytics or BI standpoint, because now you have the capability of using big-data SQL
engines to link and join across disparate sources. You can ask
questions and get information, mines of information, that you never
could before.

The aspect of extract, transform, load (ETL) will definitely be affected with the large data volumes, as you can't move the data like you used to in the past. Also, governance
is becoming a stronger force within companies, because as you load many
sources of data into one repository, it’s easier to have some kind of
governance capabilities around that.

Higher scales

Gardner:
Betsy, it sounds that as if the technology has moved in
such a way that the big-data analytics, the platform for doing analysis,
has become much more capable in dealing at higher scales, faster speeds at
lower costs. But we still come back to that same problem of getting to
the data, putting it in a format that can be used, directing it,
managing that flow, automating it, and then, of course, dealing with the
compliance, governance, risk, and security issues.

Is
that the correct read on this, that we've been able to move quite well in
terms of the analytics engine capability, but we're still struggling
with getting the fuel to that engine?

Bilhorn:
I would absolutely agree with that. When you look at the trends out
there, when we talk about big data, big analytics and all of that,
that's moved much faster than capturing those data sources and getting
them there. Again, it goes back to all of these sources Jon was
referring to. Some of these systems that we want to get the data from
were never built to be open. So there is a lot of work just to get them
out of there.

The other thing a lot of people like to talk about is an application programming interface (API) economy.
"We will have an API and we can get through web services at all this
great stuff," but what we’ve seen in building a platform ourselves and
having that connectivity, is that not all of those APIs are created equal.

The
vendors who are supplying this data, or these data services,
are kind of shooting themselves in the foot and making it difficult for
the customer to consume them, because the APIs are poorly written and
very hard to understand, or they simply don’t have the performance to
even get the data out of the system.

The vendors who are supplying this data, or these data services
themselves, are kind of shooting themselves in the foot and making it
difficult for the customer to consume them.

On
top of that, you have other vendors who have certain types of terms of
service, where they cut off the service or they may charge you for it.
So when they talk about how it's great that they can do all these
analytics, in getting the data in there, there are just so many show
stoppers on a number of fronts. It's very, very challenging.

Gardner: Let's think about what we are doing in terms of expanding the requirements for business activities and values here. Customer relationship management (CRM),
I imagine, paved the way where we’re trying to get a single view of the
customer across many different data type of activities. But now, we’re
pushing the envelope to a single view of the patient across multiple
healthcare organizations or a single view of a process that has a cloud
part, an on-premises part, and an ecosystem supply-chain part.

It
seems as if we’ve moved in more complexity here. Jon Petrucelli, how are the
systems keeping up with these complex demands, expanding
concentric circles of inclusion, if you will, when it comes to a single
view of an object, individual, or process?

Petrucelli:
That’s a huge challenge. Some people might call it data taxonomy, data
structuring, or data hygiene, but you have to be able to define a
unique identifier for your primary object in the data. That’s what we
see. Sometimes, businesses have a hard time deciding on that, but
usually it jumps out at you.

The only
things that will transact business with you in the world are people or
organizations, generally speaking. A dog, a tree, or an asset is not
going to actually transact business with you.

Master key

We
have specialists on our team that do this taxonomy, architects that
help our organizations, figure out what a master key is, a master global
unique identifier for an object. Then, you come up with a schema that
allows you to either use one that’s existing or you concatenate a bunch
of the data together to create one. That becomes the way that you relate
all of the objects to each other that sets the foreign key that they
hook up to.

Gardner: I think that helps
illustrate how far you can go with this. It seems, though, as if you
have to get your own house in order -- your own legacy applications, your
own capabilities -- before you can start to expand and gain some of these
competitive advantages. It seems that the more data you
can bring it to bear on your analytics, the more predictive, the more
precise, and the more advantageous your business decisions will be.

I
think we understand the complexity, but let's take it back inside the
organization. Rick, tell us first about what Trillium Software does and
how you're seeing organizations take the steps to begin to get the
skills, expertise, and culture to make data integration and data
lifecycle management happen better.

Percuoco:
Trillium Software has always been a data-quality company. We have a
fairly mature and diverse platform for data that you push through.
Because for analytics, for risk and compliance, or for anything where
you need to use your data to calculate some kind of risk quotient ratios
or modeling whereby you run your business, the quality of your data is
very, very important.

With
the advent of big data and the volume of more and varied unstructured
data, the problem of data quality is like on steroids now.

If
you’re using that data that comes in from multiple channels to make
decisions in your business, then obviously data quality and making that
data the most accurate that it can be by matching it against structured
sources is a huge difference in terms of whether you'll be making the
right decisions or not.

With the advent of big data and the volume of more and varied unstructured data,
the problem of data quality is on steroids now. You have a quality
issue with your data. If anybody who works in any company is really
honest with themselves and with the company, they see that the integrity
of the data is a huge issue.

As the sources of data become more varied and they come from unstructured data sources like social media,
the quality of the data is even more at risk and in question. There
needs to be some kind of platform that can filter out the chatter in
social media and the things that aren't important from a business
aspect.

Gardner: Betsy Bilhorn, tell us about
Scribe Software and how what Trillium and Hitachi Solutions are doing helps data management.

Bilhorn:
We look at ourselves as the proverbial PVC pipe, so to speak, to bring
data around to various applications and the business processes and
analytics. Where folks like Hitachi leverage our platform is in being able to make that process as easy
and as painless as possible.

We want people to get
value out of their data, increase the pace of their business, and
increase the value that they’re getting out of their business. That
shouldn’t be a multi-year project. It shouldn’t be something that you’re
tearing your hair out over and running screaming off a bridge.

As easy as possible

Our
goal here at Scribe is to make that data integration and to get that data
where it needs to go, to the right person, at the right time, as easily
and simply as possible for companies like Hitachi and their clients.

Working
with Trillium, one of the great things with that partnership is
obviously that there is the problem of garbage in/garbage out. Trillium
provides that platform by which not only can you get your data where
you need it to go, but you can also have it clean and you can have it
deduped. You can have a better quality of data as it's moving around in
your business. When you look at those three aspects together, that’s
where Scribe sits in the middle.

Petrucelli: We used to do custom software integration. With a lot of our customers we see lot of custom .NET
code or other types of codesets, Java for example, that do the
integration. They used to do that, and we still see some bigger
organizations that are stuck on that stuff. That’s a way to paint
yourself into a corner and make yourself captive to some developer.

We
highly recommend that people move away from that and go to a
platform-based middleware application like Scribe. Scribe is our
preferred platform middleware, because that makes it much more
sustainable and changeable as you move forward. Inevitably, in
integration, someone is going to want to change something later on.

When
you have a custom code integration someone has to actually crack open
that code, take it offline, or make a change and then re-update the code
and things like -- and its all just pure spaghetti code.

We highly recommend that people move away from that and go to a platform-based middleware application like Scribe.

With
a platform like Scribe, its very easy to pick up industry-standard
training available online. You’re not held hostage anymore. It’s a graphical user interface (GUI). It's literally drag-and-drop mappings and interlock points. That’s really amazing, being this nice capability in their Scribe Online
service. Even children can do an integration. It’s a teaching technique
that was developed at Harvard or MIT about how to put puzzle pieces
together through integration. If it doesn’t work, the puzzle pieces
don’t fit.

They’ve done a really amazing job of making
integration for rest of us, not just for developers. We highly recommend
people to take a look at that, because it just brings the power back to
the business and takes it away from just one developer, a small
development shop, or an outsourced developer.

That’s
one thing. The other thing I want to add is that we see integration as
critical to all of the successor projects at the high levels of adoption
and return on investment (ROI).
Adoption by the users and then ultimately ROI by the businesses is
important, because integration is like gas in the sports car. Without
the gas, it's not going to go.

We want to give them one
user experience or one user interface to productive users -- especially
sales reps in the CRM world and customer service reps. You don’t want
them all tabbing between a bunch of different systems. So we bring them
into one interface, and with a platform like Microsoft CRM, they can use their interface of choice.

They can move from a desktop, to a laptop, to a tablet, to a mobile device
and they’re seeing one version of the truth, because they’re all
looking into windows looking into the same realm. And in that realm, what
is tunneled in comes through pipes that are Scribe.

Built-in integration

What
we do for a lot of customers is intentionally build integration into it
using Scribe, because we know that if we can take them down from five
different interfaces, you're looking at getting a 360-degree view of the
customer that’s calling them or that they’re about to call on. We can
take that down to one interface from five.

They’re
really going to like that. Their adoption is going to be higher and
their productivity is going to be higher. If you can raise the
productivity of the users, you can raise the top line of the company
when you’re talking about a sales organization. So, integration is the
key to drive high level of adoption and high level of ROI and high
levels of productivity.

Gardner: Let's talk
about some examples of how organizations are using these approaches,
tools, methods, and technologies to improve their business and their
data value. I know that you can’t always name these organizations, but
let's hear a few examples of either named or non-named organizations
that are doing this well, doing this correctly, and what it gets for
them.

If you can raise the productivity of the users, you can raise the top
line of the company when you’re talking about a sales organization.

Petrucelli: One that pops to mind, because I just was recently dealing with them, is the Oklahoma City Thunder
NBA basketball team. I know that they’re not a humongous enterprise
account, but sometimes it's hard for people to understand what's going
on inside an enterprise account.

Most people follow and are aware of sports. They have an understanding of
buying a ticket, being a season ticket holder, and what those concepts
are. So it's a very universal language.

The Thunder had a
problem where they were using a ticketing system that would sell the
tickets, but they had very little CRM capabilities. All this ticketing
was done at the industry standard for ticketing and that was great, but
there was no way to track, for example, somebody's preferences. You’d
have this record of Jon Petrucelli who buys season tickets and comes to
certain games. But that’s it; that’s all you’d have.

They couldn’t track who my favorite player was, how many kids I have, if I was married, where I live, what my blog is, what my Facebook
profile is. People are very passionate about their sports team. They want to
really be associated with them, and they want to be connected with those
people. And the sports teams really want to do that, too.

So we had a great project, an award winning project. It's won a Gartner
award and Microsoft awards. We helped the Oklahoma City Thunder to
leverage this great amount of rich interaction data, this transactional
data, the ticketing data about every seat they sat in, and every time
they bought.

Rich information

That’s
a cool record and that might be one line in the database. Around that
record, we’re now able to wrap all the rich information from the
internet. And that customer, that season ticket holder, wants to share information,
so they can have a much more personalized experience.

Without
Scribe and without integration we couldn’t do that. We could easily
deploy Microsoft CRM and integrate it into the ticketing system, so all
this data was in one spot for the users. It was a real true win-win-win,
because not only did the Oklahoma City Thunder have a much more
productive experience, but their season ticket account managers could
now call on someone and could see their preferences. They could see
everything they needed to track about them and see all of their
ticketing history in one place.

And they could see if
they’re attending, if they are not attending, everything about what's
going on with that very high-value customer. So that’s a win for them.
They can deliver personalized service. On the other end of it, you have
the customer, the season ticket holder and they’re paying a lot of
money. For some of them, it’s a lifelong dream to have these tickets or
their family has passed them down. So this is a strong relationship.

Especially
in this day and age, people expect a personalized touch and a
personalized experience, and with integration, we were able to deliver
that. With Scribe, with the integration with the ticketing system,
putting that all in Microsoft CRM where it's real-time, it's accessible
and it's insightful.

It’s not just data anymore. It's
real time insights coming out of the system. They could deliver a much
better user experience or customer experience, and they have been
benchmarked against the best customer organizations in the world. The Oklahoma City Thunder are now rated as the top professional
sports fan experience. Of all professional sports, they have the top
fan experience -- and it's directly relatable to the CRM platform and the
data being driven into it through integration.

It’s not just data anymore. It's real time insights coming out of the system.

Gardner:
Great. You can actually see where there is transformational benefit.
They’re not just iterative or nice to have. It really changes their
business in a major way. Rick Percuoco, any thoughts there at Trillium
Software of some examples that exemplify why these approaches are so
powerful?

Percuoco: I’ve seen a couple of pretty
interesting use cases. One of them is with one of our technical
partnerships. They have a data platform also where they use a behavior
account-sharing model. It's very interesting in that they take multiple
feeds of different data, like social media data, call-center data, data
that was entered into a blog from a website. As Jon said, they create a
one-customer view of all of those disparate sources of data including
social media and then they map for different vertical industries
behavioral churn models.

In other words, before someone
churns their account or gets rid of their account within a particular
industry -- like insurance, for example -- what steps do they go through
before they churn their account? Do they send an e-mail to someone? Do
they call the call center? Do they send social media messages? Then,
through statistical analysis, they build these behavioral churn models.

They
put data through these models of transactional data, and when certain
accounts or transactional data fall out at certain parts, they match
that against the strategic client list and then decide what to do at the
different phases of the account churn model.

I've
heard of companies, large companies, saving as much as $100 million in
account churn by basically understanding what the clients are doing
through these behavioral churn models.

Sentiment analysis

Probably
the other most prevalent that I've seen with our clients is
sentiment analysis. Most people are looking at social media data, seeing
what people are saying about them on social media channels, and then
using all different creative techniques to try and match those social
media personas to client lists within the company to see who is saying
what about them.

Sentiment analysis is probably the
biggest use case that I've seen, but the account churn with the
behavioral models was very, very interesting, and the platform was very
complex. On top, it had a productive analytics engine that had about 80
different modeling graphs and it also had some data visualization tools. So it was very, very easy to create shots and graphs and it was actually pretty impressive.

Gardner:
Betsy, do you have any examples that also illustrate what we're talking
about when it comes to innovation and value around data gathering
analytics and business innovation.

Bilhorn: I’m
going to do a little bit of a twist here on that problem. We have had a
recent customer, who is one of the top LED lighting franchisors in
United States, and they had a different bit of a problem. They have
about 150 franchises out there and they are all disconnected.

Sentiment analysis is probably the biggest use case that I've seen.

So,
in the central office, I can't see what my individual franchises are
doing and I can't do any kind of forecasting or business reporting to be
able to look at the health of all my franchises all over the country.
That was the problem.

The second problem was that they had decided on a standardized NetSuite
platform and they wanted all of their franchises to use these.
Obviously, for the individual franchise owner, NetSuite was a little too
heavy for them and they said overwhelmingly they wanted to have QuickBooks.

This
customer came to us and said, “We have a problem here. We can't find
anybody to integrate QuickBooks to our central CRM system and we can't
report. We’re just completely flying blind here. What can you do for
us?”

Via integration, we were able to satisfy that
customer requirement. Their franchises can use QuickBooks, which was
easy for them, and then through all of that synchronized information
back from all of these franchises into central CRM, they were able to do
all kinds of analytics and reporting and dashboarding on the health of
the whole business.

The other side benefit, which also
makes them very competitive, is that they’re able to add franchises
very, very quickly. They can have their entire IT systems up and running
in 30 minutes and it's all integrated. So the franchisee is ready to
go. They have everything there. They can use a system that’s easy for
them to use and this company is able to have them up and are getting
their data right away.

Consistency and quality

So
that’s a little bit different. Big data is not social, but it’s a
problem that a lot of businesses face. How do I even get these systems
connected so I can even run my business? This rapid repeatable model for
this particular business is pretty new. In the past, we’ve seen a lot
of people try to wire things up with custom codes, or every thing is ad
hoc. They’re able to stand up full IT systems in 30 minutes, every
single time over and over again with a high level consistency and
quality.

Gardner: Well we have to begin to wrap
it up, but I wanted to take a gauge of where we are on this. It seems to
me that we’re just scratching the surface. It’s the opening innings, if
you will.

Will
we start getting these data visualizations down to mobile devices, or
have people inputting more information about themselves, their devices, or the
internet of things? Let's start with you, Jon. Where are we on the
trajectory of where this can go?

Petrucelli: We’re working on some projects right now with geolocation,
geocaching, and geosensing, where when a user on a mobile device comes
within a range of a certain store, it will serve that user up, if they
have downloaded the app. It will be an app on their smartphone and they
have opted into those. It will serve them up special offers to try to
pull them into the store the same way in which, if you’re walking by a
store, somebody might say, “Hey, Jon.” They know who I am and know my
personalization, when I come in a range, it now knows my location.

Integration is really the key to drive high levels of adoption, which drives high levels of productivity.

This
is somebody who has an affinity card with a certain retailer, or it
could be a sports team in the venue that the organization knows during
the venue, it knows what their preferences are and it puts exactly the
right offer in front of the right person, at the right time, in the
right context, and with the right personalization.

We
see some organizations moving to that level of integration. With all of
the available technology, with the electronic wallets, now with Google Glass,
and with smart watches, there is a lot of space to go. I don’t know if
it's really relevant to this, but there is a lot of space now.

We’re
more in the business app side of it, and I don’t see that going away.
Integration is really the key to drive high levels of adoption, which
drives high levels of productivity which can drive top line gain and
ultimately a better ROI for the company that’s how we really look it
integration.

Gardner: Where are we on the trajectory here for using these
technologies to advance business?

Percuoco:
You mentioned specifically location information, and, as Jon mentioned,
it is germane to this discussion. There’s the concept of digital
marketing, marketing coupons to people in real-time over their
smartphones as they’re walking by businesses, and so forth. That’s
definitely one of the very prevalent use cases for location objects.

Shopping patterns

There’s
also an interesting one that kind of goes on top of that, where you
evaluate web traffic shopping patterns of people, using Google location
objects. For large ticket items, you can actually email them, in real
time, competitor coupons. For example, a mile down the street, this one
company has something for $100 or $200 less.

It's
another interesting use case kind of intelligent marketing through
digital media in the mobile market. I also see the mobile delivery of
information being critical as we move forward.

Pretty
much all data integration or BI professionals are basically working
parents. It’s very, very important to be able to deliver that
information, at least in a dashboard format or a summary format on all
the mobile devices. You could be at your kid’s Little League game or you
could be out to dinner with your wife, but you may have to check
things.

The delivery of information through the mobile
market is critical, although the user experience has to be different.
There needs to be a bunch of work in terms of data visualization, the
user experience, and what to deliver. But the modern family aspects of
life and people working are forcing the mobile market to come up to
speed.

It’s very, very important to be able to deliver that information, at
least in a dashboard format or a summary format on all the mobile
devices.

The other thing that I would say is in
terms of integration methods and what Jon was talking about. You do have
to watch out for custom APIs. Trillium has a connectivity business as does
Scribe.

As long as you stick with industry-standard handshaking methods, like XML or JSON or web services and RESTful
APIs, then usually you can integrate packages fairly smoothly. You
really need to make sure that you're using industry-standard hand-offs for a lot of the integration methods. You have four or five
different ways to do that, but it’s pretty much the same four or five.

Those would be my thoughts on the future. I also see cloud computing, platform as a service (PaaS), and software as a service (SaaS) really taking hold of the market. Even Microsoft and some of the other platform tools like Office 365
and the email systems in CRM, are all cloud-based applications now, and
to be honest, they’re better. The service is better, and there’s no
on-premise footprint. I really see the market moving toward PaaS and
SaaS to the cloud computing market.

Gardner: What is Scribe Software's vision, and
what are the next big challenges that you will be taking your technology
to?

Bilhorn: Ideally,
what I would like to see, and what I’m hoping for, is that with mobile and consumerization of IT you’re beginning to see
that business apps act more like consumer apps, having more standard APIs and
forcing better plug and play. This would be great for business. What we’re trying to do, in absence of that, is create that
plug-and-play environment to, as Jon said, make it so easy a child
can do it.

Seamless integration

Our
vision in the future is really flattening that out, but also being able
to provide seamless integration experience between this break systems,
where at some point you wouldn’t even have to buy middleware as an
individual business or a consumer.

The cloud vendors and legacy vendors could embed integration and then be able to have really a plug and play so that the individual
user could be doing integration on their own. That’s where we would really
like to get to. That’s the vision and where the platform is going for
Scribe.

Gardner: Well, great. I’m afraid we’ll
have to leave it there. We've been listening to a sponsored
BriefingsDirect podcast discussion on how business intelligence and
big-data trends are requiring improved access and automation to data
flows from a variety of sources.

We've learned of ways that enterprises
are effectively harvesting data in all it's forms and creating
integrations that foster better use of data throughout the entire
lifecycle. The result has been the ability to exploit data strategically
among more aspects of enterprise businesses and across more types of
applications and processes.

So a huge thanks to our
guests Jon Petrucelli, Senior Director of Hitachi Solutions Dynamic CRM
and Marketing Practice. Thanks so much Jon.

Petrucelli: Thank you, glad to be here.

Percuoco: Also Rick Percuoco, Senior Vice President of Research and Development at Trillium Software. Thank you so much, Rick.

Gardner:
And also a huge thank you to our audience for joining this
insightful discussion. This is Dana Gardner, Principal Analyst at
Interarbor Solutions. Don’t forget to come back and listen next time.

Transcript
of a BriefingsDirect podcast on how creating big-data capabilities are
new top business imperatives in dealing with a flood of data from
disparate sources. Copyright Interarbor Solutions, LLC, 2005-2013. All
rights reserved.