SlideShare a Scribd company logo
1 of 14
Download to read offline
Page 1 of 14
Mission Critical Use Cases
Show How Analytics Architectures
Usher in an Artificial Intelligence Era
A discussion on how artificial intelligence and advanced analytics solutions coalesce into top
competitive differentiators that prove indispensable for digital business transformation.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett
Packard Enterprise.
Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of AI
Innovation podcast series.
I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for
this ongoing discussion on the latest insights into artificial intelligence (AI) use cases and
strategies.
Major trends in AI and advanced analytics are now coalescing into top competitive
differentiators for most businesses. Access to advanced algorithms, more cloud options,
high-performance compute (HPC) resources, and an unprecedented data asset
collection have all come together to make AI more attainable -- and more powerful --
than ever.
Stay with us as we now examine why AI is indispensable for digital transformation
through deep-dive interviews on prominent AI use cases and their escalating business
benefits.
To learn more about analytic solutions that support
mission-critical solutions, we’re joined by two experts,
Andy Longworth, Senior Solution Architect in the AI and
Data Practice at Hewlett Packard Enterprise (HPE)
Pointnext Services. Welcome, Andy.
Andy Longworth: Thank you, Dana.
Gardner: We’re also here with Iveta Lohovska, Data
Scientist in the Pointnext Global Practice for AI and Data
at HPE. Welcome, Iveta.
Iveta Lohovska: Thank you.Longworth
Page 2 of 14
Gardner: Let’s look at the trends coalescing around modern analytics and AI and why
they’re playing an increasingly essential role in digital business transformation. Andy,
what do you see as top drivers making AI more prominent in most businesses?
AI drives data to boost business
Longworth: We have three main things driving AI at the moment for businesses. First
of all, we know about the data explosion. These AI algorithms require huge amounts of
data. So we’re generating that, especially in the industrial setting with machine data.
Also, the relative price of computing is coming down, giving the capability to process all
of that data at accelerating speeds as well. You know, the graphics processing units
(GPUs) and tensor processing units (TPUs) are becoming more available, enabling us to
get through that vast volume of data.
And thirdly, the algorithms. If we look to organizations like Facebook, Google, and
academic institutions, they’re making algorithms available as open source. So
organizations don’t have to go and employ somebody to build an algorithm from the
ground up. They can begin to use these pre-trained, pre-created models to give them a
kick-start in AI and quickly understand whether there’s value in it for them or not.
Gardner: And how do those come together to impact what’s referred to as digital
transformation? Why are these actually business benefits?
Longworth: They allow organizations to
become what we call data driven. They can
use the massive data that they’ve previously
generated but never tapped into to improve
business decisions, impacting the way they
drive the business through AI. It’s
transforming the way they work.
Across several types of industry, data is now driving the decisions. Industrial
organizations, for example, improve the way they manufacture. Without the processing
of that data, these things wouldn’t be possible.
Gardner: Iveta, how do the trends Andy has described make AI different now from a
data science perspective? What’s different now than, say, two or three years ago?
Lohovska: Most of the previous AI algorithms were 30, 40, and even 50 years old in
terms of the linear algebra and their mathematical foundations. The higher levels of
computing power enable newer computations and larger amounts of data to train those
algorithms.
Organizations become …
data driven. They can use the
massive data that they’ve
previously generated but
never tapped into to improve
business decisions.
Page 3 of 14
Those two components are fundamentally changing the
picture, along with the improved taxonomies and the way
people now think of AI as differentiated between classical
statistics and deep learning algorithms. Now, not just
technical people can interact with these technologies and
analytic models. Semi-technical people can with a simple
drag-and-drop interaction, based on the new products in
the market, adopt and fail fast -- or succeed faster -- in the
AI space. The models are also getting better and better in
their performance based on the amount of data they get
trained on and their digital footprint.
Gardner: Andy, it sounds like AI has evolved to the point
where it is mimicking human-like skills. How is that
different and how does such machine learning (ML) and deep learning change the very
nature of work?
Let the simple tasks go, to the machines
Longworth: It allows organizations and people to move some of the jobs that were
previously very tedious for people so they can be done by machines and repurposes the
people’s skills into more complex jobs. For example, in computer vision and applying
that in quality control. If you’re creating the same product again and again and paying
somebody to look at that product to say whether there’s a defect on it, it’s probably not
the best use of their skills. And, they become fatigued.
If you look at the same thing again and again,
you start to miss features of that and miss the
things that have gone wrong. A computer
doesn’t get that same fatigue. You can train a
model to perform that quality-control step and it
won’t become tired over time. It can keep going
for longer than, for example, an eight-hour shift
that a typical person might work. So, you’re
seeing these practical applications, which then allows the workforce to concentrate on
other things.
Gardner: Iveta, it wasn’t that long ago that big data was captured and analyzed mostly
for the sake of compliance and business continuity. But data has become so much more
strategic. How are businesses changing the way they view their data?
Lohovska: They are paying more attention to the quality of the data and the variety of
the data collection that they are focused on. From a data science perspective, even if I
want to say that the performance of models is extremely important, and that my data
science skills are a critical component to the AI space and ecosystem, it’s ultimately
about the quality of the data and the way it’s pipelined and handled.
Lohovska
If you look at the same thing
again and again, you start to
miss features of that and miss
the things that have gone
wrong. A computer doesn’t
get that same fatigue.
Page 4 of 14
This process of data manipulation, getting to the so-called last mile of the data science
contribution, is extremely important. I believe it’s the critical step and foundation.
Organizations will realize that being more selective and paying more attention to the
foundations of how they handle big data -- or small data – will get them to the data
science part of the process.
You can already see the maturity as many customers, partners, and organizations pay
more attention to the fundamental layers of AI. Then they can get better performance at
the last mile of the process.
Gardner: Why are the traditional IT approaches not enough? How do cloud models
help?
Cloud control and compliance count
Longworth: The cloud brings opportunities for organizations insomuch as they can try
before they buy. So if you go back to the idea of processing all of that data, before an
organization spends real money on purchasing GPUs, they can try them in the cloud to
understand whether they work and deliver value. Then they can look at the delivery
model. Does it make sense with my use case to make a capital investment, or do I go for
a pay-per-use model using the cloud?
You also have the data management piece, which is understanding where your data is.
From that sense, cloud doesn’t necessarily make life any less complicated. You still
need to know where the data resides, control that data, and put in the necessary
protections in line with the value of the data type. That becomes particularly important
with legislation like the General Data Protection Regulation (GDPR) and the use of
personally identifiable information (PII).
If you don’t have your data management under control and understand where all those
copies of that data are, then you can’t be compliant with GDPR, which says you may
need to delete all of that data.
So, you need to be aware of what
you’re putting in the cloud versus
what you have on-premises and
where the data resides across your
entire ecosystem.
Gardner: Another element of the past IT approaches has to do with particulars vs.
standards. We talk about the difference between managing a cow and managing a herd.
How do we attain a better IT infrastructure model to attain digital business transformation
and fully take advantage of AI? How do we balance between a standardized approach,
You need to be aware of what you’re
putting in the cloud versus what you
have on-premises and where the data
resides across your entire ecosystem.
Page 5 of 14
but also something that’s appropriate for specific use cases? And why is the architecture
of today very much involved with that sort of a balance, Andy?
Longworth: The first thing to understand is the specific use case and how quickly you
need insights. We can process, for example, data in near real-time or we can use batch
processing like we did in days of old. That use case defines the kind of processing.
If, for example, you think about an autonomous vehicle, you can’t batch-process the
sensor data coming from that car as it’s driving on the road. You need to be able to do
that in near real-time -- and that comes at a cost. You not only need to manage the flow
of data; you need the compute power to process all of that data in near real-time.
So, understand the criticality of the data and how quickly
you need to process it. Then we can build solutions to
process the data within that framework and within the
right time that it needs to be processed. Otherwise,
you’re putting additional cost into a use case that doesn’t
necessarily need to be there.
When we build those use cases we typically use cloud-like technologies -- be that
containers or scalar technologies. That allows us portability of the use case, even if
we’re not necessarily going to deploy it in the cloud. It allows us to move the use case as
close to the data as possible.
For example, if we’re talking about a computer vision use case on a production line, we
don’t want to be sending images to the cloud and have the high latency and processing
of the data. We need a very quick answer to control the production process. So you
would want to move the inference engine as close to the production line as possible.
And, if we use things like HPE Edgeline computing and containers, we can place those
systems right there on the production line to get the answers as quickly as we need.
So being able to move the use case where it needs to reside is probably one of the
biggest things that we need to consider.
Gardner: Iveta, why is the so-called explore, experiment, and evolve approach using
such a holistic ecosystem of support the right way to go?
Small steps, scientific method, solutions
Lohovska: Because AI is not easy. If it were easy, then everyone would be doing it and
we would not be having this conversation. It’s not a simple statistical use case or a
program or business intelligence app where you already have the answer or even an
idea of the questions you are asking.
The whole process is in the data science title. You have the word “science,” so there is a
moment of research and uncertainty. It’s about the way you explore the data, the way
Understand the
criticality of the data
and how quickly you
need to process it.
Page 6 of 14
you understand the use cases, starting from the fact that you have to define your
business case, and you have to define the scope.
My advice is to start small, not exhaust your resources or the trust of the different
stakeholders. Also define the correct use case and the desired return on investment
(ROI). HPE is even working on the definitions and the business case when approaching
an AI use case, trying to understand the level of complexity and the required level of
prediction needed to achieve the use case’s success.
So such an explorational phase is extremely important so that everyone is aligned and
finds a right path to minimize failure and get to the success of monetizing data and AI.
Once you have the fundamentals, once you have experimented with some use cases,
and you see them up and running in your production environment, then it is the moment
to scale them.
I think we are doing a great job
bringing all of those complicated
environments together, with their
data complexity, model complexity,
and networking and security
regulations into one environment
that’s in production and can quickly
bring value to many use cases.
This flow is extremely important, of experimenting and not approaching things like you
have a fixed answer or fixed approach. It’s extremely important, and this is the way we at
HPE are approaching AI.
Gardner: It sounds as if we are approaching some sort of a unified reference
architecture that’s inclusive of systems, cloud models, data management, and AI
services. Is that what’s going to be required? Andy, do we need a grand unifying theory
of AI and data management to make this happen?
Longworth: I don’t think we do. Maybe one day we will get to that point, but what we are
reaching now is a clear understanding of what architectures work for which use cases
and business requirements. We are then able to apply them without having to
experiment every time we go into this because it’s a complement to what Iveta said.
When we start to look at these use cases, when we engage with customers, what’s key
is making sure there is business value for the organization. We know AI can work, but
the question is, does it work in the customer’s business context?
If we can take out a good deal of that experimentation and come in with a fairly good
answer to the use case in a specific industry, then we have a good jump start on that.
As time goes on and AI develops, we will see more generic AI solutions that can be used
for many different things. But at the moment, it’s really still about point solutions.
[Bring] all of those complicated
environments together, with their data
complexity, model complexity, and
networking and security regulations
into one environment that’s in
production and can quickly bring value
to many use cases.
Page 7 of 14
Gardner: Let’s find out where AI is making an impact. Let’s look first, Andy, at digital
prescriptive maintenance and quality control. You mentioned manufacturing a little
earlier. What’s the problem, context, and how are we getting better business outcomes?
Monitor maintenance with AI
Longworth: The problem is the way we do maintenance schedules today. If you look
back in history, we had reactive maintenance that was basically … something breaks
and then we fix it.
Now, most organizations are in a preventative mode so a manufacturer gives a service
window and says, “Okay, you need to service this machinery every 1,000 hours of
running.” And that happens whether it’s needed or not.
When we get into prescriptive and
predictive maintenance, we only
service those assets as they
actually need it, which means
having the data, understanding the
trends, recognizing if problems are
forthcoming, and then fixing them
before they impact the business.
That data from machinery may sense temperature, vibration, speed, and getting a
condition-based monitoring view and understanding in real time what’s happening with
the machinery. You can then also use past history to be able to predict what is going to
happen in the future with that machine.
We can get to a point where we know in real time what’s happening with the machinery
and have the capability to predict the failures before they happen.
The prescriptive piece comes in when we understand the business criticality or the
business impact of an asset. If you have a production line and you have two pieces of
machinery on that production line, both may have the identical probability of failure. But
one is on your critical manufacturing path, and the other is some production buffer.
As a business, the way that you are going to deal with those two pieces of machinery is
different. You will treat the one on the critical path differently than the one where you
have a product buffer. And so the prescriptive piece goes beyond the prediction to
understanding the business context of that machinery and applying that to how you are
behaving, and then how you react when something happens with that machine.
That’s the idea of the solution when we build digital prescriptive maintenance. The side
benefit that we see is the quality control piece. If you have a large piece of machinery
that you can test to it running perfectly during a production run, for example, then you
When we get into prescriptive and
predictive maintenance, we only service
those assets as they actually need it,
which means having the data,
understanding the trends, recognizing if
problems are forthcoming, and then fixing
them before they impact the business.
Page 8 of 14
can say with some certainty what the quality of the outcoming product from that machine
will be.
Gardner: So we have AI overlooking manufacturing and processing. It’s probably
something that would make you sleep a little bit better at night, knowing that you have
such a powerful tool constantly observing and reporting.
Let’s move on to our next use case. Iveta, video analytics and surveillance. What’s the
problem we need to solve? Why is AI important to solving it?
Scrutinize surveillance with AI
Lohovska: For video surveillance and video analytics in general, the overarching field
is computer vision. This is the most mature and currently the trendiest AI field, simply
because the amount of data is there, the diversity is there, and the algorithms are getting
better and better. It’s no longer state-of-the-art, where it’s difficult to grasp, adopt, and
bring into production. So, now the main goal is moving into production and monetizing
these types of data sources.
When you talk about video analytics or
surveillance, or any kind of quality assurance,
the main problem is improving on or detecting
human errors, behaviors, and environments.
Telemetry plays a huge role here, and there are
many complements and constraints to consider
in this environment.
That makes it hardware-dependent and also requires AI at the edge, where most of the
algorithms and decisions need to happen. If you want to detect fire, detect fraud or
prevent certain types of failure, such as quality failure or human failure -- time is
extremely important.
As HPE Pointnext Services, we have been working on our own solution and reference
architectures to approach those problems because of the complexity of the environment,
the different cameras, and hardware handling the data acquisition process. Even at the
beginning it’s enormous and very diverse. There is no one-size-fits-all. There is no one
provider or one solution that can handle surveillance use cases or broad analytical use
cases at the manufacturing plant or oil and gas rig where you are trying to detect fire or
oil and gas spills from the different environments. So being able to approach it
holistically, to choose the right solution for the right complement, and design the
architecture is key.
Also, it’s essential to have the right hardware and edge devices to acquire the data and
handle the telemetry. Let’s say when you are positioning cameras in an outside
environment and you have different temperatures, vibrations, and heat. This will reflect
on the quality of the acquired information going through the pipeline.
[With] video analytics or
surveillance, or any kind of
quality assurance, the main
problem is improving on or
detecting human errors,
behaviors, and environments.
Page 9 of 14
Some of the benefits in use cases using computer vision and video surveillance include
real time information coming from manufacturing plants, knowing that all the safety and
security standards there are met, and that the people operating are following the
instructions and have the safeguards required for a specific manufacturing plant is also
extremely important.
When you have a quality assurance use case, video analytics is one source of
information to tackle the problem. For example, improving the quality of your products or
batches is just one source in the computer vision field. Having the right architecture,
being agile and flexible, and finding the right solution for the problem and the right
models deployed at the right edge device -- or at the right camera -- is something we are
doing right now. We have several partners working to solve the challenges of video
analytics use cases.
Gardner: When you have a high-scaling, high-speed AI to analyze video, it’s no longer a
gating factor that you need to have humans reviewing the processes. It allows video to
be used in so many more applications, even augmented reality, so that you are using
video on both ends of the equation, as it were. Are we seeing an explosion of
applications and use cases for video analytics and AI, Iveta?
Lohovska: Yes, absolutely. The impact of algorithms in this space is enormous. Also, all
the open source datasets, such as ImageNet and ResNet, allow a huge amount of data
to train any kind of algorithms on those open source datasets. You can adjust them and
pre-train them for your own use cases, whether it’s healthcare, manufacturing, or video
surveillance. It’s very enabling.
You can see the diversity of the
solutions people are developing
and the different programs they
are tackling using computer
vision capabilities, not only from
the algorithms, but also from the
hardware side, because the
cameras are getting more and
more powerful.
Currently, we are working on several projects in the non-visible human spectrum. This is
enabled by the further development of the hardware acquiring those images that we
can’t see.
Gardner: If we can view and analyze machines and processes, perhaps we can also
listen and talk to them. Tell us about speech and natural language processing (NLP),
Iveta. How is AI enabling those businesses and how they transform themselves?
Speech-to-text to protect and serve
You can see the diversity of the solutions
people are developing and the different
programs they are tackling using computer
vision capabilities, not only from the
algorithms, but also from the hardware side,
because the cameras are getting more and
more powerful.
Page 10 of 14
Lohovska: This is another strong field for how AI is used and still improving. It’s not as
mature as computer vision, simply because the complexity of human language and
speech, and the way speech gets recorded and transferred. It’s a bit more complex, so
it’s not only a problem of technologies and people writing algorithms, but also linguists
being able to combine the grammar problems and write the right equation to solve those
grammar problems.
But one very interesting field in the speech and NLP
area is speech-to-text, so basically being able to
transcribe speech into text. It’s very helpful for
emergency organizations handling emergency calls
or fraud detection, where you need, in real time, to
detect fraud or danger. If someone is in danger, it’s
a very common use case for law enforcement or for
security organizations or for simply improving the
quality of your service for call centers.
This example is industry- or vertical-independent. You can have finance, manufacturing,
retail -- but all of them have some kind of customer support. This is the most common
use case, being able to record and improve the quality of your services, based on the
analysis you can apply. Similar to the video analytics use case, the problem here, too, is
handling the complexity of different algorithms, different languages, and the varying
quality of the recordings.
A reference architecture, where you have the different components designed on exactly
this holistic approach, allows the user to explore, evolve, and experiment in this space.
We choose the right complement for the right problem and how to approach it.
And in this case, if we combine the right data science tool with the right processing tool
and the right algorithms on top of it, then you can simply design the solution and solve
the specific problem.
Gardner: Our next and last use case for AI is one people are probably very familiar with,
and that’s the autonomous driving technology (ADT).
Andy, how are we developing highly automated-driving infrastructures that leverage AI
and help us get to that potential nirvana of truly self-driving and autonomous vehicles?
Data processing drives autonomous vehicles
Longworth: There are several problems around highly autonomous driving as we have
seen. It’s taking years to get to the point where we have fully autonomous cars and there
are clear advantages to it.
[Speech-to-text] is very
helpful for emergency
organizations handling
emergency calls or fraud
detection, where you
need, in real time, to
detect fraud or danger.
Page 11 of 14
If you look at, for example, what the World Health Organization (WHO) says, there are
more than 1 million deaths per year in road traffic accidents. One of the primary drivers
for ADT is that we can reduce the human error in cars on the road -- and reduce the
number of fatalities and accidents. But to get to that point we need to train these
immensely complex AI algorithms that take massive amounts of data from the car.
Just purely from the sensor point of view, we have high-definition cameras giving 360-
degree views around the car. You have radar, GPS, audio, and vision systems. Some
manufacturers use light detection and ranging (LIDAR), some not. But you have all of
these sensors giving massive amounts of data. And to develop those autonomous cars,
you need to be able to process all of that raw data.
Typically, in an eight-hour shift, an ADT car
generates somewhere between 70 and 100
terabytes of data. If you have an entire fleet of
cars, then you need to be able to very quickly
get that data off of the car so that you can get
them back out on the road as quickly as possible. Then you need to get that data from
where you offload it into the data center so that the developers, data scientists, analysts,
and engineers can build to the next iteration of the autonomous driving strategy.
When you have built that, tested it, and done all the good things that you need to do, you
need to next be able to get those models and that strategy from the developers back into
the cars again. It’s like the other AI problems that we have been talking about, but on
steroids because of the sheer volume of data and because of the impact of what
happens if something should go wrong.
At HPE Pointnext Services, we have developed a set of solutions that addresses several
of the pain points in the ADT development process. First is the ingest; how can we use
HPE Edgeline processing in the car to pre-process data and reduce the amount of data
that you have to send back to the data center. Also, you have to send back the most
important data after the eight-hour drive first, and then send the run-of-the-mill, backup
data later.
The second piece is the data platform itself, building a massive data platform that is
extensible to store all the data coming from the autonomous driving test fleet. That
needs to also expand as the fleet grows as well as to support different use cases.
The data platform and the development platform are not only massive in terms of the
amount of data that it needs to hold and process, but also in terms of the required
tooling. We have been developing reference architectures to enable automotive
manufacturers, along with the suppliers of those automotive systems, to build their data
platforms and provide all the processing that they need so their data scientists can
continuously develop autonomous driving strategies and be able to test them in a highly
automated way, while also giving access to the data to the additional suppliers.
Typically, in an eight-hour
shift, an ADT car generates
somewhere between 70 and
100 terabytes of data.
Page 12 of 14
For example, the sensor suppliers need to see what’s happening to their sensors while
they are on the car. The platform that we have been putting together is really concerned
with having the flexibility for those different use cases, the scalability to be able to
support the data volumes of today, but also to grow -- to be able to have the data
volumes of the not-too-distant future.
The platform also supports the speed and data locality, so being able to provide high-
speed parallel file systems, for example, to feed those ML development systems and
help them train the models that they have.
So all of this pulls together the different components we have talked about with the
different use cases, but at a scale that is much larger than several of the other use
cases, probably put together.
Gardner: It strikes me that the ADT problem, if solved, enables so many other major
opportunities. We are talking about micro-data centers that provide high-performance
compute (HPC) at the edge. We are talking about the right hybrid approach to the data
management problem -- what to move, what to keep local, how to then have a lifecycle
approach to. So, ADT is really a key use-case scenario.
Why is HPE uniquely positioned to solve ADT that will then lead to so many enabling
technologies for other applications?
Longworth: Like you said, the micro-data center --
every autonomous driving car essentially becomes
a data center on wheels. So being able to provide
that compute at the edge to enable the processing
of all that sensor data.
If you look at the HPE portfolio of products, there are very few organizations that have
edge compute solutions and the required processing power in such small packages. But
it’s also about being able to wrap it up in, not only the hardware, but the solution on top,
the support, and being able to provide a flexible delivery model.
Lots of organizations want to have a cloud-like experience, not just from the way they
consume the technology, but also in the way they pay for the technology. So, by HPE
providing everything as-a-service allows being able to pay for it all, as you use it, for your
autonomous driving platform. Again, there are very few organizations in the world that
can offer that end-to-end value proposition.
Collaborate, corroborate, and crack that nut
Gardner: Iveta, why does it take a team-sport and solution-approach from the data
science perspective to tackle these major use cases?
Every autonomous driving
car essentially becomes a
data center on wheels.
Page 13 of 14
Lohovska: I agree with Andy. The way we approach those complex use cases and the
fact that you can have them as a service -- and not only infrastructure-as-a-service
(IaaS) or data-as-a-service (DaaS) -- but working on AI and modeling-as-a-service
(MaaS). You can have a marketplace for models and being able to plug-and-play
different technologies, experiment, and rapidly deploy them allows you to rapidly get
value out of those technologies. That is something we are doing on a daily basis with
amazing experts and people with the knowledge of the different layers. They can then
attack the complexity of those use cases from each side, because it requires not just
data science and the hardware, but a lot of domain-specific expertise to solve those
problems. This is something we are looking at and we are doing in-house.
And I am extremely happy to say that I have the pleasure to work with all of those
amazing people and experts within HPE.
Gardner: And there is a great deal more information available on each of these use
cases for AI. There are white papers on the HPE website in Pointnext Services.
What else can people do, Andy, to get ready for these high-level AI use cases that lead
to digital business transformation? How should organizations be setting themselves up
on a people, process, and technology basis to become adept at AI as a core
competency?
Longworth: It is about people, technology, process,
and all these things combined. You don’t go and buy
AI in a box. You need a structured approach. You
need to understand what the use cases are that give
value to your organization and to be able to quickly
prototype those, quickly experiment with them, and
prove the value to your stakeholders.
Where a lot of organizations get stuck is moving from that prototyping, proof of concept
(POC), and proof of value (POV) phase into full production. It is tough getting the
processes and pipelines that enable you to transition from that small POV phase into a
full production environment. If you can crack that nut, then the next use-cases that you
implement, and the next business problems that you want to solve with AI, become
infinitely easier. It is a hard step to go from POV through to the full production because
there are so many bits involved.
You have that whole value chain from grabbing hold of the data at the point of creation,
processing that data, making sure you have the right people and process around that.
And when you come out with an AI solution that gives some form of inference, it gives
you some form of answer, you need to be able to act upon that answer.
You can have the best AI solution in the world that will give you the best predictions, but
if you don’t build those predictions into your business processes, you may well have
never made them in the first place.
You don’t go in and buy
AI in a box. You need a
structured approach.
Page 14 of 14
Gardner: I’m afraid we will have to leave it there. We have been exploring how major
trends in AI and advanced analytics have coalesced into top competitive differentiators
for many businesses. And we have learned how AI is indispensable for digital
transformation by looking at several prominent use cases and their escalating benefits.
So please join me in thanking our guests, Andy Longworth, Senior Solution Architect in
the AI and Data Practice at HPE Pointnext Services. Thank you so much, Andy.
Longworth: Thank you, Dana.
Gardner: And Iveta Lohovska, Data Scientist in the Pointnext Global Practice for AI and
Data at HPE. Thank you so much, Iveta.
Lohovska: Thank you.
Gardner: And a big thank you as well to our audience for joining us for this sponsored
BriefingsDirect Voice of AI Innovation discussion. I’m Dana Gardner, Principal Analyst at
Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise-
supported discussions.
Thanks again for listening. Please pass this along to your IT community, and do come
back next time.
Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett
Packard Enterprise.
A discussion on how AI and advanced analytics solutions coalesce into top competitive
differentiators that prove indispensable for digital business transformation. Copyright Interarbor
Solutions, LLC, 2005-2020. All rights reserved.
You may also be interested in:
• As hybrid IT complexity ramps up, operators look to data-driven automation tools
• Cerner’s lifesaving sepsis control solution shows the potential of bringing more AI-
enabled IoT to the healthcare edge
• How containers are the new basic currency for pay as you go hybrid IT
• HPE strategist Mark Linesch on the surging role of containers in advancing the hybrid IT
estate
• How the Catalyst UK program seeds the next generations of HPC, AI, and
supercomputing
• HPE and PTC Join Forces to Deliver Best Outcomes from the OT-IT Productivity
Revolution
• How rapid machine learning at the racing edge accelerates Venturi Formula E Team to
top-efficiency wins
• The budding storage relationship between HPE and Cohesity brings the best of startup
innovation to global enterprise reach
• HPE’s Erik Vogel on what's driving success in hybrid cloud adoption and optimization

More Related Content

What's hot

Orzota all-in-one Big Data Platform
Orzota all-in-one Big Data PlatformOrzota all-in-one Big Data Platform
Orzota all-in-one Big Data PlatformOrzota
 
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...Dana Gardner
 
Diginomica 2019 2020 ai ai ethics neil raden articles links and captions
Diginomica 2019 2020 ai ai ethics neil raden articles links and captionsDiginomica 2019 2020 ai ai ethics neil raden articles links and captions
Diginomica 2019 2020 ai ai ethics neil raden articles links and captionsNeil Raden
 
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...Dana Gardner
 
Sogeti on big data creating clarity
Sogeti on big data creating claritySogeti on big data creating clarity
Sogeti on big data creating clarityYann SESE
 
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...Dana Gardner
 
Oea big-data-guide-1522052
Oea big-data-guide-1522052Oea big-data-guide-1522052
Oea big-data-guide-1522052Gilbert Rozario
 
Balance your Supply Chain with Big Data
Balance your Supply Chain with Big DataBalance your Supply Chain with Big Data
Balance your Supply Chain with Big DataBodhtree
 
Impact of big data on analytics
Impact of big data on analyticsImpact of big data on analytics
Impact of big data on analyticsCapgemini
 
Using AI to Solve Data and IT Complexity -- And Better Enable AI
Using AI to Solve Data and IT Complexity -- And Better Enable AIUsing AI to Solve Data and IT Complexity -- And Better Enable AI
Using AI to Solve Data and IT Complexity -- And Better Enable AIDana Gardner
 
Analytics: The Real-world Use of Big Data
Analytics: The Real-world Use of Big DataAnalytics: The Real-world Use of Big Data
Analytics: The Real-world Use of Big DataDavid Pittman
 
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowWant a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowDana Gardner
 
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...Dana Gardner
 
Cutting Edge Predictive Analytics with Eric Siegel
Cutting Edge Predictive Analytics with Eric Siegel   Cutting Edge Predictive Analytics with Eric Siegel
Cutting Edge Predictive Analytics with Eric Siegel Databricks
 
Investing in AI: Moving Along the Digital Maturity Curve
Investing in AI: Moving Along the Digital Maturity CurveInvesting in AI: Moving Along the Digital Maturity Curve
Investing in AI: Moving Along the Digital Maturity CurveCognizant
 
Big data course | big data training | big data classes
Big data course | big data training | big data classesBig data course | big data training | big data classes
Big data course | big data training | big data classesNaviWalker
 
Smart Data Module 6 d drive the future
Smart Data Module 6 d drive the futureSmart Data Module 6 d drive the future
Smart Data Module 6 d drive the futurecaniceconsulting
 
Analytical thinking 16 - October 2012
Analytical thinking 16 - October 2012Analytical thinking 16 - October 2012
Analytical thinking 16 - October 2012Charlotte Skornik
 

What's hot (20)

Orzota all-in-one Big Data Platform
Orzota all-in-one Big Data PlatformOrzota all-in-one Big Data Platform
Orzota all-in-one Big Data Platform
 
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...
After Cutting its Big Data Teeth on Wall Street, Vichara Technologies Grows t...
 
Diginomica 2019 2020 ai ai ethics neil raden articles links and captions
Diginomica 2019 2020 ai ai ethics neil raden articles links and captionsDiginomica 2019 2020 ai ai ethics neil raden articles links and captions
Diginomica 2019 2020 ai ai ethics neil raden articles links and captions
 
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...
Platform 3.0 Ripe to Give Standard Access to Advanced Intelligence and Automa...
 
Sogeti on big data creating clarity
Sogeti on big data creating claritySogeti on big data creating clarity
Sogeti on big data creating clarity
 
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...How the Modern Data Center Extends Across Remote Locations Due to Automation ...
How the Modern Data Center Extends Across Remote Locations Due to Automation ...
 
Oea big-data-guide-1522052
Oea big-data-guide-1522052Oea big-data-guide-1522052
Oea big-data-guide-1522052
 
Balance your Supply Chain with Big Data
Balance your Supply Chain with Big DataBalance your Supply Chain with Big Data
Balance your Supply Chain with Big Data
 
Impact of big data on analytics
Impact of big data on analyticsImpact of big data on analytics
Impact of big data on analytics
 
Using AI to Solve Data and IT Complexity -- And Better Enable AI
Using AI to Solve Data and IT Complexity -- And Better Enable AIUsing AI to Solve Data and IT Complexity -- And Better Enable AI
Using AI to Solve Data and IT Complexity -- And Better Enable AI
 
Future of Big Data
Future of Big DataFuture of Big Data
Future of Big Data
 
Analytics: The Real-world Use of Big Data
Analytics: The Real-world Use of Big DataAnalytics: The Real-world Use of Big Data
Analytics: The Real-world Use of Big Data
 
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths NowWant a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
Want a Data-Driven Culture? Start Sorting Out the BI and Big Data Myths Now
 
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...
Agnostic Tool Chain Key to Fixing the Broken State of Data and Information Ma...
 
Cutting Edge Predictive Analytics with Eric Siegel
Cutting Edge Predictive Analytics with Eric Siegel   Cutting Edge Predictive Analytics with Eric Siegel
Cutting Edge Predictive Analytics with Eric Siegel
 
Big Data
Big DataBig Data
Big Data
 
Investing in AI: Moving Along the Digital Maturity Curve
Investing in AI: Moving Along the Digital Maturity CurveInvesting in AI: Moving Along the Digital Maturity Curve
Investing in AI: Moving Along the Digital Maturity Curve
 
Big data course | big data training | big data classes
Big data course | big data training | big data classesBig data course | big data training | big data classes
Big data course | big data training | big data classes
 
Smart Data Module 6 d drive the future
Smart Data Module 6 d drive the futureSmart Data Module 6 d drive the future
Smart Data Module 6 d drive the future
 
Analytical thinking 16 - October 2012
Analytical thinking 16 - October 2012Analytical thinking 16 - October 2012
Analytical thinking 16 - October 2012
 

Similar to Mission Critical Use Cases Show How Analytics Architectures Usher in an Artificial Intelligence Era

Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Dana Gardner
 
Big Data & Analytics Trends 2016 Vin Malhotra
Big Data & Analytics Trends 2016 Vin MalhotraBig Data & Analytics Trends 2016 Vin Malhotra
Big Data & Analytics Trends 2016 Vin MalhotraVin Malhotra
 
Welcome to Data Science
Welcome to Data ScienceWelcome to Data Science
Welcome to Data ScienceNyraSehgal
 
The 4 Biggest Trends In Big Data and Analytics Right For 2021
The 4 Biggest Trends In Big Data and Analytics Right For 2021The 4 Biggest Trends In Big Data and Analytics Right For 2021
The 4 Biggest Trends In Big Data and Analytics Right For 2021Bernard Marr
 
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING Goodbuzz Inc.
 
Notes on Current trends in IT (1) (1).pdf
Notes on Current trends in IT (1) (1).pdfNotes on Current trends in IT (1) (1).pdf
Notes on Current trends in IT (1) (1).pdfKarishma Chaudhary
 
Analytics Trends 20145 - Deloitte - us-da-analytics-analytics-trends-2015
Analytics Trends 20145 -  Deloitte - us-da-analytics-analytics-trends-2015Analytics Trends 20145 -  Deloitte - us-da-analytics-analytics-trends-2015
Analytics Trends 20145 - Deloitte - us-da-analytics-analytics-trends-2015Edgar Alejandro Villegas
 
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...Dana Gardner
 
AI in Business - Key drivers and future value
AI in Business - Key drivers and future valueAI in Business - Key drivers and future value
AI in Business - Key drivers and future valueAPPANION
 
Top 10 trends in business intelligence for 2015
Top 10 trends in business intelligence for 2015Top 10 trends in business intelligence for 2015
Top 10 trends in business intelligence for 2015Tableau Software
 
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation Tools
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation ToolsAs Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation Tools
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation ToolsDana Gardner
 
Practical analytics john enoch white paper
Practical analytics john enoch white paperPractical analytics john enoch white paper
Practical analytics john enoch white paperJohn Enoch
 
The book of elephant tattoo
The book of elephant tattooThe book of elephant tattoo
The book of elephant tattooMohamed Magdy
 
Explore the Roles and Myths of Automation and Virtualization in Data Center T...
Explore the Roles and Myths of Automation and Virtualization in Data Center T...Explore the Roles and Myths of Automation and Virtualization in Data Center T...
Explore the Roles and Myths of Automation and Virtualization in Data Center T...Dana Gardner
 
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Dana Gardner
 
Master in data science
Master in data scienceMaster in data science
Master in data scienceSagar315324
 
Big data (word file)
Big data  (word file)Big data  (word file)
Big data (word file)Shahbaz Anjam
 

Similar to Mission Critical Use Cases Show How Analytics Architectures Usher in an Artificial Intelligence Era (20)

Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
Industrializing Data Science: Transform into an End-to-End, Analytics-Oriente...
 
Big Data & Analytics Trends 2016 Vin Malhotra
Big Data & Analytics Trends 2016 Vin MalhotraBig Data & Analytics Trends 2016 Vin Malhotra
Big Data & Analytics Trends 2016 Vin Malhotra
 
Welcome to Data Science
Welcome to Data ScienceWelcome to Data Science
Welcome to Data Science
 
The 4 Biggest Trends In Big Data and Analytics Right For 2021
The 4 Biggest Trends In Big Data and Analytics Right For 2021The 4 Biggest Trends In Big Data and Analytics Right For 2021
The 4 Biggest Trends In Big Data and Analytics Right For 2021
 
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING
 
Notes on Current trends in IT (1) (1).pdf
Notes on Current trends in IT (1) (1).pdfNotes on Current trends in IT (1) (1).pdf
Notes on Current trends in IT (1) (1).pdf
 
Analytics Trends 20145 - Deloitte - us-da-analytics-analytics-trends-2015
Analytics Trends 20145 -  Deloitte - us-da-analytics-analytics-trends-2015Analytics Trends 20145 -  Deloitte - us-da-analytics-analytics-trends-2015
Analytics Trends 20145 - Deloitte - us-da-analytics-analytics-trends-2015
 
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...
The Long Road of IT Systems Management Enters the Domain of AIOps-Fueled Auto...
 
AI in Business - Key drivers and future value
AI in Business - Key drivers and future valueAI in Business - Key drivers and future value
AI in Business - Key drivers and future value
 
Top 10 trends in business intelligence for 2015
Top 10 trends in business intelligence for 2015Top 10 trends in business intelligence for 2015
Top 10 trends in business intelligence for 2015
 
Unlocking big data
Unlocking big dataUnlocking big data
Unlocking big data
 
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation Tools
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation ToolsAs Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation Tools
As Hybrid IT Complexity Ramps Up, Operators Look To Data-Driven Automation Tools
 
Practical analytics john enoch white paper
Practical analytics john enoch white paperPractical analytics john enoch white paper
Practical analytics john enoch white paper
 
Data Science for Finance Interview.
Data Science for Finance Interview. Data Science for Finance Interview.
Data Science for Finance Interview.
 
The book of elephant tattoo
The book of elephant tattooThe book of elephant tattoo
The book of elephant tattoo
 
Explore the Roles and Myths of Automation and Virtualization in Data Center T...
Explore the Roles and Myths of Automation and Virtualization in Data Center T...Explore the Roles and Myths of Automation and Virtualization in Data Center T...
Explore the Roles and Myths of Automation and Virtualization in Data Center T...
 
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
Democratizing Advanced Analytics Propels Instant Analysis Results to the Ubiq...
 
Bidata
BidataBidata
Bidata
 
Master in data science
Master in data scienceMaster in data science
Master in data science
 
Big data (word file)
Big data  (word file)Big data  (word file)
Big data (word file)
 

Recently uploaded

Navigating Identity and Access Management in the Modern Enterprise
Navigating Identity and Access Management in the Modern EnterpriseNavigating Identity and Access Management in the Modern Enterprise
Navigating Identity and Access Management in the Modern EnterpriseWSO2
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxRustici Software
 
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....rightmanforbloodline
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAndrey Devyatkin
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdfSandro Moreira
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerThousandEyes
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherRemote DBA Services
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusZilliz
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FMESafe Software
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDropbox
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MIND CTI
 
AI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by AnitarajAI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by AnitarajAnitaRaj43
 
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...WSO2
 
Stronger Together: Developing an Organizational Strategy for Accessible Desig...
Stronger Together: Developing an Organizational Strategy for Accessible Desig...Stronger Together: Developing an Organizational Strategy for Accessible Desig...
Stronger Together: Developing an Organizational Strategy for Accessible Desig...caitlingebhard1
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Victor Rentea
 
Simplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptxSimplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptxMarkSteadman7
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native ApplicationsWSO2
 

Recently uploaded (20)

Navigating Identity and Access Management in the Modern Enterprise
Navigating Identity and Access Management in the Modern EnterpriseNavigating Identity and Access Management in the Modern Enterprise
Navigating Identity and Access Management in the Modern Enterprise
 
Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..Understanding the FAA Part 107 License ..
Understanding the FAA Part 107 License ..
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....
TEST BANK For Principles of Anatomy and Physiology, 16th Edition by Gerard J....
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf[BuildWithAI] Introduction to Gemini.pdf
[BuildWithAI] Introduction to Gemini.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Exploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with MilvusExploring Multimodal Embeddings with Milvus
Exploring Multimodal Embeddings with Milvus
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
DBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor PresentationDBX First Quarter 2024 Investor Presentation
DBX First Quarter 2024 Investor Presentation
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
AI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by AnitarajAI in Action: Real World Use Cases by Anitaraj
AI in Action: Real World Use Cases by Anitaraj
 
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...
WSO2 Micro Integrator for Enterprise Integration in a Decentralized, Microser...
 
Stronger Together: Developing an Organizational Strategy for Accessible Desig...
Stronger Together: Developing an Organizational Strategy for Accessible Desig...Stronger Together: Developing an Organizational Strategy for Accessible Desig...
Stronger Together: Developing an Organizational Strategy for Accessible Desig...
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Simplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptxSimplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptx
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 

Mission Critical Use Cases Show How Analytics Architectures Usher in an Artificial Intelligence Era

  • 1. Page 1 of 14 Mission Critical Use Cases Show How Analytics Architectures Usher in an Artificial Intelligence Era A discussion on how artificial intelligence and advanced analytics solutions coalesce into top competitive differentiators that prove indispensable for digital business transformation. Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise. Dana Gardner: Hello, and welcome to the next edition of the BriefingsDirect Voice of AI Innovation podcast series. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator for this ongoing discussion on the latest insights into artificial intelligence (AI) use cases and strategies. Major trends in AI and advanced analytics are now coalescing into top competitive differentiators for most businesses. Access to advanced algorithms, more cloud options, high-performance compute (HPC) resources, and an unprecedented data asset collection have all come together to make AI more attainable -- and more powerful -- than ever. Stay with us as we now examine why AI is indispensable for digital transformation through deep-dive interviews on prominent AI use cases and their escalating business benefits. To learn more about analytic solutions that support mission-critical solutions, we’re joined by two experts, Andy Longworth, Senior Solution Architect in the AI and Data Practice at Hewlett Packard Enterprise (HPE) Pointnext Services. Welcome, Andy. Andy Longworth: Thank you, Dana. Gardner: We’re also here with Iveta Lohovska, Data Scientist in the Pointnext Global Practice for AI and Data at HPE. Welcome, Iveta. Iveta Lohovska: Thank you.Longworth
  • 2. Page 2 of 14 Gardner: Let’s look at the trends coalescing around modern analytics and AI and why they’re playing an increasingly essential role in digital business transformation. Andy, what do you see as top drivers making AI more prominent in most businesses? AI drives data to boost business Longworth: We have three main things driving AI at the moment for businesses. First of all, we know about the data explosion. These AI algorithms require huge amounts of data. So we’re generating that, especially in the industrial setting with machine data. Also, the relative price of computing is coming down, giving the capability to process all of that data at accelerating speeds as well. You know, the graphics processing units (GPUs) and tensor processing units (TPUs) are becoming more available, enabling us to get through that vast volume of data. And thirdly, the algorithms. If we look to organizations like Facebook, Google, and academic institutions, they’re making algorithms available as open source. So organizations don’t have to go and employ somebody to build an algorithm from the ground up. They can begin to use these pre-trained, pre-created models to give them a kick-start in AI and quickly understand whether there’s value in it for them or not. Gardner: And how do those come together to impact what’s referred to as digital transformation? Why are these actually business benefits? Longworth: They allow organizations to become what we call data driven. They can use the massive data that they’ve previously generated but never tapped into to improve business decisions, impacting the way they drive the business through AI. It’s transforming the way they work. Across several types of industry, data is now driving the decisions. Industrial organizations, for example, improve the way they manufacture. Without the processing of that data, these things wouldn’t be possible. Gardner: Iveta, how do the trends Andy has described make AI different now from a data science perspective? What’s different now than, say, two or three years ago? Lohovska: Most of the previous AI algorithms were 30, 40, and even 50 years old in terms of the linear algebra and their mathematical foundations. The higher levels of computing power enable newer computations and larger amounts of data to train those algorithms. Organizations become … data driven. They can use the massive data that they’ve previously generated but never tapped into to improve business decisions.
  • 3. Page 3 of 14 Those two components are fundamentally changing the picture, along with the improved taxonomies and the way people now think of AI as differentiated between classical statistics and deep learning algorithms. Now, not just technical people can interact with these technologies and analytic models. Semi-technical people can with a simple drag-and-drop interaction, based on the new products in the market, adopt and fail fast -- or succeed faster -- in the AI space. The models are also getting better and better in their performance based on the amount of data they get trained on and their digital footprint. Gardner: Andy, it sounds like AI has evolved to the point where it is mimicking human-like skills. How is that different and how does such machine learning (ML) and deep learning change the very nature of work? Let the simple tasks go, to the machines Longworth: It allows organizations and people to move some of the jobs that were previously very tedious for people so they can be done by machines and repurposes the people’s skills into more complex jobs. For example, in computer vision and applying that in quality control. If you’re creating the same product again and again and paying somebody to look at that product to say whether there’s a defect on it, it’s probably not the best use of their skills. And, they become fatigued. If you look at the same thing again and again, you start to miss features of that and miss the things that have gone wrong. A computer doesn’t get that same fatigue. You can train a model to perform that quality-control step and it won’t become tired over time. It can keep going for longer than, for example, an eight-hour shift that a typical person might work. So, you’re seeing these practical applications, which then allows the workforce to concentrate on other things. Gardner: Iveta, it wasn’t that long ago that big data was captured and analyzed mostly for the sake of compliance and business continuity. But data has become so much more strategic. How are businesses changing the way they view their data? Lohovska: They are paying more attention to the quality of the data and the variety of the data collection that they are focused on. From a data science perspective, even if I want to say that the performance of models is extremely important, and that my data science skills are a critical component to the AI space and ecosystem, it’s ultimately about the quality of the data and the way it’s pipelined and handled. Lohovska If you look at the same thing again and again, you start to miss features of that and miss the things that have gone wrong. A computer doesn’t get that same fatigue.
  • 4. Page 4 of 14 This process of data manipulation, getting to the so-called last mile of the data science contribution, is extremely important. I believe it’s the critical step and foundation. Organizations will realize that being more selective and paying more attention to the foundations of how they handle big data -- or small data – will get them to the data science part of the process. You can already see the maturity as many customers, partners, and organizations pay more attention to the fundamental layers of AI. Then they can get better performance at the last mile of the process. Gardner: Why are the traditional IT approaches not enough? How do cloud models help? Cloud control and compliance count Longworth: The cloud brings opportunities for organizations insomuch as they can try before they buy. So if you go back to the idea of processing all of that data, before an organization spends real money on purchasing GPUs, they can try them in the cloud to understand whether they work and deliver value. Then they can look at the delivery model. Does it make sense with my use case to make a capital investment, or do I go for a pay-per-use model using the cloud? You also have the data management piece, which is understanding where your data is. From that sense, cloud doesn’t necessarily make life any less complicated. You still need to know where the data resides, control that data, and put in the necessary protections in line with the value of the data type. That becomes particularly important with legislation like the General Data Protection Regulation (GDPR) and the use of personally identifiable information (PII). If you don’t have your data management under control and understand where all those copies of that data are, then you can’t be compliant with GDPR, which says you may need to delete all of that data. So, you need to be aware of what you’re putting in the cloud versus what you have on-premises and where the data resides across your entire ecosystem. Gardner: Another element of the past IT approaches has to do with particulars vs. standards. We talk about the difference between managing a cow and managing a herd. How do we attain a better IT infrastructure model to attain digital business transformation and fully take advantage of AI? How do we balance between a standardized approach, You need to be aware of what you’re putting in the cloud versus what you have on-premises and where the data resides across your entire ecosystem.
  • 5. Page 5 of 14 but also something that’s appropriate for specific use cases? And why is the architecture of today very much involved with that sort of a balance, Andy? Longworth: The first thing to understand is the specific use case and how quickly you need insights. We can process, for example, data in near real-time or we can use batch processing like we did in days of old. That use case defines the kind of processing. If, for example, you think about an autonomous vehicle, you can’t batch-process the sensor data coming from that car as it’s driving on the road. You need to be able to do that in near real-time -- and that comes at a cost. You not only need to manage the flow of data; you need the compute power to process all of that data in near real-time. So, understand the criticality of the data and how quickly you need to process it. Then we can build solutions to process the data within that framework and within the right time that it needs to be processed. Otherwise, you’re putting additional cost into a use case that doesn’t necessarily need to be there. When we build those use cases we typically use cloud-like technologies -- be that containers or scalar technologies. That allows us portability of the use case, even if we’re not necessarily going to deploy it in the cloud. It allows us to move the use case as close to the data as possible. For example, if we’re talking about a computer vision use case on a production line, we don’t want to be sending images to the cloud and have the high latency and processing of the data. We need a very quick answer to control the production process. So you would want to move the inference engine as close to the production line as possible. And, if we use things like HPE Edgeline computing and containers, we can place those systems right there on the production line to get the answers as quickly as we need. So being able to move the use case where it needs to reside is probably one of the biggest things that we need to consider. Gardner: Iveta, why is the so-called explore, experiment, and evolve approach using such a holistic ecosystem of support the right way to go? Small steps, scientific method, solutions Lohovska: Because AI is not easy. If it were easy, then everyone would be doing it and we would not be having this conversation. It’s not a simple statistical use case or a program or business intelligence app where you already have the answer or even an idea of the questions you are asking. The whole process is in the data science title. You have the word “science,” so there is a moment of research and uncertainty. It’s about the way you explore the data, the way Understand the criticality of the data and how quickly you need to process it.
  • 6. Page 6 of 14 you understand the use cases, starting from the fact that you have to define your business case, and you have to define the scope. My advice is to start small, not exhaust your resources or the trust of the different stakeholders. Also define the correct use case and the desired return on investment (ROI). HPE is even working on the definitions and the business case when approaching an AI use case, trying to understand the level of complexity and the required level of prediction needed to achieve the use case’s success. So such an explorational phase is extremely important so that everyone is aligned and finds a right path to minimize failure and get to the success of monetizing data and AI. Once you have the fundamentals, once you have experimented with some use cases, and you see them up and running in your production environment, then it is the moment to scale them. I think we are doing a great job bringing all of those complicated environments together, with their data complexity, model complexity, and networking and security regulations into one environment that’s in production and can quickly bring value to many use cases. This flow is extremely important, of experimenting and not approaching things like you have a fixed answer or fixed approach. It’s extremely important, and this is the way we at HPE are approaching AI. Gardner: It sounds as if we are approaching some sort of a unified reference architecture that’s inclusive of systems, cloud models, data management, and AI services. Is that what’s going to be required? Andy, do we need a grand unifying theory of AI and data management to make this happen? Longworth: I don’t think we do. Maybe one day we will get to that point, but what we are reaching now is a clear understanding of what architectures work for which use cases and business requirements. We are then able to apply them without having to experiment every time we go into this because it’s a complement to what Iveta said. When we start to look at these use cases, when we engage with customers, what’s key is making sure there is business value for the organization. We know AI can work, but the question is, does it work in the customer’s business context? If we can take out a good deal of that experimentation and come in with a fairly good answer to the use case in a specific industry, then we have a good jump start on that. As time goes on and AI develops, we will see more generic AI solutions that can be used for many different things. But at the moment, it’s really still about point solutions. [Bring] all of those complicated environments together, with their data complexity, model complexity, and networking and security regulations into one environment that’s in production and can quickly bring value to many use cases.
  • 7. Page 7 of 14 Gardner: Let’s find out where AI is making an impact. Let’s look first, Andy, at digital prescriptive maintenance and quality control. You mentioned manufacturing a little earlier. What’s the problem, context, and how are we getting better business outcomes? Monitor maintenance with AI Longworth: The problem is the way we do maintenance schedules today. If you look back in history, we had reactive maintenance that was basically … something breaks and then we fix it. Now, most organizations are in a preventative mode so a manufacturer gives a service window and says, “Okay, you need to service this machinery every 1,000 hours of running.” And that happens whether it’s needed or not. When we get into prescriptive and predictive maintenance, we only service those assets as they actually need it, which means having the data, understanding the trends, recognizing if problems are forthcoming, and then fixing them before they impact the business. That data from machinery may sense temperature, vibration, speed, and getting a condition-based monitoring view and understanding in real time what’s happening with the machinery. You can then also use past history to be able to predict what is going to happen in the future with that machine. We can get to a point where we know in real time what’s happening with the machinery and have the capability to predict the failures before they happen. The prescriptive piece comes in when we understand the business criticality or the business impact of an asset. If you have a production line and you have two pieces of machinery on that production line, both may have the identical probability of failure. But one is on your critical manufacturing path, and the other is some production buffer. As a business, the way that you are going to deal with those two pieces of machinery is different. You will treat the one on the critical path differently than the one where you have a product buffer. And so the prescriptive piece goes beyond the prediction to understanding the business context of that machinery and applying that to how you are behaving, and then how you react when something happens with that machine. That’s the idea of the solution when we build digital prescriptive maintenance. The side benefit that we see is the quality control piece. If you have a large piece of machinery that you can test to it running perfectly during a production run, for example, then you When we get into prescriptive and predictive maintenance, we only service those assets as they actually need it, which means having the data, understanding the trends, recognizing if problems are forthcoming, and then fixing them before they impact the business.
  • 8. Page 8 of 14 can say with some certainty what the quality of the outcoming product from that machine will be. Gardner: So we have AI overlooking manufacturing and processing. It’s probably something that would make you sleep a little bit better at night, knowing that you have such a powerful tool constantly observing and reporting. Let’s move on to our next use case. Iveta, video analytics and surveillance. What’s the problem we need to solve? Why is AI important to solving it? Scrutinize surveillance with AI Lohovska: For video surveillance and video analytics in general, the overarching field is computer vision. This is the most mature and currently the trendiest AI field, simply because the amount of data is there, the diversity is there, and the algorithms are getting better and better. It’s no longer state-of-the-art, where it’s difficult to grasp, adopt, and bring into production. So, now the main goal is moving into production and monetizing these types of data sources. When you talk about video analytics or surveillance, or any kind of quality assurance, the main problem is improving on or detecting human errors, behaviors, and environments. Telemetry plays a huge role here, and there are many complements and constraints to consider in this environment. That makes it hardware-dependent and also requires AI at the edge, where most of the algorithms and decisions need to happen. If you want to detect fire, detect fraud or prevent certain types of failure, such as quality failure or human failure -- time is extremely important. As HPE Pointnext Services, we have been working on our own solution and reference architectures to approach those problems because of the complexity of the environment, the different cameras, and hardware handling the data acquisition process. Even at the beginning it’s enormous and very diverse. There is no one-size-fits-all. There is no one provider or one solution that can handle surveillance use cases or broad analytical use cases at the manufacturing plant or oil and gas rig where you are trying to detect fire or oil and gas spills from the different environments. So being able to approach it holistically, to choose the right solution for the right complement, and design the architecture is key. Also, it’s essential to have the right hardware and edge devices to acquire the data and handle the telemetry. Let’s say when you are positioning cameras in an outside environment and you have different temperatures, vibrations, and heat. This will reflect on the quality of the acquired information going through the pipeline. [With] video analytics or surveillance, or any kind of quality assurance, the main problem is improving on or detecting human errors, behaviors, and environments.
  • 9. Page 9 of 14 Some of the benefits in use cases using computer vision and video surveillance include real time information coming from manufacturing plants, knowing that all the safety and security standards there are met, and that the people operating are following the instructions and have the safeguards required for a specific manufacturing plant is also extremely important. When you have a quality assurance use case, video analytics is one source of information to tackle the problem. For example, improving the quality of your products or batches is just one source in the computer vision field. Having the right architecture, being agile and flexible, and finding the right solution for the problem and the right models deployed at the right edge device -- or at the right camera -- is something we are doing right now. We have several partners working to solve the challenges of video analytics use cases. Gardner: When you have a high-scaling, high-speed AI to analyze video, it’s no longer a gating factor that you need to have humans reviewing the processes. It allows video to be used in so many more applications, even augmented reality, so that you are using video on both ends of the equation, as it were. Are we seeing an explosion of applications and use cases for video analytics and AI, Iveta? Lohovska: Yes, absolutely. The impact of algorithms in this space is enormous. Also, all the open source datasets, such as ImageNet and ResNet, allow a huge amount of data to train any kind of algorithms on those open source datasets. You can adjust them and pre-train them for your own use cases, whether it’s healthcare, manufacturing, or video surveillance. It’s very enabling. You can see the diversity of the solutions people are developing and the different programs they are tackling using computer vision capabilities, not only from the algorithms, but also from the hardware side, because the cameras are getting more and more powerful. Currently, we are working on several projects in the non-visible human spectrum. This is enabled by the further development of the hardware acquiring those images that we can’t see. Gardner: If we can view and analyze machines and processes, perhaps we can also listen and talk to them. Tell us about speech and natural language processing (NLP), Iveta. How is AI enabling those businesses and how they transform themselves? Speech-to-text to protect and serve You can see the diversity of the solutions people are developing and the different programs they are tackling using computer vision capabilities, not only from the algorithms, but also from the hardware side, because the cameras are getting more and more powerful.
  • 10. Page 10 of 14 Lohovska: This is another strong field for how AI is used and still improving. It’s not as mature as computer vision, simply because the complexity of human language and speech, and the way speech gets recorded and transferred. It’s a bit more complex, so it’s not only a problem of technologies and people writing algorithms, but also linguists being able to combine the grammar problems and write the right equation to solve those grammar problems. But one very interesting field in the speech and NLP area is speech-to-text, so basically being able to transcribe speech into text. It’s very helpful for emergency organizations handling emergency calls or fraud detection, where you need, in real time, to detect fraud or danger. If someone is in danger, it’s a very common use case for law enforcement or for security organizations or for simply improving the quality of your service for call centers. This example is industry- or vertical-independent. You can have finance, manufacturing, retail -- but all of them have some kind of customer support. This is the most common use case, being able to record and improve the quality of your services, based on the analysis you can apply. Similar to the video analytics use case, the problem here, too, is handling the complexity of different algorithms, different languages, and the varying quality of the recordings. A reference architecture, where you have the different components designed on exactly this holistic approach, allows the user to explore, evolve, and experiment in this space. We choose the right complement for the right problem and how to approach it. And in this case, if we combine the right data science tool with the right processing tool and the right algorithms on top of it, then you can simply design the solution and solve the specific problem. Gardner: Our next and last use case for AI is one people are probably very familiar with, and that’s the autonomous driving technology (ADT). Andy, how are we developing highly automated-driving infrastructures that leverage AI and help us get to that potential nirvana of truly self-driving and autonomous vehicles? Data processing drives autonomous vehicles Longworth: There are several problems around highly autonomous driving as we have seen. It’s taking years to get to the point where we have fully autonomous cars and there are clear advantages to it. [Speech-to-text] is very helpful for emergency organizations handling emergency calls or fraud detection, where you need, in real time, to detect fraud or danger.
  • 11. Page 11 of 14 If you look at, for example, what the World Health Organization (WHO) says, there are more than 1 million deaths per year in road traffic accidents. One of the primary drivers for ADT is that we can reduce the human error in cars on the road -- and reduce the number of fatalities and accidents. But to get to that point we need to train these immensely complex AI algorithms that take massive amounts of data from the car. Just purely from the sensor point of view, we have high-definition cameras giving 360- degree views around the car. You have radar, GPS, audio, and vision systems. Some manufacturers use light detection and ranging (LIDAR), some not. But you have all of these sensors giving massive amounts of data. And to develop those autonomous cars, you need to be able to process all of that raw data. Typically, in an eight-hour shift, an ADT car generates somewhere between 70 and 100 terabytes of data. If you have an entire fleet of cars, then you need to be able to very quickly get that data off of the car so that you can get them back out on the road as quickly as possible. Then you need to get that data from where you offload it into the data center so that the developers, data scientists, analysts, and engineers can build to the next iteration of the autonomous driving strategy. When you have built that, tested it, and done all the good things that you need to do, you need to next be able to get those models and that strategy from the developers back into the cars again. It’s like the other AI problems that we have been talking about, but on steroids because of the sheer volume of data and because of the impact of what happens if something should go wrong. At HPE Pointnext Services, we have developed a set of solutions that addresses several of the pain points in the ADT development process. First is the ingest; how can we use HPE Edgeline processing in the car to pre-process data and reduce the amount of data that you have to send back to the data center. Also, you have to send back the most important data after the eight-hour drive first, and then send the run-of-the-mill, backup data later. The second piece is the data platform itself, building a massive data platform that is extensible to store all the data coming from the autonomous driving test fleet. That needs to also expand as the fleet grows as well as to support different use cases. The data platform and the development platform are not only massive in terms of the amount of data that it needs to hold and process, but also in terms of the required tooling. We have been developing reference architectures to enable automotive manufacturers, along with the suppliers of those automotive systems, to build their data platforms and provide all the processing that they need so their data scientists can continuously develop autonomous driving strategies and be able to test them in a highly automated way, while also giving access to the data to the additional suppliers. Typically, in an eight-hour shift, an ADT car generates somewhere between 70 and 100 terabytes of data.
  • 12. Page 12 of 14 For example, the sensor suppliers need to see what’s happening to their sensors while they are on the car. The platform that we have been putting together is really concerned with having the flexibility for those different use cases, the scalability to be able to support the data volumes of today, but also to grow -- to be able to have the data volumes of the not-too-distant future. The platform also supports the speed and data locality, so being able to provide high- speed parallel file systems, for example, to feed those ML development systems and help them train the models that they have. So all of this pulls together the different components we have talked about with the different use cases, but at a scale that is much larger than several of the other use cases, probably put together. Gardner: It strikes me that the ADT problem, if solved, enables so many other major opportunities. We are talking about micro-data centers that provide high-performance compute (HPC) at the edge. We are talking about the right hybrid approach to the data management problem -- what to move, what to keep local, how to then have a lifecycle approach to. So, ADT is really a key use-case scenario. Why is HPE uniquely positioned to solve ADT that will then lead to so many enabling technologies for other applications? Longworth: Like you said, the micro-data center -- every autonomous driving car essentially becomes a data center on wheels. So being able to provide that compute at the edge to enable the processing of all that sensor data. If you look at the HPE portfolio of products, there are very few organizations that have edge compute solutions and the required processing power in such small packages. But it’s also about being able to wrap it up in, not only the hardware, but the solution on top, the support, and being able to provide a flexible delivery model. Lots of organizations want to have a cloud-like experience, not just from the way they consume the technology, but also in the way they pay for the technology. So, by HPE providing everything as-a-service allows being able to pay for it all, as you use it, for your autonomous driving platform. Again, there are very few organizations in the world that can offer that end-to-end value proposition. Collaborate, corroborate, and crack that nut Gardner: Iveta, why does it take a team-sport and solution-approach from the data science perspective to tackle these major use cases? Every autonomous driving car essentially becomes a data center on wheels.
  • 13. Page 13 of 14 Lohovska: I agree with Andy. The way we approach those complex use cases and the fact that you can have them as a service -- and not only infrastructure-as-a-service (IaaS) or data-as-a-service (DaaS) -- but working on AI and modeling-as-a-service (MaaS). You can have a marketplace for models and being able to plug-and-play different technologies, experiment, and rapidly deploy them allows you to rapidly get value out of those technologies. That is something we are doing on a daily basis with amazing experts and people with the knowledge of the different layers. They can then attack the complexity of those use cases from each side, because it requires not just data science and the hardware, but a lot of domain-specific expertise to solve those problems. This is something we are looking at and we are doing in-house. And I am extremely happy to say that I have the pleasure to work with all of those amazing people and experts within HPE. Gardner: And there is a great deal more information available on each of these use cases for AI. There are white papers on the HPE website in Pointnext Services. What else can people do, Andy, to get ready for these high-level AI use cases that lead to digital business transformation? How should organizations be setting themselves up on a people, process, and technology basis to become adept at AI as a core competency? Longworth: It is about people, technology, process, and all these things combined. You don’t go and buy AI in a box. You need a structured approach. You need to understand what the use cases are that give value to your organization and to be able to quickly prototype those, quickly experiment with them, and prove the value to your stakeholders. Where a lot of organizations get stuck is moving from that prototyping, proof of concept (POC), and proof of value (POV) phase into full production. It is tough getting the processes and pipelines that enable you to transition from that small POV phase into a full production environment. If you can crack that nut, then the next use-cases that you implement, and the next business problems that you want to solve with AI, become infinitely easier. It is a hard step to go from POV through to the full production because there are so many bits involved. You have that whole value chain from grabbing hold of the data at the point of creation, processing that data, making sure you have the right people and process around that. And when you come out with an AI solution that gives some form of inference, it gives you some form of answer, you need to be able to act upon that answer. You can have the best AI solution in the world that will give you the best predictions, but if you don’t build those predictions into your business processes, you may well have never made them in the first place. You don’t go in and buy AI in a box. You need a structured approach.
  • 14. Page 14 of 14 Gardner: I’m afraid we will have to leave it there. We have been exploring how major trends in AI and advanced analytics have coalesced into top competitive differentiators for many businesses. And we have learned how AI is indispensable for digital transformation by looking at several prominent use cases and their escalating benefits. So please join me in thanking our guests, Andy Longworth, Senior Solution Architect in the AI and Data Practice at HPE Pointnext Services. Thank you so much, Andy. Longworth: Thank you, Dana. Gardner: And Iveta Lohovska, Data Scientist in the Pointnext Global Practice for AI and Data at HPE. Thank you so much, Iveta. Lohovska: Thank you. Gardner: And a big thank you as well to our audience for joining us for this sponsored BriefingsDirect Voice of AI Innovation discussion. I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host for this ongoing series of Hewlett Packard Enterprise- supported discussions. Thanks again for listening. Please pass this along to your IT community, and do come back next time. Listen to the podcast. Find it on iTunes. Download the transcript. Sponsor: Hewlett Packard Enterprise. A discussion on how AI and advanced analytics solutions coalesce into top competitive differentiators that prove indispensable for digital business transformation. Copyright Interarbor Solutions, LLC, 2005-2020. All rights reserved. You may also be interested in: • As hybrid IT complexity ramps up, operators look to data-driven automation tools • Cerner’s lifesaving sepsis control solution shows the potential of bringing more AI- enabled IoT to the healthcare edge • How containers are the new basic currency for pay as you go hybrid IT • HPE strategist Mark Linesch on the surging role of containers in advancing the hybrid IT estate • How the Catalyst UK program seeds the next generations of HPC, AI, and supercomputing • HPE and PTC Join Forces to Deliver Best Outcomes from the OT-IT Productivity Revolution • How rapid machine learning at the racing edge accelerates Venturi Formula E Team to top-efficiency wins • The budding storage relationship between HPE and Cohesity brings the best of startup innovation to global enterprise reach • HPE’s Erik Vogel on what's driving success in hybrid cloud adoption and optimization