Sky-High Capabilities with Analytics in the Cloud

Analyzing data to generate insights ranks as a top priority among enterprises. Organizations are streamlining data access and broad adoption of analytics – and of course, it needs to be available on the go.

Join Surya Mukherjee, Lead Analyst, Technology at Ovum, and discover how modern cloud analytics providers have dispelled user anxieties around cloud and enabled exploration of their data, from anywhere.

The hype surrounding AI has reached a fever pitch, but practical uses that are delivering concrete, practical business results are here, and it’s time to get real.

From taking over the routine requests for customer service agents, reducing costs, speeding up service, and boosting customer satisfaction, to reliably helping marketers deliver what a customer wants, when they want it and where, AI is popping up everywhere, and companies need to figure out where they stand, fast, to stay competitive.

Join this VB Live event, hosted by Forrester Senior Analyst Brandon Purcell and Kayak Chief Scientist Matthias Keller to learn more about how to identify the AI use cases that can transform your business, and how to get started, stat.

You’ll learn:
* What technologies fall under the AI umbrella
* How companies like Kayak use AI to understand customers and personalize experiences
* How to identify the right AI use case for your business
* Common challenges firms face when implementing AI
* Why AI’s time in the sun has finally come, and why it’s here to stay

AI isn’t a nice-to-have any more, it’s a must-have. There’s a reason why corporate giants like Google, IBM, Yahoo, Intel, Apple, and Salesforce are competing to snatch up private AI companies. In the first quarter of 2017 alone, 37 AI companies were swallowed whole.

Companies aren’t just using these AI tools to enable existing marketing strategies; they’re taking those strategies to the next level, going beyond simple retention and delivering active engagement. AI technology offers companies unprecedented insight into customer behaviors, patterns, and beliefs, allowing you to seamlessly anticipate customer needs and serve up hyper-personalized, emotionally resonant campaigns where and when they’re most welcome.

2018 is the year to seize the AI advantage. To learn more about the technology you need, the opportunities it unlocks, and what it takes to get your ball in the game, don’t miss this VB Live event!

In this webinar, you'll learn:
* What’s new in AI for 2018 -- and what’s coming down the pike
* How businesses are using AI to drive results
* How to go beyond customer retention and power customer engagement

Watson is a computer system capable of answering questions posed in natural language. Watson was named after IBM's first CEO, Thomas J. Watson. The computer system was specifically developed to answer questions on the quiz show Jeopardy! (where it beat its human competitors) and was then used in commercial applications, the first of which was helping with lung cancer treatment.

NetApp is now using IBM Watson in Elio, a virtual support assistant that responds to queries in natural language. Elio is built using Watson’s cognitive computing capabilities. These enable Elio to analyze unstructured data by using natural language processing to understand grammar and context, understand complex questions, and evaluate all possible meanings to determine what is being asked. Elio then reasons and identifies the best answers to questions with help from experts who monitor the quality of answers and continue to train Elio on more subjects.

Elio and Watson represent an innovative and novel use of large quantities of unstructured data to help solve problems, on average, four times faster than traditional methods. Join us at this webcast, where we’ll discuss:

AI is changing the way organizations do businesses and how they interact with customers. AI continues to drive the change. Deep Learning and Natural Language Processing will become standards in AI solutions. Deep Learning is based on brain simulations and uses deep neural networks. AlphaGo is the first AI system to defeat a professional human Go player, the first program to defeat a Go world champion, and arguably the strongest Go player in history. Baidu improved speech recognition from 89% to 99% using Deep Learning. Every AI and Machine learning scientist is required to know Deep Learning tools in his / her current job scenario.

In this session, we will be discussing what is Deep Learning and why it is gaining popularity. We will explain AI solutions using Deep Learning with a practical example. Deep Learning has an edge over other machine learning techniques as with the increased volume of data, performance increases with Deep Learning. Further, Deep Learning enables Hierarchical Feature Learning i.e. learning feature hierarchies.

In analytical reporting, often the data and presentation of it are perfect, but the data story falls flat. Looking to storytelling techniques from Hollywood, one can effectively drive home the point of their data and take their data visualizations to the next level.

Join Ted Frank, Principal at Backstories Studio for this webinar as he shows you the quick wins in storytelling, so right away, you can have your stakeholders understanding more and eager, on the edge of their seats. It starts with finding your key story and staying out of the weeds, then how to visualize it, and finally, how to deliver it so you get heard and make a bigger difference.

You can reach Ted at the following:
- ted@backstories.tv
- www.backstories.tv

Discover his book Get to the Heart below:
- ted@getotheheartbook.com
- www.gettotheheartbook.com

In this talk we will see whether we are building our first product or revamping an existing one, Embedded Analytics can help us solve real customer problems, which builds product value and creates a competitive differentiator to propel our business forward.

Additionally, we'll deeply look into how Embedded Analytics is different from Traditional Business Intelligence and what are the factors/trends driving Embedded Analytics.

Selling your house in the financial crisis-stricken Greece is up to this day a great ordeal. When faced with such a challenge, I was baffled by the sparsity of conclusive data on land value at my birthplace city, Thessaloniki. Embarking on a personal mission and collecting and processing more than 10K online housing ads together with open data, I managed to render an insightful interactive visualization of the actual real estate values on borough and city block level that was published through the Greek media. Join me on this thought process journey to find out how to

RIDE supports developing in notebooks, editor, RMarkdown, shiny app, Bokeh and other frameworks. Supported by R-Brain’s optimized kernels, R and Python 3 have full language support, IntelliSense, debugger and data view. Autocomplete and content assistant are available for SQL and Python 2 kernels. Spark (standalone) and Tesnsorflow images are also provided.

Using Docker in managing workspaces, this platform provides an enhanced secure and stable development environment for users with a powerful admin control for controlling resources and level of access including memory usage, CPU usage, and Idle time.

The latest stable version of IDE is always available for all users without any need of upgrading or additional DevOps work. R-Brain also delivers customized development environment for organizations who are able to set up their own Docker registry to use their customized images.

The RIDE Platform is a turnkey solution that increases efficiency in your data science projects by enabling data science teams to work collaboratively without a need to switch between tools. Explore and visualize data, share analyses, all in one IDE with root access, connection to git repositories and databases.

IT is a key player in the digital and cognitive transformation of business processes delivering solutions for improved business value with analytics. This session will step by step explain the journey to secure production while adopting new analytics technologies leveraging mainframe core business assets

Data Scientists are rare and highly valued individuals, and for good reason: making sense of data, and using the machine learning libraries requires an unusual blend of advanced skills. Why is it then that Data Scientists spend the majority of their time getting data ready for models, and a fraction actually doing the high value work?

In this talk we introduce the concept of Data Fabric, a new way to provide a self-service model for data, where data scientists can easily discover, curate, share, and accelerate data analysis using Python, R, and visualization tools, no matter where the data is managed, no matter the structure, and no matter the size.

We will talk through the role of Apache Arrow, the in-memory columnar data standard that is accelerating analytics for GPU-based processing, as well as the role of Pandas and Arrow in providing unprecedented speed in accessing datasets from Python.

We will discuss how Big Data, Artificial Intelligence and Machine learning are rapidly impacting businesses and customers, enabling another massive shift through technology enablement. Todd DeCapua will share how these capabilities are being leveraged in Performance Engineering now, and into the future.

Join us for the next Quality & Testing SIG Talk on Tuesday, January 9, 2018: http://www.vivit-worldwide.org/events/EventDetails.aspx?id=1041157&group=.

Researchers generate huge amounts of valuable unstructured data and articles from research every day. The potential for this information is huge: cancer and pharmaceutical breakthroughs, advances in technology and cultural research that can improve the world we live in.

This webinar discusses how text mining and Machine Learning can be used to make connections across this broad range of files and help drive innovation and research. We discuss using Kubernetes microservices to analyse the data and then applying Machine Learning and graph databases to simplify the reuse of the data.

The data economy and digital technologies are deeply transforming almost all areas of our lives. One of the most heavily transformed revolve around insurance and healthcare with a number of really interesting development possibly redefining the way we take care of ourselves and the way we consumer and use insurance as well.

From harnessing the power of data to better help mental health patients, carers and medical personnel with their treatments to assessing the risk of developing broad range of illnesses and engaging better with users to propose them personalised healthy life plans to using big data and analytics to track down and prepare for epidemics to using data to better cover cars and drivers with car insurances and finally using social media data for insurers to better engage with customers, this webinar will propose a fascinating exploration of the opportunities, risks, new models supporting the digital transformation in banking.

Public cloud deployments have become irresistible in terms of flexibility, low barriers to entry, security, and developer friendliness. But the sheer inertia of traditional data lakes make them difficult to transition to cloud. In this talk we'll look at examples of how leading companies have made the transition using open source technologies and hybrid strategies.

Instead of following a "lift and shift" strategy for moving data lake workloads to the cloud, there are new considerations unique to cloud that should be considered alongside traditional approaches related to compute (eg, GPU, FPGA), storage (object store vs. file store), integrations, and security.

Viewers will take away techniques they can immediately apply to their own projects.

The concept of Data lakes evolved to address challenges and opportunities in managing big data.

Organizations are investing massive amounts of time and money to upgrade existing data infrastructures and build data lakes whether on-premises or in the cloud.

This talk will discuss architectures and design options to implement data lakes with open source tools. Also covered are challenges of upgrade & migration from existing data warehouses, metadata management, supporting self-service and managing production deployments.

As an Enterprise customer, you are potentially using IBM Z in a hybrid cloud implementation. Let's understand how to benefit from cloud access to mainframe data without moving it outside z; thereby improving security, reducing integration challenges and answering your GDPR auditor's needs.