Cloud Computing

Public Cloud, Private Cloud, and Hybrid Cloud approaches as well as services such as Software-as-a-Service (SaaS), Infrastructure-as-a-Service (IaaS), and Platform-as-a-Service (PaaS) are being embraced for cost efficiency, scalability, and consolidation. Hypervisor technologies, Open Source versus Proprietary Hypervisor technologies, Security, Open Standards, Interoperability, High Availability, and Backup & Recovery are among the issues that are critical to consider when evaluating a move to the cloud.

Cloud Computing Articles

As mobile has pushed deeper into enterprises, there is a growing recognition that it may be possible to run significant parts of businesses from relatively small devices. While mobile devices may not be ready to run entire enterprises, in many cases, they certainly can run more limited functions.

Today's enterprise database environments are growing in size and complexity, fueled by rising data volumes and new business demands. Many databases that have been at the heart of existing enterprises to power mission-critical applications are now being positioned to support new Digital Native businesses. As a result, 24x7 high availability is no longer a luxury for select applications; it's a necessity for the bulk of the business—many organizations can no longer afford downtime in their data environments—even for a minute.

Today's products and services connect to the world around its users. In an effort to help people support more devices and understand the Internet of Things space, Verizon launched ThingSpace, an IoT development platform that allows its enterprise customers to build and connect their own IoT applications.

The top reasons for implementing Internet of Things projects include increasing new business revenue sources, increasing customer and product knowledge, and reducing operating expenses. DBTA recently held a roundtable webinar on how best to derive business value from IoT featuring Kevin Petrie, senior director and technology evangelist at Attunity; Jamie Morgan senior solutions architect at HPE Security - Data Security; and Becky Hanenkrat consulting sales specialist at North America database and data warehousing at IBM.

Oracle has signed a definitive agreement to acquire Wercker, a company based in Amsterdam, Netherlands, that provides a Docker-native continuous integration and deployment automation platform for Kubernetes and microservice deployments.

Fujitsu and Oracle jointly announced the launch of Fujitsu SPARC M12, a new lineup of enterprise servers available worldwide. Featuring Fujitsu's new SPARC64 XII processor, the companies say, the Fujitsu SPARC M12 servers achieve the world's highest per CPU core performance in arithmetic processing, offering improvements for a range of database workloads, from mission-critical systems on premises to big data processing in the cloud.

Oracle has unveiled its Cloud Converged Storage, which, it says, represents the first time a public cloud provider at scale has integrated its cloud services with its on-premises, high performance NAS storage systems.

Innovative technologies, such as artificial intelligence, augmented reality, robotics, and IoT, have had, and will continue to have, broad impact that we don't yet fully understand. Organizations that adopt these technologies will require new business models and processes. We will need to understand who our customers are and what they expect. The world of work as we know it today will continue to evolve at a faster pace—this is why adaptability and resilience are critical to a vibrant career.

NTT Security, the specialized security company of NTT Group, has announced the formation of the Global Threat Intelligence Center (GTIC). Cyber threats are by nature global, and NTT Security's threat intelligence capabilities will now reflect this, offering a comprehensive view of the threat landscape but with regional delivery.

M-Files Corporation, a provider of solutions to improve how businesses manage documents and other information, has introduced offerings that help companies better protect their customers' personal data and adhere to GDPR requirements. The objective of the European Union (EU) General Data Protection Regulation (GDPR) is to simplify and harmonize data privacy laws across Europe, and give EU citizens control of their Personally Identifiable Information (PII). There are substantial non-compliance fines—up to 20 million euros or 4% of global annual turnover based on the preceding financial year, whichever is greater.

IBM has announced it is the first provider to make the NVIDIA Tesla P100 GPU accelerator available on the cloud. The combination of NVIDIA's acceleration technology with IBM's Cloud platform is intended to help organizations more efficiently run compute-heavy workloads, such as artificial intelligence, deep learning and high performance data analytics.

To meet the new demands of managing infrastructure in the cloud in a proactive manner, the new role of the "cloud keeper" has emerged. The cloud keeper is part technologist, part accountant, and part administrator. The cloud keeper has financial responsibility for keeping control of infrastructure expenses to prevent financial chaos. The role is part technical, since it requires an understanding of how and where resources are deployed. The cloud keeper must know how a resource is paid for and have enough technical expertise to know which resources can be spun up or down or would be better suited for one cloud paradigm over another.

While companies often view processes from their frame of reference, "cutting" processes up according to department, business objective, or other internal aspect, customers obviously do not act according to the same taxonomy and—from the perspective of the company—appear to jump from process to process, from department to department, and from channel to channel, making it difficult for businesses to truly follow a customer through his or her whole journey.

By now we are all in agreement: The business of data is changing. Business users are more empowered to work with data; IT is becoming less about control and more about enablement. New data science job descriptions—such as the data scientist—are springing up as companies everywhere look for the right people with the right skill sets to squeeze more value from their data. Data itself is getting bigger, hardware more economical, and analytical software more "self-service." We've embraced the paradigm shift from traditional BI to iterative data discovery. It's a new era.

Make no mistake: Big data is promising, exciting, and effective—when done right. Once considered an overhyped buzzword, it's now a potential tool that leaders in every vertical want to harness. Unfortunately, the majority of new big data projects—about 55% of them, according to Gartner—are shuttered before they even get off the ground.

There has been a sea of change in how enterprises are thinking about Apache Hadoop and big data. Today, a majority of enterprises are thinking about the cloud first, not on-premises, and are increasingly relying on ecosystem standards to drive their Apache Hadoop distribution selection.

Elastic and Google have formed a partnership to bring managed support of Elastic's open source search and analytics platform to Google Cloud Platform (GCP). The partnership will provide customers a managed open source search and analytics solution that leverages GCP's global network and scale.

Voting has opened for the 2017 Database Trends and Applications Readers' Choice Awards. Unlike other awards programs that rely on our editorial staff's evaluations, the DBTA Readers' Choice Awards are unique in that the winning information management solutions are chosen by you - the people who actually use them.

I always look forward to new research from Unisphere Research, a division of Information Today, Inc., publishers of this magazine and other great products for data professionals. The latest report which you should read is "SQL Server Transformation: Toward Agility & Resiliency 2017; PASS Database Management Survey."

There has been plenty of dialogue within the IT community about when to migrate to the cloud, how to migrate to the cloud, which provider offers customers the best cloud environments, and the due diligence or governance that is necessary before taking that big step. There are larger waves of change nipping at our heels, yet we seem content to continue discussing a technology that is a means to an end.

There's a wide and growing acceptance that containers are replacing operating systems as the deployment target for application components. While application modules were previously designed to be installed upon a specific version of an operating system on a particular hardware platform, they are now increasingly being designed to run within a virtualized representation of an operating system—most frequently within a Docker container.

Many IT organizations are looking to the cloud to move, create, or extend existing database infrastructure. Perhaps yours is one of them. Yes, the world of the database professional is shifting, this we know. The real questions are how is it shifting, and what can you, as a DBA, do to be successful in this new world?

While not the most media-hyped technology, databases are certainly one of the most crucial when it comes to our always-online, always-connected society. Databases power not just the applications and websites we use every day, but the businesses that generate revenue and fuel the economy. The internet relies on functioning and well-performing databases to operate.

"Temporality" is a term that database managers know well, but it may be a new one for business managers. That has to change, as the temporality your database supports­—or, how it handles time—could be the difference between whether or not the business will increase revenue, pay a fine, or identify new opportunities. Especially important in this regard is "bitemporality," which is the ability to examine data across different points in time.

It is difficult to find someone not talking about or considering using containers to deploy and manage their enterprise applications. A container just looks like another process running on a system; a dedicated CPU and pre-allocated memory aren't required in order to run a container. The simplicity of building, deploying, and managing containers is among the reasons that containers are growing rapidly in popularity.

Hadoop continues to gain meaningful traction and organizations are now anticipating onboarding their analytics and business intelligence to the platform. At Data Summit 2017, Josh Klahr, vice president of products at AtScale, will discuss what enterprises users need to know on being successful with business intelligence on big data.

Cohesity, a provider of hyperconverged secondary storage, is receiving its largest funding to date, raising over $90 million in a Series C round co-led by investors GV (formerly Google Ventures) and Sequoia Capital, to expand sales and marketing to meet explosive customer demand. The investment will accelerate Cohesity's research and development of additional secondary storage use cases beyond data protection, with a special focus on analytics, test/dev, file services and object services.

Data security provider McAfee has begun operating as a new standalone entity, with plans to expand upon its security solutions platform to better enable customers to effectively identify and orchestrate responses to cyberthreats.

Teradata is launching an innovative database license model across hybrid cloud deployments, giving users more portability for deployment flexibility, subscription-based licenses, and simplified tiers with bundled features. With portable database licenses, Teradata customers can now have the flexibility to choose, shift, expand, and restructure their hybrid cloud environment by moving licenses between deployment options as their business needs change.

EnterpriseDB (EDB) has released the EDB Postgres Ark Database-as-a-Service (DBaaS) framework to the Amazon Web Services Marketplace (AWS). EDB Postgres Ark enables customizable deployments of Postgres clusters to private and public clouds, such as AWS as well as Red Hat OpenStack.

GridGain Systems, provider of enterprise-grade in-memory computing platform solutions based on Apache Ignite, has obtained certifications from Hortonworks and Tableau and joined their technology partnership programs. GridGain says these relationships will make it easier for its customers to launch high performance big data systems built on Hortonworks that leverage in-memory computing and to visualize in-memory data held in GridGain using Tableau.

In an effort to alleviate an impending critical shortage of developers, Cloud Foundry Foundation, an open source project whose stated purpose is to make Cloud Foundry the leading application platform for cloud computing, is launching a cloud-native developer certification initiative. The "Cloud Foundry Certified Developer" program will be delivered in partnership with The Linux Foundation. Training will be offered by companies, including SAP, IBM, and Pivotal, to help meet end user demand.

The International Association of Cloud & Managed Service Providers (MSPAlliance) has formed a partnership with Ingram Micro Cloud to launch the MSPAlliance's MSP/Cloud Verify Program, an initiative to promote best practices and improve service delivery among the managed service provider (MSP) and cloud computing community.

The Independent Oracle Users Group (IOUG) has represented the voice of data technologists and professionals for more than 20 years, and we are excited about how our community continues to grow and focus on peer-to-peer education and know-how. With that focus we are excited for our premier yearly event: COLLABORATE 17 - IOUG Forum.

ManageEngine, the real-time IT management company, is adding two-factor authentication (2FA) support in ADManager Plus, its Active Directory (AD) management, and reporting solution, making its platforms much more secure. The addition of 2FA support establishes an extra layer of security around ADManager Plus logins while user provisioning support makes the iOS app a complete user life cycle management tool for on-the-go AD admins.

Kinetica, provider of in-memory analytics database accelerated by GPUs, is partnering with Safe Software and created FME connectors that read and write data from Kinetica into and out of FME workspaces.

TIBCO Software Inc., a provider of software for integration, API management, and analytics, has announced availability of a new API management offering, TIBCO Mashery Professional, which is targeted at helping to speed up digital transformation activities.