Database Management

Relational Database Management Systems (RDBMSs) continue to do the heavy lifting in data management, while newer database management systems are taking on prominent roles as well. NoSQL database systems such as Key-Value, Column Family, Graph, and Document databases, are gaining acceptance due to their ability to handle unstructured and semi-structured data. MultiValue, sometimes called the fifth NoSQL database, is also a well-established database management technology which continues to evolve to address new enterprise requirements.

Database Management Articles

Data governance is sometimes viewed as a roadblock that keeps data scientists and analysts from turning data into business insights quickly and efficiently. Yet, in the enterprise data and analytics work that we've undertaken for clients across diverse industries, we've found that it's often a lack of sound data governance that prevents organizations from realizing the full value of their data.

Nimble Storage's Predictive AF-Series All Flash arrays are now certified by SAP as an enterprise storage solution for the SAP HANA platform. As a result, Nimble customers can leverage their existing hardware and infrastructure components for their SAP HANA-based environments, providing an additional choice for organizations working in heterogeneous environments. This certification adds to the SAP HANA certification Nimble previously obtained for its Adaptive Flash CS-Series arrays for use as enterprise storage solutions for the SAP HANA platform.

SAP is releasing a next generation data warehouse solution for running a real-time digital enterprise on-premise and in the cloud. The new solution, SAP BW/4HANA, will be available on Amazon Web Services (AWS) and SAP HANA Enterprise Cloud (HEC).

The elastic and distributed technologies that run modern applications require a new approach to operations — one that learns about your infrastructure and assists IT operators with maintenance and problem-solving. The interdependencies between new applications are creating chaos in existing systems and surfacing the operational challenges of modern systems. Solutions such as micro services architectures alleviate the scalability pains of centralized proprietary services but at a tremendous cost in complexity.

While Hadoop can store and access vast amounts of detailed data at lower costs, businesses are still encountering high-performance demands and they face a bevy of business questions. Emerging in-memory frameworks, such as Apache Spark and SAP HANA Vora, are enabling enterprises with the tools to overcome the limitations of batch-oriented processing and achieve real-time, iterative access to data on Hadoop clusters.

SolarWinds, a provider of IT management software, has announced significant updates to the SolarWinds Database Performance Analyzer (DPA). Version 10.2 supports MySQL as a back-end database in addition to SQL Server and Oracle Database, includes visualization and analysis tools to help DBAs optimize for and resolve blocking, locking, and deadlock issues, and adds support for Microsoft SQL Server 2016.

When you think about the role of a database professional, you probably don't include "cost savings" in the list of responsibilities. Maybe if there were a clearer correlation between the work of database professionals and money, people would pay more attention. Well, it turns out, there is.

SHARE recently wrapped up its summer conference in Atlanta. James Vincent, immediate past president of SHARE, reflected on the changes that have taken place in the IT industry during his tenure and the key takeaways from the event which took place July 31-August 5. "One takeaway is that SHARE is on the right track when it comes to its focus on the new IT generation, what we call zNextGen," said Vincent.

Google has announced that all of its database storage products are generally available and covered by corresponding service level agreements (SLAs). With this announcement, the company says, Cloud SQL, Cloud Bigtable and Cloud Datastore are now generally available. It is also releasing new performance and security support for Google Compute Engine.

The size and complexity of database environments is pushing IT resources at most organizations to the limit. This reduces agility and increases costs and challenges associated with maintaining the performance and availability of these on demand services. To address these concerns, many IT departments are looking for ways to automate routine tasks and consolidate databases.

As IT decision making moves out of the IT department and into the functional areas of organizations, partnerships and collaboration become even more critical. According to an article in strategy+business on why CEOs must become more technology savvy, "the majority of technology spending (68%) is now coming from budgets outside of IT, a significant increase from 47% in 2014." What this means is that many critical technology decisions are being made without the consultation of IT professionals.

Regulatory compliance is a critical aspect of the IT landscape these days, and the ability to audit database activities showing who did what to which data when is a specific requirement of many industry and governmental regulations. There are six primary methods that can be used to accomplish database auditing.

Redis Labs and Intel announced they have collaboratively benchmarked a throughput of 3 million database operations/second at under 1 millisecond of latency, while generating over 1GB NVMe throughput, on a single server with Redis on Flash and Intel NVMe-based SSDs.

Datameer is launching a newly redesigned Global Partner Program to recruit and enable a dynamic ecosystem of technology, services, and reseller partners. Datameer's new program introduces a rewards system that recognizes partners and helps form a trusted and easily accessible network for customers.

The Independent Oracle Users Group (IOUG) is excited to be joining the Oracle technology community in San Francisco once again at Oracle OpenWorld 2016, September 18-22. IOUG's 30,0000+ member community is comprised of the top Oracle technology experts from around the globe, several of whom will be presenting sessions on hot topics like Data Intelligence, iOT, Data Security, and Cloud migrations.

Thousands of members of the Oracle Applications Users Group (OAUG) get the answers they need by sharing best practices, case studies and lessons learned. As the world's largest education, networking and advocacy forum for users of Oracle Applications, the OAUG helps members connect to find the solutions they need to do their jobs better and to improve their organizations' return on investment in Oracle Applications.

Anyone who has ever attended Oracle OpenWorld knows that you must plan ahead. The conference held in San Francisco each fall is vast, and the upcoming conference, scheduled for September 18-22, 2016, promises to be equally expansive. Just as in years before, tens of thousands of attendees from well over 100 countries can be expected converge to learn more about Oracle's ever-expanding ecosystem of technologies, products and services during thousands of sessions held at the Moscone Center and multiple additional venues in downtown San Francisco. Here, Database Trends and Applications presents the annual Who to See @ Oracle OpenWorld special section.

As one works through the normal forms, be it a journey to the placid shores of third normal, or the more arid climes of Boyce-Codd normal form, or even the cloudy peaks of fourth normal and beyond—and before one starts thinking about normalizing the design—the database designer has covered a lot of ground work already. Before thinking of normalizing, one needs to have conceptualized the relations that might be within the solution's scope.

I had the pleasure to spend some time with my old friend Mark Souza, a general manager in the Data Platform team at Microsoft, while speaking at the SQL Saturday event in Dublin, Ireland. Now keep in mind that Mark and I have known each other since the 1990s when SQL Server was just being ported to a brand new operating system called Windows NT. Mark and I were having a laugh and more than a twinge of nostalgia about how much SQL Server has improved over the decades and now sits atop the heap on most analysts' "best database" reports. This isn't just two old-timers sharing a few war stories though. This is a living, breathing transformation that is still in process.

Oracle announced today that it is acquiring NetSuite in a transaction valued at approximately $9.3 billion. The transaction is expected to close in 2016. In a statement, Zach Nelson, CEO of NetSuite, said that NetSuite will benefit from Oracle's global scale and reach to accelerate the availability of its cloud solutions.

Datical, a provider of agile database automation solutions, is releasing an updated version of its namesake platform, improving the solution with change management enhancements to boost workflows. Datical DB 4.0 includes a new Automated Deployment Packager that allows database changes to be continually validated, packaged, and integrated via more pervasive, hands-off automation.

Organizations have directed a lot of attention recently to consolidation, automation, and cloud efforts in their data management environments. This will purportedly result in decreased demand for data managers and the need for fewer DBAs per groups of databases. However, the opposite seems to be occurring. In actuality, there is a growing need for more talent, as well as expertise to manage through growing complexity. A new survey, sponsored by Idera and conducted by Unisphere Research among more than 300 data executives, managers, and professionals finds that a more challenging data environment is arising due to a confluence of factors.

The potential of your business intelligence or data processing application is limited without a comprehensive data connectivity solution. Staying competitive and relevant requires a breadth of data connectivity options.

Our goal at Amazon Web Services (AWS) is to enable our customers to do things that were previously not possible, and make things that our customers can already do simpler and better at a much lower cost.

Data replication advances a number of enterprise goals, supporting scenarios such as distribution of information as part of business intelligence and reporting initiatives, facilitating high availability and disaster recovery, and as part of a no-downtime migration initiative.

Everyone knows the three Vs, volume, velocity, and variety, of big data but what's required is a solution that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones.

Query and reporting solutions are part of a comprehensive business intelligence approach in every organization. As long as enterprises need to gather data, BI groups look to utilize query and report programs as primary applications that produce output from information systems

As data grows, organizations are looking for ways to dig up insights from underneath layers of information. Data mining solutions provide the tools that enable them to view those hidden gems and facilitate better understanding of new business opportunities, competitive situations, and complex challenges.

Business intelligence encompasses a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

To evaluate data quickly and enable predictive analytics based on sources of rapidly changing data - including social media, sensors, and financial services data -- streaming data solutions come to rescue, providing the data when it is needed ... now.

Many enterprises are finding themselves with different options when it comes to moving, working, and storing data. While one tool may be right for one organization, another or a combination of tools may be just what the doctor ordered.

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

A key component to data integration best practices, change data capture (CDC) is based on the identification, capture, and delivery of the changes made to enterprise data sources. CDC helps minimize access to both source and target systems as well as supports the ability to keep a record of changes for compliance, and is a key component of many data processes.

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. For many enterprises, however, the road to data-driven nirvana is stymied by the inflexible, calcified systems and processes that were laid out decades earlier and still control the data flow within many enterprises, according to Joe McKendrick, lead research analyst at Unisphere Research.

Effective data governance improves the usability, integrity, and security of enterprise data. As organizations gather, store, and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

The last thing companies want is tainted data merging with incorrect information. The process of maintaining data integrity enhances the reliability of information for use by a business. This is where tools to ensure data quality come in.