Business Intelligence and Analytics

The world of Business Intelligence and Analytics is evolving quickly. Increasingly, the emphasis is on real-time Business Intelligence to enable faster decision making, and on Data Visualization, which enables data patterns to be seen more clearly. Key technologies involved in preparing raw data to be used for Business Intelligence, Reporting and Analytics – including ETL (Extract, Transform, and Load), CDC (Change Data Capture), and Data Deduplication – support a wide range of goals within organizations.

Business Intelligence and Analytics Articles

The Independent Oracle Users Group (IOUG) is excited to be joining the Oracle technology community in San Francisco once again at Oracle OpenWorld 2016, September 18-22. IOUG's 30,0000+ member community is comprised of the top Oracle technology experts from around the globe, several of whom will be presenting sessions on hot topics like Data Intelligence, iOT, Data Security, and Cloud migrations.

Thousands of members of the Oracle Applications Users Group (OAUG) get the answers they need by sharing best practices, case studies and lessons learned. As the world's largest education, networking and advocacy forum for users of Oracle Applications, the OAUG helps members connect to find the solutions they need to do their jobs better and to improve their organizations' return on investment in Oracle Applications.

Anyone who has ever attended Oracle OpenWorld knows that you must plan ahead. The conference held in San Francisco each fall is vast, and the upcoming conference, scheduled for September 18-22, 2016, promises to be equally expansive. Just as in years before, tens of thousands of attendees from well over 100 countries can be expected converge to learn more about Oracle's ever-expanding ecosystem of technologies, products and services during thousands of sessions held at the Moscone Center and multiple additional venues in downtown San Francisco. Here, Database Trends and Applications presents the annual Who to See @ Oracle OpenWorld special section.

I had the pleasure to spend some time with my old friend Mark Souza, a general manager in the Data Platform team at Microsoft, while speaking at the SQL Saturday event in Dublin, Ireland. Now keep in mind that Mark and I have known each other since the 1990s when SQL Server was just being ported to a brand new operating system called Windows NT. Mark and I were having a laugh and more than a twinge of nostalgia about how much SQL Server has improved over the decades and now sits atop the heap on most analysts' "best database" reports. This isn't just two old-timers sharing a few war stories though. This is a living, breathing transformation that is still in process.

As IT decision making moves out of the IT department and into the functional areas of organizations, partnerships and collaboration become even more critical. According to an article in strategy+business on why CEOs must become more technology savvy, "the majority of technology spending (68%) is now coming from budgets outside of IT, a significant increase from 47% in 2014." What this means is that many critical technology decisions are being made without the consultation of IT professionals.

Regulatory compliance is a critical aspect of the IT landscape these days, and the ability to audit database activities showing who did what to which data when is a specific requirement of many industry and governmental regulations. There are six primary methods that can be used to accomplish database auditing.

With 20 million downloads to date, MongoDB is arguably today's fastest-growing database technology. MongoDB's rapid growth has been driven primarily by its attractiveness to developers. By using JavaScript Object Notation (JSON) documents as the native database format, MongoDB reduces the impedance mismatch between program code and database, allowing more agile and rapid application development.

When you think about the role of a database professional, you probably don't include "cost savings" in the list of responsibilities. Maybe if there were a clearer correlation between the work of database professionals and money, people would pay more attention. Well, it turns out, there is.

Can Oracle and its partners keep up with the increasing demands of customers for real-time digital capabilities? Is the Oracle constellation of solutions—from data analytics to enterprise applications—ready for the burgeoning requirements of the Internet of Things (IoT) and data-driven businesses? For Oracle—along with its far-flung network of software vendors, integrators, and partners—times have never been so challenging.

The month of June heralded the long-awaited release of Microsoft's SQL Server 2016. The 2016 edition includes unique security functionality with Always Encrypted that protects data both at rest and in motion, ground-breaking performance and scale as evidenced by the number-one performance benchmarks, and accelerated hybrid cloud scenarios and Stretch Database functionality that supports historical data on the cloud.

Oracle announced today that it is acquiring NetSuite in a transaction valued at approximately $9.3 billion. The transaction is expected to close in 2016. In a statement, Zach Nelson, CEO of NetSuite, said that NetSuite will benefit from Oracle's global scale and reach to accelerate the availability of its cloud solutions.

Datical, a provider of agile database automation solutions, is releasing an updated version of its namesake platform, improving the solution with change management enhancements to boost workflows. Datical DB 4.0 includes a new Automated Deployment Packager that allows database changes to be continually validated, packaged, and integrated via more pervasive, hands-off automation.

Organizations have directed a lot of attention recently to consolidation, automation, and cloud efforts in their data management environments. This will purportedly result in decreased demand for data managers and the need for fewer DBAs per groups of databases. However, the opposite seems to be occurring. In actuality, there is a growing need for more talent, as well as expertise to manage through growing complexity. A new survey, sponsored by Idera and conducted by Unisphere Research among more than 300 data executives, managers, and professionals finds that a more challenging data environment is arising due to a confluence of factors.

The potential of your business intelligence or data processing application is limited without a comprehensive data connectivity solution. Staying competitive and relevant requires a breadth of data connectivity options.

Our goal at Amazon Web Services (AWS) is to enable our customers to do things that were previously not possible, and make things that our customers can already do simpler and better at a much lower cost.

Data replication advances a number of enterprise goals, supporting scenarios such as distribution of information as part of business intelligence and reporting initiatives, facilitating high availability and disaster recovery, and as part of a no-downtime migration initiative.

Everyone knows the three Vs, volume, velocity, and variety, of big data but what's required is a solution that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones.

Query and reporting solutions are part of a comprehensive business intelligence approach in every organization. As long as enterprises need to gather data, BI groups look to utilize query and report programs as primary applications that produce output from information systems

As data grows, organizations are looking for ways to dig up insights from underneath layers of information. Data mining solutions provide the tools that enable them to view those hidden gems and facilitate better understanding of new business opportunities, competitive situations, and complex challenges.

Business intelligence encompasses a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

To evaluate data quickly and enable predictive analytics based on sources of rapidly changing data - including social media, sensors, and financial services data -- streaming data solutions come to rescue, providing the data when it is needed ... now.

Many enterprises are finding themselves with different options when it comes to moving, working, and storing data. While one tool may be right for one organization, another or a combination of tools may be just what the doctor ordered.

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

A key component to data integration best practices, change data capture (CDC) is based on the identification, capture, and delivery of the changes made to enterprise data sources. CDC helps minimize access to both source and target systems as well as supports the ability to keep a record of changes for compliance, and is a key component of many data processes.

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. For many enterprises, however, the road to data-driven nirvana is stymied by the inflexible, calcified systems and processes that were laid out decades earlier and still control the data flow within many enterprises, according to Joe McKendrick, lead research analyst at Unisphere Research.

Effective data governance improves the usability, integrity, and security of enterprise data. As organizations gather, store, and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

The last thing companies want is tainted data merging with incorrect information. The process of maintaining data integrity enhances the reliability of information for use by a business. This is where tools to ensure data quality come in.

Modern data modeling products offer a range of important capabilities such as ways to compare models and then synchronize them, reverse- and forward-engineer systems, visual interfaces for improved ease of use, ways for data assets to be visually documented and reused, and native support for new and emerging database platforms. Many data modeling solutions also provide features to help users map and understand their information architecture to prepare for technology upgrades and process improvements, as well as understand the impact of changes before they are implemented for better appreciation of risks and challenges.

Data is increasingly appreciated by companies as their most valuable asset. But the problem is that this view is not just held by organizations themselves, there are others - including hackers and - who see it that way as well. IT and data managers can play a pivotal role in enterprise security because they are the insiders with trusted status and they are aware of where the data is stored and how best to reduce or eliminate threats. Newer security technology can also relieve many of the manual burdens associated with database monitoring.

For fast-paced, turn-on-a-dime digital enterprises, with demands for 24-by-7 uptime, no activity is more vital than keeping database systems up and running. Today, database availability is no longer just a critical IT issue; it is a critical business issue. To be prepared in the event of system failures, infrastructure owners and DBAs have developed strategies to increase resiliency and assure availability of data.

Database development is growing more challenging all the time. Releases are expected to come out faster than ever, and teams are more spread out across, global geographies, time zones, and skill levels. Software deployment, meanwhile needs to be spread across cloud and onsite, and accessible through more devices than ever. What's needed is an integrated end-to-end environment with multi-platform support that helps simplify development and provides automation to achieve repeatable processes and avoid potential risks that can translate into unanticipated delays.

With the growing appreciation of data as a valuable resource and the pressure on organizations to become data-driven, the role of database administration has become more critical than ever, and database administrators are increasingly appreciated for the critical role they play in delivering value to the organizations. Key concerns as far as database administration are availability, security, and integration across a variety of data types and storage mechanism so that analysts and business users are empowered to be able uncover key insights when they need it. It's a lot to think about, and that is why strong database administration solutions are highly valued.

In 2016, Hadoop marked its 10th anniversary and now represents much more than a platform for the storage and batch processing of vast quantities of data from disparate sources in many formats. The Apache Hadoop framework, consisting of Hadoop Common, Hadoop Distributed File System (HDFS); Hadoop YARN, and Hadoop MapReduce, remains central to most big data projects and to the creation of data lakes, but Hadoop has also expanded to represent a large ecosystem of more than 100 interconnected open source Hadoop-related projects.

There is more data available to organizations than ever before, but the goal remains the same - to unlock nuggets of gold, the useful information that will result in competitive advantage for the organization, allowing it to react to customer's needs with lightning speed, uncover new opportunities, and act fast to counter competitive threats.

Over the years, MultiValue technology has maintained its base of committed advocates despite the decades-long trend toward relational database management systems. And, now with an expanding appreciation for polyglot persistence, or put more simply, the selection of the best tool for the job, there is a growing recognition that different data management systems offer different benefits with some simply better suited for certain requirements than others.

While relational database technology is still the undisputed leader when it comes to enterprise data management, it is also becoming increasingly apparent that it is no longer the only game in town. By now it is clear that Not Only SQL or NoSQL technology represents a key piece of a data management picture that is increasingly diverse.

Cloud solutions and services continue to grow in acceptance for many reasons - the ease of deployment and upgrades, elastic scalability, and the pay as you go simplicity. Cloud database options offer an approach that allows organizations to run a database on a cloud platform, either running the DB on the cloud independently or selecting a database service that is maintained by a public cloud service provider.

The rise of new data types over the past 10 years has led to new ways of thinking about data, and new data storage and management technologies such as Hadoop, NewSQL, and NoSQL. However, despite all the new technologies that have emerged in the last 10 years, one thing is clear: the relational database management system which has been the enterprise workhorse for decades will remain a critical component of the data architecture.

Today, data is being recognized and appreciated as an asset, and even, some have suggested, a kind of currency. But beyond the obvious businesses built on data - such as Airbnb's rental business, Uber's car service app, and Alibaba's online marketplace - every business today is striving to become a data-driven organization, with turn-on-a-dime agility and rapid insights into customer behaviors and desires.

The rapid expansion of The IT market shows no signs of abating. The growth of data in all its forms—from traditional data sources and newer sources such as social media and connected devices—is driving swift innovation. To address the need to secure, integrate, and draw meaningful insights from all this data, a steady flow of products and services, as well as new features to long-established offerings, continues to emerge.

Splice Machine is teaming up with Incedo, a technology solutions provider specializing in data and analytics, product engineering, and emerging technologies, to generate solutions that will help enterprises manage data and accelerate data processing.

SolarWinds, a provider of IT management software, has announced significant updates to the SolarWinds Database Performance Analyzer (DPA). Version 10.2 supports MySQL as a back-end database in addition to SQL Server and Oracle Database, includes visualization and analysis tools to help DBAs optimize for and resolve blocking, locking, and deadlock issues, and adds support for Microsoft SQL Server 2016.

NTT Security Corporation is bringing together its advanced analytics technologies to launch a specialized security company. NTT Security delivers Managed Security Services (MSS) and specialized security professional services that support the Full Security Life Cycle. These services will be taken to market globally and client engagement will be managed by NTT operating companies Dimension Data, NTT Communications and NTT Data.

Reltio, an applications and data management PaaS company, is receiving $22 million in Series B funding, with the investement being used to expand and accelerate the company's hiring practices and further innovation.

Talend S.A., a provider of big data and cloud integration solutions, successfully launched its IPO. The offering included 5,250,000 American Depositary Shares (ADSs), each representing one of its ordinary shares, at a price to the public of $18 per ADS, which was higher than anticipated - and then rose sharply.