Big data and data warehouse on Azure

The modern data warehouse is a combination of relational and non-relational data as well as structured and unstructured data created or stored by Azure services such as Azure Blob Storage and Data Factory. They act as a primary source of data while relational data supports business rules, governance, and security models. These functions are layered in this part while the relational model also supports various applications of internal business as it de-normalizes the data in the relational model.

Azure Databricks andAzure Machine Learning platforms can be combined to get advanced analytical capabilities on big data. Likewise,Azure Blob Storage and Azure Data Factory are very helpful for analytical purposes and for the training of data cases in advanced analytical models. Experimental models can also be used to check the efficiency of large data sets and get the credibility of model scores.Azure Cosmos DB integrates all modern services as it’s designed to combine and support different types of data requirements and consumptions.

Real-Time Analytics

Apache Kafka, Azure Databricks, and Azure IOT Hub are the services designed on batch-based workload and aim to deliver real-time analytics. Fast ingestion layer is added up in the previous model to increase the analytical capability of the model and execute data analytics to inbound model without excluding the batch workloads. Azure Stream Analytics performs the same task while its additional features include the high probability of joining capability by using the inbound data and against the already stored data. The presence of Azure Cosmos DBoffers support for the integration of modern and intelligent applications.

See this article for further reading on warehouse databases monitoring and optimization.