Posts Tagged ‘Misc’

I have been wanting to create this content concept for a long time. I am going to call this as “Azure Bits-N-Bytes” series. The idea here is to create short videos on Azure related topics (ideally lesser than 5 mins) and share it.

In this video series I am talking about how to create an SQL Azure DB using the portal quick create option.

As you can see this is the simplest option one can have in creating a SQLDB on Azure. As easy as it can be, there are a number of options we will need to play around with after this to access this DB. We will discuss the same in future posts.

Do let me know if you are interested in these type of quick videos in the future too. Based on your feedbacks, we will increase the frequency of such posts.

I am a big time supporter of PowerShell and automation when it comes to working with Azure. In the recent past, I have seen SQL Developer ask me – how I can start playing with PowerShell for Azure. Though the concepts of powershell are awesome and I use them inside SQL Server, it is always difficult to get them ready on their machines. Often, the question is – where do I start? This blog will get you started with installing and configuring your machine with PS and Azure.

We can see the installation progress and once done, we will be presented with information to various components that get installed as part of this.

Now to use PowerShell with out Azure subscription, we need to link our account. If you are using an organizational account, then use the Add-AzureAccount command from the Windows Azure PowerShell command. This is useful ONLY when we are using Azure AD for our account. If you are not using Azure AD, then move to the next step.

In most cases, we are going to have standard Live Account associated with our Subscription. In this case, we need to download the certificate. To do this, invoke the command Get-AzurePublishSettingsFile command from the PowerShell command prompt.

We will be prompted with a save dialog and save this certificate file in a secure location. The next step is to Import the same so that we can start using PowerShell from our local machine to work with our subscription remotely.

For this we will invoke the – Import-AzurePublishSettingsFile “Path of certificate file”

Once this is done, we are connected and we will get an confirmation of our current subscription. being used.

We can quickly get details about our current subscription using the command Get-AzureSubscription as it outlines some of the information.

Now we are all set with are getting started with PowerShell and Azure. In future posts we will use this configuration as starting point to play around creating new VMs, Stopping VMs, Deleting VMs, Creating other services and much more. Stay tuned !!!

Being a data person and meeting customers almost every single day on considerations of movement to Azure (Cloud), I take the opportunity to demystify a lot basic concepts with customers. Post the session, it is about thinking in the right direction while taking data to the cloud. This blog reflects the lessons learned best practices, approaches and understanding from real world experiences. There are not written on stone sort of recommendations, but these approaches I highly recommend customers to set their minds at while taking data to cloud.

Know your workload

The basic conversation here is to identify if the application is a “Green Field” project or an “Existing Application”. In a “Green Field” project, we will be exposing data via a service for the first time. Like a literal green field, we are dealing with an undeveloped environment and don’t have any expectations, limitations, dependencies or other constraints that would result from having an existing service.

For existing applications, services may reside on our on-premise servers, partner hosted environment or third-party cloud environments. In such scenario, there may be a need to move, scale, or provide a redundant environment for these services. When such applications come, I look at asking if the end-user open on opening up just a subset of of existing data as a new service on cloud.

When working movement to Cloud, I generally ask if the customer wants to migrate as-is their current deployment or migrate data / services as an opportunity to enhance it providing a super set of functionality found in the original application.

If you already have an existing application, but it is reaching the limits of scale and is difficult or impossible due to technology or financial based constraints to scale – one option is to use the cloud as a “scale layer”. In this scenario, cloud infrastructure can be put in front of your existing data services to provide significant scale. This is yet another workload to look at.

Exposing your data

There can be different levels of motivations when exposing your data via API’s. I generally quiz customers around this – is it compliance requirement, is it self-motivated for others to integrate or is it a requirement from consumers. Based on this, the discussion will decide if it is need to have, want to have or nice to have.

Irrespective how the data gets exposed by applications, there are few more analysis we do, these are some typical queries we ask:

Data Location: Are there constraints in location of where data can be stored? Is geography a contraint?

Data type and Format: Are there any standards which mandate data to be transmitted in XML, JASON, Binary, Text, OData, DOM, Zip, Images format etc?

Sharing: Is there a specific API to be implemented like WebAPI, WCF, Web Services, FTP, SAPI etc.

Consumption: Are there requirements to retrieve the data in its entirety or be able to query the data and retrieve only a subset? Is queryability a required and do we need to certify the same?

Commercial Use: It is critical to understand if it is internal systems consuming the data or available for commercial use? Is there a need to monetize the access of data and services delivered?

Data discoverability: Is there a need to expose data and advertised somehow? Should it integrate with gateways, govt services or third party data services like Azure DataMarket?

Know your Data

There are multiple technologies which can be used to host data, each with their own pros, cons, and pricing. Data size is relevant as there can be key factors when determining which technology should be used to host the data. In addition to understanding the current data size, it’s important to also understand the data growth rate. Evaluating the data growth rate can be used to extrapolate storage needs in future years and help scope the technology used.

Know how often the data gets updated. If customer wants to expose data to public, then there maybe sync or out-of-sync updates to source data and data exposed via services.

Putting a copy of your data into the cloud requires the actual transfer of data from your location to the datacenter of cloud hosting provider. These transfers require time and typically have associated bandwidth charges that should be considered – both for the initial transfer of the current state of your data set and the periodic refreshes of data.

Sharding is something that most cloud storage mechanisms support. When looking to move large data sets to the cloud, consider sharding and think about the optimal ways to partition data for optimized for query performance. Examples of different types of sharding include sharding by region, state, by time period (month, year etc.), postal code, customer’s name etc.

Query pattern is yet another dimension we need to keep in mind. Since you pay based on bandwidth utilization, it is critical to understand the query patterns and how queries come to access data. Since we are paying for the data usage, it is important to understand the amount of data we transfer in every request, the number of columns queried etc.

As we build our applications, it is important to understand if there are any Exporting requirements for other systems to consume. In an hybrid scenario, we will need to run queries that result in the export of files to the local file system and then those must be transported and imported into a cloud store. Depending on the cloud technology being used for storage, the import may be trivial or it may require the creation of custom code or scripts to export the data from the source and import it into the cloud.

Are you into data selling business?

There have been very less customers who are in a B2C scenario, but for those who are on this business – need to understand there is new costs to handle while servicing data as a service. It is no more a freebie. Even though cloud computing can greatly reduce costs, there are still costs involved to host, load and support a data service.

As these costs add up, companies look at a minimum of Cost Recovery rather than Profit in the first run. They look at optimizing it later to make profit by volumes rather than profits out of single transactions. I have seen customers adopt different strategies for pricing in a Data as a Service or Software as a Service strategy. There is no one right way or the other –

Tiered Pricing – This is like mobile plans, you charge by standard transactions rates per month, time, data etc.

Pay as you go pricing – this is something to borrow from any cloud vendor charging.

Peak Pricing and Non-Peak Pricing – We can have different pricing for different times of the day.

One time usage Subscription – It is something like a trial costing.

There are many of these mechanisms customers adopt in their strategy, but these are just representative methods one might use.

Conclusion

I seriously hope I have outlined some thoughts in your mind as you migrate your workload to cloud or enable your data on cloud. Cloud is no different than what you do on-premise – just that we need to be double careful and take few extra steps as we enable our customers. In future posts, there will be more thoughts that I will try to bring out.

The more I talk with customers on basic architecture, more are the questions about how to implement. These scenarios when talked look great on whiteboard and when they implement and come back (say after 6 months) is completely different. These are some challenges when working with customer developer teams who has just started their career in writing DB level code.

One such scenario I talk to customers who have requirements of security is around column level encryption. It is one of the most simplest implementation and yet difficult to visualize. The scenario is simple where-in most of the HRMS (Hospital Management Systems) come with a simple requirement that data of one person must be masked to other.

So in this post, I thought it will be worth take a step-by-step tutorials to what I am talking. These capabilities are with SQL Server since the 2005 version and can be used by anyone. I am not talking about the infra best practices or the deployment strategy yet, that will be for a future post.

For our example we have two doctors, we want to build a mechanism where Doctor1 patients details must not be visible to Doctor2. Let us create our simple table to keep values.

— Create tablesCREATETABLEpatientdata(idINT,nameNVARCHAR(30),doctornameVARCHAR(25),uidVARBINARY(1000),symptomVARBINARY(4000))go— Grant access to the table to both doctorsGRANTSELECT,INSERTONpatientdataTOdoctor1;GRANTSELECT,INSERTONpatientdataTOdoctor2;

Basic Encryption steps

Next step is to create our keys. To start with, we need to create our Master Key first. Then we will create the Certificates we will use. In this example, I am using a Symmetric key which will be encrypted by the certificates as part of logic.

As you can see this is a very simple implementation of column level encryption inside SQL Server and can be quite effective to mask data from each others in a multi-tenant environment. There are a number of reasons one can go for this solution. I thought this will be worth a shout even though the implementation has been in industry for close to a decade now.

I have been wanting to write on this topic for ages but seem to have missed out for one reason or the other. How many times in your life seen a web page with a bunch of tables and it is so boring to read them? The numbers or tables sometimes might have sorting capability but lacks a striking visualization to say the least. So I am going to borrow a table from a Wikipedia page about Indian Population. There are a number of tables and the table of interest to be in Literacy rate. So the rough table looks like:

State/UT Code

India/State/UT

Literate Persons (%)

Males (%)

Females (%)

01

Jammu and Kashmir

86.61

87.26

85.23+-

02

Himachal Pradesh

83.78

90.83

76.60

03

Punjab

76.6

81.48

71.34

04

Chandigarh

86.43

90.54

81.38

…

…

…

…

…

Well, this is as boring as it can ever get even when pasted as-is on this blog. Now here is the trick we are going to do called as Excel Interactive View. As the name suggests, we are going to use the power of Excel to make this mundane table into some fancy charts for analysis. This includes a couple of scripts that needs to be added as part of the HTML Table and we are done. It is really as simple as that. So let me add the complete table with the script added. Just click on the button provided above to see the magic:

So how cool is this Excel visualisation? I am sure you will want to build or use this capability in your webpages or internal sites in your organizations too. I hope you learnt something really interesting.