SQL Azure does not support SQL Server Agent but Azure Automation can be used for simplifying cloud management through process automation.

SQL Server Agent is a component of SQL Server which schedules jobs and handles other automated tasks. It runs as a Windows service so it can start automatically when the system boots or it can be started manually.

4

FileStream

Due to the nature of PaaS service, SQL Azure does not support FileStream.

SQL server has support for FileStream

5

Table Partitioning

Table Partitioning is allowed in Azure SQL Server. However, because of the PaaS nature of the product you cannot allocate partitions to different file groups. You would have to create partition scheme on Primary and the storage will be taken care by SQL Azure database itself. You will surely benefit from partition elimination here.

In SQL server has full support for table partitioning and can create and allocate partitions to different file groups.

6

Link Server

In SQL Azure you cannot access link servers.

In SQL Server you can create a linked server and access data from another SQL Server.

7

Default Transaction

The default transaction isolation level in SQL Azure is READ_COMMITTED_SNAPSHOT.

The default transaction isolation level in SQL Server is READ_COMMITTED.

8

Global Temporary

Tables

In SQL Azure you can create only local temporary tables. It does not support global temporary tables.

SQL Server has support for both local temporary tables and global temporary tables.

9

Stored Procedures

· Extended SP = NO

· User defined SP = yes

· System SP = Some

· Extended SP = yes

· User defined SP = yes

· System SP = yes

10

SQL Trace and Profiler

In SQL Azure SQL Trace and Profiler cannot be used to trace the events. As of now there is not any direct alternate other than using DMVs, monitoring using Azure Portal and extended events.

In SQL Server you use SQL Trace and Profiler to trace the events.

11

SQL CLR

In SQL Azure Common Language Runtime (SQL CLR) was first supported and then removed concerning security issues.

SQL Server still has support for SQL CLR

12

USE Statement

USE [DatabaseName] statement to switch the database context is not supported in SQL Azure.

Supported in SQL Server 2016

13

Log Shipping

SQL Azure does not support Log shipping

SQL Server supports log shipping.

14

Resource Governor

This feature is not available in Azure SQL Database.

Azure SQL Database comes with different services tiers each suitable for different workloads. You should first evaluate the performance tier your application workload will fit in and accordingly provision the database for that performance tier.

Supported in SQL Server. Resource Governor allows you to throttle/limit resources (CPU, Memory, I/O) as per different SQL Server workloads.

15

Attach Database

This feature is not supported in SQL Azure

Supported in SQL Server

16

Database Mail

This feature is not supported in SQL Azure

Supported in SQL Server

17

Data Quality Services

Not Supported in SQL Azure

Yes this feature available since SQL server 2012.

18

Event Notifications

Event notification is not supported in SQL Azure. However, One of the easiest ways to gain some automated insight into what is going on in your Azure SQL Databases is to set up alerts on your databases.

Azure Functions code is written in JavaScript, C#, F#, Node.js, Python, PHP, Batch, Bash, and PowerShell. The code can be written in Azure Portal or as Azure Function project in VS 2017

In Logic Apps, workflows are created with an easy-to-use visual designer, combined with a simple workflow definition language in the code view.

Connectivity

Azure Functions have the concept of triggers, input and output bindings. Most of these bindings connect your Azure Functions to other Azure services, such as Event Hubs, Storage, DocumentDb, etc. The HTTP binding is probably the most popular one, as it allows the creation of serverless API’s.

Logic Apps connects to an enormous variety of cloud / on-premise applications, going from Azure and Microsoft services over SaaS applications and social media to LOB systems.Each connector comes with an API connection, that stores the required credentials in a secure way. These API connections can be reused from within multiple Logic Apps

Exception Handling

In Azure Functions, you have the typical try/catch options available. If you want to enable retries, you need to do the plumbing yourself. No resume / resubmit capabilities out of the box

Logic Apps provides out-of-the-box functionality that allows you to configure automatic retries on every action. In case this doesn’t solve the problem, the workflow gets a failed status and can be resubmitted after human intervention

Networking

When using Azure Functions within an App Service Plan, you have more convenient hybrid connectivity options that reside on the network level. App Service Plans offer support for many networking options like Hybrid Connections, VNET Integration and App Service Environment.

Azure Logic Apps can connect to on premises resources via the On Premises Data Gateway, that needs to be installed on premises.

Deployment

With new functions template in VS 2017, the user can create function as a .NET project with class attributes used to define functions behavior. It also support CI/CD through VSTS Build and Release definitions

Logic Apps also have built-in support for ARM (Azure Resource Manager) deployments, through for example Visual Studio Release Management.

Debugging

Azure Functions can be easily developed and debugged on your local box or you can attach the debugger to a function deployed in Azure

Logic Appsrun only in the cloud, as it has a dependency on Microsoft-managed connectors. As a consequence, you cannot debug, test or run Logic Apps locally.

Monitoring

Each Azure Function comes with a Monitor tab, where you can see the execution history. There is also a live event stream that shows the almost real-time processing statistics in nice graphs. On top of that, there’s full integration with Application Insights, where you can take advantage of the powerful Analytics queries.

In Logic Apps, you have a nice overview of the previous runs and their corresponding outcome. You can filter this history, based on a time period and the resulting run status. The monitoring view of a workflow run is the same as the designer view, which makes it very intuitive. For each action, you can see the status and all inputs/outputs. With one button click, you can enable integration with OMS, where you can search on tracked properties.

Billing

Azure Functions, you have two options qua pricing. You can opt for a fixed cost of an App Service Plan. In that option you reserve compute power on which you can run Azure Functions and other App Services. The second option is completely serverless, with a consumption plan based on resource consumption (memory/s) and number of executions.

Logic Apps has a pure pay-per-usage billing model. You pay for each action that gets executed. It’s important to be aware that you also need to pay for polling triggers, which can be a hidden cost. If you want to benefit from the capabilities of the Integration Account, you should be aware that this comes with a fixed monthly bill.

Security

Azure Functions has a similar concept of API keys as described in logic Apps section. The API key can be shared for the whole Function App (host key) or you can create a specific one for your Function. If you run your Azure Function in an App Service Plan, you can leverage its codeless authentication functionality with Active Directory, Google, Facebook, etc. Azure Function Proxies can be a light-weight alternative of full-blown API Management, to add security on top of your HTTP triggered Functions.

In order to access a Logic App with the HTTP trigger, the client must include a Shared Access Signature in the URL. The signature is generated via a secret key that can be regenerated at all time. There is also the ability to restrict access, based on incoming IP addresses. To add more authorization logic, you can put Azure API Management in front of it.

Azure Event Grid is the event-based routing as a service offering from Microsoft that aligns with their “Serverless” strategy. Azure Event Grid simplifies the Event Consumption logic by making it more of a “Push” mechanism rather than a “Pull” mechanism – meaning, you can simply listen to and react to events from different Azure services and other sources without having to constantly poll into each service and look for events.

The best example where you can use Azure Event Grid is to automatically get notified when any user makes a slight modification to the production subscription, or when you have multiple IoT devices pumping telemetry data.

Azure Event Grid Connectors for Logic Apps

At present, there is an Azure Event Grid Connector with only one trigger – Azure Event Grid – When a resource event occurs. You can use this connector to trigger events whenever a resource event occurs.

Custom HTML and CSV headers

If you have an array of data, you can easily convert the information into a CSV document or HTML table by using the “Create CSV Table” action. Later, you can pick up this CSV table and easily embed to an email.

Enable log analytics from create

More easier way to enable Log Analytics by toggling the status while creating the Logic App. You no longer need to go to the Diagnostics section to enable Log Analytics.

Link to more details on how you can integrate Azure Logic Apps and Log Analytics:

Say, you are working with Logic Apps and you add a connector. From now, you can easily switch between the code view and designer view by clicking “Peek code” from the Options drop down.

Note: At present, the Peek code is available only in Read-Only mode. If you wish you need to edit the code directly from here, you can send the Logic Apps team a feedback on Twitter or through User Voice.

Advance scheduling in the Logic Apps Designer

There are new options to schedule the Logic App execution on a Daily and Weekly basis. This was available in the code view but now you can get this experience right in the designer. Monthly update will be rolled out soon!

In the Schedule trigger, you will notice that when you click on Week, there are few advanced operations available for you to define when you want the trigger to execute during a week. Say, you want your trigger to execute every Monday, Wednesday, Friday at 9:35 AM, 1:35 PM; 5:35 PM. The below screenshot depicts the example. The preview section will display the actual Logic App trigger condition based on the previous selections.

New Connectors

Azure table storage

Azure Event Grid

Azure Log analytics

Azure SQL Data Warehouse

Microsoft StaffHub

MySQL (R/W)

ServiceNow (East US 2 region)

Amazon Redshift

DocFusion 365

What’s in progress?

As usual, another long list of features that the Logic Apps team is currently working on and should be available in the coming weeks.

Concurrency Control (code-view live)

Say, your Logic App is executing in a faster way than you want it to actually work. In this case, you can make Logic Apps to slow down (restrict the number of Logic Apps running in parallel). This is possible today in the code-view where you can define say, only 10 Logic Apps can execute at a particular time in parallel. Therefore, when 10 Logic Apps are executing in parallel, the Logic Apps logic will stop polling until one of the 10 Logic Apps finish execution and then start polling for data.

NOTE: This works with the Polling Trigger (and not with Request Triggers such as Twitter connector etc) without SplitOn enabled.

Custom Connectors

Get your own connector within your subscription so that your connector gets shown up on the list. This is currently in Private preview and should be available for public in the month of September.

Large Files

Ability to move large files up to 1 GB (between) for specific connectors (blob, FTP). This is almost ready for release!

SOAP

Native SOAP support to consume cloud and on-premise SOAP services. This is one of the most requested features on UserVoice.

Variables (Cove Live view)

Append capability to aggregate data within loops. The AppendToArray will be shipped soon, and AppendToString will come in the next few weeks.

Expression intellisense

This functionality will go live on August 25th. Say, if you are typing an expression, you will see the same intelligent view that you see when you are typing in Visual studio.

Expression Tracing

You can actually get to see the intermediate values for complex expressions.

Foreach nesting in the designer

This capability will soon be incorporated into the designer in the coming few weeks.

Foreach failure navigation

If there are 1000 iterations in the foreach loop and 10 of them failed; instead of having to look for which one actually failed, you can navigate to the next failed action inside a for each loop easily to see what happened.

Functions + swagger

You can automatically render the Azure functions annotated with Swagger. This functionality will be going live by end of August.

Publish Logic Apps to Power Apps and Flow in a easy way

Many users who uses Power Apps and they want to talk to Logic Apps. This will make it a lot easier to call a logic App. You have to do some strange mechanicians before to light up that scenario.

Time based batching

One of those hidden features everyone should know about is that you can create batches in a Logic App. Adding new functionality, once the batch is full it will be flushed automatically according to the time you specify.

In this article we will explain how to migrate or move BizTalk databases from one server to another server. Let’s assume we want to move our BizTalk primary databases from local server to remote server or any other server on the network. Same method can also be used to move databases to SQL server cluster.

Assumptions:

· You should have admin user account access to both server, source and destination server.

· You should have basic knowledge of SQL Server.

· You should have basic knowledge of configuring BizTalk Server.

Step by step procedure

There are few major steps to follow which are described below. We will also explain these steps in detail in later.

1. Stop all BizTalk server services.

a. Perform a partial stop of all BizTalk services, do not perform a full stop!

Open SQL server management studio and restore all the backed up databases. In the SampleUpdateInfo.xml file, comment out all of the databases except for those you have moved to the new server. Edit C:\Program Files\Microsoft BizTalk Server 2006\Schema\Restore\SampleUpdateInfo.xml

a. Do a “Replace All” and change SourceServer to ….
b. Do a “Replace All” and change DestinationServer to ….

Copy the edited SampleUpdateInfo.xml file to the following folder, open a command prompt, and go to computer: %SystemDrive%\Program Files (x86)\Microsoft BizTalk Server <version>\Bins32\Schema\Restore and run

cscript UpdateDatabase.vbs SampleUpdateInfo.xml

Copy the edited SampleUpdateInfo.xml file to the following folder computer: Copy to %SystemDrive%\Program Files (x86)\Microsoft BizTalk Server <version>\Bins32\Schema\Restore open a command prompt, and go to computer: %SystemDrive%Program Files (x86)Microsoft BizTalk Server <version>Bins32SchemaRestore and run

cscript UpdateRegistry.vbs SampleUpdateInfo.xml

6. Script all SQL server users from source server and run those script on destination server.

7. Script all SQL server agent jobs from source server and run those script on destination server.

8. Restart Windows Management Instrumentation service under services from services.msc

9. Re configure BizTalk server administrator

a. Remove current BizTalk group

b. Connect to existing group

c. Configure destination SQL server by giving server name

d. Configuring databases

Your BizTalk Server databases is now moved or migrated to another database server and BizTalk should be running properly.

Visual Studio 2017 Tools

Now we have logic app designer support and tooling in Visual Studio 2017.

Download from Cloud Explorer

You can browse live logic apps using Cloud Explorer and run History.

If you go to Cloud Explorer and Run Logic App, You can download full template version of it by using download Button on Visual studio. You can deploy your app directly.

Parallel creation in designer

Now we have new feature of logic app is parallel creation designer. You can click plus button on designer and add a parallel action. That feature was available in code view before ,now we have full support in logic app designer.

Select and join Action

You have more action that you will do on arrays. So Select allows you to go ahead and if you have an array of objects and you have ABCD as properties of that object. You get to select A and B and you get new array with Properties A and B then join action is that lets you take an array of strings or objects. You can use delimiter and you can join and create a string out of those entries in that array. It is very useful one of new connectors is the computer version service witch is one of cognitive services one of the operation that describes an image give you all tags, it’s like an image has dog, it has building and all these things select and join. In case of tags you can filter them and you can turn into new string that you can send in an email.

Retry information in History

What is happening in the background? How long it took time? What errors you are getting in the intermediate calls? If you go to update something in MS Dynamics and MS Dynamics is down for a sub seconds instead of you just saying, you don’t get your update, we will go and we will retry it with retry polices. Now you can actually see what actually happened behind. You can retry three times.

Service Bus Sessions

Service bus session allows to do thing like correlation and sequential convoys for all your biz talk historians out there. Now you can do that in logic apps which uses the sequential convoy.

Sequential convoy is ability to have in order correlated messages locked to particular logic app instance so that no other logic app instance can get the message correlated to that. This is called a convoy set of correlated messages, so all go to one instance. When you lock that convoy, which is using for session, that logic app continue the process to all the messages that related to that context.

Run navigation when looking at history

When you look at your Run History Blade and when you enter one of your entries in run history, what you do? You get a compact view, you can monitoring your designer and you go ahead and click on all the items that you have filtered out.

B2B OMS Message Download

You have an option analyst, you can find what you run instances and download those and you can see all input, output messages and the payloads.

Variables-Set

Now Set of variables are available, you can put ant value you want as a variable and use reference of that variable anywhere in logic app.

Release Notes on Blade

Now right from the management portal you go click on release node and you can see a blade of history of all logic app releases and find what new feature have been added.

Run After Configuration in Designer

You can configure your action to determine when it will run. So usually you know if you look at code view we see run after success is the default setting but some time you want to have an exception handling so now you can have a run after Success branch and run after failure branch then react to the different items. Now in the blade you hit the … in the action and you get chose the reaction that you run after the previous step. There is some styling too when the run after failed then the arrow is different. You can tell which one is catching an exception.

All Preview API Apps including BizTalk Rules V1 API App was deprecated from Azure since Jan 18th, 2017. The migration path is to redeploy the V1 API App as App Service but this is not possible for built in BizTalk Rues API App.

Here are few alternatives of BizTalk Rules V1 API App:

Option 1: On-Premises BizTalk Rules Engine

we can use BizTalk BRE to define Rules in Rules Editor, deploy them on a BizTalk Server installation and write a Web API App to execute the policy through code, and we access the Web API App from Azure through a Hybrid Connection. The Logic Apps does not support Hybrid Connections yet, therefore we have to create a Relay Web App in Azure which take HTTP requests from the Logic App and pass it on to On-premises Rules Web API App.

With this approach, we have access to full capabilities of BizTalk Rules Engine to create and execute complex Rules, and store and retrieve facts from on-premises SQL Server database. However, we need to have a BizTalk 2010+ installation on any on-premises server or on an Azure VM.

Option 2: CodeEffects Rules Engine

CodeEffects Rules Engine provide a web based Rules editing and testing platform, and we can choose this option if we don’t have a BizTalk Server installation, yet we want a UI based Rule editor and complex rules processing.

Azure Functions is a solution for easily running small pieces of code in the cloud. The Code can be edited and deployed within Azure Portal. The idea is to use Azure Functions to defined and execute the Rules Logic and the Functions are invoked over an HTTP call from Logic Apps or Web Apps.

When working with web services published over Internet, it is immensely important to secure the data sent and received from the web services. To provide data security at transport level, we can use Secure Socket Layer/TLS. Secure Sockets Layer (SSL) provides encryption and signing the contents of the packets sent over TCP which makes data securely transmitted over network layer. But this is not enough when the data being sent is of sensitive nature and no one except the receiver should be able to read it. We also need to consider data archiving or tracking at intermediate storage or message data security of inflight messages between the two systems. In such scenarios, it is very important to have message level security as well along with transport level security (SSL) in order to keep the data safe and secure.

In order to implement message level security, the data should be encrypted before sending over wire to the web service and when it reaches the destination, it should be decrypted. RSA and AES algorithms are the most commonly used modern times cryptosystems.

AES Advanced Encryption Standard is a symmetric cryptosystem widely used to encrypt large amount of data. AES makes use of a password and a salt to encrypt and decrypt data.

While RSA is an asymmetric cryptographic algorithm which means two keys are generated, one as public key which is used to encrypt the data and the other as private key used to decrypt the data. Unlike AES the RSA has a maximum data size limitation. The maximum size of data which can be encrypted using RSA with 2048 bits key is 256 bytes. This is the reason why we used RSA for encryption of small piece of data i.e. password and salt. Whereas for larger piece of data we used AES algorithm for encryption.

Using only RSA would have a negative impact on overall performance. As stated earlier the RSA can only encrypt and decrypt data of limited size, so we would have to partition data into several chunks with each not exceeding the maximum of 256 bytes including the padding size. Then applying encryption on each piece of chunk and after encryption, chaining these chunks together in a sequence which they were in before applying partitioning. The same strategy would have also needed to be applied at the time of decryption at the cost of performance degradation. All this would have added as a huge overhead to overall complexity and performance of encryption and decryption.

We used RSA to encrypt and decrypt small piece of data i.e. password and salt used in AES encryption and used AES encryption and decryption for the data payload.

Scenario

In our scenario there is a producer console application which picks up a HL7 Eligibility EDI file stored in a local folder, encrypts the data and posts it to a Logic App hosted in Azure.

The Logic App then stores the encrypted data in a Blob storage for archiving and sends the encrypted data to the Service Bus queue.

The consumer is an on premises web application which picks data from the Service Bus queue, decrypts it on premises and displays it on the screen.

We created a self-signed certificate and generated .cer that holds the public key to be used for encryption and .pfx that stores both private and public keys which is used for decryption.

The producer console application used public key to encrypt and the consumer web application used private key to decrypt.

Implementation Steps

1. We used AES cryptosystem to encrypt data using a randomly generated password and a Salt.

2. Then we encrypted randomly generated password and the Salt using RSA cryptosystem with the public key stored in a self-signed public certificate. (.cer file)

3. After getting all three pieces of information encrypted, a JSON message is created for sending encrypted content, encrypted password and encrypted salt to the Logic app. The data was base64 encoded before putting it in the JSON message

4. Once message is received at the Logic App, it stores it in a blob storage and sends it to a Service Bus queue.

5. A consumer web app get the encrypted message from Service Bus queue and decrypts encrypted password and SALT using the private key stored in a certificate which carries a private key along with public key and has the .pfx extension. Then using these decrypted password and Salt, decrypts the AES encrypted data and displays it on the screen.

How to Generate Self Signed certificate

There are multiple ways to generate a self -signed certificate but we used IIS Manager to generate one for our demo.