Category: Security

Microsoft SQL Server 2016 has introduced many new features and Dynamic Data Masking is one such feature useful in some cases. Dynamic data masking main purpose is to limit the exposure of sensitive data by masking it to non-privileged or low privileged users. Dynamic data masking feature helps in preventing unauthorized access to sensitive data by enabling users to only view the sensitive data required with minimal impact on the application layer. This basically hides the data from the user trying to query the data, instead zeros or other symbols are returned instead of actual data. The underlying data does not change, but only the results are not visible to under privileged users. This Dynamic data masking feature is very easy to implement with the existing applications, since masking rules are applied only in the query results. Many applications can mask sensitive data without modifying existing queries.

It is important to understand that this Dynamic Data Masking Feature in SQL Server 2016 is not at all replacement of any of the security features like TDE, Cell-level encryption, SSL, etc, because in Dynamic Data Masking, the underlying data is not converted into another form, rather it is saved in the same format and can be viewed by users with appropriate permissions without any issues or without any requirement of decryption or performing any transformations. The use case of this feature is more for obfuscation of personally identifiable information (PII) from lesser-privileged users. This can also be be used to apply the masking constraints to development or test environments where this information should not be visible to the developers or testers. In the initial CTP releases of this Dynamic Data Masking feature, there were many bugs identified by the community where some tricks could easily reveal the sensitive data to under privileged users, like using alias or temp tables or variables could reveal the sensitive data. Later versions of CTP releases had fixes released to counter the bugs, but still this feature is only has limited usability, but strictly should not be considered as replacement for encryption of data.

Dynamic Data Masking feature supports different types of masks to some or all of the columns in a table which helps in protecting the privacy of the data from low-privileged/normal (non-sysadmin) users. There are four different types of masks currently supported Dynamic Data Masking in SQL Server 2016.default() – This is for strings to show x for each character up to 4, for numeric types it will show 0, and for dates this will show 2000-01-01.email() – This only shows the first character, then replaces the remaining characters with XXX@XXXXX.compartial() – With this option, we can define a custom string to represent the mask, including how many leading and trailing characters to show from the original string.random – A random masking function for use on any numeric type to mask the original value with a random value within a specified range.

Always Encrypted is a new feature introduced with SQL Server 2016. This has been introduced to secure sensitive data in a better way which can port very well with SQL Azure databases data as well. Although there have been other features available prior to SQL Server 2016 for encryption like TDE, SSL and cell level encryption, but the new Always Encrypted feature ensures critical data is not visible to anyone, either SQL Server DBAs, System Administrators, Network Administrators or hackers listening on the network between client and server. This is achieved by Always Encrypted as it encrypts the data both at rest and and also in motion. Important components of Always Encrypted feature as listed below,Column Master Key – It is an encryption key that protects the column encryption keys. We must have at least one master key before encrypting any columns.Column Encryption Key – It is the encryption key that actually protects our encrypted columns data.

Most important thing, which allows us to view or work with the Always Encrypted data is the Connection string.Connection string – For a client driver to understand that column encryption is in use, the connection string must have the attribute Column Encryption Setting = enabled;
As, we must have seen different connection strings which are used to connect to SQL Server from applications and has different options depending up on the type of authentication, additional parameters in case of any database mirroring features being used. So, for Always Encrypted as well, we need to use this additional parameter in the connection string to work with the Always Encrypted data which is Column Encryption Setting = enabled;
Another useful thing with Always Encrypted is that the application code does not need to changed to access this encrypted data, once we use this parameter in the connection string.
We can find the Column Master Key and Column Encryption Key metadata under DatabaseName -> Security -> Column Master Key Definitions -> Column Encryption Keys.

At this point, this feature is not supported by all client libraries. The only provider that currently works with Always Encrypted data is the ADO.NET 4.6, so we will need to ensure that the .NET Framework 4.6 is installed on any machine that will run the client application that interfaces with Always Encrypted data.
– It is important for many developers to be able to view We can use SQL Server Management Studio (SSMS), so when connecting to SSSMS, we can go to Options which will bring additional tabs, then we need to use additional Connection Parameter where we need to specify “column encryption setting=enabled” and once we make the connection successfully to SSMS, we now will be able to view the data.
– But when we try to read the Always Encrypted data, we may receive erros, where encrypted data could not be decrypted, because on the system we are launching the SSMS may not have the certificate which can decrypt the data.
We need to get the certificate installed on the local system with both public and private keys, after which we will be able to successfully view the encrypted data.

There have been many new features introduced with SQL Server 2016 and you may get different errors while you use these new features. One such issue and errors DBAs might get while they try to use new SQL Server 2016 feature Always Encrypted data is as below.Msg 0, Level 11, State 0, Line 0
Failed to decrypt column ‘test’
Msg 0, Level 11, State 0, Line 0
Failed to decrypt a column encryption key using key store provider: ‘MSSQL_CERTIFICATE_STORE’. The last 10 bytes of the encrypted column encryption key are: ‘F1-S3-K2-8E-69-E7-F7-O8-1W-B4’.
Msg 0, Level 11, State 0, Line 0
Certificate with thumbprint ‘664EBAB58B3A8150AC51E2BEF40BDEC17055167C0′ not found in certificate store ‘My’ in certificate location ‘CurrentUser’.
Verify the certificate path in the column master key definition in the database is correct, and the certificate has been imported correctly into the certificate location/store.
Parameter name: masterKeyPath

This error occurs if we do not have certificate imported to the local machine’s certificate store. Always Encryped has two main keys, one is column master key, which is stored in the trusted key store and its information about column master keys, including their location, is stored in the database in system catalog views. Another important part is Column encryption keys, which actually encrypts the sensitive data. For clients to be able to access the data, there needs to be a certificate which needs to be installed on the client systems, with out which we will get the errors related to the certificate. This is to allow only authorized users with the certificate to be able to access and decrypt the sensitive data.

Apart from this error for not importing the certificate, there can be other errors too returned with Always Encrypted even when the certificate in installed or imported on the client system like below,Failed to decrypt a column encryption key using key store provider: ‘MSSQL_CERTIFICATE_STORE’.
The last 10 bytes of the encrypted column encryption key are: ‘F1-S3-K2-8E-69-E7-F7-O8-1W-B4’. Certificate specified in key path ‘CurrentUser/My/664EBAB58B3A8150AC51E2BEF40BDEC17055167C0’ does not have a private key to decrypt a column encryption key. Verify the certificate is imported correctly.
Parameter name: masterKeyPath
The above error occurs even when we import the certificate because the imported certificate only has a public key, which is not enough to decrypt the data. We will also need a private key to be able to successfully decrypt the data.

It is important for every organization to protect their important customers or company’s data and ensure that no one should be able to view or update the data. There are many security mechanisms in SQL Server which will ensure only authorized users can see the data or make changes to it, like logins and database users to restrict the permissions on what they can do with the data. However, still there are many ways, where in unauthorized users still try to gain access to the data, like tapping the network between SQL Server and client system, trying to hack someone else account and gaining access, etc. So, only having security at login/database user level is not enough, so there are other mechanisms available like SSL certificates to encrypt the data that gets transmitted over the network, encryption of data at individual cell/column level or table level or at database level. Prior to SQL Server 2016, database/cell level encryption, still lets the database administrators or other users with high permissions to still gain access to the encrypted data, as the encryption keys are mostly stored in the database and are managed by the DBAs. This feature will be helpful for Microsoft to promote the SQL Azure databases more to the customers showcasing that their data will only be accessible to the clients, but no one else.

Starting with SQL Server 2016, new feature Always Encrypted has been introduced to safeguard the sensitive data from high privilege users as well as unauthorized users. Always Encrypted feature has been designed to protect sensitive data, such as credit card numbers or national identification numbers which are mostly stored in SQL Azure Database or on-premise SQL Server databases. Always Encrypted allows clients to encrypt sensitive data inside client applications and never reveal the encryption keys to the Database Engine like SQL Database or SQL Server. Due to this, Always Encrypted provides a separation between those who own the data (who can view it) and those who manage the data (DBAs, but should have no access to sensitive data). This ensure that on-premises database administrators, cloud database operators, or other high-privileged, or unauthorized users cannot gain access to the encrypted data. This will help in for comfortable delegation of on-premises database administration tasks to third parties or to reduce security clearance requirements for the DBA staff.

Always Encrypted features mainly uses two types of keys which are, column encryption keys and column master keys.
Column master keys – are to protect the keys used to encrypt column encryption keys. It is important for the Column master keys should be stored in a trusted key store. The information about column master keys, including their location, is stored in the database in system catalog views.
Column encryption keys – are used to encrypt sensitive data stored which is stored in the database columns. All the values which are in a column can be encrypted using a single column encryption key. The encrypted values of column encryption keys are stored in the database in system catalog views. We need to store column encryption keys in a secure/trusted location for backup.

Another important component of the Always Encrypted feature which plays a key role is Always Encrypted enabled driver which ensures the transparency of encryption to client applications. This Always Encrypted enabled driver calls a server i.e., makes a roundtrip for each query with parameters to retrieve information on how to encrypt query parameter and whether they should be encrypted. This driver then calls a key store provider to decrypt the encrypted column encryption key value. The resultant plaintext column encryption key values are cached. Query results containing data from encrypted columns are accompanied by encryption metadata to enable transparent decryption. So, when an Always Encrypted enabled client driver queries encrypted columns, SQL Server sends the information about encryption settings for the queried columns, including encryption type information. The encrypted value of column encryption keys are used to protect the data in the queried columns, and also the location of the corresponding column master keys. The driver uses that information to, contact the key store containing each column master key and decrypt the column encryption keys, encrypted with the given master key. If for some reason, Always Encrypted is not enabled on the client side, the driver returns encrypted values and the values have the varbinary(max) data type.

What is Dedicated Administrator Connection (DAC) in SQL Server?
Microsoft SQL Server gives a Dedicated Admin Connection (DAC). The DAC permits a head to get to a running case of SQL Server Database Engine to investigate issues on the server—notwithstanding when the server is lethargic to other customer associations. The DAC is accessible through the sqlcmd utility and SQL Server Management Studio. The association is just permitted from a customer running on the server. No system associations are allowed.

To utilize SQL Server Management Studio with the DAC, unite with a case of the SQL Server Database Engine with Query Editor by writing ADMIN: before the server name. Object Explorer can’t join utilizing the DA

How to Enable Dedicated Administrator Connection (DAC) for SQL Server instance?
To enable DAC, Connect to the instance and open a new query window, run the following command

sp_configure ‘remote admin connections’, 1;
GO
RECONFIGURE;
GO

How to verify if Dedicated Administrator Connection (DAC) is enabled on a SQL Server Instance?
We can verify or confirm that Dedicated Administrator Connection (DAC) is working by connecting to SQL Server instance using the following in the “Instance” field of SSMS login window

ADMIN:

How many Dedicated Administrator Connections (DAC) can be opened at a time for a particular SQL Server instance?
Only one connection can be made using Dedicated Administrator Connections (DAC).

When is the Dedicated Administrator Connections (DAC) to be used?
Dedicated Administrator Connections (DAC) can be used in cases where SQL Server is running but not responding to any regular connections due to resource usage. DAC connection has its own memory and scheduler and dedicated thread, which allows connection through DAC to run queries even when the SQL Server is not responding. It is not advisable to run a large or complex queries with DAC, rather it is to be used to identify the resource consuming queries causing SQL Server to hang or to not respond and kill any such offending sessions.

One of the important and common responsibility of a SQL Server DBA is to make sure a perfect backup/recovery plan is in place and that the plan can be implemented as per the SLA’s. Periodic testing is important to ensure that the plan works in case of unexpected disaster. In most of the environments, the backups of SQL Server databases are performed which include system and user databases, however there can be many scenarios where in one of the database may be corrupted and has to be recovered, in which case the restore of the database backups works great, but in some cases it is also a possibility that certain server level objects have to be recovered and restoring system databases is not an option and a time taking option. In such cases have backup of server level objects separately will serve greatly.

Thus it becomes important for a DBA to also consider performing backups of instance level objects periodically, so that the level of restores or control on the objects which required to be restored will be available. Also this will act as a secondary option in case the primary option of system database backups does not work in some situations. Below are the important SQL Server instance level objects which are to be backed up regularly and can be used later when required.

– SQL Server Instance Configuration properties: these setting can be backed up by saving the output of sp_configure values or the values of select * from sys.sysconfigurations.
– Database properties: The database properties gets backed up by the database backups, however saving these settings in a text document will be useful to keep track of settings and can be verified or compared if the values are changed by someone and we had to identify what were the values DBA team planned for.
– DDL triggers: There are server level triggers which should be backed up and will be useful if someone deletes the triggers.
– Backup Devices: During a backup operation on a SQL Server database, the backed up data (the backup) is written to a physical backup device. This physical backup device is initialized when the first backup in a media set is written to it. The backups on a set of one or more backup devices compose a single media set.

– Endpoints: SQL Server endpoint is the point of entry into SQL Server. It is implemented as a database object that defines the ways and means in which SQL Server may communicate over the network. SQL Server routes all interactions with the network via endpoints and each endpoint supports a specific type of communication.
– SQL Server Agent Properties: These settings are important and the backup of these settings are useful in cases where these values are changed by someone and we want to know what are the correct ones.
– SQL Server Agent Jobs: Very important to have backup of the jobs, so that we can recreate them easily without having to restore the MSDB database. It is important to keep in mind that some jobs will run DTS/SSIS packages which have be backed up separately where the packages have to be saved separately.
Linked Servers: Typically linked servers are configured to enable the Database Engine to execute a Transact-SQL statement that includes tables in another instance of SQL Server, or another database product such as Oracle. Many types OLE DB data sources can be configured as linked servers, including Microsoft Access and Excel.
SQL Server Logins: Logins allow access to users to connect to SQL Server, so important to keep a backup, so that we can recreate the required logins if they are accidentally deleted.

Security is one of the most important aspects for any technology. As SQL Server is an RDBMS product which stores users data which is important and critical, so it is common to see many attempts been made to compromise and get access to the important user or business related data. This brings responsibility of the DBAs, developers, management to plan and implement best security practices to avoid any such attacks from succeeding. It is important to periodically review the security settings to make sure that there are no gaps or possibilities of security breach. Below are list of items or objects which a DBA team may want to periodically review and verify to make sure that the security configuration is as per best practices.

Database Owners: By default, any users who creates the databases becomes the owner of the database. Mostly DBAs who create the databases become the owner of that database, so it is important to immediately change the database owner to SA or other standard account. Periodically review the database owners to make sure there are no user specific domain accounts being the database owners.Windows or SQL Server Logins: Periodically review all the windows logins, SQL Server logins and windows groups which has access to the SQL Server instance. Remove any logins which does not require access anymore. It happens that some user leaves the company but the logins still remain. Sometimes access is given on temporary basis for a project work and would not removed after the completion of the project. Periodic review of the logins should resolve this issue of unauthorized logins being present at server level or at SQL Server level.

Server Level Permissions: There are different kinds of server level and database level roles provided by SQL Server. Make sure that only the required permissions are granted to the users. Providing more permissions than what is required will cause serious problems. It is important to periodically review and revoke any additional accesses granted to any users.SQL Agent Jobs Owned by Windows Domain Accounts: The DBA who creates a SQL Server Agent job by default becomes owner of that job, so make sure to change it to SA
.SQL Server Client Protocols: By default SQL Server allows communication through ports(TCP/IP, Shared Memory, Named Pipes, VIA). Mostly it is not required for all these protocols to be enabled, some of them can be disabled when application does not use them, so that any one trying to gain access through that protocol would be blocked.

SQL Server TCP/IP port: Default instance of SQL Server uses port 1433 by default, so it is advisable to change to this some other port other than 1433 as per best practices. But making this change may cause applications to fail connecting to SQL instance, so care has to be taken to make changes to application connection string to use the port number used by SQL Server or to enable SQL Browser service. SQL Server Named instances can be configured to use either permamnent statis port or a dynamic port which may change after SQL or server restart. It is advisable to configure static port for SQL Server Named instances.

The above checks or recommendations are high level and there are mode advanced check that are to be performed for more critical servers. There are compliances like SOX HIPAA which put rules to be followed for critical servers. Also, care should be taken to implement the recommendations and proper testing has to be done prior to any implementation to make sure applications does not break due to the changes.

One of the common security recommendation DBAs come across or Security teams or management ask about SA login in present in SQL Server instance and they suggest to disable or delete this ‘SA’ login for security purposes. Generally it is expected that built-in SA account is the first thing hackers may try to get access to.

SQL Server installs or comes with a default SQL Server login named SA an abbreviation of “system administrator”. The sa login maps to the sysadmin fixed server role, which has irrevocable administrative credentials on the whole server. There are no limits to the potential damage if an attacker gains access as a system administrator. This SA account needs to be setup with a strong password and should not be used to login in your application.

Although SQL Server allows DBAs to disable to delete SA login, it may not be a good option to delete this account and even better not to disable this account as well, because sometime SQL Server installation may fail or behave weirdly when SA account is disabled or deleted. If the security team insists, then the last thing you could do is to keep this login disabled or better change the Authentication mode to Windows Authentication(Important to make sure no applications us SQL authentication as it will fail after changing to Windows Authentication).

You can also rename the SA account. You can run below command to rename the SA account.
ALTER LOGIN sa WITH NAME = [newsa];

Also, it is important to note that SQL Server may be set a owner of many SQL Server Agent jobs, so before making the change it would be better to change the job owner from SA login to some other login which you are aware of which will never be deleted from the SQL Sever instance. Also, there may be some applications which may use the SA account, so it is important to identify the applications or processes using the SA account. One way to find which applications are using the SA account would be to setup a profiler trace by filtering it on SA login. Later you can import the trace to a SQL Server table and search for any applications using the SA account.

Sometimes you may see that the SA account is disable by default, this is because the SQL Server would have been installed by choosing only Windows Authentication.

The level of permissions in SQL Server can be very tricky sometimes and can cause lot of pain. We may just receive a generic error message, but it would become very difficult to identify the problem and them to find a solution.

One of the different issues I came across was related to the Login failed error

If we search on this error number of the error message we may get lot of results as it can happen due to various reasons. Error 18456 with States 11 or 12 occurs of the SQL Server was able to authenticate the account, but not able to validate the account with the underlying Windows permissions. It could be due to the Windows login has no profile or a corrupt profile or may be the permissions could not be checked due to UAC. Another reason could be that the domain controller could not be reached.

I would like to add another issue which can raise this error that happens due to permission problems.

Consider below scenario where you created a domain group DomainGroupTest and a domain user DomainUserTest and this domain user is added to the domain group. This domain group DomainGroupTest has been added to the SQL Server as a login and granted db_reader permission to all the databases. But when you try to connect to the SQL Server instance using the user DomainUserTest, it fails with below error

We first made sure that we launched the SSMS by “Run as Administrator” and also made sure there are no problems with domain controller of connectivity with the AD from the server. There are no Windows or OS related error in Eventlogs or SQL Server errorlog.

However, if notice that granting SysAdmin permission to the DomainGroupTest allows the user DomainUserTest to successfully login and no other permission other than SysAdmin allows the login to connect to the SQL Server instance. Then you can check the permissions of the “Public” server role. By default every SQL Server login is member of “Public” server role and if any permissions are denied to this role, then there are possibilities that the login may fail.

How to check of Connect permission is denied to the “Public” server role?
To check if “Public” server role has connect permission denied, Connect to SQL Server instance using SSMS and then find the public server role. Right click on it and go to properties and then click on the permissions tab. Make sure that TCP endpoint is enabled and it has connect privileges.

Once you remove any deny permission for public server role for “TSQL TCP Endpoint”, then you should be now able to connect to the SQL Server instance using the login DomainUserTest.

There may be other reasons for the error too, this is just one of the reasons which is different for this error state 11.

In today’s world Auditing has become very important to control and detect the changes to the IT Systems, which may be planned or unplanned. SQL Server databases store important end user data and is often secured from unauthorized access, but there can be instances where there are unexpected changes to the system configuration to get access to the secure data or there can be cases where changes are made to the system and difficult to identify who made the changes. Proactive Auditing helps in identifying the changes made to the system and take corrective steps, thus can provide more control on the systems.

Auditing SQL Server Instance:

SQL Server 2005 was released around 15 years back and has limited options for built-in Audit solutions. With latest versions of SQL Server 2008 or higher there are better in-built Audit solutions. We can enable auditing in SQL Server using various methods including built-in options audit options, Customize existing features and build an audit solution which most of the third-party software’s does.

Below are some of the methods which we can use to Audit changes to SQL Server instances.

DDL Triggers:

Triggers in SQL Server are fired when a certain event occurs and we can define a certain action to be taken like logging the change made into a table. DDL triggers are fired when a change is made using commands like ALTER, CREATE or DROP.

Advantages: DDL triggers allow us to specify an action to be performed when a particular event occurs. This is suitable for scenarios where we need to audit specific events, but not all DDL changes.

Dis-Advantages: DDL triggers cannot be used to track all changes to the SQL Server instance. Implementing DDL triggers require lot of customization and will be very huge task and can also result into issues when implemented without proper planning and testing.

SQL Server Server-Side Trace:

Using the SQL Server Traces, we can audit or track almost all activity that takes place on SQL Server instance. Trace data will be stored in a trace file and we can import the trace data into tables periodically and create Scripts to read the trace data and extract the required information.

Advantages: We can track all the activity that happens on the SQL Server instance. We can filter the captured events to trace only required data.

Dis–Advantages:

-Running trace on system can cause additional overhead on the system and the performance overhead depends on the number of events traced and the activity that happens on the server. Often server side trace overhead in not much, as most of the third party solutions internally rely on the traces.

-There is lot of customization required where we need to configure the trace to capture required data and then periodically import the data into tables and run queries against those tables manually or can run the queries through jobs which run on schedule to send the report via e-mail.

SQL Server Audits:

SQL Server Audit feature was introduced with SQL Server 2008. SQL Server audit lets one create server audits, which can contain server audit specifications for server level events, and database audit specifications for database level events. Audited events can be written to the event logs or to audit files.

Advantages: This is a built-in feature, it is simple to enable and manage and no additional customization is required. Audit results can be saved to event logs or audit files, which can be referenced later. Audit can be enabled at different levels and for different issues.

Dis-Advantages: Database level auditing is limited to Enterprise, Developer, and Evaluation editions. There is no option to send an e-mail when a certain condition is met, so need to customize that piece.

SQL Server Extended Events:

Starting SQL Server 2008 R2 and higher, Extended Events can be used to track the changes that happen in SQL Server at granular level without much overhead. Extended events collect all the information similar to SQL Server Traces, and also additional details at more granular level.

Advantages: We can track all the activity on a SQL Server instance at more granular level. We can filter events based on our interest. Performance overhead is very minimal.

Dis-Advantages: This requires lot of customization for creating and managing.

Other In-built Options:

There are few other rarely used built-in options like C2 Audit, Common Criteria Compliance and Default Trace is available, but they target specific requirements, but has limited options and customization as per our requirement is not supported.

Third-Party Audit Tools:

Most of the organizations rely on third-party audit tools as they do not need much customization and can be configured to generate and send reports by e-mails. Advantage of the third party tools is that they are tested and used by many organizations already and documentation is available to use the tools. Disadvantages of third-party audit tools include additional cost and additional overhead of configuring and managing these tools.

Some of the popular third party audit products for SQL Server instance are Idera SQL Compliance Manager, ApexSQL Comply, Change Auditor, etc.

Idera SQL Compliance Manager is one of the most used products for SQL Server audits and compliance. Idera SQL Compliance Manager provides a comprehensive SQL Server auditing that can help us monitor, audit and alert SQL Server user activities and data changes. We can get detailed information about who did what activity and when it was performed, and how the changes were implemented, etc. We can use it to track changes, monitor and audit data access, schema changes and login failures. There are many built-in reports available to validate SQL Server audit trails. A DBA can configure alerts to get notified of suspicious activity. Idera SQL Compliance Manager can help you ensure compliance with regulatory and data security requirements.

ApexSQLComply is another popular tool for SQL Server audits and compliance. This tool supports tracking data changes to SQL Database. It can capture and store data changes to a central database and tracks, who did what activity and when it was performed, and how changes were implemented. This tool supports many audit reports that are run from the central database. It can also inform a user about how and when an incident occurred, capturing relevant information such as which SQL or Windows host and application name was involved. Finally, this tool can also track table definition changes and unauthorized or malicious changes.

Quest Change Auditor for SQL Server tracks, audits, reports and alerts you to any changes to our environment in real time. This tool is capable of tracking all user and administrator activity, such as who made what kind of changes, when the changes were implemented and from which work station changes were carried out. This tool monitors for critical change within the environment, and we can configure to get alerted in real time whenever there are critical changes to your environment. Since this tool maintains a centralized repository, it makes it easier to monitor multiple servers from a single location. It also supports role-based access, thereby allowing auditors to run searches and reports without the ability to make any configuration changes to the application.

When it comes to troubleshooting, the more you know the better troubleshooter you are (in my option). Sometimes knowing more actually gets in the way. We repeatedly get calls where customers couldn’t connect to AlwaysOn Listener on Azure because of the misconfiguration of the subnet mask for the listener IP. They receive this error “A...

We just released SQL Nexus 5.5.0.1 on codeplex. In addition to numerous ‘rules’ that catch issues that can potentially impact your SQL Server performance, we added quite a few reports. Some of the reports are displayed below. The following three were born out of troubleshooting experience. “Top 20 Active CPU Intensive Queries”: When troubleshooting CPU...

A user called in for help because he wasn’t able to drop a user from a database. The error message is below Msg 15136, Level 16, State 1, Line 2The database principal is set as the execution context of one or more procedures, functions, or event notifications and cannot be dropped. From the error, the...

There are multiple levels of throttling with Azure. You can get throttled at disk level, storage account level or VM/host level. These types of throttling can manifest into something that may not be that obvious. Recently we received a call from customer who was backing up databases from Azure VM to Azure blob storage. The...

Lately, we had a customer who contacted us for a performance issue where their server performed much worse in SQL Server 2016 following upgrade. To show us as an example, he even captured a video. In the video, he showed that the session that was compiling the query had multiple threads waiting on LATCH_EX of...

With so many users new to Azure, Sometimes an issue appears more complex than it really is. If you back up your databases to Azure blob storage but your company uses proxy servers for internet access, you will need to configure backuptoURL.exe.config per SQL Server Backup to URL Best Practices and Troubleshooting. We...

If you created an SQL Server VM via azure portal, there will be a section called “SQL Server Configuration” which was introduced via blog “Introducing a simplified configuration experience for SQL Server in Azure Virtual Machines”. Here is a screenshot of that setting. It allows you to configure various things like auto backup, patching or...

If you need to open up your SQL Server on an Azure VM to public internet access, you need to look no further than this document Connect to a SQL Server Virtual Machine on Azure (Resource Manager). It has very detailed step-by-step instructions. On a very high level, here are steps In your VM, create...

Recently we got a customer who called in and wanted to know why he received NULL for query_plan when querying sys.dm_exec_query_plan. This customer referenced a blog from https://dzone.com/articles/dmexecqueryplan-returning-null. In that scenario, you will get NULL for query_plan if there are any statements involving temp tables that have not been executed for the first time. For...