fredag 23 september 2016

During the
last Ignite 2015 Microsoft announced that DPM will be able to backup and most
important, be able to do restore of Vmware. Within the UR 11 just released
Microsoft kept their promised and provided an agentless solution for the Vmware
protection.

All
protected virtual servers within the DPM protection will also be able to
leverage the cloud integrated long-term protection via an Azure Backup Vault.

In
short-terms, DPM will be able to auto-protect new deployed server that is
provisioned via vCenter. This is due to the fact that vCenter lets you organize
your virtual machines in VM folders.

In the
scenario where protected virtual servers are load balanced in the Vmware environment,
DPM will keep track and continue to protect those VM’s.

fredag 9 september 2016

This update rollup adds backup and recovery support for VMware server 5.5 and 6.0. Create protection groups, manage policies, and protect and recover VMware VMs with the user interface or new PowerShell cmdlets. This update is available through Microsoft Update or by manual download. UR 11 includes these features:

Agentless backup: DPM provides doesn't require an agent on the VMware server. You can use the IP address or fully qualified domain name (FQDN), and login credentials to authenticate the VMware server with DPM.

Folder-level auto-protection: vCenter lets you organize your VMs in VM folders. DPM detects these folders and enables you to protect VMs at the folder level. Once you have turned on folder-level auto-protection, DPM protects the VMs in that folder and its sub-folders, and any new VMs added to the folder or sub-folders.

DPM can recover individual files and folders from a Windows Server VM.

fredag 5 augusti 2016

A few days
ago I bumped into an interesting error where the backup of some virtual
machines that where managed via System Center Virtual Machine Manager or SCVMM
kept failing and never could create a Recovery Point. The backup job ended with
the same error code that was ID 30111 or 0x800423F4.

As always
when it comes to troubleshooting DPM (I should write a blogpost regarding this…)
is to verify the dependent technology that DPM “integrates” with when it comes
to deliver the backup or restore job for the protected Data Sources.

Since all
VSS’s was reporting Healthy I moved on the trouble shooting to VMM itself. When
DPM makes a backup of a virtual machine it verifies the internal VSS’s present
in the virtual OS before it makes a snapshot of the virtual machine.

In this
case in VMM I right clicked the server in production that I couldn’t get a
backup of and choose properties. In the properties for the virtual machine I
choose the “Hardware Configuration” and under “Advanced” in the scroll list I
choose the “Integration Services”.

I unchecked
the “Backup (volume snapshot)” and saved the configuration to the VM. When that
job was done I re-enabled the “Backup (volume snapshot)” again.

After I initialized
the “Integration Service” for Backup I was able to get the backups up and
running again.

Never
underestimate the power of re-initializing an “Integration Service”.

fredag 24 juni 2016

Got the question
if I could consider delivering a session at Ignite in Atlanta later this year,
the answer was for sure yes!

I will be delivering
a Business Continuity focused session with the aspect of clarifying the concept
of HOW you can adapt Microsoft technology within or for your Business
Continuity concept. Also I will clarify what Business Continuity is and how you
can get started with it when you come back home from Ignite.

tisdag 24 maj 2016

Microsoft has just released the Update Rollup 10 and with that a great deal
of optimiztion! The following list of issues (listed i KB3143871)has been fixed in this update
rollup:

If you try to exclude a page file for a VM running on
Microsoft Hyper-V Server 2012 R2 server, DPM may still continue to back up the
page file

DPM provides an Active Directory schema extension tool
to make required changes to Active Directory for DPM End-User Recovery.
However, you may the tool may not work on System Center 2012 R2 Data
Protection Manager

If you try to protect a SharePoint content database
that has Always On enabled, and there is a failover of the database,
you may notice that new sites, lists, and items are not displayed on
the Recovery tab. This applies only to new items that are
created after failover. Additionally, the issue is automatically resolved after
failback

If you run Update Rollup 7 for System Center 2012 R2
Data Protection Manager, or a later version, and then try to do item level
recovery for a Hyper-V VM, you may receive the following error message when you
click the VHDX file on the Recovery tab: "DPM Cannot browse the
contents of the virtual machine on the protected computer DPMServerName

The DPM Console crashes when you try to open any of
the six built-in DPM reports

Optimized item level recovery doesn't work for a
SharePoint farm. This causes the full farm data to be copied to the staging
server that's running Microsoft SQL Server

The DPM UI crashes when you try to recover online
recovery points by using another DPM server that is registered to the same
backup vault as the original server

The Get-DPMJob cmdlet does not provide any
information about recovery jobs that are performed through the external DPM
server

This update revises the message of error code 33504 to
add details about the issue and steps to resolve the issue

If you try to protect Microsoft Exchange Server 2016
while you create a protection group, the server that's running Exchange Server
is displayed as "Exchange 2013 Database" instead of "Exchange
2016 Database

If you use the DPM Central console, and you receive
an EvalShareInquiryAlert (3123) alert on DPM, the alert is still
displayed as active in System Center Operations Manager even though the issue
is resolved on DPM

DPM crashes when you try to configure SMTP settings

If you try to stop protection of a data source in
which the FQDN contains more than 64 characters, the DPM service crashes

fredag 8 april 2016

Got the email saying the nice community conference in Minneapolis called Midwest Management Summit or MMS wanted me as one of their speakers. I will be delivering three sessions this year (or maybe more...will see)

Business Continuity using Microsoft Solutions

Be a Hero od be fired. Backup and restore strategy

Tearing down IT Silos

I will be around the conference the whole week and really looking forward connecting with new community members and also old friends.

fredag 1 april 2016

Hopefully around this years Ignite Microsoft will release the new version of DPM 2016. With that release comes not just an updated version of the actual code but also a few nice additions regarding new supported scenarios from Microsoft.

fredag 18 mars 2016

The DPM software is dependent on the VSS architecture
which of course is nothing new. But when it comes to troubleshoot DPM many
seems to forget that the initial steps in troubleshooting DPM starts with
having a healthy setup of VSS’s present in the affected operating system.

Also its recommended that you verify the System log on the server that you try
to protect. In one scenario that I bumped into regarding issues with DPM creating
new Recovery Points the local System log was filled with volsnap errors with event ID 39.

In this particular situation the associated VSS area has
lost its association to the shadow copy area and therefor cannot create any new
Recovery Points.

So how did I soleve it? Well, you need to create a new
unbounded shadowcopy area. This is done via this simple set of instrucitnos:

Open an elevated PowerShell prompt.

If you D drive is not getting any
Recovery Points done this is what you type is: “Vssadmin add shadowstorage
/For=D: /on=D: /maxsize=unbounded”

fredag 4 mars 2016

Last week I ran into a strange error when I tried to open
a DPM console. All of the sudden I got an error saying “MMC cannot open the file . This may be because the file does not excist, is not an MMC
console, or was created by a later version of MMC. This may also be because you
do not have sufficient rights to the file.”

So to solve this we need to get deep down in the user profiles.
What is really happening behind the scenes? Every time the DPM server console
opens the file “Microsoft System Center 2012 R2 Data Protection Manager”
that is located in the Roaming AppData catalogue of your profile is verified.
If that file is corrupt or faulty it needs to be reinitialized.

This is done via the simple trick of deleting the file
and restarting the DPM server console.

The
file location is “C:\Users\USERNAME\AppData\Roaming\Microsoft\MMC” and keep in mind to view “Hidden Items” otherwise the
AppData catalogue will not show up.

fredag 19 februari 2016

Now and then I bump into scenarios where people can’t seem
to open their DPM console. DPM throws an error saying “Unable to connect” and
with that an error ID that is 948.

This error ID says that a connection can’t be made since
the DPM servers database (DPMDB) is no longer within sync.

To solve this error you must get the DPMDB in sync again,
this is made via the DPM CMDLET called DPMSYNC.

Open an elevated administrative version of the DPM
Management Shell and run the following
command: “DPMSYNC -SYNC”. After a while the command has finished and you are
now able to logon to your console again.

You could need to run a “Consistencey Check” on all your
protected workloads datasources to get also the protection in sync.

When there is DPM filter corruption, DPM automatically repairs the corruption by triggering a synchronization job within 15 minutes of the previous sync/backup job failure. (DPM does this instead of running a consistency check.)

In the DPM UI, the Offline/Online tags have been removed from Hyper-V VM names.

If you already upgraded to DPM 2012 R2 Update Rollup 6 or a later version, you will not have to restart the production server when you install this update.

This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3112306.

fredag 15 januari 2016

A time ago Microsoft introduced the great feature to
provide a long-term protection using a Backup Vault in Azure to replace onprem
tape solutions.

In the scenario where DPM starts to protect datasources
within a workload that has a Recovery Point volume that is smaller than 3 GB the
Online Recovery job will constantly fail with the error ID 100034.

To solve this you must increase the size of the Recovery
Point and Replica Volumes to 3 GB each. After this is done your Online Recovery
Point jobs will start to function again.

This is done via the “Modify Disk Allocation” function
that you access by right-clicking either a Protection Group or a singel Datasource
that is a member of the Protection Group.

tisdag 27 oktober 2015

Microsoft has just released the Update Rollup 8 that contains some bugfixes.

The DPM Agent crashes intermittently during a backup

If you are trying to recover data from an imported tape, DPM may crash with a "Connection to the DPM service has been lost" error

If you try to back up a SharePoint site that uses SQL Always On as a content database, SQL logs are not truncated as expected

You cannot verify tape library compatibility for tapes that use RSMCompatmode settings such as IBM 35xx, 2900, and so on

If you have multiple SharePoint farms hosted on the same SQL cluster with different instances but the same database names, DPM cannot back up the correct SharePoint farm content

If you run Update Rollup 7 for Data Protection Manager 2012 R2, and you have already configured online protection for one or more protection groups, trying to change the protection group populates the default DPM settings for the "Select long-term goals" wizard instead of the previous configured values

When you try to protect a SQL failover cluster, the Data Protection Manager UI crashes for every backup or synchronization operation

If you install Update Rollup 7 for Data Protection Manager 2012 R2, self-service recovery for SQL databases may not work, and you receive the following error message:

This update rollup fixes bugs listed in KB 3086084. It's available through Microsoft Update or by manual download.

The ability to recover data to any DPM server that's registered in an Azure Backup vault. If two or more servers are registered in a vault and you back up data to Azure then any of those registered DPM servers can recover data from the vault. To use this feature after installing Update 7 you can enable the setting Add External DPM on the Recovery tab in the DPM console. Specify vault credentials. note that credentials are only valid for two days. If they've expired you'll need to download new credentials from the vault. Select the DPM server whose data needs to be recovered, and provide the encryption password. You can close the dialog after the server appears. Then you can recover the data by browsing the recovery points. To return to the local DPM data view click Clear external DPM.

Note that if you want to use this feature for existing backed up DPM you'll need to wait at least a day to obtain the DPM protection group related metadata for upload to Azure (this occurs for the first time during the nightly job).

Support for protection and backup of data on client computers running Windows 10. You back up in the same way you did with client computers running earlier versions of Windows.

This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3065246.

onsdag 20 maj 2015

Support for using SQL Server 2014 as the DPM database. Support for protecting SQL Server 2012 was introduced in Update 4. Now, in Update 6 you can configure a SQL Server running SQL 2012 as your database. You'll need to install Update 6 and then configure DPM to use SQL Server 2014.

If you're backing up DPM to Azure you can now select to keep online backed up data when deleting a protection group. When you delete a protection group you can choose whether to preserve backed up data in Azure or on a local disk. You can always browse the retained data, together with other recovery points, from the server on the Recovery tab.

This update is available through Microsoft Update or by manual download. You can download the package and find installation instructions and information about agent updates in KB 3030574.

torsdag 5 mars 2015

This is an issue with UR5, where Create\Modify\Delete on a PG with Clustered File Server datasources may fail with Error 197, or cause a console crash.

Please use this workaround and see if it resolves the issue:

Close the UI and all DPM services

Important: Take a full database backup of the DPMDB to a safe location

Run the SQL script on the DPM DB that is posted below

Start all DPM services, and open the UI

This issue may recur whenever the underlying volume of the File Server migrates across nodes of the cluster and an inquiry is triggered on the File Server cluster role. You can run the script again as required.

With Update Rollup 5 for System Center 2012 R2 Data Protection Manager, Data Protection Manager can now protect Microsoft SharePoint Server farms that are hosted on instances of Microsoft SQL Server with an AlwaysOn cluster.

There is no change in the Data Protection Manager UI for the backup and recovery steps. If there is a failover within the SQL Server AlwaysOn cluster, Data Protection Manager will automatically detect the failover and continue to back up from the active SQL Server availability instance without requiring user intervention.

Note The Data Protection Manager Agent must be installed on all the nodes of a SQL Server AlwaysOn cluster.

Support for multiple retention ranges for long-term backup on Microsoft Azure

With Update Rollup 5 for System Center 2012 R2 Data Protection Manager, Data Protection Manager will enable users to configure multiple retention policies for long-term retention of backup data on Microsoft Azure. Users can choose between daily, weekly, monthly, and yearly retention policies and can configure the number of recovery points (retention range) for each policy.

Notes

Data Protection Manager with Update Rollup 5 will enable up to 366 recovery points for each data source.

This option will be available only for protection groups that enable online protection for the first time.

Ability to transfer initial backup copy offline to Microsoft Azure

While it is creating a new protection group or adding data sources to a protection group, Data Protection Manager has to create an initial backup copy for the data sources that were added. Depending on the data source, this data could be large. This could make it difficult to send data over the network.

This update provides an option to transfer the initial backup copy to the Microsoft Azure data centers. Large data can now quickly be transferred to Microsoft Azure without consuming Internet bandwidth. If a user decides to transfer the initial backup copy offline to Microsoft Azure, the backup data will be exported on a disk. This disk is then shipped to Azure data centers. After the data is imported to the customer storage account in Azure, the disk is returned to the user.

More information about this feature can be found here.

Support for protecting Microsoft workloads that are hosted in VMware

With this update, Data Protection Manager can protect Microsoft workloads that are hosted on VMware. It will provide application-consistent protection at the guest OS level. Data Protection Manager Agents have to be installed on the guest OS of the VMware virtual machines that are hosting the workloads to be protected.

Note VMWare VM backup/recovery is not yet supported.

Display missed SLA alerts on the Data Protection Manager console

In Update Rollup 4 for System Center 2012 R2 Data Protection Manager, a feature for configuring backup SLA by using theSet-DPMProtectionGroupSLA cmdlet was added. For any protection group that missed the configured backup SLA, Data Protection Manager raised an alert that was visible only from Operations Manager. With this update, the SLA missed alert will also be displayed in the Data Protection Manager console.

During its nightly job, Data Protection Manager will check for all the protection groups that have the SLA configured and check if any protection group missed the SLA and raise an alert. For example, if a user configured a backup SLA of 8 hours for a protection group but there was no recovery point created in the past 24 hours, at midnight the missed SLA job would run and display three SLA missed alerts for the protection group.

Note Data Protection Manager SLA miss alerts will not be resolved automatically on the next successful recovery point. Users have to manually activate the alert.

Enhanced reporting with Data Protection Manager Central Console

This update also provides a new enhanced reporting infrastructure that can be used to create customized reports instead of just relying on the standard canned reports that were shipped with Data Protection Manager. We also expose the reporting schema to enable the creation of custom reports.

Users can now have aggregated reports from various Data Protection Manager servers that are managed in Operations Manager. A demonstration report is also available to help users create custom reports.

The new reporting infrastructure will be available after users upgrade their Data Protection Manager servers to Update Rollup 5, install the Data Protection Manager Central Console on Operations Manager, and import the new Data Protection Manager Reporting Management Pack that is available on the Microsoft Download Center.

fredag 9 januari 2015

The majority of the datacenters that are build up using the modern datacenter approach are committed to both redundancy but also smart designs regarding providing uptime for the hosted services.

Providing a dedicated backup network that the backup traffic can use and rely on is a good strategy since it will offload the primary network architecture and not be dependent of the network hardware that builds up the production network.

There are some prerequisites for enabling a backup network:

Secondary network card

Enabling name resolution

Verifying network connectivity

The configuration for setting up a backup network is made via PowerShell but first you must have a secondary network card installed that is configured with the IP addresses that is a member of the backup network subnet.

The next step is to verify name resolution, alter the HOST file on both the production server and also the DPM server and enter the NetBIOS name for the servers involved.

Verify that you can ping the IP address for the backup network on the DPM server from the production environment and also via NetBIOS name.

To configure the backup network you open the DPM Management Shell and use the PowerShell CMDLET called Add-BackupNetworkAddress.

To configure the DPM server to use the dedicated network of 10.1.1.0/24 enter the following syntax:Add-BackupNetworkAddress –Address 10.1.1.0/24 –DPMServer DPM1 –SequenceNumber 1

The three switched are:

Address

DPMServer

SequenceNumber

The Address switch provides the address for the backup network. The DPMServer provides the NetBIOS name for the DPM server. With the SequenceNumber switch you provide the primary backup network and also the failover backup network.

The DPM agent will verify its XML configuration and understand what network it should use as its primary backup network. Normal communication between the DPM agent and the DPM server will still travel over the production network. If the backup network connectivity fails the DPM agent can be setup using also a secondary backup network that will be a failover network for that traffic. The failover network is something that you must configure for it to work.

torsdag 27 november 2014

I have had
some interesting meetings with the Product Group managers at Microsoft regarding
the development of Data Protection Manager also known as DPM. I’m very pleased
that Microsoft are working very hard to make improvements to the product and that
is the reason for me writing this blog post.

I want to understand the community
wishes regarding HOW Microsoft can improve the DPM software from two scenarios:

The on-prem tape story. What do you
think Microsoft should improve regarding tape management and features in DPM

SharePoint protection. What do you
think Microsoft should improve regarding SharePoint protection, restore and
features? What is missing from your point of view.

Just to be
clear. I will read all of your email and I will summarize them without changing
your words. The information you provide to me will be provided to the product
group managers for system center and DPM.

fredag 21 november 2014

Laytley I have got some questions regardin if the new DPM
book will be out and released this year, I have great news for you. I have only
three chapters left to write of a total of 14 chapters.

I will have my part done ASAP and then the great people who
are reviewing my work will have their work done. The nice people at
PacktPublishing will produce the book as soon as all the information is
gathered and reviewed.

fredag 14 november 2014

Are you
intressted in learning how to start your proactive monitoring experience and
also learn how to get started with your business continuity planning based on
the features and functions of Microsoft product stack?

If the answer is yes I
would like to personally welcome you to my two sessions that I deliver at the
TechDays event in Stockholm.

onsdag 29 oktober 2014

Microsoft has just released UR 4 and with this comes the demand for reboot of your production environment that you want to protect.

It is important to emphasize why you need a reboot of you production environment. When Microsoft makes changes to the drivers that is a part of the deduplication process the DPM team must update their code to adopt to the changes. For DPM to be 100% adoptable the operating system kernel needs to be rebooted due to the fact of re-initializing the deduplication updates made by the Windows Server team. Hence, this is not due to DPM but the fact of re-initializing the code changes for the deduplication engine made by the Windows Server team.

The content of UR4 is a mixture of many good things, the following list provides a description of what has been fixed or added to the product:

Scenario
You try to back up SharePoint when SharePoint has various files that have custom access permission settings and is using a non-English locale for SharePoint sites. When you try to recover the files, data is recovered successfully. However, the custom permissions may be lost. This causes some users to no longer have access to the file until permissions are restored by the SharePoint administrator.

The DPM user interface crashes, and you receive the following error message: Connection to the DPM service has been lost.

This issue occurs when you change some protection groups after you upgrade to DPM 2012 R2 Update Rollup 2 or a later version.

You experience issues when you try to restore a SQL Server database that has error 0x800423f4.

Scenario
You have a SQL Server database that uses multiple mount points that point to the same volume and that protect it by using DPM. When you try to recover from an existing recovery point, you receive an error message 0x800423f4.

An instance of SQL Server is missing when you try to create a protection group (PG).

Scenario
You have two or more instances of SQL Server on same server so that one of the database names in one instance is a substring of another name of an instance of SQL Server. For example, you have a default instance of SQL Server and another instance that is named "contosoinstance." Then, you create a database that is named "contosoins" in the default instance of SQL Server.

In this case, when you try to create or change a protection group, the “contosoinstance” SQL instance and its database do not appear in the Create Protection Group Wizard.

The DPM console or service crash after VMs are removed from the Hyper-V host.

Scenario
Several virtual machines on the Hyper-V host are physically removed. Then, the DPM UI crashes when you try to stop protection for these virtual machines.

DPM crashes with NullReferenceException.

Scenario
When you protect a SQL database by using long-term protection on tape, the SQL logs are moved to a different disk volume. When you try to make a backup for the first time after you move the logs, the DPM UI crashes. However after you restart the UI, everything works correctly.

Scheduled backup to tape runs on an incorrect date for quarterly, semi-annually, and yearly backups. In some cases, you see that the backup runs before the specific date.

The DPM UI sometimes crashes when you change a protection group that has the following error: Connection to the DPM service has been lost. (ID: 917)

The following features has been added in UR4:

Added support for protecting SQL Server 2014

This update let you protect SQL Server 2014 as a workload. There is no change in the user experience or in scenarios that are supported. Therefore, you can continue to back up SQL Server 2014 by using DPM in the same way that you protected older versions of SQL Server.

Unsupported scenarios in SQL Server 2014:
•You cannot use SQL Server 2014 as the DPM configuration database.
•You cannot back up the data that is stored as Windows Azure blobs from SQL Server 2014.
•The "Prefer secondary backup" preference for the SQL Always On option is not supported completely. DPM always takes a backup from the secondary. If no secondary can be found, the backup fails.

Added support for SQL Server 2012 Service Pack 2 (SP2)

This update lets you protect the latest SQL Server 2012 SP2 as a workload and also use it as a DPM configuration database. There is no change in the user experience or scenarios that are supported. Therefore, you can continue to back up SQL Server 2012 SP2 by using DPM in the same way that you protected older SQL Server versions.

Simplified steps for Online backup registration

Note Existing DPM to Azure customers should upgrade to the latest agent (version 2.0.8689.0 or a later version available here). In case this is not installed, online backups fail, and no DPM to Azure operations will work.

By adopting the latest new features and improvements in this update, DPM customers can register for Azure protection in several clicks. These new features and improvements remove the dependency of creating a certificate by usingmakecert and uploading it to the Azure portal. Instead, you can download vault credentials from the vault page on the Azure portal and use those credentials for registration.

To register the DPM server to the Azure backup vault, follow these steps.

Prerequisites
You must have a subscription on the Azure portal, and a backup vault must be created on the portal.

If you already use DPM-A
1.From the backup vault page on the portal, download the agent for SC Data Protection Manager.
2.Run the installer on the server. To do this, follow these steps:
a.Select the folder to install the server.
b.Enter the correct proxy settings.
c.Opt in for Microsoft updates, if you are not already opted in. Critical updates are usually propagated through Microsoft updates.
d.The installer verifies the prerequisite software and installs them.

Please be aware that you do not have to register the server again if the server is already registered and backups are in progress. In this case, you have only to upgrade the agent.

Getting started with DPM-A
3.From the backup vault page on the portal, follow these steps:
.Download vault credentials.
a.Download the agent for System Center Data Protection Manager.
4.Run the installer on the server. To do this, follow these steps:
.Select the folder where you want to install the agent.
a.Enter the correct proxy settings.
b.Opt in for Microsoft updates, if you have not already opted in. Critical updates are usually propagated through Microsoft updates.
c.The installer will verify the prerequisite software and installs them.
5.Register the DPM server. To do this. follow these steps:
.In Data Protection Manager, click Online, and then click Register Server.
a.In the Registration window, select the vault credential file that is downloaded from the portal. In the case of a remote scenario, only the network share path is enabled.
b.Follow the rest of the steps, and complete the registration.

fredag 19 september 2014

There is a new opportunity coming up regarding attending the "Master System Center 2012 R2 Data Protection Manager" course that I will teach in Stockholm the 8th of December this year.

If you are looking for an education that will give you the whole picture regarding how you can interact DPM with your Business Continuity plan and make the product work optimal with ohter System Center families and Azure this is the course for you.

We will also be focusing on restore since that is the reason of backup!

fredag 5 september 2014

A few hours
ago the product group released the great information regarding that DPM now is
supported to run as a IaaS in Azure protecting other IaaS Azure servers. There
are some important facts that you need to consider and this blogpost will cover
the basics that you need to get started.

The IaaS
Azure virtual machine must be a virtual machine that is A2 or higher. Please
keep in mind that DPM can also protect workloads that runs across multiple
Azure cloud services that have the same Azure virtual network and Azure
subscription. The number of disks that can be used for the DPM disk pool is
limited be the size of the virtual machine. If you want to know more about size
limits please read this information regarding Azure Virtual Machines (http://msdn.microsoft.com/library/azure/dn197896.aspx).

fredag 29 augusti 2014

The
community has for a while mention that they need more retention time for their
Recovery Points that they choose to store in Azure. As the UR3
release for DPM 2012 R2 a month back ago some final changes and optimizations were
made from the DPM team to make DPM ready to deliver a fully supported,
optimized and efficient Privet Cloud protection. With that came the architecture
that was needed from the DPM server side to interact with the new version for
the Azure agent, please keep in mind that there is a specific Azure team that
develops the Azure services that closely cooperates with the DPM team to
provide all great features.

From a more
personal view, I find this update a milestone where the great possibility to partially
remove tape from your restore plans or even remove them entirely. There is a
very positive feeling regarding the development of the DPM technology and also
a great focus of delivering great interactions between systems.

For more information
regarding the new retention time in the recovery service the Backup Vault
please read this article that Sherees published a few days ago.

onsdag 30 juli 2014

Today the UR3 for System Center Data Protection Manager 2012 R2 was released and with that Microsoft Product Group team in India sealed the deal regarding private cloud protection. More blogpost will follow regarding protecting your private cloud using the great features of System Center Data Protection Manager 2012 R2.

Features that are implemented in this update rollup

Scalable VM backup

This update rollup improves the reliability at scale for Virtual Machine (VM) backups on Hyper-V and Windows Server 2012 R2 infrastructures. This feature is supported on both Cluster Shared Volumes (CSV) and scale-out file server (SOFS) storage configurations for VMs.

You must upgrade the DPM 2012 R2 agent to DPM 2012 R2 UR3 on all nodes of Windows Server 2012 R2 Hyper-V clusters.

You must restart the cluster nodes.

Backup and consistency check window

Important This feature is supported only for disk protection for VM data sources.

This feature, configured through Windows PowerShell, enables specific time windows to restrict backup and consistency check (CC) jobs. This window can be applied per protection group and will limit all jobs for that protection to the specified time window.

After the backup job has ended, all in-progress jobs can continue. Any queued jobs outside the backup and consistency jobs will be automatically canceled.

This feature affects only scheduled jobs and does not affect specific jobs that are triggered by the user.

fredag 25 april 2014

Microsoft
keeps on delivering great technologies to their customers, the last
contribution is the support for backing up Hyper-V replicated servers.

When
building a disaster recovery plan for a customer one important key component is
the Hyper-V replica feature that lets you asynchronous replicate a Hyper-V
machine between two hosting servers. The only previously supported scenario was
to back up the virtual machines running at the primary site, this has now
changed. On the 24th of April Microsoft announced that they now
support backup of the replicated sever that is located at the replica site,
this makes it easier to provide a decent and also a more optimal disaster recovery
scenario or strategy to customers.

Please note
that backup of a Hyper-V virtual machine should only be considered as a
disaster recovery strategy. From a DPM perspective you can recovery items that
are defined as flat files from within the backup of the virtual machine, this
does not apply to the applications (SQL, Exchange etc.) running inside the virtual
machine. To summarize it, to be able to create a basic recover strategy you need
to deploy a DPM agent to the virtual machine OS and perform what Microsoft call
a guest-level backup providing a recovery scenario for the hosted application.

The Hyper-V
replica feature has been a great contribution from Microsoft to their customers
and with this announcement of support the efficiency of building a more optimal
and strategically disaster recovery plan, which will directly map to the
customer business continuity plan, means a great deal.

fredag 28 mars 2014

A little
while back I wrote an article for Microsoft that was published on Technet. The
article covers how you get started using the Recovery Services in Microsoft
Azure (name change!!) with focus on the backup vault and DPM integration.

fredag 7 mars 2014

As I mentioned
in my previous blog post (http://robertanddpm.blogspot.se/2014/02/attending-teched-north-america.html)
I’m going to TechEd NA to deliver a session together with John Joyner. We will
talk about how to deliver Backup As A Service (BaaS), Restore As A Service
(RaaS) and Disaster Recovery As A Service (DRaaS) using SCDPM and the other members
of the System Center 2012 R2 stack, Windows Server 2012 R2 and Azure.

If you are attending
TechEd NA I would of course be very happy to see you in the audience but I
would be even more than happy if I got the opportunity to listen to your challenges,
thoughts or any idea you may have regarding DPM, SCOM or other members in the
System Center family stack.

If you would
like to have a 1 – 1 Q&A @ TechEd NA please send me an email (robert.hedblom@gmail.com)
with the subject TechEd2014.

fredag 28 februari 2014

Are you
attending TechEd? If you are I would love to see you in the audience for my
session “How
to deliver BaaS, RaaS and DRaaS in a modern datacenter using System Center 2012
R2” where me John Joyner and a PM member of DPM
product group would like to explain how to use the combined power of the System
Center stack, Windows Server and Azure to deliver BaaS (Backup as a Service),
RaaS (Restore as a Service) and DRaaS (Disaster Recovery as a Service) with focus
of SCDPM that are the key component for this service delivery.

Issue 1A 0x80070057 error occurs when a session is closed prematurely. This error is caused by a failure during a consistency check.

Issue 2

Lots of concurrent threads or calls to Microsoft SQL Server from the Data Protection Manager (DPM) console cause slow SQL Server performance. When this issue occurs, the DPM console runs out of connections to SQL Server and may hang or crash.

Issue 3

The DPM console crashes, and an error that resembles the following is logged:

fredag 31 januari 2014

As an IT Pro you probably sometime now and then face the facts that you need to look up something for a deeper understanding. The most common dialogue I have with the community is "why is it so hard to find information that is relevant to me".

This blog post may not solve all of your problems but should be considered a possibility to investigate the MVA or Microsoft Virtual Academy that is an online repository with great information regarding the Microsoft product stack.

What is important when you design your services in your data center is to see the whole package of a service delivery. To be able to understand how the service should be built up and designed you need the knowledge. Microsoft Virtual Academy is a great starting point and from there you are of on an journey exploring the new possibilities that no one has ever considered possible.

fredag 24 januari 2014

Recently I have got a lot of questions on my blog regarding how to backup workloads cross domains in the scenario that you setup a two way transitive trust. I wrote a blog post a while back ago and it is applicable for Windows Server 2012 R2.

fredag 17 januari 2014

I ran into a problem when a customer tried to back up his newly added Exchange mailbox databases. The environment was a primary DPM server backing up a DAG Exchange and a secondary DPM server protecting the DAG offsite.

The customer added some new mailbox databases to the primary DPM server and it worked. When the primary DPM server had synced-up the databases he tried to add the databases to the secondary DPM server and got the error 915.

The operation failed because the data source VSS component 39f96ce3-c40d-4232-8248-2261258e690e is missing.

Check to see that the protected data source is installed properly and the VSS writer service is running.

ID: 915

Details: The operation completed successfully (0x0)

The error occurred since the VSS Writer ID has changed. The VSS prompted in the error dialog was replaced with another VSS Writer ID.

All the configuration regarding your DPM setup is stored in the DPMDB database, this goes also for your VSS Writers ID. The tables that stores that information is tbl_IM_Datasource and tbl_IM_ProtectedObject.

fredag 10 januari 2014

I got the great pleasure to deliver two sessions at the NIC conference in Oslo. I will deliver a session on Business Continuity and RRR (Rock N' Roll Restore) where you will get a deeper understanding of the concept of building a Business Continuity plan for your data center and also how you can provide a BaaS (Backup-as-a-Service), RaaS (Restore-as-a-Service) and DRaaS (Disaster-Recovery-as-a-Service).

The NIC conference will be held 16th and 17th of January in Oslo.

I would be delighted to meet up with any community members that are attending NIC and have questions. Please send me an email @ robert.hedblom@gmail.com and we will meet up.

tisdag 5 november 2013

The biggest
Nordic System Center event is about to launch and it would be a great pleasure
to see you in the crowd. Listen to several MVP’s and also Travis Wright
regarding the news and great features and functions for modern datacenter
management using the System Center family.

The event
will take place in both Sweden (Stockholm) and Norway (Oslo) the 4th
and 5th of December.

For
registration and more information please go to the following webpage:

fredag 1 november 2013

The Tape and Library Sharing feature is not a new feature within
the System Center 2012 R2 release, the TLS feature has been around for a couple
of DPM versions.

The TLS
function is enabled when you designate a DPM server as the Library Server. The Library Server will control and manage the
shared tape library and coordinate the DPM server that shares the library
called Library Clients. For the DPM
administrator to be able to start using the TLS feature the tape library must
be presented using a fiber channel technology.

There are some
prerequisites that you must consider before you start configuring the TLS
feature:

Use
only fiber channel as the primary communication media.

The
physical tape library must be presented to all DPM server that should use it.

For
the server that should be Library Server
the media changer / Tape Library AND the tape drives should be enabled in the
device manager.

For
the servers that should be Library
Clients the media changer / Tape Library MUST be disabled in the device
manager of the servers. The tape drives MUST be enabled in device manager.

Configure
SQL Named Pipes Protocol on all SQL servers (will be covered in the blog post).

A
service account for the TLS feature. In this blog post I will call the service
account DPMTLS.

The
following part in this blog post will cover how you configure the TLS feature
so the DPM servers could share the tape library.

On all DPM
Servers that will use TLS verify that the Medium
Changer devices and Tape drives are
present.

On the DPM
server that will be library server (SRV1)
the Media Changer devices and Tape drives should be enabled.

On the DPM servers that will be library client computers (SRV2) the Media Changer device should be disabled, keep the Tape drivesenabled
since the library client computers will need access to the tape drives.

Next step
is to enablenamed pipes protocol for the SQL servers hosting the DPMDB for both
client and library servers. Open your SQL
Server Configuration Manager and expand
the SQL Server Network Configuration
and click on Protocols for MSDPM2012.
Verify that Named Pipes are enabled
on the right side of the MMC. If not enable Named Pipes by right-clicking and choose Enable. You need to restart the SQL server (MSDPM2012) service, it will also restart the SQL Agent (MSDPM2012) and DPM services.

Next step
is to go to all your Client Library
Computers and open an elevated command prompt. Open the following catalogue
“C:\Program Files\Microsoft System Center
2012\DPM\DPM\Setup>” and execute the following command

AddLibraryServerForDpm.exe
–DpmServerWithLibrary SRV1.contoto.local

You
must always provide an FQDN for the DPM server name.

It is
very important that you run this executable on all your client
library computers. After you have executed the command on all your client
library computers go the DPM server that will be the library server
computer.

Open an
elevated command prompt and go to the catalogue “C:\Program Files\Microsoft
System Center 2012\DPM\DPM\Setup>” and execute the following command:

AddLibraryServerForDpm.exe
– ShareLibraryWithDpm SRV2.contoso.local

Create a service account called DPMTLS that you make
a member of the local administrator group of all your client library
computers and the library server. When the DPMTLS account is a
member of the local administrator group on all our involved DPM servers you
configure the two SQL services:

SQL Server (MSDPM2012)

SQL Server Agent (MSDPM2012)

By default the two
services are configured to use local accounts, you should change this for both
services so they use the DPMTLS account.

This
apply to both library server and client library computers.

When
that is done you open an elevated command prompt on every client
library computer and open the catalogue C:\Program Files\Microsoft
System Center 2012\DPM\DPM\Setup and run the executable SetSharedDpmDatabase.exe with the following syntax:

To find out
the path for the SqlServer\Instance\DatabaseName go to the library server and open the DPM console. Click on the About DPM information button and you
will see the path.

﻿

Go back now
to the elevated command prompt on your client
computers and type the following syntax:

SetSharedDpmDatabase –DatabaseName SRV1\MSDPM2012\DPMDB

Keep in mind that it will take some time for the TLS
configuration is finished. You may see the SQL Server configuration
Successful message for a long time before the Client SQL and Client
DPM messages appears. DON’T CLOSE THE COMMAND
PROMPT BEFORE THOSE THREE MESSAGES HAS BEEN PROMPTED

Go now to your Library Server and perform a Rescan
and Refresh of your tape library. Go to Management in the
console and click on Libraries. Right click on the library and choose
first Rescan, wait for the process to finish. Right click again and
choose Refresh. Repeat this on all your client library computers.