Microsoft announced new capabilities for Advanced Data Governance at the Ignite conference in September 2016.As of April 1, these features have been released into the O365 platform.As with all O365 releases, these are rolling out in waves.If you navigate to the Security and Compliance Center, you will see additional options under Classifications and Data Governance.

There are a significant number of features that have been released, but for the purposes of this discussion, we are going to focus on the specific capabilities surrounding information governance and policy.

Retention Policies and Classification Labels

Microsoft’s capabilities around retention are not new.Both Exchange and SharePoint have featured mechanisms to define how long content items should exist.Many of these manifesting themselves over 10 years ago.What is new with the recent announcement and update is Microsoft’s unification of retention policies across the Office 365 (O365) service.Within a series of mouse clicks, one can create a retention policy and apply it to Exchange (email and public folders), SharePoint sites, OneDrive for Business, Office 365 Groups and Skype for Business content.To an organization, this now provides the ability to create and enforce a number of information governance initiatives.

Layered on top of Retention Policies are Classification Labels.Classification Labels allow organizations to apply a specific information governance rule to content manually.The Classification Label sets can be published to different locations across the Office 365 platform.The result provides users the ability to easily follow information governance policy without having to have deep knowledge or training. Apply a Classification Label of “External Proposal” to a new sales proposal and the user is done.

Alternatively, Classification Labels can be defined and applied automatically and/or by default to a set of locations.Maybe your organization has a mergers and acquisitions group.A Classification Label can be defined and automatically applied to all content created in that group regardless of where it is stored.The added benefit of a Classification Label over a Retention Policy is that the label will actually manifest on the content, whether a document or email.

Retention Policies and Classification Labels offer the ability to ubiquitously apply to all content in the platform as a global policy.Alternatively, they can be applied to very specific content, in specific targeted locations.This flexibility offers organizations great power to solve many diverse and unique information governance challenges.

Before organizations jump into this pool with both feet, I suggest that they take a deeper look at these capabilities so that they obtain the results that they expect.

Retention and Records

The new Advanced Data Governance (ADG) features provide the ability to perform content retention as well as record declaration.Retention, as implemented in the ADG context retains content for the specified period of time.This doesn’t mean that the content is inaccessible.Users may continue to work with the content including editing, sharing and collaborating.Content that is under a retention policy behaves like any other content to the user.What is different is when a content item is deleted.The ADG features manage the preservation of the content so that it is not lost.

Retained content is handled differently than content marked as a “record”.This is a unique option for a Classification Label and removes the ability to modify the content in any way.This prevents users from modifying the specific item and will change the behavior experienced during the normal course of working.

Deletion and Destruction

The process that occurs when content is no longer retained may be defined as a delete action.The retention policy can be defined to do nothing at the end of retention, or perform a delete.It is important to note that delete in this case doesn’t mean “delete immediately”.ADG uses the recycle bin to provide a staged deletion process.Therefore, “deleted” content may still exist in the recycle bin for a period of time.

In the future, there are plans to put approval process in place for the retention policy that will provide organizations the ability to create more formal processes around information destruction.

Date Calculations

With this release, ADG provides date options for the calculation of the retention period.Standard dates such as Created and Modified are available and an option for when the content was labeled (in the case of Classification Labels).

Currently there is no option to support custom date fields or event-driven (also known as retention triggers) retention.These are going to be available soon per Microsoft.

Impressions

After being involved with the preview as a Microsoft ISV and working with several joint partners we have some initial impressions of these new features.

Microsoft has finally moved to unification of policy management – this is a big plus because we no longer manage separate, application specific policies

Application of Retention Policies and Classification Labels – definition of the policies and labels is quite easy and operates through a very modern wizard, stepping even the novice through the process.

Timeliness of Processes – in our experience, the policy application service level agreements aren’t immediate, some taking as long as seven days.We are sure that as ADG gets wider adoption and Microsoft gains experience with the service, these will come down.

Not your traditional records management – there is some adjustment that will be needed for more traditional organizations using legacy records management solution to adapt to these new concepts.For some organizations, these capabilities will augment more regimented processes to address the ever-increasing proliferation of content and the risks it represents.

Terminology – we can only suggest taking your time and understand what each of these new features is doing.We found ourselves a little confused more than once with the use of terminology.

Enterprise Perspective – one of the biggest challenges we faced was understanding the what, when and where of the Retention Policies and Classification Labels.Since each of these is a simple listing of defined policies and labels, it becomes very difficult to track where certain policies are applied and what labels are available.We suggest starting with very broad definitions and picking relevant naming conventions (you can name and describe these any way you want).Microsoft is gathering feedback in this area, so we hope to see good things coming.

What about other sources – as stated previously, Microsoft has made a great stride with these capabilities.If your organization has other locations or applications that need to have policy assigned, you will need to look for solutions that extend the platform.

In conclusion, Microsoft deserves credit for listening to its clients and taking this leap forward.For those of us committed to the platform, we are happy to see these capabilities.While adding some great capabilities, we believe that organizations will find that there will be compliance requirements and information handling processes that will require broader and deeper functionality in specific areas.

I was working with my new SharePoint 2013 farm and found an interesting situation that may cause you to consider your approach to a SharePoint 2013 environment. Since we are producing software for multiple SharePoint configurations and instances, I do a tremendous amount of heterogeneous farm configuration testing. One of our engineers pointed out this little anomaly with SharePoint 2013 and I proceeded to do some further testing. Here is the situation: If you have a green field SharePoint 2013, you can create site collections in either SharePoint 2013 (default) or in a compatibility mode SharePoint 2010 experience. The screenshot below shows the option in the Site Collection creation page:

Now, the end result is that you get a 2010 looking site in the SharePoint 2013 farm. However, this goes beyond the “experience”. The architecture is such that this new site will leverage configuration information in the 14 hive rather than the 15 hive. With respect to the Content Type Hub configuration, what I found was that a site created in SharePoint 2013 experience will publish content types to those sites running in 2013 experience. It will not publish content types to a site created with 2010 experience. Further testing also shows that you can define a 2010 experience site as the Content Type Hub for your 2013 environment, but it will not publish content types to a 2013 experience site. This has tremendous implications for running a governed SharePoint environment as well as how you approach your site upgrades for existing will-governed SharePoint 2010 environments. I believe that Microsoft has really taken the process of upgrading enterprise environments seriously and has made huge strides in this area; however, little potholes in the road are always good to know about before moving ahead at full speed.

The process of creating content type based workflows is sometimes daunting for the SharePoint professional because it requires a number of steps and the order of those steps is critical.

In this tutorial, we have a Content Type Hub defined for our farm and a team site in which we want to use a content type hierarchy to manage our content. For a specific hierarchy of content types, we want to have a specific workflow available to process the content. We don’t want to have this workflow apply to other content types in the library, only specific types of content.

This example steps through the manual process of workflow creation using SharePoint Designer and standard SharePoint administrative features. Obviously, the process can be automated or packaged into features to expedite the publication of the workflow to multiple sites.

Let’s start with our assumptions:

the Content Type hierarchy has been defined in our Hub

the target team site has been created

The Content Type hierarchy that we have created is as follows:

Document

Enterprise Document

Administration

Equipment Operation Manuals and Specifications

For this exercise, we will be associating our workflow with the Administration content type and subsequently applying the workflow to the child content type, Equipment Operation Manuals and Specifications.

Step 1: Create workflow at the Content Type Hub

This will be a reusable workflow that is associated to our parent content type (because we want to work with specific columns of that content type).

Assign Workflow to highest level content type

When creating the workflow, ensure that you have selected the content type to associate to the workflow. If you don’t select a content type, the specific columns will not be available to the workflow.

Step 2: Define the content of the workflow

In our sample workflow, we are only going to manipulate specific column values. I would anticipate that your workflow will be much more complex than this.

Save and Publish the workflow

The workflow must be saved and published. This will make the workflow available in our Content Type Hub.

Step 3: Export the Workflow to install into our target team site

Because the Content Type Hub only publishes the workflow association and not the workflow itself, we must package and install the workflow into our team site. This involves saving the workflow as a portable template (in the form of a WSP).

The Workflow is saved in the Site Assets library in the content type hub (where we created the content type)

Navigate to the Site Assets library and download a copy of the workflow

Save the WSP to a local location on the drive

Step 4: Install the workflow in the target site

Navigate to the site where the workflow is to be used.

Go to Site Settings > Solutions

Upload the solution to the site

Once uploaded, select Activate

Navigate to site settings (in the target site)

Go to the Site Actions area and select Manage Site Features

The Activated workflow solution will now appear as a site feature. Activate this feature

Step 5: Associate the Workflow to the Content Type Parent

Go back to the Content Type Hub and navigate to the content type that you want to associate to the workflow

Select Workflow Settings

We are going to add the new workflow to the Parent Content Type, Administration

Configure the options that you want for the workflow.

IMPORTANT: Make sure that you select “Yes” for the “Add this workflow to all content types that inherit from this content type?”

In the Workflow Admin Screen, select “Update all content types that inherit from this type with these workflow settings

Step 6: Publish the updated content type

Now we need to publish the content type and the changes back out to our subscribing sites

Navigate back to the content type (parent)

Select the manage publishing

If the content type has been previously published, you will need to republish it

Step 7: Publish the Content Type

Once the Publish/Republish has been selected go to Central Admin and run the Content Type Hub and Content type Subscriber jobs

Step 8: Validate the Target Site Configuration

Once the publishing jobs have completed, navigate to your target site and assign the content types to a library.

Once you import content and assign to one of your content types, you will then be able to use the workflows

Select an item and workflows

You will see the workflow assigned to the content in the target library

Hopefully you have found this post useful and can easily see where you can apply this process to your environment and applications.

I constantly work with multiple notebooks, tablets and desktops. Since the first introduction of the pen-based convertible tablets over 10 years ago, I have been a convert.

As a result, I was really intrigued in June, when Microsoft announced the Surface. At the time, it seemed to have everything I like about the iPad and a tablet combined into one, simple, elegant package. To top it off, a built-in keyboard and kickstand.

Immediately when the “pre-order” button went live on the Microsoft site, I placed an order for my new Surface RT. I went into the purchase knowing all of the documented limitations of the RT platform, but the form-factor, Windows 8 and Office apps overcame those negatives.

I received my Surface on the announced day – about 2 weeks ago. Since, I have completely moved from my iPad and have started using it as my primary “task oriented” workstation. When I say “Task Oriented”, I mean the usual, emails, attachments, document creation/editing and presentation development. Of course, I can’t run my stable of virtual machines for demonstration and development. More on that later.

The first thing that I will say about my experience so far…WOW. I don’t have any significant issues with the Windows 8 RT platform so far. Speed, responsiveness and overall performance are all suitable for daily work tasks.

I’m still getting used to the Mail application, and I do miss Outlook. But the built-in mail application suffices and is only occasionally a small inconvenience. A big miss is the inability to support opening PST files – so archiving email in Outlook and expecting to open it in the Mail app was an incorrect assumption. Hopefully Microsoft will improve the Mail integration capabilities with Outlook.

Some key apps that I have been using consistently that may help many of you:

Remote Desktop – this app does exactly what you expect. I can remote into all of my development machines directly from the Surface

I recently did a webinar in conjunction with Earley & Associates about creating a Records and Information Management Strategy within your organization. The presentation is divided into two parts, a business perspective and a technical perspective.

I specifically address the foundational components of a strategy within your organization and how to get started. During the second half, I cover the elements within SharePoint 2010 that can be leveraged as part of your strategy.

To view larger, click the Full Screen Icon

I’d like to thank Earley & Associates for hosting the webinar and hope you enjoy viewing the replay.

I have been asked several times this week where the SharePoint 2010 content database files and log files are stored by default. Well, sort of hard to answer as everyone seems to install their SQLServer slightly differently and the locations of these files depends on how you approach the installation of SQLServer.

If you “click through” the installation, not really paying any attention and take the defaults, then your database files and log files are going to be created and stored in the default location. This typically is C:\Program Files\Microsoft SQL Server\MSSQL10.MSSQLSERVER\MSSQL\DATA.

If we create our databases in SQL Server Management Studio, we can specify the location of the MDF files and LOG files. However, we will have to go through some manipulation to use a pre-created content database in Central Administration – probably not the most streamlined approach. However, the pre-creation of content databases may be the optimal solution if your organization has DBAs that don’t want the SharePoint Administrators creating databases all over the place.

So, here is a quick tip and modification that you can give to your DBA so that the data and log files go in the location that they want and you don’t have to go through down-time or other configuration steps to have the default Central Administration Create Site Collection process do what you expect it to do.

The simple SQL script below can be executed to change the default location for MDF and LOG files:

Windows 7’s libraries are a really convenient tool for quickly accessing your data (or putting data from multiple locations into one window), but it doesn’t let you add removable flash drives or SD cards. Through some fancy searching and testing (playing around), I figured out how to leverage removable storage within a library.

Go to Libraries and right click on Libraries, select New/Library and give it a name—Removable Drives for example.

Right click it and hit Properties. There’s the magic button “Include a Folder”.

Unfortunately it doesn’t work. It tells you you can’t add removable media. Removable media can be used, and according to Microsoft,

“Only if the device appears in the navigation pane, under Computer, in the Hard Disk Drives section. This is set by the device manufacturer, and in some cases, it can be changed. Contact your manufacturer for more information.”

The way to do this is to go to a known location on your hard drive—C:\ perhaps—and create a folder called Pictures Card, or whatever.

Start the Computer Management program (Right click on Computer -> select Manage) and click on “Disk Management” in the left-hand sidebar.

Right click your removable drive and select “Change Drive Letter and Paths”. You will first need to remove the drive letter assigned.

Click Add and select “Mount in the following empty NTFS folder”. Choose the folder you made earlier (e.g., C:\Picture Card). Now if you look in the C:\Picture Card folder you’ll see the contents of your SD card.Now, if you right click your new Library you made earlier and add C:\Picture Card to it, it’ll work!

For each removable resource you want to index or search create a new folder and go to Computer Management to enable it to be added to a Library.

Note, of course, that the resource needs to be available to Windows for you to see the files. If you put the picture card back in the camera, Windows won’t be able to display the contents. The previously indexed contents however remain good next time you insert the drive.

If SQL Server ever crashes or hard booted, you may come across a possible corrupted SharePoint_Config database. Recently while doing SharePoint 2010 configuration of our new product suite on my VM, I had to hard reboot the VM. When the VM came back up, I received the dreaded “cannot connect to the configuration database error”. This occurs while accessing any of the content web application.

I’ve seen this condition before, so I tried the usual debugging avenue – – IIS Web Site availability, IIS Application Pool Availability. IIS Application Pool Identity, and SQL Server availability. (see http://support.microsoft.com/kb/823287) After walking through these steps, I noticed that “SharePoint_Config” database was in suspect mode. (notice the order of the items in my debugging avenue…yes, SQLServer was last)

As it turned out, it’s not a SharePoint issue. Any SQL Server databases can be corrupted and gets in the suspect mode. The following steps will fix the suspect database mode issues. As usual, I don’t take credit for this solution, but only post as a reference. The credit goes to above blogs.

— Use the Master database
Use Master— Verify that database has issues
EXEC sp_resetstatus ‘SharePoint_Config’— Put the database in emergency mode
ALTER DATABASE SharePoint_Config SET EMERGENCY
DBCC checkdb(‘SharePoint_Config’)— Set the database in single user modeALTER DATABASE SharePoint_Config SET SINGLE_USER WITH ROLLBACK IMMEDIATE–Repair the database with data loss
DBCC CheckDB (‘SharePoint_Config’, REPAIR_ALLOW_DATA_LOSS)–Set the database in multi-user mode
ALTER DATABASE SharePoint_Config SET MULTI_USER–Verify that database is reset
EXEC sp_resetstatus ‘SharePoint_Config’

To fix SQL Server database suspect mode, we need to use the SQL Server’s emergency mode which allows you repair the database by reparing to last normal state.

After running the script on the Master database, SharePoint_Config database suspect mode was fixed and I was able to access the content web application.

I recently delivered a webinar with Idera Software on Information Policies and Compliance with SharePoint 2010. This is a very basic introduction of the component parts of the policy and compliance puzzle.

The overview of the webinar was published as “Information accumulation can be a big issue within such a collaborative platform as SharePoint 2010. There are techniques and best practices to control how information is kept and enumerated as valuable to the organization. During this session, participants will gain valuable insight as to how to assign information policies to Work In Progress or Draft information and the application of Record Retention to critical documents within their SharePoint environment. The topics covered will include SharePoint 2010 Information Management Policies, Content Types, the Record Center, location-based policies and In Place Records management.”

I have recently been working with a number of customers to establish information governance in their SharePoint 2010 environments. The question always comes up is “how do we get content off the network shares into the SharePoint environment?”

There are a number of commercial tools that will analyze, cleanup and migrate network share content to SharePoint. These tools come from vendors such as StoredIQ, Active Navigation, EMC Kazeon and Metalogix.

If you are going to be migrating file shares to SharePoint, regardless of the tool being used, there is a basic process that I think you should be taking. This process is broad, but could help you get started thinking about your environment. You can see the basic steps below.

You may look at that process and be thinking that it doesn’t really say anything, nor does it really help you.

As I said, there are many tools out on the market today to help in this endeavor. These tools can help you discover all the files you are looking to migrate; they can also help you filter out files by date, specific strings, or other information that you may not want to migrate to SharePoint.

However! There are a couple tools that you can use right now that can help. They are PowerShell and Microsoft Excel! Yes, PowerShell and Excel!

Take the example of a network file share you want to migrate into SharePoint. The share is N:\team1. In the process above, the first step is to discover what files are in the file share, how old they are, what types there are etc. Of course, you could go through and manually look at them one by one. Or, you could ask the person responsible for the file share to do that. Or you could use PowerShell to iterate through and output the results to a CSV file. Simply by running this command:

At this point, congratulate yourself, you have discovery, now you can run these PowerShell commands and use Excel to sort and filter.

Now, let’s go to the next, Cleanse. We want to remove what is often termed ROT (Redundant, Obsolete and Trash or Trivial) files. Once again there are many products that make this part of the migration or import process.

However, you can do some cleaning with PowerShell as well. When you come up with a list of files you want to get rid of such as all the MSI files in the \\Ndriveserver\team1\Marketing directory, you can use the Remove-Item to get rid of them.

The –Whatif command will output what the command would have done, but not do it.

Like the shot below if I was to delete all the PPTX files

Running the command with the –Whatif parameter will help you make sure you are aware of exactly what the command would delete.

So now we can identify file types, and that may be useful in the case of log files or obscure application output files; however that typically is not all the criteria that you would practically use. What if you want to enforce a policy of removing OLD files? For example, what if you wanted to remove anything that has not been modified in the past year? With the capabilities of PowerShell you can do that!

Finally, you do have another option that can take advantage of that CSV file you are creating. Once you create that export file (CSV) you can delete all the entries of files you wish to keep, thus leaving in all the ones you want to delete. Then use the Import-csv PowerShell command and use that as the input to the remove-item command as follows:

Import-CSV c:\export.csv | foreach {remove-item $_.fullname }

The result is any file listed in that CSV goes away. This is one where you definitely want to be careful and ensure that you are only listing out files you really want to get rid of! Remember the WhatIf!

Hopefully, I’ve given you some insights how to Discover and Cleanse your network file shares in preparation of migrating them to your SharePoint environment!