Latest Posts

I’ve been finding myself in the Exchange 2013 world for the last few months, helping with some administration and updates. As a result I stumbled across an unknown, yet cool (to me) Exchange API. Naturally I couldn’t resist trying this out in PowerShell. For those who just want working script, here it is. This script will return the most recent 10 items.

# This requires the Exchange Web Services Managed API to be installed on the computer where this script is being ran

The Exchange Web Services (EWS) Managed API appears to have come onto the scene in early 2009 (Where was I?) with it’s Beta 1.0 release. Fast forward to today, it’s in its 2.2 release and is applicable to Exchange 2007 SP1 and UP including Office 365. This API can be used to work with e-mail messages, calendar, task, and contact information…Translation it’s so much easier for a non-developer (like me!) to harness Exchange resources without dealing with the underlying SOAP interface of EWS. I mean how cool is it that Microsoft took the time to make a wrapper to get to the EWS.

The construction was super simple, and is executed like this.

1.) Create the client object which looks like this

2.) Set the URL by passing the “AutodiscoverURL()” method your email address. This will go out to your Exchange environment to get and fill the object with the correct “Exchange.asmx” web service. This might be something like “https://mail.contoso.com/EWS/Exchange.asmx”

3.) Create a variable for the inbox folder

4.) Create a variable for mailitems filling it with a search from the inbox using the “FindItems()” method. Incidentally this is where you might decide to do other things like use the “MarkAllItemsAsRead()” method to take care of those pesky unread items should you desire.

5.) Loop through each of the mail items so they are loaded into PowerShell using the “Load()” method

6.) Display them selecting whatever fields you deem important. For the sake of this demonstration it “Sender,Subject,Body”

All this to say I really am wildly excited about being able to use this as an alternative method to send e-mail from PowerShell Scripts where I use the .net method. Not to mention demonstrating how powerful and friendly powershell really is. Past that I can think of application within SharePoint synchronizing a calendar and task items from a mailbox without a third party program. I’m not saying this is the end all be all by any means but it’s a nice new egg of putty I can make immediate use of to fill some gaps until a better solution can be designed and implemented.

It certainly made me think…”Hmm…This might work” for this,that, and the other thing too!

I have been running Windows 10 since the preview was released. My experience, so far, would be positive. It has been very smooth transition. As someone using a machine without a touchscreen, I would say it is a lot more functional than Windows 8/8.1. I still find the full screen apps lacking, but the operating system is smoother, overall, to me, and I enjoy using it.

I own, and generally enjoy using, a MacBook Pro 15. My only complaint on the MacBook is the lack of physical Home, End, Insert, and Delete keys. I have run Windows of various versions on it in the past using Parallels to great success.

Windows 10 has been happily living on my PC laptop, but I wanted to consolidate it, if I could, onto the MacBook Pro, so I could carry one thing. The resolution on the MacBook Pro is much higher as well and makes some of my work much easier. I am typing this on my Windows 10 VM that was restored onto Parallels on the MacBook Pro, and it's been working for a few days like a champ. I thought I would document the steps I took to move from the Physical machine to this Virtual machine if anyone else needed, as it was not that complicated once I located the information I needed.

Make a system image backup of Windows 10

Restore the system image backup of Windows 10 onto Parallels VM

That is the high level, now I'm going to go through the steps I took to get there.

Make a system image backup of Windows 10

I grabbed a USB drive, and plugged it in where it was recognized as F:

Ran the following command

wbAdmin start backup -backupTarget:F: -include:C: -allCritical -quiet

Note that you will need to replace F: with whatever drive you want to use.

Restore the system image backup of Windows 10 onto Parallels VM

Note: This was, for me a little trickier because my VM didn't seem to see my USB drive however I mounted it to the machine, so I used my NAS.

Connected to a share on my NAS from my PC

Copied the folder with the backup from my USB drive to the share.

On Mac, Open Parallels, selected my WinX ISO

Let Parallels do the entire install (I happened to be going somewhere for a while, so this just completed while I was gone. It's not really required)

Held SHIFT and restart to get into the recovery console

Went to repair, advanced, system image recovery (path may be a little different, but as long as you get to system image recovery, it's ok)

Stepped through wizard and mapped drive to NAS share

Picked backup and restored

Restarts happened after that as needed.

Once Windows was started, took a snapshot and began cleaning

Installed Parallels tools and removed the driver software for my hardware I had installed

Took another snapshot when all was stable and start working.

One thing to note is that the Parallels tools need quite a few reboots. So if you are rebooting over and over again manually, you may think something is wrong, but it is probably fine. I wasn't being super observant, but I think it was 4 or 5 reboots that all looked like they did the same thing when they started.

This was my first time using the system image recovery tool, and I have to say compared to Time Machine, it's a bit ridiculous and convoluted. It *did* work though once I found the commands and figured out what to do. By contrast, when I got my new mac, I just pointed it at my Time Machine location (which setup found automatically for me) and pushed a button. I imagine a OSX VM would work the same way. Apple! Why don't you have HOMEENDINSERTDELETE KEYS!!! Then you would be perfect!

I recently had a need to mash-up some data from a SharePoint 2007 list in an Excel document I was working on. I already knew that I could work with SharePoint 2007 data in Excel by using the following instructions from Microsoft:

Do one the following on a SharePoint site:

Windows SharePoint Services 3.0

If your list is not already open, click its name on the Quick Launch. If the name of your list doesn't appear, click View All Site Content, and then click the name of your list.

On the Actions menu , click Export to Spreadsheet.

If you are prompted to confirm the operation, click OK.

Windows SharePoint Services 2.0

If your list is not already open, click Documents and Lists, and then click the name of your list.

On the page that displays the list, under Actions, click Export to spreadsheet.

In the File Download dialog box, click Open.

If you are prompted whether to enable data connections on your computer, click Enable if you believe the connection to the data on the SharePoint site is safe to enable.

Do one of the following:

If no workbook is open, Excel creates a new blank workbook and inserts the data as a table on a new worksheet.

If a workbook is open, do the following in the Import Data dialog box that appears:

Under Select how you want to view this data in your workbook, click Table, PivotTable Report, orPivotChart and PivotTable Report.

Under Where do you want to put the data, click Existing worksheet, New worksheet, or New workbook.

If you click Existing worksheet, click the cell where you want to place the upper-left corner of the list.

That’s all very well, but in my case I had an existing spreadsheet with quite a few tables and an existing excel model. I didn’t want to go through these gyrations. Isn’t there some way to just get the data right off the list as a web page?

Yes. Enter the “Import XML data” feature in Excel. MS Offers the following following guidance for getting XML data from a web service:

Import XML data from a Web service

To do the following procedure, you must have access to a server that is running pnSTS11. A default installation of pnSTS11 provides a data retrieval service for connecting to data in SharePoint lists. A SharePoint site administrator can install the Microsoft Office Web Parts and Components to add additional data retrieval services for Microsoft SQL Server and Microsoft Business Solutions. The installation program for Microsoft Office Web Parts and Components is available from the Downloads on Microsoft Office Online.

On the Data menu, point to Import External Data, and then click Import Data.

The contents of the XML data file are imported into an XML list in the existing worksheet at the specified cell location.

XML list in new worksheet

The contents of the file are imported into an XML list in a new worksheet starting at cell A1.

If the XML data file does not refer to a schema, then Excel will infer the schema from the XML data file.

To control the behavior of XML data, such as data binding, format, and layout, click Properties which displays the XML Map properties dialog box. For example, existing data in a mapped range will be overwritten when you import data by default, but you can change this.

You have to replace {0} with your site URL and the {1} with your GUID for the list ID on SharePoint and {2} with the view GUID. How to get these? Well the easiest way (I think) is to go to your list, then select the view you want. Click view dropdown and select “Modify this view.” This will open the link to the screen where you can modify that view. The URL in the browser should have a View and List value. I stripped the %7B and %7D off as those are the { and } respectively. You don’t need to use a view and can omit that completely if you want to connect directly to the list. But I found I normally wanted to just get a certain view or I wanted to create a special view just for this activity.

Regardless, you can now follow the auto prompts to get the data into a spreadsheet. You can also open the developer tab and dive into the XML itself (see the MS link above for more instructions) and drag and drop the fields to other locations. Now when you get the data it will put it wherever you bound it.

By default it will dump it into a table and will include the schema items which you don’t need. You can open the Developer tab, select “Source” in the XML section and then just deselect the ns2 : Schema element. It will stop syncing those fields and you’ll have to clean up your table to get rid of them. I was able to rename the columns etc as needed, but since this is a read only feed I was typically using it to calculate other columns so I frequently stick with the ows_* columns that are the defaults for the data element.

Screenshot of the XML section from Developer tab in Excel:

Once I figured out I needed to find the XML import stuff for Excel, I was able to find lots of articles on it online to help. The following was my favorite and links to a lot of other ones.

I realize both are rather self explanatory, the first creating a self signed certificate where as the other providing directory like interaction / access within PowerShell to the certificate stores. Essentially making the need to spin ye old mcc certificate console a moot point.

Suppose you’re tasked with building a functional lab {insert Microsoft software title here} environment, naturally you want to automate as much as possible yet those pesky certificates cause you to break open IIS to create a self signed certificates. Sure, it’s only an extra manual step or two but my take, why do manual when automation isn’t but half the effort more?

That said what if the requirements are for a SharePoint 2013 LAB, with a functioning APP Model. The app model requirements bring along with it the requirements for wildcard certificates. Now I could be missing something but my testing within IIS8 didn’t allow for specifying the FQDN (CN).

I guess at this point, one could consider a few options. First one might be inclined to stand up a lab PKI(or leverage an existing one) Of course a more simple but costly route would be to use public certificates.

If time and money are constraints then our friendly neighborhood PowerShell cmdlet and CERT provider can quickly help us out. After all the New-SelfSignedCertificate will let us specify our DNS name or DNS names. Yes…That was plural of names, as in more than one. And since we are talking DNS names, well then we only find ourselves limited by what is defined in DNS or the server HOSTS file (none of us do that – right?)

So take this snippet and incorporate it into your automated lab builds or conversely offer your own opinion

I have a set of documents I need to review regularly. They are *mostly* form data in word so I wanted to write a simple script to extract the data I need from all of the documents and put it into a table. I wasn't sure how to do this so I wrote a little script to iterate over the data in the document. The documents I review have several versions, different form fields, and other irregularities. I wanted to save a little script I wrote that uses various methods to iterate through the document. I use LINQPad regularly, so I wrote this in there in C# as a method for opening and handling the word document, and an extension for iterating it.

I think the standard plan would be great for most people. Reasonably priced if you need office or not. I think using the MS Office 365 would be a better deal for most, but it's certainly nice to get turnkey with no work on office and anti-virus. I'm surprised MS doesn't offer something like this yet really. I wish there was a save password option for the client to make use of complex passwords easier, but this could be solved by a password reset policy I suppose.

My Profile:

My computer time is split between Communication and Content Creation about 70/30. Communication is via the usual Email, Chat, and Screen Sharing. Most content creation is in MS Office or some type of coding application. The most important thing for me when utilizing a VDI platform is that it is fairly responsive when I'm working in multiple applications and need to alt-tab back and forth. VDI doesn't exactly shine in this capacity when you are working remotely, but I feel like I can get a good sense of a worst case scenario for typical users when I am testing.

Verdict:

Good, but I am not going to replace my standard solution for this (local use of Parallels on my MBPro), yet.

I’m throwing this script out there for any developers or admins who are seeking a quick SharePoint focused script for user account creation.

In my environment we’re working towards automated unattended install of SharePoint, including account creation. AutoSPInstaller is cool but it seems too complicated for me. Put another way, if I’m going to spend time learning I’m choosing to learn PowerShell and SharePoint in more detail.

With that said, here is my script. This script could easily import from a CSV (or other file), read a SharePoint List, or any other means of input. For the purpose of an example, an array is used.

#Make Use Of An Array Just for example. This could easily be a csv but since this was for dev it was easier

#In Case of CSV Column Order would be "SamAccountNAme,FName,LName,Description,Password

IISReset/noforce

}

Today I had to fall in line and do something I’m not entirely proud of. I had to create a script to replicate files from a fileshare to Sharepoint. Struggling to find value in the efforts, I figured a blog post to remind me of this event was fitting. The script was rather basic, something good, and I took the opportunity to grow, exploring the world of Powershell – SharePoint interaction outside of the SharePoint PS SnapIn.

Surprisingly, it was a bit easier than I had imagined. While I wish I could take 100% of the credit, inspiration for this function comes from this article

1) You need to get the .net provider for SQLite from sourceforge.net2) Then add a reference to System.Data.SQLite to your project. 3) You need to make sure the reference is marked to be copied locally.

Here we are, a cold crisp 20 degree Wednesday in November. I thought to myself…this is not cool (no pun), but you know what is cool? Yeah, I’m sure you guessed PowerShell’s ActiveDirectory module.

Just a quick blog note to show how PowerShell quickly settled a dispute during an upgrade of our AD schema to handle a Windows 2012 DC. Of course this wasn’t a big dispute, many other tools could have been used. The question was had the Schema already been changed to support a 2012 Server. Again, there are many tools that could provide the answer, but what made this so cool was being able to share the experience with others who didn’t know PowerShell could replace some of the old stand by AD tools. So this is more of an AH-HA moment that felt right to share (and the script)…All brought to us by PowerShell and the ActiveDirectory module.

(An academic honesty note here…this script is not 100% my own work...More like 5% – 10% my work, I can’t remember where I snagged the meat of this script so the credit remains unknown.)

#This script will query AD for the Schema Version of AD,Exchange and Lync. Can be ran as least privilaged user.

The other day while reviewing an Exchange 2010 Environment, I noticed a few active mailboxes belonging to disabled users. For obvious reasons this isn’t a good thing, if for nothing else it clutters up the Exchange Address Book.

Next thought in my mind…So what’s the best way to hide these disabled users? Having the PowerShell bias that I do in fact have, I had to spend 15 minutes reviewing the options.

Use a manual process. This would include disabling the user in AD, followed up with the steps described here.

Use Exchange Address Book Policies(ABP). As indicated in this article, APB’s have a dependency on Exchange 2010 SP2. That said it seems like a viable and interesting approach.

Use PowerShell. As I started from the outset, I’m biased right now…So a PowerShell only approach seems “more better”.

Here is the script I used in a resource / user environment. Keep in mind this is a down and dirty version, a proof of concept. I would limit the use of this example as an inspiration only. (good or bad)

#This script will query for all LinkedMailboxes when ran on an Exchange Server

#It will return a user set who show their Linked Master Accounts as disabled

#Use the results with "Set-Mailbox -HiddenFromAddressListsEnabled $true" to change

#all of the disabled users to hidden from the address book. Example Below

select so.name as TableName, + o.list as IdentityColumnName
from sysobjects so
cross apply
(SELECT
column_name
+
case when exists (
select id from syscolumns
where object_name(id)=so.name
and name=column_name
and columnproperty(id,name,'IsIdentity') = 1
) then
''
end + ' '
from information_schema.columns where table_name = so.name
) o (list)
where xtype = 'U'
AND o.list is not null
AND name NOT IN ('dtproperties')

If you have a connection to a database through entity framework and you need to switch it to another database (with the exact same structure) you just need to set the Connection.ConnectionString (as seen below). I had an application where we created a copy of the Master database when setting up a new client. So using Entity Framework I switched from the master database to the client (depending on what the admin was doing).

In my continued automation efforts, I was looking to convert documentation provided from a consultant into something more…well…automated. In this first of four parts I’ll list out creating a Result Source with Powershell.

Subsequent posts (part 2 – 4) will give example of creating Result Types, Query Rules, and Search Navigation. The aim of this effort is to make use of PowerShell in rebuilding, essentially cloning without data, a search service application. This is useful when Microsoft support gives the classic solution of “rebuild” your service application. Doh!

First things first. I’m not a developer. I seem to do ok working my way through the SharePoint object model with PowerShell and writing automation scripts, but that doesn’t make me a developer.

With that out the way, I do find myself in an odd place with developers neglecting (for whatever reason) to automate population of web parts in the content pages their solution has deployed. An example might be Search Content web part needing to have the proper display template selected for displaying conversations. Ah…My Dev friends (don’t hate me) but why not go that extra mile? If I can do it through script, surely you can employ your superior coding skills to include it in the solution (wsp)!

For those of you who may find yourself in my shoes, here is a script I created to help ease that cross farm (environment pain). Essentially the script is a function with a few parameters. This is the core, from here you can customize to your needs. It’s a great starting point for anyone who wants to automate changes to web parts via PowerShell. You can copy and paste the script below into PowerShell ISE and save it to whatever name you like