Recently at my client, Sunstate Equipment Company, I have been performing some SEO (Search Engine Optimization) work on their public facing SharePoint 2013 website and customer portal sunstateequip.com.

Sunstate Equipment offers construction equipment rental services at more than 60 different locations around the country. They rent both large and small pieces of equipment, items like: Backhoes, Excavators, Loaders, Trenchers, Forklifts, Generators, Scissor Lifts, Boom Lifts, thousands of other pieces of equipment.

As part of this work, we decided to bring their existing local websites which reside on different domains (26 or so) and move them into SharePoint as a local site page so that the traffic would be coming from their primary domain. The Digital Marketing Manager prefers using Wordpress blogging so I have created a Wordpress blog on azure and soon will be setting up a blog subdomain to host this blog. If you are interested in how to set up a WordPress blog on Azure you can read my blog post here. The blog will contain different new stories that are pertinent to the company and to the local communities where the branches do business. The Digital Marketing Manager will be able to tag content that is related to the local communities and we want to have this content show up in our SharePoint 2013 local site pages.

I thought about using RSS since it is already supported on WordPress and SharePoint has a webpart to pull in rss feeds. I actually prototyped this fairly quick and could see the content using this approach , but it doesn’t match the look of my site. The traditional way of changing the look of this webpart in SharePoint is to modify the xslt which is not my favorite thing to do. Since I am using AngularJS and a lot of client side javascript I would prefer to consume the data from the blog in a json format. Additionally, I have written a custom deployment framework that is tearing down all the pages, images, layouts, etc and reprovisioning through my build scripts on the fly and it will be much easier to deploy this if i could just mark up my local site page layout with angular and have the javascript controller make an asynchronous request to go get the data from the blog and then display it on our site.

I was starting to write a service to convert rss data to json but then I saw someone had written a JSON API plugin for WordPress. This seemed like a really simple approach to solve this problem. After reading through the documentation I was able to quickly determine how to get a http request to return json data from the blog site.

I also needed to install a CORS plugin into WordPress called WP-CORS. Cross-Origin Resource Sharing (CORS) is a W3C spec that allows cross-domain communication from the browser. It builds on top of the XMLHttpRequest object, CORS allows developers to work with the same idioms as same-domain requests.

You can go into the admin section and click on the CORS menu option. You can add your domain into the “Allowed domains” area. For development environment I have set it to *.

Now that I have those 2 options in place i can simple make a request to my site and append a querystring parameter like this to return json data:

Community Stories

Loading...

{{story.title}}

Note: This isn’t meant to be an AngularJS tutorial so hopefully you can look up AngularJS if you are not familiar with it.

Here is a screenshot showing the original wordpress site with two sample blog entries.

Here is a screenshot showing our custom page layout in SharePoint 2013 rendering the same two blog posts using an async call to load the json api data from wordpress hosted blog.

The page layout will just render the title and excerpt text with a read more link that will take the user to the full blog site. I will be adding filtering so that only entries with specific tags get displayed, for example if a branch location was in Phoenix, it could get tagged Phoenix on WordPress and for the Phoenix branch we can just pull in the blog posts tagged with Phoenix.

With this approach you could really build a custom user interface in the language / system of your choice and serve up the wordpress data or create posts etc entirely through the json api. This is going to be a great solution for my client giving them the productivity and familiarity of the tool they prefer while providing integration into our current system and not reinventing the wheel.

You should see a page similar to this. Enter your email address and password to sign in.

Once signed in successfully you should see the Microsoft Azure Dashboard which looks similar to the screenshot below.

Click on the New menu option at the top left and you will see the panel slide out with the search box.

Enter the search term: ‘WordPress Bitnami’ into the box where it says ‘Search the marketplace’

Next you will see a search results that contains items similar to this. Click on the item called WordPress and the publisher Bitnami.

You will now see the description of that Item:

WordPress powered by Bitnami is a pre-configured, ready to run image for running WordPress on Microsoft Azure. WordPress is one of the world’s most popular web publishing platforms for building blogs and websites. It can be customized via a wide selection of themes, extensions and plug-ins. Bitnami provides technical support for installation and setup issues through their support center at https://bitnami.com/support/azure. Default credentials are username: ‘user’ / password: ‘bitnami’.

Next click the “Create” button at the bottom of the screen.

Now you will be prompted to create the Ubuntu virtual machine. You will walk through the wizard completing the 5 steps.

Step 1 – Basics

You can hover over the i circle icons to get more information if you have a question on what the field is used for.

SSH Key

You can use a key generator to generate an SSH public key. I used one built into MobaXterm personal edition. If you use MobaXterm for this you will see there is a MobaKeyGen menu option under Tools menu. You generate a key by making random movements with your mouse. Save the public and private keys. You will be prompted to use a password for the private key. You can cut and paste the public key into this Step 1 screen under SSH Public Key. This will be used later if you want to remote into the Linux box using SSH, also done inside of MobaXterm.

Step 2 – Choose a size

I am just picking a very small size (A0) since this is for a test environment. You have to pick a regular hard drive and not an ssd in order to see the A0 size.

Step 3 – Settings

Step 4 – Summary

Step 5 – Purchase

I already have a credit card associated with my account so I didn’t need to do anything else. This page shows you the pricing information. Simply click purchase if you are happy with the offer details.

It will take about 5 minutes to spin up the new machine. Once it is up and running you will be able to access it by the ip address

This is what the default site will look like. Notice the Manage icon on the bottom right. This is part of Bitnami’s installation. They have instructions on how to remove it. Click on it to see more information

Once you click on manage on the bottom corner you will see the following information. It has the default username and password and a login link to the admin console.

After you have logged in you should see the dashboard. Now you can configure like any normal WordPress site. Make sure to change your default password, and add your own accounts as neccesary.

Note: You will get a new dynamic ip every time you stop and start your A0 instance VM instance.

Thanks to everyone who came out to the SharePoint Saturday event and also to those who attended my session on SharePoint client side development with the rest api, jquery and intro to knockout. Here is the slide deck from the presentation.
The session contained information on reading SharePoint list data with the rest api and calling that data using the jquery ajax method. We built up the first example just using jquery taking advantage of $.each , building our html strings and replacing the text/html with jquery selector.

Next we discussed Knockout.js and how it can help make your life easier by splitting the ui into a model, viewmodel and then binding this view model data declaratively to the view. We also used knockouts templating capabilities to rewrite how the html was generated for the select filter, manufacturers list, and vehicle list. Finally we set the databinding context to a specific property on our view model called selectedVehicle. Whenever that property changed we were able to see the vehicle details section update on the fly.

I hope you found the information useful. As you enhance your skills and move towards more complex types of applications in SharePoint i believe you will find it very useful.

You can take a look at how things were done. I didn’t have time to package up the lists, that it used, but you should be able to change out the rest calls to point to your own lists and columns and get the example code working. Feel free to reach out if you have any questions.

This was what the final example looked like for the session on sharepoint rest api, jquery and knockout.

I spent some time this evening exploring making an app part for my SharePoint 2013 app i am building. I ran into a few issues while trying to get this to work the way I wanted it.

1. I kept getting an error stating that this content couldn’t be iframed when inserting it into my SharePoint page in internet explorer. (In google chrome i got the following error: “Refused to display ‘https://xxxxxxxxxxxx’ in a frame because it set ‘X-Frame-Options’ to ‘SAMEORIGIN’.)Content cannont be displayed in a frame

2. I had to add a the name of my site https://xxxxxxxx-public.sharepoint.com to my trusted sites zone in internet explorer for the app part to login behind the scenes. ( I read somewhere that this is a known issue for IE and that Microsoft is working on a fix).

3. Since the content of my app is getting Iframed it contains a separate header with the logo and global navigation from the app page which i wanted to hide.

To solve issue number 1 I ended up going back to my source page for my applications and inserted the following line below the “PlaceHolderMain” conetnt placeholder (You need the 2nd line).

That allowed me to start seeing my app part displayed in my page which led me to issue number 2. To solve that issue apparently it works like the dialog did in SharePoint 2010 and you can hide the outer shell items like navigation and logo, etc. I went to the elements file for my app part which looked like this.

Tonight I finally starting looking into developing apps for SharePoint 2013. I created an Office 365 Enterprise E3 (trial here) of SharePoint online.

Once i got up and running i created a developer site collection with the developer template. This allows you to get some tiles that will allow you to install the free Napa development tools. The developer template allows you to “Sideload” apps into that site collection from Visual Studio 2012 when you deploy.

I developed my app and I wanted to move it to my production site but this is just for a test demo site so i don’t want to put it into the store. I went into my project in visual studio and changed the site url to my production url and attempted to deploy it.

It gave me an error “Error occurred in deployment step ‘Install app for SharePoint’: Sideloading of apps is not enabled on this site. “

Here are the steps i took to deploy to my “production” site.

Go to the admin > sharepoint menu

From the left navigation select Apps

From the App landing page select App Catalog option

Select the create new app catalog site radio button and then click ok

Fill out the dialog for your app catalog site collection and then click ok.

SharePoint will provision your site collection, you need to give it a minute before you can go back in and access it.

Once it is done go back and click the App Catalog (Steps 2 and 3).

You will now see 3 tiles (Distrubute apps for SharePoint is the one I wanted). Click it

Go to visual studio and right click on your project and select the publish option.

Grab the path or note the path to your .app file because you will upload back into SharePoints app catalog

From SharePoint (Apps for SharePoint page) click the New App + sign

A dialog will display, browse to your app file

You will be prompted to fill out some details like images, links to support / videos pages, etc.

Click ok at the bottom, fill in any details you feel neccessary.

Now go to you production site, and click the settings gear and select add an app.

Locate your app, click on details then click the Add It button. Trust it, but only if you trust your own app :).

We launched the new Community Partnership of Southern Arizona (CPSA) web site today on SharePoint 2010.

CPSA is the Regional Behavioral Health Authority (RBHA) contracted by the state of Arizona to coordinate and manage publicly funded behavioral health services for children, adults and their families in Pima County. CPSA coordinates, by way of a provider network, the delivery of mental health and substance use treatment services, and behavioral health wellness and prevention services.

I had a lot of fun working on this project. There is great branding and a lot of cool features and tools that allow CPSA to easily maintain the site. We have integrated with lots of 3rd party systems as well (EventBright, SurveyMonkey, IContact) to name a few. I imagine i may have some more blog posts highlighting some of the items that were built.

I found my self in a situation where i had to modify some xml values directly on a server for a client. In this particular scenario I didn’t have any tools like visual studio on the server. I also couldn’t cut and paste from this remote desktop session to my own box due to security so I was looking for an easy way to format some xml that I was pasting in from the SharePoint Advanced Search properties without having to manually format it in notepad.

I was able to download and install notepad++ on the server from http://notepad-plus-plus.org/. There is a plugin manager that is included now that you can launch from within the application by using the following command.

Once you have the plug in manager open scroll down and look for the plug in called “XML Tools” and click the checkbox and click the install button.

This will launch a progress dialog while the plug download and installs, you will get prompted to restart notepad++ after it is finished.

Now that you have restarted notepad++ you are able to cut in your nasty formatted xml and begin the formatting process.

Here is a simple example of xml on one line.

You can now use the new XML Tools menu option from the plugins menu to format your document.

Select Pretty print(xml only – with line breaks) or you can use the shortcut Ctrl + Alt + Shift +B to format your document to make it easy to read and modify. After making your edits if you want to format it back to one line, you can use the Linarize XML feature also highlighted in red below.

Here is what the single line of xml now looks like

That’s a wrap, litterally, you now have you beautifully formatted xml that you can modify easily. Thanks notepad++ and XML Tool plugin.

Those of us in the SharePoint world know that User Profile Sync can be quite challenging to configure and probably contributes directly to Diet Coke sales due to the late nights and long hours of troubleshooting.

In my particular scenario , the services were up and running and the import job was syncing from AD and bringing in user accounts and we thought things were working just fine. There were also no errors in the fim client either. The fim client is located at C:\Program Files\Microsoft Office Servers\14.0\Synchronization Service\UIShell\miisclient.exe. You can use it to debug issues with the user profile import.

I recently discovered that things weren’t working how they were supposed to. When the users were getting imported into the system some users were coming into with an account name of corp\username and others were coming in with a name of corp-xxxx\username. This was causing problems with the org chart not working correctly. If i viewed a user profile in central admin i could also see that the account in the manager field was not resolving. After lots of research and experimenting and diet coke, apparently this is because the netbios name is different from the domain name. So how do we tell when those 2 are different?

To tell what the domain name is you can goto the web front end and open up the Active Directory Users and Computers window from the Start >; Administrative Tools menu.

Active Directory Computers and Users

The domain should be showing in that window, in my case I was seeing corp-xxxx.org in the tree with a bunch of folders underneath it. (That one we all probably already knew how to get). So that gives us the domain name. Note the screen shot doesn’t match up with my original names but is just provided as a reference from my development environment.

Domain Name location

The one I was less familiar with was the netbios name. This can be determined by right clicking on the domain name and looking at the properties for (pre-Windows 2000), in my case this was set to corp. Note on my environment the domain name and netbios name are actually the same, if this was the environment that had the problem we would see corp-xxxx.org on the left and then corp in the red box below.

Netbios name location

With that information I was now able to start to get to the heart of the problem, but earlier I didn’t quite know where to start. So now I have learned that you have to configure the User Profile Service differently in this scenario, (netbios name different than domain name). In order to make this change you can open up powershell and run a few commands to enablenetbiosnames on the ups.

You can start by opening the SharePoint 2010 management shell on a box in your sharepoint farm.

Get-SPServiceApplication

You will see a list of service applications, look for your user profile service application and copy the guid from the window using edit > mark

powershell command Get-SPServiceApplication

With that information we can now get the specific user profile service application and store into an object so we can modify a property.

Now you can view the properties of your $ups variable with a command like

$ups | Get-Member

This will list all the members / properties of the object. You will see one named NetBIOSDomainNamesEnabled

That is the property we want to update to TRUE.

We can check the current value by using the following command. (remember you can use the tab key in powershell to select the commands for instance type $ups.net and then hit the tab key and the rest should get filled in.)

$ups.NetBIOSDomainNamesEnabled

Your current value of the property will be either true or false. If it is false as it was in my case and then you will want to update it to true if,you are having the same problem..

You can update it to true using the following command.

$ups.NetBIOSDomainNamesEnabled = "True"

Now commit the change using the update command

$ups.Update()

You can verify that you have set it appropriately by reading the property again, type:

$ups.NetBIOSDomainNamesEnabled

After you have this configured, you will need to delete your existing Synchronization Connections in Central Admin. I created a new one pointed it back to AD and picked the appropriate ou containers for my users.

I then did a full import / sync on that connection. After that completed i only had around 8 user profiles imported insead of 200+ user profiles. That wasn’t good. I opened up the fim client to troubleshoot the import and noticed there was a security error. This brings us to the next issue, this particular scenario (netbios name different then domain name) needs some additional Active Directory permissions for the sync account.

I found and followed a good tech article to make sure i had all the appropriate security permissions on Active Directory, this article from technet has the steps layed out very nicely.

I worked with my clients network administrator over a live meeting and prompted him to perform all the documented changed in that article, and bamn, now i have over 200+ profiles with all the user accounts corp\username. The org charts are now working better than ever, and best of all I could stop losing sleep over SharePoint at least for one day.

Today I ran into an issue at my client that I thought was worth writing about.

I have created a separte web application for collaboration sites (i.e http://collaboration). The client wanted to have a security trimmed list (Links) of all the collaboration sites that the user was able to access. I did some research and determined an approach using the serach.asmx webservice. I created a new data service referencing
http://collaboration/_vti_bin/search.asmx and configured my search query for the “queryex” method. I found a good example of what I wanted to do on a blog called Mikes Notebook (thanks Mike). Here is a reference http://mikesnotebook.wordpress.com/2012/01/16/global-site-directory-for-sharepoint-2010/.

I am using a data view web part to display the data from the datasource. The query in the datasource uses a specific search scope to return all the sites that the logged in user has access to under the specific site collection. It returns the Title, Description of the site and the url. Here is an example of what it looks like

Collaboration Links Web Part

This code worked in my single server dev envrionment with no issues. In the production environment is was only working randomly. It was intermittently failing with an error. I could hit F5 to refresh the page and then it would work. After digging into the uls logs I saw that it was getting a Unauthroized 401 error.

Both of the web front ends have been configured to disable the loopback check. What was happening to the best of my knowledge was probably a double hop type issue, and we are using ntlm.

The request would come in under the load balanced ip address of say 175.xxx.xx.xxx and then it would be on one of the web front ends say 10.xxx.xx.150 and then it would try to call out to the webservice i.e http://collaboration.somesite.com/_vti_bin/search.asmx this call was failing since it would resolve back out to the 175.xxx.xx.xxx ip address.

I ended up putting a host entry in both web front end servers to route collaboration.somesite.com to the local virtualized ip address of the individual web front end.

I hope this may save someone in the future from dealing with a similiar issue. I could envision this happening in other areas as well.

Today I had lunch at a local sandwhich shop near my house and I was suprised to see they had a Coca-Cola Freestyle. I saw a tv program on CNBC about Coke probably over a year or so ago where they mentioned this machine. Read about CNBC Coke story…

The machine is an interesting concept. My understanding is that it uses micro dispensing technology from the drug industry. Coke says it uses their proprietary PurePour Technology™. The machine can send data back to Coca-Cola on what drinks are being served up, and with over 100 flavors hopefully you can find something you will like. Most of the beverages allow for you to add a flavored version so you could have Orange Coke, Rasberry Coke, Lime Coke, Cherry Coke, and Vanilla Coke. I tried a Cherry Vanilla Coke.

I drink a lot of Diet Coke and Coke Zero being a programmer and living on caffiene, I have to say my Coke Zero didn’t taste the same. My girlfriend ordered a Dr Pepper and she complained that the taste was different as well. I am curious to what others have to say about their experiences.

I think a few of the downsides are :

Lines can form because people are deciding on what new flavors to try

You have to touch a screen that hundreds of others are touching and can be dirty.

If you want just to get ice for your ice tea you have to wait in line because this machine is also the ice dispenser

Normal Coke Zero and Dr. Pepper flavors didn’t taste as good as the old fountain dispenser.

On the positive side:

Lot’s of options to make your own favorite drink

I can see the machine being a draw in places where there are a lot of kids