According to Computerworld, 42% of IT decision makers planned to increase spending on cloud computing in 2015. While companies are adopting cloud technologies at rapid pace, migrating from existing technology to the cloud can still be a challenge. For an optimal (read: as smooth and quick as possible) move to the cloud, it’s critical to consider your requirements and explore the available features.

While this ZDNet article where Mary Jo Foley interviews a former Azure evangelist is a couple years old, I think these tips to guide your thought process still hold true: always plan for the long term, start with something simple and know the outcomes you want to achieve.

When done right, migrating on-premises applications to the cloud can lead to a significant ROI. The process, however, can seem rather complex, as there are numerous paths to take and changes that are required. A directive as simple as, “lets move some applications to the cloud,” involves many choices, and must consider an organization’s requirements, evaluation criteria and architectural principles, according to Gartner. You’ll want to look at a number of salient points, which we will cover in a webinar this Wednesday, July 29 at 1 pm CT: Optimize Business Performance by Moving Apps to the Microsoft Cloud.

During the session, Perficient’s Joe Crabtree, National Practice Lead for Custom Application Development, and Chris Pietschmann, a Microsoft Certified Azure Solutions Architect, will give an overview of migrations to Microsoft Azure, a rundown of the various service options and a tried and true approach to planning for a successful migration.

What to consider when moving applications to the Microsoft cloud was first posted on July 27, 2015 at 7:36 am.]]>http://blogs.perficient.com/microsoft/2015/07/what-to-consider-when-moving-applications-to-the-microsoft-cloud/feed/0Office 365 – A Deeper Look at the Microsoft “Send” Apphttp://blogs.perficient.com/microsoft/2015/07/office-365-a-deeper-look-at-the-microsoft-send-app/
http://blogs.perficient.com/microsoft/2015/07/office-365-a-deeper-look-at-the-microsoft-send-app/#commentsThu, 23 Jul 2015 03:30:46 +0000http://blogs.perficient.com/microsoft/?p=27443Office 365 – A Deeper Look at the Microsoft “Send” App was first posted on July 22, 2015 at 10:30 pm.]]>On Wednesday, Microsoft announced a new Office 365 mobile app called “Send“. The idea behind the app is to be able to send quick and simple messages to other users via email.

The first question that came to mind was, “What problem is this app trying to solve?” My phone can already send emails, IMs and text messages, did we really need another option?

My thoughts then shifted to “How does this work?”, “What kind of security options are there?” and other questions that clients are bound to ask.

In the interest of peeking behind the scenes, I ran the app through Fiddler to see what the traffic looked like. Some thoughts based on the results are below…

Getting The App

At the initial release, the app is only available on iOS and in the US and Canada. This will obviously change in the future although it’s a bit interesting to see how iOS seems to receive some of the first releases from Microsoft these days. Finding the app in the Apple App Store is a bit tricky, it’s listed under “Send, a Microsoft Garage Project“.

If you look for other “Microsoft Garage” apps, you’ll notice they have another interesting app called “Tossup” which looks to be for sending surveys to friends on where to go to for lunch, etc.

Logging In (and Out?)

The process to login to the app is pretty much the same as most Office 365 applications. I was able to easily login to the application via my organization’s AD FS. When I went to look at how to logout, I found out there isn’t a way. In the FAQ you’ll find that you basically need to uninstall the application to logout.

Sending a Message

Sending messages is certainly quick and simple, it’s a very similar look and feel to sending a text. If the user has the “Send” app, the message will pop up in the app, otherwise the recipient will just receive the message in their mailbox. The messages are sent via your Office 365 mailbox and will show up in your “Sent Items”. Since these messages are going out your Exchange Online environment, they’ll flow through any DLP or other filtering you’re performing on SMTP traffic.

Receiving Messages

Even when you have the “Send” app installed, the messages arrive in your mailbox in addition to appearing in the app; the messages all have “#Send” in the subject line. The duplicate alerts quickly get annoying so I made a folder called “#Send History” and setup a rule in Outlook to move messages there based on that subject.

The “#Send” subject appears to be important to the app and what causes messages to pop up in the application. I was able to compose a new message with that subject line and it popped up in the recipient’s app.

The Network Trace

The authentication appears to use OAuth and then the rest of the app communication is connecting via an API as opposed to directly using Exchange Web Services (EWS) or ActiveSync. The API communication is all to the host “flow-prod-api.outlookapps.com” and does not appear to be the APIs that I’ve seen published. That said, I’m by no means a developer and will probably need to have some other eyes take a look at the requests.

The “flow” name in the above hostname seems to indicate this is “Microsoft Flow” app that had information leaked back in May.

Securing Access

While the Microsoft approach seems to be allowing users to access their email from any device and any location, I have some clients that take a significantly more restrictive stance. Certainly we will be asked by clients to block access to this application and the method in which that will be done is still TBD.

So Do We Need This?

In the short time the app has been out, it’s amazing that people seem to be on one extreme or another with it. Some of my clients and coworkers love it right out of the gate, others think it’s an overlap of multiple existing technologies and there is no need for it.

I can see certain situations where the app could be useful. It’s definitely a bit beta but I imagine that’s to be expected out of the “Microsoft Garage”. A logout option would be nice and it doesn’t seem like you can add additional people to a group conversation that you have going.

More Info to Come

The above is all pretty early observations, more info is surely to come. Microsoft has setup a group in Yammer for the application and there is a “YamJam” scheduled on July 28th at 9:00 AM PST.

Did you find this article helpful?

Leave a comment below or follow me on Twitter (@JoePalarchio) for additional posts and information on Office 365.

Office 365 – A Deeper Look at the Microsoft “Send” App was first posted on July 22, 2015 at 10:30 pm.]]>http://blogs.perficient.com/microsoft/2015/07/office-365-a-deeper-look-at-the-microsoft-send-app/feed/0Azure, IoT and the Future Enterprise Data Centerhttp://blogs.perficient.com/microsoft/2015/07/azure-iot-and-the-future-enterprise-data-center/
http://blogs.perficient.com/microsoft/2015/07/azure-iot-and-the-future-enterprise-data-center/#commentsTue, 21 Jul 2015 12:00:43 +0000http://blogs.perficient.com/microsoft/?p=27419Azure, IoT and the Future Enterprise Data Center was first posted on July 21, 2015 at 7:00 am.]]>The Cloud and the Internet of Things (IoT) are two of the most exciting areas of computer science right now. The Internet of Things (IoT) is all about small devices with sensors and other components that can both gather data from and interact with the real world. The Cloud is the platform perfect for aggregating together all the data from these IoT devices so it can be analyzed using the clouds massive compute and storage resources.

Microsoft has been positioning itself as a leader in the cloud with the Azure Platform since its initial public availability in the beginning of 2010. Since that time, Microsoft has built out the PaaS, IaaS and SaaS features of Azure to be the enterprise data center of the future. The Azure Platform supports any compute, storage, security and server configuration an enterprise would require without needing to build and maintain an expensive internal data center. Azure offers all this while simultaneously offering a far more affordable and superior SLA.

The Internet of Things (IoT) is an exciting area of computer science with hardware such as Raspberry Pi 2, Arduino, MinnowBoard Max, Intel Galileo and other DIY style devices available today. The IoT ecosystem has reached a really exciting time where the developer/engineer of IoT devices and systems are no longer required to be engineers that need to design all the hardware themselves. Which ever hardware platform you choose there are plenty of “off the shelf” components that can be easily integrated. This “off the self” or commodity hardware for IoT has made the technology more accessible today than ever before!

What is an IoT Device?

An IoT device is any network connected device that can integrate sensors, motors or some other hardware with the network connected pieces of the entire system. A few hardware devices such as Raspberry PI 2, Arduino and MinnowBoard were mentioned above, but these are just a single set of hardware that could be used as IoT devices. The Internet of Things is about far more than just these hardware platforms.

With the release of Windows 10, Microsoft is offering a free version called Windows 10 IoT Core. The Windows 10 IoT Core is a platform that offers the use of all the excellent developer tools from Microsoft (Visual Studio, .NET Framework, C#, etc) for use when developing IoT solutions on top of the various IoT hardware mentioned previously. The combination of Windows 10, developer tools and hardware provide an unprecedented lowering of the barrier to entry for building IoT solutions.

Any network connected device could be used as an IoT device. This means that any PC or Smartphone could be used as an IoT device, in addition to the other custom built devices.

Where does the Microsoft Azure platform fit into all this? IoT is about much more than just some small devices with sensors, motors, etc. All IoT solutions require some sort of backend servers to provide storage and computing resources for analyzing all the data gathered by these devices in addition to controlling / coordinating carried out by these various devices. This is where the Microsoft Azure cloud platform fits in perfectly.

Below is a sample solution architecture to better convey how the Internet of Things (IoT) and Microsoft Azure could fit together within an enterprise.

High Level Example of an Azure IoT Solution

Services like Bing Maps and companies like UPS or FedEx are what most people think of first when discussing Location tracking. However, another type of Location tracking that comes to play for enterprises is tracking personnel and/or assets within a building, not just across the globe. RFID is a technology that can be used for this type of tracking, since GPS generally isn’t accurate down to the level of detail necessary within a building.

To track personnel and assets within a building you first give each one an RFID tag. Then RFID readers are placed at key locations within the building. When ever a person or asset passes an RFID reader the unique identifier of the RFID tag is picked up and then can be logged in a database. On the surface this is a fairly simple solution for tracking the movements and locations of each entity assigned an RFID tag.

The position of the RFID readers within the building are best in areas just inside doorways to rooms, as well as in hallways and even just outside building exits. The room readers will be able to read what RFIDs enter the room. The hallway readers will be able to read RFIDs that approach a room, or just log movement across the building. Lastly the readers just outside building exits will track when personnel and assets might be leaving the premises.

In addition to the readers mentioned, handheld RFID readers (such as connected to an iPhone/iPod, Android or Windows Phone) can be used to specifically scan assets manually. This would be tied to software that would allow for inventory to be periodically taken and for item locations to be updated when they are placed. The software on the handheld device could use an additional RFID tag or barcode to allow the user to easily enter the storage location as well as the assets identifier.

This type of system would be made up of a number of custom built hardware devices (the RFID readers), the handheld mobile devices connected with Wifi, as well as software built to integrate the system with Azure hosted services.

The RFID readers could be built using Raspberry Pi 2 hardware with the software written as a Universal Windows App running on the Free Windows 10 IoT Core operating system. These would also be connected to the local network using Ethernet, as it’s much more reliable than Wifi. The reader would pick up RFID tag readings and send then to Azure Storage. Azure Table storage (or another NoSQL storage solution) would be better than an Azure SQL database since this system is likely to fit into the realm of Big Data. In addition to storing the raw scan data in NoSQL storage, the reading event message would be sent into an Azure Service Bus Queue so that a subsequent message reader can perform any actions that may be necessary based on the personnel or asset that triggered the event.

The Mobile App on the handheld, mobile RFID reader software would be built on the mobile platform chosen (iPhone/iPod, Android or Windows Phone) in a similar fashion to the RFID readers in the way they send the data to be persisted into the NoSQL storage solution. In addition to this, however, the handheld devices would also implement Azure Mobile Services so they can both receive notifications and store data on the local device with additional reporting functionality. It would also be good to integrate an offline mode on the handheld devices so they can still be used if they are unable to connect to the Wifi network for short periods of time.

Once all the data is being scanned and stored up into Azure, the analysis and processing of the data to gain intelligence on it can be done. The first thing required with processing the data would be to implement one or more Message Receivers hosted as an Azure Web Job that monitor the Azure Service Bus Queue for new messages and then store, process and initiate the necessary actions based on the data.

Basic reporting and tracking of the personnel/assets that would be implemented as an ASP.NET MVC application (so it works across platforms/devices on the web) that is hosted as an Azure Web App. This application would provide the primary interface for this functionality, in addition to the some of this same functionality being implemented in the Mobile App on the handheld devices as necessary.

With all of the above built, there are a couple of really innovative Azure services that could be added to this solution to really enhance the overall Business Intelligence over the data. Azure Machine Learning could be integrated to give some predictive analytics over the data, such as tracking the future movements of certain personnel or assets. Second, Azure Stream Analytics could be implemented to better implement real-time analytics over the data as Azure Stream Analytics is built specifically for processing massive amounts of Internet of Things (IoT) data in real-time.

This is just one of a nearly infinite number of potential Enterpirse IoT systems, and is the above is far from a complete solutions architecture document. However, hopefully this paints enough detail on how Microsoft Azure and the Internet of Things (IoT) can be use together to build some really innovative solutions.

Azure and IoT are the Future

Internet of Things (IoT) is still mostly a buzz word, as most people are still working at figuring out what it really means. There have been IoT devices and solutions built for at least the last 20 years, but they have traditionally been fairly expensive to both build and maintain. Additionally, IoT solutions have been fairly difficult to build until the more recent availability of Microsoft Azure and most of it’s services mentioned in the above IoT use case example.

Another driving factor of making IoT cheaper in recent years has been the availability of hardware development platforms like the Raspberry Pi and Arduino. Some may think these aren’t really ready for the Enterprise yet, but their much cheaper price tag surely makes them rather appealing.

Data is a key component to learning how to improve any business. The larger the amount of accurate data, the more intelligent reporting can be done. With the integration of Microsoft Azure and IoT, enterprises can gather “Big Data” amounts of data and utilize services such as Microsoft Azure Machine Learning and Azure Stream Analytics to get the answers to necessary questions that couldn’t be answered before.

Azure, IoT and the Future Enterprise Data Center was first posted on July 21, 2015 at 7:00 am.]]>http://blogs.perficient.com/microsoft/2015/07/azure-iot-and-the-future-enterprise-data-center/feed/0Power BI General Availability Date Announcedhttp://blogs.perficient.com/microsoft/2015/07/power-bi-general-availability-date-announced/
http://blogs.perficient.com/microsoft/2015/07/power-bi-general-availability-date-announced/#commentsMon, 20 Jul 2015 20:06:10 +0000http://blogs.perficient.com/microsoft/?p=27351Power BI General Availability Date Announced was first posted on July 20, 2015 at 3:06 pm.]]>In a little bit of an underplayed blog post, Microsoft announced last weekend that Power BI general availability would be July 24, 2015. This is great to finally hear/read, because it has felt like DECADES since the preview for the Power BI “new experience” went online.

In reality though, it’s only been since the beginning of this year. And the development of the product, given constant online feedback, has been impressive. If Microsoft intends for this offering to compete with Tableau and Qlik in the modern analytics visualization market, they needed to step up. I feel they have. They have added features, refined the UX , and apparently have brought the previewed Power BI Designer tool into it’s own, with the re-dubbed Power BI Desktop tool.

So what does GA bring for users of the current Preview service? On that date, those accounts will be converted to Power BI free accounts. Features included in Power BI Pro (the pay version of the service) will be disabled. Bottom line, users of the preview will have to choose whether it’s worthwhile to pay based on the differences in functionality. For $9.99 a month per user, it’s a relatively inexpensive prospect

What does this bring for Microsoft? It’s the third draft of a “Microsoft BI” story they’ve been working on for several years — at least since 2010. The story editing has definitely improved things, but there are still plots that go nowhere (hello PerformancePoint Services?), new characters every chapter (welcome, Rev R and DataZen!)., and sometimes it’s like a choose-your-own-adventure (“… if you choose NOT to run your solution in Azure, turn to page 108 …”) .

But the Power BI offering is solid, and it finally fills in a gap for many years in the Micosoft toolset: front-end visualizations. The “2-for-1″ idea of linking that offering to self-service BI is also smart. If they can continue to establish a coherent set of functionality around the brand, incorporating new tool acquisitions, I have high hopes for the product. I believe the final pieces of the puzzle for Power BI as an offering are going to be overcoming (or just waiting out) the general market’s cloud aversion, and establishing an Enterprise level story for Power BI — particularly around security.

Power BI General Availability Date Announced was first posted on July 20, 2015 at 3:06 pm.]]>http://blogs.perficient.com/microsoft/2015/07/power-bi-general-availability-date-announced/feed/0Office 365 – Office ProPlus C2R Not Compatible With MSI Installshttp://blogs.perficient.com/microsoft/2015/07/office-365-office-proplus-c2r-not-compatible-with-msi-installs/
http://blogs.perficient.com/microsoft/2015/07/office-365-office-proplus-c2r-not-compatible-with-msi-installs/#commentsMon, 20 Jul 2015 13:00:12 +0000http://blogs.perficient.com/microsoft/?p=27393Office 365 – Office ProPlus C2R Not Compatible With MSI Installs was first posted on July 20, 2015 at 8:00 am.]]>During a session at Microsoft Ignite, I heard a presenter mention that running MSI-installed versions Office products such as Visio and Project was not compatible with the “Click-to-Run” deployment of Office 365 ProPlus.

This was something I sort of stored in the memory banks with the intention of researching. One thing I did know was that I was running that exact configuration, MSI-installed versions of Visio and Project and Office 365 ProPlus. It had me wondering what the issues were at what exactly didn’t work.

While Microsoft Ignite was close to three months ago, I still have a laundry list of items that I’m following up on or sessions that I still plan to watch. The above issue wasn’t really a priority on my research list, until now…

Recently I uninstalled Office 365 ProPlus (from my corporate laptop) and when I went to reinstall it, I was greeted with the error below:

Office 365 – Office ProPlus C2R Not Compatible With MSI Installs was first posted on July 20, 2015 at 8:00 am.]]>http://blogs.perficient.com/microsoft/2015/07/office-365-office-proplus-c2r-not-compatible-with-msi-installs/feed/2Azure: Did You Know? Azure RemoteApp, Access Apps from Any Devicehttp://blogs.perficient.com/microsoft/2015/07/azure-did-you-know-azure-remoteapp-access-apps-from-any-device/
http://blogs.perficient.com/microsoft/2015/07/azure-did-you-know-azure-remoteapp-access-apps-from-any-device/#commentsFri, 17 Jul 2015 04:57:35 +0000http://blogs.perficient.com/microsoft/?p=27383Azure: Did You Know? Azure RemoteApp, Access Apps from Any Device was first posted on July 16, 2015 at 11:57 pm.]]>Azure RemoteApp provides a way to deliver Windows applications to any device with the power and scalability of Azure. That’s a pretty bold statement… so what, exactly, does it mean? Essentially, you can package up any Windows application on a VM image hosted in Azure and provide users access to it from any device using Remote Desktop Services. Being hosted on Azure also means you get benefits such as scalability, fault tolerance, etc. with little to no effort on your part.

First, “any” device needs a little clarification

Azure RemoteApp supports most devices, but not every device. The RemoteApp FAQ page lists the supported devices and essentially includes Windows, iOS, MAC OS X, and Android. Other devices such as Blackberry, Xbox, etc. are not listed. In most cases, this should be fine since the target audience is line of business apps but it should be noted that it is not in fact “any” device.

So, How does it work?

Azure RemoteApp provides access to applications via Remote Desktop Services hosted on an image in Azure. Both cloud and hybrid configurations are supported giving you lots of flexibility in determining what part of the app lives in the cloud and what part lives on-prem. The beauty of this is users can use “any” device to access your applications and you don’t have to maintain the infrastructure behind the scenes. Typically, in an on-prem world, when we need a new application requiring VM’s, we have to configure the VM image and ensure our host environment has enough capacity for the expected usage of the application. In addition, usage tends to spike at different times of the year for various reasons so if not planned properly, we can easily reach capacity causing outages for the user. With the infrastructure hosted on Azure, you get the benefits of Azure which should alleviate these concerns to some extend and free you up to address more critical requests from the business.

What is “the image”?

The image is basically nothing more than a Windows image, just like you would use for any VM, that has been prepped with SYSPREP and meets the RemoteApp requirements.

The following out of the box images are available with your subscription by default:

Considerations

Images should be stateless – Applications should never store data directly on the image… all data should be persisted elsewhere or in the user profile.

Videos – videos may have some trouble playing smoothly… remember that users are accessing this via RDP, not through a web site or native application on their local devices so graphic intense operations like playing video may not perform as expected.

Image size – when uploading, the image size should be a multiple of MBs or the upload will fail.

Remote Desktop dependencies – Remote Desktop Session Host role and the Desktop Experience feature must be installed… however, Remote Desktop Connection Broker role must not be installed.

SYSPREP – there are specific parameters documented that need to be used. For example, do not use the “/mode:vm” parameter.

Conclusion

Azure RemoteApp can be a great opportunity to provide easy access to applications to users in any enterprise. Hosting the infrastructure in Azure allows you to take advantage of Azure scalability and reliability with less involvement than maintaining the VM environment on-prem. Apps published through RemoteApp should be chosen carefully to ensure they don’t suffer any major performance drawbacks via RDP access to the cloud. In many ways, this is comparable to some offerings via Citrix although this shouldn’t be confused for Citrix and is an entirely different platform. It’s also worth noting that many people have the assumption that “cloud services” are simply websites and web services and this is a great example of capabilities far beyond simple web sites and gives companies a relatively easy way to host an app, in the cloud, with little to no code changes (depending on the design of the application).

Azure: Did You Know? Azure RemoteApp, Access Apps from Any Device was first posted on July 16, 2015 at 11:57 pm.]]>http://blogs.perficient.com/microsoft/2015/07/azure-did-you-know-azure-remoteapp-access-apps-from-any-device/feed/0Azure Active Directory Reporting APIs Now Availablehttp://blogs.perficient.com/microsoft/2015/07/azure-active-directory-reporting-apis-now-available/
http://blogs.perficient.com/microsoft/2015/07/azure-active-directory-reporting-apis-now-available/#commentsWed, 15 Jul 2015 13:45:33 +0000http://blogs.perficient.com/microsoft/?p=27353Azure Active Directory Reporting APIs Now Available was first posted on July 15, 2015 at 8:45 am.]]>One of the most common things we hear from our clients is their need to automatically access security related reports from Azure AD. With last week’s announcement from Microsoft about AAD Reporting APIs Public Preview, we now have that capability.

Azure AD already has a robust set of activity, security and audit reports, with some of the most useful provided within Azure AD Premium, and they can all easily be viewed within the Azure portal. With the new APIs, we can now programmatically access that data via “any tool or programming language which supports REST APIs with OAuth” and integrate it into a custom dashboard, Power BI/Excel, or your favorite SIEM solution.

Azure Active Directory Reporting APIs Now Available was first posted on July 15, 2015 at 8:45 am.]]>http://blogs.perficient.com/microsoft/2015/07/azure-active-directory-reporting-apis-now-available/feed/0MSFT US Partner & Industry Team Partner of the Year: Perficient!http://blogs.perficient.com/microsoft/2015/07/msft-us-partner-industry-team-partner-of-the-year-perficient/
http://blogs.perficient.com/microsoft/2015/07/msft-us-partner-industry-team-partner-of-the-year-perficient/#commentsTue, 14 Jul 2015 15:51:04 +0000http://blogs.perficient.com/microsoft/?p=27350MSFT US Partner & Industry Team Partner of the Year: Perficient! was first posted on July 14, 2015 at 10:51 am.]]>

I am absolutely thrilled to share that Perficient has been named Microsoft EPG United States Partner and Industry Team Partner of the Year for 2015. This is a huge honor, and we are incredibly thankful to Microsoft, our customers and our partners for sharing in this recognition and working so closely with us this past year.

The newly created award recognizes Perficient for sweeping Microsoft’s regional Enterprise Platform Group (EPG) honors. In addition to the national award, Perficient was named Microsoft’s East Region National Solution Provider (NSP) Partner of the Year, Central Region NSP Partner of the Year and West Region NSP Partner of the Year.

“Perficient’s ability to deliver high-quality digital experiences, business optimization and industry solutions for our largest and most critical customers in the enterprise customer segment is what won them National Solution Provider Partner of the Year across East, Central and West regions in U.S. EPG,” said Rich Figer, Senior Director, U.S. Enterprise Partner Group Sales at Microsoft. “We are also honored to name Perficient the overall U.S. EPG Partner and Industry Team Partner of the Year in 2015 based on unanimous confidence across our partner sales executive teams in all three regions across the United States. They consistently deliver cloud-optimized solutions across our platforms and productivity services with complete customer and Microsoft sales executive satisfaction.”

Perficient has activated over 2.5 million Office 365 seats and has 26 Azure certified consultants, more than any other NSP.

“Microsoft’s enterprise offerings have grown increasingly cloud-based as companies move to adopt this innovative, efficient and secure technology,” said Mike Gersten, Vice President of Perficient’s Microsoft national business group. “Cloud computing lowers operating costs and provides agility and scalability options unavailable on limited legacy infrastructure. We are honored to receive these partner awards, which reflect the strength of Perficient’s Microsoft cloud consultation and delivery expertise at work across the country.”

While at Microsoft’s Worldwide Partner Conference (WPC), it’s also been exciting to see the video highlighting how Perficient and Microsoft have worked together to empower our customers, and in this case, Partners In Health, to achieve more through technology. Dave Mayo, CIO at Partners In Health, spoke of the value of our partnership:

“Perficient’s proven track record in the health care industry, its Microsoft cloud services expertise, and its recognition as a top U.S. Microsoft partner all stood out prominently when we were looking at technology solutions providers,” said Dave Mayo, CIO of Partners In Health. “Thanks to our partnership with Perficient, we have adopted a consolidated, reliable platform for colleague interactions that enables us to more effectively serve the world’s most vulnerable. Perficient’s generosity and support of our efforts has earned our profound gratitude.”

Thanks to our entire team that helped to make this happen. What I love most about working here at Perficient is the people. Our team is second to none, and every single day, I’m amazed by the talent and passion they possess. Sweeping the nation with all three regional awards for Microsoft EPG PTU Partner of the Year is proof of their commitment to helping our customers scale operations and remain agile with Microsoft solutions. Well done, team! I can’t wait to see what we can accomplish together in the coming year.

MSFT US Partner & Industry Team Partner of the Year: Perficient! was first posted on July 14, 2015 at 10:51 am.]]>http://blogs.perficient.com/microsoft/2015/07/msft-us-partner-industry-team-partner-of-the-year-perficient/feed/0Microsoft HDInsight Joins the BAA for HIPAA Compliancehttp://blogs.perficient.com/microsoft/2015/07/microsoft-hdinsight-joins-the-baa-for-hipaa-compliance/
http://blogs.perficient.com/microsoft/2015/07/microsoft-hdinsight-joins-the-baa-for-hipaa-compliance/#commentsMon, 13 Jul 2015 17:06:18 +0000http://blogs.perficient.com/microsoft/?p=27345Microsoft HDInsight Joins the BAA for HIPAA Compliance was first posted on July 13, 2015 at 12:06 pm.]]>For those who would like to take advantage of HDInsight — Microsoft’s Azure-based Hadoop service — but have had concerns regarding compliance and security, this news is for you.

But this is of particular interest when looking slightly forward, at the frontier of IoT (Internet of Things) applications in the Healthcare Analytics realm. The cloud, particularly for Microsoft, weighs heavily in the general story regarding these “advanced analytics” solutions. HDInsight, actually a Hortonworks version of open source Hadoop tweaked to run in Azure, offers as its chief advantage a supported Hadoop capability easily integrated into Windows/SQL Server/Azure environments. It’s an obvious choice for Big Data storage and analysis if you are already working in the Azure ecosystem. Couple that with the elastic compute and storage capability of Azure, and HDInsight is quite compelling Microsoft-centric IT organizations looking to expand into the cloud.

As for Healthcare, as regulatory pressure and consumer competition push the industry down the Digital Experience path, the Cloud will increasingly become part of solutions designed to improve customer experience, increase patient safety, and reduce cost. Microsoft Azure is going to become more and more of a sensible option for Healthcare orgs looking to do those things.

Microsoft HDInsight Joins the BAA for HIPAA Compliance was first posted on July 13, 2015 at 12:06 pm.]]>http://blogs.perficient.com/microsoft/2015/07/microsoft-hdinsight-joins-the-baa-for-hipaa-compliance/feed/0Office 365 – How to Handle Departed Users (Part 2 of 2)http://blogs.perficient.com/microsoft/2015/07/office-365-how-to-handle-departed-users-part-2-of-2/
http://blogs.perficient.com/microsoft/2015/07/office-365-how-to-handle-departed-users-part-2-of-2/#commentsMon, 13 Jul 2015 15:00:21 +0000http://blogs.perficient.com/microsoft/?p=26602Office 365 – How to Handle Departed Users (Part 2 of 2) was first posted on July 13, 2015 at 10:00 am.]]>As a result of a decision made by either the employee or the employer, users will inevitably leave your organization. Whether you call these user “separations”, “terminations” or “offboarding”, the impact to IT is the same: network access needs to be secured and the user’s data needs to be addressed.

When using cloud services such as Office 365, there are additional aspects to consider which will make your process different than in an on-premises scenario. There may be a licensing impact which can equate to costs and you are dependent upon another party (Microsoft) for handling the disposal of data.

In this two part series, I will cover some of the ways to handle Office 365 data for users that have left your organization. Part 1 of this series covered how to handle the user’s mailbox in Exchange Online. This article, part 2, will cover how to handle the user’s OneDrive for Business data.

As in the previous article, if the Active Directory account is left or only disabled as opposed to deleted, nothing happens. However, if the Active Directory account is deleted or removed from the DirSync scope, then the timer begins on the OneDrive data removal.

Advanced Preparation

Part of this process involves some up-front preparation. Notification regarding data removal in OneDrive for Business is dependent upon the user’s “Manager” attribute being populated in Active Directory. If that value is not populated on the deleted user account, then there is a failover to the SharePoint Online “Secondary Owner” assuming it has been assigned. You can populate the “Secondary Owner” by opening the SharePoint Online Admin Center by going to “User Profiles” and then “Setup My Sites”. From here, you will see an option to assign the “Secondary Owner” in the section “My Site Cleanup”.

If the user’s manager cannot be determined, the assigned “Secondary Admin” will receive the notifications that the manager would have received.

User Deletion

When a OneDrive for Business user is deleted, the “My Site Clean Up timer job” will eventually run. At that time, the SharePoint Online profile is marked for deletion and an email notification will be sent to the user’s manager or the secondary admin stating that access has been granted to the manager / secondary admin and that the site will be deleted after 30 days.

A second email is sent with 3 days left stating the same and then at the conclusion of the 30 days, the data is deleted.

In Pictures…

The above process is probably best shown in pictures, I’ve put together this flow chart that hopefully helps illustrate the process. Click the image to see a larger version.

What To Do With The Data

This is where there really aren’t a lot of great options just yet… So the manager or secondary admin has access to the OneDrive for Business data but moving it out of the deleted user’s profile is not real easy. You can access the OneDrive for Business site via the URL in the notification email but the browser really doesn’t let you do much with more than one file at a time. You can click the “Sync” link to configure the sync client for that site and then you have access to the files via Explorer and can copy them where you please. Alternatively, there are some third-party migration tools that can then be used to migrate the data. I think asking the user’s manager to preserve the data and move it with the access they’ve been granted is asking a bit much in many cases, hopefully we see some improvements here in the future.

Summary

You should configure the “Secondary Owner” in SharePoint Online (go do it now!)

Preservation of data a task delegated to the departed user’s admin by default

Did you find this article helpful?

Leave a comment below or follow me on Twitter (@JoePalarchio) for additional posts and information on Office 365.