Category Archives: Technologies

Post navigation

I recently held a talk for http://www.dotnet-developer-conference.de/ About – Microservices and Scalable Backends with Azure Service Fabric, and also a Transition path from a legacy application to scalable application.

Again, a sensation to the topic Microsoft and Linux: Microsoft joins the Linux Foundation and is a Board Member there in the future! At the same time, Google is member of the .NET Foundation!

Microsoft has unveiled a Visual Studio for Mac OS X. The new IDE is however no porting existing Visual Studio, which runs only on Windows, but only an extended version of the product beginning 2016 in relation to the acquisition by Xamarin with purchased Xamarin Studio, which in turn is based on the free Mono develop. The new IDE supports the first step in the Microsoft strategy “mobile first, cloud first” only the development by Xamarin apps for iOS and Android, as well as Web applications and REST based Web services with .NET core. The programming languages can choose the developer between c# and f #.

Visual Studio “15” is now called Visual Studio 2017 and is now available as a release candidate.

Visual Studio 2017 and Visual Studio for Mac now offer a graphical view of the preview for Xamarin forms. So far, developers could write XAML tags in the editor and only saw the result at run time. Graphic design by mouse, the developers of WindowsPresentation Foundation (WPF), Silverlight or Windows universal apps are used, is still not possible on a Xamarin Forms interface.

In entity framework core 1.1, Microsoft provides some programming functions from the predecessor entity framework 6.1 again at the disposal, which were missing in entity framework core 1.0. This searches for objects in the cache with find(), explicitly load of related objects with load() method, renewed preloaded objects with reload()loading as well as the simple query of object content, old and new, with GetDatabaseValues() and GetModifiedProperties(). Also the mapping to simple fields instead ofproperties and the recovery of lost database connections (connection resiliency) to the Microsoft SQL Server or SQL azure is possible again. The ability to map objects to the memory-optimized tables from Microsoft SQL Server 2016 is brand new. In addition, the makers have simplified the API with the standard functions can be replaced by entity framework core by their own implementations.

.NET 1.1 core Linux Mint 18, 42.1, OpenSUSE come with macOS 10.12 and WindowsServer 2016 as operating systems added. Samsung delivers core for Tizen also .NET.A further step in the direction of cross-platform compatibility of Microsoft products.NET represents core for the Tizen operating system, the Samsung, since June 2016Member has developed the .NET Foundation, together with relevant Visual Studio Tools.

The Team Foundation Server “15” 2017 receives the version number like Visual Studio. He had previously reached the stage release candidate 2 is now available as RTM version available

The new Visual Studio Mobile Center is a cloud application, the hosted on Github source code for IOS and Android apps automatically translated at each commit, testson real hardware and distributed successfully tested app packets to beta testers. Also, a usage analysis and run-time failure is possible. The supported programming languages are swift, objective-C, Java and c# (Xamarin). Support for Cordova and Windows universal apps is planned.

SQL Server for Linux: In March, Microsoft announced to provide the SQL Server in the future also for Linux. There is now a preview version of the database server for RedHat Enterprise Linux, Ubuntu Linux, macOS and Windows, as well as docking available. A version for SUSE Linux Enterprise Server is to follow soon. The latest release is called SQL Server Community Technology Preview vNext and represents an evolutionof SQL Server 2016, with new features in addition to the platform independence were shown in the keynote, nor previously listed on the website.

For the SQL Server 2016, a Service Pack 1 available now, that contains significant improvements in the licensing model in addition to bug fixes.

For Mac developers , Xamarin Studio is now available as a benefit of Visual Studio Professional or Enterprise subscriptions. Developers can use the newly-created Xamarin Studio Community Edition for free.

Another big announcement it is that The Mono Project it is added to the .NET Foundation, including some previously-proprietary mobile-specific improvements to the Mono runtime. Mono will also be re-released under the MIT License, to enable an even broader set of uses for everyone. More details to the Mono Project blog.

The changes to Mono remove all barriers to adopting a modern, performing .NET runtime in any software product, embedded device, or engine, and open the door to easily integrate C# with apps and games on iOS, Android, Mac, and Windows and any emerging platforms developers want to target in future.

Using Team Foundation Server , Visual Studio, to increase productivity and transparency into your application as well as increase the rate at which you can ship high quality software throughout the application lifecycle.

The Scaled Agile Framework, or SAFe, is popular among organizations looking to scale Agile practices to the enterprise level. SAFe is a comprehensive framework, covering practices from portfolio level planning to release planning to coding practices.

While TFS does not provide full support for all SAFe practices, TFS can be used to implement many of the planning practices. This whitepaper also provides practical guidance on how to implement SAFe practices using TFS. It covers the following topics:

The first two sections are conceptual and provide a quick overview of how TFS supports SAFe.The last two sections are guidance and provide detailed steps for the TFS Administrator to configure and customize TFS to support SAFe.

SAFe supports a portfolio view of multiple agile teams. SAFe illustrates how a portfolio vision is met by a hierarchy of teams, all of whom have their own specific objectives. This framework breaks down Epics into Features and Stories, which teams work on in Sprints and deliver through Program Increments (PIs) and Release Trains. Also, the portfolio backlog can track how deliverables map to Strategic Themes and associated budgets.

The examples in this paper illustrate how to add the Epic WIT and backlog, configure a three-level team hierarchy, and map teams to their respective area and iteration paths. The examples build from the TFS Agile process template. However, the changes can be applied to any TFS process template.

Because TFS supports a hierarchical team structure, each team has its own view of their work which rolls up to the next level within the team hierarchy.

In the section, “Customize TFS process to support SAFe”, details the changes to our Scrum, Agile, and CMMI process templates which enable SAFe support. The goal is not to create a SAFe Process Template, but modify existing process templates to enable SAFe practices. This changes are minimal and don’t encumber teams who choose not to use SAFe.

Now, you have the following options to update the templates to include these changes :

You can download the standard Scrum, Agile, CMMI process templates with changes for SAFe here.

If you have customized process templates, you can follow the instructions in the guidance. Additionally, in this blog post shows how to automate the process with PowerShell.

Some tips How to migrate file shares to SharePoint and use OneDrive (SkyDrive) for Business (ODFB) and if you are planning to migrate file share content into SharePoint and want to make use of ODFB for synchronizing the SharePoint content offline.

Note: that these steps are both valid for SharePoint 2013 on-premises and SharePoint Online (SPO).

First Step – Analyze your File Shares

As a first step, try to understand the data that resides on the file shares. Ask yourself the following questions:

What is the total size of the file share data that the customer wants to migrate?

How many files are there in total?

What are the largest file sizes?

How deep are the folder structures nested?

Is there any content that is not being used anymore?

What file types are there?

Let me try to explain why you should ask yourself these questions.

Total Size

If the total size of the file shares are more that the storage capacity that you have on SharePoint, you need to buy additional storage (SPO) or increase your disk capacity (on-prem). To determine how much storage you will have in SPO, please check the Total available tenant storage in the tables in this article. Another issues that may arise is that in SharePoint is that you reach the capacity per site collection. For SPO that is 1000 Gigabyte (changed from 100 GB to 1 TB), for on-premises the recommended size per site collection is still around 200 Gigabyte.

What if we have more than 1000 Gigabyte?

Try to divide the file share content over multiple site collections when it concerns content which needs to be shared with others.

If certain content is just for personal use, try to migrate that specific content into the personal site of the user.

How Many Files

The total amount of files on the file shares is important as there are some limits in both SharePoint as well as ODFB that can result in an unusable state of the library or list within SharePoint but you also might end up with missing files when using the ODFB client.

First, in SPO we have a fixed limit of 5000 items per view, folder or query. Reasoning behind this 5000 limit boils all the way down to how SQL works under the hood. If you would like to know more about it, please read this article. In on-prem there is a way to boost this up, but it is not something we recommend as the performance can significantly decrease when you increase this limit.

There is also a limit of 5 million items within a document library, but I guess that most customer in SMB won’t reach that limit very easily.

What should I do if my data that I want to migrate to a document library contains more than 5000 items in one folder?

Try to divide that amount over multiple subfolders or create additional views that will limit the amount of documents displayed.

But wait! If I already have 5000 items in one folder, doesn’t that mean that the rest of the other document won’t get synchronized when I use ODFB?

Yes, that is correct. So if you would like to use ODFB to synchronize document offline, make sure that the total amount of documents per library in a team site, does not exceed 5000 documents in total.

How do I fix that limit ?

Look at the folder structure of the file share content and see if you can divide that data across multiple sites and/or libraries. So if there is a folder marketing for example, it might make more sense to migrate that data into a separate site anyway, as this department probably wants to store additional information besides just documents (e.g. calendar, general info about the marketing team, site mailbox etc). An additional benefit of spreading the data over multiple sites/libraries is that it will give the ODFB users more granularity about what data they can take offline using ODFB. If you would migrate everything into one big document library (not recommended), it would mean that all users will need to synchronize everything which can have a severe impact on your network bandwidth.

Largest File Sizes

Another limit that exists in SPO and on-prem is the maximum file size. For both the maximum size per file is 2 Gigabyte. In on-prem the default is 250 MB, but can be increased to a maximum of 2 Gigabyte.

So, what if I have files that exceed this size?

Well, it won’t fit in SharePoint, so you can’t migrate these. So, see what type of files they are and determine what they are used for in the organization. Examples could be software distribution images, large media files, training courses or other materials. If these are still being used and not highly confidential, it is not a bad thing to keep these on alternative storage like a SAN, NAS or DVDs. If it concerns data that just needs to be kept for legal reasons and don’t require to be retrieved instantly, you might just put these on DVD or an external hard drive and store them in a safe for example.

Folder Structures

Another important aspect to look at on your file shares is the depth of nested folders and file length. The recommended total length of a URL in SharePoint is around 260 characters. You would think that 260 characters is pretty lengthy, but remember that URLs in SharePoint often has encoding applied to it, which takes up additional space. E.g. a space is one character but in Unicode this a %20, which takes up three characters. The problem is that you can run into issues when the URL becomes to large. More details about the exact limits can be found here, but as a best practice try to keep the URL length of a document under 260 characters.

What if I have files that will have more than 260 characters in total URL length?

Make sure you keep your site URLs short (the site title name can be long though). E.g. don’t call the URL Human Resources, but call it HR. If you land on the site, you would still see the full name Human Resources as Site Title and URL are separate things in SharePoint.

Shorten the document name (e.g. strip of …v.1.2, or …modified by Andre), as SharePoint has versioning build in. More information about versioning can be found here.

Content Idle

When migrating file shares into SharePoint is often also a good momentum to clean up some of the information that the organization has been collecting over the years. If you find there is a lot of content which is not been accessed for a couple of years, what would be the point of migrating that data it to SharePoint?

So, what should I do when I come across such content?

Discuss this with the customer and determine if it is really necessary to keep this data.

If the data cannot be purged, you might consider storing it on a DVD or external hard drive and keep it in a safe.

If the content has multiple versions, such as proposal 1.0.docx, proposal 1.1.docx, proposal final.docx, proposal modified by Andre.docx, you might consider just moving the latest version instead of migrating them all. This manual process might be time consuming, but can safe you lots of storage space in SharePoint. Versioning is also something that is build into the SharePoint system and is optimized to store multiple versions of the same document. For example, SharePoint only stores the delta of the next version, saving more storage space that way. This functionality is called Shredded Storage.

Types of Files

Determine what kind of files the customer is having. Are they mainly Office documents? If so, then SharePoint is the best place to store such content. However, if you come across developers code for example, it is not a good idea to move that into SharePoint. There are also other file extensions that are not allowed in SPO and/or on-prem. A complete list of blocked file types for both SPO and on-prem can be found here.

what if I come across such blocked file extensions?

Well, you can’t move them into SharePoint, so you should either ask yourself, do I still need these files? And if so, is there an alternative storage facility such as a NAS, I can store these files on? If it concerns developer code, you might want to store such code on a Team Foundation Service Server instead.

Tools for analyzing and fixing file share data

In order to determine if you have large files or exceed the 5000 limit for example, you need to have some kind of tooling. There are a couple of approaches here.

There is a PowerShell script that has been pimped up by Hans Brender, which checks for blocked file types, bad characters in files and folders and finally for the maximum URL length. The script will even allow you to fix invalid characters and file extensions for you. It is a great script, but requires you to have some knowledge about PowerShell. Another alternative I was pointed at is a tool called SharePrep. This tool does a scan for URL length and invalid characters.

There are other 3rd party tools that can do a scan of your file share content such as Treesize. However such tools do not necessarily check for the SharePoint limitations we talked about in the earlier paragraphs, but at least they will give you a lot more insight about the size of the file share content.

Finally there are actual 3rd party migration tools that will move the file share content into SharePoint, but will check for invalid characters, extensions and URL length upfront. We will dig into these tools in Step 2 – Migrating your data.

Second Step – Migrating your data

So, now that we have analyzed our file share content, it is time to move them into SharePoint. There are a couple of approaches here.

Document Library Open with Explorer

If you are in a document library you can open up the library in the Windows Explorer. That way you can just do a copy and paste from the files into SharePoint.

Are some drawbacks using this scenario. First of all, I’ve seen lots of issues trying to open up the library in the Windows Explorer. Secondly, the technology that is used for copying the data into SharePoint is not very reliable, so keep that in mind when copying larger chunks of data. Finally there is also drag & drop you can use, but this is only limited to files (no folders) and only does a maximum of 100 files per drag. So this would mean if you have 1000 files, you need to drag them 10 times in 10 chunks. More information can be found in this article. Checking for invalid characters, extensions and URL length upfront are also not addressed when using the Open with Explorer method.

OneDrive (formerly SkyDrive) for Business

You could also use ODFB to upload the data into a library. This is fine as long as you don’t sync more than 5000 items per library. Remember though that ODFB is not a migration tool, but a sync tool, so it is not optimized for large chunks of data to be copied into SharePoint. Things like character and file type restrictions, path length etc. is on the list of the ODFB team to address, but they are currently not there.

The main drawbacks of using either the Open in Explorer option or using ODFB is that when you use these tools, they don’t preserve the metadata of the files and folder that are on the file shares. By this I mean, things like the modified date or owner field are not migrated into SharePoint. The owner will become the user that is copying the data and the modified date will be the timestamp of the when the copy operation was executed. So if this metadata on the files shares is important, don’t use any of the methods mentioned earlier, but use one of the third party tools below.

Pros: Free, easy to use, works fine for smaller amounts of data (max 5000 per team site library or 20000 per personal site)

Where some have a focus on SMB, while other more focused on the enterprise segment. We can’t speak out any preference for one tool or the other, but most of the tools will have a free trial version available, so you can try them out yourself.

Summary

When should I use what approach?

Here is a short summary of capabilities:

Open in ExplorerOneDrive for Business (with latest update)3rd party

Amount of dataRelatively smallNo more than 5000 items per libraryLarger data sets

Invalid character detectionNoNoMostly yes1

URL length detectionNoNoMostly yes1

Metadata preservationNoNoMostly yes1

Blocked file types detectionNoNoMostly yes1

1This depends on the capabilities of the 3rd party tool.

Troubleshooting

ODFB gives me issues when synchronizing dataPlease check if you have the latest version of ODFB installed. There have been stability issues in earlier released builds of the tool, but most of the issues should be fixed by now. You can check if you are running the latest version, by opening up Word-> File-> Account and click on Update Options-> View Updates. If your current version number is lower than the one you have, click on the Disable Updates button (click yes if prompted), then click Enable updates (click yes if prompted). This will force downloading the latest version of Office and thus the latest version of the ODFB tool.

If you are running the stand-alone version of ODFB, make sure you have downloaded the latest version from here.

Latency Time by the upload , process is taking so long?This really depends on a lot of things. It can depend on:

The method or tool that is used to upload the data

The available bandwidth for uploading the data. Tips:

Check your upload speed at http://www.speedtest.net and do a test for your nearest Office 365 data center. This will give you an indication of the maximum upload speed.

Often companies have less available upload bandwidth then people at home. If you have the chance, uploading from a home location might be faster.

Schedule the upload at times when there is much more bandwidth for uploading the data (usually at night)

Test your upload speed upfront by uploading maybe 1% of the data. Multiply it by 100 and you have a rough estimate of the total upload time.

The computers used for uploading the data. A slow laptop can become a bottle neck while uploading the data.

2012 is only in September last year appear final. There is the Visual Studio 2012 update 2 (VS 2012.2) as CTP will be available after the update 1 already. Tonight at 5 pm, Dariusz Parys and Christian Binder present their personal highlights of the second update in the live stream. There is also the opportunity to ask questions in the live chat to discuss.2012 is only in September last year appear final. There is the Visual Studio 2012 update 2 (VS 2012.2) as CTP will be available after the update 1 already. Tonight at 5 pm, Dariusz Parys and Christian Binder present their personal highlights of the second update in the live stream. There is also the opportunity to ask questions in the live chat to discuss.

August 15th: IT professionals testing Windows 8 in organizations will be able to access the final version of Windows 8 through your TechNet subscriptions.

August 16th: Education institutions with existing Microsoft Software Assurance for Windows will be able to download Windows 8 Enterprise edition through the Volume License Service Center (VLSC), allowing you to test, pilot and begin adopting Windows 8 Enterprise within your organization.

August 16th: Microsoft Partner Network members will have access to Windows 8.

On September 4. That’s when Windows Server 2012 will be generally available for evaluation and download by all customers around the world. On that day we will also host an online launch event where our executives, engineers, customers and partners will share more about how Windows Server 2012 can help organizations of all sizes realize the benefits of what we call the Cloud OS. You will be able to learn more about the features and capabilities and connect with experts and peers. You’ll also be able to collect points along the way for the chance to win some amazing prizes. You don’t want to miss it. Visit this site to save the date for the launch event.

Each TFS component maintains its own set of transaction databases. This includes work items, source control, tests, bugs, and Team Build. This data is aggregated into a relational database. The data is then placed in an Online Analytical Processing (OLAP) cube to support trend-based reporting and more advanced data analysis.

The TfsWarehouse relational database is a data warehouse designed to be used for data querying rather than transactions. Data is transferred from the various TFS databases, which are optimized for transaction processing, into this warehouse for reporting purposes. The warehouse is not the primary reporting store, but you can use it to build reports. The TfsReportDS data source points to the relational database. The Team System Data Warehouse OLAP Cube is an OLAP database that is accessed through SQL Server Analysis Services. The cube is useful for reports that provide data analysis of trends such as ‘how many bugs closed this month versus last month?’ The TfsOlapReportDS data source points to the Team System Data Warehouse OLAP cube in the analysis services database.

10 Steps to trouble shoot TFS Reporting

1. On the TFS Application tier server, open an Administrative Command Prompt

2. Run the following command: Net Stop TFSJobAgent

3. Once this completes, run the following command to restart the TFSJobAgent: Net Start TFSJobAgent

4. Open the TFS Administration console, and select the Reporting Node

5. Click the Start Rebuild link to rebuild the warehouse. Refresh this page until it displays “Configured and Jobs Enabled”

6. Open a web browser and navigate to the warehousecontrolservice.asmx page at:

Refresh TFS Warehouse, Cube and Reports on demand

By default, TFS will process it’s Data Warehouse and Analysis Services Cube (and thus update the data for the reports) every 2 hours. Be careful with changing it to lower values than every hour:

Important

If you reduce the interval to less than the default of two hours (7200 seconds), processing of the data warehouse will consume server resources more frequently. Depending on the volume of data that your deployment has to process, you may want to reduce the interval to one hour (3600 seconds) or increase it to more than two hours. [Source: MSDN]

Alternatively you can use this small command line utility from Neno Loje:

Below are few issues which I guess one would run into on their first usage of TFS & Team Explorer, some of them are fixed in TFS 2010 and some other in TFS 11

1) Permanently deleting Dummy Projects: After playing around for a while there would few dummy Team Projects created. By default TFS uses a soft delete. For permanent (hard) delete one can use tf command line utility with destroy option.

Note if you have deleted a project already you need to undelete it & check in pending changes. Destroy doesn’t work on deleted projects. Also the folder you are trying to delete should be mapped to a workspace (File -> Source Control ->Workspaces…)

2) Logging in as a different user: By default VS.NET would ask you credentials to connect to TFS every time you run it. You can avoid it by caching required credentials. To cache your credentials go to Control Panel -> User Accounts -> Manage your network password (left column) -> Click Add to add the required details. Once added VS.NET won’t trouble you for credentials.

3) Deleting a workspace: Workspace belongs to a owner (authenticated user by TFS). Let’s say you have logged in as Admin & set your working folder to C:\WorkingFolder. Now you want to log on TFS as local user (without admin rights) & use the same mapped path (C:\WorkingFolder). TFS at this level would complain Admin is already using that location, hence you can’t use it. In order to remove workspace created by Admin you again need to fallback on tf utility.

4) Automatic Check Out – not working: If you go to VS.NET Tools -> Options -> Source Control -> Environment, you will see 2 drop downs there. There is one which reads Editing – Check Out Automatically. What this means is when you have a opened project & you edit files using Solution Explorer those would be checked out automatically. But sometimes you won’t find it working. A possible reason, your solution is not binded to source control. To regain the bindings click on File -> Source Control -> Change Source Control (N.B. At this point of time Source Control explorer should be closed). There you will a list of your projects & solutions. Select them & click on bind button on the tool bar window. Things would start working now as expected. When you bind the TFS project and solution by default they will be checked out and 2 additional files vspscc and vssscc would be created. You need to check-in the project (.csproj) / solution (.sln) files to avoid rebinding of the solution next time. There isn’t a need to include vspscc and vssscc inside TFS.

5) Unlocking a file: 2 steps – find the workspace that belongs to the user and then execute undo command by specifying workspace, user account and file path – tf workspaces /owner:domain\userid –you would get these parameters from the file lock message tf undo /workspace:workspacename;domain\userid $/filePath –filePath you can copy it from File Properties in Source Control

6) Permanently deleting a WorkItem: There is quite a possibility there your team could create dummy workitems in process of getting familiar with the system. At times it might be important to cleam this items so that they don’t impact your charts and reports. Below is the command you can use to delete a workitem permanently.