This is an exciting development, evolving the mobile-centric Xamarin Studio IDE into a true mobile-first, cloud-first development tool for .NET and C#, and bringing the Visual Studio development experience to the Mac.

I tend to agree that it is a significant piece of news. It signals Microsoft’s intent to offer first-class support for Mac developers. Other than at Microsoft events, the majority of the developers I see at conferences carry Macs rather than Windows laptops, and if the company is to have any hope of winning them over to its cross-platform ASP.NET web application framework, getting excellent development support on Macs is a critical step.

Naming things is not Microsoft’s greatest strength though. Sometimes it gives different things the same name, such as with OneDrive and OneDrive for Business, or Outlook for Windows and Outlook for iOS and Android. It makes sense from a marketing perspective, but it is also confusing.

This is another example. No, Microsoft has not ported Visual Studio to the Mac. This is a rebrand of Xamarin Studio, originally a cross-platform IDE for its C# mobile app framework, but more recently Mac-only.

Hutchinson makes the best of it:

Its UX is inspired by Visual Studio, yet designed to look and feel like a native citizen of macOS …. Below the surface, Visual Studio for Mac also has a lot in common with its siblings in the Visual Studio family. Its IntelliSense and refactoring use the Roslyn Compiler Platform; its project system and build engine use MSBuild; and its source editor supports TextMate bundles. It uses the same debugger engines for Xamarin and .NET Core apps, and the same designers for Xamarin.iOS and Xamarin.Android.

The common use of MSBuild is a key point. “Although it’s a new product and doesn’t support all of the Visual Studio project types, for those it does have in common it uses the same MSBuild solution and project format. If you have team members on macOS and Windows, or switch between the two OSes yourself, you can seamlessly share your projects across platforms,” says Hutchinson.

The origins of what will now be Visual Studio for the Mac actually go back to the early days of the .NET Framework. Developer Mike Kruger decided to write an IDE in C# in order to work more easily with a pre-release of .NET Framework 1.0. His IDE was called SharpDevelop. Here is an early version, from 2001:

Of course by then most developers used Visual Studio to work with C#, but there were several reasons why SharpDevelop continued to have a following. Unlike Visual Studio, it was built in C# and you could get all the code. It was free. It was also of interest to Mono users, Mono being the open source implementation of the .NET Framework originated by Miguel de Icaza (also now at Microsoft). In 2003, Mono developers started work on porting SharpDevelop to run on Linux using the GNOME toolkit (Gtk#). This forked project became MonoDevelop.

Xamarin (the framework) of course has its roots in Mono and when Xamarin (the company) decided to create its own IDE it based it on MonoDevelop. So MonoDevelop evolved into Xamarin Studio.

Incidentally, SharpDevelop is still available and you can get it here. MonoDevelop is still available and you can get it here.

So now some sort of circle is complete and what began as SharpDevelop, a rebel imitation of Visual Studio, will now be an official Microsoft product called Visual Studio for the Mac – though how much SharpDevelop code remains (if any) is another matter.

Historical digression aside, the differences between Visual Studio and Visual Studio for the Mac are not the only point of confusion. There is also Visual Studio Code, an editor with some IDE features, which is cross-platform on Windows, Mac and Linux. This one is based on the Google-sponsored Chromium project and has won quite a few friends.

Should Mac users now use Visual Studio Code, or Visual Studio for the Mac, for their .NET Core or ASP.NET Core development? Microsoft will say “your choice” but it is a good question. The key here is which project will now get more attention from both Microsoft and other open source contributors.

Still, we should not complain. Two rival Microsoft IDEs for the Mac are a considerable advance on none, which was the answer until Visual Studio Code went into preview in April 2015.

The Mio MiVue 688 is a high quality dashcam which will record your journeys as well as alerting you to lane drift and speed cameras.

In the box is the device itself – around 90 x 45 x 37mm – together with a vehicle power adapter and a suction mount. You will need a couple more things to get going: a Micro SD memory card (8GB to 128GB) and a USB Mini-B to type A cable, presuming you want to connect it to a PC. It is always annoying to find that that you have to buy extras, though you may have some spares anyway, and also annoying that MiVue still use the older Mini-B connector which is relatively uncommon now.

The MiVue 688 has a rechargeable battery, though for full use you will want to keep it powered continuously with the adapter.

After charging, the first thing you will want to do is to set the date and time as well as your preferred distance measure. Being in the UK I set it to miles.

In doing so, you will get an idea of how the MiVue’s controls work. There is a nice bright LED colour display, but it is not touch control. Instead, there are 6 buttons:

Power button on the left edge

Event button (for emergency recording) on the front right

Four function buttons on the right edge

The control system is not all that intuitive. By default the unit records when it is on. The function keys come into play when you go into the menu. The top key is the menu key; it displays or exits the current menu. The next key is Enter. The two lower keys are cursor keys. At first you might think that the buttons align with the menu item you want to operate, but they do not. Of course you are not intended to operate this fiddly menu system while driving.

The normal use is that recording starts as soon as the unit receives power, in other words when you start the engine. It then records continuously, creating 3-minute video files. If it runs out of space it overwrites old files.

When you start recording you get a view of what it is recording on the screen. After a short time, this blanks out and you just get the time. However it is still recording.

The device has a Sony Exmor video processor, does 1080p video recording and displays on a 2.7″ screen. It has an F1.8 aperture and a 140⁰ wide angle lens.

The MiVue 688 in use

I tried the MiVue on a 3-hour journey on a rather damp day. The first challenge is mounting the MiVue, the main problem being getting the power cable connected without it hanging dangerously or getting in the way. I found some short lengths of gaffer tape essential, to secure the cable to the edge of the windscreen. The MiVue cable is fortunately fairly long.

I then sited the camera towards the top of the windscreen. Again, care is needed as you do not want it to obscure your view.

I found the way the device works confusing at first. In particular, I thought that when the screen changed from the live recording to the clock, that recording had stopped. It was only when I got back and connected the device to a PC that I realised the entire journey was on video. I do think this is preferable; despite the emergency button, you want the recording to happen without having to think about it.

My journey passed without incident, but having a recording, given how simple this is to achieve, does make sense. If you are the innocent party in a collision, it will provide crucial evidence. Note that it records your speed and exact location as it goes, thanks to built-in GPS. A side-effect of having a dashcam may be that you are less inclined to take chances, knowing that there will be evidence.

When we parked, I removed the MiVue, because I did not want the embarrassment of risking theft of my loan gadget. This is a dilemma, as the MiVue has a parking function that will automatically record if it detects a collision when parked. If you think someone might steal the device though, that will not help you.

Annoyances

Wiring up the MiVue all felt a bit DIY and it would be good to see provision for dashcams built into modern vehicles. I also found several nits with the MiVue:

Packaging does not make it clear that you need to supply your own memory card and USB cable – as well as Gaffer tape or equivalent

Extras

On the plus side, there are a few extras. The safety camera warnings worked, though if you have SatNav of some kind you probably already have this. There is the parking function mentioned above. The speed always shows, and since this is more accurate than my in-car speedometer this is a benefit.

A camera feature lets you take still images. Could be handy after an incident.

A motion sensor kicks in a recording automatically in the event of sudden movement. This also tends to happen when handling the unit, for example connecting it to a PC!

There are also some Advanced Driver Assistance features. Specifically, this covers Lane Departure Warning (could be a life-saver if you fell asleep), which beeps if you drift out of your lane; and Front Collision Warning System which beeps if it thinks you are driving too close to the vehicle in front.

These are handy features, but require regular calibration to work. You have to tell the MiVue where is the horizon and where is the end of your bonnet (hood). You cannot do this while driving so require a passenger.

I would have thought the AI for this kind of feature could do this calibration automatically as systems like this evolve.

MiVue Manager

You can download a MiVue Manager app to help you view your videos. I did not get on well with this. The first annoyance was that the MiVue Manager app insists on running with admin rights on Windows. Next, I found it still did not work because of missing codecs.

However I can view the videos fine using the Windows 10 built-in app, or VLC. So I gave up on the MiVue Manager.

Conclusion

The MiVue 688 will cost you around £150 and works well. As noted above though, there are some annoyances and you might prefer a touch control unit like the 658, which is a similar price.

I am still impressed. The quality of the video is very good, and this MiVue provides significant benefit at modest cost.

Somewhere I’ve got a book, “David Bowie in his own words.” Sadly Bowie is no longer with us, and we have to make do with David Bowie in other people’s words. Here are some good ones.

This book is subtitled “A tribute to Bowie by his artistic collaborators and contemporaries,” which describes it exactly. It’s been put together by Rolling Stone writer Brian Hiatt, who conducted the interviews, and I doff my cap to him: he’s managed to ask the right people the right questions, and assemble the results into a tasteful and compelling portrait.

The first contributor is George Underwood, a schoolfriend who became an a artist and contributed to some of Bowie’s album covers.

Amazingly, George Underwood left a message for Bowie on his answerphone in 1976 or thereabouts, saying “I’m happy, hope your happy too.” The words later turned up on Ashes to Ashes. “But I don’t know if I’m the Action Man,” writes Underwood.

Then there’s Dana Gillespie, an early singer friend for whom Bowie wrote the song Andy Warhol, though it first appeared on Hunky Dory.

And Mike Garson, who is fascinating about the tension of being a classical and jazz pianist and working with a rock musician. “There was a part of me for sure that recognized his genius … but let me tell you for sure, there was another part of me at the time that just thought, this is way below my gift and abilities.”

He later remarks, “I was the longest member in the band, when you put all the hours and tours together.”

Earl Slick writes frankly about his work with Bowie. I was interested in his remarks about recording Station to Station. “He was not as out of control as he was made out to be, in terms of his functionality. When he got his mind into something he could hyper-focus like a _. I don’t care if he was living on milk.”

He also reveals that there was nearly a tour after the Reality tour. “There were about three or four close calls where I did get phone calls, and I was put on hold to tour, but it didn’t happen.”

And later Slick writes, “there were parts of David that you could never get through.”

Carlos Alomar: “The master puppeteer actually did know what he was doing, and not only can you understand that now, but you see it play out constantly on all those albums.”

He also recounts his goodbye. “I saw David at Tony Visconti’s birthday partly last year and he was very very fragile. In hindsight, I can see what was happening … we talked about old times and it was good to talk about things, heal old wounds … now I understand it was that goodbye.”

Artist Derek Bosher tells a story about Bowie being photographed. “As we were chatting the PR person came over and said, David, there’s a photographer here from Paris Match. David, in real life, used to always walk quite slowly and talk quietly, he never shouted, different from the almost narcissistic public persona. … they start shooting and he becomes David Bowie. And then straight after that, he said, “Let’s go sit down again. It was like watching Clark Kent going into the telephone booth and becoming Superman, then turning back.”

And Nile Rodgers, of Chic, says Bowie really did call him up and say, “you do hits, I’d like you to do a record of hits” – it became, of course, Let’s Dance. His account of how it was recorded is incredible, gripping. I won’t spoil it for you by quoting everything.

This isn’t a picture book, but it is illustrated with around 40 photos and artworks most of which I had not seen before. The printing is high quality and this is just a lovely book, you will know Bowie better after reading it.

The only thing I don’t much like is the cover, which looks rather cheap to me, not hinting at the wonders within. And I suppose there are other contributors it would have been nice to see included, Brian Eno, Robert Fripp, Tony Visconti and more; but you never get everyone in a project like this.

I’ve been trying Microsoft’s ADConnect tool, the replacement for the utility called DirSync, which synchronises on-premises Active Directory with Azure AD, the directory used by Office 365.

It is therefore a key piece in Microsoft’s hybrid cloud story.

In my case I have a small office set-up with Active Directory running on Server 2012 R2 VMs. I also have an Office 365 tenant that I use for testing Microsoft’s latest cloud stuff. I have long had a few basic questions about how the sync works so I created a small Server 2012 R2 VM on which to install it.

ADConnect can be installed on a Domain Controller, though this used to be unsupported for DirSync. However it seems to be tidier to give ADConnect its own server, and less likely to cause problems.

There are a number of pre-requisites but for me the only one that mattered was that your domain must be set up on the Office 365 tenant before you configure ADConnect. You cannot configure it using the default *.onmicrosoft.com domain.

Adding a domain to Office 365 is straightforward, provided you have access to the DNS records for the domain, and provided that the domain is not already linked to another Office 365 tenant. This last point can be problematic. For example, BT uses Office 365 to provide business email services to its customers. If you want to migrate from BT to your own Office 365, detaching the domain from BT’s tenant, to which you do not have admin access, is a hassle.

When I tried to set up my domain, I found another problem. At some point I must have signed up for a trial of Power BI, and without my realising it, this created an Office 365 tenant. I could not progress until I worked out how to get admin access to this Power BI tenant and assign my user account a different primary email address. The best way to discover such problems is to attempt to add the domain and note any error messages. And to resist the wizard’s efforts to get you to set up your domain in a different tenant to the one that you want.

That done, I ran the setup for ADConnect. If you use the Express settings, it is straightforward. It requires SQL Server, but installs its own instance of SQL Server Express LocalDB by default.

You enter credentials for your Office 365 tenant and for your on-premises AD, then the wizard tells you what it will do.

I was interested in the link on the next screen, which describes how to get all your Windows 10 domain-joined computers automatically “registered” to Azure AD, enabling smoother integration.

If you follow the link, and read the comments, you may be put off; I was. It involves configuring Active Directory Federation Services as well as Group Policy and looks fiddly. I suspect this is worth doing though, and hope that configuration will be more automated in due course.

The next step was to look at the outcome. One thing that is important to understand is that synced users are distinct from other Office 365 users. Imagine then that you have existing users in Office 365 and you want to match them with existing on-premises users, rather than creating new ones. This should work if ADConnect can match the primary email address. It will convert the matching Azure AD user into a synced user. Otherwise, it will just create new users, even if there are existing Azure AD users with the same names. If it goes wrong, there are ways to recover. Note that the users are not actually linked via the email address, they are linked by an attribute called an ImmutableID.

The Office 365 admin portal is fully aware of synced users and the user list shows the distinction. Users are designated as “In Cloud” or “Synced with Active Directory”.

Synced users cannot be deleted from the Office 365 portal. You delete them in on-premises AD and they disappear.

The next obvious issue is that if you dive in like me and just install ADConnect with Express Settings, you will get all your on-premises users and groups in Azure AD. In my case I have things like “ASP.NET Machine Account”, various IUSR* accounts, users created by various applications, and groups like “DHCP Administrators” and “Exchange Trusted Subsystem” that do not belong in Office 365.

These accounts do not do much harm; they do not consume licenses or mess up Office 365. On the other hand, they are annoying and confusing. You may also have business reasons to exclude some users from synchronization.

Fortunately, there are various ways to fine-tune, both before and after initial synchronization. You can read about it here. This document also states:

With filtering, you can control which objects should appear in Azure AD from your on-premises directory. The default configuration takes all objects in all domains in the configured forests. In general, this is the recommended configuration.

I find this puzzling, in that I cannot see the benefit in having irrelevant service accounts and groups synced to Office 365 – though it is not entirely obvious what is safe to exclude.

I went back to the ADConnect tool and reconfigured, using the Domain and OU filtering option. This time, I selected what seems to be a minimal configuration.

The excluded objects are meant to be deleted from Office 365, but so far they have not. I am not sure if this will fix itself. (Update: it did, though I also re-ran a full initial sync to help it along). If not, you can temporarily disable sync, manually delete them in the Office 365 portal, then re-enable sync.

What if you want to exclude a specific user? I used the steps described to create a DoNotSync filter based on setting extensionAttribute15. You use the ADConnect Synchrhonization Rules Editor to create the rule, then set the attribute using ADSIEdit or your favourite tool. This worked, and the user I marked disappeared from Office 365 on the next sync.

Incidentally, you can trigger an immediate sync using this PowerShell command:

Start-ADSyncSyncCycle -PolicyType Delta

Complications

Setting up ADConnect does introduce complexity into Office 365. You can no longer do everything through the portal. It is not only deletion that does not work. When I tried to set up a mailbox in Office 365 I hit this message:

“This user’s on-premises mailbox hasn’t been migrated to Exchange Online. The Exchange Online mailbox will be available after migration is completed.”

I can see the logic behind this, but there might be cases where you want a new empty mailbox; I am sure there is a way around it, but now there is more to go wrong.

Update: there is a rather important lesson hiding here. If you have are running Exchange on-premises and want to end up on Office 365 with ADConnect, you must take care about the order of events. Once ADConnect is running, you cannot do a cutover migration of Exchange, only a hybrid migration. If you don’t want hybrid (which adds complexity), then do the cutover migration first. Convert the on-premise mailboxes to mail-enabled users. Then run ADConnect, which will match the users based on the primary email address.

It is also obvious that ADConnect is designed for large organisations and for administrators who know their way around Active Directory. There is a simplified sync tool in Windows Server Essentials, though I have not used it. It would be good though to see something between Essentials and the complexity of ADConnect. For example, I had imagined that there might be a mapping tool that would let you see how ADConnect intends to match on-premises users with Office 365 users and let you amend and exclude users with a few clicks.

Microsoft has been working on this stuff for some time and is not done yet. In preview for example is Group Writeback, which lets you sync Office 365 groups back to on-premises AD.

Maybe Microsoft might also consider using different icons for the various ADConnect utilities as they do look a bit silly if you pin them to the taskbar:

The tools are:

Azure ADConnect (Wizard)

Synchronization Rules Editor (advanced filtering)

Synchronization Service WebService Connector Config (SOAP stuff)

Synchronization Service Key Management (what it says)

On the plus side, I have not hit any mysterious Active Directory errors and it has all worked without having to set up certificates, reverse proxies, special DNS entries (other than the standard ones for Office 365), or anything too fiddly, though note that I avoided ADFS and automatic Windows 10 registration.

Final thoughts

If you need to implement this, you will find doing what I did and trying it out on a test domain is worth it. There seem to be quite a few pitfalls, and as ever, it is easier to get it right at the start rather than trying to fix things up afterwards.