Every so often, I decide "Gosh, I'd really like to write code for some
Bluetooth LE devices, but I don't really do much on mobile. Maybe
things have gotten better on desktop!" So far, I have been
disappointed every time. Now is no exception, but I've decided to
actually write down that disappointment as a form of therapy.

This post will go over how different desktop OSes, libraries, and
hardware deal with Bluetooth LE. I'm sticking to desktop here because
product manufacturers assume BTLE devices will usually be used with
phones. Mobile certainly isn't a solved problem either, but it's
better than desktop right now.

This article isn't an introduction to BTLE itself. I'm going to assume
readers know the basic terms and differences between, say, pairing and
connection. If you're not familiar, I recommend checking
out
Adafruit's BTLE Intro. Apple's
CoreBluetooth Overview has
some nice explanation also, though the examples are obviously platform
specific.

Operating System Support

OS X

Starting off easy. OS X has had support as a BTLE Central Node since
10.6, and as a peripheral node since 10.9. Done!

Linux

And then right on up the difficulty curve to Linux, where we
have bluez. I've yet to ever hear anyone say
"yay bluez!"

Bluez got BTLE support in 4.93 or so. As of this writing (November
2016), we're at 5.43. That's a full major version and a ton of minor
versions difference.

Between bluez 4 and 5, APIs moved from direct access to dbus. Then,
within DBUS 5, the methods have changed multiple times. I spent part
of last weekend trying to write some dbus code for accessing BTLE
devices with no luck, as I couldn't seem to identify services on the
device. It turns out that I'm on debian Jessie, which comes with dbus
5.23 (released September 2014) . After looking around a bit, it seems
most current bluez supporting libraries expect users to have at least
5.38 or higher, and sure enough, those expose different methods.

I was pretty confused by this, as I'd been
using pygattlib with no
problems on the same linux box to write some BTLE test scripts. Turns
out, pygattlib is similar to the
C-based gattlib. Both of these
use the GATT functions from gatttool in the bluez 4 line to talk to
BTLE devices without having to go through dbus, to let older
machines/kernels talk BTLE. That's why things worked, because it was
just bypassing the dbus interface.

Windows

tl;dr: Either hope your device requires pairing, or your code can deal
with WinRT APIs and will only run on updated Windows 10. Otherwise, do
something crazy.

Windows started supporting communicating with BTLE as of Windows 8.
However, this didn't mean you could just go talking willy-nilly to any
device you pleased. You actually had to pair with devices before they
were available to functions.

The problem is that device manufacturers are lazy and cheap, and
pairing is an optional part of the BTLE handshake process. It's also
the part that's vaguely secure, but a lot of products don't really
care about that. For many devices, you just connect, query for
services, and off you go. This is possible on both OS X and Linux, but
on windows, it was a no-go up until a set of Windows 10 updates.

There is one other workaround for BTLE on windows, even for
pre-windows 10 platforms, but it's not pretty. Check out the section
below on Noble for more information.

Cross-Platform ways to access BTLE currently

Given those warnings, if you still want to access bluetooth in a
pre-written, cross-platform way, here's a few choices. This is by no
means an even partially complete list of bluetooth wrappers/libraries,
it's just what I looked up while figuring all this out..

Qt (C++)

Qt was actually one of the first places I went to check for this, as
they're usually pretty good about supporting as much functionality as
possible across platforms.

Web Bluetooth

There is currently
a proposed spec
being implemented by Google in Blink (the browser engine that backs both
Chrome and Opera) that will allow webpages to interact with BTLE
devices. This is slated
to
ship in Chrome 56.
It will support Linux, OS X, Android, and ChromeOS. Windows is
apparently coming with WinRT later.

So while this API may show up in Chrome, it may also ONLY show up in
Chrome. This situation certainly hasn't stopped anyone from using APIs
before, though.

And for anyone that is saying "Wait, Kyle, didn't you implement one of
the Bluetooth stacks for FirefoxOS? What about that?", my reply is
"DON'T MENTION THE WAR." (Translation: That was an early version of a
non-standardized API that has since been removed from Gecko, Mozilla's
browser engine)

Noble (Node.js)

Noble is a node.js BTLE
library that supports OS X, Linux, and Windows.

Yes, Windows. Without WinRT. Crazy, right?

Well, yes. It actually is crazy. To use Noble on windows (which many
IoT/Maker programs do), you have to
install
WinUSB drivers over the standard Bluetooth Dongle drivers.
Noble then handles the full bluetooth stack for you, bypassing the
connection/scanning APIs missing from regular old windows. While a
clever way to do that, it's not exactly something you'd want to ship
to non-savvy end-users.

BGAPI

Some people aren't happy to just bitbang bluetooth to a dongle though.
Instead, they go all the way and implement a specialized API
specifically for their dongle.
The
BlueGiga BLED112 BTLE Dongle comes
with a special, proprietary API that allows users to connect to BTLE
devices on Windows (and other platforms), also routing around the lack
of OS API functionality. So, as long as your platform can talk USB, it
can also talk BTLE.

Conclusion

Well, that's the state of things for the moment. Those are some of the
reasons there's no libusb-equivilent for
bluetooth yet. Hopefully we'll see Microsoft fill out the Windows API
surface soon and make this article a little less sad, 'cause the WinRT
stuff is kinda painful. Until then, though, this is what we get to
deal with.

Thanks to Sandeep Mistry for
filling in some of the details on the Windows situation.

Last weekend I decided to dust off my Sensable Phantom Omni (now
the
3D Systems Geomagic Touch,
but I bought mine before Geomagic bought Sensable and 3D Systems
bought Geomagic), and see if it was still usable. I had to order
a
PCIe 1394b card,
but other than that, I hooked it up, and the Phantom drivers seemed to
install correctly on Windows 10. However, when running the Phantom
demo software, anytime a program tried to access the motors, the
program would crash. Sensor reading seemed ok, but I couldn't get any
feedback.

A quick call to the Geomagic Freeform support line turned up the
solution. As with many video products that used 1394b, the Phantom
Omni requires the "Legacy" firewire drivers. These were included with
Windows up to Windows 7, but as of Windows 8 and above, are no longer
included with the operating system.

After installing the drivers and changing the 1394 PCIe interface to
use the Legacy drivers
(process documented here),
I rebooted and the demos worked fine, with force feedback and all!

Along the way, I also found some information on repairing internal
cable breakage in the Omni, by a research team at John Hopkins
University. The original site had died, but the instructions and
images were still
available
at this link via the Internet Archive.

Anyways, hope this helps others that still want to get some life out
of their haptic controllers!

I was an Artist In Residence at
Autodesk Pier 9 from July 2014 to January
2015, concentrating on a sound art project to extract new and
interesting sounds from the prototyping machines around the workshop.
There's a video of the lecture I gave covering my time at Pier 9:

Also, I've just moved a lot of the health driver projects I currently
maintain to the OpenYou organization on github. The hope is to
get more developers working on these projects, versus having the world
waiting on me to have time to work on things. There's more information
available at the post on openyou.org.

Yay! Thanks to mAngO on the comment thread for my last keepon post, we now know that grounding out the bus during keepon's powerup allows you to act as the master to the bus! This means we can now control the motors and sound, as can be seen in the video above. I'm just controlling motors there, using the Control Program for Android to send OSC messages to a python script I wrote. The python talks to the USB serial port, and the arduino turns the commands coming over serial into I2C to go to keepon.

All the source code for this is available in completely raw, uncommented form at

So, that's the first part finished. Now it's on to polishing things out and figuring out the rest of the parts of the hardware we don't have access to yet. I'm keeping the github issues list updated with things we have left to do.

While this post still has relevant information, the engineers at
BeatBots have created a far more stable
firmware. I highly recommend using their MyKeepon firmware, as it fixes
a lot of the timing issues the KeepOff firmware had. The MyKeepon
firmware is available at:

Keepon hacking has made a major step! Thanks to mAngO on the comment thread for my last keepon post,
we now know that grounding out the bus during keepon's powerup allows
you to act as the master to the bus!
There's a Proof of Concept video posted on youtube now..
I'm leaving the rest of this post as it was when I first wrote it for
history sake, but the information in it plus knowing that you just
need to hold down the I2C lines for a second when the keepon powers up
are enough to actually get control going. The reverse engineering
document and code in the keepoff repository will be updated to reflect
this information.

I'm really not sure I've never spent so much time cursing at something
so adorable. The past week has been yelling, crying, and generally
losing my emotional shit toward a few servos wrapped in a weird,
sticky plasticy skin, better known as the MyKeepon Dancing Robot.

How better to atone for my sin of the vivisection of the most adorable
christmas toy this year, than writing up what I found. That way,
future generations can avoid the pain inflicted on it, and the pain it
inflicted on me.

But good lord, it's so fucking CUTE.

Usually I wouldn't write this up until after I had things completely
finished, but I gave myself a week deadline for that, and that
deadline passed 2 days ago. I'm still in the middle of a few different
ideas for reversing it, but those could take a while (stupid real life
getting in the way of toy hacking), so I figured I'd dump what
information I do have now.

On this big day in UI development, let's take a look over the current
console controls landscape, and what it means to non-game
developers.

Why focus on game consoles controls? They've driven down sensor prices
like crazy, due to mass manufacturing and required price points for
game sales. They've established more than a few careers of
non-game-developers now. Uses of the kinect and the wiimote for
projects not pertaining to their original console have been all over
the media lately. Keeping a forecast of where development for these
technologies is going means we have a better idea of how to ride the
wave when it comes.

Disclaimers

In terms of licensing issues, I am not a lawyer. I do not play one
on TV. However, I do have a lawyer fursona.

While I am part of the OpenKinect project, I do not speak for others
involved in the project. All opinions expressed here are my own, and
all cursing is far fucking better than anyone else on the project
could turn out, so while I may share my source code, I'm not giving
them rights to that.

I strive to keep all the information as correct as possible, but,
well, I've been drinking.

I am not a game developer. I am a reverse engineer that specializes
in controls and interface devices. My view of this hardware is
purely from the driver and capabilities side.

I have not directly used the Move SDK or Kinect SDK. But I have read
some articles and created very strong opinions, which means they are
valid for internet consumption.

This article is only about reversing/using alternative console
controllers, not about reversing consoles themselves. There's a
completely different history to that which would take much more than
a blog post to cover, though I will admit that it does have some
influence on the information here..

Then there's the Quantified Self Conference on May 28-29th,
2011, at the Computer History Museum in Mountain View, CA. There's no
central presentation, but honestly, I probably won't stop talking at
any point during the 2 days, as I have a table at the expo, plus will
be helping out with the health hardware session and the hackathon.