Consisting of mostly water and curiosity

I’ve been working on updating an older model JCut CNC Router that I came across last year. The machine works well, but its controller card is old enough to not be supported by most modern computers (it’s an ISA card!!!). Plus, the most recent OS that the driver works with is Windows Vista. Ideally, I’d want to use this with a modern computer and OS. I got to investigating and found GRBL – an open source CNC controller firmware for the Arduino Uno. It’s pretty customizable and meets almost all the requirements for my machine. The only place it fell short was in spindle speed control. This particular machine doesn’t have continuously variable speed control. Instead, it has several discrete operating speeds. GRBL assumes that you’re using a variable frequency drive and outputs a pulse width modulated (PWM) signal to indicate the desired speed. You could, hypothetically, go into the GRBL code and rewrite what you need, but I found it easier to just tack on an Arduino Nano to act as a speed selector. It takes the PWM signal from the Arduino Uno and them selects the nearest discrete speed that the machine supports.

I’m not an electrical or circuits guy, so this project was a big learning experience. Anyway, it made a good summer project and saved a little of money. Just gotta make a neat case for it now.

I’ve been playing around with augmented reality lately. Here are some really terrible cell-phone videos of the results. The first is from last year and is using AR Toolkit markers to track the position of virtual objects in the room. The more recent video is using a Vive Tracker and a Leap Motion to provide a head and hand tracking respectively. Both of these were proof-of-concept demos that I put together for local technology expos. The HMD being used is a custom built binoculars-style display. I hope you enjoy the videos!

The site has been quite for a minute now. 2015 – 2016 has been an insanely busy time! New job, another cross-country move, got married, and started my own lab. Really fun stuff!

Long story short, 2012 through 2014 was mostly spent working with the Mixed Reality Lab at the University of Southern California, seeing Oculus before it got off the ground, and doing some pretty cool research. After that, I was offered a postdoctoral fellowship with the Virtual Environments Group at Clemson University. For 2014 – 2015, we got some pretty cool research published (modeling the human eye to make super accurate augmented and virtual reality [paper], new ways of calibrating and understanding virtual reality optics [paper], and how people reach and interact with objects in the virtual reality [paper 1, paper 2]).

Ok, it has been a super long time since I’ve made a post… To break the silence, I wanted to write up a bit of info on something I recently had to work out for a project. That is how to drive a CD-ROM stepper motor using the Seeed Motor Shield for the Arduino Uno. I did some quick searching and didn’t immediately turn up any posts on how to do this (though I found several DIY setups and one using an Adafruit motor shield). The motor isn’t marked with any branding, so finding the pinout was by trial and error. Once you’ve got that, the motor works just as any other stepper. As shown in the diagram, I have arbitrarily labeled the pins from top-to-bottom, but this was an arbitrary decision.

I soldered wires directly onto where the ribbon cable was attached (see photos) so that it would be easier to work with. Once everything was wired up, I ran the Seeed “StepperMotorDemo” code and everything worked. Well, sortta. 200 steps were too many and made the laser carriage run into the ends. 40 seemed to be a good number of steps. USB power was enough to run this, so I didn’t need to add external power.

So, there ya go. A short, sweet post that will hopefully be helpful to somebody.

A colleague is working on a book chapter in which he wants to include a section on AR. He asked me for a brief history (not to be include in the book) to give him an idea of how AR came to be what it is today. I sent him the following. There is A LOT missing in here, but it was only intended to be an informal “one page history”. I thought I share it here.

Augmented reality has a history much longer than most people think. To understand its history you must first understand what “augmented reality” means at its most fundamental. At its most basic, augmented reality is quite literally augmenting one’s experience of the physical world by modifying or adding sensory information that would not otherwise be available. In one way or another we have been doing this since torch light was first used to allow us to see clearly in the dark. However, augmented reality in its modern form (and potentially future forms) was first formally conceptualized by Ivan Sutherland in his historic article “The Ultimate Display”. Not long thereafter, Sutherland developed the first head-mounted augmented reality display. This display was dubbed “The Sword of Damocles” since the device and its associated equipment seemed to dangle precariously above the head of the user. For better or worse, the basic design of most augmented reality systems, and later virtual reality systems, remained essentially unchanged for decades to come. (An important side note is that of a common misconception. Many people assume that virtual reality was the predecessor to augmented reality, mostly due to VR’s early popularity in the 1990s. However, Sutherland’s 1968 AR display served as the template for essentially all future AR and VR displays.)

For quite some time afterward advances in augmented reality centered around non-head-worn displays, like the head-up displays (HUDs) often used in modern aircraft. These displays enabled pilots to view critical control information while retaining a direct view of their surroundings. As is the case with many new technologies, the ramifications of its adoption were not fully understood at first. Unfortunately, misconceptions and misapplications of head-up displays led to multiple dozens of preventable aircraft accidents in the early 1980s. However, through considerable research and hard earned experience HUD designs improved to the point that they are common place in many commercial and military aircraft.

As computer and display technologies became faster and smaller, virtual reality saw a surge in popularity in the 1990s. However, the majority of the systems developed during this time could only offer limited experiences and usually only in tightly controlled lab settings. Excellent examples of these were the virtual reality systems employed by NASA to visualize and virtually explore the terrain of our neighboring planets. (One of the VR displays used by Michael McGreevy, Steve Ellis, and others from NASA Ames is currently on display at the Smithsonian Air and Space Museum.) A few commercial systems were available to the public though. Unfortunately, most of these were expensive, cumbersome, and provided lackluster experiences. Perhaps a too often cited example is the Nintendo Virtual Boy game system which debuted in 1995. The Virtual Boy, though hyped among the youth at the time, offered a monochromatic, low resolution gaming experience that was often reported to quite efficiently make players very uncomfortable, a condition dubbed simulator sickness.

In the late 1990s, augmented reality saw a bit of a resurgence with the release of commercially available see-through head-mounted displays such as the Sony Glasstron. The Glasstron was mostly geared toward businessmen who wanted to discretely watch videos while traveling, but it became one of the most commonly used displays in augmented reality research due to its small size and relatively affordable price. It was around this time that AR for training became a point of interest and led to the development of experimental AR frameworks for military training. Though a myriad of augmented reality display designs and manufacturers have come about since the early 2000s, the same basic head-worn archetype pioneered by Sutherland persisted as the dominant display type. This, however, rapidly changed as smartphones became ubiquitous. Smartphones contained almost all of the components used in AR and VR systems: cameras, motion sensors, and a display. This gave rise to a “magic window” form of augmented reality that allowed users to view computer generated imagery superimposed on the video feed from the phone’s camera. This method became quite popular and has even been adopted in the realm of handheld game systems. Though smartphone and tablet based AR applications are becoming quite common, there are still significant efforts to bring head-worn AR into the mainstream. Probably the most notable of these is the Google Glass project which is pushing to bring lightweight, see-through head-up displays to the masses. However, substantial research is still being devoted to the use of traditional head-worn stereoscopic 3D AR and VR displays. This will likely continue as cheap VR displays, like the Oculus Rift, are becoming more commonplace.

I have an interesting story about the Oculus Rift that I might post later. We’ll see if time allows.

Well, I recently had some unexpected difficulties with my old WordPress install, and I had to do some major fixes. With that said, I’m currently in the process of updating my site. I’ll be sure to get my old posts back up soon!

I recently gave a talk at Ames and got to see some of the vibration testing work they are doing. I, however, did not get to see this. You have got to love it when brilliant minds, like theirs, come up with beautifully elegant solutions to complex problems.

Recently, a couple of us were talking about white LEDs. White LEDs are a somewhat recent commodity. Red LEDs have been around forever, yellow LEDs for the limit approaching forever, and green LEDs for simply a really long time. Blue LEDs are more recent, but they have been around long enough to have been installed in nearly every electronic device and some varieties of toilet paper. White LEDs are considerably more recent than any of these. Actual white light is composed of a wide spectrum of light, ranging from violet to red. LEDs are typically constrained to a small color bands, so how do white LEDs work?

White light can be simulated by combining red, green, and blue light. This is the method used by TVs, LCD screens, and compact florescent lights. Could white LEDs be doing something similar? Could they just be red, green, and blue diodes combined into one unit? The simplest way to answer this question would be to google it, but let’s wait on that. This gives us the perfect opportunity to do some empirical experimentation. Let’s look at the light spectrum produced by a white LED. To do this we need a spectroscope…or some cardboard and a CD. There are multiple sites out there with instructions on building your own. Here is a photo of the one I built. You can find instructions to build a similar one here. BTW: You can also use these spectroscopes to determine what gasses are used in your florescent lights – a topic for another time.

If the white LEDs were combining multiple colors to simulate white light, it would look something like the image below. This image, by the way, is the spectrum of a typical compact florescent light. You can see “bumps” in the image that represent individual wavelengths of light that compose the simulated white light.

<Image missing. Sorry, this is a post retrieved from my archive before the site crash back in 2012. The original image seems to be lost.>

However, this is not what you see when you look at a white LED. Below is the image of the spectrum I captured from a common LED flashlight. This is interesting! ”Why?”, you may ask. It is surprisingly continuous. It seems to be composed mostly of a wide, smooth blue/green band and a wide, smooth red band. There is practically no yellow though. This seems to indicate that there are probably not three separate diodes in there, but the wideness and smoothness of the spectrum is a bit strange.

<Image missing. Sorry, this is a post retrieved from my archive before the site crash back in 2012. The original image seems to be lost.>

Another interesting aspect of this spectrum is that the red and blue ends look like they have a similar shape. I’ve sketched this out in the plot below. Now, I’ve hit a point where I need to google for answers. After a quick search, I found that many white LEDs are actually blue LEDs that have been doped with phosphors that fluoresce red when exposed to blue light. This explains the observation that the light was mostly composed of two similarly shaped blue and red spectra. So there, a quick little experiment that taught us something about how LEDs work.