A linear resonant actuator is a vibration motor that produces an oscillating force across a single axis. Unlike a DC eccentric rotating mass (ERM) motor, a linear resonant actuator relies on an AC voltage to drive a voice coil pressed against a moving mass connected to a spring. When the voice coil is driven at the resonant frequency of the spring, the entire actuator vibrates with a perceptible force. Although the frequency and amplitude of a linear resonant actuator may be adjusted by changing the AC input, the actuator must be driven at its resonant frequency to generate a meaningful amount of force for a large current.

Currently, almost every modern video game console includes some form of vibrotactile feedback, but this was not always the case. As an increasing number of video games were made for computers and at-home entertainment systems, arcade game manufacturers sought ways to make their cabinet games more immersive. Though arcade controls were typically customized to each individual game, the increasing availability of video games outside of arcades placed pressure on companies to provide arcade visitors with experiences more uniquely tailored to branded game cabinets. In 1976, Sega’s game Moto-Cross (rebranded as Fonz) was the first to feature vibrotactile feedback, allowing each player to feel the rumble of their motorcycle as it crashed with another player’s bike on the screen. The control scheme was a success.

Expressive aphasia, also known as Broca’s aphasia, is a neurological condition characterized by an individual’s inability to produce grammatically correct speech, often due to a physical impact or alteration to the anterior regions of the brain, which impairs the proper function of neurons that would otherwise help construct vocalizations of grammatically correct sentences [1]. In the suburbs of Paris, France in 1861, Paul Broca identified the location of the region responsible for expressive aphasia after he conducted an autopsy of a patient incapable of uttering any word other than “tan” [2]. Though speculations of the structure and function of human consciousness had existed for several centuries, Broca’s discovery resulted in a new framework for understanding the brain’s role in producing conscious experience. The patient’s brain incurred a lesion from injury, and only a small subset of his cognitive function was impaired. Naturally, psychologists concluded that different parts of the brain mediate different cognitive processes. By 1874, Carl Wernicke discovered receptive aphasia (Wernicke’s aphasia), which results from damage to posterior regions of the brain [3], and increasing numbers of scientists began exploring which regions of the brain were responsible for different aspects of cognition. Brain science adopted a new goal: mapping the locations of the brain corresponding to each observable function in human consciousness.

While we were getting started with the hardware development for Moment, our first product, we needed to find a suitable way to handle the processing and Bluetooth Low Energy communications for the device. The NRF51 from Nordic Semiconductor, a BLE system on a chip, was very appealing for a few reasons:

That being said, we encountered a few things we wish we knew before starting:

The Keil development environment can be used for free, but use GCC if you plan to make larger or more complicated firmware code (there’s a ~40kb limit on the compiled binary size with Keil for free use)

In spite of that, debugging with Keil is much easier than with a GCC setup

The provided iOS and Android example apps have excellent demonstration of the over-the-air device firmware update (OTA DFU), but the core functionalities have only recently been split into modular components – integration with your own app can require a lot of tweaking

Many of the example programs included with the SDK require some modification until they can be effectively deployed to a new target board – making your own BSP (board support package) can be very useful if you make sure to properly include all the necessary definitions

Always use the latest SDK and documentation whenever possible – many tutorials you find in a Google search will use a different version of the SDK or SoftDevice, and searching for examples outside of the latest SDK documentation can be unproductive

Use GCC. Keil has some great tools for development, but GCC is available across all platforms without restrictions. If you’re a Windows-only team without much price sensitivity, you probably will prefer Keil for its superior debugging.

Have a solid game plan for integrating the chip into your own circuits. How will you update the firmware? What kind of a programmer will you use?

The examples are a good starting point for quickly trying out different SDK components, but starting from scratch with your own code structure will help you keep your code more modular

In 1995, inventor Geir Jensen designed a piece to mount on a watch strap that could provide caller identification through haptic rhythms. His device was based on a frequency-controlled actuator that provides tactile feedback, but he never actually built the hardware. Instead, he submitted the idea to a technology competition and was rejected by the Norwegian Industrial and Regional Development Fund. Jensen was thinking two decades ahead of his time.

Though haptic feedback often relies on the mapping of information directly to a stimulus produced by tactile actuators, we can often leverage human sensory illusions to produce a richer tactile experience for users. One such illusion is the “Cutaneous Rabbit Illusion” (also known as sensory saltation):

The cutaneous rabbit illusion (also known as cutaneous saltation and sometimes the cutaneous rabbit effect or CRE) is a tactile illusion evoked by tapping two or more separate regions of the skin in rapid succession. The illusion is most readily evoked on regions of the body surface that have relatively poor spatial acuity, such as the forearm. A rapid sequence of taps delivered first near the wrist and then near the elbow creates the sensation of sequential taps hopping up the arm from the wrist towards the elbow, although no physical stimulus was applied between the two actual stimulus locations.

This illusion allows us to simulate the experience of movement across the skin using a smaller number of vibration motors – if we want to communicate direction or spatial relationships, sensory saltation allows us to quickly deliver the information by letting the human brain fill in the gaps.

Haptics.js is a simple JavaScript library that allows you to implement vibrotactile effects using the HTML5 implementation of navigator.vibrate. In addition to providing a cross-browser compatibility layer, Haptics.js allows you to quickly create different vibrotactile effects, including fading vibrations, heartbeats, notifications, and error alerts.

In the past, we’ve discussed driving a linear resonant actuator using a DRV2605 haptic driver chip from Texas Instruments. Though the DRV2605 chip provides plenty of features (audio to haptics, licensed effects from Immersion, and flexible I2C or PWM input), it also requires a more complicated integration with an existing circuit. Though an extra capacitor or two doesn’t introduce too much complexity, the DRV2605 also isn’t suited for circuits attempting to drive multiple motors at the same time. Since the chip uses I2C, its address remains the same in every chip and cannot be modified. As a result, integrating multiple DRV2605 chips on a single I2C bus requires an I2C switch or multiplexor – multiple slaves cannot be controlled by the same master on the same bus.

There must be a simpler way! And Texas Instruments provides the DRV2603 to provide a simpler option. The DRV2603 haptic driver forgoes the licensed effects and audio input for a simple PWM-only input. Each motor driven with a DRV2603 chip only needs a single digital output pin from a signal source that is capable of producing a signal from 10kHz to 250kHz will be capable of driving multiple motors simultaneously.

For haptics projects that rely on multiple actuators to produce feedback, the DRV2603 provides a simpler way to get started using linear resonant actuators.

Now, it’s common to use an audio signal to drive a haptic actuator and produce tactile effects that correspond to sounds. The Apple Watch and Taptic engine use this technique to render haptic feedback.

In general, the best approach for most consumer electronics devices involves the implementation of a haptic driver capable of consuming audio signals as input. The DRV2605 from Texas Instruments can provide this form of feedback with an eccentric rotating mass motor or linear resonant actuator.

Another option, which is less suited for most portable electronic devices, is to use a surface transducer. A surface transducer will frequently be used in speakers to produce vibrations in an enclosure that result in sound. Unfortunately, surface transducers consume a lot of power, but they are able to produce vibrations that propagate over a hard surface very quickly and consistently.