Tag: sensors

Now, imagine seamless authentication everywhere. Software so powerful that by the sensors you already have on your phone, wearables, devices at home or the office, knows it’s you. No more 6-digit pin, string of upper and lowercase letters and numbers to signify that it is really you making a purchase, logging in, or entering a key swipe. Anywhere online or offline where you need to identify yourself, UnifyID promises that based on your everyday actions from factors like how you sit, walk, and type (i.e. passive factors also known as implicit authentication in academia), your “you-ness” can be determined with 99.999% accuracy. At times when the machine learning algorithms are unsure, an active challenge will be triggered on your nearest phone or device (e.g. fingerprint verification, among a dozen others in development).

The UnifyID iOS active challenge is triggered when the machine learning algorithm requires additional verification to learn that it is really you.

UnifyID has been called the holy grail of authentication because the degree of security and sophistication of its machine learning efforts are unparalleled and the convenience and focus on usability makes trying the product unbelievably easy.

Between now and then, we’re in the stage of private beta–ensuring that the flows are easy and work as expected. UnifyID launched out of stealth at TechCrunch Disrupt in September. The initial sign on, logging out and logging back into sites has gone through more than 25 iterations in a few weeks (thanks to the onsite testers!). We’re ready to move forward to a remote private beta and test outside the bounds of our four-walls.

Join us on this journey to disrupt passwords. While “The Oracle” is still under development (our machine learning algorithms), we are moving full-forward on making sure that at this stage, the UnifyID user flows are easy for everyone to use, many times, everyday, across all sites.

***

Sign up for the UnifyID Private Beta: https://unify.id, click “Apply for Private Beta,” enter “Imagination” and why you are interested in participating in the beta in the secret handshake field.

Today, UnifyID, a service that can authenticate you based on unique factors like the way you walk, type, and sit, announced the final 16 fellows selected for its inaugural Artificial Intelligence Fellowship for the Fall of 2016. Each of the fellows have shown exemplary leadership and curiosity in making a meaningful difference in our society and clearly has an aptitude for making sweeping changes in this rapidly growing area of AI.

Of the company’s recent launch and success at TechCrunch Disrupt, claiming SF Battlefield Runner-Up (2nd in 1000 applicants worldwide), UnifyID CEO John Whaley said, “We were indeed overwhelmed by the amazing response to our first edition of the AI Fellowship and the sheer quality of applicants we received. We also take immense pride in the fact that more than 40% of our chosen cohort will be women, which further reinforces our commitment as one of the original 33 signees of the U.S. White House Tech Inclusion Pledge.”

The final 16 fellows hail from Israel, Paris, Kyoto, Bangalore, and cities across the U.S. with Ph.D., M.S., M.B.A., and B.S. degrees from MIT, Stanford, Berkeley, Harvard, Columbia, NYU-CIMS, UCLA, Wharton, among other top institutions.

Aidan Clark triple major in Math, Classical Languages and CS at UC Berkeley

This highly selective, cross-disciplinary program covers the following areas:

Deep Learning

Signal Processing

Optimization Theory

Sensor Technology

Mobile Development

Statistical Machine Learning

Security and Identity

Human Behavior

Our UnifyID AI Fellows will get to choose from one of 16 well-defined projects in the broad area of applied artificial intelligence in the context of solving the problem of seamless personal authentication. The Fellows will be led by our esteemed Fellowship Advisors, renown experts in machine learning and PhDs from CMU, Stanford, and University of Vienna, Austria.

The UnifyID product consists of an app that runs on your devices as well as a cloud service. The local apps periodically collect sensor data from the local device, process it, and communicate with the cloud service. We use a variety of data sources all of which are implicit in nature, requiring no conscious action by the user. On mobile devices, we make use of a variety of sensors including GPS, accelerometer, gyroscope, magnetometer, barometer, ambient light, and wifi and Bluetooth signal telemetries. All sensor data is processed locally and we send a small stream of extracted features to our cloud-based machine learning system, which automatically finds correlations between factors and discovers what makes you unique. The data is kept on the local device, is encrypted and anonymized using best practices in differential privacy. We only periodically sample sensor data when it’s necessary, so the impact on battery life and data usage is minimal.

On the PC and laptop side, we look at factors such as keystroke timing (not what you type, but how you type), mouse/touchpad movements (finger length affects swipe/scroll arc), as well as looking at wifi and Bluetooth telemetry data from not only your devices, but also from other signals that are found around you. We tap into the constant signals emitted by Bluetooth LE to keep track of where you are relative to known and unknown devices.

Many of these factors are extremely noisy and possess a high false-positive rate when examined individually. On the backend, we combine these noisy factors to extract a highly accurate “confidence level” via the use of proprietary machine learning algorithms to figure out if it’s really you or someone else using a given device. Best of all, each user always has direct control over which implicit factors are used, and they can even purge the data on command. This is the first time ever that users will have full control and management over their biometric/behavioral data derived from connected sensors.

Our system is highly accurate. By utilizing just four available sensors, our system already achieves five nines (99.999%) of accuracy, which is far more secure and convenient than the status quo of login credentials used today. We can also achieve high accuracy even after a small amount of data. For example, our gait detection algorithms can identify a user after collecting four seconds of walking data.

The graphs below show one example of how you can distinguish between individuals with passive sensor data. These two graphs show accelerometer and gyroscope data from two users while they are sitting down. These two users have been paired to have the same height, weight, and BMI. As you can see, there are clear differences in how the two users sit down, as indicated by the grouping of dots.

There are, of course, cases where the data is noisy or missing, or changes to the underlying physical process (for example, when someone sits on a different chair, or is injured or sore from exercise). This is why this is only one of over 100 factors that we use to authenticate you.