The Rekognition AI platform is now capable of identifying fear, adding an eight entry in its emotion-detection capability. (Image credit: Amazon)

Amazon’s Rekognition platform is a deep learning image and video analysis service that’s part of the company’s AI software suite, which uses machine-learning and deep-learning APIs to identify scenes, objects, faces, and other objects within those images and video. The platform provides access to face-based user verification by comparing live images with reference pics. It also offers sentiment and demographic analysis, interpreting emotions and gender from facial imagery. What’s more, it features unsafe content detection, celebrity recognition, and text detection- classifying ads, sports scores, news, and more.

Amazon’s Rekognition platform has also reached a new milestone in its facial analysis feature with increased accuracy and functionality. The system was previously only capable of identifying seven emotions- happy, sad, angry, surprised, disgusted, calm, and confused. It can now add fear to that list, as well as increased accuracy at identifying gender, age range, and attributes.

Rekognition provides a confidence overview on how accurate it is at identifying objects- including the animal type and species of this golden retriever. (Image credit: Amazon)

While the Rekognition system is an excellent platform for identifying objects- friends, family, pets, in groups of images and video that you have stored, which is a quick way to group them in different categories. As you might imagine, others look at this new advancement as a nefarious application, leading to a dystopian future, certainly so when Amazon has licensed the software to law enforcement agencies.

Regardless, Rekognition is not perfect; it has trouble identifying women and people of color and even misidentified 28 US lawmakers according to the ACLU. That said, Amazon did pitch the software to US Immigration and Customs Enforcement; however the company states the technology is used to advocate for crime victims.