The complete source code for this blog post can be found on my GitHub. It is based on Angular CLI and Bootstrap 4.

The Goal

The goal of this blog post or this private project I did was to consume the face recognition API, sending a picture to it which you can capture with the camera of your laptop/computer, and then to analyze it and display the results.

We do fire a request to an endpoint with specific parameters, headers, and a body which is an object with a source URL as a value of a property called “data.” So we can easily do that with Angular as well:

So the challenge here was to not send the URL in the body to the Face API but taking the base64 image representation. We can send blobs to an API, which is not difficult, with the new HttpClient Angular provides us. I tried and searched a bit and found the SO answers which I shared in the “links” section at the end of this article. I modified them a bit and covered them in a service so this method here takes care of generating the correct blob:

That’s a lot of information we get back as JSON. So we can easily cast it in our TypeScript object to work with it. So we can ask the camera service to get the photo, use a switch map to ask the face recognition service to work with the data and give back the result.