Furhat Blog

Follow us on our journey as we bring social robots to the world

Should social robots have touch screens?

Interaction with a social robot like Furhat is inherently different from interaction with a voice assistant. The difference is similar to that between talking to someone face-to-face in the same room compared to a phone conversation.

I can see clearly now – the visual perception of a social robot

For a robot to interact socially with human beings, it indeed must be able to not only detect, but also understand human faces and what they represent. Here is a glimpse into how we make this possible.

How to integrate NLU and Dialog with Furhat

In a conversation, we constantly try to interpret what our conversational partner is saying and decipher what meaning they want to convey. For social robots, we call this capability Natural Language Understanding. But how do we use this in the Furhat platform?

5 steps to making robot interactions stronger

You are designing a social robot interaction. You are fighting against the monotone intonation of speech synthesis, against latency in the speech recognition and against the very narrow conversational path you have to lead your users down because “real” AI does not exist yet. How can you avoid your interaction being experienced as slow and flat? How do you make it come to life?