After watching the darkly humorous short film L1ZY (which deals with an insidious digital home assistant that goes a little too far) we really can't stop thinking of it whenever we talk about its real-life counterpart, Amazon's Echo. Despite becoming one of the leading AI assistants on the market, though, Alexa is still kind of clunky, especially when it comes to communicating with its users. To remedy this, Amazon has instituted some new changes to the system to make conversations more natural, including a self-learning technique that allow it to pick up on context clues.

One of the big steps forward is "name-free recognition," which allows a user to dispense with using specific keywords when asking Alexa to do something. According to the Verge, users can now say things like "Alexa, get me a car" to order a car from a ride service like Uber. This will probably help smooth over interactions where people want to hear some music, but forget to add "on Spotify" to their request—which happens a lot.

Another big change is the "context carryover" feature, which allows Alex to answer a chain of different, related questions. This is how Ruhi Sarikaya, the director for applied science for Alexa, describes it: "For example, if a customer says "What's the weather in Seattle?" and, after Alexa's response, says "How about Boston?", Alexa infers that the customer is asking about the weather in Boston. If, after Alexa's response about the weather in Boston, the customer asks, "Any good restaurants there?", Alexa infers that the customer is asking about restaurants in Boston."

Hopefully, these changes allow Alexa to become better at doing basic things around the house. As it stands, paying $69.99 for a device that tells you the weather and plays music is pretty steep.