Vogels told a massive AWS re:Invent audience gathered at a stadium in the MGM Grand in Las Vegas that as Amazon decides how to enable computing architectures for the 21st Century, their input plays a crucial role in that process.

"We allow you to set our roadmap," he said, referencing the almost 4,000 features Amazon has introduced since launching its public cloud.

"We are delivering tools now for the systems you want to be running in 2020," Vogels said. Every new feature helps partners and developers build solutions that will benefit their customers a few years down the road, he said.

Advances in data processing, and the primacy of data in the business, Internet of Things, and machine learning are all driving the radical shift in architectures, he said. And the cloud is creating an environment where developers will soon only have to write business logic taking advantage of those capabilities, not underlying system software.

A major disruption that's coming soon and will revolutionize how people interact with applications will be natural interfaces, starting with voice, Amazon's CTO said.

Interfaces have always catered to the requirements of the machine, not the humans using it. "I believe the interfaces of the future are no longer going to be machine-driven," Vogels said.

With true natural-language interfaces to digital systems that are capable of interpreting human feelings and sentiments, "a whole environment will become active" in ways it has never been before, Vogels said, giving the example of a surgeon talking to an application during an operation.

Amazon's Alexa-powered devices, such as the Echo home assistant, have already given the company a lot of experience delivering that technology.

To that end, Vogels announced the launch of Alexa for Business, an artificially intelligent language recognition system to power business applications.

Its "a fully managed service for having many Alexa devices at work," Vogels said, "to manage them, to manage the users, to manage unique skills you may have at your work."

Alexa for business is integrated with video-conferencing systems from vendors like Cisco and Polycom to make sure it "works really well with conference rooms" and "so you never have to type in again a conference ID."

It will deliver capabilities to allow developers to build "conversational systems that will delight your customers."

Vogels also told re:Invent attendees the threat landscape means every developer has to simultaneously be a security engineer.

"Protecting your customer should be your number-one priority," Vogels said. "Without that, you don’t have a business. This comes before any other feature development."

For that reason, security is the largest area of investment at AWS, he said.

People still don’t take encryption seriously enough—both for data in motion and at rest.

"Encryption is the only tool you have to be absolutely sure you are the only one who controls access to your data," he said, noting encryption is integrated into almost all AWS services.

"There is no excuse anymore to not use encryption," he said. At minimum, personal identification information of customers and critical business information should always be encrypted.

Availability is another area that warrants a high-degree of planning from partners and their customers.

The number of 9s a service offers as uptime can vary by redundancies in architecture, but it must be business decisions that drive those considerations, Vogels said.

Sometimes two nines of uptime is fine. If you want three, you can add multiple availability zones, and impose master-slave relationships for databases. Even five nines—only minutes of downtime a year—can be achieved by leveraging multiple regions, and services like the DynamoDB fully managed database.

"But it’s the business rules that need to decide which availability scenario you want," Vogels said. Higher availability comes at a higher price.

Vogels advised AWS developers to take advantage of as many of his cloud's managed services as possible. The more they use, "the likelier the system will be secure, reliable and be able to scale."

The new EKS container service for Kubernetes is an example of such a managed service that enables developers to build applications based on micro-services without having to deal with the pains of infrastructure management.

"Container technology has become sort of the default mechanism if you want to build microservices," he said.

Abby Fuller, AWS' senior technical evangelist, joined Vogel on stage to tell attendees, "we want to be the best place to run your container workloads however you want to run them."

Developers should look for the minimum viable system with containers that supports their business goals, she said.

"Containers in production can be really hard work," Fuller said. "As Werner mentioned, you end up with lots of pieces."

But the new AWS Fargate service will change that by managing the underlying infrastructure of container workloads to allow easy scaling and rapid launching of containerized applications, she said.

The goal of Fargate is "no setup, no service discovery. Just run the application," Fuller said.

Vogels also introduced in general availability AWS Cloud9, an integrated development environment that will facilitate the work of AWS developers. The environment is highly collaborative, and aids in tasks like debugging Lambda functions.

AWS will continue to deliver technologies "you can't get anywhere else," he said.

Lambda serverless compute, which is seeing rapid pickup among large enterprises for large applications, is one popular feature because customers never pay for the system when it's idle.

AWS is ramping Lambda's capabilities with new features, including an API Gateway for integration with virtual private clouds, and 3GB memory support.

On the machine learning front, the new frontier is bridging edge devices with intelligent back-end systems.

"One area where deep integration with machine learning and the real world is in IoT," Vogels said.

Matt Wood, AWS general manager for artificial intelligence, said Amazon is working to make it easier to train AI models in its cloud, then deploy them to the edge of networks. Brainstorming sessions on the technology led to the path of pairing the cloud's "infinite compute" with "some sophistication on the device."

Embedding intelligence in the field is important where a round trip causes prohibitive latency, or devices have to operate in disconnected environments, Wood said.