Build an Amazon Lex Chatbot to Manage Amazon WorkSpaces

Amazon WorkSpaces is a Desktop-as-a-Service (DaaS) service lets customers build scalable and secure cloud-based desktops for any number of users. It provides users access to the documents, applications, and resources that they need – anywhere, anytime, and from any supported device.

Customers using Amazon WorkSpaces usually start with specific use cases. An example use case is providing a corporate connected desktop infrastructure to external contractors. This helps take advantage of a Bring Your Own Device (BYOD) policy or provide their employees with a powerful GPU enabled desktop, without purchasing and managing expensive physical hardware.

As customers get comfortable with Amazon WorkSpaces, they usually notice that they can take advantage of the same benefits for any desktop use case in their corporation. However, they usually face a huge increase on the number of WorkSpaces to provision and maintain. This is where automation helps.

Amazon WorkSpaces provides you with a full set of public APIs, which allow you automate all your daily tasks. At the same time, it makes those tasks repeatable and more resilient to human error.

Let’s look at creating WorkSpaces through the CreateWorkspaces API as an example. You pass required parameters such as “BundleId”, which has a format similar to “wsb-abcde12fg” and “DirectoryId”, which in turn, will look like “d-12345a678b”. This scenario shows where a chatbot can be helpful to work with these parameters.

What Amazon Lex offers to the WorkSpaces administrator experience

Amazon Lex lets you create a layer of functionality to acquire the parameters described in the previous section. You can get them from the Amazon WorkSpaces administrator, or even the end-user itself.

Amazon Lex enables customers to build conversational bots (chatbot) quickly and easily using the same deep learning technologies that power Amazon Alexa.

For an Amazon Lex bot, create intents for each Amazon WorkSpaces API that guides the user interaction and collects the parameters needed for the Amazon WorkSpaces APIs. In those intents, use custom slot types.

A slot type specifies parameters that indicate the information an intent uses to respond to a user’s request. A slot type contains a list of values that Amazon Lex uses to train the machine learning model to recognize values for a slot.

Slot values can have synonyms that you can define. For example, for BundleId “wsb-abcde12fg”, you can define the synonym to be more descriptive, like “graphics” or “standard”. For the “DirectoryId,” you can reference your Active Directory domain, like “example.org.”

The following screenshot shows where you specify a synonym for a slot type. You can access the slots by clicking Slot types in the main menu of the Amazon Lex console.

These capabilities in Amazon Lex are how our chatbot approach makes the interaction more user-friendly.

What does AWS Lambda offers to the WorkSpaces Chatbot approach?

After Amazon Lex collects the parameters from the users through the intent, it can call for an AWS Lambda function that interacts with the Amazon WorkSpaces API. AWS Lambda lets you run scheduled or event triggered code without provisioning or managing servers. It takes care of all scalability aspects for you.

For the example in this post, the trigger for running your code in AWS Lambda, and interacting with the Amazon WorkSpaces service, is Amazon Lex.

Architecture

I’m going to outline the architecture of the solution in order to give you a better idea on how it works.

To accomplish this objective, we’re creating a serverless system composed of the following characteristics:

The front end is a static website based on HTML5 and JavaScript. On this website, I used the JavaScript SDK to interact with Amazon Lex. This static website is hosted in Amazon Simple Storage Services (Amazon S3).

An Amazon Lex bot with intents for user interaction. The intents have the different Amazon WorkSpaces APIs and custom slot types, with a synonym defined. Remember that the synonym, as previously described, enables user-friendly interactions for those users.

Amazon Lex intents that drive user interaction. The intents collect all the parameters, and then trigger an AWS Lambda function, which receives those parameters in the event, in a JSON format. It then uses them with the Amazon WorkSpace API. For this scenario, I use our Python SDK, Boto3, for the Lambda functions. I can create different Lambda functions for the different APIs.

As a result, I created [GJ1] functions to receive this information from Amazon Lex. It’s received in the eventparameter as a JSON payload. It parses the needed information, adds it in variables, and passes them to the API client.

It then gets the response from the API and sends it back to Amazon Lex, who will prompt it. For more information about creating a Lambda function, see the AWS Lambda Documentation.

Creating the Lex Chatbot

Log in to the AWS Management Console for Amazon Lex, and then create a bot.

For this example, I name the bot WorkSpacesVending. You can name yours whatever you want.

The following screenshot shows where you specify the entries to create your bot.

You can then start creating intents, which function as the conversational flow you have with your users. For this example, I created different Intents for each Amazon WorkSpaces API that I choose to interact with through my chatbot.

Create the intent by first creating the utterances. Utterances are the spoken or typed phrases that invoke the intent. In my example, I’ve chosen the following phrases. You can choose anything you think makes sense for you and your users like “I want a WorkSpace”, “Provision a WorkSpace”, “Create a WorkSpace”.

After creating the utterances, we create the slot types. The slots are the actual data that the user must provide to fulfill the intent.

For each of the parameters that the methods expect to receive, I created a custom slot type, as described earlier in this post, with its corresponding synonym.

Earlier in this post, I showed you the example of the DirectoryID. The following is a similar example:

I created a custom stop for UserName, but in this case, the slot is going to store the information passed by the user directly. A synonym isn’t required here.

When the custom slot types are created, you allocate them into the intent.

In the Slots area, choose a name, a slot type, and then create a phrase that interacts with the user to collect that type of information.

The following examples show the slots I created for my CreateWorkSpace intent:

After you create the slot, you can call the AWS Lambda function that we created before. You can reference your AWS Lambda functions in the Fulfillment session, and grant permissions to the Amazon Lex bot to interact with it.

You can use the same strategy for other methods. The following sections show some examples of intents I created for the reboot_workspaces, rebuild_workspaces and terminate_workspaces methods.

RebootWorkspace Intent

RebuildWorkspace Intent

TerminateWorkspace Intent

Notice that for the TerminateWorkspace intent, I added a Confirmation prompt so the user can have a last chance make a different decision.

When you create your intents, you can build and test your chatbot in the Amazon Lex console. If everything works as expected, you can publish your chatbot. At this point, it is ready to be used!

Creating the front end

To create the front end, we can create a simple HTML5 and JavaScript website. This website is going to take advantage of the AWS JavaScript SDK to interact with our Amazon Lex chatbot.

This post shows you how to secure your front end by using an Amazon Cognito. Federated Identities pool. It integrates the Pool ID with the chatbot through JavaScript and builds an HTML, with JavaScript, static website that takes advantage of the class AWS.LexRuntime from AWS JavaScript SDK with the method postContent to expose the Amazon Lex chatbot to an HTML-based static webpage.

After your front end is working, you can host it on Amazon S3. That completes your serverless-based ChatOps platform for Amazon WorkSpaces. For more information, see Hosting a Static Website on Amazon S3.

Conclusion

Automating tasks from your skilled SysAdmins helps manage Amazon WorkSpaces at scale and create a collaborative environment. At the same time, you introduce your end user computing services to the world of DevOps. You can use the same concepts described in this post to create ChatOps platforms for any AWS services. Let the chatbots make our lives easier!