Note these Amazon Lex chatbot integrations, limits

Amazon Lex opens the door for conversational interactions with apps. But, while it's easy to create a chatbot, you must have the right pieces in place to take advantage of Lex.

Amazon Lex enables users to interact with applications through natural language processing. Essentially, the service takes spoken or written sentences and converts them into parameters that an application can use.

With the service, developers can create Amazon Lex chatbots, which opens the door to many potential use cases. First, though, it's important to grasp how to use the service, as well as its integration capabilities.

Get started with Amazon Lex

Developers can access Amazon Lex via the AWS Management Console and define the main elements they need to create a basic Amazon Lex chatbot, such as:

  • Intents -- What a chatbot can do.
  • Utterances -- Sample human sentences that trigger an intent.
  • Slots -- How to extract specific parameters from human sentences.
  • Error handling -- For example, when the bot doesn't understand a sentence.
  • Response format -- For example, plain text or interactive response cards that include buttons and images.
  • Integrations with AWS Lambda functions -- Once Lex translates a human sentence into an action and extracts relevant data, a developer can call a Lambda function to act on that information. For example, when a user asks, "How's the weather in Seattle?" the service will call a weather API and return the weather in Seattle.

As of now, Lex, which uses the Amazon Alexa voice technology, only integrates with Lambda as a fulfillment mechanism for a chatbot application. If developers have an existing service or API they want to integrate with Lex, they need to write Lambda functions that integrate with their existing back end. Keep this in mind when you estimate how much time and effort is required to build an Amazon Lex chatbot.

It's also simple to integrate an Amazon Lex chatbot with a Lambda function, even though Lex isn't currently supported as a Serverless Application Model event source. When building a chatbot from scratch, this integration can make it easier to write and deploy the application code that powers the chatbot's functionality.

Properly format your bots

It's best to treat Amazon Lex chatbots as code, even if there's no CloudFormation support for the service yet. The good news is that a bot can still be defined in JSON format, where developers can specify intents, utterances, Lambda integrations and other elements. Then they can use the PutBot API to create and update a bot.

For example, for a bot created in the AWS console, a developer can use the Lex API to export a JSON-formatted bot definition and then stop using the console to make further changes to the bot. After the export, he or she can make any changes to the JSON bot definition and the PutBot API, then track the bot development cycle in a similar fashion to any other application code.

Amazon Lex integration

Lex has built-in integrations with Facebook, Kik, Slack and Twilio SMS, which eliminates the cumbersome and time-consuming interface process, and delivers a working chatbot to users' hands faster. However, there are some things to be aware of with these integrations. With Lex response cards, developers are limited to the format Lex supports, which isn't necessarily the same as the options that a target chat platform supports. For example, Slack's Interactive Messages give developers many formatting options that are not supported in Lex Response Cards.

With Lex's built-in integrations, it isn't possible for a bot to initiate a conversation. For example, with Slack, in order for a bot to start a conversation, it needs to know the token that was generated when that user allowed the bot to access a particular channel. AWS stores these tokens, but the Lex API doesn't grant access to them. Therefore, a developer's bot can't proactively send updates to users without custom integration with a chat platform.

It's also important to have a properly configured back end to develop a successful application, especially a chatbot. For example, Slack times out at three seconds. If your Lambda function doesn't respond fast enough, your users will see a timeout error message. Consider system performance from the start when building Amazon Lex chatbots. If a chatbot isn't used a lot, keep Lambda functions warm so they can respond quickly.

Dig Deeper on AWS artificial intelligence

App Architecture
Cloud Computing
Software Quality