Serverless Integration for Large Language Model (LLM) using AWS Lambda
Table of Contents
About
We are going to use an LLM model to summarize a web page and store the outcome into a DynamoDB table. The page is provided through an URL which is also the key used for storing the summarization outcome in the DynamodDB table.
Pprerequisite
- A computer. If using Windows please check it’s Windows 11 and recent versions of Windows 10 (version 1903 or higher, with build 18362 or higher).
- An AWS account. You can create a free account if don’t have one yet.
- Install Docker Desktop by clicking the Download Docker Desktop.
- Install Visual Studio Code, and DevContainer extension.
- Install Git.
- Clone this repo and open it in Visual Studio Code. (Choose “Reopen in container” when being asked)
- Install AWS CLI and CDK by running the
./script/install_tools.sh
in Visual Studio Code’s Integrated Terminal.
Architecture
References
- Udemy - LLM Engineering: Master AI, Large Language Models & Agents Created by Ligency Team, Ed Donner. Last updated 12/2024
- AWS - Track, allocate, and manage your generative AI cost and usage with Amazon Bedrock by Kyle Blocksom and Dhawalkumar Patel on 01 NOV 2024