Introduction
Hey there, fellow traders, engineers, and AI enthusiasts! We’ve reached the final post in this series, where I’ve been sharing my journey of building an AI-powered stock analysis agent. From laying the foundation with LangChain, OpenBB, and Claude 3 Opus to integrating advanced analysis techniques and risk management strategies, we’ve covered a lot of ground. I’ve demonstrated how to add tools for analyzing trends, charts, sentiment, and risk, finally orchestrating them together using LangGraph as a multi-agent chat application.
However, a tool is only as good as its accessibility. That’s why, in this post, I’ll show you how to deploy the AI stock analysis agent on AWS using the Copilot CLI.
Why AWS Copilot is Special
AWS Copilot stands out because it can streamline the entire lifecycle of containerized applications, from development to production. It encapsulates best practices and production-ready configurations, reducing developers’ learning curve and operational overhead. By automating infrastructure provisioning and deployment pipelines, Copilot allows developers to focus on application development and optimization rather than the intricacies of cloud infrastructure management.
Prerequisites
Before we dive in, make sure you have the following:
- An AWS account. If you don’t have one, sign up at https://aws.amazon.com/.
- AWS CLI installed and configured. Follow the instructions at https://aws.amazon.com/cli/.
- Docker installed. Download it from https://www.docker.com/.
- Copilot CLI installed. Follow the instructions at https://aws.github.io/copilot-cli/docs/getting-started/install/ for your operating system.
Load the Boat and Ship It
Step 1: Dockerize the Application
First, we must create a Dockerfile for our AI stock analysis agent. If you’ve been following along so far, you’ll know that the repo already contains a Dockerfile
, but the app contains a LangServe API and a Streamlit UI. For simplicity (and article length), I will focus on the LangServe API, but one can efficiently orchestrate multiple apps together using Copilot. Thus, I will split these concerns into two Dockerfiles.
# api.dockerfile
FROM python:3.11-slim
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
RUN apt-get update \
&& apt-get -y install build-essential \
&& apt-get -y install gcc \
&& apt-get clean
RUN pip install poetry==1.6.1
RUN poetry config virtualenvs.create false
WORKDIR /code
COPY ./pyproject.toml ./README.md ./poetry.lock* ./
COPY ./package[s] ./packages
RUN poetry install --no-interaction --no-ansi --no-root
COPY ./app ./app
RUN poetry install --no-interaction --no-ansi
EXPOSE 8080
CMD exec uvicorn app.server:app --host 0.0.0.0 --port 8080
Step 2: Initialize Copilot
Next, navigate to the project root directory in the terminal and run this command.
copilot init
Follow the prompts to select the type of application (Load Balanced Web Service), name your service (e.g., “financial-chat”), and choose the api.dockerfile
Dockerfile. I have provided all the necessary YAML manifests in the repo, so there is no need to add any additional environments or services unless the reader wishes. For example, here is the manifest.yml
for the API service.
name: financial-chat-api
type: Load Balanced Web Service
http:
path: '/'
healthcheck: '/health'
image:
build: api.dockerfile
port: 8080
cpu: 1024
memory: 4096
platform: linux/x86_64
count: 1
exec: true
network:
connect: true
variables:
LOG_LEVEL: debug
secrets:
LANGCHAIN_TRACING_V2: /copilot/financial-chat/test/secrets/langchain_tracing
LANGCHAIN_ENDPOINT: /copilot/financial-chat/test/secrets/langchain_endpoint
LANGCHAIN_API_KEY: /copilot/financial-chat/test/secrets/langchain_api_key
LANGCHAIN_PROJECT: /copilot/financial-chat/test/secrets/langchain_project
OPENAI_API_KEY: /copilot/financial-chat/test/secrets/openai_api_key
IMGUR_CLIENT_ID: /copilot/financial-chat/test/secrets/imgur_client_id
IMGUR_CLIENT_SECRET: /copilot/financial-chat/test/secrets/imgur_client_secret
TAVILY_API_KEY: /copilot/financial-chat/test/secrets/tavily_api_key
TIINGO_API_KEY: /copilot/financial-chat/test/secrets/tiingo_api_key
FMP_API_KEY: /copilot/financial-chat/test/secrets/fmp_api_key
INTRINIO_API_KEY: /copilot/financial-chat/test/secrets/intrinio_api_key
ANTHROPIC_API_KEY: /copilot/financial-chat/test/secrets/anthropic_api_key
OPENBB_TOKEN: /copilot/financial-chat/test/secrets/openbb_token
I have added secrets to the AWS Systems Manager using Copilot, which can be referenced using the /copilot/<app>/<service>/secrets/<key>
path. These secrets can be uploaded in bulk using the following command, ensuring they get properly tagged for Copilot access. Just be sure the YAML file syntax follows this format.
copilot secret init --cli-input-yaml secrets.yml
Step 3: Deploy
Now that we have all the config files in place, let’s spin this setup up.
copilot env init # create a new environment, using "test" as the name
copilot env deploy --name test # deploy the environment
copilot svc init --name financial-chat-api # create a service using the manifest.yml above
copilot svc deploy --name financial-chat-api --env test # let it rip!
Copilot will package the application, create the necessary infrastructure, and deploy the service to AWS. It may take a few minutes, but once it’s done, you’ll see the URL where your AI stock analysis agent is live and ready to use.
Step 4: もったいない (Waste Not)
Tearing the entire CloudFormation stack down is easy using Copilot when done with it.
copilot app delete
Why not Bedrock?
I tested out the current support for Anthropic’s models in AWS Bedrock using the aws_langchain
Python package, but I ran into a few snags along the way. It appears that function calling support using the ChatBedrock
provider is currently missing bind_tools
support, which this app heavily uses. These libraries are continuously being improved, and I expect full support shortly. Anthropic released the general availability of its tool calling last week. I will update this article with those bits when they are released.
Conclusion
And there you have it! Using the Copilot CLI, we’ve successfully deployed the AI-powered stock analysis agent on AWS. Now, you can access its wealth of insights and data-driven recommendations anywhere, anytime.
Remember, this is just the beginning. With your agent in the cloud, you can continue to refine and expand its capabilities, making it an even more valuable tool in your trading arsenal.
I hope this series has been as fun and informative for the reader as it has been for me. As I continue to work on a commercial application in this space, I must admit that a good trader always keeps a few cards close to the vest. Gotta know when to hold ’em, know when to fold ’em. 😉
I’ll see you again next time when I explore another topic in finance, trading, tech, and AI while I continue swapping symbols.