See beyond your dashboards, metrics, and logs. OpsTower.ai takes the grunt work out of DevOps, connecting to your systems (AWS, Docker, Kubernetes, etc) and debugging problems on its own.
🚧 As of Oct '23, OpsTower.ai can answer questions about your AWS resources and perform calculations on cloudwatch metrics from the command line. Learn about our larger vision.
🏆 OpsTower.ai is the current SOTA for accuracy in the DevOps AI Assistant Open Leaderboard for AWS Services, AWS Cloudwatch Metrics, and AWS Billing.
📅 Book a time on my calendar or email [email protected] to chat about what capabalities you'd like to see.
Vision • Install • Usage • Demo Mode
OpsTower.ai is AI Assistant that takes the grunt work out of DevOps. OpsTower.ai connects to your systems (AWS, Docker, Kubernetes, etc) and debugs and researches problems on its own.
You've probably played with some fragile demos in this space. That's not OpsTower.ai. At the heart of OpsTower.ai is a robust evaluation framework that allows us to measure the agent's performance against a large dataset of questions across problem domains. We're building a platform that will allow us to iterate on the agent's capabilities and measure its performance over time. We'll transparently share our results.
See our Capabilities Roadmap below for more details on current evaluation results and planned knowledge areas.
Installed as a plugin for LLM, A CLI utility for interacting with LLMs.
Install llm
if you haven't yet:
pip install llm
Install the OpsTower plugin:
llm install https://github.com/opstower-ai/llm-opstower/archive/refs/heads/main.zip
Set your API key. Use demo
to try with a live AWS account or see our docs for on creating a read-only IAM user.
llm keys set opstower --value demo
Make opstower
your default model:
llm models default opstower
AWS Services
llm "list each ec2 instance id, name, state, last restart, image, and size"
AWS Cloudwatch Metrics
llm "What is the average CPU utilization of my EC2 instances?"
AWS Billing
llm "breakdown bill by aws service over the past 30 days"
The default timeframe is the past hour. You can specify a different timeframe in the question (ex: "...over the past 10 minutes").
OpsTower.ai provides a OpenAI-compatible API. The llm
CLI utility sends your credentials and question to our API. Server-side, OpsTower.ai generates AWS SDK code to answer your question and execute it in an isolated environment, summarizing the response.
Read how the agent is structured and our current evaluation results against an AWS question dataset.
You can query against your account by setting your API key in the following format:
llm keys set opstower --value AWS_ACCESS_KEY_ID:AWS_SECRET_ACCESS_KEY:AWS_REGION
You can execute commands against a demonstration AWS account that contains an EC2 instance, RDS instance, an Application Load Balancer, several Lambda functions, S3 buckets, and more.
Set the opstower key to demo
:
llm keys set opstower --value demo
Just like how a Junior DevOps engineer isn't assigned with high-level tasks, OpsTower.ai's training is starting with a focus on information retrieval (ex: "what is our estimated AWS bill for this month"). After building up a base set of skills, OpsTower.ai will tackle reasoning you cannot infer from a ready-made dashboard (ex: "what resources appear to be not used and/or over-provisioned and how much could we reduce costs?").
Knowledge area | Status | Accuracy % |
---|---|---|
AWS services | Released | 92% |
AWS cloudwatch metrics | Released | 89% |
AWS billing | Released | 91% |
AWS cloudwatch logs | 🚧 | N/A |
AWS deployment | N/A | |
AWS security | N/A | |
Advanced AWS reasoning | N/A | |
Docker | N/A | |
Kubernetes | N/A | |
Terraform | N/A | |
Advanced Kubernetes reasoning | N/A | |
Incident Investigation | N/A |
- Only read-only operations are permitted. To be safe, you should only use an IAM user with read-only access.
- OpsTower.ai does not support higher-level and/or abstact questions like "Has there been a sudden change in any critical ec2, rds, or s3 metrics?". Try to be specific.
- Support for the
llm --continue
option is not yet available.
llm uninstall llm-opstower