Skip to content

PeakIQ is an AI-powered tool for optimizing API performance, providing real-time insights and recommendations to enhance system efficiency.

License

Notifications You must be signed in to change notification settings

jjacobsonn/PeakIQ

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PeakIQ

PeakIQ is an AI-powered optimization tool designed to keep APIs running at peak performance. Built with an LLM-powered recommendation engine and deployed on the Akash Network, PeakIQ analyzes real-time metrics to detect bottlenecks and provide actionable insights for API improvements. These insights include recommendations for caching, load balancing, and database tuning.

Table of Contents


Project Overview

PeakIQ aims to help developers and DevOps teams optimize their API performance through intelligent, data-driven recommendations. The tool collects real-time performance metrics, analyzes them, and suggests improvements to reduce latency and optimize resource utilization.


Features

  • Real-Time Metrics Tracking: Collects response times, request volumes, and error rates.
  • AI-Driven Recommendations: Suggests optimizations based on metrics, including caching, load balancing, and indexing.
  • Interactive Dashboard: Visualizes API metrics and recommendations for quick insights.
  • Containerized with Docker: Easily deployable on the Akash Network.

Getting Started

Prerequisites

  • Docker: Ensure Docker is installed and running.
  • Git: Use Git to clone the repository.

Clone the Repository

git clone https://github.com/your-username/PeakIQ.git
cd PeakIQ

Docker Setup

Build and Run the Application

The application consists of several services, including the backend, LLM engine, Redis database, and a frontend dashboard.

  1. Start Docker Containers: Run the following command to build and start the containers:

    docker-compose up --build
    
  2. Check Status: Verify that all containers (backend, llm_engine, redis, and dashboard) are running without errors. You should see logs indicating the backend is accessible at http://localhost:8000.

  3. Stop Containers: To stop the containers, use:

    docker-compose down
    

API Endpoints

1. Metrics Endpoint

Use this endpoint to submit API metrics that PeakIQ will analyze.

  • Endpoint: /metrics
  • Method: POST
  • Example Request:
    curl -X POST http://localhost:8000/metrics -H "Content-Type: application/json" -d '{"endpoint": "/api test","response_time": 120, "status_code": 200,"timestamp": "2024-11-03T22:10:00"}'
    
  • Expected Response:
    {"status": "Metrics received"}
    

2. Recommendations Endpoint

This endpoint provides optimization suggestions based on submitted metrics.

  • Endpoint: /recommendations
  • Method: POST
  • Example Request:
    curl -X POST "http://localhost:8000/recommendations" -H "Content-Type: application/json" -d '{"endpoint": "/api/test"}'
    
  • Expected Response:
    {"recommendation": "Consider caching this endpoint for optimization."}
    

3. LM Endpoint

To directly interact with the LLM engine for testing purposes:

  • Endpoint: /recommendations
  • Port: 8001
  • Example Request:
    curl -X POST "http://localhost:8001/recommendations" -H "Content-Type: application/json" -d '{"text": "Optimize API performance"}'
    
  • Response:
    {"recommendation": "Optimize API performance by..."}
    

Usage Guide

  1. Access the Dashboard: Open your browser and navigate to http://localhost:YOUR_DASHBOARD_PORT to view real-time metrics and recommendations.

  2. Submit Metrics: Use the /metrics endpoint to continuously submit data for any API you’re monitoring.

  3. View Recommendations: After submitting metrics, use the /recommendations endpoint to receive improvement suggestions. These insights are shown on the dashboard as well.


Updates in Latest Version

In the latest merge, implemented the following key features:

  1. Core Backend Features:
  • Metrics Collection and Storage: Added endpoints to collect, store, and retrieve API metrics. This includes capturing response times, status codes, and request timestamps for all API interactions.
  • Recommendation Engine: Integrated the LLM-powered recommendation endpoint to provide optimized suggestions for improving API performance based on usage data.
  1. API Enhancements:
  • POST /api/metrics: Collects API usage metrics and stores them in a data structure to be accessed by the frontend.
  • GET /api/metrics: Retrieves all stored metrics in real time, allowing for dynamic monitoring of API performance.
  • POST /api/recommendations: Generates performance optimization suggestions for API endpoints using the recommendation engine.

Acknowledgments

This project was developed by Jackson Jacobson (github: @jjacobsonn). Special thanks to contributors and collaborators who supported its development.

If you use this project, please provide appropriate credit by linking back to this repository and following the guidelines outlined in the license.

License

This project is licensed under the MIT License. See the LICENSE file for more details.

About

PeakIQ is an AI-powered tool for optimizing API performance, providing real-time insights and recommendations to enhance system efficiency.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published