Skip to content

RJuro/RPAI2024

Repository files navigation

RPAI2024

Applied LLM workshop material. Find me on LinkedIn

Title Open in Colab
🐍 Python Refresher Open In Colab
📘 LLMs for structuring Data Open In Colab
📝 Simple QA Retriever Open In Colab
🔍 Self-Query Retriever Open In Colab
🏷️ LLM Label Synthesis Open In Colab
🔧 LLM Finetuning Open In Colab

🚀 GenAI App Deployment with Langserve

📘 Tutorial 🗂️ HF Repository

Links & Cool Material for Building AI Applications

Here are some valuable resources to help you build AI applications, complete with enhanced descriptions and key features for each tool.

Argilla

  • Overview: Argilla is an open-source data curation platform designed for Large Language Models (LLMs). It supports the entire MLOps lifecycle from data labeling to model monitoring.
  • Key Features:
    • Human and Machine Feedback: Combines human insights with machine learning for better data annotation.
    • Integration: Compatible with major NLP libraries like Hugging Face transformers, spaCy, and more.
    • End-to-End Solution: Facilitates the development, evaluation, and continuous improvement of NLP models.
    • Deployment: Easily deployable using Docker, enabling quick setup and scalability.

Phidata

  • Overview: Phidata is a framework for building AI Assistants equipped with memory, knowledge, and various tools.
  • Key Features:
    • Web Search Integration: Supports web searching via DuckDuckGo, Google, etc.
    • API Data Retrieval: Pulls data from APIs like yfinance and polygon.
    • Data Analysis: Utilizes SQL, DuckDb, etc., for analyzing data.
    • Research and Reporting: Capable of conducting research and generating comprehensive reports.
    • Task Automation: Automates tasks such as sending emails and querying databases.

Unsloth

  • Overview: Unsloth optimizes the fine-tuning of models like Llama 3, Mistral, Phi-3, & Gemma, making the process faster and more memory-efficient.
  • Key Features:
    • Efficiency: Fine-tunes models 2-5x faster with up to 80% less memory usage.
    • Performance: Focuses on optimizing model performance and resource utilization.

LangServe

  • Overview: LangServe offers a hosted solution for deploying LangChain applications with a single click.
  • Key Features:
    • Ease of Deployment: Simplifies the process of deploying LangChain applications.
    • Scalability: Designed for quick and scalable deployment in production environments.

Ollama

  • Overview: Ollama provides a platform to quickly get started with large language models (LLMs).
  • Key Features:
    • User-Friendly: Suitable for both beginners and advanced users.
    • Quick Setup: Helps users get up and running with LLMs efficiently.

Chainlit

  • Overview: Chainlit is an open-source asynchronous Python framework for building conversational AI and agentic applications.
  • Key Features:
    • Rapid Development: Enables building production-ready AI applications in minutes.
    • Scalability: Supports the development of scalable conversational AI solutions.
    • Community Support: Extensive documentation and community resources to assist developers.

About

Applied LLM workshop material

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages