Applied LLM workshop material. Find me on LinkedIn
Title | Open in Colab |
---|---|
🐍 Python Refresher | |
📘 LLMs for structuring Data | |
📝 Simple QA Retriever | |
🔍 Self-Query Retriever | |
🏷️ LLM Label Synthesis | |
🔧 LLM Finetuning |
📘 Tutorial 🗂️ HF Repository
Here are some valuable resources to help you build AI applications, complete with enhanced descriptions and key features for each tool.
- Overview: Argilla is an open-source data curation platform designed for Large Language Models (LLMs). It supports the entire MLOps lifecycle from data labeling to model monitoring.
- Key Features:
- Human and Machine Feedback: Combines human insights with machine learning for better data annotation.
- Integration: Compatible with major NLP libraries like Hugging Face transformers, spaCy, and more.
- End-to-End Solution: Facilitates the development, evaluation, and continuous improvement of NLP models.
- Deployment: Easily deployable using Docker, enabling quick setup and scalability.
- Overview: Phidata is a framework for building AI Assistants equipped with memory, knowledge, and various tools.
- Key Features:
- Web Search Integration: Supports web searching via DuckDuckGo, Google, etc.
- API Data Retrieval: Pulls data from APIs like yfinance and polygon.
- Data Analysis: Utilizes SQL, DuckDb, etc., for analyzing data.
- Research and Reporting: Capable of conducting research and generating comprehensive reports.
- Task Automation: Automates tasks such as sending emails and querying databases.
- Overview: Unsloth optimizes the fine-tuning of models like Llama 3, Mistral, Phi-3, & Gemma, making the process faster and more memory-efficient.
- Key Features:
- Efficiency: Fine-tunes models 2-5x faster with up to 80% less memory usage.
- Performance: Focuses on optimizing model performance and resource utilization.
- Overview: LangServe offers a hosted solution for deploying LangChain applications with a single click.
- Key Features:
- Ease of Deployment: Simplifies the process of deploying LangChain applications.
- Scalability: Designed for quick and scalable deployment in production environments.
- Overview: Ollama provides a platform to quickly get started with large language models (LLMs).
- Key Features:
- User-Friendly: Suitable for both beginners and advanced users.
- Quick Setup: Helps users get up and running with LLMs efficiently.
- Overview: Chainlit is an open-source asynchronous Python framework for building conversational AI and agentic applications.
- Key Features:
- Rapid Development: Enables building production-ready AI applications in minutes.
- Scalability: Supports the development of scalable conversational AI solutions.
- Community Support: Extensive documentation and community resources to assist developers.