中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
-
Updated
Apr 2, 2024 - Python
中文Mixtral-8x7B(Chinese-Mixtral-8x7B)
🤖️ an AI chat Telegram bot can Web Search Powered by GPT-3.5/4/4 Turbo/4o, DALL·E 3, Groq, Gemini 1.5 Pro/Flash and the official Claude2.1/3/3.5 API using Python on Zeabur, fly.io and Replit.
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
🐳 Aurora is a [Chinese Version] MoE model. Aurora is a further work based on Mixtral-8x7B, which activates the chat capability of the model's Chinese open domain.
Fast Inference of MoE Models with CPU-GPU Orchestration
Build LLM-powered robots in your garage with MachinaScript For Robots!
Examples of RAG using Llamaindex with local LLMs - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Train llm (bloom, llama, baichuan2-7b, chatglm3-6b) with deepspeed pipeline mode. Faster than zero/zero++/fsdp.
An innovative Python project that integrates AI-driven agents for Agile software development, leveraging advanced language models and collaborative task automation.
Examples of RAG using LangChain with local LLMs - Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Reference implementation of Mistral AI 7B v0.1 model.
An unofficial C#/.NET SDK for accessing the Mistral AI API
Tool for test diferents large language models without code.
Notes on the Mistral AI model
Turn any Youtube video into a nice blogpost, using Groq and Deepgram.
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
Unofficial .NET SDK for the Mistral AI platform.
Chat with your PDF files for free, using Langchain, Groq, ChromaDB, and Jina AI embeddings.
A versatile CLI and Python wrapper for Groq AI's breakthrough LPU Inference Engine. Streamline the creation of chatbots and generate dynamic text with speeds of up to 800 tokens/sec.
Examples of RAG using Llamaindex with local LLMs in Linux - Gemma, Mixtral 8x7B, Llama 2, Mistral 7B, Orca 2, Phi-2, Neural 7B
Add a description, image, and links to the mixtral-8x7b topic page so that developers can more easily learn about it.
To associate your repository with the mixtral-8x7b topic, visit your repo's landing page and select "manage topics."