Skip to content

UvrajSB/Interface

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Interface

Overview

Interface is a privacy-preserving tool for interacting with cloud-hosted Large Language Models (LLMs). It ensures user data remains protected by leveraging an on-device Small Language Model (SLM) for initial queries.

If the SLM cannot generate a satisfactory response, the query is automatically routed to a cloud-hosted LLM. However, before sending the query, all Personally Identifiable Information (PII) is replaced with synthetic data to maintain privacy. Once the response is received, the tool reverses this process, restoring the original information seamlessly.

Key Features

  • On-Device Processing: Provides a local SLM for handling queries without external data exposure.
  • Intelligent Query Routing: Automatically forwards complex queries to cloud LLMs only when necessary.
  • Privacy Protection: Anonymizes sensitive data before transmission and restores it upon response.
  • Seamless Integration: Ensures a smooth user experience without compromising security.

Why Use Interface?

  • Protects user privacy while leveraging powerful cloud-based AI models.
  • Reduces reliance on cloud resources, minimizing costs and latency.
  • Ensures sensitive data never leaves the local environment in an identifiable form.

Getting Started

Follow the installation instructions to set up Interface on your system.

  1. Install Requirements

    • Use the following command:
      pip install -r requirements.txt
  2. Get a Free Gemini Key

    • Obtain a free API key from Gemini Key
    • Add it as an environment variable:
      export GEMINI_API_KEY=your_api_key_here
  3. Install deepseek-r1:1.5b

    • Run the following command:
      ollama pull deepseek-r1:1.5b
  4. Install en_core_web_lg which is a pretrained english language model for tasks like semantic understanding and text classification

    • Run the following command:
      python -m spacy download en_core_web_lg
  5. Run the Project

    • Start the project using:
      python main.py
  6. Start Chatting


    image

Feature Demo

Secure LLM feature (Anonymisation)

We are using Microsoft's Presidio for data anonymisation in the prompts. Here is an example.
Screenshot 2025-03-04 at 1 47 00 AM
Anonymised Prompt (1)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published