Skip to content

smartnodes-lab/tensorlink

Repository files navigation

Logo

Distributed AI Inference & Training for Everyone

Latest Release Version Node Downloads GitHub Repo stars Join us on Discord Documentation

What is Tensorlink?

Tensorlink is a Python library and decentralized platform that makes distributed AI accessible to everyone. Access Hugging Face models through simple APIs, run PyTorch models across a network of peers, or contribute compute resources to earn rewards, all without the need of centralized infrastructure.

Early Access: We're in active development! Some features are still stabilizing. Join our Discord for updates and support.

Key Features

  • Drop-in PyTorch replacement - Run models in your workflows without VRAM
  • Simple REST APIs - Access Hugging Face models with familiar HTTP requests
  • Privacy-first architecture - Your data stays local, never stored on external servers
  • Earn while you contribute - Get rewarded for sharing idle compute resources

Quick Start

Tensorlink can be accessed via API or directly within Python.

Use the Inference API

import requests

response = requests.post(
    "http://smartnodes-lab.ddns.net/tensorlink-api/generate",
    json={
        "hf_name": "Qwen/Qwen2.5-7B-Instruct",
        "message": "Explain quantum computing in simple terms",
        "max_new_tokens": 256,
        "temperature": 0.7
    }
)

print(response.json())

Installation

pip install tensorlink

Requirements: Python 3.10+, PyTorch 2.3+, UNIX/MacOS (Windows support coming soon)

Run Your First Distributed Model

from tensorlink import DistributedModel
import torch

# Connect to a pre-trained model on the network
model = DistributedModel(
    model="Qwen/Qwen2.5-7B-Instruct",
    training=False,
    device="cuda",
    dtype=torch.float16
)

# Use it like any PyTorch model
inputs = tokenizer("Hello, world!", return_tensors="pt")
outputs = model.generate(inputs, max_new_tokens=100)

Contribute Compute (Mining)

  1. Download the latest tensorlink-miner from Releases
  2. Configure your wallet address in config.json
  3. Run: ./run-worker.sh

That's it! Your GPU will earn rewards by processing AI workloads from the network.

Learn More

Contributing

We welcome contributions! Here's how to get involved:

License

Tensorlink is released under the MIT License.

Packages

No packages published

Contributors 3

  •  
  •  
  •