Skip to content

Use Deepseek Coder 1.3B Instruct-tuned Language Model for code generation in the terminal.

License

Notifications You must be signed in to change notification settings

marcoscannabrava/deepseek-coder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deepseek Coder

See marcoscannabrava/local-llm for a more recent and robust multi-model implementation leveraging the excellent llm package.

This package uses Deepseek Coder 1.3B Instruct-tuned Language Model for code generation in the terminal.

A GPU is not required but makes the responses faster.

It's based on:

Installation

pip install deepseek-coder

Usage

deepseek-coder

The first time you run deepseek-coder the model weights will be downloaded which can take a while.

tail -f /tmp/deepseek_coder.log # this will show the download progress

Options

deepseek-coder --logpath /path/to/logs

Backlog

  • [] improve Aider default prompts and prevent multiple completions due to "improperly formatted response"
  • [] env var for tokens
  • [] multi-platform tests

About

Use Deepseek Coder 1.3B Instruct-tuned Language Model for code generation in the terminal.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages