Skip to content

RomainC-lab/gpt3-tokenizer

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPT3 Tokenizer

This is a isomorphic TypeScript tokenizer for OpenAI's GPT-3 model. Including support for gpt3 and codex tokenization. It should work in both NodeJS and Browser environments.

Usage

First, install:

yarn add gpt3-tokenizer

In code:

import { GPT3Tokenizer } from 'gpt3-tokenizer';

const tokenizer = new GPT3Tokenizer({ type: 'gpt3' }); // or 'codex'
const str = "hello 👋 world 🌍";
const encoded: { bpe: number[]; text: string[] } = tokenizer.encode(str);
const decoded = tokenizer.decode(encoded.bpe);

Reference

This library is based on the following:

The main difference between this library and gpt-3-encoder is that this library supports both gpt3 and codex tokenization (The dictionary is taken directly from OpenAI so the tokenization result is on par with the OpenAI Playground). Also Map API is used instead of JavaScript objects, especially the bpeRanks object, which should see some performance improvement.

License

MIT

About

Isomorphic Tokenizer for GPT3 algorithm for OpenAI.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 100.0%