Skip to content

tomohiro-sawada/llm-law-hackathon

 
 

Repository files navigation

LLM x Law Hackathon.

We are finetuning MPT-7B (or 40B) on Pile of Law and Legal Bench. We are then taking a convex combination with an instruct model.

  • Few shot prompt MPT models (their context length is 65k compared to 2k for Falcon)

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.2%
  • Shell 0.8%