LLM x Law Hackathon. We are finetuning MPT-7B (or 40B) on Pile of Law and Legal Bench. We are then taking a convex combination with an instruct model. Few shot prompt MPT models (their context length is 65k compared to 2k for Falcon)