Skip to content
wangd12rpi edited this page Feb 22, 2025 · 20 revisions

Most part of the project would need to be completed by the end of March.

Questions

  • Standard LoRA: what training configuration. (one LoRA adaptor for each dataset? or one LoRA adaptor for each type of task? or one LoRA adaptor for all tasks?)
  • Using LoRA MoE: what training configuration.

Todo

Datasets

  • Confirm XBRL extraction pipeline (what are the subtasks) and finish creating the dataset.

Finetuning

  • Finetune XBRL extraction.

Benchmark

  • FNXL
Clone this wiki locally