You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
best model : deepset/xlm-roberta-large-squad2
tokenizer : mecab.bpe64k.ko
execute.py -> hyperparameter tuning (colab 활용)
dense, sparse retrieval score 같이 활용
reader 개선 방안
-> 어차피 bert max sequence length 단위로 자르기 때문에 큰 영향 있을지 모른다.
-> downstream task에 관계 없이 backbone 네트워크만 가져올 수 있다.
retrieval 개선 방안 (todo list)
-> context에서 max_sequence_length까지의 문자열만 사용한다.
-> overlap을 사용해서 1개 context 여러 개로 늘려서 저장
역할 분담
목요일까지는 retrieval 구현에 집중
Beta Was this translation helpful? Give feedback.
All reactions