Skip to content

fleur0705/master_thesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Investigating the extent to which transformer-based language models learn long-distance dependencies

Author

  • Yingqin HU

The goal of this thesis is to assess the extent to which transformer-based language models acquire or not long-distance dependencies by comparing their behavior with that of humans.

Three main studies were conducted:

  • Modeling acceptability and log-probability (model.R)
  • comparing acceptability and log-probability on subject island studies (code_for_compute_lp_surprisal.ipynb)
  • comparing reading time and surprisal on dequi/dont relative clauses (code_for_compute_lp_surprisal.ipynb)

In addition, plot.ipynb was used to generate plots to visualize the results obtained from these studies.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published