Skip to content

Latest commit

 

History

History
14 lines (10 loc) · 782 Bytes

embeddings-cant-possibly-be-right.md

File metadata and controls

14 lines (10 loc) · 782 Bytes
tags
from_paper

Embeddings Can't Possibly Be Right

I was randomly reading a paper from a journal whose name sounded legitimate but actually is one of those predatory journals (and supposedly has an acceptance rate of 90+%, hence why I'm not linking to them). The authors raised something provocative which I thought was worth debunking:

One of core insights in (statistical) NLP is the idea of using embeddings (vectors in Euclidean space) to represent words (concepts). However, this seems to raise some philosophical quandaries; namely, now that we're in Euclidean space, which is equipped with a natural metric, and allows us to

Aside: distributional hypothesis vs distributed representation? I'm pretty sure I use these terms interchangeably.