Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updating global variables without locking #35

Open
GoogleCodeExporter opened this issue Jul 21, 2015 · 0 comments
Open

Updating global variables without locking #35

GoogleCodeExporter opened this issue Jul 21, 2015 · 0 comments

Comments

@GoogleCodeExporter
Copy link

Some of the global variables are updated in the function TrainModelThread 
without the use of locking. Here is one of them:

word_count_actual += word_count - last_word_count;

Can someone please explain how this works. 

Also, can someone please help me with the following question:

For negative sampling, why do we need a unigram table to choose the negative 
samples from? Why can't we just choose random words from the vocabulary?

Sincerely,

Vishal

Original issue reported on code.google.com by [email protected] on 21 Jul 2015 at 6:26

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant