Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

verbose #95

Open
weichen1984 opened this issue Apr 27, 2017 · 3 comments
Open

verbose #95

weichen1984 opened this issue Apr 27, 2017 · 3 comments
Labels

Comments

@weichen1984
Copy link

Am I missing something? I cannot seem to find where I can set a verbose flag? I would like to see after each iteration, how do the metrics improve, such as the output from the original implementation.

@brawner
Copy link

brawner commented Jun 19, 2017

It does not appear that there is any logging capability currently. In libFM it prints out the training and testing regression accuracy or classification accuracy after each iteration. With scipy minimization you can specify after how many iterations details should be printed out. It's very helpful to me to understand how quickly things are converging.

It would be great to see some logging/output capability added. Thanks for this excellent tool though!

@ibayer
Copy link
Owner

ibayer commented Jun 20, 2017

The model parameter in fastFM can be inspected after each iteration without much overhead. This allows you to create your own logging output in python. This approach is much more flexible but requires some boiler plate code (see. http://ibayer.github.io/fastFM/guide.html#learning-curves).

@brawner
Copy link

brawner commented Jun 20, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants