Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove Theano from Week 5 #348

Merged
merged 3 commits into from
Feb 8, 2021
Merged

Remove Theano from Week 5 #348

merged 3 commits into from
Feb 8, 2021

Conversation

yhn112
Copy link
Collaborator

@yhn112 yhn112 commented Apr 10, 2020

Fixes #174. Part of #258.

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

You'll be able to see Jupyter notebook diff and discuss changes. Powered by ReviewNB.

@dniku dniku marked this pull request as draft April 10, 2020 20:08
@dniku dniku changed the title Remove theano from week5 Remove Theano from Week 5 Apr 10, 2020
Copy link
Collaborator

@dniku dniku left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Cleanup notebook metadata
  • Prettify Python code
  • Remove redundant outputs
  • Fix Colab initialization

Copy link
Collaborator

@dniku dniku left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a superficial review. I haven't launched the code or even opened the notebook, I've only looked at the diff. The whole thing looks close to finished. I'll test whether it actually works later.

"""
A single-file module that makes your lasagne network into a bayesian neural net.
Originally created by github.com/ferrine , rewritten by github.com/justheuristic for simplicity
import torch
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this file is converted from someone's module, I believe credit should be given to the original author.

self.bias.data.zero_()

def forward(self, x):

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

redundant new line

# Local reparameterization trick
out = mean + std * epsilon


Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

redundant new line

@@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "code",
"execution_count": null,
"execution_count": 1,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please clean up these. Use my script or Jupyter's native feature (cells -> execution counts -> clean all I believe)

"metadata": {},
"outputs": [],
"source": [
"def calc_kl(model):\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is already implemented in ModuleWrapper.forward, no?

" for module in model.modules():\n",
" if hasattr(module, 'kl_loss'):\n",
" kl = kl + module.kl_loss()\n",
" return kl/10000"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/10000 ?

" updates=updates,\n",
" allow_input_downcast=True)\n",
" self.n_actions = n_actions\n",
" self.model = < Your network here> # Use BBBLinear instead of Linear layers\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

<YOUR CODE: network. Use BBLinear instead of linear layers>

@@ -1423,9 +1365,22 @@
}
],
"metadata": {
"kernelspec": {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please clean up this

"\n",
" def predict_sample_rewards(self, states):\n",
" return self.model(torch.Tensor(states))\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tensor is lowercase nowadays

Copy link
Contributor

@Qwasser Qwasser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left some comments on codestyle. Will review how it works later.
You should fix environment setup in notebook since it does not require theano installation now.


__all__ = ['NormalApproximation', 'get_var_cost', 'bbpwrap']
class GaussianVariational(nn.Module):
#Samples weights for variational inference as in Weights Uncertainity on Neural Networks (Bayes by backprop paper)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs space after #
Add link to referred paper.



class ScaleMixturePrior(nn.Module):
#Calculates a Scale Mixture Prior distribution for the prior part of the complexity cost on Bayes by Backprop paper
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use ''' doc strings. Add link to paper.

log q(weights|learned mu and rho) aka log q(theta|x)
Calculates the log_likelihood for each of the weights sampled relative to a prior distribution as a part of the complexity cost
returns:
torch.tensor with shape []
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what shape?

# if layer is bayesian or pretends so
cost += layer.get_var_cost()
return cost
Bayesian Linear layer, implements the linear layer proposed on Weight Uncertainity on Neural Networks
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

link to paper

freeze = False):
super().__init__()

#our main parameters
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spaces after #. Comment is redundatn.

self.bias = bias
self.freeze = freeze

#parameters for the scale mixture prior
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Space

def forward(self, x):
# Sample the weights and forward it

#if the model is frozen, return frozen
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment repeats code. Remove it

# Sample the weights and forward it

#if the model is frozen, return frozen
if self.freeze:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is boolean variable. Name it is_frozen.

week05_explore/bayes.py Outdated Show resolved Hide resolved
week05_explore/bayes.py Outdated Show resolved Hide resolved
Copy link
Contributor

@Qwasser Qwasser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • Supply doc string to your layer implementation. It will help students to understand it.
  • In the notebook sample_prediction method does not work without .detach().numpy()
  • Is it possible to make 2-layer network no? If not, than "your code here" section in __init__ of BNNAgent is redundant. Can you fill it with model or provide possibility of multilayer agent?
  • Why Bayesian layer is called BBBLayer?

super().__init__()

#our main parameters
def __init__(self, in_features, out_features, alpha_shape=(1, 1), bias=True, name='BBBLinear'):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you write a detailed docstrig here?

@yhn112 yhn112 linked an issue May 11, 2020 that may be closed by this pull request
@dniku dniku changed the base branch from spring20 to master August 4, 2020 13:52
@@ -2,7 +2,7 @@
"cells": [
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove installation of Theano/Lasagne from the Colab initialization cell.

@Qwasser Qwasser marked this pull request as ready for review February 8, 2021 17:06
@Qwasser Qwasser merged commit dab6ae3 into master Feb 8, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove Theano/Lasagne support Port week5 BNN from Theano/Lasagne to PyTorch (or at least TF)
3 participants