Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update nnbp.m #85

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Update nnbp.m #85

wants to merge 1 commit into from

Conversation

ericstrobl
Copy link

added rectified linear and softplus activation functions

added rectified linear and softplus activation functions
@rasmusbergpalm
Copy link
Owner

Please merge this into your other pull request, and see the comments there.
Also I think happynears suggestion should work, so please try that.

@AK747
Copy link

AK747 commented Jul 13, 2014

Hello.I will ask a problem about convolutional neural network. There are forty-five gray pictures (256*256),I use them as traing sample, which is divided into three categoriesapple(apple, camel , car ), each category 15 pictures: apple is set label 1, camel is label 2, car is label 3.Then I also put them into 100,010,001. I choose extra fifteen pictures as traing sample.My mat documents types are the same with handwritten database. I simply modify the main function data, while I haven't changed other functions. Regardless of the iteration number, the results are the same. Divided into three categories, the error rate is 2/3.
Accordingly, I also test ohers, if divided into four categories, the error rate is 3/4;Divided into five categories, the error rate is 4/5......
I found the function of cnntest (~, h] = max (net.o) was wrong.I got the results are the same value, such as, all is 1 or 2 or 3 ...... Why this was happen? Why not get the right h?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants