You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I trained a simple neural network with 3 linear layers, relu and a dropout and Adam optimizer
class Net(nn.Module):
def init(self):
super(Net, self).init()
self.fc1 = Linear(12, 128)
self.fc2 = Linear(128, 64)
self.fc3 = Linear(64, 32)
self.fc4 = Linear(32, 1)
self.dropout = nn.Dropout(0.2)
def forward(self, x):
x = torch.relu(self.fc1(x))
x = self.dropout(x)
x = torch.relu(self.fc2(x))
x = self.dropout(x)
x = torch.relu(self.fc3(x))
x = self.dropout(x)
x = self.fc4(x)
return x
The error seems to be very high.
Something is unique about the dataset.
A custom model tailored for this dataset is required
The text was updated successfully, but these errors were encountered:
You need to find the minima of the function given in the dataset. What you have done, you are trying to reduce the MSE loss between the fit NN and the target values. You have changed the objective function altogether here. Plus, scaling plays a big role in NN convergence. Try to scale the values
Epoch [95/100], Loss: 15795335555002138624.0000
Epoch [96/100], Loss: 15947571735960748032.0000
Epoch [97/100], Loss: 22230783709444833280.0000
Epoch [98/100], Loss: 24243408957763223552.0000
Epoch [99/100], Loss: 20029352523428003840.0000
Epoch [100/100], Loss: 26956084463393570816.0000
Mean Squared Error: 20261864048330539008.0000
I trained a simple neural network with 3 linear layers, relu and a dropout and Adam optimizer
class Net(nn.Module):
def init(self):
super(Net, self).init()
self.fc1 = Linear(12, 128)
self.fc2 = Linear(128, 64)
self.fc3 = Linear(64, 32)
self.fc4 = Linear(32, 1)
self.dropout = nn.Dropout(0.2)
The error seems to be very high.
Something is unique about the dataset.
A custom model tailored for this dataset is required
The text was updated successfully, but these errors were encountered: