This repository contains the portable deep learning (deep neural networks) library implementation for .NET platform. This library supports inference and training. Furthermore, all codes are written in C#.
You can install the Merkurius NuGet package from the .NET Core CLI command.
> dotnet add package Merkurius
or from the NuGet package manager.
PM> Install-Package Merkurius
To build Merkurius, run .NET Core CLI command.
> dotnet build Merkurius.csproj
Convolutional neural network (CNN).
var model = new Model(
new Convolution(ch, iw, ih, f, fw, fh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
new Activation(new ReLU(),
new MaxPooling(f, mw, mh, pw, ph,
new FullyConnected(f * ow * oh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
new Activation(new ReLU(),
new FullyConnected(100, 10, (fanIn, fanOut) => Initializers.GlorotNormal(fanIn, fanOut))))))));
model.Fit(trainingList, 50, 100, new Adam(), new SoftmaxCrossEntropy());
Recurrent neural network (RNN).
var model = new Model(
new Recurrent(1, 128, 10, true, false, (fanIn, fanOut) => Initializers.LeCunNormal(fanIn),
new FullyConnected(128, 10, (fanIn, fanOut) => Initializers.LeCunNormal(fanIn),
new Activation(10, new Identity()))));
model.Fit(trainingList, 50, 10, new SGD(), new MeanSquaredError());
- Inference
- Training
- Code first modeling
- .NET Standard 2.1 library
- Dependency-free
- ELU (Exponential linear unit)
- Hyperbolic tangent
- Identity
- ReLU (Rectified linear unit)
- SELU (Scaled exponential linear unit)
- Sigmoid
- Softmax
- SoftPlus
- Softsign
- Batch normalization
- Convolution
- Dropout
- Embedding
- GRU (Gated recurrent unit)
- Fully connected
- LSTM (Long short-term memory)
- Max pooling
- Recurrent
- Cross-entropy
- Mean squared error (MSE)
- AdaDelta
- AdaGrad
- Adam
- Momentum
- Nesterov
- RMSprop
- SGD