godeep alternatives and similar packages
Based on the "Machine Learning" category

tfgo
Easy to use Tensorflow bindings: simplifies the usage of the official Tensorflow Go bindings. Define computational graphs in Go, load and execute models trained in Python. 
CloudForest
Fast, flexible, multithreaded ensembles of decision trees for machine learning in pure Go. 
Goptuna
Bayesian optimization framework for blackbox functions written in Go. Everything will be optimized.
Do you think we are missing an alternative of godeep or a related project?
Popular Comparisons
README
godeep
Feed forward/backpropagation neural network implementation. Currently supports:
 Activation functions: sigmoid, hyperbolic, ReLU
 Solvers: SGD, SGD with momentum/nesterov, Adam
 Classification modes: regression, multiclass, multilabel, binary
 Supports batch training in parallel
 Bias nodes
Networks are modeled as a set of neurons connected through synapses. No GPU computations  don't use this for any large scale applications.
Todo:
 Dropout
 Batch normalization
Install
go get u github.com/patrikeh/godeep
Usage
Import the godeep package
import (
"fmt"
deep "github.com/patrikeh/godeep"
"github.com/patrikeh/godeep/training"
)
Define some data...
var data = training.Examples{
{[]float64{2.7810836, 2.550537003}, []float64{0}},
{[]float64{1.465489372, 2.362125076}, []float64{0}},
{[]float64{3.396561688, 4.400293529}, []float64{0}},
{[]float64{1.38807019, 1.850220317}, []float64{0}},
{[]float64{7.627531214, 2.759262235}, []float64{1}},
{[]float64{5.332441248, 2.088626775}, []float64{1}},
{[]float64{6.922596716, 1.77106367}, []float64{1}},
{[]float64{8.675418651, 0.242068655}, []float64{1}},
}
Create a network with two hidden layers of size 2 and 2 respectively:
n := deep.NewNeural(&deep.Config{
/* Input dimensionality */
Inputs: 2,
/* Two hidden layers consisting of two neurons each, and a single output */
Layout: []int{2, 2, 1},
/* Activation functions: Sigmoid, Tanh, ReLU, Linear */
Activation: deep.ActivationSigmoid,
/* Determines output layer activation & loss function:
ModeRegression: linear outputs with MSE loss
ModeMultiClass: softmax output with Cross Entropy loss
ModeMultiLabel: sigmoid output with Cross Entropy loss
ModeBinary: sigmoid output with binary CE loss */
Mode: deep.ModeBinary,
/* Weight initializers: {deep.NewNormal(μ, σ), deep.NewUniform(μ, σ)} */
Weight: deep.NewNormal(1.0, 0.0),
/* Apply bias */
Bias: true,
})
Train:
// params: learning rate, momentum, alpha decay, nesterov
optimizer := training.NewSGD(0.05, 0.1, 1e6, true)
// params: optimizer, verbosity (print stats at every 50th iteration)
trainer := training.NewTrainer(optimizer, 50)
training, heldout := data.Split(0.5)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations
resulting in:
Epochs Elapsed Error
  
5 12.938µs 0.36438
10 125.691µs 0.02261
15 177.194µs 0.00404
...
1000 10.703839ms 0.00000
Finally, make some predictions:
fmt.Println(data[0].Input, "=>", n.Predict(data[0].Input))
fmt.Println(data[5].Input, "=>", n.Predict(data[5].Input))
Alternatively, batch training can be performed in parallell:
optimizer := NewAdam(0.001, 0.9, 0.999, 1e8)
// params: optimizer, verbosity (print info at every n:th iteration), batchsize, number of workers
trainer := training.NewBatchTrainer(optimizer, 1, 200, 4)
training, heldout := data.Split(0.75)
trainer.Train(n, training, heldout, 1000) // training, validation, iterations
Examples
See training/trainer_test.go
for a variety of toy examples of regression, multiclass classification, binary classification, etc.
See examples/
for more realistic examples:
Dataset  Topology  Epochs  Accuracy 

wines  [5 5]  10000  ~98% 
mnist  [50]  25  ~97% 