randomforest alternatives and similar packages
Based on the "Machine Learning" category.
Alternatively, view randomforest alternatives based on common mentions on social networks and blogs.
-
Gorgonia
Gorgonia is a library that helps facilitate machine learning in Go. -
m2cgen
Transform ML models into a native code (Java, C, Python, Go, JavaScript, Visual Basic, C#, R, PowerShell, PHP, Dart, Haskell, Ruby, F#, Rust) with zero dependencies -
gosseract
Go package for OCR (Optical Character Recognition), by using Tesseract C++ library -
gago
:four_leaf_clover: Evolutionary optimization library for Go (genetic algorithm, partical swarm optimization, differential evolution) -
ocrserver
A simple OCR API server, seriously easy to be deployed by Docker, on Heroku as well -
onnx-go
onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library. -
Goptuna
A hyperparameter optimization framework, inspired by Optuna. -
shield
Bayesian text classifier with flexible tokenizers and storage backends for Go -
go-fann
Go bindings for FANN, library for artificial neural networks -
neat
Plug-and-play, parallel Go framework for NeuroEvolution of Augmenting Topologies (NEAT). -
go-featureprocessing
๐ฅ Fast, simple sklearn-like feature processing for Go -
neural-go
A multilayer perceptron network implemented in Go, with training via backpropagation. -
go-cluster
k-modes and k-prototypes clustering algorithms implementation in Go
Access the most powerful time series database as a service
Do you think we are missing an alternative of randomforest or a related project?
README
GoDoc: https://godoc.org/github.com/malaschitz/randomForest
Test:
go test ./... -cover -coverpkg=.
randomForest
Random Forest implementation in golang.
Simple Random Forest
xData := [][]float64{}
yData := []int{}
for i := 0; i < 1000; i++ {
x := []float64{rand.Float64(), rand.Float64(), rand.Float64(), rand.Float64()}
y := int(x[0] + x[1] + x[2] + x[3])
xData = append(xData, x)
yData = append(yData, y)
}
forest := randomForest.Forest{}
forest.Data = randomforest.ForestData{X: xData, Class: yData}
forest.Train(1000)
//test
fmt.Println("Vote", forest.Vote([]float64{0.1, 0.1, 0.1, 0.1}))
fmt.Println("Vote", forest.Vote([]float64{0.9, 0.9, 0.9, 0.9}))
Extremely Randomized Trees
forest.TrainX(1000)
Deep Forest
Deep forest inspired by https://arxiv.org/abs/1705.07366
dForest := forest.BuildDeepForest()
dForest.Train(20, 100, 1000) //20 small forest with 100 trees help to build deep forest with 1000 trees
Continuos Random Forest
Continuos Random Forest for data where are still new and new data (forex, wheather, user logs, ...). New data create a new trees and oldest trees are removed.
forest := randomForest.Forest{}
data := []float64{rand.Float64(), rand.Float64()}
res := 1; //result
forest.AddDataRow(data, res, 1000, 10, 2000)
// AddDataRow : add new row, trim oldest row if there is more than 1000 rows, calculate a new 10 trees, but remove oldest trees if there is more than 2000 trees.
Boruta Algorithm for feature selection
Boruta algorithm was developed as package for language R. It is one of most effective feature selection algorithm. There is paper in Journal of Statistical Software.
Boruta algorithm use random forest for selection important features.
xData := ... //data
yData := ... //labels
selectedFeatures := randomforest.BorutaDefault(xData, yData)
// or randomforest.BorutaDefault(xData, yData, 100, 20, 0.05, true, true)
In /examples is example with MNIST database. On picture are selected features (495 from 784) from images.
[boruta 05](boruta05.png)