Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save tomthetrainer/d84054e8b402efa39caf9726cbc77dec to your computer and use it in GitHub Desktop.
Save tomthetrainer/d84054e8b402efa39caf9726cbc77dec to your computer and use it in GitHub Desktop.
Dropout code snippet
MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.seed(seed)
.iterations(iterations)
.optimizationAlgo(OptimizationAlgorithm.STOCHASTIC_GRADIENT_DESCENT)
.learningRate(learningRate)
.updater(Updater.NESTEROVS).momentum(0.9)
.regularization(true).l2(1e-4)
.weightInit(WeightInit.XAVIER)
.activation(Activation.TANH)
.list()
.layer(0, new DenseLayer.Builder().nIn(numInputs).nOut(numHiddenNodes).dropOut(0.5)
.build())
.layer(1, new DenseLayer.Builder().nIn(numHiddenNodes).nOut(numHiddenNodes)
.build())
.layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.NEGATIVELOGLIKELIHOOD)
.activation(Activation.SOFTMAX)
.nIn(numHiddenNodes).nOut(numOutputs).build())
.pretrain(false).backprop(true).build();
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment