Hello Neurons – ENCOG Neural Network XOR example in F#
I’ve been playing with Machine Learning lately, starting with Abhishek Kumar’s Introduction to Machine Learning video on PluralSight.
I’m not going to go into the details of ML or Neural Networks here (I don’t know them, for a start), but one thing I found was that the .Net ENCOG examples were all in C#. As such, I though I’d post my F# version here. (See the C# version for comparison).
So, without further ado:
open Encog.ML.Data.Basic open Encog.Engine.Network.Activation open Encog.Neural.Networks open Encog.Neural.Networks.Layers open Encog.Neural.Networks.Training.Propagation.Resilient let createNetwork() = let network = BasicNetwork() network.AddLayer( BasicLayer( null, true, 2 )) network.AddLayer( BasicLayer( ActivationSigmoid(), true, 2 )) network.AddLayer( BasicLayer( ActivationSigmoid(), false, 1 )) network.Structure.FinalizeStructure() network.Reset() network let train trainingSet (network: BasicNetwork) = let trainedNetwork = network.Clone() : ?> BasicNetwork let trainer = ResilientPropagation(trainedNetwork, trainingSet) let rec trainIteration epoch error = match error > 0.001 with | false -> () | true -> trainer.Iteration() printfn "Iteration no : %d, Error: %f" epoch error trainIteration (epoch + 1) trainer.Error trainIteration 1 1.0 trainedNetwork [<EntryPoint>] let main argv = let xor_input = [| [| 0.0 ; 0.0 |] [| 1.0 ; 0.0 |] [| 0.0 ; 1.0 |] [| 1.0 ; 1.0 |] |] let xor_ideal = [| [| 0.0 |] [| 1.0 |] [| 1.0 |] [| 0.0 |] |] let trainingSet = BasicMLDataSet(xor_input, xor_ideal) let network = createNetwork() let trainedNetwork = network |> train trainingSet trainingSet |> Seq.iter ( fun item -> let output = trainedNetwork.Compute(item.Input) printfn "Input: %f, %f Ideal: %f Actual: %f" item.Input. item.Input. item.Ideal. output.) printfn "Press return to exit.." System.Console.Read() |> ignore 0 // return an integer exit code
The main difference over the C# version is that the training iterations are done with recursion instead of looping, and the training returns a new network rather than updating the existing one. Nothing wrong with doing it that way per se, but it gave me a warm feeling inside to make it all ‘functional’.
It may be a while before I create Skynet, but you’ve got to start somewhere..