Neural Network Part 2: Perceptrons

I started working though the second chapter of McCaffrey’s book Neural Networks Using C# Succinctly to see if I could write the examples using F#.

McCaffrey’s code is tough to read though because of its emphasis on loops and global mutable variables.  I read though his description and this is how <I think> the Perceptron should be constructed.

The inputs are a series of independent variables (in this case age and income) and the output is a single dependent variable (in this case party affiliation).  The values have been encoded and normalized like in this post here.

An example of the input (from page 31 of his book) is:

image

Or in a more abstract manner:

image

In terms of data structures, individual inputs (each row) is placed into an array of floats and the output is a single float

image

I call this single set of inputs an “observation” (my words, not McCaffrey).

Looking at McCaffrey’s example for a perceptron Input-Output,

image

all of the variables you need are not included. Here is what you need:

image

Where A0 and B0 are the same as X0 and X1 respectively in his diagram.  Also, McCaffrey uses the word “Perceptron” to mean two different concepts: the entire system as a whole and the individual calculation for a given list of X and Bias.  I am a big believer of domain ubiquitous languages so I am calling the individual calculation a neuron.

Once you run these values through the neuron for the 1st observation, you might have to alter the Weights and Bias based on the (Y)result.  Therefore, the data structure coming out of the Neuron is

image

These values are feed into the adjustment function to alter the weights and bias with the output as

image

I am calling this process of taking the a single observation, the xWeights, , and the bias and turning them into a series of weights and bais as a “cycle” (my words, not McCaffrey)

image

 

The output of a cycle is then fed with  the next observation and the cycle repeats for as many observations as there are fed into the system. 

image

 

I am calling the process of running a cycle for each observation in the input dataset a rotation (my words, not McCaffrey) and that the perceptron runs rotations for an x number of times to train itself.

image

 

Finally, the Perceptron takes a new set of observations where the Y is not known and runs a Rotation once to predict what the Y will be.

So with that mental image in place, the coding became much easier.  Basically, there was a 1 to 1 correspondence of F# functions to each step laid out.  I started with an individual cycle

  1. type cycleInput = {xValues:float list;yExpected:float;mutable weights:float list;mutable bias:float;alpha:float}
  2.  
  3. let runNeuron (input:cycleInput) =
  4.     let valuesAndWeights = input.xValues |> List.zip input.weights
  5.     let output = valuesAndWeights
  6.                     |> List.map(fun (xValue, xWeight) -> xValue*xWeight)
  7.                     |> List.sumBy(fun x -> x)
  8.     output + input.bias
  9.  
  10. let runActivation input =
  11.     if input < 0.0 then -1.0 else 1.0

I used record types all over the place in this code just so I could keep things straight in my head.  McCaffrey uses ambiguously-named arrays and global variables.  Although this makes my code a bit more wordy (esp for functional people), I think the increased readability is worth the trade-off.

In any event, with the Neuron and Activation calc out of the way, I created the functions that adjust the weights and bias:

  1. let calculateWeightAdjustment(xValue, xWeight, alpha, delta) =
  2.     match delta > 0.0, xValue >= 0.0 with
  3.         | true,true -> xWeight – (alpha * delta * xValue)
  4.         | false,true -> xWeight + (alpha * delta * xValue)
  5.         | true,false -> xWeight – (alpha * delta * xValue)
  6.         | false,false -> xWeight + (alpha * delta * xValue)
  7.  
  8. let calculateBiasAdjustment(bias, alpha, delta) =
  9.     match delta > 0.0 with
  10.         | true -> bias – (alpha * delta)
  11.         | false -> bias + (alpha * delta)

This code is significantly different than the for, nested if that McCaffrey uses. 

image

I maintain using this kind of pattern matching makes the intention much easier to comprehend.  I also split out the adjustment of the weights and the adjustment of the bias into individual functions.

With these functions ready, I created an input and output record type and implemented the adjustment function

  1. let runAdjustment (input:adjustmentInput) =
  2.     match input.yExpected = input.yActual with
  3.         | true -> {weights=input.weights;bias=input.bias;yActual=input.yActual}
  4.         | false ->
  5.             let delta = input.yActual – input.yExpected
  6.             let valuesAndWeights = input.xValues |> List.zip input.weights
  7.             let weights' =  valuesAndWeights |> List.map(fun (xValue, xWeight) -> calculateWeightAdjustment(xValue,xWeight,input.alpha,delta))
  8.             let bias' = calculateBiasAdjustment(input.bias,input.alpha,delta)
  9.             {weights=weights';bias=bias';yActual=input.yActual}

There is not a corresponding method in McCaffrey’s code, rather he just does some Array.copy and mutates the global variables in the Update method.  I am not a fan of side-effect programming so I created a function that explicitly does the  modification.

And to wrap up the individual cycle:

  1. let runCycle (cycleInput:cycleInput) =
  2.     let neuronResult = runNeuron(cycleInput)
  3.     let activationResult = runActivation(neuronResult)
  4.     let adjustmentInput = {xValues=cycleInput.xValues;weights=cycleInput.weights;yExpected=cycleInput.yExpected;
  5.                             bias=cycleInput.bias;alpha=cycleInput.alpha;yActual=activationResult}
  6.     runAdjustment(adjustmentInput)

Up next is to run the cycle for each of the observations (called a rotation)

  1. type observation = {xValues:float list;yExpected:float}
  2. type rotationInput = {observations: observation list;mutable weights:float list;mutable bias:float;alpha:float}
  3. type trainingRotationOutput = {weights:float list; bias:float}
  4. type predictionRotationOutput = {observation: observation;yActual:float}
  5.  
  6. let runTrainingRotation(rotationInput: rotationInput)=
  7.     for i=0 to rotationInput.observations.Length do
  8.         let observation = rotationInput.observations.[i]
  9.         let neuronInput = {cycleInput.xValues=observation.xValues;cycleInput.yExpected=observation.yExpected;cycleInput.weights=rotationInput.weights;
  10.                             cycleInput.bias=rotationInput.bias;cycleInput.alpha=rotationInput.alpha}
  11.         let cycleOutput = runCycle(neuronInput)
  12.         rotationInput.weights <- cycleOutput.weights
  13.         rotationInput.bias <- cycleOutput.bias
  14.     {weights=rotationInput.weights; bias=rotationInput.bias}

Again, note the liberal use of records to keep the inputs and outputs clear.  I also created a prediction rotation that is designed to be run only once that does not alter the weights and bias.

  1. let runPredictionRotation(rotationInput: rotationInput)=
  2.     let output = new System.Collections.Generic.List<predictionRotationOutput>()
  3.     for i=0 to rotationInput.observations.Length do
  4.         let observation = rotationInput.observations.[i]
  5.         let neuronInput = {cycleInput.xValues=observation.xValues;cycleInput.yExpected=observation.yExpected;cycleInput.weights=rotationInput.weights;
  6.                             cycleInput.bias=rotationInput.bias;cycleInput.alpha=rotationInput.alpha}
  7.         let cycleOutput = runCycle(neuronInput)
  8.         let predictionRotationOutput = {observation=observation;yActual=cycleOutput.yActual}
  9.         output.Add(predictionRotationOutput)   
  10.     output

With the rotations done, the last step was to create the Perceptron to train and then predict:

  1. type perceptronInput = {observations: observation list;weights:float list;bias:float}
  2. type perceptronOutput = {weights:float list; bias:float}
  3.  
  4. let initializeWeights(xValues, randomSeedValue) =
  5.     let lo = -0.01
  6.     let hi = 0.01
  7.     let xWeight = (hi-lo) * randomSeedValue + lo
  8.     xValues |> List.map(fun w -> xWeight)
  9.  
  10. let initializeBias(randomSeedValue) =
  11.     let lo = -0.01
  12.     let hi = 0.01
  13.     (hi-lo) * randomSeedValue + lo
  14.  
  15. let runTraining(perceptronInput: perceptronInput, maxEpoches:int) =
  16.     let random = System.Random()
  17.     let alpha = 0.001
  18.     let baseObservation = perceptronInput.observations.[0]
  19.     let mutable weights = initializeWeights(baseObservation.xValues,random.NextDouble())       
  20.     let mutable bias = initializeBias(random.NextDouble())
  21.     let rotationList = [0..maxEpoches]
  22.     for i=0 to maxEpoches do
  23.         let rotationInput = {observations=perceptronInput.observations;weights=weights;bias=bias;alpha=alpha}
  24.         let rotationOutput = runTrainingRotation(rotationInput)
  25.         weights <- rotationOutput.weights
  26.         bias <- rotationOutput.bias
  27.     {weights=weights;bias=bias}
  28.  
  29. let runPrediction(perceptronInput: perceptronInput, weights: float list, bias: float) =
  30.     let random = System.Random()
  31.     let alpha = 0.001
  32.     let rotationInput = {observations=perceptronInput.observations;weights=weights;bias=bias;alpha=alpha}
  33.     runPredictionRotation(rotationInput)

 

Before I go too much further, I have a big code smell.  I am iterating and using the mutable keyword.  I am not sure how to take the results of a function that is applied to the 1st element in a sequence and then input that into the second.  I need to do that with the weights and bias data structures –> each time it is used in a expression, it need to change and feed into the next expression.  I think the answer is the List.Reduce, so I am going to pick this up after looking at that in more detail.  I also need to implement the shuffle method so that that cycles are not called in the same order across rotations….

Advertisements

One Response to Neural Network Part 2: Perceptrons

  1. Pingback: F# Weekly #30, 2014 | Sergey Tihon's Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: