Java Classes for Hopfield Neural Networks

algorithms for storing and recall of patterns at the same time. In a Hopfield neural network simulation, every neuron is connected to every other neuron. Consider a pair of neurons indexed by i and j. There is a weight W i,j between these neurons that corresponds in the code to the array element weight [i, j]. We can define energy between the associations of these two neurons as: energy [i, j] = −weight[i, j] ∗ activation[i] ∗ activation[j] In the Hopfield neural network simulator, we store activations i.e., the input values as floating point numbers that get clamped in value to -1 for off or +1 for on. In the energy equation, we consider an activation that is not clamped to a value of one to be zero. This energy is like “gravitational energy potential” using a basketball court analogy: think of a basketball court with an overlaid 2D grid, different grid cells on the floor are at different heights representing energy levels and as you throw a basketball on the court, the ball naturally bounces around and finally stops in a location near to the place you threw the ball, in a low grid cell in the floor – that is, it settles in a locally low energy level. Hopfield networks function in much the same way: when shown a pattern, the network attempts to settle in a local minimum energy point as defined by a previously seen training example. When training a network with a new input, we are looking for a low energy point near the new input vector. The total energy is a sum of the above equation over all i,j. The class constructor allocates storage for input values, temporary storage, and a two-dimensional array to store weights: public Hopfieldint numInputs { this.numInputs = numInputs; weights = new float[numInputs][numInputs]; inputCells = new float[numInputs]; tempStorage = new float[numInputs]; } Remember that this model is general purpose: multi-dimensional inputs can be con- verted to an equivalent one-dimensional array. The method addT rainingData is used to store an input data array for later training. All input values get clamped to an “off” or “on” value by the utility method adjustInput. The utility method truncate truncates floating-point values to an integer value. The utility method deltaEnergy has one argument: an index into the input vector. The class variable tempStorage is set during training to be the sum of a row of trained weights. So, the method deltaEnergy returns a measure of the energy difference between the input vector in the current input cells and the training input examples: private float deltaEnergyint index { 112 float temp = 0.0f; for int j=0; jnumInputs; j++ { temp += weights[index][j] inputCells[j]; } return 2.0f temp - tempStorage[index]; } The method train is used to set the two-dimensional weight array and the one- dimensional tempStorage array in which each element is the sum of the corre- sponding row in the two-dimensional weight array: public void train { for int j=1; jnumInputs; j++ { for int i=0; ij; i++ { for int n=0; ntrainingData.size; n++ { float [] data = float []trainingData.elementAtn; float temp1 = adjustInputdata[i] adjustInputdata[j]; float temp = truncatetemp1 + weights[j][i]; weights[i][j] = weights[j][i] = temp; } } } for int i=0; inumInputs; i++ { tempStorage[i] = 0.0f; for int j=0; ji; j++ { tempStorage[i] += weights[i][j]; } } } Once the arrays weight and tempStorage are defined, it is simple to recall an original input pattern from a similar test pattern: public float [] recallfloat [] pattern, int numIterations { for int i=0; inumInputs; i++ { inputCells[i] = pattern[i]; } for int ii = 0; iinumIterations; ii++ { for int i=0; inumInputs; i++ { if deltaEnergyi 0.0f { 113 inputCells[i] = 1.0f; } else { inputCells[i] = 0.0f; } } } return inputCells; }

7.3 Testing the Hopfield Neural Network Class

The test program for the Hopfield neural network class is T est Hopf ield. This test program defined three test input patterns, each with ten values: static float [] data [] = { { 1, 1, 1, -1, -1, -1, -1, -1, -1, -1}, {-1, -1, -1, 1, 1, 1, -1, -1, -1, -1}, {-1, -1, -1, -1, -1, -1, -1, 1, 1, 1} }; The following code fragment shows how to create a new instance of the Hopf ield class and train it to recognize these three test input patterns: test = new Hopfield10; test.addTrainingDatadata[0]; test.addTrainingDatadata[1]; test.addTrainingDatadata[2]; test.train; The static method helper is used to slightly scramble an input pattern, then test the training Hopfield neural network to see if the original pattern is re-created: helpertest, pattern 0, data[0]; helpertest, pattern 1, data[1]; helpertest, pattern 2, data[2]; The following listing shows an implementation of the method helper the called method pp simply formats a floating point number for printing by clamping it to zero or one. This version of the code randomly flips one test bit and we will see that the trained Hopfield network almost always correctly recognizes the original 114 pattern. The version of method helper included in the ZIP file for this book is slightly different in that two bits are randomly flipped we will later look at sample output with both one and two bits randomly flipped. private static void helperHopfield test, String s, float [] test_data { float [] dd = new float[10]; for int i=0; i10; i++ { dd[i] = test_data[i]; } int index = int9.0f floatMath.random; if dd[index] 0.0f dd[index] = 1.0f; else dd[index] = -1.0f; float [] rr = test.recalldd, 5; System.out.prints+\nOriginal data: ; for int i = 0; i 10; i++ System.out.printpptest_data[i] + ; System.out.print\nRandomized data: ; for int i = 0; i 10; i++ System.out.printppdd[i] + ; System.out.print\nRecognized pattern: ; for int i = 0; i 10; i++ System.out.printpprr[i] + ; System.out.println; } The following listing shows how to run the program, and lists the example output: java Test_Hopfield pattern 0 Original data: 1 1 1 0 0 0 0 0 0 0 Randomized data: 1 1 1 0 0 0 1 0 0 0 Recognized pattern: 1 1 1 0 0 0 0 0 0 0 pattern 1 Original data: 0 0 0 1 1 1 0 0 0 0 Randomized data: 1 0 0 1 1 1 0 0 0 0 Recognized pattern: 0 0 0 1 1 1 0 0 0 0 pattern 2 Original data: 0 0 0 0 0 0 0 1 1 1 Randomized data: 0 0 0 1 0 0 0 1 1 1 Recognized pattern: 0 0 0 0 0 0 0 1 1 1 In this listing we see that the three sample training patterns in T est Hopf ield.java are re-created after scrambling the data by changing one randomly chosen value to 115