Project 6: Eye-Tracking

Chenran Jin - 50217058 - chenranj@buffalo.edu

Introduction

The function of this project is to train the machine to track the position of the "pupil". First use a pair of eyes pictures generated by svg graphics as the training object. There is also a txt file generated with the eye picture to store the position information of the "pupil" for checking the training results. This project uses Tensorflow to build a CNN. Use built-in convolution kernels, ReLU functions, normalization algorithms, and max/avg pools as existing modules. Finally, the model is built by convolution and flatten volume. The model performs several evolutions on the training group and summarizes the evolution times that are closest to the results.

We can then use existing pictures instead of existing pictures for comparison. View test and training accuracy to identify networks for evaluation.

Process

Import packages

In [1]:
# import the necessary packages
from tensorflow.keras.layers import BatchNormalization
from sklearn.model_selection import train_test_split
from tensorflow.keras.layers import MaxPooling2D
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Activation
from tensorflow.keras.layers import Dropout
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.layers import Flatten
from tensorflow.keras.layers import Conv2D
from tensorflow.keras.layers import Dense
from tensorflow.keras.layers import Input
from tensorflow.keras.models import Model
import matplotlib.pyplot as plt
from os.path import join
from glob import glob
import pandas as pd
import numpy as np
import cv2
import os

Build & train the network

Creating this neural network is similar to adding a filter to our image. We define the size of this filter as (16, 32, 64), and require the user to give the image's length, width, and "depth". We process images in a region with a given filter size and step size. First, create a two-bit convolution kernel through Conv2D, input parameters, kernel size, stride and other parameters. Use the activation function ReLU to reduce the activation rate of the neural network. After that, operations such as normalization and maximum data merge within the region are performed. At last, output a model after flatten the volume that we got.

In [2]:
# build the network
def create_cnn(width, height, depth, nfilters=(16, 32, 64), regress=False):
    # initialize the input shape and channel dimension, assuming
    # TensorFlow/channels-last ordering
    inputShape = (height, width, depth)
    chanDim = -1

    # define the model input
    inputs = Input(shape=inputShape)

    # loop over the number of filters
    for (i, nf) in enumerate(nfilters):
        # if this is the first CONV layer then set the input
        # appropriately
        if i == 0:
            x = inputs

        # CONV => RELU => BN => POOL
        x = Conv2D(nf, (3, 3), padding="same")(x)
        x = Activation("relu")(x)
        x = BatchNormalization(axis=chanDim)(x)
        x = MaxPooling2D(pool_size=(2, 2))(x)

    # flatten the volume, then FC => RELU => BN => DROPOUT
    x = Flatten()(x)
    x = Dense(16)(x)
    x = Activation("relu")(x)
    x = BatchNormalization(axis=chanDim)(x)
    x = Dropout(0.5)(x)

    x = Dense(2, activation="linear")(x)  ################  2D output

    # construct the CNN
    model = Model(inputs, x)

    # return the CNN
    return model

Load training images using the network we just defined. And compare the csv file in txt format to check the accuracy of the prediction. From the chart summarized by the data, we can conclude that if the predicted dot matrix is thinner, the closer the formed line is to f (x) = x, the more accurate the prediction result is.

In [3]:
# now actually do it
def setup(datafolder, columns_to_use):
    datafile = join(datafolder, datafolder + '.txt')
    # construct the path to the input .txt file that contains information
    # on each image in the dataset and then load the dataset
    print("Loading attributes...")
    df = pd.read_csv(datafile, sep=' ')[columns_to_use]

    # load the images and then scale the pixel intensities to the
    print("Loading images...")
    imagefiles = sorted(glob(join(datafolder, '*.png')))
    images = np.array([
        cv2.imread(imagefile)[:, :, :3] for imagefile in imagefiles
    ])  
    # slice off opacity layer
    images = images / 255.0

    # partition the data into training and testing splits using 75% of
    # the data for training and the remaining 25% for testing
    split = train_test_split(
        df, images, test_size=0.25,
        random_state=42)  # can alternatively make your own function to do this
    (trainAttrX, testAttrX, trainImagesX, testImagesX) = split

    # to rescale trainY and testY
    minAttr = trainAttrX.min(axis=0)
    maxAttr = trainAttrX.max(axis=0)
    rangeAttr = maxAttr - minAttr

    trainAttrX -= minAttr
    trainAttrX = (trainAttrX / rangeAttr).values

    testAttrX -= minAttr
    testAttrX = (testAttrX / rangeAttr).values

    # Let's return minAttr and rangeAttr for use in interpretation of predictions
    return trainAttrX, testAttrX, trainImagesX, testImagesX, minAttr, rangeAttr


tra, testa, tri, testi, mina, rangea = setup('one_eye', ['x', 'y'])

# create our Convolutional Neural Network and then compile the model
# using mean squared error as our los
model = create_cnn(90, 48, 3, regress=True)
opt = Adam(lr=1e-3,
           decay=1e-3 / 200)  # more sophisticated than plain gradient descent
model.compile(loss="mean_squared_error", optimizer=opt)

# train the model
print("Training model...")
model.fit(tri, tra, validation_data=(testi, testa), epochs=100, batch_size=8)
model.save('model.h5')  # in case we want to take up training from were we left off

# make predictions on the testing data
print("Predicting ...")
preds = model.predict(testi)
print(preds)
print('Actual')
print(testa)
Loading attributes...
Loading images...
Training model...
Train on 750 samples, validate on 250 samples
Epoch 1/100
750/750 [==============================] - 4s 5ms/sample - loss: 1.1553 - val_loss: 7.2373
Epoch 2/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.5609 - val_loss: 26.3445
Epoch 3/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.3768 - val_loss: 15.6102
Epoch 4/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.2799 - val_loss: 1.9767
Epoch 5/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.2107 - val_loss: 0.3861
Epoch 6/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.1644 - val_loss: 0.0575
Epoch 7/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.1385 - val_loss: 0.0169
Epoch 8/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0939 - val_loss: 0.0071
Epoch 9/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0803 - val_loss: 0.0040
Epoch 10/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0668 - val_loss: 0.0047
Epoch 11/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0562 - val_loss: 0.0062
Epoch 12/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0500 - val_loss: 0.0051
Epoch 13/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0465 - val_loss: 0.0060
Epoch 14/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0456 - val_loss: 0.0065
Epoch 15/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0382 - val_loss: 0.0071
Epoch 16/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0379 - val_loss: 0.0066
Epoch 17/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0330 - val_loss: 0.0057
Epoch 18/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0353 - val_loss: 0.0056
Epoch 19/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0336 - val_loss: 0.0061
Epoch 20/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0334 - val_loss: 0.0064
Epoch 21/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0318 - val_loss: 0.0063
Epoch 22/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0307 - val_loss: 0.0063
Epoch 23/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0319 - val_loss: 0.0076
Epoch 24/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0311 - val_loss: 0.0073
Epoch 25/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0304 - val_loss: 0.0054
Epoch 26/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0299 - val_loss: 0.0052
Epoch 27/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0314 - val_loss: 0.0060
Epoch 28/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0316 - val_loss: 0.0062
Epoch 29/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0288 - val_loss: 0.0063
Epoch 30/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0294 - val_loss: 0.0059
Epoch 31/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0305 - val_loss: 0.0058
Epoch 32/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0279 - val_loss: 0.0056
Epoch 33/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0284 - val_loss: 0.0060
Epoch 34/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0296 - val_loss: 0.0050
Epoch 35/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0297 - val_loss: 0.0067
Epoch 36/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0284 - val_loss: 0.0055
Epoch 37/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0296 - val_loss: 0.0047
Epoch 38/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0282 - val_loss: 0.0055
Epoch 39/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0285 - val_loss: 0.0061
Epoch 40/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0269 - val_loss: 0.0074
Epoch 41/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0289 - val_loss: 0.0067
Epoch 42/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0292 - val_loss: 0.0060
Epoch 43/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0279 - val_loss: 0.0044
Epoch 44/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0276 - val_loss: 0.0071
Epoch 45/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0287 - val_loss: 0.0052
Epoch 46/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0277 - val_loss: 0.0058
Epoch 47/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0292 - val_loss: 0.0063
Epoch 48/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0288 - val_loss: 0.0055
Epoch 49/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0268 - val_loss: 0.0057
Epoch 50/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0282 - val_loss: 0.0054
Epoch 51/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0310 - val_loss: 0.0068
Epoch 52/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0286 - val_loss: 0.0055
Epoch 53/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0317 - val_loss: 0.0045
Epoch 54/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0280 - val_loss: 0.0049
Epoch 55/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0301 - val_loss: 0.0044
Epoch 56/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0312 - val_loss: 0.0052
Epoch 57/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0257 - val_loss: 0.0037
Epoch 58/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0295 - val_loss: 0.0074
Epoch 59/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0279 - val_loss: 0.0043
Epoch 60/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0276 - val_loss: 0.0042
Epoch 61/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0276 - val_loss: 0.0052
Epoch 62/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0287 - val_loss: 0.0031
Epoch 63/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0288 - val_loss: 0.0055
Epoch 64/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0296 - val_loss: 0.0048
Epoch 65/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0291 - val_loss: 0.0050
Epoch 66/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0274 - val_loss: 0.0046
Epoch 67/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0302 - val_loss: 0.0061
Epoch 68/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0278 - val_loss: 0.0060
Epoch 69/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0277 - val_loss: 0.0069
Epoch 70/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0284 - val_loss: 0.0043
Epoch 71/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0295 - val_loss: 0.0046
Epoch 72/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0262 - val_loss: 0.0043
Epoch 73/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0281 - val_loss: 0.0056
Epoch 74/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0285 - val_loss: 0.0059
Epoch 75/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0300 - val_loss: 0.0049
Epoch 76/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0274 - val_loss: 0.0060
Epoch 77/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0312 - val_loss: 0.0065
Epoch 78/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0314 - val_loss: 0.0050
Epoch 79/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0276 - val_loss: 0.0053
Epoch 80/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0283 - val_loss: 0.0074
Epoch 81/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0282 - val_loss: 0.0043
Epoch 82/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0282 - val_loss: 0.0042
Epoch 83/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0281 - val_loss: 0.0048
Epoch 84/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0265 - val_loss: 0.0063
Epoch 85/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0287 - val_loss: 0.0037
Epoch 86/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0271 - val_loss: 0.0066
Epoch 87/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0272 - val_loss: 0.0057
Epoch 88/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0285 - val_loss: 0.0060
Epoch 89/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0275 - val_loss: 0.0038
Epoch 90/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0288 - val_loss: 0.0037
Epoch 91/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0272 - val_loss: 0.0051
Epoch 92/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0286 - val_loss: 0.0059
Epoch 93/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0285 - val_loss: 0.0057
Epoch 94/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0271 - val_loss: 0.0057
Epoch 95/100
750/750 [==============================] - 3s 4ms/sample - loss: 0.0264 - val_loss: 0.0035
Epoch 96/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0283 - val_loss: 0.0060
Epoch 97/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0287 - val_loss: 0.0074
Epoch 98/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0294 - val_loss: 0.0055
Epoch 99/100
750/750 [==============================] - 3s 3ms/sample - loss: 0.0287 - val_loss: 0.0055
Epoch 100/100
750/750 [==============================] - 2s 3ms/sample - loss: 0.0268 - val_loss: 0.0041
Predicting ...
[[0.14408115 0.62644064]
 [0.5191398  0.8470916 ]
 [0.8146733  0.16779172]
 [0.44692144 0.34224552]
 [0.22456911 0.3416016 ]
 [0.5903549  0.11671969]
 [0.684707   0.8429208 ]
 [0.50616854 0.22188017]
 [0.4077308  0.752016  ]
 [0.24113753 0.319127  ]
 [0.16315389 0.19399145]
 [0.3470545  0.15546694]
 [0.42384005 0.44141093]
 [0.15000165 0.38906735]
 [0.7609581  0.19539115]
 [0.60182685 0.22274107]
 [0.7599214  0.33015645]
 [0.50664026 0.28423655]
 [0.17955181 0.18834129]
 [0.531073   0.2897105 ]
 [0.4934024  0.5175084 ]
 [0.25734636 0.36592725]
 [0.5981566  0.11399177]
 [0.24418229 0.4554207 ]
 [0.5396902  0.37145042]
 [0.86369765 0.28228846]
 [0.41910875 0.15682721]
 [0.8557017  0.7813239 ]
 [0.8888368  0.51534396]
 [0.31313062 0.640778  ]
 [0.6735526  0.5405044 ]
 [0.28552827 0.87406147]
 [0.20721304 0.18960294]
 [0.50520545 0.67909026]
 [0.72769594 0.8189574 ]
 [0.25639302 0.33082992]
 [0.762216   0.686227  ]
 [0.67765254 0.644508  ]
 [0.47811842 0.84967744]
 [0.34465832 0.38893157]
 [0.3464507  0.46374056]
 [0.51143533 0.4389072 ]
 [0.14475363 0.5506094 ]
 [0.32011718 0.7175544 ]
 [0.43593335 0.08831102]
 [0.14765301 0.4393273 ]
 [0.21535566 0.8832363 ]
 [0.7292905  0.34678966]
 [0.3244319  0.61598146]
 [0.52114356 0.72229564]
 [0.4315038  0.19273794]
 [0.17468375 0.31130248]
 [0.26790917 0.24728335]
 [0.14689925 0.63165426]
 [0.40042907 0.21993774]
 [0.20067763 0.17879504]
 [0.46799055 0.8485749 ]
 [0.6631969  0.8432307 ]
 [0.6532811  0.4432854 ]
 [0.51932544 0.4081323 ]
 [0.27448338 0.29066938]
 [0.17556664 0.25341374]
 [0.28902614 0.190658  ]
 [0.23220035 0.19110963]
 [0.31329268 0.5743778 ]
 [0.5335208  0.08677095]
 [0.16061491 0.7445782 ]
 [0.3876862  0.83816516]
 [0.58746475 0.6309232 ]
 [0.28192183 0.6806697 ]
 [0.8180792  0.81291616]
 [0.6901829  0.11707538]
 [0.49205327 0.10121977]
 [0.5111289  0.18505919]
 [0.8403695  0.19062197]
 [0.33774918 0.12314615]
 [0.731677   0.5576516 ]
 [0.47587445 0.20942816]
 [0.58039993 0.18330967]
 [0.22010827 0.47793946]
 [0.6981875  0.64825714]
 [0.79891086 0.5654951 ]
 [0.16957477 0.26549676]
 [0.51957184 0.38623697]
 [0.79397    0.77589774]
 [0.84257555 0.59280264]
 [0.423027   0.7376752 ]
 [0.2104722  0.43012172]
 [0.8621595  0.5172892 ]
 [0.5074603  0.70961756]
 [0.6696625  0.09784788]
 [0.2705348  0.2569383 ]
 [0.31965172 0.8673848 ]
 [0.8535112  0.78979456]
 [0.66905767 0.08738798]
 [0.43133163 0.35110748]
 [0.33585572 0.1577008 ]
 [0.5847937  0.12496749]
 [0.29608303 0.8874289 ]
 [0.18943131 0.18886063]
 [0.34808707 0.21149328]
 [0.74771    0.4450989 ]
 [0.64661556 0.20603278]
 [0.61942655 0.14787313]
 [0.77994686 0.20838073]
 [0.59720093 0.12753567]
 [0.5063036  0.59288925]
 [0.5157981  0.5099315 ]
 [0.7350179  0.49320602]
 [0.3727103  0.8726578 ]
 [0.7110821  0.84356606]
 [0.56732637 0.3005522 ]
 [0.13612083 0.5026382 ]
 [0.78675467 0.83615464]
 [0.16745162 0.2807988 ]
 [0.6935107  0.7782248 ]
 [0.2073417  0.16267702]
 [0.3471728  0.61275566]
 [0.5432636  0.85380715]
 [0.7803464  0.0976131 ]
 [0.2126477  0.26564413]
 [0.57687485 0.5857147 ]
 [0.81241834 0.7623415 ]
 [0.4548598  0.66666514]
 [0.4341923  0.58842504]
 [0.8550574  0.787428  ]
 [0.3201592  0.13751763]
 [0.69092906 0.77999175]
 [0.8466188  0.21316755]
 [0.7944111  0.69035816]
 [0.6927586  0.5240405 ]
 [0.5484333  0.08809835]
 [0.5031522  0.13255584]
 [0.6850059  0.5945349 ]
 [0.15556183 0.3618269 ]
 [0.6837022  0.84621096]
 [0.77421284 0.77528137]
 [0.41636136 0.74950695]
 [0.8660555  0.26254252]
 [0.38739    0.46826133]
 [0.81140566 0.810634  ]
 [0.25255665 0.31698668]
 [0.7385689  0.8522467 ]
 [0.29227048 0.1499457 ]
 [0.34119266 0.19960505]
 [0.4654393  0.81395036]
 [0.14938956 0.68194085]
 [0.2358529  0.6786954 ]
 [0.7966244  0.1218459 ]
 [0.13977996 0.72846943]
 [0.20149255 0.7536553 ]
 [0.33100536 0.7674438 ]
 [0.5249912  0.1573497 ]
 [0.25567397 0.16773847]
 [0.77762896 0.851761  ]
 [0.8020388  0.1498701 ]
 [0.463736   0.32049114]
 [0.58980966 0.7308762 ]
 [0.24411601 0.5125661 ]
 [0.45426926 0.13302746]
 [0.15819284 0.2285637 ]
 [0.3588019  0.34836397]
 [0.6860625  0.38814282]
 [0.54986274 0.0822885 ]
 [0.13877532 0.81101835]
 [0.6778724  0.632426  ]
 [0.324983   0.7553723 ]
 [0.5003114  0.1822455 ]
 [0.4303826  0.17720205]
 [0.56589025 0.8461152 ]
 [0.39803457 0.16718924]
 [0.22589558 0.3116852 ]
 [0.47083712 0.6550364 ]
 [0.6610575  0.28160003]
 [0.4924281  0.4529826 ]
 [0.21179953 0.5277155 ]
 [0.82245326 0.15514264]
 [0.14625716 0.7754959 ]
 [0.16329882 0.18462393]
 [0.16375065 0.41571027]
 [0.1347045  0.8249008 ]
 [0.47515157 0.6636346 ]
 [0.8747082  0.6317812 ]
 [0.8229375  0.77548945]
 [0.6095131  0.12419856]
 [0.6306819  0.09737796]
 [0.8207773  0.76487565]
 [0.13934547 0.8935003 ]
 [0.6462196  0.81890625]
 [0.6696199  0.50150955]
 [0.27469337 0.29949832]
 [0.15120056 0.64359415]
 [0.35408705 0.8822886 ]
 [0.6206207  0.0691739 ]
 [0.8614496  0.4912231 ]
 [0.49461323 0.4784555 ]
 [0.88737136 0.5015279 ]
 [0.8533972  0.7511945 ]
 [0.16792184 0.82326674]
 [0.69032514 0.14665022]
 [0.2822287  0.3558089 ]
 [0.6984528  0.08551261]
 [0.18557441 0.17722294]
 [0.70471597 0.25701013]
 [0.80134    0.188095  ]
 [0.2652443  0.15301383]
 [0.25426728 0.69704145]
 [0.4105659  0.44468343]
 [0.7570108  0.2764391 ]
 [0.7224787  0.51786935]
 [0.793735   0.11439461]
 [0.52802014 0.43545976]
 [0.24646366 0.59762436]
 [0.42186874 0.24123918]
 [0.36223936 0.14346132]
 [0.65857327 0.5694055 ]
 [0.20870948 0.68095684]
 [0.3715433  0.1204873 ]
 [0.7751267  0.14794436]
 [0.8749598  0.44760424]
 [0.78399706 0.44213963]
 [0.17361441 0.6172792 ]
 [0.6499276  0.7552761 ]
 [0.86348116 0.6981952 ]
 [0.58500826 0.19753212]
 [0.46549308 0.24089164]
 [0.45134616 0.14441931]
 [0.82983637 0.16502592]
 [0.87700605 0.6702883 ]
 [0.44547838 0.22114083]
 [0.6812509  0.3166645 ]
 [0.7607736  0.58807933]
 [0.4411634  0.10455802]
 [0.8520916  0.4932834 ]
 [0.6481034  0.78032905]
 [0.37856498 0.62977165]
 [0.7940172  0.11570027]
 [0.84613407 0.57587665]
 [0.46818435 0.7869393 ]
 [0.43334803 0.1765444 ]
 [0.18070808 0.48977435]
 [0.14202985 0.4972176 ]
 [0.8409641  0.18746287]
 [0.67569226 0.83781356]
 [0.1677705  0.2606644 ]
 [0.41085035 0.8090441 ]
 [0.2660777  0.88638353]
 [0.6797192  0.18616384]
 [0.36660072 0.2264396 ]
 [0.58679974 0.09670779]]
Actual
[[ 8.01021302e-03  6.12684297e-01]
 [ 5.04343037e-01  9.73354284e-01]
 [ 8.63188065e-01  1.75386981e-01]
 [ 4.21787779e-01  4.19176891e-01]
 [ 1.47600691e-01  2.97315369e-01]
 [ 5.95747078e-01  1.03139313e-01]
 [ 7.17890310e-01  9.65531076e-01]
 [ 4.85694260e-01  2.88087994e-01]
 [ 3.73826629e-01  8.34375313e-01]
 [ 1.80117149e-01  2.87185316e-01]
 [ 3.25665223e-02  7.84995487e-02]
 [ 2.95176350e-01  1.05479589e-01]
 [ 3.98007460e-01  5.17201030e-01]
 [-2.50319157e-04  3.19280532e-01]
 [ 7.81871887e-01  2.13433185e-01]
 [ 5.90089865e-01  2.62545552e-01]
 [ 7.68442264e-01  3.77720571e-01]
 [ 4.83766803e-01  3.53682592e-01]
 [ 1.95999900e-02  1.62816355e-02]
 [ 5.11627325e-01  3.73274046e-01]
 [ 4.69248292e-01  5.93494032e-01]
 [ 2.00943703e-01  3.52713049e-01]
 [ 6.06986408e-01  9.13376350e-02]
 [ 1.86199905e-01  4.45087092e-01]
 [ 5.18974192e-01  4.60399184e-01]
 [ 9.32901950e-01  2.88288589e-01]
 [ 3.83751784e-01  1.55126876e-01]
 [ 9.38459035e-01  8.56006152e-01]
 [ 9.44053668e-01  5.12888235e-01]
 [ 2.65876493e-01  6.85099127e-01]
 [ 6.73095697e-01  6.15158303e-01]
 [ 2.10393251e-01  9.50386146e-01]
 [ 1.42281409e-01  1.21995253e-01]
 [ 4.94618138e-01  7.59018421e-01]
 [ 7.59931413e-01  9.06790144e-01]
 [ 2.01131443e-01  3.07278259e-01]
 [ 7.87491552e-01  7.47116445e-01]
 [ 7.03872437e-01  7.19166862e-01]
 [ 4.56694786e-01  9.69843870e-01]
 [ 2.87516584e-01  4.14797232e-01]
 [ 2.96077499e-01  4.93764836e-01]
 [ 4.88635510e-01  5.16131189e-01]
 [ 3.18531127e-02  5.06536057e-01]
 [ 2.68767679e-01  7.61492428e-01]
 [ 4.09321885e-01 -1.06984053e-03]
 [ 3.22285915e-02  3.88285246e-01]
 [ 1.31592781e-01  9.59747250e-01]
 [ 7.26526321e-01  4.06238508e-01]
 [ 2.76978147e-01  6.61562636e-01]
 [ 5.06370623e-01  8.14182073e-01]
 [ 4.03314226e-01  2.27976330e-01]
 [ 8.68482315e-02  2.76553776e-01]
 [ 2.16363363e-01  2.17277925e-01]
 [ 1.89867081e-02  6.19872288e-01]
 [ 3.51373001e-01  2.32690315e-01]
 [ 1.15484743e-01  7.97365518e-02]
 [ 4.40436557e-01  9.57440407e-01]
 [ 7.01932464e-01  9.77733944e-01]
 [ 6.29890611e-01  5.34117883e-01]
 [ 5.00700894e-01  4.93263348e-01]
 [ 2.20481113e-01  2.61509144e-01]
 [ 9.51838594e-02  2.09120390e-01]
 [ 2.44048662e-01  1.45030256e-01]
 [ 1.80354953e-01  1.34666176e-01]
 [ 2.66339583e-01  6.06198389e-01]
 [ 5.35382613e-01  5.52305172e-02]
 [ 6.96638214e-02  7.72458293e-01]
 [ 3.51197777e-01  9.13777540e-01]
 [ 5.70314651e-01  7.07465481e-01]
 [ 2.28241007e-01  7.25552472e-01]
 [ 8.91173747e-01  8.91377754e-01]
 [ 7.22483667e-01  8.69579753e-02]
 [ 4.72927983e-01  6.48590819e-02]
 [ 4.93429122e-01  2.35532078e-01]
 [ 9.37307567e-01  1.83511083e-01]
 [ 2.80808030e-01  1.91233994e-02]
 [ 7.45750832e-01  6.25756411e-01]
 [ 4.46406669e-01  2.46197051e-01]
 [ 5.73243385e-01  2.28644980e-01]
 [ 1.60667351e-01  4.64745411e-01]
 [ 7.17514831e-01  7.12513791e-01]
 [ 8.24100728e-01  6.24954030e-01]
 [ 7.30556459e-02  2.08786065e-01]
 [ 5.01301660e-01  4.75744709e-01]
 [ 8.39557936e-01  8.46778777e-01]
 [ 8.78169666e-01  6.30670991e-01]
 [ 3.96818444e-01  8.19096653e-01]
 [ 1.21817818e-01  3.91795660e-01]
 [ 8.97319082e-01  5.32145365e-01]
 [ 4.95181356e-01  7.96696867e-01]
 [ 6.96275251e-01  6.76005483e-02]
 [ 2.16838970e-01  2.25301728e-01]
 [ 2.53973817e-01  9.43900237e-01]
 [ 9.69561191e-01  9.10434288e-01]
 [ 6.99642044e-01  3.87148541e-02]
 [ 4.05354327e-01  4.23723714e-01]
 [ 2.82234849e-01  9.61184848e-02]
 [ 5.88162407e-01  1.21928388e-01]
 [ 2.15649954e-01  9.79138110e-01]
 [ 9.42451626e-02  8.27454783e-02]
 [ 2.91484142e-01  1.97887065e-01]
 [ 7.46702045e-01  5.08374845e-01]
 [ 6.46699542e-01  2.39577413e-01]
 [ 6.24771584e-01  1.56297014e-01]
 [ 8.02035095e-01  2.22426532e-01]
 [ 6.03619615e-01  1.22062118e-01]
 [ 4.99336654e-01  6.75604293e-01]
 [ 5.04017622e-01  5.88278560e-01]
 [ 7.37377657e-01  5.49162515e-01]
 [ 3.19982978e-01  9.79505867e-01]
 [ 7.45450449e-01  9.57741299e-01]
 [ 5.50789757e-01  3.81799338e-01]
 [ 3.76730331e-03  4.49399886e-01]
 [ 8.32085910e-01  9.17321387e-01]
 [ 6.20040552e-02  2.26505299e-01]
 [ 7.22784050e-01  8.69780348e-01]
 [ 1.10453328e-01  1.56464177e-02]
 [ 3.02135222e-01  6.66945271e-01]
 [ 5.41615560e-01  9.90003678e-01]
 [ 8.54013868e-01  3.12259704e-02]
 [ 1.46324063e-01  2.28410952e-01]
 [ 5.59888858e-01  6.64604995e-01]
 [ 8.61122932e-01  8.23910936e-01]
 [ 4.29935668e-01  7.41666945e-01]
 [ 4.12000300e-01  6.58386547e-01]
 [ 9.66131818e-01  9.02677945e-01]
 [ 2.65964104e-01  5.86740664e-02]
 [ 7.16989161e-01  8.70047808e-01]
 [ 9.18270795e-01  2.09187256e-01]
 [ 8.28694085e-01  7.46180335e-01]
 [ 7.07189166e-01  6.06265254e-01]
 [ 5.51903677e-01  5.82060112e-02]
 [ 4.86295026e-01  1.38009428e-01]
 [ 7.03947533e-01  6.71224633e-01]
 [ 2.20030539e-02  3.03433519e-01]
 [ 7.20355954e-01  9.83852095e-01]
 [ 8.09557185e-01  8.53331550e-01]
 [ 3.87118576e-01  8.27020160e-01]
 [ 9.85406393e-01  2.51211929e-01]
 [ 3.53788580e-01  5.21112634e-01]
 [ 8.79195975e-01  8.91912674e-01]
 [ 1.94598113e-01  2.90461703e-01]
 [ 7.77040727e-01  9.76196048e-01]
 [ 2.42559263e-01  6.24185082e-02]
 [ 2.85463967e-01  1.83979138e-01]
 [ 4.37532854e-01  9.07024172e-01]
 [ 3.42061128e-02  6.85232857e-01]
 [ 1.78853038e-01  7.22376383e-01]
 [ 9.47520589e-01  1.93908595e-03]
 [ 1.56824952e-02  7.49155829e-01]
 [ 1.30291121e-01  8.03216208e-01]
 [ 2.79894365e-01  8.29795059e-01]
 [ 5.16858995e-01  1.95647086e-01]
 [ 2.02971288e-01  8.96325766e-02]
 [ 8.23637638e-01  9.60549631e-01]
 [ 8.38231245e-01  1.49677376e-01]
 [ 4.34866955e-01  3.93366989e-01]
 [ 5.69388470e-01  8.12276420e-01]
 [ 1.91731958e-01  5.08809468e-01]
 [ 4.25479987e-01  1.21560630e-01]
 [ 1.89116123e-02  1.36036909e-01]
 [ 3.01759744e-01  3.60502825e-01]
 [ 6.79879346e-01  4.65280332e-01]
 [ 5.55095246e-01  3.50372773e-02]
 [ 3.60709905e-02  8.44705961e-01]
 [ 7.01369246e-01  7.06362208e-01]
 [ 2.73310971e-01  8.03082478e-01]
 [ 4.80024531e-01  2.27073652e-01]
 [ 4.01111417e-01  1.98221390e-01]
 [ 5.62016571e-01  9.69008057e-01]
 [ 3.50371724e-01  1.65089766e-01]
 [ 1.57500814e-01  2.69766975e-01]
 [ 4.46656988e-01  7.35114172e-01]
 [ 6.45460462e-01  3.44388352e-01]
 [ 4.68234499e-01  5.28835545e-01]
 [ 1.45998648e-01  5.16900137e-01]
 [ 9.06668502e-01  1.30721139e-01]
 [ 9.51212796e-04  8.27220755e-01]
 [ 3.06390648e-02  5.16198054e-02]
 [ 6.92132469e-02  3.72538531e-01]
 [ 1.07136599e-02  8.66771422e-01]
 [ 4.53853663e-01  7.43071111e-01]
 [ 9.42989812e-01  6.65541105e-01]
 [ 8.90310145e-01  8.47447427e-01]
 [ 6.21730206e-01  1.09357761e-01]
 [ 6.54547047e-01  5.43947043e-02]
 [ 8.82112193e-01  8.33506068e-01]
 [ 2.12270645e-02  9.58978302e-01]
 [ 6.74372325e-01  9.31797666e-01]
 [ 6.58426994e-01  5.79820133e-01]
 [ 2.17639991e-01  2.70703086e-01]
 [ 4.23540014e-02  6.38226739e-01]
 [ 2.90332674e-01  9.83350607e-01]
 [ 6.45598138e-01 -6.68650329e-04]
 [ 8.90347693e-01  5.13790913e-01]
 [ 4.68509850e-01  5.56417372e-01]
 [ 9.47495557e-01  5.00819097e-01]
 [ 9.26744099e-01  8.20367089e-01]
 [ 9.85381361e-02  8.75564174e-01]
 [ 7.06100278e-01  1.52586005e-01]
 [ 2.17752635e-01  3.38203337e-01]
 [ 7.36952114e-01  2.99889673e-02]
 [ 4.58709855e-02  1.87222092e-03]
 [ 7.16926581e-01  2.94473605e-01]
 [ 8.38206213e-01  2.06044599e-01]
 [ 2.04761070e-01  4.03530474e-02]
 [ 1.96901049e-01  7.41198890e-01]
 [ 3.83739268e-01  5.15729999e-01]
 [ 7.63060402e-01  3.02263381e-01]
 [ 7.33347518e-01  5.82327572e-01]
 [ 8.64965331e-01  6.90381465e-02]
 [ 5.10213022e-01  5.19909064e-01]
 [ 1.95812161e-01  6.18300960e-01]
 [ 3.87168640e-01  2.72441577e-01]
 [ 3.13537260e-01  9.80910033e-02]
 [ 6.58314351e-01  6.43642807e-01]
 [ 1.43658164e-01  7.04556852e-01]
 [ 3.24238404e-01  5.01487747e-02]
 [ 8.06252973e-01  1.42857143e-01]
 [ 9.13852662e-01  4.53278058e-01]
 [ 7.87416456e-01  4.98846578e-01]
 [ 9.13790082e-02  6.04760790e-01]
 [ 6.55548324e-01  8.42733443e-01]
 [ 9.25855466e-01  7.49924777e-01]
 [ 5.78938146e-01  2.41048444e-01]
 [ 4.34554056e-01  2.90495136e-01]
 [ 4.21499912e-01  1.46969342e-01]
 [ 9.28008210e-01  1.52987195e-01]
 [ 9.55605898e-01  7.16057638e-01]
 [ 4.13151769e-01  2.71505466e-01]
 [ 6.80204761e-01  3.80228010e-01]
 [ 7.72234599e-01  6.56347163e-01]
 [ 4.12588550e-01  4.37297315e-02]
 [ 8.79408746e-01  5.19775333e-01]
 [ 6.62719968e-01  8.76366554e-01]
 [ 3.40934692e-01  6.96064993e-01]
 [ 9.26406168e-01  6.48590819e-03]
 [ 8.79033267e-01  6.04025275e-01]
 [ 4.41725700e-01  8.79676373e-01]
 [ 4.05804901e-01  2.04740731e-01]
 [ 7.42696939e-02  4.41175487e-01]
 [ 1.81481389e-02  4.45053659e-01]
 [ 9.61313174e-01  1.70572699e-01]
 [ 7.11356980e-01  9.60649928e-01]
 [ 6.99141405e-02  2.00427936e-01]
 [ 3.78419985e-01  8.90775969e-01]
 [ 1.84209868e-01  9.59178897e-01]
 [ 6.95649453e-01  2.11861857e-01]
 [ 3.07967659e-01  2.15338839e-01]
 [ 5.92317705e-01  5.78382535e-02]]
Out[3]:
<matplotlib.collections.PathCollection at 0x131cdd650>
In [4]:
# draw a plot to check evaluate the trained network
plt.scatter(preds[:, 0], testa[:, 0])
Out[4]:
<matplotlib.collections.PathCollection at 0x1a41fee650>

Conclusion

After this test, when another FC layer is removed, the recognition result is greatly improved. The reason may be that four images are used for training in the sample code, and we only need to train one image. Therefore, the FC layer is not necessary in the network. When it evolved 100 more times, the data approached a straight line. This shows that the ratio of preds to testa approaches 1.