Wine recognition dataset (tf.keras) ワイン分類
I understand that no need to use keras for this wine recognition. Scikit-learn is better, easy and good result. This is just less than 200 dataset. Just I try to use keras for small dataset of categorical analysis.
# Download dataset import numpy as np from sklearn.datasets import load_wine wine = load_wine() print(wine.DESCR)
.. _wine_dataset: Wine recognition dataset ------------------------ **Data Set Characteristics:** :Number of Instances: 178 (50 in each of three classes) :Number of Attributes: 13 numeric, predictive attributes and the class :Attribute Information: - Alcohol - Malic acid - Ash - Alcalinity of ash - Magnesium - Total phenols - Flavanoids - Nonflavanoid phenols - Proanthocyanins - Color intensity - Hue - OD280/OD315 of diluted wines - Proline - class: - class_0 - class_1 - class_2 :Summary Statistics: ============================= ==== ===== ======= ===== Min Max Mean SD ============================= ==== ===== ======= ===== Alcohol: 11.0 14.8 13.0 0.8 Malic Acid: 0.74 5.80 2.34 1.12 Ash: 1.36 3.23 2.36 0.27 Alcalinity of Ash: 10.6 30.0 19.5 3.3 Magnesium: 70.0 162.0 99.7 14.3 Total Phenols: 0.98 3.88 2.29 0.63 Flavanoids: 0.34 5.08 2.03 1.00 Nonflavanoid Phenols: 0.13 0.66 0.36 0.12 Proanthocyanins: 0.41 3.58 1.59 0.57 Colour Intensity: 1.3 13.0 5.1 2.3 Hue: 0.48 1.71 0.96 0.23 OD280/OD315 of diluted wines: 1.27 4.00 2.61 0.71 Proline: 278 1680 746 315 ============================= ==== ===== ======= ===== :Missing Attribute Values: None :Class Distribution: class_0 (59), class_1 (71), class_2 (48) :Creator: R.A. Fisher :Donor: Michael Marshall (MARSHALL%PLU@io.arc.nasa.gov) :Date: July, 1988 This is a copy of UCI ML Wine recognition datasets. https://archive.ics.uci.edu/ml/machine-learning-databases/wine/wine.data The data is the results of a chemical analysis of wines grown in the same region in Italy by three different cultivators. There are thirteen different measurements taken for different constituents found in the three types of wine. Original Owners: Forina, M. et al, PARVUS - An Extendible Package for Data Exploration, Classification and Correlation. Institute of Pharmaceutical and Food Analysis and Technologies, Via Brigata Salerno, 16147 Genoa, Italy. Citation: Lichman, M. (2013). UCI Machine Learning Repository [https://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science. .. topic:: References (1) S. Aeberhard, D. Coomans and O. de Vel, Comparison of Classifiers in High Dimensional Settings, Tech. Rep. no. 92-02, (1992), Dept. of Computer Science and Dept. of Mathematics and Statistics, James Cook University of North Queensland. (Also submitted to Technometrics). The data was used with many others for comparing various classifiers. The classes are separable, though only RDA has achieved 100% correct classification. (RDA : 100%, QDA 99.4%, LDA 98.9%, 1NN 96.1% (z-transformed data)) (All results using the leave-one-out technique) (2) S. Aeberhard, D. Coomans and O. de Vel, "THE CLASSIFICATION PERFORMANCE OF RDA" Tech. Rep. no. 92-01, (1992), Dept. of Computer Science and Dept. of Mathematics and Statistics, James Cook University of North Queensland. (Also submitted to Journal of Chemometrics).
wine.data
array([[1.423e+01, 1.710e+00, 2.430e+00, ..., 1.040e+00, 3.920e+00, 1.065e+03], [1.320e+01, 1.780e+00, 2.140e+00, ..., 1.050e+00, 3.400e+00, 1.050e+03], [1.316e+01, 2.360e+00, 2.670e+00, ..., 1.030e+00, 3.170e+00, 1.185e+03], ..., [1.327e+01, 4.280e+00, 2.260e+00, ..., 5.900e-01, 1.560e+00, 8.350e+02], [1.317e+01, 2.590e+00, 2.370e+00, ..., 6.000e-01, 1.620e+00, 8.400e+02], [1.413e+01, 4.100e+00, 2.740e+00, ..., 6.100e-01, 1.600e+00, 5.600e+02]])
wine.target
array([0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2])
from sklearn.model_selection import train_test_split as split x_train, x_test, y_train, y_test = split(wine.data,wine.target,train_size=0.8,test_size=0.2) from __future__ import absolute_import, division, print_function, unicode_literals # Install TensorFlow try: # %tensorflow_version only exists in Colab. %tensorflow_version 2.x except Exception: pass import tensorflow as tf # Create sequential model(Maybe no need to be build deep like this. More simple model better) model = tf.keras.Sequential([ tf.keras.layers.Dense(26, activation='relu', input_shape=(13,)), tf.keras.layers.Dense(52, activation='relu', input_shape=(26,)), tf.keras.layers.Dense(104, activation='relu', input_shape=(52,)), tf.keras.layers.Dense(3, activation='softmax') ]) model.summary()
Model: "sequential" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense (Dense) (None, 26) 364 _________________________________________________________________ dense_1 (Dense) (None, 52) 1404 _________________________________________________________________ dense_2 (Dense) (None, 104) 5512 _________________________________________________________________ dense_3 (Dense) (None, 3) 315 ================================================================= Total params: 7,595 Trainable params: 7,595 Non-trainable params: 0 _________________________________________________________________
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['acc']) history = model.fit(x_train, y_train, batch_size=128, epochs=100, verbose=1)
Epoch 1/100 2/2 [==============================] - 0s 3ms/step - loss: 8.1767 - acc: 0.3451 Epoch 2/100 2/2 [==============================] - 0s 1ms/step - loss: 18.4229 - acc: 0.2817 Epoch 3/100 2/2 [==============================] - 0s 2ms/step - loss: 1.6280 - acc: 0.5000 Epoch 4/100 2/2 [==============================] - 0s 1ms/step - loss: 6.7892 - acc: 0.6127 Epoch 5/100 2/2 [==============================] - 0s 1ms/step - loss: 8.2274 - acc: 0.5775 Epoch 6/100 2/2 [==============================] - 0s 2ms/step - loss: 6.6641 - acc: 0.5141 Epoch 7/100 2/2 [==============================] - 0s 2ms/step - loss: 2.4120 - acc: 0.6479 Epoch 8/100 2/2 [==============================] - 0s 2ms/step - loss: 7.1879 - acc: 0.2817 Epoch 9/100 2/2 [==============================] - 0s 2ms/step - loss: 5.3963 - acc: 0.3169 Epoch 10/100 2/2 [==============================] - 0s 2ms/step - loss: 2.7827 - acc: 0.5845 Epoch 11/100 2/2 [==============================] - 0s 2ms/step - loss: 4.6227 - acc: 0.5352 Epoch 12/100 2/2 [==============================] - 0s 2ms/step - loss: 4.5851 - acc: 0.5070 Epoch 13/100 2/2 [==============================] - 0s 2ms/step - loss: 2.3347 - acc: 0.6479 Epoch 14/100 2/2 [==============================] - 0s 2ms/step - loss: 2.0858 - acc: 0.4366 Epoch 15/100 2/2 [==============================] - 0s 1ms/step - loss: 1.3559 - acc: 0.3662 Epoch 16/100 2/2 [==============================] - 0s 3ms/step - loss: 1.1846 - acc: 0.5000 Epoch 17/100 2/2 [==============================] - 0s 2ms/step - loss: 1.5474 - acc: 0.4437 Epoch 18/100 2/2 [==============================] - 0s 2ms/step - loss: 1.4257 - acc: 0.4366 Epoch 19/100 2/2 [==============================] - 0s 2ms/step - loss: 1.3411 - acc: 0.5282 Epoch 20/100 2/2 [==============================] - 0s 2ms/step - loss: 1.1221 - acc: 0.5282 Epoch 21/100 2/2 [==============================] - 0s 2ms/step - loss: 1.5945 - acc: 0.6620 Epoch 22/100 2/2 [==============================] - 0s 2ms/step - loss: 1.3009 - acc: 0.6197 Epoch 23/100 2/2 [==============================] - 0s 2ms/step - loss: 1.3326 - acc: 0.5282 Epoch 24/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0436 - acc: 0.5845 Epoch 25/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6736 - acc: 0.5986 Epoch 26/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0627 - acc: 0.5141 Epoch 27/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7542 - acc: 0.6408 Epoch 28/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8888 - acc: 0.6338 Epoch 29/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0726 - acc: 0.5352 Epoch 30/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8842 - acc: 0.6690 Epoch 31/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7021 - acc: 0.6338 Epoch 32/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0238 - acc: 0.5563 Epoch 33/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7450 - acc: 0.6338 Epoch 34/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6461 - acc: 0.7042 Epoch 35/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5955 - acc: 0.6831 Epoch 36/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7565 - acc: 0.6268 Epoch 37/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9491 - acc: 0.5704 Epoch 38/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8987 - acc: 0.6408 Epoch 39/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8005 - acc: 0.6268 Epoch 40/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0723 - acc: 0.5634 Epoch 41/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6473 - acc: 0.6761 Epoch 42/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7843 - acc: 0.6479 Epoch 43/100 2/2 [==============================] - 0s 2ms/step - loss: 1.3531 - acc: 0.4859 Epoch 44/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0187 - acc: 0.6408 Epoch 45/100 2/2 [==============================] - 0s 3ms/step - loss: 1.0043 - acc: 0.6479 Epoch 46/100 2/2 [==============================] - 0s 2ms/step - loss: 1.0806 - acc: 0.5915 Epoch 47/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5988 - acc: 0.7183 Epoch 48/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5754 - acc: 0.6831 Epoch 49/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9460 - acc: 0.6338 Epoch 50/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7031 - acc: 0.6479 Epoch 51/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6227 - acc: 0.7394 Epoch 52/100 2/2 [==============================] - 0s 5ms/step - loss: 0.6894 - acc: 0.6831 Epoch 53/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7101 - acc: 0.6620 Epoch 54/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7836 - acc: 0.6690 Epoch 55/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6260 - acc: 0.6831 Epoch 56/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6250 - acc: 0.6972 Epoch 57/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5411 - acc: 0.7254 Epoch 58/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7616 - acc: 0.6901 Epoch 59/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9814 - acc: 0.5986 Epoch 60/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5598 - acc: 0.7042 Epoch 61/100 2/2 [==============================] - 0s 2ms/step - loss: 1.1271 - acc: 0.6620 Epoch 62/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6484 - acc: 0.7254 Epoch 63/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6730 - acc: 0.6690 Epoch 64/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5882 - acc: 0.7113 Epoch 65/100 2/2 [==============================] - 0s 2ms/step - loss: 1.2114 - acc: 0.5986 Epoch 66/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7267 - acc: 0.6479 Epoch 67/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8106 - acc: 0.6972 Epoch 68/100 2/2 [==============================] - 0s 2ms/step - loss: 1.5999 - acc: 0.4859 Epoch 69/100 2/2 [==============================] - 0s 2ms/step - loss: 1.1265 - acc: 0.6338 Epoch 70/100 2/2 [==============================] - 0s 2ms/step - loss: 1.2845 - acc: 0.6268 Epoch 71/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7232 - acc: 0.6761 Epoch 72/100 2/2 [==============================] - 0s 2ms/step - loss: 1.4645 - acc: 0.5634 Epoch 73/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7787 - acc: 0.6690 Epoch 74/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7638 - acc: 0.6690 Epoch 75/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8269 - acc: 0.6761 Epoch 76/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8411 - acc: 0.6620 Epoch 77/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6552 - acc: 0.6972 Epoch 78/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5763 - acc: 0.7113 Epoch 79/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9468 - acc: 0.6338 Epoch 80/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7569 - acc: 0.6479 Epoch 81/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6522 - acc: 0.6901 Epoch 82/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5492 - acc: 0.7254 Epoch 83/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6368 - acc: 0.6901 Epoch 84/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5285 - acc: 0.7113 Epoch 85/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7157 - acc: 0.7042 Epoch 86/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5902 - acc: 0.7183 Epoch 87/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6237 - acc: 0.7324 Epoch 88/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7108 - acc: 0.7113 Epoch 89/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6581 - acc: 0.6690 Epoch 90/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8553 - acc: 0.6479 Epoch 91/100 2/2 [==============================] - 0s 2ms/step - loss: 0.6352 - acc: 0.7042 Epoch 92/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5463 - acc: 0.7465 Epoch 93/100 2/2 [==============================] - 0s 2ms/step - loss: 0.7264 - acc: 0.7042 Epoch 94/100 2/2 [==============================] - 0s 2ms/step - loss: 1.1365 - acc: 0.6549 Epoch 95/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5548 - acc: 0.7254 Epoch 96/100 2/2 [==============================] - 0s 2ms/step - loss: 1.1269 - acc: 0.5915 Epoch 97/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9596 - acc: 0.6268 Epoch 98/100 2/2 [==============================] - 0s 2ms/step - loss: 0.5778 - acc: 0.7465 Epoch 99/100 2/2 [==============================] - 0s 2ms/step - loss: 0.8838 - acc: 0.6268 Epoch 100/100 2/2 [==============================] - 0s 2ms/step - loss: 0.9017 - acc: 0.6690
# Evaluation score = model.evaluate(x_test, y_test, batch_size = 32) print("損失係数 loss : " + str(score[0]) + ", 正解率 accuracy : " + str(score[1]*100) + "% ") # Prepare virtual test data and evaluate it. import numpy as np x = np.array([[11.23, 1.41, 3.43, 12.6, 167, 2.00, 4.06, 0.38, 2.30, 5.64, 1.04, 3.92, 1065]]) r = model.predict(x) print(r) answer = r.argmax() if answer == 0: print("This wine is class 0.") elif answer == 1: print("This wine is class 1.") else: print("This wine is class 2.")
2/2 [==============================] - 0s 2ms/step - loss: 1.3145 - acc: 0.5556 損失係数 loss : 1.3145475387573242, 正解率 accuracy : 55.55555820465088% [[0.92950535 0.00265262 0.06784209]] This wine is class 0.
Accuracy is not good score. If using sklearn, easy to find good result. Clearly there’s some overlearning going on. Seems good to use EarlyStopping. But this is just test. Let’s take a look at plot.
import matplotlib.pyplot as plt # list all data in history print(history.history.keys()) plt.plot(history.history['acc']) plt.title('Accuracy') plt.ylabel('accuracy') plt.xlabel('epoch') plt.legend(['train'], loc='upper left') plt.show() plt.plot(history.history['loss']) plt.title('Model loss') plt.ylabel('Loss') plt.xlabel('Epoch') plt.legend(['train', 'test'], loc='upper left') plt.show()
# Create model using dropout model = tf.keras.Sequential([ tf.keras.layers.Dense(26, activation='relu', input_shape=(13,)), tf.keras.layers.Dropout(0.5), tf.keras.layers.Dense(52, activation='relu', input_shape=(26,)), tf.keras.layers.Dropout(0.5), tf.keras.layers.Dense(104, activation='relu', input_shape=(52,)), tf.keras.layers.Dense(3, activation='softmax') ]) model.summary()
Model: "sequential_1" _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_4 (Dense) (None, 26) 364 _________________________________________________________________ dropout (Dropout) (None, 26) 0 _________________________________________________________________ dense_5 (Dense) (None, 52) 1404 _________________________________________________________________ dropout_1 (Dropout) (None, 52) 0 _________________________________________________________________ dense_6 (Dense) (None, 104) 5512 _________________________________________________________________ dense_7 (Dense) (None, 3) 315 ================================================================= Total params: 7,595 Trainable params: 7,595 Non-trainable params: 0 _________________________________________________________________
model.compile(loss='sparse_categorical_crossentropy', optimizer='adam', metrics=['acc']) history = model.fit(x_train, y_train, batch_size=128, epochs=100, verbose=1)
Epoch 1/100 2/2 [==============================] - 0s 2ms/step - loss: 78.4047 - acc: 0.3732 Epoch 2/100 2/2 [==============================] - 0s 3ms/step - loss: 72.1076 - acc: 0.3944 Epoch 3/100 2/2 [==============================] - 0s 2ms/step - loss: 50.2278 - acc: 0.3803 Epoch 4/100 2/2 [==============================] - 0s 2ms/step - loss: 42.7940 - acc: 0.4225 Epoch 5/100 2/2 [==============================] - 0s 2ms/step - loss: 45.1331 - acc: 0.3310 Epoch 6/100 2/2 [==============================] - 0s 2ms/step - loss: 43.4943 - acc: 0.2746 Epoch 7/100 2/2 [==============================] - 0s 2ms/step - loss: 47.0344 - acc: 0.2958 Epoch 8/100 2/2 [==============================] - 0s 2ms/step - loss: 34.0279 - acc: 0.3451 Epoch 9/100 2/2 [==============================] - 0s 2ms/step - loss: 32.5107 - acc: 0.3592 Epoch 10/100 2/2 [==============================] - 0s 2ms/step - loss: 28.6999 - acc: 0.3732 Epoch 11/100 2/2 [==============================] - 0s 2ms/step - loss: 40.4330 - acc: 0.3028 Epoch 12/100 2/2 [==============================] - 0s 2ms/step - loss: 35.4584 - acc: 0.2676 Epoch 13/100 2/2 [==============================] - 0s 2ms/step - loss: 34.7127 - acc: 0.2676 Epoch 14/100 2/2 [==============================] - 0s 2ms/step - loss: 24.5392 - acc: 0.3803 Epoch 15/100 2/2 [==============================] - 0s 3ms/step - loss: 28.7477 - acc: 0.3380 Epoch 16/100 2/2 [==============================] - 0s 2ms/step - loss: 29.9051 - acc: 0.3028 Epoch 17/100 2/2 [==============================] - 0s 2ms/step - loss: 23.9948 - acc: 0.3310 Epoch 18/100 2/2 [==============================] - 0s 2ms/step - loss: 22.5187 - acc: 0.4296 Epoch 19/100 2/2 [==============================] - 0s 2ms/step - loss: 22.2603 - acc: 0.4225 Epoch 20/100 2/2 [==============================] - 0s 2ms/step - loss: 21.4563 - acc: 0.3521 Epoch 21/100 2/2 [==============================] - 0s 2ms/step - loss: 22.8434 - acc: 0.3592 Epoch 22/100 2/2 [==============================] - 0s 2ms/step - loss: 21.4928 - acc: 0.2817 Epoch 23/100 2/2 [==============================] - 0s 2ms/step - loss: 19.7906 - acc: 0.3944 Epoch 24/100 2/2 [==============================] - 0s 2ms/step - loss: 16.5452 - acc: 0.4296 Epoch 25/100 2/2 [==============================] - 0s 2ms/step - loss: 20.4858 - acc: 0.3310 Epoch 26/100 2/2 [==============================] - 0s 2ms/step - loss: 19.8785 - acc: 0.3169 Epoch 27/100 2/2 [==============================] - 0s 2ms/step - loss: 18.2515 - acc: 0.3169 Epoch 28/100 2/2 [==============================] - 0s 3ms/step - loss: 19.0335 - acc: 0.3521 Epoch 29/100 2/2 [==============================] - 0s 2ms/step - loss: 19.0035 - acc: 0.3239 Epoch 30/100 2/2 [==============================] - 0s 2ms/step - loss: 18.0972 - acc: 0.3239 Epoch 31/100 2/2 [==============================] - 0s 2ms/step - loss: 21.0755 - acc: 0.2746 Epoch 32/100 2/2 [==============================] - 0s 2ms/step - loss: 14.1798 - acc: 0.4296 Epoch 33/100 2/2 [==============================] - 0s 2ms/step - loss: 15.9207 - acc: 0.3310 Epoch 34/100 2/2 [==============================] - 0s 1ms/step - loss: 16.9278 - acc: 0.3662 Epoch 35/100 2/2 [==============================] - 0s 1ms/step - loss: 15.2984 - acc: 0.3873 Epoch 36/100 2/2 [==============================] - 0s 2ms/step - loss: 15.5278 - acc: 0.3521 Epoch 37/100 2/2 [==============================] - 0s 2ms/step - loss: 18.0193 - acc: 0.3732 Epoch 38/100 2/2 [==============================] - 0s 1ms/step - loss: 15.0588 - acc: 0.3521 Epoch 39/100 2/2 [==============================] - 0s 2ms/step - loss: 14.2689 - acc: 0.3662 Epoch 40/100 2/2 [==============================] - 0s 2ms/step - loss: 13.7116 - acc: 0.3521 Epoch 41/100 2/2 [==============================] - 0s 2ms/step - loss: 13.9965 - acc: 0.3169 Epoch 42/100 2/2 [==============================] - 0s 2ms/step - loss: 15.1268 - acc: 0.3028 Epoch 43/100 2/2 [==============================] - 0s 3ms/step - loss: 14.5549 - acc: 0.3169 Epoch 44/100 2/2 [==============================] - 0s 2ms/step - loss: 11.1286 - acc: 0.3944 Epoch 45/100 2/2 [==============================] - 0s 2ms/step - loss: 16.3549 - acc: 0.3380 Epoch 46/100 2/2 [==============================] - 0s 2ms/step - loss: 13.5113 - acc: 0.3310 Epoch 47/100 2/2 [==============================] - 0s 2ms/step - loss: 9.9733 - acc: 0.3662 Epoch 48/100 2/2 [==============================] - 0s 2ms/step - loss: 12.6871 - acc: 0.3732 Epoch 49/100 2/2 [==============================] - 0s 2ms/step - loss: 11.7777 - acc: 0.4085 Epoch 50/100 2/2 [==============================] - 0s 2ms/step - loss: 13.7046 - acc: 0.3732 Epoch 51/100 2/2 [==============================] - 0s 2ms/step - loss: 11.4345 - acc: 0.3451 Epoch 52/100 2/2 [==============================] - 0s 3ms/step - loss: 11.6702 - acc: 0.3521 Epoch 53/100 2/2 [==============================] - 0s 3ms/step - loss: 10.4102 - acc: 0.4014 Epoch 54/100 2/2 [==============================] - 0s 2ms/step - loss: 10.1209 - acc: 0.3873 Epoch 55/100 2/2 [==============================] - 0s 2ms/step - loss: 11.5787 - acc: 0.3873 Epoch 56/100 2/2 [==============================] - 0s 2ms/step - loss: 13.3705 - acc: 0.3028 Epoch 57/100 2/2 [==============================] - 0s 2ms/step - loss: 9.7055 - acc: 0.3873 Epoch 58/100 2/2 [==============================] - 0s 2ms/step - loss: 11.2023 - acc: 0.3732 Epoch 59/100 2/2 [==============================] - 0s 2ms/step - loss: 11.5734 - acc: 0.4085 Epoch 60/100 2/2 [==============================] - 0s 2ms/step - loss: 10.6941 - acc: 0.3592 Epoch 61/100 2/2 [==============================] - 0s 2ms/step - loss: 10.2570 - acc: 0.4014 Epoch 62/100 2/2 [==============================] - 0s 2ms/step - loss: 11.8426 - acc: 0.3310 Epoch 63/100 2/2 [==============================] - 0s 2ms/step - loss: 10.0226 - acc: 0.3521 Epoch 64/100 2/2 [==============================] - 0s 2ms/step - loss: 9.2977 - acc: 0.3521 Epoch 65/100 2/2 [==============================] - 0s 2ms/step - loss: 9.4539 - acc: 0.4085 Epoch 66/100 2/2 [==============================] - 0s 2ms/step - loss: 10.3128 - acc: 0.3521 Epoch 67/100 2/2 [==============================] - 0s 2ms/step - loss: 8.8057 - acc: 0.4155 Epoch 68/100 2/2 [==============================] - 0s 2ms/step - loss: 12.8116 - acc: 0.2958 Epoch 69/100 2/2 [==============================] - 0s 2ms/step - loss: 10.6053 - acc: 0.3521 Epoch 70/100 2/2 [==============================] - 0s 2ms/step - loss: 7.3333 - acc: 0.4225 Epoch 71/100 2/2 [==============================] - 0s 3ms/step - loss: 8.6286 - acc: 0.4296 Epoch 72/100 2/2 [==============================] - 0s 3ms/step - loss: 9.6019 - acc: 0.3451 Epoch 73/100 2/2 [==============================] - 0s 2ms/step - loss: 9.4916 - acc: 0.3732 Epoch 74/100 2/2 [==============================] - 0s 2ms/step - loss: 10.0909 - acc: 0.4014 Epoch 75/100 2/2 [==============================] - 0s 2ms/step - loss: 10.7403 - acc: 0.2958 Epoch 76/100 2/2 [==============================] - 0s 3ms/step - loss: 8.9081 - acc: 0.3451 Epoch 77/100 2/2 [==============================] - 0s 2ms/step - loss: 8.3458 - acc: 0.3662 Epoch 78/100 2/2 [==============================] - 0s 2ms/step - loss: 8.5031 - acc: 0.2676 Epoch 79/100 2/2 [==============================] - 0s 3ms/step - loss: 7.7347 - acc: 0.3944 Epoch 80/100 2/2 [==============================] - 0s 2ms/step - loss: 7.4269 - acc: 0.3592 Epoch 81/100 2/2 [==============================] - 0s 2ms/step - loss: 9.6820 - acc: 0.3239 Epoch 82/100 2/2 [==============================] - 0s 2ms/step - loss: 8.8056 - acc: 0.3451 Epoch 83/100 2/2 [==============================] - 0s 2ms/step - loss: 8.6425 - acc: 0.3380 Epoch 84/100 2/2 [==============================] - 0s 2ms/step - loss: 7.6924 - acc: 0.3380 Epoch 85/100 2/2 [==============================] - 0s 2ms/step - loss: 6.8624 - acc: 0.4437 Epoch 86/100 2/2 [==============================] - 0s 2ms/step - loss: 8.5389 - acc: 0.3380 Epoch 87/100 2/2 [==============================] - 0s 2ms/step - loss: 8.0148 - acc: 0.3803 Epoch 88/100 2/2 [==============================] - 0s 2ms/step - loss: 9.7312 - acc: 0.3592 Epoch 89/100 2/2 [==============================] - 0s 2ms/step - loss: 6.2159 - acc: 0.3803 Epoch 90/100 2/2 [==============================] - 0s 2ms/step - loss: 6.4581 - acc: 0.3873 Epoch 91/100 2/2 [==============================] - 0s 2ms/step - loss: 6.3186 - acc: 0.3732 Epoch 92/100 2/2 [==============================] - 0s 2ms/step - loss: 6.8777 - acc: 0.4155 Epoch 93/100 2/2 [==============================] - 0s 2ms/step - loss: 6.2091 - acc: 0.4014 Epoch 94/100 2/2 [==============================] - 0s 2ms/step - loss: 7.0392 - acc: 0.3592 Epoch 95/100 2/2 [==============================] - 0s 2ms/step - loss: 6.7406 - acc: 0.3451 Epoch 96/100 2/2 [==============================] - 0s 2ms/step - loss: 6.7102 - acc: 0.4225 Epoch 97/100 2/2 [==============================] - 0s 2ms/step - loss: 6.9462 - acc: 0.3239 Epoch 98/100 2/2 [==============================] - 0s 2ms/step - loss: 5.5881 - acc: 0.4296 Epoch 99/100 2/2 [==============================] - 0s 2ms/step - loss: 6.5911 - acc: 0.3592 Epoch 100/100 2/2 [==============================] - 0s 2ms/step - loss: 7.3447 - acc: 0.4225
And it got even worse. As It is expected, no need to use dropout.. I think it is better to use scikit-learn for such small data. Thanks.