경기도 인공지능 개발 과정/Python

[Python] 텐서플로 히든레이어를 이용한 보스턴 집값과 아이리스 품종 예측

agingcurve 2022. 7. 11. 12:21
반응형

텐서플로 히든레이어를 이용한 보스턴 집값 예측

  • 텐서플로를 이용해서 보스턴 집값을 예측해보자
  • 독립변수 : crim, zn, indus, chas, nox, rm, age, dis, rad, tax, tratio, b, lstat
  • 종속변수 : medv
  • medv는 해당타운의 주택들 가격의 중앙값을 나타냄
  • 회귀알고리즘을 사용하여 딥러닝을 통한 예측을 실시함
  • 이번에는 히든레이어를 추가하여 집값을 예측해본다.
  • 히든레이어 : 입력레이어와 출력 레이어 사이에 추가한 부분을 히든레이어라고 하며, 입력과 출력 사이에 하나의 층을 쌓아서 구성한 모델임
  • 예를 들어 총 5개의 노드가 있다면 5개의 퍼셉트론이 필요하다.
  • 히든레이어는 1층만 쌓아서 실시해보자

텐서플로 히든레이어를 이용한 아이리스 품종

  • 붓꽃 데이터를 분류실시 하고자 함
  • 독립변수 : 꽃잎길이, 꽃잎 폭, 꽃받침 길이, 꽃받침 폭
  • 종속변수 : 품종
  • 분류 알고리즘으로서 범주형인 종속변수를 더미화 시켜줘야 함
  • 활성화 함수 = 확률값으로 만들어주기,(softmax 사용)
  • 히든레이어를 3층을 쌓아서 실시해보자
In [1]:
import pandas as pd
import tensorflow as tf
import numpy as np
In [2]:
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/boston.csv'
보스턴 = pd.read_csv(파일경로)
In [13]:
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/iris.csv'
아이리스 = pd.read_csv(파일경로)

보스턴 집값 예측

In [3]:
print(보스턴.columns)
보스턴.head()
Index(['crim', 'zn', 'indus', 'chas', 'nox', 'rm', 'age', 'dis', 'rad', 'tax',
       'ptratio', 'b', 'lstat', 'medv'],
      dtype='object')
Out[3]:
crimzninduschasnoxrmagedisradtaxptratioblstatmedv01234
0.00632 18.0 2.31 0 0.538 6.575 65.2 4.0900 1 296 15.3 396.90 4.98 24.0
0.02731 0.0 7.07 0 0.469 6.421 78.9 4.9671 2 242 17.8 396.90 9.14 21.6
0.02729 0.0 7.07 0 0.469 7.185 61.1 4.9671 2 242 17.8 392.83 4.03 34.7
0.03237 0.0 2.18 0 0.458 6.998 45.8 6.0622 3 222 18.7 394.63 2.94 33.4
0.06905 0.0 2.18 0 0.458 7.147 54.2 6.0622 3 222 18.7 396.90 5.33 36.2

독립변수, 종속변수 분리

In [4]:
독립 = 보스턴[['crim', 'zn', 'indus', 'chas', 'nox', 'rm', 'age', 'dis', 'rad', 'tax',
       'ptratio', 'b', 'lstat']]
종속 = 보스턴[["medv"]]
In [5]:
print(독립.shape, 종속.shape)
(506, 13) (506, 1)

모델 구조 만들기 (1층의 히든레이어)

In [6]:
X = tf.keras.layers.Input(shape=[13])
H = tf.keras.layers.Dense(10, activation = "swish")(X)
Y = tf.keras.layers.Dense(1)(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss="mse")
In [7]:
# 모델구조 확인
model.summary()
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_1 (InputLayer)        [(None, 13)]              0         
                                                                 
 dense (Dense)               (None, 10)                140       
                                                                 
 dense_1 (Dense)             (None, 1)                 11        
                                                                 
=================================================================
Total params: 151
Trainable params: 151
Non-trainable params: 0
_________________________________________________________________
In [8]:
# 데이터로 모델을 학습
model.fit(독립, 종속, epochs=100)
Epoch 1/100
16/16 [==============================] - 0s 732us/step - loss: 26327.2168
Epoch 2/100
16/16 [==============================] - 0s 652us/step - loss: 13436.7129
Epoch 3/100
16/16 [==============================] - 0s 598us/step - loss: 7715.2832
Epoch 4/100
16/16 [==============================] - 0s 532us/step - loss: 4736.2407
Epoch 5/100
16/16 [==============================] - 0s 665us/step - loss: 3143.1248
Epoch 6/100
16/16 [==============================] - 0s 598us/step - loss: 2067.8193
Epoch 7/100
16/16 [==============================] - 0s 598us/step - loss: 1255.3082
Epoch 8/100
16/16 [==============================] - 0s 665us/step - loss: 708.7706
Epoch 9/100
16/16 [==============================] - 0s 598us/step - loss: 395.2045
Epoch 10/100
16/16 [==============================] - 0s 598us/step - loss: 234.7323
Epoch 11/100
16/16 [==============================] - 0s 731us/step - loss: 165.7978
Epoch 12/100
16/16 [==============================] - 0s 599us/step - loss: 138.7129
Epoch 13/100
16/16 [==============================] - 0s 598us/step - loss: 117.4395
Epoch 14/100
16/16 [==============================] - 0s 598us/step - loss: 103.9854
Epoch 15/100
16/16 [==============================] - 0s 532us/step - loss: 97.0482
Epoch 16/100
16/16 [==============================] - 0s 598us/step - loss: 90.0469
Epoch 17/100
16/16 [==============================] - 0s 665us/step - loss: 82.3302
Epoch 18/100
16/16 [==============================] - 0s 532us/step - loss: 77.6037
Epoch 19/100
16/16 [==============================] - 0s 641us/step - loss: 76.8474
Epoch 20/100
16/16 [==============================] - 0s 731us/step - loss: 72.1552
Epoch 21/100
16/16 [==============================] - 0s 598us/step - loss: 68.4709
Epoch 22/100
16/16 [==============================] - 0s 665us/step - loss: 66.8577
Epoch 23/100
16/16 [==============================] - 0s 665us/step - loss: 64.5925
Epoch 24/100
16/16 [==============================] - 0s 532us/step - loss: 61.3757
Epoch 25/100
16/16 [==============================] - 0s 665us/step - loss: 62.3037
Epoch 26/100
16/16 [==============================] - 0s 731us/step - loss: 55.9166
Epoch 27/100
16/16 [==============================] - 0s 665us/step - loss: 56.8465
Epoch 28/100
16/16 [==============================] - 0s 665us/step - loss: 55.0134
Epoch 29/100
16/16 [==============================] - 0s 731us/step - loss: 53.7480
Epoch 30/100
16/16 [==============================] - 0s 798us/step - loss: 55.0471
Epoch 31/100
16/16 [==============================] - 0s 731us/step - loss: 51.7012
Epoch 32/100
16/16 [==============================] - 0s 665us/step - loss: 52.7076
Epoch 33/100
16/16 [==============================] - 0s 598us/step - loss: 49.3026
Epoch 34/100
16/16 [==============================] - 0s 665us/step - loss: 48.1481
Epoch 35/100
16/16 [==============================] - 0s 665us/step - loss: 50.3238
Epoch 36/100
16/16 [==============================] - 0s 665us/step - loss: 49.2616
Epoch 37/100
16/16 [==============================] - 0s 731us/step - loss: 48.4561
Epoch 38/100
16/16 [==============================] - 0s 864us/step - loss: 46.7820
Epoch 39/100
16/16 [==============================] - 0s 731us/step - loss: 48.5841
Epoch 40/100
16/16 [==============================] - 0s 598us/step - loss: 45.8624
Epoch 41/100
16/16 [==============================] - 0s 665us/step - loss: 49.9760
Epoch 42/100
16/16 [==============================] - 0s 731us/step - loss: 44.3670
Epoch 43/100
16/16 [==============================] - 0s 731us/step - loss: 46.3549
Epoch 44/100
16/16 [==============================] - 0s 598us/step - loss: 44.3563
Epoch 45/100
16/16 [==============================] - 0s 731us/step - loss: 43.2360
Epoch 46/100
16/16 [==============================] - 0s 731us/step - loss: 46.6557
Epoch 47/100
16/16 [==============================] - 0s 665us/step - loss: 41.3246
Epoch 48/100
16/16 [==============================] - 0s 598us/step - loss: 42.8393
Epoch 49/100
16/16 [==============================] - 0s 731us/step - loss: 44.0714
Epoch 50/100
16/16 [==============================] - 0s 665us/step - loss: 42.8014
Epoch 51/100
16/16 [==============================] - 0s 655us/step - loss: 42.5950
Epoch 52/100
16/16 [==============================] - 0s 665us/step - loss: 40.9573
Epoch 53/100
16/16 [==============================] - 0s 665us/step - loss: 42.6814
Epoch 54/100
16/16 [==============================] - 0s 598us/step - loss: 40.7444
Epoch 55/100
16/16 [==============================] - 0s 665us/step - loss: 42.0802
Epoch 56/100
16/16 [==============================] - 0s 731us/step - loss: 41.2343
Epoch 57/100
16/16 [==============================] - 0s 598us/step - loss: 42.3350
Epoch 58/100
16/16 [==============================] - 0s 636us/step - loss: 42.6219
Epoch 59/100
16/16 [==============================] - 0s 731us/step - loss: 40.5511
Epoch 60/100
16/16 [==============================] - 0s 598us/step - loss: 42.0141
Epoch 61/100
16/16 [==============================] - 0s 665us/step - loss: 38.3005
Epoch 62/100
16/16 [==============================] - 0s 665us/step - loss: 41.9233
Epoch 63/100
16/16 [==============================] - 0s 1ms/step - loss: 42.9261
Epoch 64/100
16/16 [==============================] - 0s 773us/step - loss: 38.5902
Epoch 65/100
16/16 [==============================] - 0s 665us/step - loss: 42.1813
Epoch 66/100
16/16 [==============================] - 0s 598us/step - loss: 39.3743
Epoch 67/100
16/16 [==============================] - 0s 665us/step - loss: 42.3749
Epoch 68/100
16/16 [==============================] - 0s 731us/step - loss: 39.1514
Epoch 69/100
16/16 [==============================] - 0s 665us/step - loss: 40.7933
Epoch 70/100
16/16 [==============================] - 0s 598us/step - loss: 40.2619
Epoch 71/100
16/16 [==============================] - 0s 731us/step - loss: 40.4545
Epoch 72/100
16/16 [==============================] - 0s 665us/step - loss: 40.5902
Epoch 73/100
16/16 [==============================] - 0s 665us/step - loss: 40.2466
Epoch 74/100
16/16 [==============================] - 0s 598us/step - loss: 37.2725
Epoch 75/100
16/16 [==============================] - 0s 665us/step - loss: 40.5919
Epoch 76/100
16/16 [==============================] - 0s 532us/step - loss: 38.8667
Epoch 77/100
16/16 [==============================] - 0s 595us/step - loss: 39.6245
Epoch 78/100
16/16 [==============================] - 0s 598us/step - loss: 39.3633
Epoch 79/100
16/16 [==============================] - 0s 532us/step - loss: 39.8824
Epoch 80/100
16/16 [==============================] - 0s 731us/step - loss: 39.6080
Epoch 81/100
16/16 [==============================] - 0s 532us/step - loss: 39.3975
Epoch 82/100
16/16 [==============================] - 0s 598us/step - loss: 37.9085
Epoch 83/100
16/16 [==============================] - 0s 665us/step - loss: 38.4694
Epoch 84/100
16/16 [==============================] - 0s 532us/step - loss: 39.0518
Epoch 85/100
16/16 [==============================] - 0s 598us/step - loss: 40.5084
Epoch 86/100
16/16 [==============================] - 0s 598us/step - loss: 37.4809
Epoch 87/100
16/16 [==============================] - 0s 532us/step - loss: 39.8344
Epoch 88/100
16/16 [==============================] - 0s 598us/step - loss: 36.5545
Epoch 89/100
16/16 [==============================] - 0s 598us/step - loss: 39.9741
Epoch 90/100
16/16 [==============================] - 0s 598us/step - loss: 39.0188
Epoch 91/100
16/16 [==============================] - 0s 598us/step - loss: 38.7354
Epoch 92/100
16/16 [==============================] - 0s 598us/step - loss: 36.9616
Epoch 93/100
16/16 [==============================] - 0s 665us/step - loss: 39.1454
Epoch 94/100
16/16 [==============================] - 0s 598us/step - loss: 38.2624
Epoch 95/100
16/16 [==============================] - 0s 532us/step - loss: 39.5628
Epoch 96/100
16/16 [==============================] - 0s 598us/step - loss: 39.2256
Epoch 97/100
16/16 [==============================] - 0s 598us/step - loss: 37.4265
Epoch 98/100
16/16 [==============================] - 0s 598us/step - loss: 38.0635
Epoch 99/100
16/16 [==============================] - 0s 598us/step - loss: 39.0122
Epoch 100/100
16/16 [==============================] - 0s 665us/step - loss: 36.8732
Out[8]:
<keras.callbacks.History at 0x1bb8fc679d0>
In [9]:
# 모델을 이용
print(model.predict(독립[:5]))
print(종속[:5])
[[31.611774]
 [26.954277]
 [29.4788  ]
 [29.359203]
 [28.425667]]
   medv
0  24.0
1  21.6
2  34.7
3  33.4
4  36.2
In [12]:
# 모델의 수식을 확인
print(model.get_weights())
[array([[ 3.3245251e-01,  1.0988521e-01,  1.1101019e-01, -1.7416896e-01,
        -1.2876163e-01,  4.3043011e-01, -3.0944335e-01, -3.8369218e-01,
         1.7854728e-01, -1.4274819e-01],
       [ 1.2754864e-01,  1.3259703e-01,  6.1892256e-02,  3.6035109e-02,
        -3.8911682e-01, -2.5312856e-01,  4.6911547e-01, -2.9607448e-01,
        -1.5533793e-02,  2.5129177e-02],
       [-3.0209079e-01, -4.9431354e-01, -3.8058403e-01,  1.7640278e-01,
         2.9478657e-01,  3.3800131e-01,  3.8129285e-01, -3.2706574e-02,
        -7.6932758e-02, -3.5227850e-01],
       [-8.6337060e-01,  4.8707899e-02,  9.3200862e-01, -9.2294765e-01,
        -5.0375539e-01,  5.0701302e-01,  8.3834529e-01,  1.1029038e+00,
         4.3934995e-01, -2.8047821e-01],
       [-9.5416620e-02, -7.6460950e-02,  3.8516748e-01,  7.3103063e-02,
         2.7310044e-01, -4.2848027e-01,  2.3440103e-01, -2.5112621e-02,
         2.5381921e-02,  4.2854768e-01],
       [ 9.6862167e-02,  4.1507627e-03,  3.1982914e-01, -6.7877698e-01,
         1.5494627e-01,  4.9266583e-01, -7.9922773e-02,  2.9438335e-01,
         5.3284788e-01, -7.1749568e-02],
       [-2.0784123e-01, -9.9493444e-02,  2.2725922e-01, -2.8962451e-01,
         9.1701508e-02, -4.5276868e-01,  3.3928341e-01, -9.3295395e-02,
        -1.5507750e-02,  8.3057443e-05],
       [ 1.6504687e-01, -6.9693792e-01, -8.3533889e-01,  8.8298768e-02,
         2.5216803e-01, -3.5490805e-01, -7.7157277e-01, -1.4981267e-01,
        -2.2698294e-01, -4.0516520e-01],
       [-4.9801845e-02, -1.3625306e-01,  3.2507142e-01,  4.2427799e-01,
        -2.0052855e-01, -2.7404052e-01, -2.1450582e-01,  6.5812424e-02,
        -1.8650414e-01,  2.2661595e-01],
       [ 5.8594555e-02,  1.8929440e-01,  3.6961949e-01,  4.3627780e-02,
         4.8772979e-01, -3.3931524e-01,  9.1169558e-02,  2.4337918e-01,
        -8.8033434e-03, -4.6037668e-01],
       [-6.4908862e-02, -1.6524553e-01, -3.4110674e-01,  3.7266046e-01,
        -2.2949973e-01, -1.6166589e-01, -4.1413659e-01, -2.7708641e-01,
         3.0013180e-01, -2.1090658e-01],
       [ 4.8540726e-01, -2.4107383e-01,  4.6429098e-01,  4.3627375e-01,
         1.8313807e-01, -5.0825745e-01,  2.0175559e-02,  2.1854751e-01,
         1.6839325e-01,  2.3896493e-01],
       [ 2.9405817e-01, -2.8048506e-01, -1.9925767e-01,  3.5444263e-02,
         6.1701965e-01,  1.7088240e-01, -3.9511999e-01, -8.4584451e-01,
         2.2413298e-01, -2.5673485e-01]], dtype=float32), array([ 0.00034696, -0.11694708, -0.01176737, -0.00680507,  0.01170198,
        0.        , -0.01114792, -0.01120993,  0.12635516, -0.04369785],
      dtype=float32), array([[-0.13675573],
       [ 0.31583285],
       [ 0.18055497],
       [-0.11585219],
       [-0.41009825],
       [ 0.36333317],
       [ 0.18611383],
       [ 0.5206608 ],
       [ 0.43113285],
       [-1.0553864 ]], dtype=float32), array([-0.01115046], dtype=float32)]

아이리스 분류 예측

In [14]:
아이리스 = pd.get_dummies(아이리스)
아이리스.columns
Out[14]:
Index(['꽃잎길이', '꽃잎폭', '꽃받침길이', '꽃받침폭', '품종_setosa', '품종_versicolor',
       '품종_virginica'],
      dtype='object')

종속변수, 독립변수 만들기

In [15]:
독립 = 아이리스[["꽃잎길이", "꽃잎폭", "꽃받침길이", "꽃받침폭"]]
종속 = 아이리스[['품종_setosa', '품종_versicolor','품종_virginica']]
print(독립.shape, 종속.shape)
(150, 4) (150, 3)

모델의 구조를 만들기

In [16]:
X = tf.keras.layers.Input(shape=[4])
H = tf.keras.layers.Dense(8, activation="swish")(X)
H = tf.keras.layers.Dense(8, activation="swish")(H)
H = tf.keras.layers.Dense(8, activation="swish")(H)
Y = tf.keras.layers.Dense(3, activation="softmax")(H)
model = tf.keras.models.Model(X, Y)
model.compile(loss="categorical_crossentropy", metrics="accuracy")
In [17]:
# 모델의 구조 확인
model.summary()
Model: "model_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_2 (InputLayer)        [(None, 4)]               0         
                                                                 
 dense_2 (Dense)             (None, 8)                 40        
                                                                 
 dense_3 (Dense)             (None, 8)                 72        
                                                                 
 dense_4 (Dense)             (None, 8)                 72        
                                                                 
 dense_5 (Dense)             (None, 3)                 27        
                                                                 
=================================================================
Total params: 211
Trainable params: 211
Non-trainable params: 0
_________________________________________________________________
In [18]:
# 모델을 학습
model.fit(독립, 종속, epochs=100)
Epoch 1/100
5/5 [==============================] - 0s 1ms/step - loss: 1.1253 - accuracy: 0.2800
Epoch 2/100
5/5 [==============================] - 0s 1ms/step - loss: 1.0852 - accuracy: 0.3667
Epoch 3/100
5/5 [==============================] - 0s 748us/step - loss: 1.0578 - accuracy: 0.4800
Epoch 4/100
5/5 [==============================] - 0s 748us/step - loss: 1.0333 - accuracy: 0.5600
Epoch 5/100
5/5 [==============================] - 0s 997us/step - loss: 1.0101 - accuracy: 0.6267
Epoch 6/100
5/5 [==============================] - 0s 748us/step - loss: 0.9876 - accuracy: 0.6667
Epoch 7/100
5/5 [==============================] - 0s 998us/step - loss: 0.9659 - accuracy: 0.6667
Epoch 8/100
5/5 [==============================] - 0s 997us/step - loss: 0.9452 - accuracy: 0.6667
Epoch 9/100
5/5 [==============================] - 0s 997us/step - loss: 0.9244 - accuracy: 0.6667
Epoch 10/100
5/5 [==============================] - 0s 1ms/step - loss: 0.9051 - accuracy: 0.6667
Epoch 11/100
5/5 [==============================] - 0s 997us/step - loss: 0.8863 - accuracy: 0.6667
Epoch 12/100
5/5 [==============================] - 0s 748us/step - loss: 0.8682 - accuracy: 0.6667
Epoch 13/100
5/5 [==============================] - 0s 997us/step - loss: 0.8508 - accuracy: 0.6667
Epoch 14/100
5/5 [==============================] - 0s 997us/step - loss: 0.8342 - accuracy: 0.6667
Epoch 15/100
5/5 [==============================] - 0s 997us/step - loss: 0.8180 - accuracy: 0.6667
Epoch 16/100
5/5 [==============================] - 0s 998us/step - loss: 0.8023 - accuracy: 0.6667
Epoch 17/100
5/5 [==============================] - 0s 997us/step - loss: 0.7860 - accuracy: 0.6667
Epoch 18/100
5/5 [==============================] - 0s 1ms/step - loss: 0.7699 - accuracy: 0.6667
Epoch 19/100
5/5 [==============================] - 0s 997us/step - loss: 0.7528 - accuracy: 0.6733
Epoch 20/100
5/5 [==============================] - 0s 748us/step - loss: 0.7340 - accuracy: 0.6800
Epoch 21/100
5/5 [==============================] - 0s 1ms/step - loss: 0.7144 - accuracy: 0.6867
Epoch 22/100
5/5 [==============================] - 0s 748us/step - loss: 0.6940 - accuracy: 0.7200
Epoch 23/100
5/5 [==============================] - 0s 997us/step - loss: 0.6729 - accuracy: 0.7400
Epoch 24/100
5/5 [==============================] - 0s 997us/step - loss: 0.6512 - accuracy: 0.8533
Epoch 25/100
5/5 [==============================] - 0s 748us/step - loss: 0.6298 - accuracy: 0.8333
Epoch 26/100
5/5 [==============================] - 0s 748us/step - loss: 0.6090 - accuracy: 0.8000
Epoch 27/100
5/5 [==============================] - 0s 997us/step - loss: 0.5890 - accuracy: 0.8267
Epoch 28/100
5/5 [==============================] - 0s 998us/step - loss: 0.5704 - accuracy: 0.8267
Epoch 29/100
5/5 [==============================] - 0s 997us/step - loss: 0.5531 - accuracy: 0.8000
Epoch 30/100
5/5 [==============================] - 0s 997us/step - loss: 0.5383 - accuracy: 0.7933
Epoch 31/100
5/5 [==============================] - 0s 997us/step - loss: 0.5248 - accuracy: 0.7333
Epoch 32/100
5/5 [==============================] - 0s 1ms/step - loss: 0.5130 - accuracy: 0.7800
Epoch 33/100
5/5 [==============================] - 0s 748us/step - loss: 0.5023 - accuracy: 0.7600
Epoch 34/100
5/5 [==============================] - 0s 748us/step - loss: 0.4931 - accuracy: 0.8467
Epoch 35/100
5/5 [==============================] - 0s 997us/step - loss: 0.4844 - accuracy: 0.8467
Epoch 36/100
5/5 [==============================] - 0s 997us/step - loss: 0.4774 - accuracy: 0.7400
Epoch 37/100
5/5 [==============================] - 0s 997us/step - loss: 0.4700 - accuracy: 0.7600
Epoch 38/100
5/5 [==============================] - 0s 748us/step - loss: 0.4645 - accuracy: 0.8800
Epoch 39/100
5/5 [==============================] - 0s 748us/step - loss: 0.4589 - accuracy: 0.7400
Epoch 40/100
5/5 [==============================] - 0s 1ms/step - loss: 0.4535 - accuracy: 0.8733
Epoch 41/100
5/5 [==============================] - 0s 748us/step - loss: 0.4485 - accuracy: 0.9200
Epoch 42/100
5/5 [==============================] - 0s 997us/step - loss: 0.4440 - accuracy: 0.8467
Epoch 43/100
5/5 [==============================] - 0s 997us/step - loss: 0.4396 - accuracy: 0.8733
Epoch 44/100
5/5 [==============================] - 0s 997us/step - loss: 0.4330 - accuracy: 0.8600
Epoch 45/100
5/5 [==============================] - 0s 1ms/step - loss: 0.4277 - accuracy: 0.9400
Epoch 46/100
5/5 [==============================] - 0s 1ms/step - loss: 0.4224 - accuracy: 0.9333
Epoch 47/100
5/5 [==============================] - 0s 748us/step - loss: 0.4172 - accuracy: 0.9000
Epoch 48/100
5/5 [==============================] - 0s 1ms/step - loss: 0.4097 - accuracy: 0.9400
Epoch 49/100
5/5 [==============================] - 0s 997us/step - loss: 0.4072 - accuracy: 0.9400
Epoch 50/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3989 - accuracy: 0.9333
Epoch 51/100
5/5 [==============================] - 0s 997us/step - loss: 0.3914 - accuracy: 0.9467
Epoch 52/100
5/5 [==============================] - 0s 997us/step - loss: 0.3855 - accuracy: 0.9333
Epoch 53/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3788 - accuracy: 0.9667
Epoch 54/100
5/5 [==============================] - 0s 748us/step - loss: 0.3716 - accuracy: 0.9533
Epoch 55/100
5/5 [==============================] - 0s 748us/step - loss: 0.3652 - accuracy: 0.9600
Epoch 56/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3569 - accuracy: 0.9600
Epoch 57/100
5/5 [==============================] - 0s 748us/step - loss: 0.3492 - accuracy: 0.9600
Epoch 58/100
5/5 [==============================] - 0s 998us/step - loss: 0.3410 - accuracy: 0.9600
Epoch 59/100
5/5 [==============================] - 0s 748us/step - loss: 0.3352 - accuracy: 0.9400
Epoch 60/100
5/5 [==============================] - 0s 748us/step - loss: 0.3269 - accuracy: 0.9467
Epoch 61/100
5/5 [==============================] - 0s 1ms/step - loss: 0.3178 - accuracy: 0.9667
Epoch 62/100
5/5 [==============================] - 0s 997us/step - loss: 0.3097 - accuracy: 0.9667
Epoch 63/100
5/5 [==============================] - 0s 748us/step - loss: 0.3019 - accuracy: 0.9467
Epoch 64/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2929 - accuracy: 0.9733
Epoch 65/100
5/5 [==============================] - 0s 748us/step - loss: 0.2889 - accuracy: 0.9600
Epoch 66/100
5/5 [==============================] - 0s 748us/step - loss: 0.2776 - accuracy: 0.9667
Epoch 67/100
5/5 [==============================] - 0s 1ms/step - loss: 0.2729 - accuracy: 0.9600
Epoch 68/100
5/5 [==============================] - 0s 997us/step - loss: 0.2661 - accuracy: 0.9600
Epoch 69/100
5/5 [==============================] - 0s 748us/step - loss: 0.2620 - accuracy: 0.9733
Epoch 70/100
5/5 [==============================] - 0s 997us/step - loss: 0.2538 - accuracy: 0.9667
Epoch 71/100
5/5 [==============================] - 0s 997us/step - loss: 0.2486 - accuracy: 0.9667
Epoch 72/100
5/5 [==============================] - 0s 998us/step - loss: 0.2420 - accuracy: 0.9667
Epoch 73/100
5/5 [==============================] - 0s 997us/step - loss: 0.2352 - accuracy: 0.9733
Epoch 74/100
5/5 [==============================] - 0s 748us/step - loss: 0.2289 - accuracy: 0.9733
Epoch 75/100
5/5 [==============================] - 0s 997us/step - loss: 0.2212 - accuracy: 0.9667
Epoch 76/100
5/5 [==============================] - 0s 997us/step - loss: 0.2151 - accuracy: 0.9800
Epoch 77/100
5/5 [==============================] - 0s 748us/step - loss: 0.2085 - accuracy: 0.9600
Epoch 78/100
5/5 [==============================] - 0s 748us/step - loss: 0.2056 - accuracy: 0.9533
Epoch 79/100
5/5 [==============================] - 0s 997us/step - loss: 0.1968 - accuracy: 0.9667
Epoch 80/100
5/5 [==============================] - 0s 997us/step - loss: 0.1912 - accuracy: 0.9733
Epoch 81/100
5/5 [==============================] - 0s 998us/step - loss: 0.1845 - accuracy: 0.9733
Epoch 82/100
5/5 [==============================] - 0s 748us/step - loss: 0.1846 - accuracy: 0.9667
Epoch 83/100
5/5 [==============================] - 0s 748us/step - loss: 0.1764 - accuracy: 0.9600
Epoch 84/100
5/5 [==============================] - 0s 997us/step - loss: 0.1700 - accuracy: 0.9733
Epoch 85/100
5/5 [==============================] - 0s 998us/step - loss: 0.1679 - accuracy: 0.9733
Epoch 86/100
5/5 [==============================] - 0s 748us/step - loss: 0.1649 - accuracy: 0.9600
Epoch 87/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1596 - accuracy: 0.9600
Epoch 88/100
5/5 [==============================] - 0s 997us/step - loss: 0.1572 - accuracy: 0.9667
Epoch 89/100
5/5 [==============================] - 0s 748us/step - loss: 0.1556 - accuracy: 0.9667
Epoch 90/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1500 - accuracy: 0.9667
Epoch 91/100
5/5 [==============================] - 0s 748us/step - loss: 0.1470 - accuracy: 0.9733
Epoch 92/100
5/5 [==============================] - 0s 748us/step - loss: 0.1450 - accuracy: 0.9533
Epoch 93/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1424 - accuracy: 0.9600
Epoch 94/100
5/5 [==============================] - 0s 748us/step - loss: 0.1401 - accuracy: 0.9733
Epoch 95/100
5/5 [==============================] - 0s 748us/step - loss: 0.1388 - accuracy: 0.9600
Epoch 96/100
5/5 [==============================] - 0s 1ms/step - loss: 0.1381 - accuracy: 0.9600
Epoch 97/100
5/5 [==============================] - 0s 748us/step - loss: 0.1343 - accuracy: 0.9667
Epoch 98/100
5/5 [==============================] - 0s 748us/step - loss: 0.1312 - accuracy: 0.9667
Epoch 99/100
5/5 [==============================] - 0s 998us/step - loss: 0.1299 - accuracy: 0.9667
Epoch 100/100
5/5 [==============================] - 0s 748us/step - loss: 0.1279 - accuracy: 0.9733
Out[18]:
<keras.callbacks.History at 0x1bba6026f40>
In [19]:
# 모델을 이용한다.
print(model.predict(독립[:5]))
print(종속[:5])
[[9.9617094e-01 3.8290883e-03 1.1422100e-10]
 [9.9107856e-01 8.9214463e-03 1.3839503e-09]
 [9.9404329e-01 5.9567560e-03 6.1645972e-10]
 [9.9185210e-01 8.1478627e-03 1.4671592e-09]
 [9.9672049e-01 3.2794459e-03 8.4045430e-11]]
   품종_setosa  품종_versicolor  품종_virginica
0          1              0             0
1          1              0             0
2          1              0             0
3          1              0             0
4          1              0             0