경기도 인공지능 개발 과정/Python

[Python] 텐서플로를 이용한 보스턴 집값 예측

agingcurve 2022. 7. 11. 09:58
반응형
import pandas as pd
import tensorflow as tf
import numpy as np

보스턴 집값 예측

  • 텐서플로를 이용해서 보스턴 집값을 예측해보자
  • 독립변수 : crim, zn, indus, chas, nox, rm, age, dis, rad, tax, tratio, b, lstat
  • 종속변수 : medv
  • medv는 해당타운의 주택들 가격의 중앙값을 나타냄
  • 회귀알고리즘을 사용하여 딥러닝을 통한 예측을 실시함
In [22]:
파일경로 = 'https://raw.githubusercontent.com/blackdew/tensorflow1/master/csv/boston.csv'
보스턴 = pd.read_csv(파일경로)
In [23]:
print(보스턴.columns)
보스턴.head()
Index(['crim', 'zn', 'indus', 'chas', 'nox', 'rm', 'age', 'dis', 'rad', 'tax',
       'ptratio', 'b', 'lstat', 'medv'],
      dtype='object')
Out[23]:
crimzninduschasnoxrmagedisradtaxptratioblstatmedv01234
0.00632 18.0 2.31 0 0.538 6.575 65.2 4.0900 1 296 15.3 396.90 4.98 24.0
0.02731 0.0 7.07 0 0.469 6.421 78.9 4.9671 2 242 17.8 396.90 9.14 21.6
0.02729 0.0 7.07 0 0.469 7.185 61.1 4.9671 2 242 17.8 392.83 4.03 34.7
0.03237 0.0 2.18 0 0.458 6.998 45.8 6.0622 3 222 18.7 394.63 2.94 33.4
0.06905 0.0 2.18 0 0.458 7.147 54.2 6.0622 3 222 18.7 396.90 5.33 36.2

독립변수, 종속변수 분리

In [25]:
독립 = 보스턴[['crim', 'zn', 'indus', 'chas', 'nox', 'rm', 'age', 'dis', 'rad', 'tax',
       'ptratio', 'b', 'lstat']]
종속 = 보스턴[["medv"]]
In [26]:
print(독립.shape, 종속.shape)
(506, 13) (506, 1)

모델 구조 만들기

In [27]:
X = tf.keras.layers.Input(shape=[13])
Y = tf.keras.layers.Dense(1)(X)
model = tf.keras.models.Model(X, Y)
model.compile(loss="mse")

모델을 학습하기

In [28]:
model.fit(독립, 종속, epochs=10)
Epoch 1/10
16/16 [==============================] - 1s 2ms/step - loss: 28793.9883
Epoch 2/10
16/16 [==============================] - 0s 2ms/step - loss: 23754.8086
Epoch 3/10
16/16 [==============================] - 0s 2ms/step - loss: 19923.2852
Epoch 4/10
16/16 [==============================] - 0s 2ms/step - loss: 16577.0488
Epoch 5/10
16/16 [==============================] - 0s 2ms/step - loss: 13682.0469
Epoch 6/10
16/16 [==============================] - 0s 2ms/step - loss: 11181.2148
Epoch 7/10
16/16 [==============================] - 0s 2ms/step - loss: 9052.7422
Epoch 8/10
16/16 [==============================] - 0s 2ms/step - loss: 7310.6748
Epoch 9/10
16/16 [==============================] - 0s 2ms/step - loss: 5977.0327
Epoch 10/10
16/16 [==============================] - 0s 2ms/step - loss: 4962.4990
Out[28]:
<keras.callbacks.History at 0x7f96173e10d0>

100번더 학습해보자

In [29]:
model.fit(독립, 종속, epochs=100)
Epoch 1/100
16/16 [==============================] - 0s 1ms/step - loss: 4260.5791
Epoch 2/100
16/16 [==============================] - 0s 1ms/step - loss: 3786.9917
Epoch 3/100
16/16 [==============================] - 0s 2ms/step - loss: 3425.4236
Epoch 4/100
16/16 [==============================] - 0s 2ms/step - loss: 3096.6509
Epoch 5/100
16/16 [==============================] - 0s 2ms/step - loss: 2784.9995
Epoch 6/100
16/16 [==============================] - 0s 2ms/step - loss: 2486.8101
Epoch 7/100
16/16 [==============================] - 0s 3ms/step - loss: 2209.1921
Epoch 8/100
16/16 [==============================] - 0s 3ms/step - loss: 1954.2546
Epoch 9/100
16/16 [==============================] - 0s 2ms/step - loss: 1712.1709
Epoch 10/100
16/16 [==============================] - 0s 2ms/step - loss: 1476.8589
Epoch 11/100
16/16 [==============================] - 0s 2ms/step - loss: 1266.2563
Epoch 12/100
16/16 [==============================] - 0s 2ms/step - loss: 1077.9348
Epoch 13/100
16/16 [==============================] - 0s 2ms/step - loss: 914.7093
Epoch 14/100
16/16 [==============================] - 0s 3ms/step - loss: 768.5402
Epoch 15/100
16/16 [==============================] - 0s 3ms/step - loss: 636.4363
Epoch 16/100
16/16 [==============================] - 0s 2ms/step - loss: 519.5001
Epoch 17/100
16/16 [==============================] - 0s 4ms/step - loss: 423.1514
Epoch 18/100
16/16 [==============================] - 0s 3ms/step - loss: 342.9977
Epoch 19/100
16/16 [==============================] - 0s 2ms/step - loss: 279.7083
Epoch 20/100
16/16 [==============================] - 0s 2ms/step - loss: 229.1192
Epoch 21/100
16/16 [==============================] - 0s 2ms/step - loss: 192.2030
Epoch 22/100
16/16 [==============================] - 0s 2ms/step - loss: 166.1513
Epoch 23/100
16/16 [==============================] - 0s 2ms/step - loss: 148.8109
Epoch 24/100
16/16 [==============================] - 0s 2ms/step - loss: 136.0228
Epoch 25/100
16/16 [==============================] - 0s 4ms/step - loss: 130.4703
Epoch 26/100
16/16 [==============================] - 0s 3ms/step - loss: 124.6347
Epoch 27/100
16/16 [==============================] - 0s 3ms/step - loss: 119.1601
Epoch 28/100
16/16 [==============================] - 0s 4ms/step - loss: 115.2893
Epoch 29/100
16/16 [==============================] - 0s 2ms/step - loss: 110.7635
Epoch 30/100
16/16 [==============================] - 0s 2ms/step - loss: 107.7423
Epoch 31/100
16/16 [==============================] - 0s 4ms/step - loss: 104.0660
Epoch 32/100
16/16 [==============================] - 0s 2ms/step - loss: 99.5700
Epoch 33/100
16/16 [==============================] - 0s 2ms/step - loss: 96.4818
Epoch 34/100
16/16 [==============================] - 0s 2ms/step - loss: 92.9528
Epoch 35/100
16/16 [==============================] - 0s 2ms/step - loss: 91.0243
Epoch 36/100
16/16 [==============================] - 0s 3ms/step - loss: 87.4626
Epoch 37/100
16/16 [==============================] - 0s 3ms/step - loss: 84.5909
Epoch 38/100
16/16 [==============================] - 0s 2ms/step - loss: 82.4647
Epoch 39/100
16/16 [==============================] - 0s 2ms/step - loss: 79.9733
Epoch 40/100
16/16 [==============================] - 0s 2ms/step - loss: 77.6530
Epoch 41/100
16/16 [==============================] - 0s 2ms/step - loss: 75.9963
Epoch 42/100
16/16 [==============================] - 0s 2ms/step - loss: 73.7685
Epoch 43/100
16/16 [==============================] - 0s 2ms/step - loss: 72.0959
Epoch 44/100
16/16 [==============================] - 0s 2ms/step - loss: 70.6330
Epoch 45/100
16/16 [==============================] - 0s 2ms/step - loss: 68.5906
Epoch 46/100
16/16 [==============================] - 0s 4ms/step - loss: 66.3550
Epoch 47/100
16/16 [==============================] - 0s 3ms/step - loss: 66.7453
Epoch 48/100
16/16 [==============================] - 0s 2ms/step - loss: 64.6731
Epoch 49/100
16/16 [==============================] - 0s 2ms/step - loss: 63.9234
Epoch 50/100
16/16 [==============================] - 0s 2ms/step - loss: 62.1411
Epoch 51/100
16/16 [==============================] - 0s 2ms/step - loss: 62.0070
Epoch 52/100
16/16 [==============================] - 0s 2ms/step - loss: 61.1266
Epoch 53/100
16/16 [==============================] - 0s 3ms/step - loss: 59.9524
Epoch 54/100
16/16 [==============================] - 0s 3ms/step - loss: 58.7118
Epoch 55/100
16/16 [==============================] - 0s 2ms/step - loss: 58.4219
Epoch 56/100
16/16 [==============================] - 0s 5ms/step - loss: 57.7281
Epoch 57/100
16/16 [==============================] - 0s 5ms/step - loss: 57.3351
Epoch 58/100
16/16 [==============================] - 0s 4ms/step - loss: 56.4000
Epoch 59/100
16/16 [==============================] - 0s 2ms/step - loss: 55.9484
Epoch 60/100
16/16 [==============================] - 0s 5ms/step - loss: 55.3683
Epoch 61/100
16/16 [==============================] - 0s 8ms/step - loss: 54.8105
Epoch 62/100
16/16 [==============================] - 0s 9ms/step - loss: 54.1496
Epoch 63/100
16/16 [==============================] - 0s 6ms/step - loss: 54.4760
Epoch 64/100
16/16 [==============================] - 0s 5ms/step - loss: 52.9954
Epoch 65/100
16/16 [==============================] - 0s 5ms/step - loss: 52.1806
Epoch 66/100
16/16 [==============================] - 0s 5ms/step - loss: 52.2380
Epoch 67/100
16/16 [==============================] - 0s 5ms/step - loss: 52.6178
Epoch 68/100
16/16 [==============================] - 0s 4ms/step - loss: 51.9215
Epoch 69/100
16/16 [==============================] - 0s 2ms/step - loss: 51.5462
Epoch 70/100
16/16 [==============================] - 0s 4ms/step - loss: 51.5191
Epoch 71/100
16/16 [==============================] - 0s 4ms/step - loss: 50.5247
Epoch 72/100
16/16 [==============================] - 0s 4ms/step - loss: 50.3248
Epoch 73/100
16/16 [==============================] - 0s 1ms/step - loss: 49.2081
Epoch 74/100
16/16 [==============================] - 0s 1ms/step - loss: 49.7366
Epoch 75/100
16/16 [==============================] - 0s 1ms/step - loss: 49.4484
Epoch 76/100
16/16 [==============================] - 0s 2ms/step - loss: 49.1177
Epoch 77/100
16/16 [==============================] - 0s 2ms/step - loss: 48.5091
Epoch 78/100
16/16 [==============================] - 0s 2ms/step - loss: 48.9125
Epoch 79/100
16/16 [==============================] - 0s 2ms/step - loss: 48.4128
Epoch 80/100
16/16 [==============================] - 0s 2ms/step - loss: 48.4422
Epoch 81/100
16/16 [==============================] - 0s 2ms/step - loss: 48.0588
Epoch 82/100
16/16 [==============================] - 0s 2ms/step - loss: 47.3163
Epoch 83/100
16/16 [==============================] - 0s 2ms/step - loss: 47.0180
Epoch 84/100
16/16 [==============================] - 0s 2ms/step - loss: 47.5772
Epoch 85/100
16/16 [==============================] - 0s 2ms/step - loss: 47.0086
Epoch 86/100
16/16 [==============================] - 0s 2ms/step - loss: 47.4502
Epoch 87/100
16/16 [==============================] - 0s 2ms/step - loss: 46.2360
Epoch 88/100
16/16 [==============================] - 0s 2ms/step - loss: 46.2861
Epoch 89/100
16/16 [==============================] - 0s 1ms/step - loss: 45.6796
Epoch 90/100
16/16 [==============================] - 0s 2ms/step - loss: 45.8365
Epoch 91/100
16/16 [==============================] - 0s 2ms/step - loss: 45.1030
Epoch 92/100
16/16 [==============================] - 0s 2ms/step - loss: 45.5354
Epoch 93/100
16/16 [==============================] - 0s 2ms/step - loss: 45.9184
Epoch 94/100
16/16 [==============================] - 0s 1ms/step - loss: 45.6969
Epoch 95/100
16/16 [==============================] - 0s 2ms/step - loss: 44.9009
Epoch 96/100
16/16 [==============================] - 0s 2ms/step - loss: 44.9191
Epoch 97/100
16/16 [==============================] - 0s 1ms/step - loss: 45.1698
Epoch 98/100
16/16 [==============================] - 0s 1ms/step - loss: 44.9781
Epoch 99/100
16/16 [==============================] - 0s 2ms/step - loss: 45.0291
Epoch 100/100
16/16 [==============================] - 0s 1ms/step - loss: 44.7501
Out[29]:
<keras.callbacks.History at 0x7f961625ff10>

1000번 더 학습하고 10번 더 학습해보자

In [30]:
model.fit(독립, 종속, epochs=1000, verbose= 0)
model.fit(독립, 종속, epochs=10)
Epoch 1/10
16/16 [==============================] - 0s 1ms/step - loss: 26.2411
Epoch 2/10
16/16 [==============================] - 0s 1ms/step - loss: 26.1759
Epoch 3/10
16/16 [==============================] - 0s 1ms/step - loss: 26.4039
Epoch 4/10
16/16 [==============================] - 0s 1ms/step - loss: 26.5664
Epoch 5/10
16/16 [==============================] - 0s 1ms/step - loss: 26.4499
Epoch 6/10
16/16 [==============================] - 0s 1ms/step - loss: 26.3733
Epoch 7/10
16/16 [==============================] - 0s 1ms/step - loss: 26.2376
Epoch 8/10
16/16 [==============================] - 0s 2ms/step - loss: 25.8005
Epoch 9/10
16/16 [==============================] - 0s 2ms/step - loss: 26.2990
Epoch 10/10
16/16 [==============================] - 0s 2ms/step - loss: 26.4971
Out[30]:
<keras.callbacks.History at 0x7f960e2883d0>
In [31]:
print(model.predict(독립[0:5]))
[[29.394705]
 [24.690996]
 [30.410822]
 [29.720282]
 [29.090536]]
In [32]:
종속[0:5]
Out[32]:
medv01234
24.0
21.6
34.7
33.4
36.2

모델 가중치 확인해보기

In [33]:
print(model.get_weights())
[array([[-0.08651359],
       [ 0.07526172],
       [-0.06145329],
       [ 3.3342063 ],
       [ 1.8765093 ],
       [ 4.032421  ],
       [ 0.01375858],
       [-0.9366195 ],
       [ 0.14706182],
       [-0.0098104 ],
       [ 0.02117945],
       [ 0.01604049],
       [-0.57715064]], dtype=float32), array([2.533991], dtype=float32)]