๐Ÿ’ฟ Data/์ด๋ชจ์ €๋ชจ

    [๋”ฅ๋Ÿฌ๋‹]ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ ํŠœ๋‹(sklearn์˜ RandomizedSearchCV, keras_tuner์˜ RandomSearch)

    RandomSearch๋ฅผ ์ด์šฉํ•œ ํ•˜์ดํผ ํŒŒ๋ผ๋ฏธํ„ฐ ํŠœ๋‹ 0. ๋ฐ์ดํ„ฐ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ ๋ฐ Normalization # ๋ฐ์ดํ„ฐ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ from tensorflow.keras.datasets import mnist (X_train, y_train), (X_test, y_test) = mnist.load_data() # input ๋ฐ target ๋ฐ์ดํ„ฐ ํ™•์ธ X_train.shape set(y_train) # Normalization X_train = X_train / 255. X_test = X_test / 255. 1. ๋ชจ๋ธ๋ง from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, Flatten, Dropout..

    [๋”ฅ๋Ÿฌ๋‹]keras_cifar100 ์ด์šฉํ•œ ๊ฐ„๋‹จ ์‹ ๊ฒฝ๋ง ๋ฐ ๊ณผ์ ํ•ฉ ๋ฐฉ์ง€, ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ ํŠœ๋‹

    0) ๋ฐ์ดํ„ฐ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ ๋ฐ ํ™•์ธ, Normalization import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense, Flatten, Dropout from tensorflow.keras.optimizers import Adam from tensorflow.keras import regularizers # ๋ฐ์ดํ„ฐ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ from tensorflow.keras.datasets import cifar100 (X_train, y_train), (X_test, y_test) = cifar100.load_data() # ๋ฐ์ดํ„ฐ shape..

    [๋”ฅ๋Ÿฌ๋‹]๊ฐ„๋‹จ ์‹ ๊ฒฝ๋ง ๋ฐ ๋จธ์‹ ๋Ÿฌ๋‹ ๋ชจ๋ธ๋ง, ์„ฑ๋Šฅ ๋น„๊ต

    0) ๋ฐ์ดํ„ฐ ํ™•์ธ ๋ฐ ์ „์ฒ˜๋ฆฌ # ๋ฐ์ดํ„ฐ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ import tensorflow as tf boston_housing = tf.keras.datasets.boston_housing (X_train, y_train), (X_test, y_test) = boston_housing.load_data() # ๋ฐ์ดํ„ฐ ์…‹ shape ํ™•์ธ X_train.shape # ๋ฐ์ดํ„ฐ ํŠน์„ฑ scale ๋งž์ถ”๊ธฐ from sklearn.preprocessing import StandardScaler scaler = StandardScaler() X_train_scaled = scaler.fit_transform(X_train) X_test_scaled = scaler.transform(X_test) 1) ์‹ ๊ฒฝ๋ง ๋ชจ๋ธ model = ..

    [๋”ฅ๋Ÿฌ๋‹]์˜ตํ‹ฐ๋งˆ์ด์ €(Optimizer)

    Optimizer๋ž€ ํ˜น์‹œ ์…ฐ๋ฅดํŒŒ๋ผ๊ณ  ์•„์‹œ๋‚˜์š”? ์…ฐ๋ฅดํŒŒ๋Š” ํ‹ฐ๋ฒ ํŠธ์–ธ์–ด๋กœ, '์ผ๋กœ์„œ ๋“ฑ๋ฐ˜์„ ๋•๋Š” ์‚ฌ๋žŒ' ์ด๋ผ๋Š” ์˜๋ฏธ๋กœ ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค. ์ผ๋ฐ˜์ ์ธ ์šฐ๋ฆฌ๋‚˜๋ผ ์‚ฐ์ด ์•„๋‹Œ, ์ •๋ง ํ—˜๋‚œํ•œ ์‚ฐ์„ ๋“ฑ๋ฐ˜ํ•ด์•ผํ•œ๋‹ค๋ฉด ์šฐ๋ฆฌ๋Š” ์ ๋‹นํ•œ ๊ธธ์žก์ด๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค. ๋ฒ ์ด์Šค์บ ํ”„๊ฐ€ ์–ด๋””์žˆ๋Š”์ง€๋ถ€ํ„ฐ, ํ•˜๋ฃจ์— ๋ช‡ํ‚ค๋กœ๋ฅผ ๊ฐ€๋Š” ๊ฒŒ ์ข‹์„์ง€, ๊ตฌ์„ฑ์›์— ๋”ฐ๋ผ ์ฒด๋ ฅ์•ˆ๋ฐฐ๋Š” ์–ด๋–ป๊ฒŒ ํ• ์ง€ ๋“ฑ๋“ฑ์„ ๊ฐ€์ด๋“œํ•ด์ค„ ์‚ฌ๋žŒ์ฒ˜๋Ÿผ ๋ง์ด์ฃ . ๋“ฑ๋ฐ˜๋ณด๋‹จ ํ•˜์‚ฐ์— ์ข€๋” ๊ฐ€๊นŒ์šด ๋Š๋‚Œ์ด์ง€๋งŒ, ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ์ €ํฌ๊ฐ€ ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•์„ ํ†ตํ•ด ์ตœ์ ์ ์„ ์ฐพ์•„๊ฐ€๋Š” ๊ณผ์ •์—์„œ ์…ฐ๋ฅดํŒŒ์˜ ์—ญํ• ์„ ํ•˜๋Š” ๊ฒƒ์ด ์˜ตํ‹ฐ๋งˆ์ด์ €์™€ ๋‹ฎ์•„์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ–ˆ์Šต๋‹ˆ๋‹ค. ๊ธธ์žก์ด๋งˆ๋‹ค ์กฐ๊ธˆ์”ฉ์€ ๋‹ค๋ฅธ ์กฐ๊ฑด์œผ๋กœ ๊ธธ์„ ๊ฐ€๋ฅด์ณ์ฃผ๊ณ , ๊ธธ์žก์ด์— ๋”ฐ๋ผ ์‚ฐ์„ ๋ฌด์‚ฌํžˆ ๋“ฑ๋ฐ˜ํ•  ์ˆ˜๋„ ์žˆ๊ณ  ๊ทธ๋ ‡์ง€ ๋ชปํ•  ์ˆ˜๋„ ์žˆ๋Š” ๊ฒƒ ๊ฐ™์Šต๋‹ˆ๋‹ค. Optimizer ์ข…๋ฅ˜ GD(Gradient Des..

    [๋”ฅ๋Ÿฌ๋‹]์†์‹ค ํ•จ์ˆ˜

    ์†์‹ค ํ•จ์ˆ˜๋ž€ ์†์‹ค ํ•จ์ˆ˜ = ๋น„์šฉ ํ•จ์ˆ˜(cost function ; cost) ์ž…๋ ฅ๊ฐ’(x)๋ฅผ F(w)๋ผ๋Š” ๋ชจ๋ธ์— ํ†ต๊ณผ์‹œ์ผฐ์„ ๋•Œ ๋‚˜์˜ค๋Š” ๊ฐ’์€ ์ถœ๋ ฅ๊ฐ’(y_pred; ์˜ˆ์ธก๊ฐ’)์ž…๋‹ˆ๋‹ค. ๋ชจ๋ธ์˜ ์ตœ์ข… ๋ชฉ์ ์€ ์‹ค์ œ๊ฐ’(y_true; ์ฐธ๊ฐ’, ๊ด€์ธก๊ฐ’)์— ์ตœ๋Œ€ํ•œ ๊ฐ€๊นŒ์šด ์˜ˆ์ธก๊ฐ’์„ ์–ป๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์ด ๋•Œ, ์˜ˆ์ธก๊ฐ’๊ณผ ์‹ค์ œ๊ฐ’์˜ ์ฐจ์ด๋ฅผ ํ™•์ธํ•˜๋Š” ํ•จ์ˆ˜๊ฐ€ ์†์‹คํ•จ์ˆ˜ ์ž…๋‹ˆ๋‹ค. ๋‹ค์‹œ ๋งํ•ด, ์šฐ๋ฆฌ๊ฐ€ ์„ค๊ณ„ํ•œ ๋ชจ๋ธ์ด ์ž…๋ ฅ๊ฐ’์„ ํ† ๋Œ€๋กœ ์–ผ๋งˆ๋‚˜ ์ž˜ ๋งž์ถ”์—ˆ๋Š”์ง€๋ฅผ ๋‚˜ํƒ€๋‚ด๋Š” ์ง€ํ‘œ๋ผ๊ณ  ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์šด๋™์—๋Š” ์šด๋™๋งˆ๋‹ค ๋‹ค๋ฅธ ๊ทœ์น™์ด ์ ์šฉ๋˜๋“ฏ, ์šฐ๋ฆฌ๊ฐ€ ์„ค๊ณ„ํ•œ ๋ชจ๋ธ(์ •ํ™•ํžˆ๋Š” ์šฐ๋ฆฌ๊ฐ€ ํ’€๊ณ ์ž ํ•˜๋Š” ๋ฌธ์ œ์˜ ์ข…๋ฅ˜)์— ๋”ฐ๋ผ ๋‹ค์–‘ํ•œ ์†์‹คํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ์†์‹ค ํ•จ์ˆ˜์˜ ์ข…๋ฅ˜ ์šฐ๋ฆฌ๊ฐ€ ๋‹ค๋ฃจ๊ฒŒ ๋  ๋ฌธ์ œ๋Š” ํฌ๊ฒŒ 3๊ฐ€์ง€๋กœ ๋‚˜๋ˆŒ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. 1) ํšŒ๊ท€, 2) ์ด์ง„ ๋ถ„๋ฅ˜, 3) ๋‹ค์ค‘ ๋ถ„..