■ MLPClassifier 클래스를 사용해 다층 퍼셉트론 신경망을 만드는 방법을 보여준다.
▶ main.py
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 |
import sklearn.datasets as datasets import sklearn.model_selection as model_selection import sklearn.neural_network as neural_network import sklearn.preprocessing as preprocessing print("MNIST 데이터를 구합니다...") mnistBunch = datasets.fetch_openml("mnist_784") print("MNIST 데이터를 다운로드 했습니다.") imageNDArray = mnistBunch.data labelNDArray = mnistBunch.target imageNDArray = preprocessing.normalize(imageNDArray, norm = "l2") # "l1" 정규화도 사용 가능하다. trainImageNDArray, testImageNDArray, trainLabelNDArray, testLabelNDArray = model_selection.train_test_split(imageNDArray, labelNDArray, test_size = 0.25, random_state = 17) mlpClassifier = neural_network.MLPClassifier(hidden_layer_sizes = (300, 300, 300), max_iter = 50, solver = "sgd", learning_rate_init = 0.01, verbose = True) print("신경망 학습을 시작합니다...") mlpClassifier.fit(trainImageNDArray, trainLabelNDArray) print("신경망 학습을 완료했습니다.") print("네트워크 성능 : %f" % mlpClassifier.score(testImageNDArray, testLabelNDArray)) """ MNIST 데이터를 구합니다... MNIST 데이터를 다운로드 했습니다. 신경망 학습을 시작합니다... Iteration 1, loss = 2.01495220 Iteration 2, loss = 0.69619084 Iteration 3, loss = 0.42475069 Iteration 4, loss = 0.35455444 Iteration 5, loss = 0.31833714 Iteration 6, loss = 0.29176438 Iteration 7, loss = 0.26984975 Iteration 8, loss = 0.24962781 Iteration 9, loss = 0.23111968 Iteration 10, loss = 0.21396440 Iteration 11, loss = 0.19797201 Iteration 12, loss = 0.18409881 Iteration 13, loss = 0.17178327 Iteration 14, loss = 0.16095801 Iteration 15, loss = 0.15054954 Iteration 16, loss = 0.14160059 Iteration 17, loss = 0.13336125 Iteration 18, loss = 0.12576072 Iteration 19, loss = 0.11900718 Iteration 20, loss = 0.11208487 Iteration 21, loss = 0.10663341 Iteration 22, loss = 0.10152752 Iteration 23, loss = 0.09642898 Iteration 24, loss = 0.09181761 Iteration 25, loss = 0.08624251 Iteration 26, loss = 0.08296494 Iteration 27, loss = 0.07905917 Iteration 28, loss = 0.07541192 Iteration 29, loss = 0.07185268 Iteration 30, loss = 0.06843048 Iteration 31, loss = 0.06545506 Iteration 32, loss = 0.06233578 Iteration 33, loss = 0.05947011 Iteration 34, loss = 0.05644594 Iteration 35, loss = 0.05451590 Iteration 36, loss = 0.05169441 Iteration 37, loss = 0.04988134 Iteration 38, loss = 0.04732568 Iteration 39, loss = 0.04542914 Iteration 40, loss = 0.04318924 Iteration 41, loss = 0.04146527 Iteration 42, loss = 0.03971327 Iteration 43, loss = 0.03743316 Iteration 44, loss = 0.03632515 Iteration 45, loss = 0.03451298 Iteration 46, loss = 0.03323524 Iteration 47, loss = 0.03185525 Iteration 48, loss = 0.03023780 Iteration 49, loss = 0.02904490 Iteration 50, loss = 0.02757689 D:\TestProject\TestProject\env\lib\site-packages\sklearn\neural_network\_multilayer_perceptron.py:702: ConvergenceWarning: Stochastic Optimizer: Maximum iterations (50) reached and the optimization hasn't converged yet. warnings.warn( 신경망 학습을 완료했습니다. 네트워크 성능 : 0.976457 """ |
▶ requirements.txt
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
cycler==0.11.0 fonttools==4.34.4 joblib==1.1.0 kiwisolver==1.4.4 matplotlib==3.5.3 numpy==1.23.1 packaging==21.3 pandas==1.4.3 Pillow==9.2.0 pip==22.0.4 pyparsing==3.0.9 python-dateutil==2.8.2 pytz==2022.2 scikit-learn==1.1.2 scipy==1.9.0 setuptools==58.1.0 six==1.16.0 sklearn==0.0 threadpoolctl==3.1.0 |