文章目录:
-
一.什么是分类学习
1.Classification
2.MNIST
-
二.Keras实现MNIST分类
-
三.总结
-
https://github.com/eastmountyxz/
AI-for-TensorFlow -
https://github.com/eastmountyxz/
AI-for-Keras
学Python近八年,认识了很多大佬和朋友,感恩。作者的本意是帮助更多初学者入门,因此在github开源了所有代码,也在公众号同步更新。深知自己很菜,得拼命努力前行,编程也没有什么捷径,干就对了。希望未来能更透彻学习和撰写文章,也能在读博几年里学会真正的独立科研。同时非常感谢参考文献中的大佬们的文章和分享。
- https://blog.csdn.net/eastmount
一.什么是分类学习
-
训练。给定一个数据集,每个样本都包含一组特征和一个类别信息,然后调用分类算法训练模型。
-
预测。利用生成的模型对新的数据集(测试集)进行分类预测,并判断其分类结果。
2.MNIST
-
训练数据集:55,000个样本,mnist.train
-
测试数据集:10,000个样本,mnist.test
-
验证数据集:5,000个样本,mnist.validation
二.Keras实现MNIST分类
import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.optimizers import RMSprop
-
X_train.reshape(X_train.shape[0], -1) / 255
将每个像素点进行标准化处理,从0-255转换成0-1的范围。 -
np_utils.to_categorical(y_train, nb_classes=10)
调用up_utils将类标转换成10个长度的值,如果数字是3,则会在对应的地方标记为1,其他地方标记为0,即{0,0,0,1,0,0,0,0,0,0}。
# 下载MNIST数据
# X shape(60000, 28*28) y shape(10000, )
(X_train, y_train), (X_test, y_test) = mnist.load_data()
# 数据预处理
X_train = X_train.reshape(X_train.shape[0], -1) / 255 # normalize
X_test = X_test.reshape(X_test.shape[0], -1) / 255 # normalize
# 将类向量转化为类矩阵 数字 5 转换为 0 0 0 0 0 1 0 0 0 0 矩阵
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)
-
model = Sequential()
-
model.add(Dense(output_dim=1, input_dim=1))
-
from keras.layers import Dense, Activation
-
from keras.optimizers import RMSprop
-
第一层为Dense(32, input_dim=784),它将传入的784转换成32个输出
-
该数据加载一个激励函数Activation(‘relu’),并转换成非线性化数据
-
第二层为Dense(10),它输出为10个单位。同时Keras定义神经层会默认其输入为上一层的输出,即32(省略)
-
接着加载一个激励函数Activation(‘softmax’),用于分类
# Another way to build your neural net
model = Sequential([
Dense(32, input_dim=784), # 输入值784(28*28) => 输出值32
Activation('relu'), # 激励函数 转换成非线性数据
Dense(10), # 输出为10个单位的结果
Activation('softmax') # 激励函数 调用softmax进行分类
])
# Another way to define your optimizer
rmsprop = RMSprop(lr=0.001, rho=0.9, epsilon=1e-08, decay=0.0)
# We add metrics to get more results you want to see
# 激活神经网络
model.compile(
optimizer = rmsprop, # 加速神经网络
loss = 'categorical_crossentropy', # 损失函数
metrics = ['accuracy'], # 计算误差或准确率
)
print("Training")
model.fit(X_train, y_train, nb_epoch=2, batch_size=32)
print("Testing")
loss, accuracy = model.evaluate(X_test, y_test)
print("loss:", loss)
print("accuracy:", accuracy)
# -*- coding: utf-8 -*-
"""
Created on Fri Feb 14 16:43:21 2020
@author: Eastmount CSDN YXZ
O(∩_∩)O Wuhan Fighting!!!
"""
import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.optimizers import RMSprop
#---------------------------载入数据及预处理---------------------------
# 下载MNIST数据
# X shape(60000, 28*28) y shape(10000, )
(X_train, y_train), (X_test, y_test) = mnist.load_data()
# 数据预处理
X_train = X_train.reshape(X_train.shape[0], -1) / 255 # normalize
X_test = X_test.reshape(X_test.shape[0], -1) / 255 # normalize
# 将类向量转化为类矩阵 数字 5 转换为 0 0 0 0 0 1 0 0 0 0 矩阵
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)
#---------------------------创建神经网络层---------------------------
# Another way to build your neural net
model = Sequential([
Dense(32, input_dim=784), # 输入值784(28*28) => 输出值32
Activation('relu'), # 激励函数 转换成非线性数据
Dense(10), # 输出为10个单位的结果
Activation('softmax') # 激励函数 调用softmax进行分类
])
# Another way to define your optimizer
rmsprop = RMSprop(lr=0.001, rho=0.9, epsilon=1e-08, decay=0.0) #学习率lr
# We add metrics to get more results you want to see
# 激活神经网络
model.compile(
optimizer = rmsprop, # 加速神经网络
loss = 'categorical_crossentropy', # 损失函数
metrics = ['accuracy'], # 计算误差或准确率
)
#------------------------------训练及预测------------------------------
print("Training")
model.fit(X_train, y_train, nb_epoch=2, batch_size=32) # 训练次数及每批训练大小
print("Testing")
loss, accuracy = model.evaluate(X_test, y_test)
print("loss:", loss)
print("accuracy:", accuracy)
Using TensorFlow backend.
Downloading data from https://s3.amazonaws.com/img-datasets/mnist.npz
11493376/11490434 [==============================] - 18s 2us/step
# -*- coding: utf-8 -*-
"""
Created on Fri Feb 14 16:43:21 2020
@author: Eastmount CSDN YXZ
O(∩_∩)O Wuhan Fighting!!!
"""
import numpy as np
from keras.datasets import mnist
from keras.utils import np_utils
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.optimizers import RMSprop
import matplotlib.pyplot as plt
from PIL import Image
#---------------------------载入数据及预处理---------------------------
# 下载MNIST数据
# X shape(60000, 28*28) y shape(10000, )
(X_train, y_train), (X_test, y_test) = mnist.load_data()
#------------------------------显示图片------------------------------
def show_mnist(train_image, train_labels):
n = 6
m = 6
fig = plt.figure()
for i in range(n):
for j in range(m):
plt.subplot(n,m,i*n+j+1)
index = i * n + j #当前图片的标号
img_array = train_image[index]
img = Image.fromarray(img_array)
plt.title(train_labels[index])
plt.imshow(img, cmap='Greys')
plt.show()
show_mnist(X_train, y_train)
# 数据预处理
X_train = X_train.reshape(X_train.shape[0], -1) / 255 # normalize
X_test = X_test.reshape(X_test.shape[0], -1) / 255 # normalize
# 将类向量转化为类矩阵 数字 5 转换为 0 0 0 0 0 1 0 0 0 0 矩阵
y_train = np_utils.to_categorical(y_train, num_classes=10)
y_test = np_utils.to_categorical(y_test, num_classes=10)
#---------------------------创建神经网络层---------------------------
# Another way to build your neural net
model = Sequential([
Dense(32, input_dim=784), # 输入值784(28*28) => 输出值32
Activation('relu'), # 激励函数 转换成非线性数据
Dense(10), # 输出为10个单位的结果
Activation('softmax') # 激励函数 调用softmax进行分类
])
# Another way to define your optimizer
rmsprop = RMSprop(lr=0.001, rho=0.9, epsilon=1e-08, decay=0.0) #学习率lr
# We add metrics to get more results you want to see
# 激活神经网络
model.compile(
optimizer = rmsprop, # 加速神经网络
loss = 'categorical_crossentropy', # 损失函数
metrics = ['accuracy'], # 计算误差或准确率
)
#------------------------------训练及预测------------------------------
print("Training")
model.fit(X_train, y_train, nb_epoch=2, batch_size=32) # 训练次数及每批训练大小
print("Testing")
loss, accuracy = model.evaluate(X_test, y_test)
print("loss:", loss)
print("accuracy:", accuracy)
三.总结
-
十七.Keras搭建分类神经网络及MNIST数字图像案例分析
天行健,君子以自强不息。
地势坤,君子以厚德载物。
-
[1] 神经网络和机器学习基础入门分享 - 作者的文章
-
[2] 斯坦福NG教授: https://class.coursera.org/ml/class/index
-
[3] 《深度学习》
-
[4] 网易云莫烦老师视频
https://study.163.com/course/courseLearn.htm?courseId=1003340023
-
[5] 神经网络激励函数 - deeplearning
-
[6] 机器学习实战—MNIST手写体数字识别 - RunningSucks
-
[7] https://github.com/siucaan/CNN_MNIST
原文始发于微信公众号(娜璋AI安全之家):Python人工智能 | 十七.Keras搭建分类神经网络及MNIST数字图像案例分析
- 左青龙
- 微信扫一扫
-
- 右白虎
- 微信扫一扫
-
评论