基本操作
01 Tensor数据类型 <https://www.cnblogs.com/nickchen121/p/10840234.html> 02 创建Tensor
<https://www.cnblogs.com/nickchen121/p/10840266.html> 03 Tensor索引和切片
<https://www.cnblogs.com/nickchen121/p/10840274.html> 04 维度变换
<https://www.cnblogs.com/nickchen121/p/10841062.html> 05 Broadcasting
<https://www.cnblogs.com/nickchen121/p/10849477.html> 06 数学运算
<https://www.cnblogs.com/nickchen121/p/10849481.html> 07 前向传播(张量)- 实战
<https://www.cnblogs.com/nickchen121/p/10849484.html>
高级操作
08 合并与分割 <https://www.cnblogs.com/nickchen121/p/10849538.html> 09 数据统计
<https://www.cnblogs.com/nickchen121/p/10851359.html> 10 张量排序
<https://www.cnblogs.com/nickchen121/p/10852639.html> 11 填充与复制
<https://www.cnblogs.com/nickchen121/p/10852722.html> 12 张量限幅
<https://www.cnblogs.com/nickchen121/p/10853473.html> 13 高阶操作
<https://www.cnblogs.com/nickchen121/p/10864401.html>
神经网络与全连接层
14 数据加载 <https://www.cnblogs.com/nickchen121/p/10872939.html> 15 测试(张量)- 实战
<https://www.cnblogs.com/nickchen121/p/10877191.html> 16 全连接层
<https://www.cnblogs.com/nickchen121/p/10878347.html> 17 输出方式
<https://www.cnblogs.com/nickchen121/p/10900983.html> 18 误差计算
<https://www.cnblogs.com/nickchen121/p/10901445.html>
随机梯度下降
19 梯度下降简介 <https://www.cnblogs.com/nickchen121/p/10901468.html> 20 激活函数及其梯度
<https://www.cnblogs.com/nickchen121/p/10906230.html> 21 损失函数及其梯度
<https://www.cnblogs.com/nickchen121/p/10906835.html> 22 单输出感知机及其梯度
<https://www.cnblogs.com/nickchen121/p/10908602.html> 23 多输出感知机及其梯度
<https://www.cnblogs.com/nickchen121/p/10914273.html> 24 链式法则
<https://www.cnblogs.com/nickchen121/p/10914433.html> 25 反向传播算法
<https://www.cnblogs.com/nickchen121/p/10914739.html> 26 函数优化实战
<https://www.cnblogs.com/nickchen121/p/10915301.html> 27 手写数字问题实战(层)
<https://www.cnblogs.com/nickchen121/p/10921539.html> 28 TensorBoard可视化
<https://www.cnblogs.com/nickchen121/p/10921824.html>
Keras高层接口
29 Keras高层API <https://www.cnblogs.com/nickchen121/p/10921824.html> 30 自定义层or网络
<https://www.cnblogs.com/nickchen121/p/10922806.html> 31 模型加载与保存
<https://www.cnblogs.com/nickchen121/p/10922944.html> 32 CIFAR10自定义网络实战
<https://www.cnblogs.com/nickchen121/p/10923333.html>
过拟合
33 过拟合 <https://www.cnblogs.com/nickchen121/p/10923344.html>
卷积神经网络CNN
34 什么是卷积 <https://www.cnblogs.com/nickchen121/p/10923353.html> 35 卷积神经网络
<https://www.cnblogs.com/nickchen121/p/10925663.html> 36 池化与采样
<https://www.cnblogs.com/nickchen121/p/10930571.html> 37 CIFAR100与VGG13实战
<https://www.cnblogs.com/nickchen121/p/10939750.html> 38
经典卷积网络VGG,GoodLeNet,Inception
<https://www.cnblogs.com/nickchen121/p/10957051.html> 39 ResNet,DenseNet
<https://www.cnblogs.com/nickchen121/p/10958899.html> 40 ResNet实战
<https://www.cnblogs.com/nickchen121/p/10960206.html>
循环神经网络RNN
41 序列表示方法 <https://www.cnblogs.com/nickchen121/p/10960206.html> 42 循环神经网络层
<https://www.cnblogs.com/nickchen121/p/10961001.html> 43 RNNCell使用
<https://www.cnblogs.com/nickchen121/p/10963544.html> 44 RNN与情感分类问题实战-加载IMDB数据集
<https://www.cnblogs.com/nickchen121/p/10963848.html>
自编码器Auto-Encoders
45 无监督学习 <https://www.cnblogs.com/nickchen121/p/10964187.html> 46
Auto-Encoders原理 <https://www.cnblogs.com/nickchen121/p/11073127.html> 47
Auto-Encodes变种 <https://www.cnblogs.com/nickchen121/p/11073131.html> 48
Adversarial Auto-Encoders <https://www.cnblogs.com/nickchen121/p/11073136.html>
49 Reparameterization Trick
<https://www.cnblogs.com/nickchen121/p/11073138.html> 50 Variational
Auto-Encoders原理 <https://www.cnblogs.com/nickchen121/p/11073147.html> 51
Auto-Encoders实战 <https://www.cnblogs.com/nickchen121/p/11073620.html>
debugging…… <>
推荐阅读
Python从入门到放弃 <https://www.cnblogs.com/nickchen121/p/10718112.html> 机器学习
<https://www.cnblogs.com/nickchen121/p/10802091.html> Python能干啥
<https://www.cnblogs.com/nickchen121/p/10825705.html>
热门工具 换一换