文章目錄
- 實驗任務
- (一)模型的建構
- (二)實驗結果
實驗任務
使用torch.nn實作二維卷積。資料集與[PyTorch]手動實作二維卷積神經網絡完成車輛分類任務相同。本篇對于資料的處理以及模型的訓練測試與上文相同,主要放上利用torch.nn實作二維卷積模型的代碼。
(一)模型的建構
# 使用torch.nn實作二維卷積
class ConvModule(torch.nn.Module):
def __init__(self):
super(ConvModule, self).__init__()
# 定義一個三層卷積(卷積層越少曲線越平滑,卷積層越多acc越高)
self.conv = torch.nn.Sequential(
# 第一層卷積
# stride步長,padding填充
torch.nn.Conv2d(in_channels=3, out_channels=32, kernel_size=3, stride=1, padding=0),
torch.nn.BatchNorm2d(32),
torch.nn.ReLU(inplace=True),
# 第二層卷積
torch.nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, stride=1, padding=0),
torch.nn.BatchNorm2d(64),
torch.nn.ReLU(inplace=True),
# 第三層卷積
torch.nn.Conv2d(in_channels=64, out_channels=128, kernel_size=3, stride=1, padding=0),
torch.nn.BatchNorm2d(128),
torch.nn.ReLU(inplace=True)
)
# 輸出層,将輸出通道數變為分類數
self.fc = torch.nn.Linear(128, num_classes)
def forward(self, X):
out = self.conv(X) # 輸出次元(batch_size, C_out, H, W)
# 平均池化
out = F.avg_pool2d(out, 26) # 池化後圖檔大小1*1
# 将out從batch size*128*1*1變為batch size*128
out = out.squeeze()
out = self.fc(out)
return out
net = ConvModule()
net.to(device)
# 損失函數和優化器
loss = torch.nn.CrossEntropyLoss() # 交叉熵損失函數
optimizer = torch.optim.SGD(net.parameters(), lr=lr)
(二)實驗結果
實驗的部分超參數如下:
batch_size = 128 # batch size越小acc越好,但是曲線更震蕩
num_epochs = 100 # 訓練輪次
lr = 0.01
訓練期間的輸出如下:
epoch 90, train loss: 0.3220, train acc: 0.888
test loss: 0.2839, test acc: 0.882
epoch 91, train loss: 0.3132, train acc: 0.885
test loss: 0.2930, test acc: 0.889
epoch 92, train loss: 0.3216, train acc: 0.890
test loss: 0.4076, test acc: 0.856
epoch 93, train loss: 0.3138, train acc: 0.892
test loss: 0.3139, test acc: 0.886
epoch 94, train loss: 0.3174, train acc: 0.887
test loss: 0.6971, test acc: 0.812
epoch 95, train loss: 0.3206, train acc: 0.885
test loss: 0.3292, test acc: 0.889
epoch 96, train loss: 0.3081, train acc: 0.889
test loss: 0.2781, test acc: 0.897
epoch 97, train loss: 0.2958, train acc: 0.895
test loss: 0.2661, test acc: 0.886
epoch 98, train loss: 0.2988, train acc: 0.901
test loss: 0.3309, test acc: 0.860
epoch 99, train loss: 0.3121, train acc: 0.892
test loss: 0.3602, test acc: 0.849
epoch 100, train loss: 0.3060, train acc: 0.894
test loss: 0.2816, test acc: 0.893
訓練和測試的acc曲線和loss曲線如下: