天天看点

torch.nn.Softmax()和torch.nn.functional.softmax()的使用方法

参考链接: torch.nn.Softmax(dim=None)

参考链接: torch.nn.functional.softmax(input, dim=None, _stacklevel=3, dtype=None)

torch.nn.Softmax()和torch.nn.functional.softmax()的使用方法
torch.nn.Softmax()和torch.nn.functional.softmax()的使用方法

实验代码:

Microsoft Windows [版本 10.0.18363.1256]
(c) 2019 Microsoft Corporation。保留所有权利。

C:\Users\chenxuqi>
C:\Users\chenxuqi>conda activate ssd4pytorch1_2_0

(ssd4pytorch1_2_0) C:\Users\chenxuqi>python
Python 3.7.7 (default, May  6 2020, 11:45:54) [MSC v.1916 64 bit (AMD64)] :: Anaconda, Inc. on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> from torch import nn as nn
>>> input = torch.randn(2, 3, 4)
>>> input
tensor([[[-0.1335,  0.1574, -0.4618, -0.1629],
         [-1.1302, -0.2782,  0.2689,  1.4722],
         [ 1.8547,  3.0593,  1.7146, -0.4395]],

        [[-0.0102,  1.4679,  0.0138,  0.5245],
         [ 2.2345,  2.0089,  2.0074,  0.4197],
         [-1.4187, -0.0887,  0.9257,  0.2516]]])
>>>
>>>
>>> torch.nn.functional.softmax(input, dim=None)
__main__:1: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
tensor([[[0.4692, 0.2124, 0.3833, 0.3346],
         [0.0334, 0.0922, 0.1495, 0.7413],
         [0.9635, 0.9588, 0.6876, 0.3338]],

        [[0.5308, 0.7876, 0.6167, 0.6654],
         [0.9666, 0.9078, 0.8505, 0.2587],
         [0.0365, 0.0412, 0.3124, 0.6662]]])
>>> torch.nn.functional.softmax(input, dim=X)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
NameError: name 'X' is not defined
>>> torch.nn.functional.softmax(input, dim='X')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "D:\Anaconda3\envs\ssd4pytorch1_2_0\lib\site-packages\torch\nn\functional.py", line 1230, in softmax
    ret = input.softmax(dim)
TypeError: softmax(): argument 'dim' (position 1) must be int, not str
>>>
>>> torch.nn.functional.softmax(input)
tensor([[[0.4692, 0.2124, 0.3833, 0.3346],
         [0.0334, 0.0922, 0.1495, 0.7413],
         [0.9635, 0.9588, 0.6876, 0.3338]],

        [[0.5308, 0.7876, 0.6167, 0.6654],
         [0.9666, 0.9078, 0.8505, 0.2587],
         [0.0365, 0.0412, 0.3124, 0.6662]]])
>>>
>>> torch.nn.functional.softmax(input, dim=0)
tensor([[[0.4692, 0.2124, 0.3833, 0.3346],
         [0.0334, 0.0922, 0.1495, 0.7413],
         [0.9635, 0.9588, 0.6876, 0.3338]],

        [[0.5308, 0.7876, 0.6167, 0.6654],
         [0.9666, 0.9078, 0.8505, 0.2587],
         [0.0365, 0.0412, 0.3124, 0.6662]]])
>>> torch.nn.functional.softmax(input, dim=1)
tensor([[[0.1153, 0.0504, 0.0841, 0.1452],
         [0.0426, 0.0326, 0.1746, 0.7447],
         [0.8421, 0.9171, 0.7413, 0.1101]],

        [[0.0936, 0.3415, 0.0923, 0.3757],
         [0.8835, 0.5866, 0.6779, 0.3383],
         [0.0229, 0.0720, 0.2298, 0.2860]]])
>>> torch.nn.functional.softmax(input, dim=2)
tensor([[[0.2482, 0.3320, 0.1788, 0.2410],
         [0.0479, 0.1122, 0.1939, 0.6460],
         [0.1885, 0.6287, 0.1638, 0.0190]],

        [[0.1232, 0.5403, 0.1262, 0.2103],
         [0.3626, 0.2894, 0.2889, 0.0591],
         [0.0487, 0.1843, 0.5081, 0.2589]]])
>>>
>>>
>>>
           

继续阅读