翻译十三、Loss Functions

艾盼盼阅读量 10

一、Loss Functions损失函数

损失函数的作用:
1,损失函数就是实际输出值和目标值之间的差
2,由这个差便可以通过反向传播对之后的数据进行更新
[Loss Functions官网给的API](https://pytorch.org/docs/stable/nn.html#loss-functions)
里面由很多种损失函数,不同的损失函数有其不同的用途及表达式
![在这里插入图片描述](https://img-blog.csdnimg.cn/182c0c476f0d484cb76492bd463d33b2.png)

二、L1Loss损失函数

[torch.nn.L1Loss(size_average=None, reduce=None, reduction='mean')](https://pytorch.org/docs/stable/generated/torch.nn.L1Loss.html#torch.nn.L1Loss)
由官网给的各参数使用可知,size_average和reduce参数都已经被弃用了
而reduction有三个模式,none、mean、sum

![yo](https://img-blog.csdnimg.cn/199035744294461abc84a55a5d92b343.png)

prism language-python 复制代码
import torch
from torch.nn import L1Loss

input = torch.tensor([1,2,3],dtype=torch.float32)
target = torch.tensor([4,5,6],dtype=torch.float32)

input = torch.reshape(input,(1,1,1,3))
target = torch.reshape(target,(1,1,1,3))

loss_1 = L1Loss(reduction='sum')
result_1 = loss_1(input,target)
print(result_1)#tensor(9.)   4-1 + 5-2 + 6-3 = 9

loss_2 = L1Loss(reduction='mean')
result_2 = loss_2(input,target)
print(result_2)#tensor(3.)   (4-1 + 5-2 + 6-3) / 3 = 3

loss_3 = L1Loss(reduction='none')
result_3 = loss_3(input,target)
print(result_3)#tensor([[[[3., 3., 3.]]]])

三、MSELoss均方误差损失函数

[torch.nn.MSELoss(size_average=None, reduce=None, reduction='mean')](https://pytorch.org/docs/stable/generated/torch.nn.MSELoss.html#torch.nn.MSELoss)
![在这里插入图片描述](https://img-blog.csdnimg.cn/97e6f4c3a0a94c2eb5d478c4fe5c855f.png)

prism language-python 复制代码
import torch

input = torch.tensor([1,2,3],dtype=torch.float32)
target = torch.tensor([4,5,6],dtype=torch.float32)

input = torch.reshape(input,(1,1,1,3))
target = torch.reshape(target,(1,1,1,3))

loss_mse = torch.nn.MSELoss()
result = loss_mse(input,target)
print(result)#tensor(9.)   [(4-1)^2 + (5-2)^2 + (6-3)^2] / 3 = 9

四、CrossEntropyLoss交叉熵损失函数

[torch.nn.CrossEntropyLoss(weight=None, size_average=None, ignore_index=- 100, reduce=None, reduction='mean', label_smoothing=0.0)](https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html#torch.nn.CrossEntropyLoss)
![在这里插入图片描述](https://img-blog.csdnimg.cn/db5778a27abd4e7493ec7ab34aa29054.png)
![在这里插入图片描述](https://img-blog.csdnimg.cn/9adc71e5119e4b9d91eca04d82d6a30c.png)

假设x为三类:狗0、猫1、猪2,对应的输出概率为(1.0,2.0,3.0)
最后的真实目标结果为猪,对应的是(2)

prism language-python 复制代码
import torch

x = torch.tensor([1.0,2.0,3.0])
target = torch.tensor([2])

x = torch.reshape(x,(1,3))#将x变成(N,C)形式 ,1维,3类
print(x.shape)#torch.Size([1, 3])

loss_cross = torch.nn.CrossEntropyLoss()
result = loss_cross(x,target)

print(result)#tensor(0.4076)

![在这里插入图片描述](https://img-blog.csdnimg.cn/3034b71b336d4838b983bc95855db078.png)

复制代码
    ===========================
    【来源: CSDN】
    【作者: beyond谚语】
    【原文链接】 https://beyondyanyu.blog.csdn.net/article/details/126452789
    声明:转载此文是出于传递更多信息之目的。若有来源标注错误或侵犯了您的合法权益,请作者持权属证明与本网联系,我们将及时更正、删除,谢谢。
0/300
全部评论0
0/300