亚欧色一区w666天堂,色情一区二区三区免费看,少妇特黄A片一区二区三区,亚洲人成网站999久久久综合,国产av熟女一区二区三区

  • 發布文章
  • 消息中心
點贊
收藏
評論
分享
原創

多次執行backward的方法

2024-11-06 10:00:17
12
0
import torch

x = torch.tensor([2.], requires_grad=True)
w = torch.tensor([1.], requires_grad=True)

# 方式1
a = torch.add(w, x)
b = torch.add(w, 1)
y = torch.mul(a, b)

y.backward(retain_graph=True)
print(w.grad) # tensor([5.])

w.grad.data.zero_()
y.backward(retain_graph=True)
print(w.grad) # tensor([5.])

y.backward(retain_graph=True)
print(w.grad) # tensor([10.])

y.backward(retain_graph=True)
print(w.grad) # tensor([105.])

# 方式2
for _ in range(4):
a = torch.add(w, x)
b = torch.add(w, 1)
y = torch.mul(a, b)

w.grad.zero_()
y.backward()
print(w.grad)

"""
y.backward()
y.backward() 2次執行 backward()會報錯
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed).
Saved intermediate values of the graph are freed when you call .backward() or autograd.grad().
Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

"""
0條評論
作者已關閉評論
Top123
32文章數
3粉絲數
Top123
32 文章 | 3 粉絲
Top123
32文章數
3粉絲數
Top123
32 文章 | 3 粉絲
原創

多次執行backward的方法

2024-11-06 10:00:17
12
0
import torch

x = torch.tensor([2.], requires_grad=True)
w = torch.tensor([1.], requires_grad=True)

# 方式1
a = torch.add(w, x)
b = torch.add(w, 1)
y = torch.mul(a, b)

y.backward(retain_graph=True)
print(w.grad) # tensor([5.])

w.grad.data.zero_()
y.backward(retain_graph=True)
print(w.grad) # tensor([5.])

y.backward(retain_graph=True)
print(w.grad) # tensor([10.])

y.backward(retain_graph=True)
print(w.grad) # tensor([105.])

# 方式2
for _ in range(4):
a = torch.add(w, x)
b = torch.add(w, 1)
y = torch.mul(a, b)

w.grad.zero_()
y.backward()
print(w.grad)

"""
y.backward()
y.backward() 2次執行 backward()會報錯
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed).
Saved intermediate values of the graph are freed when you call .backward() or autograd.grad().
Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

"""
文章來自個人專欄
文章 | 訂閱
0條評論
作者已關閉評論
作者已關閉評論
0
0