Skip to content
This repository was archived by the owner on Jan 26, 2024. It is now read-only.

Commit ba517f1

Browse files
committed
Add gradient norm log
1 parent e4e0d8d commit ba517f1

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

train.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -235,14 +235,14 @@ def save_model():
235235
model.zero_grad()
236236
loss.backward()
237237

238-
# norm = utils.compute_gradient_norm(model.parameters())
238+
norm = utils.compute_gradient_norm(model.parameters())
239239
nn.utils.clip_grad_norm_(model.parameters(), 1.0)
240240

241241
optimizer.step()
242242

243243
if enable_logging:
244244
writer.add_scalar('loss', loss.item(), iteration)
245-
# writer.add_scalar('norm', norm.item(), iteration)
245+
writer.add_scalar('norm', norm.item(), iteration)
246246

247247
print(f'iter {iteration}, loss: {loss.item()}')
248248

0 commit comments

Comments
 (0)