pytorch/torch/utils/trainer/plugins
Roy Fejgin b479494ed4 loss plugin: Fix indexing into a scalar (#9143)
Summary:
The loss plugin was using the old-style loss[0] access, which in PyTorch 0.4 and
later is an attempt to index into a scalar, generating a warning.
Replaced that with loss.item().

This fixes
https://github.com/pytorch/pytorch/issues/9142
Closes https://github.com/pytorch/pytorch/pull/9143

Differential Revision: D8726403

Pulled By: ezyang

fbshipit-source-id: 6c496b140a74d22c8423f511db901b18615fd6fa
2018-07-03 14:25:44 -07:00
..
__init__.py [pep8] Fix most lint automatically with autopep8 2017-01-28 01:15:51 +01:00
accuracy.py [pep8] Fix most lint automatically with autopep8 2017-01-28 01:15:51 +01:00
logger.py [pep8] Fix most lint automatically with autopep8 2017-01-28 01:15:51 +01:00
loss.py loss plugin: Fix indexing into a scalar (#9143) 2018-07-03 14:25:44 -07:00
monitor.py [pep8] Fix most remaining lint manually 2017-01-28 01:15:51 +01:00
plugin.py [pep8] Fix most lint automatically with autopep8 2017-01-28 01:15:51 +01:00
progress.py Ensure displayed progress in ProgressMonitor is between 0 and 100%. 2017-03-24 15:21:52 +01:00
time.py [pep8] Fix most lint automatically with autopep8 2017-01-28 01:15:51 +01:00