aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorAlex Morehead <alex.morehead@gmail.com>2024-10-02 02:50:49 -0500
committerGitHub <noreply@github.com>2024-10-02 09:50:49 +0200
commitd661859ad4206217afb274be709837d19218b3f8 (patch)
treee4126070c1a6d5e34e4b68a205f9e89b6f2d0b65 /README.md
parenta3edfae103d79d49ad51aea2cdd0a3dba37d6403 (diff)
Update README.md (#16)HEADmaster
Diffstat (limited to 'README.md')
-rw-r--r--README.md8
1 files changed, 4 insertions, 4 deletions
diff --git a/README.md b/README.md
index 40fa3ff..cc717ae 100644
--- a/README.md
+++ b/README.md
@@ -67,7 +67,7 @@ with ema.average_parameters():
### Manual validation mode
-While the `average_parameters()` context manager is convinient, you can also manually execute the same series of operations:
+While the `average_parameters()` context manager is convenient, you can also manually execute the same series of operations:
```python
ema.store()
ema.copy_to()
@@ -77,7 +77,7 @@ ema.restore()
### Custom parameters
-By default the methods of `ExponentialMovingAverage` act on the model parameters the object was constructed with, but any compatable iterable of parameters can be passed to any method (such as `store()`, `copy_to()`, `update()`, `restore()`, and `average_parameters()`):
+By default the methods of `ExponentialMovingAverage` act on the model parameters the object was constructed with, but any compatible iterable of parameters can be passed to any method (such as `store()`, `copy_to()`, `update()`, `restore()`, and `average_parameters()`):
```python
model = torch.nn.Linear(10, 2)
model2 = torch.nn.Linear(10, 2)
@@ -90,11 +90,11 @@ ema.copy_to(model2)
### Resuming training
-Like a PyTorch optimizer, `ExponentialMovingAverage` objects have `state_dict()`/`load_state_dict()` methods to allow pausing, serializing, and restarting training without loosing shadow parameters, stored parameters, or the update count.
+Like a PyTorch optimizer, `ExponentialMovingAverage` objects have `state_dict()`/`load_state_dict()` methods to allow pausing, serializing, and restarting training without losing shadow parameters, stored parameters, or the update count.
### GPU/device support
`ExponentialMovingAverage` objects have a `.to()` function (like `torch.Tensor`) that can move the object's internal state to a different device or floating-point dtype.
-For more details on individual methods, please check the docstrings. \ No newline at end of file
+For more details on individual methods, please check the docstrings.