mirror of
https://github.com/zebrajr/pytorch.git
synced 2025-12-06 12:20:52 +01:00
[ez] fix grammar mistakes in StatefulSymbolicContext comment (#152598)
Pull Request resolved: https://github.com/pytorch/pytorch/pull/152598 Approved by: https://github.com/malfet ghstack dependencies: #151407
This commit is contained in:
parent
36e5ff6bc4
commit
1f898657e6
|
|
@ -1996,12 +1996,12 @@ class StatefulSymbolicContext(StatelessSymbolicContext):
|
|||
other values - dynamic_sizes and constraint_sizes will not be read if we cache
|
||||
hit.
|
||||
|
||||
It is the cache owners responsibility to maintain the lifecycle of the cache
|
||||
w/r/t different shape_envs, clearing, etc.
|
||||
It is the cache owner's responsibility to maintain the lifecycle of the cache
|
||||
with respect to different shape_envs, clearing, etc.
|
||||
"""
|
||||
|
||||
tensor_source: Source = None # type: ignore[assignment]
|
||||
# Why is this keyd on int first?
|
||||
# Why is this keyed on int first?
|
||||
# That integer is actually the id of the shape_env. This cache short-circuits symbol
|
||||
# creation, and we must store it per shape env. Now, while tracing invariants are a single
|
||||
# shape env per tracing context, and every new frame gets a new shape_env. So where would we have
|
||||
|
|
|
|||
Loading…
Reference in New Issue
Block a user