This class provides the python interface to C_GradAccumulator, the C++ class which performs heavier workloads.
More...
|
def | __delitem__ (self, int index) |
| Set named parameter gradients at a given index. More...
|
|
C_GradAccumulator.MapOfTensors | __getitem__ (self, int index) |
| Retrieve named parameter gradients at a given index. More...
|
|
def | __init__ (self, List[str] parameter_keys, int bootstrap_rounds) |
|
def | __len__ (self) |
| Get the number of named parameters' accumulated gradients so far. More...
|
|
None | __setitem__ (self, int index, Iterable named_parameters) |
| Set named parameter gradients at a given index. More...
|
|
None | accumulate (self, Iterable named_parameters) |
| Accumulates the parameters from the model. More...
|
|
None | clear (self) |
| Clears the accumulated gradients. More...
|
|
C_GradAccumulator.MapOfTensors | mean_reduce (self) |
| Performs the mean reduction of accumulated gradients. More...
|
|
C_GradAccumulator.MapOfTensors | sum_reduce (self) |
| Performs the sum reduction of accumulated gradients. More...
|
|
This class provides the python interface to C_GradAccumulator, the C++ class which performs heavier workloads.
This class is used for accumulating gradients and performing reduction operations on it.
◆ __init__()
def rlpack._C.grad_accumulator.GradAccumulator.__init__ |
( |
|
self, |
|
|
List[str] |
parameter_keys, |
|
|
int |
bootstrap_rounds |
|
) |
| |
- Parameters
-
parameter_keys | List[str]: The parameter keys (names) of the model. |
bootstrap_rounds | int: The bootstrap rounds defined the agent. |
◆ __delitem__()
def rlpack._C.grad_accumulator.GradAccumulator.__delitem__ |
( |
|
self, |
|
|
int |
index |
|
) |
| |
Set named parameter gradients at a given index.
- Parameters
-
index | int: The index at which we wish to set the gradient values. |
◆ __getitem__()
C_GradAccumulator.MapOfTensors rlpack._C.grad_accumulator.GradAccumulator.__getitem__ |
( |
|
self, |
|
|
int |
index |
|
) |
| |
Retrieve named parameter gradients at a given index.
- Parameters
-
index | int: The index at which we wish to obtain the gradient values. |
- Returns
- MapOfTensors: The custom map object from C++ backend with gradient of parameters for each key.
◆ __len__()
def rlpack._C.grad_accumulator.GradAccumulator.__len__ |
( |
|
self | ) |
|
Get the number of named parameters' accumulated gradients so far.
- Returns
- int: The size of GradAccumulator.
◆ __setitem__()
None rlpack._C.grad_accumulator.GradAccumulator.__setitem__ |
( |
|
self, |
|
|
int |
index, |
|
|
Iterable |
named_parameters |
|
) |
| |
Set named parameter gradients at a given index.
- Parameters
-
index | int: The index at which we wish to set the gradient values. |
named_parameters | Iterable for parameters (use model.named_parameters()). |
◆ accumulate()
None rlpack._C.grad_accumulator.GradAccumulator.accumulate |
( |
|
self, |
|
|
Iterable |
named_parameters |
|
) |
| |
Accumulates the parameters from the model.
C++ backend extracts the gradients from the parameters.
- Parameters
-
named_parameters | Iterable for parameters (use model.named_parameters()). |
◆ clear()
None rlpack._C.grad_accumulator.GradAccumulator.clear |
( |
|
self | ) |
|
Clears the accumulated gradients.
◆ mean_reduce()
C_GradAccumulator.MapOfTensors rlpack._C.grad_accumulator.GradAccumulator.mean_reduce |
( |
|
self | ) |
|
Performs the mean reduction of accumulated gradients.
- Returns
- MapOfTensors: The custom map object from C++ backend with mean of gradient of parameters for each key.
◆ sum_reduce()
C_GradAccumulator.MapOfTensors rlpack._C.grad_accumulator.GradAccumulator.sum_reduce |
( |
|
self | ) |
|
Performs the sum reduction of accumulated gradients.
- Returns
- MapOfTensors: The custom map object from C++ backend with sum of gradient of parameters for each key.
◆ c_grad_accumulator
rlpack._C.grad_accumulator.GradAccumulator.c_grad_accumulator |
◆ map_of_tensors
rlpack._C.grad_accumulator.GradAccumulator.map_of_tensors |
The instance of MapOfTensors; the custom object used by C++ backend.