src.config ¶
Classes:
| Name | Description |
|---|---|
Config | Training configuration class including logging (summary writer) handle. |
Classes¶
Config ¶
Bases: ContextDecorator
flowchart TD
src.config.Config[Config]
click src.config.Config href "" "src.config.Config"
Training configuration class including logging (summary writer) handle.
Example:
Methods:
| Name | Description |
|---|---|
__enter__ | |
__exit__ | |
__init__ | |
get_global_training_epoch | Get the global training epoch. |
Attributes:
| Name | Type | Description |
|---|---|---|
args | | |
batch_size | | |
device | | |
epochs | | |
log_interval | | |
logger | | |
loss | | |
optimizer | | |
scheduler | | |
summary_writer | |
Source code in src/config.py
Attributes¶
optimizer class-attribute instance-attribute ¶
optimizer = partial(SGD, lr=0.05, momentum=0.9, nesterov=True, weight_decay=0.0001)
summary_writer instance-attribute ¶
Functions¶
__exit__ ¶
get_global_training_epoch ¶
get_global_training_epoch(local_epoch: int) -> int
Get the global training epoch.
Calculates and returns the global training epoch based on the local epoch and the training round.
Note:
self.args.round(training round) is zero based- testing rounds are not considered or included
Example:
Consider a scenario where the client returns the model to the server after every three local epochs. If we are in the second training round and the first local epoch, the global training epoch would be calculated as 1 + 3*2, which equals 7.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
| int | local training epoch | required |
Returns:
| Name | Type | Description |
|---|---|---|
int | int | global training epoch |