.. Comment: this file is automatically generated by `update_example_docs.py`. It should not be modified manually. .. _example-category-overriding_configs: Overriding Configs ================== In these examples, we show how :func:`tyro.cli` can be used to override values in pre-instantiated configuration objects. .. _example-01_dataclasses_defaults: Dataclasses + Defaults ---------------------- The :code:`default=` argument can be used to override default values in dataclass types. .. note:: When ``default=`` is used, we advise against mutation of configuration objects from a dataclass's :code:`__post_init__` method [#f1]_. In the example below, :code:`__post_init__` would be called twice: once for the :code:`Args()` object provided as a default value and another time for the :code:`Args()` objected instantiated by :func:`tyro.cli()`. This can cause confusing behavior! Instead, we show below one example of how derived fields can be defined immutably. .. [#f1] Official Python docs for ``__post_init__`` can be found `here `_. .. code-block:: python :linenos: # 01_dataclasses_defaults.py import dataclasses import tyro @dataclasses.dataclass class Args: """Description. This should show up in the helptext!""" string: str """A string field.""" reps: int = 3 """A numeric field, with a default value.""" @property def derived_field(self) -> str: return ", ".join([self.string] * self.reps) if __name__ == "__main__": args = tyro.cli( Args, default=Args( string="default string", reps=tyro.MISSING, ), ) print(args.derived_field) .. raw:: html
    $ python ./01_dataclasses_defaults.py --help
    usage: 01_dataclasses_defaults.py [-h] [--string STR] --reps INT
    
    Description. This should show up in the helptext!
    
    ╭─ options ─────────────────────────────────────────────────────────────╮
     -h, --help          show this help message and exit                   
     --string STR        A string field. (default: 'default string')       
     --reps INT          A numeric field, with a default value. (required) 
    ╰───────────────────────────────────────────────────────────────────────╯
    
.. raw:: html
    $ python ./01_dataclasses_defaults.py --reps 3
    default string, default string, default string
    
.. raw:: html
    $ python ./01_dataclasses_defaults.py --string hello --reps 5
    hello, hello, hello, hello, hello
    
.. _example-02_overriding_yaml: Overriding YAML Configs ----------------------- :mod:`tyro` understands a wide range of data structures, including standard dictionaries and lists. If you have a library of existing YAML files that you want to use, :func:`tyro.cli` can help override values within them. .. note:: We recommend dataclass configs for new projects. .. code-block:: python :linenos: # 02_overriding_yaml.py import yaml import tyro # YAML configuration. This could also be loaded from a file! Environment # variables are an easy way to select between different YAML files. default_yaml = r""" exp_name: test optimizer: learning_rate: 0.0001 type: adam training: batch_size: 32 num_steps: 10000 checkpoint_steps: - 500 - 1000 - 1500 """.strip() if __name__ == "__main__": # Convert our YAML config into a nested dictionary. default_config = yaml.safe_load(default_yaml) # Override fields in the dictionary. overridden_config = tyro.cli(dict, default=default_config) # Print the overridden config. overridden_yaml = yaml.safe_dump(overridden_config) print(overridden_yaml) .. raw:: html
    $ python ./02_overriding_yaml.py --help
    usage: 02_overriding_yaml.py [-h] [OPTIONS]
    
    ╭─ options ───────────────────────────────────────────────╮
     -h, --help              show this help message and exit 
     --exp-name STR          (default: test)                 
    ╰─────────────────────────────────────────────────────────╯
    ╭─ optimizer options ─────────────────────────────────────╮
     --optimizer.learning-rate FLOAT                         
                             (default: 0.0001)               
     --optimizer.type STR    (default: adam)                 
    ╰─────────────────────────────────────────────────────────╯
    ╭─ training options ──────────────────────────────────────╮
     --training.batch-size INT                               
                             (default: 32)                   
     --training.num-steps INT                                
                             (default: 10000)                
     --training.checkpoint-steps [INT [INT ...]]             
                             (default: 500 1000 1500)        
    ╰─────────────────────────────────────────────────────────╯
    
.. raw:: html
    $ python ./02_overriding_yaml.py --training.checkpoint-steps 300 1000 9000
    exp_name: test
    optimizer:
      learning_rate: 0.0001
      type: adam
    training:
      batch_size: 32
      checkpoint_steps:
      - 300
      - 1000
      - 9000
      num_steps: 10000
    
    
.. _example-03_choosing_base_configs: Choosing Base Configs --------------------- One common pattern is to have a set of "base" configurations, which can be selected from and then overridden. This is often implemented with a set of configuration files (e.g., YAML files). With :mod:`tyro`, we can instead define each base configuration as a separate Python object. After creating the base configurations, we can use the CLI to select one of them and then override (existing) or fill in (missing) values. The helper function used here, :func:`tyro.extras.overridable_config_cli()`, is a lightweight wrapper over :func:`tyro.cli()` and its Union-based subcommand syntax. .. code-block:: python :linenos: # 03_choosing_base_configs.py from dataclasses import dataclass from typing import Callable, Literal from torch import nn import tyro @dataclass(frozen=True) class ExperimentConfig: # Dataset to run experiment on. dataset: Literal["mnist", "imagenet-50"] # Model size. num_layers: int units: int # Batch size. batch_size: int # Total number of training steps. train_steps: int # Random seed. seed: int # Not specifiable via the commandline. activation: Callable[[], nn.Module] # We could also define this library using separate YAML files (similar to # `config_path`/`config_name` in Hydra), but staying in Python enables seamless # type checking + IDE support. default_configs = { "small": ( "Small experiment.", ExperimentConfig( dataset="mnist", batch_size=2048, num_layers=4, units=64, train_steps=30_000, seed=0, activation=nn.ReLU, ), ), "big": ( "Big experiment.", ExperimentConfig( dataset="imagenet-50", batch_size=32, num_layers=8, units=256, train_steps=100_000, seed=0, activation=nn.GELU, ), ), } if __name__ == "__main__": config = tyro.extras.overridable_config_cli(default_configs) print(config) Overall helptext: .. raw:: html
    $ python ./03_choosing_base_configs.py --help
    usage: 03_choosing_base_configs.py [-h] {small,big}
    
    ╭─ options ──────────────────────────────────────────╮
     -h, --help         show this help message and exit 
    ╰────────────────────────────────────────────────────╯
    ╭─ subcommands ──────────────────────────────────────╮
     {small,big}                                        
         small          Small experiment.               
         big            Big experiment.                 
    ╰────────────────────────────────────────────────────╯
    
The "small" subcommand: .. raw:: html
    $ python ./03_choosing_base_configs.py small --help
    usage: 03_choosing_base_configs.py small [-h] [SMALL OPTIONS]
    
    Small experiment.
    
    ╭─ options ──────────────────────────────────────────────────────────────────╮
     -h, --help              show this help message and exit                    
     --dataset {mnist,imagenet-50}                                              
                             Dataset to run experiment on. (default: mnist)     
     --num-layers INT        Model size. (default: 4)                           
     --units INT             Model size. (default: 64)                          
     --batch-size INT        Batch size. (default: 2048)                        
     --train-steps INT       Total number of training steps. (default: 30000)   
     --seed INT              Random seed. (default: 0)                          
     --activation {fixed}    Not specifiable via the commandline. (fixed to:    
                             <class 'torch.nn.modules.activation.ReLU'>)        
    ╰────────────────────────────────────────────────────────────────────────────╯
    
.. raw:: html
    $ python ./03_choosing_base_configs.py small --seed 94720
    ExperimentConfig(dataset='mnist', num_layers=4, units=64, batch_size=2048, train_steps=30000, seed=94720, activation=<class 'torch.nn.modules.activation.ReLU'>)
    
The "big" subcommand: .. raw:: html
    $ python ./03_choosing_base_configs.py big --help
    usage: 03_choosing_base_configs.py big [-h] [BIG OPTIONS]
    
    Big experiment.
    
    ╭─ options ──────────────────────────────────────────────────────────────────╮
     -h, --help              show this help message and exit                    
     --dataset {mnist,imagenet-50}                                              
                             Dataset to run experiment on. (default:            
                             imagenet-50)                                       
     --num-layers INT        Model size. (default: 8)                           
     --units INT             Model size. (default: 256)                         
     --batch-size INT        Batch size. (default: 32)                          
     --train-steps INT       Total number of training steps. (default: 100000)  
     --seed INT              Random seed. (default: 0)                          
     --activation {fixed}    Not specifiable via the commandline. (fixed to:    
                             <class 'torch.nn.modules.activation.GELU'>)        
    ╰────────────────────────────────────────────────────────────────────────────╯
    
.. raw:: html
    $ python ./03_choosing_base_configs.py big --seed 94720
    ExperimentConfig(dataset='imagenet-50', num_layers=8, units=256, batch_size=32, train_steps=100000, seed=94720, activation=<class 'torch.nn.modules.activation.GELU'>)