Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

#869 Add DagsHub Logger to Super Gradients

Merged
Ghost merged 1 commits into Deci-AI:master from timho102003:dagshub_logger
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
  1. from typing import Dict, Optional
  2. import hydra.utils
  3. from omegaconf import DictConfig
  4. from super_gradients.common.environment.cfg_utils import load_arch_params
  5. def get_arch_params(config_name: str, overriding_params: Dict = None, recipes_dir_path: Optional[str] = None) -> DictConfig:
  6. """
  7. Class for creating arch parameters dictionary, taking defaults from yaml
  8. files in src/super_gradients/recipes/arch_params.
  9. :param config_name: Name of the yaml to load (e.g. "resnet18_cifar_arch_params")
  10. :param overriding_params: Dict, dictionary like object containing entries to override.
  11. :param recipes_dir_path: Optional. Main directory where every recipe are stored. (e.g. ../super_gradients/recipes)
  12. This directory should include a "arch_params" folder,
  13. which itself should include the config file named after config_name.
  14. """
  15. overriding_params = overriding_params if overriding_params else dict()
  16. arch_params = load_arch_params(config_name=config_name, recipes_dir_path=recipes_dir_path)
  17. arch_params = hydra.utils.instantiate(arch_params)
  18. arch_params.update(**overriding_params)
  19. return arch_params
Discard
Tip!

Press p or to see the previous file or, n or to see the next file