Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

#869 Add DagsHub Logger to Super Gradients

Merged
Ghost merged 1 commits into Deci-AI:master from timho102003:dagshub_logger
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
  1. from typing import Callable, Any
  2. from super_gradients.common.abstractions.abstract_logger import get_logger
  3. logger = get_logger(__name__)
  4. def wrap_with_warning(cls: Callable, message: str) -> Any:
  5. """
  6. Emits a warning when target class of function is called.
  7. >>> from super_gradients.training.utils.deprecated_utils import wrap_with_warning
  8. >>> from super_gradients.training.utils.callbacks import EpochStepWarmupLRCallback, BatchStepLinearWarmupLRCallback
  9. >>>
  10. >>> LR_WARMUP_CLS_DICT = {
  11. >>> "linear": wrap_with_warning(
  12. >>> EpochStepWarmupLRCallback,
  13. >>> message=f"Parameter `linear` has been made deprecated and will be removed in the next SG release. Please use `linear_epoch` instead",
  14. >>> ),
  15. >>> 'linear_epoch`': EpochStepWarmupLRCallback,
  16. >>> }
  17. :param cls: A class or function to wrap
  18. :param message: A message to emit when this class is called
  19. :return: A factory method that returns wrapped class
  20. """
  21. def _inner_fn(*args, **kwargs):
  22. logger.warning(message)
  23. return cls(*args, **kwargs)
  24. return _inner_fn
Discard
Tip!

Press p or to see the previous file or, n or to see the next file