Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel

#869 Add DagsHub Logger to Super Gradients

Merged
Ghost merged 1 commits into Deci-AI:master from timho102003:dagshub_logger
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
  1. from super_gradients.training.utils.utils import Timer, HpmStruct, WrappedModel, convert_to_tensor, get_param, tensor_container_to_device, random_seed
  2. from super_gradients.training.utils.checkpoint_utils import adapt_state_dict_to_fit_model_layer_names, raise_informative_runtime_error
  3. from super_gradients.training.utils.version_utils import torch_version_is_greater_or_equal
  4. from super_gradients.training.utils.config_utils import raise_if_unused_params, warn_if_unused_params
  5. from super_gradients.training.utils.early_stopping import EarlyStop
  6. from super_gradients.training.utils.pose_estimation import DEKRPoseEstimationDecodeCallback, DEKRVisualizationCallback
  7. __all__ = [
  8. "Timer",
  9. "HpmStruct",
  10. "WrappedModel",
  11. "convert_to_tensor",
  12. "get_param",
  13. "tensor_container_to_device",
  14. "adapt_state_dict_to_fit_model_layer_names",
  15. "raise_informative_runtime_error",
  16. "random_seed",
  17. "torch_version_is_greater_or_equal",
  18. "raise_if_unused_params",
  19. "warn_if_unused_params",
  20. "EarlyStop",
  21. "DEKRPoseEstimationDecodeCallback",
  22. "DEKRVisualizationCallback",
  23. ]
Discard
Tip!

Press p or to see the previous file or, n or to see the next file