Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
zhliu 46bfd6b89c
Update to 1.0.2 release
3 years ago
..
46bfd6b89c
Update to 1.0.2 release
3 years ago
46bfd6b89c
Update to 1.0.2 release
3 years ago
2931f6b295
Update to v0.9 release
4 years ago
46bfd6b89c
Update to 1.0.2 release
3 years ago
2931f6b295
Update to v0.9 release
4 years ago
2931f6b295
Update to v0.9 release
4 years ago
2931f6b295
Update to v0.9 release
4 years ago

README

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
  1. ################################################################################
  2. # Copyright (c) 2020-2021, NVIDIA CORPORATION. All rights reserved.
  3. #
  4. # Permission is hereby granted, free of charge, to any person obtaining a
  5. # copy of this software and associated documentation files (the "Software"),
  6. # to deal in the Software without restriction, including without limitation
  7. # the rights to use, copy, modify, merge, publish, distribute, sublicense,
  8. # and/or sell copies of the Software, and to permit persons to whom the
  9. # Software is furnished to do so, subject to the following conditions:
  10. #
  11. # The above copyright notice and this permission notice shall be included in
  12. # all copies or substantial portions of the Software.
  13. #
  14. # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
  15. # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
  16. # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
  17. # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
  18. # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
  19. # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
  20. # DEALINGS IN THE SOFTWARE.
  21. ################################################################################
  22. Prequisites:
  23. - DeepStreamSDK 5.1
  24. - NVIDIA Triton Inference Server
  25. - Python 3.6
  26. - Gst-python
  27. - NumPy
  28. To set up Triton Inference Server:
  29. For x86_64 and Jetson Docker:
  30. 1. Use the provided docker container and follow directions for
  31. Triton Inference Server in the SDK README --
  32. be sure to prepare the detector models.
  33. 2. Run the docker with this Python Bindings directory mapped
  34. 3. Install required Python packages inside the container:
  35. $ apt install
  36. $ apt install python3-gi python3-dev python3-gst-1.0 python3-numpy -y
  37. For Jetson without Docker:
  38. 1. Install NumPy:
  39. $ apt update
  40. $ apt install python3-numpy
  41. 2. Follow instructions in the DeepStream SDK README to set up
  42. Triton Inference Server:
  43. 2.1 Compile and install the nvdsinfer_customparser
  44. 2.2 Prepare at least the Triton detector models
  45. 3. Add to LD_PRELOAD:
  46. /usr/lib/aarch64-linux-gnu/libgomp.so.1
  47. This is to work around the following problem with TLS usage limitation:
  48. https://gcc.gnu.org/bugzilla/show_bug.cgi?id=91938
  49. 4. Clear the GStreamer cache if pipeline creation fails:
  50. rm ~/.cache/gstreamer-1.0/*
  51. To run the test app:
  52. $ python3 deepstream_ssd_parser.py <h264_elementary_stream>
  53. This document shall describe the sample deepstream-ssd-parser application.
  54. It is meant for simple demonstration of how to make a custom neural network
  55. output parser and use it in the pipeline to extract meaningful insights
  56. from a video stream.
  57. This example:
  58. - Uses SSD neural network running on Triton Inference Server
  59. - Selects custom post-processing in the Triton Inference Server config file
  60. - Parses the inference output into bounding boxes
  61. - Performs post-processing on the generated boxes with NMS (Non-maximum Suppression)
  62. - Adds detected objects into the pipeline metadata for downstream processing
  63. - Encodes OSD output and saves to MP4 file. Note that there is no visual output on screen.
  64. Known Issue:
  65. 1. On Jetson, if libgomp is not preloaded, this error may occur:
  66. (python3:21041): GStreamer-WARNING **: 14:35:44.113: Failed to load plugin '/usr/lib/aarch64-linux-gnu/gstreamer-1.0/libgstlibav.so': /usr/lib/aarch64-linux-gnu/libgomp.so.1: cannot allocate memory in static TLS block
  67. Unable to create Encoder
  68. 2. On Jetson Nano, ssd_inception_v2 is not expected to run with GPU instance.
  69. Switch to CPU instance when running on Nano:
  70. update config.pbtxt files in samples/trtis_modeo_repo:
  71. # Switch to CPU instance for Nano since memory might not be enough for
  72. # certain Models.
  73. # Specify CPU instance.
  74. instance_group {
  75. count: 1
  76. kind: KIND_CPU
  77. }
  78. # Specify GPU instance.
  79. #instance_group {
  80. # kind: KIND_GPU
  81. # count: 1
  82. # gpus: 0
  83. #}
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...