Register
Login
Resources
Docs Blog Datasets Glossary Case Studies Tutorials & Webinars
Product
Data Engine LLMs Platform Enterprise
Pricing Explore
Connect to our Discord channel
zhliu 46bfd6b89c
Update to 1.0.2 release
3 years ago
..
46bfd6b89c
Update to 1.0.2 release
3 years ago
2931f6b295
Update to v0.9 release
4 years ago
c6eca50d06
Initial Commit
4 years ago
c6eca50d06
Initial Commit
4 years ago
46bfd6b89c
Update to 1.0.2 release
3 years ago
c6eca50d06
Initial Commit
4 years ago
971626b018
Update to 1.0 release
3 years ago

README

You have to be logged in to leave a comment. Sign In
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
  1. ################################################################################
  2. # Copyright (c) 2019-2021, NVIDIA CORPORATION. All rights reserved.
  3. #
  4. # Permission is hereby granted, free of charge, to any person obtaining a
  5. # copy of this software and associated documentation files (the "Software"),
  6. # to deal in the Software without restriction, including without limitation
  7. # the rights to use, copy, modify, merge, publish, distribute, sublicense,
  8. # and/or sell copies of the Software, and to permit persons to whom the
  9. # Software is furnished to do so, subject to the following conditions:
  10. #
  11. # The above copyright notice and this permission notice shall be included in
  12. # all copies or substantial portions of the Software.
  13. #
  14. # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
  15. # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
  16. # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
  17. # THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
  18. # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
  19. # FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
  20. # DEALINGS IN THE SOFTWARE.
  21. ################################################################################
  22. Prerequisites:
  23. - DeepStreamSDK 5.1
  24. - Python 3.6
  25. - Gst-python
  26. #Deepstream msgbroker supports sending messages to Azure(mqtt) IOThub, kafka and AMQP broker(rabbitmq)
  27. Dependencies
  28. ------------
  29. $ sudo apt-get update
  30. Azure Iot:
  31. ----------
  32. $ sudo apt-get install -y libcurl4-openssl-dev libssl-dev uuid-dev libglib2.0 libglib2.0-dev
  33. #If your host machine is x86 and using Ubuntu 18.04, additionally install the below
  34. $ sudo apt-get install -y libcurl3
  35. Kafka:
  36. ------
  37. $ sudo apt-get install libglib2.0 libglib2.0-dev
  38. $ sudo apt-get install libjansson4 libjansson-dev
  39. $ sudo apt-get install librdkafka1=0.11.3-1build1
  40. AMQP (rabbitmq):
  41. ----------------
  42. Install rabbitmq-c library
  43. --------------------------
  44. $ git clone -b v0.8.0 --recursive https://github.com/alanxz/rabbitmq-c.git
  45. $ cd rabbitmq-c
  46. $ mkdir build && cd build
  47. $ cmake ..
  48. $ cmake --build .
  49. $ sudo cp librabbitmq/librabbitmq.so.4 /opt/nvidia/deepstream/deepstream-<version>/lib/
  50. $ sudo ldconfig
  51. SETUP:
  52. 1.Use --proto-lib or -p command line option to set the path of adaptor library.
  53. Adaptor library can be found at /opt/nvidia/deepstream/deepstream-<version>/lib
  54. kafka lib - libnvds_kafka_proto.so
  55. azure device client - libnvds_azure_proto.so
  56. AMQP lib - libnvds_amqp_proto.so
  57. 2.Use --conn-str command line option as required to set connection to backend server.
  58. For Azure - Full Azure connection string
  59. For Kafka - Connection string of format: host;port;topic
  60. For Amqp - Connection string of format: host;port;username. Password to be provided in cfg_amqp.txt
  61. Provide connection string under quotes. e.g. --conn-str="host;port;topic"
  62. 3.Use --topic or -t command line option to provide message topic (optional).
  63. Kafka message adaptor also has the topic param embedded within the connection string format
  64. In that case, "topic" from command line should match the topic within connection string
  65. 4.Use --schema or -s command line option to select the message schema (optional).
  66. Json payload to send to cloud can be generated using different message schemas.
  67. schema = 0; Full message schema with separate payload per object (Default)
  68. schema = 1; Minimal message with multiple objects in single payload.
  69. Refer user guide to get more details about message schema.
  70. 5.Use --no-display to disable display.
  71. 6.Use --cfg-file or -c command line option to set adaptor configuration file.
  72. This is optional if connection string has all relevent information.
  73. For kafka: use cfg_kafka.txt as a reference.
  74. This file is used to define the parition key field to be used while sending messages to the
  75. kafka broker. Refer Kafka Protocol Adaptor section in the DeepStream 4.0 Plugin Manual for
  76. more details about using this config option. The partition-key setting within the cfg_kafka.txt
  77. should be set based on the schema type selected using the --schema option. Set this to
  78. "sensor.id" in case of Full message schema, and to "sensorId" in case of Minimal message schema
  79. For Azure , use the cfg_azure.txt as a reference. It has the following section:
  80. [message-broker]
  81. #connection_str = HostName=<my-hub>.azure-devices.net;DeviceId=<device_id>;SharedAccessKey=<my-policy-key>
  82. #shared_access_key = <my-policy-key>
  83. Azure device connection string:
  84. -------------------------------
  85. You can provide the connection_str within cfg_azure.txt of format:
  86. connection_str = HostName=<my-hub>.azure-devices.net;DeviceId=<device_id>;SharedAccessKey=<my-policy-key>
  87. OR
  88. optionally, you can pass in part of the required connection string with --conn-str option in format: "url;port;device-id"
  89. AND provide the shared_access_key within cfg_azure.txt
  90. shared_access_key = <my-policy-key>
  91. For AMQP, use cfg_amqp.txt as reference. It has the following default options:
  92. [message-broker]
  93. password = guest
  94. #optional
  95. hostname = localhost
  96. username = guest
  97. port = 5672
  98. exchange = amq.topic
  99. topic = topicname
  100. AMQP connection string:
  101. ----------------------
  102. Provide hostname, username, password details in the cfg_amqp.txt
  103. OR
  104. optionally, you can pass in part of the required connection string with --conn-str option in format: "hostname;port;username"
  105. AND provide password within cfg_amqp.txt
  106. password = <your_amqp_broker_password>
  107. NOTE:
  108. - DO NOT delete the line [message-broker] in cfg file. Its the section identifier used for parsing
  109. - For Azure & AMQP:
  110. If you use --conn-str commandline option as in step 2), make sure to provide password details in cfg file
  111. OR
  112. You can ignore --conn-str commandline option and provide full connection details within cfg file
  113. 7. Enable logging:
  114. Go through the README to setup & enable logs for the messaging libraries(kafka, azure, amqp)
  115. $ cat ../../../tools/nvds_logger/README
  116. To run:
  117. $ python3 deepstream_test_4.py -i <H264 filename> -p <Proto adaptor library> --conn-str=<Connection string> -s <0/1>
  118. NOTE: More details about the message adapters can be found at README inside DS_PACKAGE_DIR/sources/libs/*_protocol_adaptor
  119. This document shall describe about the sample deepstream-test4 application.
  120. This sample builds on top of the deepstream-test1 sample to demonstrate how to:
  121. * Use "nvmsgconv" and "nvmsgbroker" plugins in the pipeline.
  122. * Create NVDS_META_EVENT_MSG type of meta and attach to buffer.
  123. * Use NVDS_META_EVENT_MSG for different types of objects e.g. vehicle, person etc.
  124. * Provide copy / free functions if meta data is extended through "extMsg" field.
  125. "nvmsgconv" plugin uses NVDS_META_EVENT_MSG type of metadata from the buffer
  126. and generates the "DeepStream Schema" payload in Json format. Static properties
  127. of schema are read from configuration file in the form of key-value pair.
  128. Check dstest4_msgconv_config.txt for reference. Generated payload is attached
  129. as NVDS_META_PAYLOAD type metadata to the buffer.
  130. "nvmsgbroker" plugin extracts NVDS_META_PAYLOAD type of metadata from the buffer
  131. and sends that payload to the server using protocol adaptor APIs.
  132. Generating custom metadata for different type of objects:
  133. In addition to common fields provided in NvDsEventMsgMeta structure, user can
  134. also create custom objects and attach to buffer as NVDS_META_EVENT_MSG metadata.
  135. To do that NvDsEventMsgMeta provides "extMsg" and "extMsgSize" fields. User can
  136. create custom structure, fill that structure and assign the pointer of that
  137. structure as "extMsg" and set the "extMsgSize" accordingly.
  138. If custom object contains fields that can't be simply mem copied then user should
  139. also provide function to copy and free those objects.
  140. Refer generate_event_msg_meta() to know how to use "extMsg" and "extMsgSize"
  141. fields for custom objects and how to provide copy/free function and attach that
  142. object to buffer as metadata.
  143. NOTE: This app by default sends message for first object of every 30th frame. To
  144. change the frequency of messages, modify the following line in source code accordingly.
  145. if(is_first_object and not (frame_number%30)) should be changed to:
  146. if (not (frame_number % 30)) #To get all objects of a single frame
Tip!

Press p or to see the previous file or, n or to see the next file

Comments

Loading...