Skip to content

Commit

Permalink
v1.5.15
Browse files Browse the repository at this point in the history
  • Loading branch information
actions-user committed Jan 22, 2025
1 parent 6123232 commit e3aabe9
Show file tree
Hide file tree
Showing 5 changed files with 138 additions and 27 deletions.
3 changes: 3 additions & 0 deletions RELEASE.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
# Release Notes

## v1.5.15

- Update deployments README for redability

## v1.5.14

Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
v1.5.14
v1.5.15
130 changes: 119 additions & 11 deletions deployment-examples/SIOOnDemandAnalytics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,27 +2,135 @@

## Summary

This sample implements a REST API based analytics, allowing monitoring of an RTSP stream, and retrieving a most-recent image with associated analytics, as well as providing an inline image for analytics.
This sample implements a REST API-based analytics system, allowing monitoring of an RTSP stream and retrieving the most recent image with associated analytics, as well as providing an inline image for analytics.

## Implementation

This sample is backed by two separate pipelines, executing within the same process. One pipeline is responsible for monitoring an RTSP stream, and emitting it's output to a shared storage volume, as well as maintaining the lifecycle of that output. The output is generated and managed with an extension module.
This sample is backed by two separate pipelines, executing within the same process:

The other pipeline is a vanilla Folder Watch pipeline.
1. **RTSP Monitoring Pipeline**: Responsible for monitoring an RTSP stream, outputting its results to a shared storage volume, and managing the lifecycle of the output. The output is generated and managed using an extension module.

REST API Flask module is bringing it all together, providing a front-end implementation of the API. Upon request for an analytics, it finds the most recent file generated by the live pipeline, and serves JSON containing the image and relevant analytics. Another flavor of that API will serve an image/jpg, which is useful for debugging.
2. **Folder Watch Pipeline**: A standard pipeline that monitors a folder for changes.

The REST API is implemented using the Flask module, providing a front-end interface for the system. Upon a request for analytics, the API locates the most recent file generated by the live pipeline and serves JSON containing the image and relevant analytics. Another API variant serves an `image/jpg`, which is useful for debugging.

## Special Instructions for Jetson Devices

When deploying on Jetson devices, additional configuration is required:

1. Set the `SIO_DOCKER_TAG_VARIANT` environment variable to the appropriate value:
```bash
export SIO_DOCKER_TAG_VARIANT=-r32.7.3-arm64v8
```
2. Configure Docker to use the NVIDIA runtime:
```bash
export SIO_DOCKER_RUNTIME=nvidia
```
3. Modify `docker-compose` to add the network:
```yaml
networks:
sh-device-ui_sh-ui-net:
external: true
```
4. Add the following to the analytics service in `docker-compose`:
```yaml
networks:
sh-device-ui_sh-ui-net:
aliases:
- sioanalytics
```
5. Update the pipeline configuration to use the camera URL:
```json
"VIDEO_IN": "rtsp://sh-ui-backend:8555/live"
```
To automate this step, use a `sed` command to replace the `VIDEO_IN` URL in `pipelines.json`:
```bash
sed -i 's|"VIDEO_IN".*:.*"rtsp://.*"|"VIDEO_IN": "rtsp://sh-ui-backend:8555/live"|' config/analytics/pipelines.json
```

## Testing the API

After everything is running, you can test the API by running:
```bash
curl http://<camera_ip or localhost>:8080/alpr/v1.0/locations/100
```
This will return an output similar to the following:
```json
{
"location": {
"id": 100
},
"streamDetail": [
{
"image": [
{
"id": 1,
"imageData": "",
"licensePlate": {
"lpRegion": "unknown",
"lpString": "unknown",
"lpnConfidence": 0
},
"vehicle": {
"color": "unknown",
"make": "unknown",
"model": "unknown"
}
}
],
"preferredImage": 1,
"stream": {
"id": 1
}
}
]
}
```

## Client

A sample client is implemented in `SIOOnDemandAnalytics/clients/OnDemandTest.py`.
The client can be ran using `python3 ./clients/OnDemandTest.py [-i inputImage] [-o outputFolder]`
A sample client is implemented in `SIOOnDemandAnalytics/clients/OnDemandTest.py`. You can run the client using:
```bash
python3 ./clients/OnDemandTest.py [-i inputImage] [-o outputFolder]
```

## OS Compatibility

`SIO_DOCKER_TAG_VARIANT` environment variable used in `docker-compose` controls the flavor of SIO analytics container image. While on x86 systems thing largely work without setting it, on Jetson-based system, set it to the value most compatible with your base OS.
The `SIO_DOCKER_TAG_VARIANT` environment variable in `docker-compose` controls the flavor of the SIO analytics container image. For Jetson-based systems, it should be set based on your OS version:

- `-r32.4.3-arm64v8` (built for hardware running Jetpack 4.4)
- `-r32.7.3-arm64v8` (built for hardware running Jetpack 4.6)
- `-r35.3.1-arm64v8` (work in progress, built for hardware running Jetpack 5.1)

For x86-based systems, use:

- `-amd64`

## Sample Pipeline Configuration

Below is an example of the RTSP section of the `pipelines.json` configuration:

* `-r32.4.3-arm64v8` (built for hardware running Jetpack 4.4)
* `-r32.7.3-arm64v8` (built for hardware running Jetpack 4.6)
* `-r35.3.1-arm64v8` (work in progress, built for hardware running Jetpack 5.1)
* `-amd64` for x86 based systems
```json
{
"rtsp": {
"pipeline": "./share/pipelines/VehicleAnalytics/VehicleAnalyticsRTSP.yaml",
"restartPolicy": "restart",
"parameters": {
"VIDEO_IN": "rtsp://sh-ui-backend:8555/live",
"boxFilterConfig": "/config/analytics/boxFilter.json",
"detectionModel": "gen7es",
"lptModel": "gen7es",
"lptFilter": "['eu', 'us']",
"lptMinConfidence": "0.7",
"sourceId": "rtsp-stream-1",
"lptPreferAccuracyToSpeed": "false",
"fpsLimit": "2",
"updateOnlyOnChange": "false",
"splitMakeModel": "true",
"extensionModules": "/config/analytics/extension.py",
"extensionConfigurations": "/config/analytics/extensionConfig.json"
}
}
}
```

26 changes: 13 additions & 13 deletions docs/schemas/anypipe/anypipe.html
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ <h1>
<br/>
<span class="description">
<p>
Analytics data sent by the Sighthound video/image analysis pipeline. This data is sent based on configuration when the number of detected objects or attributes of detected objects changes, the confidence of detected objects or their attributes improves, or a configurable timeout occurs.
Analytics data sent by the Sighthound video/image analysis pipeline. This data is sent based on configuration when the number of detected objects or attributes of detected objects changes, the confidence of detected objects or their attributes improves, or a configurable timeout occurs.
</p>
</span>
<span class="badge badge-info no-additional">
Expand Down Expand Up @@ -229,7 +229,7 @@ <h2 class="mb-0">
<br/>
<span class="description">
<p>
The dimensions (width and height) of the frame represented by frameId. Also used as the coordinate base for all bounding box coordinates.
The dimensions (width and height) of the frame represented by frameId. Also used as the coordinate base for all bounding box coordinates.
</p>
</span>
<div class="accordion" id="accordionframeDimensions_w">
Expand Down Expand Up @@ -499,7 +499,7 @@ <h2 class="handle">
<br/>
<div class="description collapse" id="collapseDescription_metaClasses_pattern1">
<p>
An plural MetaClass name. Supported MetaClasses
An plural MetaClass name. Supported MetaClasses
<br/>
include:
<br/>
Expand Down Expand Up @@ -583,9 +583,9 @@ <h2 class="handle">
<p>
A Unique ID representing this object, used to map
<br/>
additional object properties. This ID is guaranteed unique
additional object properties. This ID is guaranteed unique
<br/>
for each object, regardless of streamId. It will change the object drops out of
for each object, regardless of streamId. It will change the object drops out of
<br/>
detection/tracking
</p>
Expand Down Expand Up @@ -708,7 +708,7 @@ <h2 class="mb-0">
<br/>
<span class="description">
<p>
Object specific class returned by the model. For objects of the vehicles metaclass this may include car, truck, bus, motorbike, etc based on model capabilities
Object specific class returned by the model. For objects of the vehicles metaclass this may include car, truck, bus, motorbike, etc based on model capabilities
</p>
</span>
</div>
Expand Down Expand Up @@ -890,7 +890,7 @@ <h2 class="handle">
<br/>
<div class="description collapse" id="collapseDescription_metaClasses_pattern1_pattern3_attributes_pattern1">
<p>
A map of attributes for this object. Not all atributes are supported for all object types. Example attributes include:
A map of attributes for this object. Not all atributes are supported for all object types. Example attributes include:
<br/>
color - The color of an object
<br/>
Expand Down Expand Up @@ -2345,7 +2345,7 @@ <h2 class="mb-0">
<br/>
<span class="description">
<p>
An object hash which uniquely identifies this object and associated attributes. Will change when attributes change. Reserved for future use
An object hash which uniquely identifies this object and associated attributes. Will change when attributes change. Reserved for future use
</p>
</span>
</div>
Expand Down Expand Up @@ -2958,11 +2958,11 @@ <h2 class="mb-0">
<p>
A map of maps describing an event type.
<br/>
- The top level map key is a name describing the event type. Supported types are presenceSensor, lineCrossingEvent, speedEvent.
- The top level map key is a name describing the event type. Supported types are presenceSensor, lineCrossingEvent, speedEvent.
<br/>
- The sub level map key is a Unique ID representing the event, used to map
<br/>
additional object properties. This ID is guaranteed unique
additional object properties. This ID is guaranteed unique
<br/>
for each event for a given stream ID.
</p>
Expand Down Expand Up @@ -3205,7 +3205,7 @@ <h2 class="handle">
<p>
Describes an event where one or more objects are present in a region of interest.
<br/>
The event starts when the first object enters a region of interest. Updates are sent for each change in status, with updateCount incremented for each update. When the last object exits and the region is empty, the sensor event will become immutable and will track the total amount of time at least one object was present in the region of interest. An entry of an object will start a new event and reset the updateCount to 1. Region definitons, object filtering and other items related to sensor definitions are tracked as a part of the sensorId associated with the event.
The event starts when the first object enters a region of interest. Updates are sent for each change in status, with updateCount incremented for each update. When the last object exits and the region is empty, the sensor event will become immutable and will track the total amount of time at least one object was present in the region of interest. An entry of an object will start a new event and reset the updateCount to 1. Region definitons, object filtering and other items related to sensor definitions are tracked as a part of the sensorId associated with the event.
</p>
</div>
<div>
Expand Down Expand Up @@ -3932,7 +3932,7 @@ <h2 class="mb-0">
<br/>
<div class="description collapse" id="collapseDescription_sensorEvents_pattern1_pattern3_items_oneOf_i0_updateCount">
<p>
The cumulative number of updates sent for this sensor, starting with 1 for the initial update and incremented once for each update sent for each unique sensor event ID. An update refers to a change in the state of the sensor due to a corresponding sensor event (entry, exit, crossing, ...). For sensors which include multiple updates per sensor event (presense sensors), the updateCount will be reset to 1 to indicate the first update for a given event. For sensors (count) which only include 1 update per event, updateCount will be cumulative and count the total number of events per sensor.
The cumulative number of updates sent for this sensor, starting with 1 for the initial update and incremented once for each update sent for each unique sensor event ID. An update refers to a change in the state of the sensor due to a corresponding sensor event (entry, exit, crossing, ...). For sensors which include multiple updates per sensor event (presense sensors), the updateCount will be reset to 1 to indicate the first update for a given event. For sensors (count) which only include 1 update per event, updateCount will be cumulative and count the total number of events per sensor.
</p>
</div>
<div>
Expand Down Expand Up @@ -5515,7 +5515,7 @@ <h2 class="mb-0">
<a href="https://github.com/coveooss/json-schema-for-humans">
json-schema-for-humans
</a>
on 2024-11-14 at 16:30:00 +0000
on 2025-01-22 at 17:55:07 +0000
</p>
</footer>
</body>
Expand Down
4 changes: 2 additions & 2 deletions services/sio/examples/camera/sio.json
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
"pipeline" : "./share/pipelines/TrafficAnalytics/TrafficAnalyticsRTSP.yaml",
"restartPolicy" : "restart",
"parameters" : {
"VIDEO_IN" : "rtsp://sh-camera-rtsp:8555/live",
"VIDEO_IN" : "rtsp://sh-ui-backend:8555/live",
"sourceId" : "camera1",
"recordTo":"/data/sighthound/media/output/video/camera1/",
"imageSaveDir":"/data/sighthound/media/output/image/camera1/",
Expand All @@ -15,4 +15,4 @@
"amqpErrorOnFailure":"true"
}
}
}
}

0 comments on commit e3aabe9

Please sign in to comment.