Starting Multi-Graph Runner
Starts head detection on the frames of a previously configured video stream.
Note: The aim of this guide is to describe how to start MGR for running the UVAP Feature Demos only. To start MGR for production use, see the Operating the Multi-Graph Runner guide in the Operation Guide section.
Prerequisites
It is assumed that UVAP is properly configured. For more information on configuration, see Configuring UVAP.
For information on Multi-Graph Runner (MGR) configuration, see Configuring Multi-Graph Runner.
Starting the Multi-Graph Runner service
To start MGR:
Run the microservice:
Attention! Before starting this microservice, the command below silently stops and removes the Docker container named
uvap_mgr
, if such already exists.$ "${UVAP_HOME}"/scripts/run_mgr.sh -- --net=uvap
Attention! First startup of MGR can take a few minutes, especially on Jetson TX2 or other small machines. The cause of this is the optimization of neural networks for the given runtime environment. These optimized model files then are stored in cache (usually
~/.cache/multi-graph-runner/
). This script mounts it to/ultinous_app/cache
in guest and MGR is configured to use that location.The output of the above command contains the following:
- Information about pulling the required Docker image
- The ID of the Docker container created
- The name of the Docker container created:
uvap_mgr
There are more optional parameters for the
run_mgr.sh
script to override defaults. The following list describes these options in detail:--
: any options after--
will be passed to thedocker container create
command, which is called by the script--help
: prints out brief information of the usage of the script--models-dir
: directory path of the AI models. Default value:${UVAP_HOME}/models
--config-dir
: directory path of the configuration files. Default value:${UVAP_HOME}/config/uvap_mgr
--cache-dir
: directory path of model cache files. Default value:~/.config/mult-graph-runner
--image-name
: tag of the Docker image to use. The default value will be determined by Git tags--license-data-file
: data file of your UVAP license. Default value:${UVAP_HOME}/license/license.txt
--license-key-file
: key file of your UVAP license. Default value:${UVAP_HOME}/license/license.key
--gpu-specification
: NVIDIA® GPU specification for Docker. Controls which GPU cards should be visible for the container. See the NVIDIA Container Runtime documentation for the possible values. default value:- in case of the license is bound to GPU information: the GPU UUID found in the license data file
- otherwise: the GPU index
0
--run-mode
: determines, how the service should be started. Possible values:background
orforeground
Default value:background
All video devices (
/dev/video*
) on the host (where MGR is started withrun_mgr.sh
) are mounted into theuvap_mgr
container.If prerecorded videos (stored on the local filesystem) are configured as streams to be analyzed, the files need to be mounted into the
uvap_mgr
container – this can be done by passing regular Docker mount parameters at the end of the above command line (after the--
parameter). For more information on Docker mounting, see the Add bind mounts or volumes using the --mount flag section in the documentation of Docker.For example, if there is a video file on the host
/mnt/videos/video1.avi
, and it is configured for MGR as/some/directory/my_video.avi
, the following command runs MGR accordingly:$ "${UVAP_HOME}"/scripts/run_mgr.sh -- --net=uvap \ --mount type=bind,readonly,src=/mnt/videos/video1.avi,dst=/some/directory/my_video.avi
Check if the
uvap_mgr
container is running:$ docker container inspect --format '{{.State.Status}}' uvap_mgr
Expected output:
running
Note: If the status of the UVAP container is
not running
, send the output of the following command tosupport@ultinous.com
:$ docker logs uvap_mgr
These Docker containers can be managed with standard Docker commands. For more information, see the documentation of the docker (base command).
Check if the Kafka topics are created:
$ docker exec kafka kafka-topics --list --zookeeper zookeeper:2181
Expected output:
- In case of Base Mode Demos:
base.cam.0.ages.AgeRecord.json base.cam.0.anonymized_original.Image.jpg base.cam.0.dets.ObjectDetectionRecord.json base.cam.0.frameinfo.FrameInfoRecord.json base.cam.0.genders.GenderRecord.json base.cam.0.masks.FaceMaskRecord.json base.cam.0.original.Image.jpg base.cam.0.poses.HeadPose3DRecord.json base.cam.1.ages.AgeRecord.json base.cam.1.anonymized_original.Image.jpg base.cam.1.dets.ObjectDetectionRecord.json base.cam.1.frameinfo.FrameInfoRecord.json base.cam.1.genders.GenderRecord.json base.cam.1.masks.FaceMaskRecord.json base.cam.1.original.Image.jpg base.cam.1.poses.HeadPose3DRecord.json
- In case of Feature Vector Mode Demos:
fve.cam.0.dets.ObjectDetectionRecord.json fve.cam.0.fvecs.FeatureVectorRecord.json fve.cam.0.frameinfo.FrameInfoRecord.json fve.cam.0.ages.AgeRecord.json fve.cam.0.original.Image.jpg fve.cam.1.dets.ObjectDetectionRecord.json fve.cam.1.fvecs.FeatureVectorRecord.json fve.cam.1.frameinfo.FrameInfoRecord.json fve.cam.1.ages.AgeRecord.json fve.cam.1.original.Image.jpg
- In case of Skeleton Mode Demos:
skeleton.cam.0.original.Image.jpg skeleton.cam.0.skeletons.SkeletonRecord.json skeleton.cam.1.original.Image.jpg skeleton.cam.1.skeletons.SkeletonRecord.json
- In case of Base Mode Demos:
Fetch data from a Kafka topic:
$ docker exec kafka kafka-console-consumer --bootstrap-server kafka:9092 \ --topic base.cam.0.dets.ObjectDetectionRecord.json
Expected example output:
{"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","bounding_box":{"x":747,"y":471,"width":189,"height":256},"detection_confidence":0.99951756,"end_of_frame":false} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","bounding_box":{"x":730,"y":484,"width":190,"height":255},"detection_confidence":0.991036654,"end_of_frame":false} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","bounding_box":{"x":713,"y":467,"width":173,"height":252},"detection_confidence":0.999676228,"end_of_frame":false} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","bounding_box":{"x":713,"y":467,"width":172,"height":252},"detection_confidence":0.999602616,"end_of_frame":false} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true} {"type":"PERSON_HEAD","bounding_box":{"x":701,"y":468,"width":178,"height":253},"detection_confidence":0.999979258,"end_of_frame":false} {"type":"PERSON_HEAD","detection_confidence":0,"end_of_frame":true}