
You are very welcome to contribute to this pipeline to make it even better! What can go wrong In the following we outline the main problems currently known and possible remedies. Roslaunch dvs_tracking live.launch bootstrap_image_topic:= auto_trigger:=įor instance, bootstrap_image_topic:=/dvs/image_raw. You can test the svo tuning bootstrapping from traditional frames: If you are not using the fronto-planar bootstrapper, then you might need to tune SVO. We still invite you to have a look at the Wiki, to discover further interesting features ) Module The main parameters can be found in the template launch file, and are explained contextually. Tuning is crucial for a good performance of the pipeline, in particular the min_depth and max_depth parameters in the mapping node, and the bootstrap node parameters.Īn explanation of all the parameters for each module of the pipeline can be found in the Wiki. TuningĪdjust the parameters in the launch file dvs_tracking/launch/template.launch. See this section for further references on calibration. The single camera format has to be placed in the dvs_tracking/parameters/calib folder and the multiple cameras format in the dvs_tracking/parameters/calib/ncamera folder. Remark that we have two calibration files in two different format for the same sensor. yaml files have the same format as the provided ones ( single camera format, multiple cameras format). Make sure you have updated calibration files for your event camera, in the dvs_tracking/parameters/calib folder. Further customization, such as which bootstrapper to use, are explained in the launch file itself. To run the pipeline live, first adjust the template live.launch to your sensor and scene.
If your sensor provides frames under a topic /my_sensor/image_raw, and you want to bootstrap from the traditional frames, you can use bootstrap_image_topic:=/my_sensor/image_raw.
You can also set it as default in live.launch.
The calibration files paths will be built as $(find dvs_tracking)/parameters/calib/ncamera/$(arg camera_name).yaml and $(find dvs_tracking)/parameters/calib/$(arg camera_name).yaml, where camera_name is specified as argument to the launch file. If anything fails, just press Ctrl+C and restart the live node ) The scene should have enough texture and the motions should recall the ones that you can see in the provided examples. You may disable Map-Expansion to track high-speed motions using the current map (single keyframe tracking).
As the camera moves out of the current map, the latter will be automatically updated if the Map-Expansion is enabled. If the map looks correct, press Switch to tracking to start tracking with EVO. Perform a circle (or more), and then press Update. Press the Start/Reset button in rqt_evo. This will automatically trigger the pipeline.Īlternatively, it is also possible to trigger one module at the time: Press the Bootstrap button in the EVO GUI. Reset and start the pipeline until it tracks decently well. This option is recommended to debug/improve/extend the rest of the EVO pipeline, without worrying about the quality of the bootstrapping. In this case, it makes sense to use the SVO-based bootstrapping only. Roslaunch dvs_tracking live.launch bootstrap_image_topic:= auto_trigger:=falseįor instance bootstrap_image_topic:=/dvs/image_raw. AUTO TUNE EVO EXPLAINED RAPIDSHARE CODE
Convert a standard video to events using our "Video to Events: Recycling Video Datasets for Event Cameras." cited above, code here:. Generate one using event-camera simulator:. Using one of our event camera datasets:. If you don't have an event camera and want to run the code on more data, you can do one of the following: