Research

Active tracking using bioinspired event-based vision – Towards improving interpretability of individual and collective behavior

Using a combination of video cameras and event-based cameras to extract meaningful motion information on complex animal behavior

Research Unit: 3

Project Number: 36

Disciplines:
Computer Vision

 

Principal Investigators:
Guillermo Gallego

Doctoral Researchers:
Friedhelm Hamann

 

Expected Project Duration
2021 - 2024


← Projects Overview

Active tracking using bioinspired event-based vision – Towards improving interpretability of individual and collective behavior

Using a combination of video cameras and event-based cameras to extract meaningful motion information on complex animal behavior

Complex behavior of animals is often analyzed from video recordings because cameras provide an economical and non-invasive way to acquire abundant data. Hence, developing computer vision tools to extract relevant information from such a rich yet raw data source is essential to support behavioral analysis. We propose to develop computer vision algorithms using a combination of video (i.e., frame-based) cameras and event-based cameras in order to extract meaningful motion information in individuals (in isolation or as part of groups). Both sensor types are complementary: event-based cameras excel at capturing high-frequency temporal content, while traditional cameras are better at acquiring slowly-varying content.

Event-based cameras are novel, biologically-inspired sensors that mimic the transient pathway of the human visual system. These cameras respond to motion in the form of brightness changes (called “events”) at any pixel in time. They can capture the dynamics of a scene with high dynamic range and temporal resolution, w​ithout suffering from motion blur, as opposed to traditional cameras. Additionally, these cameras allow us to record only motion information, which we will exploit for long-term tracking and better segmentation of behaviors.

We consider an ​active vision ​approach, where the viewpoint of the camera can vary to improve the tracking performance. This system will enable robust detection of individuals regardless of their 3D location and avoid target disappearance during long-term tracking. In an analysis phase, motion tracks will help categorize relevant behavior of the interactions of individuals.


Shiba, S., Klose, Y., Aoki, Y., & Gallego, G. (2024). Secrets of Event-based Optical Flow, Depth and Ego-motion Estimation by Contrast Maximization. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–18. https://doi.org/10.1109/TPAMI.2024.3396116
Shiba, S., Hamann, F., Aoki, Y., & Gallego, G. (2023). Event-based Background-Oriented Schlieren. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1–16. https://doi.org/10.1109/TPAMI.2023.3328188
Hamann, F., Ghosh, S., Martínez, I. J., Hart, T., Kacelnik, A., & Gallego, G. (2024). Low-power, Continuous Remote Behavioral Localization with Event Cameras. CVPR. https://doi.org/10.48550/arXiv.2312.03799
Hamann, F., & Gallego, G. (2022). Stereo Co-capture System for Recording and Tracking Fish with Frame- and Event Cameras. International Conference on Pattern Recognition (ICPR), Workshop on Visual observation and analysis of Vertebrate And Insect Behavior. https://arxiv.org/abs/2207.07332

Research

An overview of our scientific work

See our Research Projects