PeTrack

Authors: Luke Dreßen, Alica Kandler, Simon Hermanns, Tobias Schrödter, Janine Klein, Deniz Kilic, Ann Katrin Boomers, Arne Graf, Paul Häger, Tobias Arens, Daniel Hillebrand, Ricardo Martin Brualla, Maik Boltes, Juliane Adrian, Daniel Salden, Mira Küpper, Paul Lieberenz

Keywords: Image processing, Machine learning, Fair data, Data analysis, Data visualization, Fair software, Open source, Tracking, Pedestrian, Detection, Crowd, Calibration, Correction, Stereo, Marker, Video recording, Annotation, Computer vision, Trajectory, Controled experiments

For the understanding of the dynamics inside crowds reliable empirical data are needed enabling an increase of safety and comfort for pedestrians and the design of models reflecting the real dynamics. Manual procedures for collecting this data are very time-consuming and usually do not supply sufficient accuracy in space and time.

For this reason we are developing the tool named PeTrack (Pedestrian Tracking) to automatically extract accurate pedestrian trajectories from video recordings. The joint trajectories of all pedestrians provide data like velocity, flow and density at any time and position. With such a tool extensive experimental series with a large number of persons can be analyzed. Individual codes enables personalized trajectories with static information of each participant (e.g. age, gender).

The program has to deal with wide angle lenses and a high density of pedestrians. Lens distortion and perspective view are taken into account. The procedure includes calibration, recognition, tracking and height detection.

Different kinds of markers (e.g. with height information, head direction, individual code) are implemented. With a stereo camera more accurate height measurements and also markerless tracking is possible.

The source code as well as precompiled executables of PeTrack are available. The brief documentation of using PeTrack cannot answer all questions. Thus you may contact the author before setting up experiments and automatic extraction with PeTrack: petrack@fz-juelich.de. Results of collected trajectories can be found here.


Publications

PeTrack

Boltes M, Kilic D, Schrödter T, Arens T, Dreßen L, Hermanns S, Adrian J, Boomers A, Kandler A, Küpper M, Graf A, Salden D, Brualla R, Häger P, Hillebrand D, Lieberenz P, Klein J - Zenodo - 2025


PeTrack

Boltes M, Adrian J, Boomers A, Brualla R, Dreßen L, Graf A, Häger P, Hillebrand D, Kilic D, Lieberenz P, Salden D, Schrödter T - Zenodo - 2022


PeTrack

Boltes M, Boomers A, Adrian J, Brualla R, Graf A, Häger P, Hillebrand D, Kilic D, Lieberenz P, Salden D, Schrödter T - Zenodo - 2021


PeTrack

Boltes M, Adrian J, Brualla R, Graf A, Häger P, Hillebrand D, Kilic D, Lieberenz P, Salden D, Schrödter T, Seemann A - Zenodo - 2021


Collecting pedestrian trajectories

Boltes M, Seyfried A - Neurocomputing - 2013


Automatic Extraction of Pedestrian Trajectories from Video Recordings

Boltes M, Seyfried A, Steffen B, Schadschneider A - Pedestrian and Evacuation Dynamics 2008 - 2009


Helmholtz RSD
This entry is synchronized with the Helmholtz Research Software Directory (RSD).
If you're the author or maintainer, please edit on the Helmholtz RSD platform.
Click here to view PeTrack on RSD.
Helmholtz RSD icon
PeTrack Image
License
GPL-3.0-only

Helmholtz Imaging spinning wheel

Please wait, your data is processed