Supporting Social Distancing with AI and Camera – A Demonstrator

04/21/2021 // SCADS

Published on 14 April 2021

Our emerging Living Lab will be the home to many of our competences at ScaDS.AI. To present them to our visitors and partners many different demonstrators are being developed and will be displayed. Unfortunately, due to the Covid-19 pandemic several social distancing rules apply nowadays. Of course, we must also incorporate these new conditions into the development of our Living Lab. Existing solutions like the Corona-WarnApp help us to monitor the contacts between users and warn them if they have been too close based on the distance of their smartphones. For the Living Lab Alexander Leipnitz and Timo Adameit therefore developed a new method to monitor the social distance between our visitors within the Living Lab via cameras. The goal is to visualize contacts between the visitors of the Living Lab in a graph and highlight dangerous contacts’ while preserving the data privacy. By using this approach, we can identify contact groups, analyze group dynamics and can thus show how the Corona virus could possibly spread.

This video shows the web interface of the demonstrator. The screen is separated in three areas. In the top-left the video feeds from the cameras are displayed. The detected persons are highlighted with different colored boxes, according to the internally assigned ID. Below that is the floor plan of the Living Lab where the past trajectory of each person is drawn. This shows how visitors spread spatially in the Living Lab and shows larger and therefore dangerous gatherings. On the right side is the main contact graph that visualizes contacts between visitors and interactions with demonstrators. A contact happens if the safe distance of 1.5m between two persons is undercut. The color and thickness indicate the length and therefore risk of the contact.

How does it work?

The camera feeds are streamed to a central server inside the ScaDS.AI office. Multiple state-of-the-art AI systems in the form of convolutional neural networks (CNNs) are used to detect each person and assign a consistent ID to them. The cameras are specially calibrated for the Living Lab space so that the position of a person in the room can be reconstructed from the camera images. Based on this position the distance between two persons can be determined and visualized at any given time. Every visitor is only represented by an ID in the system. No visual information is stored so that a later back-tracing of the ID to the visitor’s appearance is not possible. The motion profiles are only stored as long as the visitor is present in the Living Lab. This prevents the classic corona contact tracing of users but rather allows the users to see what is happening inside the Living Lab during their visit to create awareness of Corona infection chains.

Authors: Alexander Leipnitz, Timo Adameit

TU
Universität
Max
Leibnitz-Institut
Helmholtz
Hemholtz