Privacy Aspects of High-Performance Video Analytics
Irisity develops systems for security and safety using video analytics, triggering real-time alerts based on configured rules. For many applications, data protection and personal privacy are the main customer concerns. One example is detecting patients falling to the ground in nursing homes while maintaining absolute privacy for the monitored individuals. One of the privacy aspects is around the training data. Our tech stack relies heavily on deep learning, and it is hard to get real-world training data from sensitive environments such as hospitals in large quantities. A second aspect is privacy in live systems. Our system includes features for anonymization, where all detected humans can be replaced with a low-resolution pixelized version, hiding the person’s identity. A final aspect of privacy and data protection is to support video processing at the edge, as many of our customers are reluctant to upload video data to cloud systems. We found using GPUs at the edge to be impractical in many cases, and spent significant efforts on making our model inference run highly optimized on edge CPUs. This talk describes the details on how we address these three types of challenges, with examples, results, and a few lessons learned.
Erik did his PhD in Linköping in 2008, focusing on computer vision and machine learning. After a few years of initial industrial experience, he founded the consulting agency Visionists in 2012, specializing in computer vision solutions for customers in various industries. In 2019, Visionists had grown to a strong technical team of 13 employees. Around then, Visionists was acquired by Irisity, where Erik spent his first few years as CTO. Erik currently holds the role of VP Future Labs at Irisity, with a responsibility for the most research-oriented and forward-looking activities of the company.