Challenges of (Chinese) AI in Europe

China’s artificial intelligence (AI) industry is rapidly growing. One of the key reasons is the availability of ‘data’. The privacy standards in China are not as high as in Europe. Chinese companies can create rich databases filled with personal data of their customers. Data can be used to improve their products and adjust them to meet customer needs. It is no surprise that European companies are interested in using technologies that have been improved in a way they could not due to data protection restrictions. But, transferring these technologies to markets with stricter data protection laws can be challenging. 

Chinese surveillance software and the GDPR

Chinese start-up Watrix is one of the companies that has caught the interest of a company in Europe. Watrix has developed a surveillance software that can identify suspects of a crime by their body shape and by their movement from up to 50 meters away. According to Watrix CEO Huang Yongzhen the technology even works when an individual’s face is hidden or covered. Misleading the system by changing the way you walk will not work, because all features of the entire body are analyzed.

If a European security company would use this gait recognition technology the General Data Protection Regulation (GDPR) applies. Under the GDPR organisations must have a legal ground and a legitimate purpose to process personal data. The surveillance system processes a lot of personal data – more than a ‘normal’ camera. ‘Special categories of personal data’ such as biometrics and perhaps also information about someone’s race are being processed. This kind of data is considered to be more sensitive than a name or e-mail. Processing special personal data is prohibited unless an exception is applicable. If you deploy this technology, you need to be able to demonstrate that you need all this data and cannot reach your goal by using less data or by using a less intrusive technology. 

High-tech products such as gait recognition have to comply with strict requirements in Europe. Products developed outside the European Union with different legal regimes could affect the usage of the products. It could mean that some features legally have to be disabled. The software must have an option to disable features that are non-compliant with European law. If not, you may not be able to use the software at all. As EU data protection requirements are mostly overlooked in China, the removal of non-compliant features can be difficult. 

These challenges show the need to assess if a technology can be legitimately deployed in Europe. In general, the use of AI can bring potential legal and ethical issues.

Considerati has developed an AI Impact Assessment to identify potential legal and ethical issues. Contact Considerati now for more information.

Romy ter Beek Legal Consultant

Contact me

contact me