RTBF podcast

Translation and transcription of the podcast titled AI Revolution Episode 7: Smile, you are filmed, scanned, tracked…

In streets, parks, train stations, at your workplace, in hospitals, shops, car parks, perhaps even in your house on your own initiative… surveillance cameras are everywhere.

Millions of hours of images, faces and behaviours are recorded for one reason: security.

“Appeal for witnesses: this man has been seen several times. He was filmed by surveillance cameras. If you recognize him, please contact the police via following toll-free number… “

Millions of images… actually, these are databases where you can look for relevant information, for instance, by using artificial intelligence (AI) solutions that are given missions such as recognizing people.

“The camera makes it possible to capture a face. First, it detects a face, its edges. Then, a facial signature is calculated depending on different points between the mouth, nose, eyes, chin and forehead.” (Valerio Burgarello)

Valerio Burgarello is co-founder and technical director at Piximate, a Belgian start-up from La Hulpe.

Valerio Burgarello: It makes it possible to calculate the singular facial signature that will serve to recognize the person if he/she comes back. So, we do not recognize a person by their face, but by their face’s signature. We do not identify people by their Facebook profile where you can get their name and all kind of other information. We only have a signature which is anonymous to some extent, if I may say so.

Journalist: I can see that when I’m getting closer, my face appears on the screen. There’s a blue frame around my face and yours.

V.B.: Exactly. In fact, the system has detected a face. The frame is there for the demo, for a better illustration. Inside the frame, the system picks points that will serve for the calculation of the signature. Then, based on this calculation, it will be able to ascertain on the next screen whether it’s a man or a woman. Today we can determine the age group, so we’re not categorical about age. We can give a 5-year age range.

Journalist: You can determine the gender, the age group and… well, detect people’s emotions.

V.B.: The calculation of emotions also relies on facial points – the signature – and the comparison of different datasets which are collections of images. These images were used to teach the algorithm that an emotion, a smile for instance, is when the distance between some points becomes larger than normally, than when the face is neutral. Then, using this information and the images it recorded, the system can tell whether a person is smiling or not. Thera are three basic emotions: smiling, neutral and sad.

How does a machine become able to detect emotions? What kind of opportunities can arise with such solutions? Will it be possible to identify a terrorist behaviour before an attack even occurs? Are we monitored all the time and everywhere? And who uses these images and for what purposes?

A brief lesson in machine training by Valerio Burgarello who is back with us.

V.B.: The first step is the collection of what we call a dataset – a collection of annotated images. If we work on smiling emotion, it means that we need plenty of images illustrating smiling people, so that they’re tagged as smiling.

Journalist: It’s a kind of food for the machine.

V.B.: Exactly. The machine will learn with this information. Our algorithm processes it all and makes it possible to teach the machine that a given point of face corresponds to a smile. Once it’s in our database, every newly taken image will go through our system and the machine will be able to say the category to which it belongs. If all conditions are fulfilled, at the end, the machine considers, with a confidence level, that a person is smiling.

Journalist: What is your algorithm’s reliability percentage?

V.B.: Today, it’s around 85-90%.

Journalist: But the more it is trained, the better it will get?

V.B.: Indeed. The more we train it, the better it will work. But we need thousands of additional images to increase the percentage.

Journalist: Where do you get these images? Your clients can certainly provide you with them. But the machine had to be trained in the first place. So, what images did you use at the beginning?

V.B.: There are datasets that can be bought on specialised websites. We bought some datasets. But you can also find free datasets, free to use. For instance, Google has that in its offer and Microsoft too. We also use our own data from our own cameras to improve our algorithm.

Journalist: Basically, you’re a kind of machine super trainer, a coach.

V.B.: That’s it. I’m a teacher, I’m teaching… (chuckles)

This is how a machine becomes able to recognize an emotion of which it learned the features, on a face that it has never seen before.

Let’s summarize. At Piximate they mainly work in retail sector, with stores. By using images recorded by surveillance cameras, they can define someone’s facial signature – a kind of encrypted face print with all its features – as well as his/her age, gender, emotions, presence in a store, the time he/she spent in a store, if he/she came back several times.

Laure Uytdenhoef is CEO at Piximate.

Laure Uytdenhoef: This solution enables retailers, and therefore stores, to get a better knowledge of their visitors and customers at physical points of sale. In fact, nowadays, when it comes to visitor behaviour analysis, e-commerce is much more advanced than brick and mortar stores. This solution can give those stores information they can use themselves to adapt a series of things. I can give you a very concrete example: if a brand offers a product originally intended for men aged 25-30, but it realizes that customers who come to store and buy this product are in fact women aged 35-40, the brand has two options. It can either adapt the communication or adapt the product. This is what it’s made for. It can also help with reorganizing a point of sale or adapting the opening hours or the number of open checkout counters which will obviously have an impact on the customer satisfaction.

 

That’s it in terms of marketing. Emotion recognition makes it possible to assess whether a customer is satisfied when he/she leaves the point of sale or whether their interaction with the salesperson went well. But would you imagine that Piximate is collaborating with the French gendarmerie on the issue of emotion recognition?

Rémy Millescamps: The French gendarmerie is an organisation that has existed for many years now and has always been very close to citizens.

Rémy Millescamps is CEO at DC Communication, a French company that helps public organisations, including the gendarmerie, make the digital transition.

R.M.: Ensuring citizens’ satisfaction and assisting them is something very important. In a strategy of measuring the satisfaction, the aim is also to recreate a close relationship with citizens, rebuild trust and be able to continue interacting. There is a real goal of improving the institution and providing assistance for policemen in daily work since you know that when people are satisfied with policemen’s work, it’s also much easier for policemen to carry out their duties.

The aim is the same as in stores. Citizens’ faces are scrutinized as they come in and leave the police station in order to assess whether they are satisfied or not with the provided service. Citizens are not individually notified about the presence of the surveillance camera, but there is an obligatory sign informing about it. Three police stations are currently equipped with this AI solution developed by Piximate. It’s being tested. But Rémy Millescamps is already considering the next step.

R.M.: You know, Piximate’s platform offers great possibilities. The amazing thing about AI is that people continue to develop it. So, it will be possible to use this platform for new purposes and make it evolve. You can also support and better help policemen who may suffer from pressure in everyday lives, meet difficulties, feel pain or sadness, or even sometimes commit horrific acts. This emotions platform will make it possible to alert internally the gendarmerie about abnormal behaviours and to react immediately at an early stage in order to better assist those policemen who might suffer emotional hardship.

In 2018 in France, 35 policemen and 33 members of gendarmerie committed suicide. The year 2019 is already following the same trend. The problem is very worrying, and I can get the idea. But there is still something that intrigues me. Being scrutinized at work, your emotions being scrutinized, day after day, isn’t it a little intrusive?

R.M.: It becomes intrusive when you misuse a digital service. It becomes positive when you can detect a problem beforehand and provide support. But it’s up to gendarmerie to judge the relevance of this solution and to decide. However, since this data will never be published, you, as a citizen, or at least as a policeman, are given the freedom to decide. You’re already being evaluated and assisted by your managers. It’s a complementary way for an officer to better assist his men since it’s his responsibility.

Cameras are all over the city, they’re unnoticeable. But did you know that these cameras can nowadays identify you by the colour of your clothes and follow your moves in the street?

Surveillance cameras, emotions detection, gendarmerie… When I hear all that, my mind is racing. Could we not imagine detecting suspicious behaviours in public spaces? For instance, detecting the moment when a demonstration will turn violent?

Laure Uytdenhoef: Sure. It’s something we can easily develop. I think we’ll need to gather some information, but the kind of information that we can have now, namely the number of people detected in a specific place. So, if it increases significantly, it means there is a crowd building up. We are already now able to get this kind of information. And by coupling it with emotions detection, if we see on the image that there is a growing number of people who have a negative facial expression, such as aggressiveness or anger, then we can consider that there is a potential danger and that it should attract attention. The machine can send an alert saying that something suspicious was detected and that a real person should go and see if it’s right, and if necessary, we can make authorities take action much quicker than it’s done now.

Journalist: And when it comes to individual behaviours? Could AI identify a criminal or even a terrorist before they commit a crime?

L.U.: Technologically speaking, the answer is yes. However, the implementation is more complex because we have to define together what is considered as suspicious behaviour and obviously there can be a lot of variation. But the machine will only detect the series of behaviours it has been taught. It can’t think by itself and draw inferences or extrapolate, at least not in an unlimited way. So, it’s obvious that if it’s given a series of ten suspicious behaviours, it will be able to recognize these ten behaviours and to send an alert saying: “one of the suspicious behaviours has just been detected, you can rapidly react”. However, these behaviours may vary a lot, and so if there is an 11th type of suspicious behaviour, the machine will never detect it and recognize it as such.

AI can serve as an additional help for security services when they have to react. But since we’re talking about security, I have a question for Rémy Millescamps: is it possible today to design a high-performing lie detector by combining the emotions recognition technology with other elements?

R.M.: I’ll give you a very clear answer, Marie. It is absolutely possible. However, this obviously raises an ethical issue and a reliability question. Let’s consider it as a support to an aid. There are two things to be taken into account. If it’s used in the context of a specific investigation, it would be possible to detect lies by analysing not only emotions, but also by combining this kind of information with temperature jumps and blood pressure, since we know that we can detect blood pressure by observing the moves of someone’s eyes. Therefore, we will link together some statistical information that will give us trends on whether a person is likely to lie or not.

Images, faces, emotions… all this can be analysed and provide significant information to whoever has access to these images and uses the right algorithms. And even if it isn’t identification information, even if it’s only a facial signature like it was said at the beginning, this is still information related to who we are. The only thing that needs to be done is to cross-reference facial signatures with, for instance, a Facebook profile with your name, city of residence and so many other things, and it will provide a great number of elements that will make it possible to determine quite precisely who you are. Who we are, what we do, where we go, our character, our preferences, our reactions…

Today, it’s forbidden. The GDPR doesn’t allow to cross such information without the consent of individuals. Only anonymised information can be imparted.

But there are still some things that are allowed. Laure Uytdenhoef is back with us.

L.U.: It would be absolutely feasible if our client – a brand or a store – had a photo of their customer’s face in his own database that was not provided by us. It’s possible for instance to link the end-customer’s loyalty card with our technology. We can perfectly consider the possibility to recognize a customer personally, not anonymously anymore, if, by accepting the terms of use of his loyalty card, he agreed to be photographed and to make his photo available for the analysis we do for our client. Technologically, it’s possible. Are we doing it now? No, we are not.

For a store, it makes it possible to link information such as customers’ behaviour on internet, advertising exposure and customer’s behaviour in points of sale. The customer’s consumption journey can be tracked, which obviously provides very valuable data.

And as for the user who might be bothered, you’ll tell him that he should have read the terms of use.