©Rawpixel.com via Freepik

Scholars from Erasmus University Rotterdam (EUR) are conducting research on various academic fronts to find ways of making our society fairer and happier. Our safety is an important prerequisite in this context. Professor Gabriele Jacobs, Professor of Organisational Behaviour and Culture, is examining ways in which we can use ‘smart technology’, such as CCTV monitoring, wireless hotspots and facial recognition, for the benefit of public safety, while respecting human rights and public values.

Artificial Intelligence (AI) plays an important part in your work as a social scientist.

“True. AI can help us make better decisions when dealing with complex problems. Take climate change, for example. Extreme weather and earthquakes not only cause inconvenience, but often great human suffering as well. AI allows us to combine and integrate many data from different sources. As a result, we may become aware of patterns in climate change that we otherwise might not notice and be able to predict and prevent dangerous situations. In the social field, for example, AI is used for crowd management in the Netherlands as a means of preventing accidents.”

Can you tell us something about your involvement in the AI-MAPS consortium?

“I am ‘Chair’ at AI MAPS, focusing on Public Safety. In AI-MAPS [Artificial Intelligence for Multi-Agency Public Safety Issues], 26 organisations work together in the field of public safety and AI. Partners include the universities of Leiden, Delft and Rotterdam (LDE), but also, for example, TNO, the police and companies such as Deloitte and Nokia. By combining our knowledge and expertise, we aim to develop methods that help prevent or deal with social unrest.

Social unrest is often accompanied by a lack of trust. If you no longer trust the government or your neighbours, you’re more likely to withdraw from or fight against society. The use of technology can reinforce that feeling. That’s why it’s important that every layer society can be part of the discussion about how technology is used: public bodies, parties from the private sector, scholars and, above all, citizens. Citizens in particular should be involved in decisions about their living environment. In other words, what we should not do is invent, implement and use technologies and only then ask citizens if they like what we’ve done.

When it comes to technology in public spaces, citizens are often regarded as consumers. However, most of them are oblivious of all the technology used in public spaces: sensors, cameras trained on you, facial recognition, motion mapping… This may well be perfectly watertight in legal terms, but try looking at it from a social and moral perspective! As citizens, we can no longer shrug our shoulders and think there’s nothing we can do about it anyway. Public safety is something that affects us all and that we all share responsibility for.”

‘Although Artificial Intelligence has a seat at the table, it can never be allowed to make the decisions’

So what you’re saying is that new technology isn’t always welcome?

“Looking at public safety, I feel technology is sometimes deployed inordinately early. Some streets have sensors installed for capturing shouting. Is that really a safety problem? And do we really need to put up cameras or other equipment in areas where young people congregate? Don’t they have an equal right to be on the streets? If the plan is to provide all residents of a neighbourhood with a pleasant living environment, they have the same right as anyone else to such an environment.

In too many cases, technology is seen as a silver bullet without the social impact being properly analysed. With AI-MAPS, we ask ourselves when we genuinely need AI and when it’s better to solve a problem based on purely human intelligence. Although Artificial Intelligence has a seat at the table, it can never be allowed to make the decisions.

Take the child allowance affair, where a naive faith in computers caused major personal tragedies – just try winning back the trust of the victims. People with a higher level of education usually know their way around society. Those who fare less well in society often seem or are less engaged. Is that because of lack of interest? Of course not! Many people have no idea of what’s going on or don’t have their questions or concerns taken seriously. And let’s be honest, this is obviously quite a complex issue.”

Where does your passion for public safety come from?

“I grew up in West Germany, where the tensions and pain caused by World War II were still felt everywhere during the 1970s and 1980s. Moreover, we were in fear of another nuclear war. My generation worried about questions such as how we could ensure that peace endured and how to stay safe and guarantee democracy. That also shaped who I am today.

Finding ways of contributing to world justice is a major motivator in my work as a scholar. The problems in our society are overwhelming: war, refugees, climate change, polarisation and inequality. The fact that we spend so little time listening to each other and don’t listen carefully to what others say when we do is perhaps the biggest threat, though. The challenge is, first and foremost, to restore trust in each other and in society and to regain our curiosity about other people!”

‘With AI-MAPS, we hope to cultivate an ecosystem of trust in which people start talking to each other again’

How can we boost trust in society?

“Citizens must be part of the discussion and be able to say what they do and do not want in their neighbourhoods. Otherwise, we’ll only alienate them further and leave those decisions to smart entrepreneurs who will come up with solutions that approach problems one-sidedly, from a technology-based perspective. Together with the business community, civil-society organisations and public bodies, we need to guide or steer those innovations so that we can find a way of ensuring that technologies serve our society.

With AI-MAPS, we hope to cultivate an ecosystem of trust in which people start talking to each other again. ‘They never listen to us anyway’, many citizens cry now. Public bodies, on the other hand, say ‘they’re not interested’ or ‘they believe in conspiracies’. I truly believe that both sides are well-intentioned and would be perfectly happy to talk to each other. All we need to do is create a safe setting and learn to listen to each other again. With AI-MAPS, we intend to ensure that technology brings people together instead of driving them apart.”Professor

prof.dr. (Gabriele) G Jacobs 

  • Erasmus School of Social and Behavioural Sciences

Gabriele Jacobs studied Psychology at the University of Cologne and obtained her PhD from the University of Münster in 1998. She has been with EUR since 2000. Her previous posts include those of co-director of the Centre of Excellence on Public Safety Management at the Rotterdam School of Management (RSM) and dean of Erasmus University College. Since 2017, Gabriele has been Professor of Organisational Behaviour and Culture, originally at the RSM and from 2020 at the Erasmus School of Social and Behavioural Sciences (ESSB).

Want to read more?

Consortium AI-MAPS, supported bij Leiden-Delft-Erasmus, receives a 2 million Euro grant