For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
How can you develop AI in such a way that it does not discriminate or consume large amounts of energy, but actually benefits society? A large-scale national AI initiative, housed at UvA's Informatics Institute, is exploring this question. Not just by science, but rather in close cooperation with companies and (semi-)government institutions.

Researching how to improve logistics to reduce food waste with supermarket chain Ahold. Working with the Netherlands Cancer Institute on more precise cancer radiotherapy. Or working with the City of Amsterdam and the Ministry of the Interior and Kingdom Relations on reducing bias in AI models, so that people are not discriminated because of their origin, address or skin colour.

Photo Benedicta Badu

These are some examples of research projects currently being carried out within one of the 49 labs of the Innovation Centre for Artificial Intelligence (ICAI), a network in which knowledge institutions, businesses, governments and non-profit organisations work together to further develop knowledge on responsible AI, as well as to attract and deploy AI talent to relevant social issues. The national partnership was founded in 2018 on the initiative of the University of Amsterdam (UvA) and the Free University (VU). Since then, a total of 18 knowledge institutions and over 145 other partners have since joined the initiative. 

Access to data

Esther Smit is business manager of ICAI. She has been working for the UvA's Informatics Institute (IvI) for over eight years, at the cutting edge of knowledge sharing, building communities and setting up collaborations. The ICAI labs are demand-driven, she says. ‘Each of the labs always involve a non-knowledge institution, such as a company or (semi)government. They approach us with specific questions or challenges they are facing. Together, we look at the scientific questions that we can address at the same time.’ PhD research is central; at least five PhD students are involved in each lab. They can spend part of their time at an external organisation to jointly take up the research. ‘Not by working for the company, but to better understand their research topic. Moreover, they often get access to data from that organisation. Thus, the researchers have the opportunity to test their algorithms on ‘real’ data that they would not normally have access to.’

Each of the labs always involve a non-knowledge institution, such as a company or (semi)government. They approach us with specific questions or challenges they are facing. Together, we look at the scientific questions that we can address at the same time. Esther Smit, business manager ICAI and manager external relations IvI

SDGs

An important aspect of ICAI is that it focuses on societal impact. All labs have committed to one or more Sustainable Development Goals (SDGs). ‘Think, for example, of challenges around energy transition. The Dutch power grid is currently overloaded; how can we better match power supply and demand? And in the Caribbean, in the Illustre programme, besides a reliable energy supply, access to clean water is also an issue.’

The 17 labs of the large-scale and long-term NWO programme ROBUST are also set up and formally accommodated at ICAI. This programme focuses on increasing the reliability of AI. Many AI systems still make too many mistakes or suffer from bias. This can cause people to get misinformed, and therefore make wrong decisions. Or that systems make wrong decisions that can have undesired consequences. Think of a medical AI system prescribing the wrong treatment, or a self-driving car causing an accident.

Leading the way

‘With the AI Act, the European Union is also increasingly pushing for human-centric, responsible AI. With ICAI and ROBUST, we quickly picked up on that in the Netherlands.’ According to Smit, the Netherlands even has a leading position when it comes to responsible AI. In the Global Index on Responsible AI, the Netherlands ranks first. ‘The Netherlands has always been strong in AI. At the UvA, VU, Utrecht University and Radboud University Nijmegen, there has been research on AI since the 1980s. There are strong machine learning groups there. We have also had education programmes specifically on artificial intelligence since the 1990s. Because this knowledge is housed at broad, multidisciplinary universities, there has always also been a link with other, social sciences. As a result, we are well placed to build bridges.’ 

Community

ICAI is physically housed in the LAB42 building, near IvI. IvI researchers who do not participate in ICAI, but who are looking for cooperation with societal partners, can also contact Smit. She also supports setting up other collaborations, makes agreements on access to data and assists in contract negotiations with external parties. In addition, she and her team connect companies renting offices in LAB42 with each other and with researchers and students. ‘We want to connect the different occupants of the building to build a community,’ she says.

ICAI, meanwhile, is in a transition phase, Smit explains. A number of labs have already been completed and the next step is now to better connect the labs with each other and to better highlight the social impact. The network has now built a strong name for itself, Smit believes. ‘The format in which we work with external partners has proven itself. This approach allows us to work with external partners in a good way.’