Exploring the Potential of Technology to Enhance Global Peace at the First Annual Conference of the Global PeaceTech Hub

The first Annual Conference of the Global PeaceTech Hub brought together experts and representatives from different PeaceTech Labs, research institutes, non-profit organizations, governments, and companies to discuss how technology can enhance peace at the global level. The conference was organized by the Global PeaceTech Hub and was an invitation-only event. However, on the second day of the conference, three parallel sessions were open to the public.

The conference addressed various topics such as matching algorithms and digital identities for refugee management, predictive analytics for peace, peaceful digital ecosystems, cyber diplomacy, ethics of human rights in tech governance, satellite and space-tech for peace, peace engineering and investments in technologies of peace. The event also marked the first KluzPrize for PeaceTech, an initiative supported by Artur Kluz, Founder and CEO of Kluz Ventures, which was awarded to a project that demonstrates exceptional potential in applying emerging technologies to positively impact transnational peace and cooperation. Juan Carlos Lucero, Co-Founder of the Magnolia Foundation, received the award to further his organization’s School of Peace, a multidisciplinary approach to teaching peace, mediation, and rehabilitation.

The conference was a valuable opportunity for attendees to learn about the latest developments in PeaceTech and to network with other experts and practitioners in the field. Through open discussion and the sharing of ideas, the conference aimed to promote a global culture of genuine human growth that meets the current and future needs of all people and serves as an inspiration for future generations. A breakdown of the conference sessions is offered below, as reported by Uma Kalkar in her blog post “Situating PeaceTech: Takeaways from the Inaugural Global PeaceTech Conference”.

 

 

Opening Panel

To kick off the conference, individuals from the public, private, and non-profit spaces discussed key challenges, triumphs, and questions facing the use of technology in peacekeeping and building efforts.

  • Cameran Ashraf, Human Rights Lead at the Wikimedia Foundation and Co-Founder of AccessNow introduced the growing problem of “entryism,” wherein disinformation agents distort trust in online spaces by becoming pillars of a digital community over time and then spreading false narratives. He discussed how Russian/Belarussian agents have weaponized collaborative editing and forum spaces to spread disinformation about the Russia-Ukraine War.
  • Elizabeth Crossick, Head of Government Affairs EU and Global Policy Lead AI at RELX presented the eyeWitness to Atrocities app, an Android app that time- and geo-stamps user-captured pictures of human atrocities and uploads them to a secure server to create a legally-admissable chain of custody for the evidence.
  • Helena Puig Larrauri, Strategy Lead & Co-founder at Build-Up then turned to the classification, application, regulation, and implementation of peace technologies, stating that PeaceTech “is neither good nor bad, but it is political.” She argued that the focus on ‘high tech’ is misdirected and that peace practitioners should focus on useable, ‘low tech’ for better uptake and action.

 

Defining (Global) PeaceTech

In the next session, members of the Global PeaceTech Hub presented their initiatives to scope and set the agenda for the field.

  • Kalypso Nicolaïdis, Chair in Global Affairs at the EUI School of Transnational Governance, and Michele Giovanardi, Coordinator of the Global PeaceTech Hub outlined their definition of Global PeaceTech as a “field of analysis applied to all processes connecting local and global practices aimed at achieving social and political peace through the responsible use of frontier technologies.” They demonstrated how Global PeaceTech stands in the middle of power asymmetries between different sectoral actors, ideas and material tools, and interests.
  • Stefaan Verhulst, Co-Founder and Chief Research Officer of The GovLab and Uma Kalkar, Researcher at The GovLab, along with Andrea Renda, Adjunct Professor of Digital Policy at the EUI School of Transnational Governance presented the methodology and findings of the “PeaceTech Topic Map: A Research Base for an Emerging Field.” They outlined the six overarching categories of where PeaceTech is applied and the challenges that the field faces, specifically with regard to dual-use technologies and poor existing oversight and regulation.
  • Lucia Bosoer, Project Associate at the Global PeaceTech Hub, and Michele Giovanardi, Coordinator of the Global PeaceTech Hub debuted the Global PeaceTech Atlas, a repository of existing PeaceTech initiatives organized by country to understand who is carrying out PeaceTech and where.
  • Evelyne Tauchnitz, Senior Researcher at the Institute of Social Ethics ISE, University of Lucerne discussed the ethical and theoretical framework against which PeaceTech should be gauged, pointing to the need for human rights to be set as the minimal ethical standard.

 

The Many Ways to PeaceTech

Peace experts then discussed the ways in which they have incubated PeaceTech actions, as well as the challenges and avenues of further research they have to consider.

  • Lisa Schirch, Professor of Peace Studies at the Kroc Institute for International Peace Studies at the University of Notre Dame discussed the business case (or rather, lack thereof) for creating peace on digital platforms. She touched on the inherent design of platforms as “Gladiator arenas” to amplify polarizing and hateful content in order to generate likes and profits, and how efforts to regulate content have been met with contempt from users.
  • Sheldon Himelfarb, CEO of the PeaceTech Lab mentioned an initiative to create an ‘International Panel on the Information Environment’ akin to the Intergovernmental Panel on Climate Change to tackle the growing social and political threats posed by misinformation and hate speech.
  • Christine Bell, Director and Principal Investigator at PeaceRep at the University of Edinburgh showed her lab’s peace process trackers, which use data to visualize peace agreements and building processes. She showed the messy, back-and-forth nature of the “peace-conflict continuum.” She concluded with a look towards better, more ethical data gathering and sharing to improve peace insights.
  • Lisa Glybchenko, Founder of Color Up Peace showcased how she combines technology and art to create a “digital visuality” of peace in conflict-affected zones and imagine futures of peace.

 

Thematic Discussions

The conference also featured three concurrent workshop sessions on different aspects of PeaceTech, including AI ethics, cyber peace, and peacebuilding tools.

Towards Ethical Design and Implementation of AI in Peacebuilding

Panelists:

  • Branka Panic, AI for Peace
  • Andrea Renda, EUI School of Transnational Governance
  • Evelyne Tauchnitz, University of Lucerne — Institute of Social Ethics ISE
  • Stefaan Verhulst, The GovLab

Branka Panic, Founder and Executive Director of AI for Peace led a panel discussion on the need to explore existing AI ethics practices and discover gaps and offer solutions for AI ethics issues around PeaceTech. She focused on the principle to ‘do no harm’ and ‘conflict sensitivity’ practices to assess and upgrade tools to be more peaceful and discussed the efficacy of these tools for ethical AI for peace.

Evelyne mentioned the need to think about who bears responsibility for the deployment of AI, noting that the onus for ethical use falls to humans. Andrea gave an overview of the existing landscape of AI ethics frameworks created by the US government, OECD, Council of Europe, and other high-level sources. He concluded with the need for a “PeaceTech by design” approach to AI ethics principles that stress-test the trustworthiness, robustness, and ethical nature of an AI system. Lastly, Stefaan introduced the responsibility of ‘non-use,’ arguing that a lack of technology implementation in order to ‘do no harm’ where the technology could be useful is as dangerous as deploying unethical tools. He called for policymakers to consider not only the misuse of technology to guide AI ethics but also its missed use and subsequent consequences.

Cyber Peace: Cooperating for the Stability and Security of the Cyber Domain

Panelists:

  • Andrea Calderaro, Centre for Internet and Global Politics, Cardiff University
  • Francesca Bosco, Cyber Peace Institute
  • Madeline Carr, University College London
  • Martin Koyabe, AU-GFCE Project
  • Vladimir Radunovic, DiploFoundation

Andrea Calderaro, Robert Schuman Centre for Advanced Studies Fellow, Associate Professor in International Relations, and Director of the Centre for Internet and Global Politics at Cardiff University moderated this panel on the challenges and opportunities of cyber diplomacy. He asked the panel about what international cooperation means with respect to cyber security and if the existing venues and tools available in the cyber domain were sufficient for transnational governance and peacebuilding.

Madeline began by stating how typical instruments of diplomacy are not well suited for the cyber domain. She noted the slow pace at which existing diplomacy processes move, and how this is mismatched when compared to technological progress. A first step to rectifying this issue, she argued, is to build consensus and a shared understanding of the issues affecting peace. Francesca complemented this stance by mentioning the potential for civil society and advocacy organizations to identify, assess, and monitor existing and potential cyber threats and improve existing international cyber security governance. She called for international organizations to co-design cyber peace initiatives with civil society rather than just consult with them. Next, Martin noted that standards to support value chains remain an important and unaddressed issue when it comes to implementing cyber diplomacy initiatives. He called on the need for better knowledge sharing, which Vladimir expanded on by introducing the notion of inclusive and representative cyber diplomats to work across sectors and agencies to achieve change.

PeaceTech Toolkit for Peacebuilding in the 21st Century

Panelists:

  • Gilbert Beyamba, Pollicy
  • Krystel Tabet, Build Up
  • Alexander Young, Peace Tech Lab

Krystel kicked off the workshop with an overview of Build Up’s Phoenix tool, an open-source, non-commercial tool that helps peacebuilders use social media data ethically. The tool draws on the concept of ‘social listening,’ an engagement strategy that sources online reviews and discourse around products to improve targeting and sales. Krystel discussed how Build Up applies social (media) listening to gauge online narratives and harness information about comments, posts, and shares in order to understand where polarization occurs, what groups facilitate hate speech, and how peacebuilders can tailor their responses.

Next, Gilbert presented Digital Safe-Tea, a digital security game Pollicy created to train African women to protect themselves against hate online. The game features multiple avatars and scenarios of online challenges women face, along with resources to educate on and address online threats.

Lastly, Alexander walked through the PeaceTech Lab’s tailored training sessions, which unlock the power of low tech to help participants not only collect data but to tell a story from the information they gather. Specifically, he discussed the US ‘Road to Equal Justice’ training session, which hosted 39 participants from 28 organizations to learn how to leverage easily accessible and familiar survey and data collection tools, data analysis and visualization, and social media strategies to advance their racial equity and justice work.

 

Investing in PeaceTech

The final session of the conference looked at avenues for sustainable, long-term growth and investment in PeaceTech.

  • Margarita Quihuis, Executive Director of the Peace Innovation Lab framed peace as a service that can facilitate better behavior between parties. Margarita referred to Peace Dot, an initiative that used Facebook to facilitate friendships across groups that historically have had conflict with each other, and Play Nice, a project to bolster healthy online communities. Her findings showed how online platforms can enable prosocial, peaceful behaviors.
  • Mark Nelson, Co-Founder and Director of Innovation of the Peace Innovation Lab then discussed the financing of peace-related work, noting that much of this work has been supported by military organizations, which bias the agenda for peace and initiate the dual-use nature of peace technologies. He introduced the notion of peace finance, which encourages the creation of profit-generating peace efforts to create an investable portfolio of peace capital and put the global market in the service of peace.

 

Other News

Core Partners