TECHNOLOGY, ACCOUNTABILITY AND GOVERNANCE: ICOCA’S MARCH 2025 WORKSHOP REPORT

Summary Report on ICoCA’s Consultative Workshop on Responsible Security in the Digital Age, March 2025

 

In 2024, ICoCA teamed up with ICT4Peace to support the creation of the first Toolkit for the Responsible Use of Technology in the Private Security Sector. The Toolkit identifies the main challenges brought by the development of new technologies in the private security sector and offers practical recommendations for companies to manage the risks they pose. Upon the release of the Toolkit, ICoCA convened a consultative workshop of experts on 26 March 2025 to further investigate and elaborate on how technology is transforming the private security industry and how the regulatory and governance framework of private security need to be adapted and developed in response.

The following provides a summary of the discussions which took place around four key questions posed during the workshop:

  • How does technology transform the security sector?
  • What are the risks for human rights and international humanitarian law (IHL)?
  • How to adapt/develop the regulation of private security?
  • How to adapt/develop the governance of private security?

 

How does technology transform the private security sector?

  1. Considerations on ethical challenges

Whilst there are difficulties in determining clear-cut categories of ethical/lawful technologies vs. unethical/unlawful technologies, the manner in which technologies are used by PSCs ultimately depends on the risk PSCs are willing to take. There are specific areas that a PSC’s use of technology can be deemed unlawful or unethical, for example when it comes to collecting intelligence to target specific individuals such as journalists and opposition leaders. As international legal instruments have long recognised the right to privacy as a fundamental human right, the responsible use of technologies in the digital age must imply being sensitive to data collection and protection issues and how these can often disproportionately target vulnerable communities. The cascading effects on human rights can extend well beyond privacy risks.

  1. Considerations on regulatory and legal challenges

Governments should consider the type of regulatory and legal frameworks needed for ensuring responsible data collection, management and protection practices by actors involved in the private security industry, whether traditional PSCs or tech companies alike. The effectiveness of existing accountability mechanisms for companies providing surveillance services needs careful assessment. Companies may or may not be aware of the obligations and risks associated with providing these types of services in the first place. It is not only the responsibility of businesses to gain clarity on the compliance and human rights due diligence requirements they should abide by, but also the responsibility of governments and relevant institutions in the field to formulate guidelines that contextualize and clearly include new technologies and digital tools into current frameworks regulating the provision of private security services.

There is a need to expand procedures for data management and protection beyond the European Union’s General Data Protection Regulation (EU GDPR) – one of the most comprehensive regulatory frameworks to date – with a view to also facilitate the integration of the new actors now comprising the market for private security in oversight mechanisms too.

More inclusive and exhaustive guidelines are necessary not only because they provide the possibility to have effective compliance and accountability mechanisms, but also because they represent an essential source of guidance for PSCs that operate in complex and challenging environments. PSCs can face difficulties when reaching out to authorities concerning areas lacking any sort of enforceable legal code making them more receptive to issues of IHL and human rights. Drafting all-encompassing guidelines in a participatory way, seeking the contributions of all actors involved, will also allow institutions acquire a deeper understanding of private actors’ needs and shift the discussion around the responsible use of technologies in private security from a sanctions-centred focus to an incentives-centred focus, raising PSCs’ stakes in complying with more defined and explicit regulations. Technology often offers leeway for PSCs to be able to provide security services in contexts in which demand cannot be met via traditional methods; therefore, elaborating a clear and convincing value-proposition articulating why businesses should instead limit or more strictly regulate their use of new technologies is key.

  1. Considerations on workforce and local communities

Recognising how the private security industry provides important job opportunities in countries with low socioeconomic demographics, by replacing human personnel with drones or other types of technologies, PSCs risk losing their “social license” to operate, namely the legitimisation they gained over the years by employing locals and interacting with their communities. One way for PSCs to deal with such conflicting interests could be to propose themselves as mediators: instead of picking sides and endorse a clear stance in favour of either their clients or the local communities that provide most of their labour supply, PSCs can use their corporate responsibility values to reduce the risk of – or at least de-escalate – social conflicts, proposing compromise-solutions capable of striking a balance between taking advantage of technological tools more substantially, whilst at the same time still guaranteeing employment opportunities.

  1. Considerations on complex environments and humanitarian operations 

The role of PSCs operating in complex environments that are subject to humanitarian operations is evolving because of technology. With activities now ranging from traditional demining, manned guarding and/close protection to new tech-related areas involving data collection and processing, there is increasing complexity in mapping all stakeholders PSCs work with and the risks they take in collecting, handling and storing sensitive data in highly volatile contexts. Given PSCs’ multidimensional and ever-evolving role in the humanitarian sphere there are challenges in co-designing processes with other actors operating in this type of high-risk context, above all civil society organisations.

 

 What are the risks for Human Rights and IHL?

  1. Technology, private security and the right to privacy 

The impact of technology-driven security services on individuals’ privacy rights needs careful consideration. Given the proliferation of actors that are now exposed to handling sensitive information such as biometric data and possibly incur privacy and human rights violations, identifying and promoting essential responsible data management practices is key. Privacy violations can often act as a gateway to broader human rights abuses, including more extreme cases of extra judicial killing – which obviously represents a breach to the right to life – requires privacy violations to take place beforehand, ranging from undisclosed surveillance practices to sensitive data retention.

  1. The Interdependence between public and private sectors

The increasing dependence of the public sector on digital security services provided by the private sector raises serious questions around the dilution of state sovereignty. Besides benefitting from the sector’s research/investment capacities and most up-to-date technologies, relying on PSCs’ services also implies that governments do not have total control over whether the data utilised and/or collected by such technologies is managed in compliance with IHL and human rights provisions. PSCs and tech companies can acquire large volumes of civilian data sourced from online interactions through websites, social media platforms, gaming, etc., for very low prices.

  1. Adapting existing regulatory frameworks

Existing human rights instruments can still apply to new technologies and services. Relatively small amendments to the International Code of Conduct (the Code) could already be a good step in addressing the new challenges characterising private security’s digital era. Any adaptation of the existing regulatory frameworks, however, needs to be conceptualised, delivered and communicated with clarity and precision. Given the ambiguity of labour standards in poorer countries and the lack of clearly designated roles between private sector’s compliance requirements and state’s regulatory regimes, new regulatory solutions must be formulated with and explicitly address the public as well as the private sector, assigning clear responsibilities and due diligence requirements to both.

 

How to adapt/develop the regulation of private security?

  1. Legal interpretations and definitions

Technologies may be considered as new types of “means and methods” through which security services are provided as the aim of a security service provided through technological means remains similar to the aim of a security service provided through traditional means. Therefore, the two services should be treated equally under the law or the statute in question. Grouping relevant technologies as “means and methods” in a way that is neither too specific nor too broad will still be challenging. Mapping the most relevant threats fuelling the demand for tech-security services and then identifying which type of technologies PSCs and tech companies provide to address them is key. This could be a way to strike a balance between principles-based regulation – which relies on high-level, broadly stated principles setting standards for regulated firms to conduct business which are still open to interpretation – and rule-based regulation – which instead relies on detailed, prescriptive rules which are far more stringent.

  1. Revising the Code

Identifying ways to adapt the Code to new types of digital services now provided by PSCs and tech companies alike is key. Similarly, existing regulations do not include terms like “privacy”, “data protection”, “cybersecurity”, as they were drafted having mostly physical attacks and military operations in mind. The Code’s current definition of security services could risk being too narrow in two ways: first, it lacks explicit references to tech-related services such as app developers, cloud providers and similar; second, focusing only on the “use” of a specific service, whether tech-related or not, the Code de facto excludes most part of tech products’ development life cycle. This results in any sort of data collection or management practice taking place before the product is on the market being much more exposed to IHL and human rights violations. Whether cybersecurity services and associated human rights and IHL risks are significantly different requiring a specific approach also needs careful consideration. Whether a full-fledged revision of the Code or lighter approach such as integrating existing articles with explicit, more precise tech-specific terms, updating the Code’s compliance indicators will provide more clarity on compliance requirements related to tech services without needing to update the Code.

  1. Keeping pace with technology 

The strategic effectiveness of “chasing” tech companies directly by amending the Code and “forcing” them into its framework is worthy of consideration. ICoCA Members and Affiliates are interested in demonstrating to their clients’ adherence to frameworks that ensure secure data management and compliance with human rights. But updating and reviewing the Code to keep pace with technological developments comes with significant challenges. The rapid pace of technological change means updating or reviewing the Code to keep up with such changes is extremely challenging. The real question is identifying areas of priority, what is under-addressed, what is over-addressed and what challenges PSCs are seeking solutions for.

 

How to adapt/develop the governance of private security?

  1. Engagement with tech companies and stakeholders

Outreach strategies with technology companies that are new to the market of private security will need developing. The potential for ICoCA to reach out to technology companies and expand dialogue on new technologies needs assessing. Establishing clear pathways for engagement not only with tech companies, but also with multi-stakeholder platforms and civil organisations that operate in the field will be key.

  1. Engagement strategy and challenges

ICoCA should strive for more diversified partnerships through, engagement with civil society organisations and social organisations that represent network coalitions, such as the Global Network Initiative (GNI) and the Heartland Initiative. This would also allow ICoCA to expand its governance model and include a broader range of stakeholders.

  1. Value proposition and raising awareness

There is a need to raise and communicate more effectively companies’ stakes in complying with regulations on technologies used and/or provided. The reputational and security risks posed by technologies are still not perceived as strongly as other incidents related to conventional corporate responsibility ethics. Developing case studies that could be disseminated through ICoCA’s case-map would allow ICoCA to articulate the risks and best practices related to the use of technology in private security more clearly and translate the Toolkit’s guidelines into practical, evidence-based recommendations. This could also act as a feedback mechanism to keep the Toolkit updated and relevant.

 

Conclusion: Navigating technology’s opportunities and challenges: collaborative governance for responsible security in the digital age

The challenges posed by technology in the private security sector are multidimensional and involve actors across the whole governance spectrum (international, national, private and public). Therefore, any strategy aiming to uphold and enhance IHL and human rights accountability in the field must be grounded on a “co-regulatory governance” framework involving the “meaningful participation from state, business and civil society actors” (David Kaye, the UN Special Rapporteur on Freedom of Opinion and Expression).

ICoCA is committed to adopting a more up-to-date strategy designed as a collaborative, multi-stakeholder plan of action. Through this strategy, ICoCA aims to: (i) enlarge its platform and adapt it to all the new actors brought into the industry by the advent of new technologies; (ii) promote and help operationalise the Toolkit to support security providers’ compliance with human rights, IHL and the Code’s provisions; (iii) consider reviewing the Code to help develop a comprehensive regulatory and governance framework that the international community can use as reference to expand the principles of ethical business conduct to the realm of private security in the digital age.