Principles for Combatting Image Based Sexual Abuse
While image-based sexual abuse (IBSA) is not a new problem, technological advances have made it easier than ever to create, distribute, and monetize nude or sexually explicit imagery of individuals without their consent. Regardless of whether the images are authentic or inauthentic, IBSA can cause devastating psychological, financial, and reputational harm. It can also be a factor in harassment, impersonation, and offline violence. IBSA has a chilling effect on the free expression and full civil participation of those targeted, disproportionately silencing women and girls (especially women and girls of color) and the LGBTQI+ community.
After the White House issued a Call to Action to Combat Image-Based Sexual Abuse for tech and civil society on May 23, 2024, the Center for Democracy and Technology (CDT), the Cyber Civil Rights Initiative (CCRI), and the National Network to End Domestic Violence (NNEDV) invited civil society organizations and tech industry leaders to a multistakeholder working group (“Working Group”) focused on combating IBSA. This working group convened a series of meetings to provide opportunities for experts and survivor advocates to share information about the definitions, scope, and impact of various forms of IBSA (including the nonconsensual distribution of intimate images (NDII) sometimes referred to as “nonconsensual pornography” or “revenge porn”; sexually explicit digital forgeries, often referred to as “AI-generated porn” or “deepfakes”; and sexual image extortion, often referred to as “sextortion”) with tech industry leaders; for tech industry leaders to share information about their existing and ongoing efforts to prevent and address IBSA; and for civil society and industry to collaborate on the development of actionable best practices and further opportunities to stem IBSA.
The following voluntary principles are derived from these initial discussions. The Working Group anticipates that these principles will inform the development of industry best practices and be refined as technology advances and industry standards evolve.
Consent and Autonomy by Design
Consent should be central to and woven throughout both policy and product design to preserve victim safety, privacy, self-expression, and self-determination. Individuals should have the ability to control whether and how their likeness or body is depicted or appears to be depicted in intimate imagery as well as whether and how such depictions are disseminated. When reviewing current products and services and while developing new ones, all companies should identify how they may be exploited to contribute to the creation, solicitation, distribution, or monetization of nonconsensual intimate imagery; develop interventions that mitigate those risks; and continually assess and improve the efficacy and proportionality of those interventions.Accessibility & Inclusion
Company responses to preventing and mitigating IBSA should be gender inclusive, accessible for people with disabilities, considerate of cultural implications, available in the language in which the user interacts with the service, minimize burden for victims, and be responsive to the needs of users regardless of their language, country, or region.
Clear Content Prohibition and Policy
All tech platforms and services should expressly prohibit the creation, solicitation, distribution, or monetization of nonconsensual intimate imagery. These policies should be clear, conspicuous, and accessible to all users.
Trauma-Informed Approaches
Policies, technological developments for functionality and features, and procedures for responding to abuse should all be approached with the integration of a trauma-informed lens to ensure not only sensitivity to the issues, but also effective mitigation and responses to possible harm. This may include ensuring training of staff, partnerships with experts, safety features that alert users to the risk of intimate image abuse, and investing in resources and tooling to ensure the rapid processing of reports.
Prevention and Harm Mitigation
The same technologies that can facilitate free expression, intellectual inquiry, privacy, and finding and building community can be misused to facilitate IBSA. Given the severe and often irreparable harm caused by IBSA, as appropriate by service, companies should invest in preventive practices, respond timely to identified potential intimate image abuse and provide trauma-informed responses when harm occurs.
Effective Implementation
Companies should implement effective, prominent, and easy-to-use tools to prevent, identify, mitigate, and respond to IBSA. Reports of abuse should be quickly addressed in ways that support the needs of the victim, and companies should consistently enforce their platform policies against the nonconsensual creation or distribution of intimate images.
Transparency and Accountability
Users should be able to understand how companies address the creation and distribution of nonconsensual intimate images. Companies should be transparent about their policies and enforcement related to the creation and distribution of nonconsensual intimate images and should engage with civil society organizations that have subject matter expertise to inform their approach.
Commitment and Collaboration
The development of best practices to mitigate intimate image abuse will require ongoing collaboration between victims of intimate image abuse, industry, civil society, and government. Companies should commit to ongoing coordination and engagement to develop shared knowledge, technological solutions, and educational strategies to combat IBSA.
Affiliations listed for identification purposes only.
former research manager at the Stanford Internet Observatory
Georgetown University
Georgetown University
Northumbria University Newcastle
For updates on our work, sign up here:
For press inquiries, contact press@cdt.org
To sign your organization onto the IBSA Principles signon@ibsaprinciples.org