Both Congress and Private Sector Innovators Have Important Roles in Combating Deep Fake Intimate Imagery

Over the years, leading U.S.-based tech companies have developed technologies that have provided new opportunities for Americans. However, while these innovations have benefited many consumers and businesses, it has also introduced new risks and potential for misuse.

As generative imagery technology has improved in recent years, there has been a concerning increase in generated artificial images – known as “deep fakes” – that portray people in sexually explicit contexts without their consent. These non-consensual intimate images (NCII) create serious harm for victims, and young Americans are particularly at risk. This content can be deeply distressing for people affected by it.

Understanding the horrific effect of these types of images, American companies are often the first line of defense against NCII and are taking significant measures to combat them. Many tech platforms, for example, have simplified the process for victims of NCII to request the removal of deep fake explicit imagery from search engines at scale and are continuing to refine their detection and removal capabilities. This includes leveraging AI technology to find and remove explicit images and duplicates. Additionally, companies have taken steps to train their algorithms to proactively identify and flag search terms that present a higher risk of explicit deep fake content appearing. As companies continue to innovate, this technology will only improve, reducing the risk, scale and impact of NCII.

SIIA welcomes the proactive approach the private sector has taken in tackling this issue, but it’s also important that lawmakers use tools at their disposal to prevent these bad behaviors from gaining traction. Concerningly, despite bills passed at the state level and executive actions taken by the White House, federal laws do not currently provide adequate protections for victims of NCII, especially AI-generated NCII. However, leaders in Washington – including Senators Ted Cruz, Amy Klobuchar, Josh Hawley, Dick Durbin and Lindsey Graham –  are leading the charge on legislative solutions designed to address the issue by creating criminal and civil penalties for bad actors. Taking this important step would address gaps in current law and provide the first federal legal framework to hold perpetrators accountable for creating and distributing real or deep fake NCII without consent. Additionally, by establishing clear legal consequences, including fines, damages, and other penalties, these solutions would serve as a deterrent to potential offenders and offer victims a way to seek justice.

Further, as Congress considers  legislation such as the TAKE IT DOWN Act and DEFIANCE Act to hold bad actors accountable, it’s also essential for Congress to provide legal clarity that allows private actors to continue effectively combating NCII.

Congress can and should act urgently to establish a legal framework to combat the production and distribution of NCII. It is also equally important for lawmakers to work with American companies who can continue to play a critical role in defending against these abuses and can often anticipate future risks and respond to them in a timely manner. This collaborative approach is essential to create a comprehensive federal standard that both punishes offenders and empowers victims. By combining legislative action with technological innovation, elected officials can embrace new solutions to better protect individuals from the harmful effects of NCII.

Comments are closed.