You may not know the word for it, but you, like many people, probably saw the recently doctored video of U.S. House Speaker Nancy Pelosi. The “deepfake” made it look like she was slurring her words, perhaps implying that she was under the influence of alcohol. The Washington Post reported that it appeared the manipulator slowed down the original video and modified the pitch of the Speaker’s voice.
Deepfakes are “videos and presentations enhanced by artificial intelligence and other modern technology to present falsified results … [including] deepfakes involve[ing] … celebrities, politicians or others saying or doing things that they never actually said or did,” explains Techopedia.
Political process vulnerable
As the Pelosi example suggests, deepfakes could wreak havoc in the political space. In response, a bill is pending in the California state legislature that would ban the use of audio, image and video deepfakes (or other kinds of alterations) of a candidate for office within 60 days of an election, unless the manipulation is disclosed. For the law to apply, the person promoting the deepfake would have to either intend to harm the reputation of the candidate or to influence voters.
As of this writing on July 17, the bill has already passed the Assembly by a unanimous vote and is winding its way through the committee process in the state Senate.
Intellectual property rights also at risk
But another enormous concern is what deepfakes could do to intellectual property rights. Above the Law just published an insightful and interesting article about this topic. Let’s talk about copyright and the right of publicity in this context.
Say a photographer or painter created a copyrighted work. The Above the Law article raises the possibility that someone could create a deepfake of the copyrighted image that disparages the work or the copyright owner, but that the deepfake creator could argue that they are not violating the copyright under the doctrine of fair use. This could become a newly developing application of fair use as a defense.
Deepfakes, especially of altered images of celebrities, if used to make money, could violate state right of publicity statutes, such as California’s. In our state (and some others), a person (or their heirs or beneficiaries if the person is deceased) can sue for damages when someone misappropriates their image for commercial profit as a violation of their right of publicity.
While altered images and recordings have been around a long time, since deepfakes are becoming so hard to detect, it is likely that these and other related intellectual property issues will only become more common and more complex. We will keep tabs on this growing area of the law to understand how it will impact our individual and business clients.