This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Main Menu Back to Page


latest legal news and analysis

Addressing important issues in Delaware law.

| 1 minute read

Hermann Technology Inn of Court and Melson-Arsht Inn of Court Hold Joint Meeting on Deepfake Technology

Last week, on Tuesday, January 9, the Richard K. Herrmann Technology Inn of Court and the Melson - Arsht Family Law Inn of Court held a joint meeting where members could connect and participate in a discussion surrounding developments in artificial intelligence and its application and effect on legal discovery and family law. The joint meeting was originally planned to be an in-person event, but unfortunately, weather conditions necessitated a shift to an online webinar. Despite the change, attendance was robust, and participants enjoyed an in-depth discussion of “Deep Fake” technology and its impact, led by Daniel Shin, an attorney and cyber security researcher at the Center for Legal and Court Technology at William & Mary School of Law.

The term “Deep Fake”, or deepfake, applies to A.I. generated content where one subject’s voice or appearance is digitally altered, or created whole-cloth, to give the appearance that the subject is someone else entirely. Deepfakes are enabled by machine learning programs where computer algorithms analyze existing media and then analogize that information onto a new framework to create a deceptive video or audio clip where someone appears to be doing or saying something they never actually did.

The rise of this technology creates serious evidentiary issues, as video and audio documentation may not actually be reliable. In a legal proceeding, a video of one party appearing to do or say something may not actually represent the reality of the situation. While the popular conception of deepfakes usually involves celebrities or other prominent figures, there are many apps available for smartphones that facilitate the creation of deepfake images of anyone. The implications for evidence in domestic disputes are particularly troubling. With the rise of deepfake technology, it is imperative that there be a coinciding rise in efforts to identify the use of this technology in media, as prior indicators and visual cues that a piece of media utilizes deepfake technology may no longer be effective.

It is important for attorneys in all practice areas, and for the public in general, to be aware of the increasing prevalence of deepfake technology, and the increasing facility with which it can be applied. 


artificial intelligence, chat gpt, legal technology