{"id":100170,"date":"2025-01-13T13:02:40","date_gmt":"2025-01-13T21:02:40","guid":{"rendered":"https:\/\/xira.com\/p\/2025\/01\/13\/sex-lies-and-deepfakes-ces-panel-paints-a-scary-portrait\/"},"modified":"2025-01-13T13:02:40","modified_gmt":"2025-01-13T21:02:40","slug":"sex-lies-and-deepfakes-ces-panel-paints-a-scary-portrait","status":"publish","type":"post","link":"https:\/\/xira.com\/p\/2025\/01\/13\/sex-lies-and-deepfakes-ces-panel-paints-a-scary-portrait\/","title":{"rendered":"Sex, Lies, And Deepfakes: CES Panel Paints A Scary Portrait"},"content":{"rendered":"<p>Lack of trust has enormous implications for lawyers, judges, and the way we resolve disputes.<br \/>\nThe post Sex, Lies, And Deepfakes: CES Panel Paints A Scary Portrait appeared first on Above the Law.<\/p>\n<div id=\"attachment_1079719\" class=\"wp-caption alignright\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" aria-describedby=\"caption-attachment-1079719\" class=\"wp-image-1079719\" src=\"https:\/\/i0.wp.com\/abovethelaw.com\/wp-content\/uploads\/2025\/01\/GettyImages-2192956385-scaled.jpg?resize=352%2C264&#038;ssl=1\" alt=\"Consumer Electronics Show 2025 Kicks Off In Las Vegas\" width=\"352\" height=\"264\" title=\"\"><\/p>\n<p id=\"caption-attachment-1079719\" class=\"wp-caption-text\">The 2025 CES trade show in Las Vegas. (Photo by Zhang Shuo\/China News Service\/VCG via Getty Images)<\/p>\n<\/div>\n<p>Lies. Scams. Disinformation. Misinformation. Voice cloning. Likeness cloning. Manipulated photographs. Manipulated videos. AI has exploded the possibilities of all these things to the point that it\u2019s almost impossible to trust anything. Lack of trust has enormous implications for lawyers, judges, and the way we resolve disputes.<\/p>\n<p>And if you believe a Thursday afternoon <a href=\"https:\/\/www.ces.tech\/\" target=\"_blank\" rel=\"noopener nofollow\">CES<\/a> panel presentation entitled <i>Fighting Deepfakes, Disinformation and Misinformation<\/i>, it\u2019s likely a problem that will only get worse and for which there are precious few solutions.<\/p>\n<p><b>The Bad News<\/b><\/p>\n<p>A year ago, it was relatively easy to tell if a photograph had been substantially manipulated. Today, according to the panelists, it\u2019s next to impossible. In a year, the same will be true of manipulated or AI generated fictitious video. Right now, it takes the bad guys about 6 seconds of audio to clone a voice so well it\u2019s hard to tell the difference \u2014 and that time will get less.\u00a0<\/p>\n<p>The bad guys are only going to get better. Add to this fact that, according to the panel, we are accustomed to assuming that a photograph or video or even audio recording is what it purports to be. Camera, video, and audio companies have spent years convincing us this assumption is valid.<\/p>\n<p>Finally, as we begin to use AI generated avatars, digital twins, and even AI agents of and for ourselves, it will get worse: The bad guys won\u2019t have to create a fake; we will do it for them.<\/p>\n<p><b>What\u2019s to Be Done?<\/b><\/p>\n<p>The panel talked about solutions, none of which struck me as that great. First, there is detection. There are sophisticated tools and analyses that can be done to attempt, with varying success, to detect deepfakes. The problem, though, is similar to what the cybersecurity world faces: The bad guys can figure out ways to avoid detection faster than we can figure out how to detect the fakes. Yes, tools do exist to detect fakes. But the tools always will lag behind the abilities of the deepfake producers to elude detection. In addition, forensic tools and experts are expensive, giving the bad guys more opportunity. And there are a lot more bad guys than forensic experts.<\/p>\n<p>The second way to combat the problem is referred to as <a href=\"https:\/\/en.wikipedia.org\/wiki\/Provenance\" target=\"_blank\" rel=\"noopener nofollow\">provenance<\/a>. Provenance is a way to determine where the object in question came from and what data was used to create it. It informs and\/or labels any object that may have been manipulated. Watermarks are perhaps a familiar example. The idea is to create something like the nutrition labels on foods.<\/p>\n<p>But again, the panelists noted that provenance examination and labeling don\u2019t always work since the bad guys will always be a step ahead of the game and can erase or hide the information. Provenance doesn\u2019t completely solve the problem in any event, particularly when, as in a court of law, accuracy counts. Provenance may tell you a photo may have been manipulated, but it won\u2019t necessarily tell you whether it has been for sure and how. (Keep in mind that with photos, for example, some level of manipulation may be acceptable or even expected. The issue is when the process creates an altered or fictitious image). So the question remains subject to debate.<\/p>\n<p>Where did the panelists come down? Detection and provenance need to be used together to achieve the maximum chances of success. I didn\u2019t get a warm and fuzzy feel from this solution, though.<\/p>\n<p><b>So What Are Lawyers to Do?<\/b><\/p>\n<p>Deepfakes pose tough questions for lawyers, judges, and juries. For lawyers and judges, while we may want to believe what we are seeing, we now have to accept that we can\u2019t. We can no longer assume that something is what it purports to be. We have to view evidence with new, more critical eyes. We have to be prepared to ask tougher evidentiary authentication questions. Authentication can\u2019t be assumed. It is no longer the tail wagging the proverbial dog. It may be the dog.<\/p>\n<p>One thing the panelists did agree on: You can\u2019t determine if something is fake just by looking at it or listening to it. So we have to ask questions. We may have to use experts.\u00a0<\/p>\n<p>We have to keep abreast of the tools available to question authenticity; we have to keep abreast of tools and strategies the bad guys are using.<\/p>\n<p>The panelists offered some help using what they called the human firewall to ferret out deepfakes. We need to ask questions like: Where did the object come from? What is the credibility of the source? What is the motive of the object provider? Does the object depict something that is consistent with the remaining evidence, or is it in stark contrast? Is the photograph consistent with other photographs from other sources?<\/p>\n<p>In short, we have to treat those attempting to authenticate evidence the same way we treat substantive witnesses.<\/p>\n<p>Judges, too, have a significant role. They need to understand the threat. They need to know that authenticity can\u2019t be assumed and is important. They, too, have to keep abreast of what\u2019s happening with AI and deepfakes and what the threats are in real time. They need to know that \u201cletting the jury decide\u201d is not a solution.<\/p>\n<p>We need more and better rules for assessing evidentiary credibility. Just as Daubert was a watershed case for ensuring the credibility of expert witnesses and evidence, courts need some definitive guidance in the rules as to how to assess deep fake issues.<\/p>\n<p>The public from which juries come needs to be constantly educated about the threat so they, too, can take with a grain of salt evidence that comes to them if the court does not make the determination.<\/p>\n<p><b>Is This Realistic?<\/b><\/p>\n<p>Despite these potential solutions, it\u2019s hard not to be pessimistic. Precious few resources are allocated to our court systems already. It\u2019s hard to see legislatures providing the funds necessary to better educate judges on deepfake issues. The expense of experts and forensic analysis will place less well-heeled litigants at a disadvantage. It will be hard to convince people that they can\u2019t believe what they see when they have been conditioned to do so.<\/p>\n<p>And with today\u2019s polarization of political beliefs and ideologies, it may be hard to convince people that something is fake if they want to believe to the contrary. As lying and misinformation become more prevalent, litigants and even lawyers may be more and more tempted to use deepfakes to justify what they believe and want.<\/p>\n<p>Put all this together, and I\u2019m fearful of what technology may do to our cherished legal institutions. I\u2019m generally an evangelist when it comes to technology. Sometimes though, shiny new objects turn out to be nothing more than a bucket of shit.<\/p>\n<hr \/>\n<p><em><strong>Stephen Embry is a lawyer, speaker, blogger and writer. He publishes\u00a0<a href=\"https:\/\/www.techlawcrossroads.com\/\" target=\"_blank\" rel=\"noopener nofollow\" data-saferedirecturl=\"https:\/\/www.google.com\/url?q=https:\/\/www.techlawcrossroads.com\/&amp;source=gmail&amp;ust=1704470220915000&amp;usg=AOvVaw2xvKCnFgr7_FK8LOfTB9nF\">TechLaw Crossroads<\/a>, a blog devoted to the examination of the tension between technology, the law, and the practice of law.<\/strong><\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Lack of trust has enormous implications for lawyers, judges, and the way we resolve disputes. The post Sex, Lies, And Deepfakes: CES Panel Paints A Scary Portrait appeared first on Above the Law. The 2025 CES trade show in Las Vegas. (Photo by Zhang Shuo\/China News Service\/VCG via Getty Images) Lies. Scams. Disinformation. Misinformation. Voice [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":100171,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[16,17],"tags":[],"class_list":["post-100170","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-above_the_law","category-legal_matters"],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/xira.com\/p\/wp-content\/uploads\/2025\/01\/GettyImages-2192956385-scaled-KSv12o.jpeg?fit=2560%2C1920&ssl=1","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts\/100170","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/comments?post=100170"}],"version-history":[{"count":0,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/posts\/100170\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/media\/100171"}],"wp:attachment":[{"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/media?parent=100170"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/categories?post=100170"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/xira.com\/p\/wp-json\/wp\/v2\/tags?post=100170"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}