M+E Daily

CPS 2023: How Adobe is Using AI to Help Security Compliance

Adobe is leveraging artificial intelligence (AI) to achieve greater content authenticity, easier and safer content generation, and more effective fraud prevention, while also enabling amazing user creativity and efficiency, according to the company.

Among “almost everybody who works at Adobe … I think actually anybody in the room but certainly at Adobe, our lives changed rather materially when we started to see generative AI emerge about a year ago,” Todd Burke, principal solutions consultant at Adobe and a Content Delivery & Storage Association (CDSA) board member, said at the Dec. 5 Content Production Summit (CPS) at The Culver Theater in Culver City, California.

“Certainly, as we started to develop Firefly and then, as that was released in September – that feels like multiple generations ago – we’ve seen a significant change in the way our business works insofar as new products, new services, new capabilities for the creators, the storytellers and so forth,” he said during the session “Adobe’s Content Authenticity Initiative & Extensive Leverage of AI.”

Adobe Firefly is a family of creative generative AI models that covers image generation and text effects for productions, pushing generative AI integrations directly into workflows, the company said earlier this year.

Or, as Burke described it at CPS, “Firefly is generative AI and it’s a number of models that we’ve trained to create images where those models are based upon assets that we’ve licensed from creatives within the Adobe stock portfolio,” where it offers more than 300 million assets. “We train on those assets,” he pointed out, adding Firefly is artist and creator friendly because they can opt in or out.

There is “a lot more to come” on how Firefly works with video and “we’re going to have more to say … on that in 2024,” he told attendees.

Santiago Lyon, head of advocacy and education at the Content Authenticity Initiative (CAI), then discussed “Establishing Provenance: The Content Authenticity Initiative” remotely via video.

“For the last five or six years, I’ve been working at Adobe, more recently for the last three years on the Content Authenticity Initiative,” noted Lyon, who previously served as a photojournalist for the Associated Press (AP) and others.

“This work is really, in many ways, a logical extension of my life’s work as it relates to truth and authenticity and transparency.”

The initiative was “kicked off by Adobe in late 2019, really in response to this perennial problem of mis- and disinformation that we face in our lives every day and because Adobe makes a lot of very powerful software that sometimes falls into the hands of bad actors, and foments the spread of misinformation,” he said.

Lyon added: “We thought it was really important to take on a leadership role in this space to address some of these problems. And from the get-go, we were very clear that we wanted this to be an open-source initiative, which is to say all of the underlying code and development work is available for anybody and everybody to use, including Adobe’s business competitors.”

More recently, “we’ve seen the rise of generative AI imagery” that he said is “indicative of the sort of technology that’s at people’s fingertips with text to image generators like Firefly and many others that allow for the increasingly hyper realistic creation of images that are, in many cases, impossible to distinguish.”

Membership in the initiative was just three when it started back in 2019, and “now we’re at over 2,000,” he noted, saying that list is made up of media companies, “major technology players” and now “major advertising companies.”

Produced by MESA, the Content Production Summit was presented by Fortinet, and sponsored by Convergent Risks, Friend MTS, Amazon Studios Technology, Indee, NAGRA, EIDR, and Eluv.io, in association with the CDSA and the Hollywood IT Society (HITS).