Bruce MacCormack, co-lead of Project Origin, on the Making the Media podcast

Trust is fundamental to the reputation of news organizations. But with misinformation and claims of fake news now abundant, what can the industry do to ensure their viewers and followers know that their content is genuine.

In this episode, we discuss the aims of Project Origin, an alliance of media and technology organizations looking to do just that. But how does it hope to achieve this? And what are the risks to the news industry if this issue is not tackled head-on?

Listen to Hear:

  • How Project Origin plans to establish a chain of trust from publishers to consumers
  • Why trust is central to news organizations
  • The risks to the news industry if more is not done to stem misinformation

Our Guest This Episode

Bruce MacCormack is the co-lead of Project Origin, a media provenance initiative that consists of the BBC, New York Times, CBC/Radio-Canada, and Microsoft. He is also an invited expert on the technical working group of the Coalition for Content Provenance and Authenticity (C2PA).

 

Bruce is focused on the impact of Artificial Intelligence on news operations, with a particular emphasis on mitigating the threat of misinformation. He is a member of the Partnership on AI steering committee on AI and Media Integrity. He was also Head of Business Strategy for Technology at CBC/Radio-Canada and the Executive Architect of the Enterprise Media Asset Management system. 

Once people start to think anything can be faked, then nothing is real. I guess that’s the logical extension of all this. So, what we’re hoping to do with this technology is to provide a higher level of security to say this piece of content has not been faked and this is how we know why it’s not been faked.

Bruce MacCormack, Co-Lead, Project Origin

Mentioned in This Episode

Image1 - S1E15 1200x628
All Things AI

In this podcast episode, we discuss the threat and opportunities of the use of Artificial Intelligence in newsrooms with AI expert Felix Simon
Listen to the podcast

Image2 - S1E15 1200x628
Security Best Practices for Cloud News Production

Cybersecurity is essential for broadcast and news operation—especially with the growth of work-from-home situations. See how Avid is working to make cloud deployments as secure as our on-premises solutions to make remote production easier for you.
Watch now

 

Episode Transcript

Craig Wilson: Hi, and welcome to the Making the Media podcast. My name is Craig Wilson, and as always, it's a pleasure to have you join me. In this episode, we're talking trust and exploring an initiative called Project Origin. Never heard of it? Well, stay tuned. There's a lot to talk about.

Project Origin is an alliance of leading organizations from the publishing and technology worlds who are working to establish a chain of trust from the publisher to the consumer. Now, those leading organizations are the BBC, CBC Radio Canada, Microsoft, and the New York Times—so some really big players involved here, and they're not the only ones.

I spoke to Bruce McCormick from Project Origin about what they're trying to do and how they're trying to achieve it. But I began by asking him to set out a bit of background to the project and how he got involved.

Bruce MacCormack: I'm a strange hybrid of a telecommunications engineer who's now masquerading as a journalist, working with journalists. So, my background is the technical infrastructure behind broadcast and media, and I got to that via a career in telecommunications that morphed into the early days of the Internet where I was running the Internet digital operations of the largest newspaper chain in Canada—about 20 years ago, I guess, now. From there, I moved on through some local newspapers and to CBC/Radio Canada, which was the national broadcaster in Canada—is the national broadcaster in Canada—and ended up doing corporate strategy where I was thinking about “How does technology impact the way we do business? So, what are the things that are going to surprise us? What are the things that are just on the horizon?” And when we started looking at AI, we started to think about, with advent of synthetic media coming on board, what's that going to do to our ability to tell a story and be trusted in telling the story?

So, if anybody can create media and anybody can create the illusion of a story, then what does that do to our storytelling roles? And what do we have to do as defence? So, what do I have to do to make sure we're closing the barn door before the horse leaves, I guess would be the way to express it.

So, through that and some conversations with colleagues I was having at the BBC and the New York Times, we realized we were all thinking the same thing and we basically joined up and said, “Let's work on this together.”

CW: So, you mentioned the “T-word” there, which I think is trust. How important do you think it is to news organizations that people who access their content can trust that it's correct?

BM: So, trust has many levels. So, trusting that it's correct is one level. Trusting the source is another level, and trusting that the piece of content you have in your hands actually came from that source is yet another level. So, trust is paramount. It is the coin of the realm for news organizations.

So, when I was running a local newspaper, it was our brand—we had been in the business of telling the stories of Nova Scotia since 1825. You know, you establish trust through legitimacy of long tenure. And look at any local news advertisements—they'll tell you they're the most trusted or the most watched news in their market, right? Trust is the coin of the realm.

CW: So, you mentioned there about you're beginning to look at different levels and different types of trust. So, what really is Project Origin about? And how do you think it can help people be confident that what they're looking at is trustworthy?

BM: So, the first thing we did was we said, “OK, what problems are we trying to solve? What problems are we not trying to solve with Project Origin?” And making sure we kept that scope tight was very important because this is a broad ranging problem. And what we said we're going to start with is: How do we make sure people can trust that a piece of content that's come to them through some channel that we may not control?” I mean, if you think that the heritage of news organizations, we broadcast from our tower, it got picked up on your rabbit ears… there was nothing in between, and you watch the news and we said, “The top story is…” and you had every reason to believe us.

Now you're seeing a clip of news, or maybe a story, taken out of context, possibly, from the program it was in, distributed by a party that may or may not have anything to do with the people that published the news story, and it's made it to you through some channel which is probably legitimate, but you don't know. And it doesn't take much interference with the story to add a few frames, drop a few frames, inject a “not,” or take a “not” out of a sentence, so that it reads “We will not do this.” You know, you could flip context quite easily.

So, we said “OK, how are we going to make sure that we can ensure the integrity of the content that person receives? That what you've received is what we transmitted?” Because there is potential for mischief in between. And: How do you know that the source that sent it is the source that it purports to be? It's not just somebody with a camera saying I'm from the BBC and this is what I'm doing. They actually are legitimately from the BBC, and you can confirm that is from the BBC.

So that's what Project Origins started off to do. It's to do those two things: Ensure that you understand who the source of a piece of content is and that it hasn't been tampered with in transit—easier said than done. I mean, it's technically doable because it's technologies from banking and cryptography that are just have to be poured over—but it's a fair bit of work to pour that over.

But we were fortunate enough to have Microsoft also join us. Microsoft Research is a partner in Project Origin, and they said, “OK will will help with some of that heavy lifting stuff.” So, the media organizations were helping with structuring what the problem looks like and describing our workflows, and Microsoft is able to convince “OK this is what we can do to help help adjust the file structures in the cryptography.”

CW: So, before we delve into the details of it, one thing I want to be clear about is: What you’re not doing here is content authenticity, to an extent. You're not verifying the originating source, for example, of something that someone is shot on a camera found somewhere that ended up in a newsroom system.

BM: So, as I said, this is a multi-faceted problem. And what we found is, in the industry there's a number of partners that we've found that are doing different aspects of the problem. So, we started off focusing on: How do we deliver video from a broadcaster to a consumer, and make sure that piece of the chain had provenance? That’s what we’re calling it—the ability to track it through.

We discovered that Adobe was doing similar work from the light hitting the lens on the camera on a still photo to start with and into the editing process. So, Adobe's got something called the Content Authenticity Initiative. They put out a white paper about a year ago, now. If you read their white paper, you will see that all of the members of Project Origin are co-authors on the Adobe White Paper, so they are leading that aspect of the research, but we're all cooperating. And in February of 2021, we actually formalized that cooperation. We formed something called the Coalition for Content, Authenticity and Provenance, or, as we call it, C2PA. That group has been formed to build common technical standards to make sure end-to-end provenance can be traced from its source all the way through.

Now, there may be legitimate reasons why you don't want to pass that information through. Nothing says it's mandatory, but the facility to carry metadata in a secure, encrypted way, from source through to the delivery with all of the editorial ads and changes accounted for—I mean, you are always going to want to change content. You're going to color adjust it, you're going to crop it, you're going to change it for time. But you want to know who's made those changes, and is the person that makes those changes declaring on pay-out “It was the CDC and we were editing this piece, and we've signed off on the work we’ve done.” So, that end-to-end provenance, is something that we're working through as a group.

The seat the C2PA has been joined by Intel, by Arm, by Twitter. We're building quite a large industry coalition to make sure that we solve this problem once and solve it consistently.

CW: So, you mentioned there about taking technology from things like the banking industry. You know, I think some people may be surprised that you take that kind of approach. So, where do you think there are similarities in what you could take from other industries and bring it into the media technology world?

BM: Well, cryptography is well established as a science. It’s been applied to media to make sure that legitimate content doesn't end up in illegitimate channels. I want to make sure that my soccer/football games don’t get shown to someone who hasn't paid a license. What we're doing is the complete opposite. We’re making sure that illegitimate content cannot make it into legitimate channels and purport to be legitimate. So, it's the same problem, but it's a reverse on it, and it's sort of like looking through a telescope the other way around. It's all the same lens pieces, but you look from the wrong end of the telescope, it changes the way the machine works.

So, we're taking that and thinking through each of those steps and say, “OK, how do we reposition these pieces of content? Or this piece of technology to to ensure it serves our purpose?” Which, as I said, is to do the end-to-end validation of source and tamper resistance.

CW: So, what stage is the project in at the moment for the kind of things you're looking at?

BM: Yeah, so this is, as you can probably imagine, a multi-year project. Right now, we have assembled the coalition, and we’re doing this at a couple of levels. So, the the C2PA is working diligently to produce a public draft later this year of a specification—an open standard specification. That will be available this fall and then will be taken out for comment and will work through the standards bodies.

The work we're doing at Project Origin is going to rely on the fact that companies like Avid or Intel are building this functionality into their product. So, once it's a public standard, we're assuming that it becomes a function, and that none of this is much more than a function. If you're a reporter, this should be baked into your edit suite. You know, that it’s going to carry that information through as much as it carries much of the other production metadata.

But from a workflow perspective and talking about “How does it get used in journalism,” Project Origin  has assembled a coalition of about 20 or so other news organizations and thinking about: How are we going to use this technology in a consistent and interoperable way? Because media files get exchanged between broadcasters and publishers frequently, you want to make sure that we find the sweet spot between making sure there's a consistent implementation that allows us to maintain provenance end-to-end without going into this Herculean task of saying “Let's align all our metadata standards across multiple broadcasters” 'Cause, you know, none of us are silly enough to try to do that. So, we're trying to find that sweet spot, and that's a series of conversations that are ongoing now and will go on probably for the next nine months or so on adoption methodologies to make sure that there's a smooth and fairly rapid adoption. And also to feedback requirements back into the C2PA standards. So, as we talk to other broadcasters, we're testing out ideas. What's important, what's not important, and making sure those get baked into the standards work at C2PA.

CW: I guess, one of the other things from the broadcasters perspective is they don't necessarily want to have something that is going to delay them putting news on air, 'cause I guess you know there's a constant debate is now “I want to be first” or “I want to be right?” Ideally I would be first and right? So, I guess that's something else that you have to work through as well.

BM: Yeah it is. But nothing says that you can't do both. You can release a file, and there’s going to be a lot of files that are released without metadata in place—especially in the early days. But you can go back and revisit it, you can set it as breaking news, but when it goes out in the 2nd edition, you've put the rest of the provenance information in place, you cleaned it up, or ideally, it's done in such an automated way that it's just part of the workflow and it doesn’t catch you up. It's going to be a matter of how the implementations get built by the various vendors such as Avid.

CW: And what about the standards? What work do you do the moment to try to gather? Is it through bodies like SMPTE or others that you're looking to utilize?

BM: Yeah, there’s another team that's working through who's the appropriate standards organization. We have liaison agreements with a number of the standards bodies that are consulting with us as we're writing the draft standard that we're putting together and then the endorsements and the ratifications will roll up through the various standards. I can't comment on which ones will be using formally at this point in time.

CW: And what do you think success is for this project? What, in essence, is that is the challenge that you really want to solve?

BM: It's sort of like the Y2K problems, for anybody who's back in the business in the year 2000. Nothing happened, and everyone went “Well that was no big deal.” And nothing happened because an awful lot of hard work was done to clean up some bad code to keep things working smoothly. And ideally, what we want to happen is continued trust in news goes forward.

To give you sort of an extreme scenario, if you can fake legitimacy. One of my friends Gregory has a great line that says, “Prepare but don't panic.” You know, when you're preparing, you have to think about “What are the the attack scenarios? And how could you be affected by this?” And misinformation—well timed, around known events can have dramatic impact, even if it's only for a short period of time.

You can pre-empt a CEO doing an earnings announcement and get something up that looks like a CEO, but saying something that's not true enough to drop the stock just for enough time to do some quick trading and make some money, right? So, there's there's announcements that can be timed, and can be faked and have disproportionate impact.

Well, one of the things we start thinking about is, “Well, what if somebody faked our hosts on our sets and took all of the signs of legitimacy?” In the UK, you've got your… the crown has the sector and the sword, and then there's the signs, legitimacy. Well, for a news organization, it's the host anchor on the set with the desk and the way they tell the story with the theme music. If you could take that and and adjust that person's behaviors and speech such that they said something that we did not want them to say and push it out, all of the halos of legitimacy of the broadcaster are used to amplify this disinformation. And that's not good! You know, would we be doing our job if we weren’t figuring out a way to put a few locks on the door to make sure that that was harder to do?

So, what we're hoping to do is maintain legitimacy by saying that “We've done that.” There's another piece of work that's being done by some really good people at something called the Partnership on AI, which is a coalition of a lot of AI organizations that funding joint research into best practices with AI. And there's a team there led by Claire Leibovitz and Emily Salts, who are doing great work on labelling. So, as we start to say this, this material has legitimate provenance, how do we communicate that to an audience in such a way that it lets the person know that the end-to-end track is there? It also doesn't make a comment on the reliability of the journalism. It just simply says it's come from that journalist source.

So again, you started earlier talking about levels of trust. The legitimacy of the journalism is a different thing than the legitimacy of the container carrying journalism. So, Project Origin has been working on the legitimacy of the container, and then there's other layers, and there's other work being done by other organizations about the legitimacy of the content. So, all of these things have to work together.

That's a confusing story to give to a consumer, and we have to figure out how to break that down into small bites of, you know, what does a thumbs up (or whatever the equivalent we end up using happens to be,) and what should that signal to the audience? So how is the user educated, and how do we signal that content through to the end user is some work that, yet another part of this broad coalition is working on.

CW: You pre-empted what my next question was going to be, Bruce, because I think access to information generally now is really more widely available than it's ever been. But I'm not entirely convinced that people are better informed than they perhaps were in the past. So, from a user perspective, would they necessarily see anything different or… I’m trying to understand from that perspective, I can understand you're absolutely right. There’s the technical standard. “This piece of content has legitimacy all the way through it,” and I absolutely get what you mean about the different layer of trust in the actual journalism itself. So, I'm just thinking from the user experience, what do you think that might be?

BM: Well, I I won't pre-empt the user experience task force we have working on this, but there's a lot of people actually working through the “What information is signalled and how is it conveyed in such a way that it can't be misinterpreted? And how do we make sure that during the initial phases of roll out, the lack of a signal does not constitute illegitimacy, because this is going to take awhile and roll out piece parts and you don't want to infer that a piece of content that doesn't have a positive signal is a challenge.”

So, in the initial rollout, I can see this being used by journalists who want to check the forensic integrity of a file before they use it in their content, right? So, you've got an educated, trained journalist that's taking a piece of content in, checking the seals, making sure everything is OK, and then using it in their content.

That's very different than a consumer who's in a sit-back mode and just watching the content flow over them. So, you know, we're going to see this done in a couple of layers. There's going to be a signal level to tell people “Yes, this content has authenticity information,” right? Then there may be an interrogation point where you click on it and they say “Where was this picture taken? When was it taken? Who took the picture?” And give you that kind of information.

And then there might be a third layer that's forensic. Where you're going through a video frame by frame to make sure that the temp, the “Tylenol safety seals” on each frame are intact—and no one expects a consumer to do that, but you may do that if you're checking a news footage that you've gotten from some source that you just want to be absolutely sure. So, it'll depend on the use of the context and the way it's displayed. Well, we're talking to a number of organizations that control displays in various forms to say, “OK, how do we build this into the standards?” They just make it part of the day-to-day consumer experience, but that's an ultimate goal. We've got to build the foundation of getting the signal to the device before we can think about how we're going to show it on the device.

CW: So, you talked about the other different working groups. You've mentioned lots of different organizations. It's a wide-ranging project. How difficult is it, Bruce, to bring all these things together?

BM: It's easier now than it was two years ago. If you think about starting an avalanche, should start with a tiny little snowball you start it rolling. We were lucky in that I was concerned about this just in my day-to-day role at CBC Radio Canada. I was at a dinner with the CTO of BBC. We were just chatting over dinner and the conversation and some residents. So, BBC says “Let’s start to see if we can do something.” Partnership on AI had similar conversations. The New York Times chatted with their head of R&D, and said “We’re working on a similar project.” And there was an evening when the Chief Architect at the BBC, Jatin Aythoram, Marc Lavallee, the Head of R&D at the New York Times, and myself were having dinner and, you know when the chemistry is right on a dinner and everybody's thinking the same way you can sort of feel it? We said “Yeah OK, this makes sense. This is a real problem. The approach makes some sense to work on it.” And then we looked at each other and said, “OK. Well, if the BBC, the CBC, and the New York Times aren’t critical mass to start a project, I don't know who is.”

So, we said “OK, let's see if we can find friends!” As I said, and that helped. But then you get other organizations. The BBC did something called the Trusted News Initiative, that's bringing together a number of journalistic organizations, so, we said we can be a technical element of that. So, it's about friends introducing them to other friends. And now there's enough critical mass to say “OK. We are all working together here.”

As I said, there are different facets of the problem that different groups have taken the lead on solving, but, for the general rules, I think most of us know each other and there's probably 30 people that are working actively on this right now. I would say somewhere… maybe 50, maybe 25, but it's in that order of magnitude. But I would say that, you know, everybody knows 8 out of 10 of the others, so that there's a critical mass in the conversations; there's a high frequency of conversations; and that's making it easier to find other groups that want to join and support and endorse the work we're doing.

CW:  And how would you summarize? I guess you talked about there is a flow of trust. (Maybe that's the good phrase to use around how this is going to work.) How would you summarize where you are in that journey of getting to that point where that flow off trust and can begin to flow, if you like?

BM: If you think about it as a technical stack, we're thinking about how it's going to get built-in at the chip level with the partners like Intel and ARM. We are building standards on how it can be used with the work that C2PA is doing. We're talking to a number of vendors that will build this functionality into their price.

So, once it's a thing, once we put a name on it, we give it a standard number, and we have a bunch of broadcasters and journalists saying “We want our newsroom to comply to this standard,” then the tool vendors will say, “OK, well, we're going to put that in our in our road map and it will be released in version X in nine months” And it'll be out there, and then it starts to become something that is just part of the way we do business—in much the same way as any new technology gets adopted. It starts slow, and then it becomes widespread and then it becomes sort of just a given that’s in place.

So, this will take a little bit of time to work its way through, but for the most part, it's technical sausage making. If you were to talk to a lot of journalists, they don't have to know what goes on past the lens, right? The lens is where their job stops, and then the engineers start on the other side of the lens. And to a large degree, it's going to be much the same way.

Once we've said that having a story securely sealed with a tamper proof seal is the expectation, then the engineers can figure out how to maintain that file end-to-end this week as we move through the system.

CW: And ultimately, what do you think are the benefits something like this brings to news organizations? Because there's been so much concern about, you know, fake news, people look at things like deep fakes and other things as well. What do you think the ultimate benefits they can bring to news organizations?

BM: Well, deep fakes is where this started, right? So deep fakes were when the original concern came from shallow fakes where it's really just a small edit, more than a fancy piece of technology, is where the concern is right now. But as the technology for creating synthetic media becomes more widespread and widely distributed, there will be a general erosion in trust because once people start to think anything can be faked, then nothing is real. I guess that's the logical extension of all of this.

So, what we're hoping to do with this technology is to provide a higher level of security to say “This piece of content has not been faked and this is how we know why it's not been faked. Therefore, we’re maintaining trust.” So, it's slowing the erosion, if you will—going back to my earlier things. It’s not that we're trying to make things much, much better—we're putting in a defence so things do not get much worse, I guess, with TV the way I would phrase this. Which is why I said it’s not glamorous, it's not sexy, it just has to be done. And adding security, as we're starting to find out, cybersecurity is a part of our lives, we have to start building it in, and when you don't build it in, bad things happen to pipelines, to airports, to hospitals, and we can't let news be one of those industries that gets taken down by a lack of care in securing our product.

CW: So, Bruce—really interesting stuff. I mean, there's one question I'm asking everyone that's on the podcast, so I'll I'll ask it to you: When you look at the scenario, when you look at the situations, what is it, if anything, that keeps you awake at night?

BM: Oh! Pause, while I think about this. What is it keeping me awake?

It originally was deep fakes. That was the original concern. I was reading about deep fakes, and I went “Gee, you know, somebody should be doing something about this.” And then I sort of looked at my job description and went “Well darn it—I think it's me.” But as I said, prepare, but don't panic, is taking me away from the edge on the deep fake by saying “It's coming, but we're starting to have the defences in place.

It's consumer education, I think. It's how we educate consumers to consume both the technology that they are receiving media on, and understand the business of media that is populating the containers that is sending that content to them, and to do so in a way that maintains truth—however truth gets defined. And, defining truth would become the thing that keeps me awake at night, at the end of that sentence. So, that nature of the, “We're putting very complex technologies into hands of a broader population that was never intended to have broad technology knowledge,” I guess. It's a complex world.

CW: It certainly is a complex world! Of that, there is no doubt. I'm sure we will hear a lot more from Project Origin as the year develops and into future years.

I want to thank Bruce for joining us on the podcast. What did you think? Let us know. I'm on social. My username is @CraigAW1969. Or, email us here at the podcast—our address is MakingtheMedia@Avid.com. Also, check out the show notes for links to a webinar about cyber security featuring experts from both Avid and Microsoft. You can also check out another podcast episode with AI researcher, Felix Simon, where we talk about the impact it could have on journalism.

Next time on the podcast, we're going to delve into the world of digital news with Adam Weiner, Executive VP of CBS Local Digital Media in the US. Let's hear a bit of what he had to say:

Adam Weiner: The lines are blurred between what we had done before for traditional broadcast or network news shows, and the digital offerings that we have, so much so, that we’re in the process—and I think that every media company is in the same process—of this transitionary period of understanding that we’re always on. Regardless of the deadlines that we have connected to specific broadcaster shows, there are people looking for information from us on all platforms, at all times.

CW: Don’t forget to subscribe to get notified when the episode with Adam is released. That’s all for now. Thanks to our producer, Rachel Haberman, but most of all thanks to you for listening. My name is Craig Wilson. Join me next time for more behind the scenes discussion on the challenges facing the news business in Making the Media.