How can journalists from the U.S. and other nations better protect themselves and their audiences from being duped by fake videos? Is there a way for journalists working in remote conflict areas to quickly alert national and international support organizations when their colleagues are beaten, kidnapped or killed? Last week, I joined about 80 technologists and journalists sprinting through sessions to answer these and other problems at a two-day TechCamp co-sponsored by USIP.

Lightning Rounds Spark Tech Solutions for Media Dangers
The BBC’s Samantha Barry talking fast about digital storytelling (here in Kyiv, but on deck in NYC too). Photo Credit: Department of State/Jamie Findlater

This - the 24th TechCamp run by the State Department’s Office of eDiplomacy -  focused on the needs of journalists working in conflict zones. After lightning tech trainings on data visualization (USIP’s presentation), mobile security, SMS and storytelling platforms, and much more, teams formed to address specific problems and work out solutions. The idea of TechCamps is to fast-track the interaction of people across the stovepipes that typically exist in and between organizations, professions and cultures and hinder creativity. In corridors, parks, and bars, around City University of New York (CUNY), the juices were flowing.

Journalists are increasingly under attack around the world. The Committee to Protect Journalists reports that almost 1,000 journalists have been killed worldwide since 1992, with 31 killed by July 31 this year alone. Many more are intimidated, beaten, and kidnapped.

Conflict-sensitive reporting relies on the ability to get into conflict areas to understand the driving factors of violence, to report the multiple sides of stories, to give voice to under-represented groups and to hold all to account. But as brave as many journalists are, the risks involved in going into conflict zones and reporting without self-censorship can be too much for anyone. One consequence is reporting from afar, reporting that highlights the violence but does little to delve more deeply into the how’s and why’s.

So when something bad happens, in a contested or remote location in Afghanistan, Libya, Mali, or Syria, how does a journalist get help?

The Committee to Protect Journalists, Reporters San Frontieres, the International Federation of Journalists and the freedom-of-expression network IFEX  are among the international organizations that track violence against journalists and advocate for their safety and for investigations, prosecutions, and legal and practical protections. With their national counterparts, these organizations do some great work, and web sites like Speak Justice Now draw attention to problems such as impunity for those who attack journalists.

But the way they find out about cases is through a web of formal and informal networks and media reports. Different organizations have varying priorities, diverse local relationships, and distinct methods of defining cases. For a journalist deep in the field, there’s no simple way to quickly and securely alert all the national and international organizations that could offer assistance.

In an afternoon, we put together a standard reporting form that could be accessed over mobile devices, sent securely using the Mobile Martus open-source platform that’s currently in development and distributed to the relevant national and international support organizations.

Another group worked on the problem of video verification. The volume of eyewitness video reports produced by governments, activists and others during the Arab Spring has given bloggers and journalists a treasure trove of material to take their audiences deep inside the conflicts -- except when these videos turn out be fool’s gold. A video distributed by Reuters in 2011 and broadcast by, among others, the Australian Broadcasting Corporation, with the common caveat that it “cannot be independently verified,” purported to show “heavily armed Syrian security beating anti-government protestors.”

An audience member e-mailed the network’s own watchdog program, noting that the Lebanese Army uniforms, Lebanese accents and Lebanese car registration plates might call into question the video’s authenticity. An ABC radio journalist then tweeted to her contacts and within five minutes learned that the video was actually from Lebanon and was shot in 2008. This GlobalPost story has many more examples.

Large news organizations like The New York Times have responded by turning the problem into a solution, curating the videos with accompanying information about what we know – and, more often, don’t know – as content.

But for bloggers and journalists without the resources of NYT, on tight deadlines or with a potential scoop, simple tools can help to avoid the embarrassment or incitement that might come with publishing faked social media content.

The TechCamp team created a tool to quickly search for social media traces of the purported authors of video and photo evidence, so that journalists can get a better idea of who those authors are, can assess their credibility, and can start the process of verification. Along with common techniques such as noting landmarks or distinguishing features that allow cross-referencing with other video or photo records from the same site, or noting weather conditions in videos that can be checked with weather reporting, this tool could help speed up verification and avoid mistakes.

Those are just two examples of tools that both teams plan to keep developing. The Office of eDiplomacy TechCamp caravan is moving on to Ramallah, Nairobi, and Cambodia later this year. Take a look here if you’re interested in taking part. Or you can organize your own TechCamp with their TechCamp-In-A-Box. And stay-tuned for more TechCamping from USIP …

Michael Dwyer is a senior program officer in USIP’s Center of Innovation for Media, Conflict and Peacebuilding.

Related Publications

Afghanistan Post-2014

Afghanistan Post-2014

Thursday, November 12, 2015

Geospatial analysis and mapping have a critical role to play in reconstruction efforts in conflict-affected regions. This report explains the core problem in typical data collection techniques: bias. Data is collected only where collection is safe and thus is not representative. To be more effective, development programs need more in-depth analysis of their reconstruction efforts, even in the most insecure spaces.

Type: Peaceworks

Conflict Analysis & Prevention

View All Publications