How Behavio's Smart Metadata Could Help Verify Amateur News Photos

Technology from the Knight News Challenge winner Behavio could provide valuable insight to journalists when fact-checking user-generated content.

This Monday six winners were announced for the latest Knight News Challenge, which focused on the topic of networks. One of the winners rewarded $355,000 was an open-sourced Android platform called Behavio, which "turns phones into smart sensors of people's behaviors and surroundings."

The technology, which emerged from founder Nadav Aharony's (@nadavaha) project at MIT Media Lab, proposes to create a software development kit for apps that would allow individuals to explore data about their lives, by tracking their activity through the multiple sensors now present on most Android smartphones.

As Behavio's website explains:

Your smartphone is sensing dozens of signals right now. Things like location, movement, app activity, radio networks, devices around it, and much more. These simple signals, put together, can be used to infer much more interesting and useful things about us, our environment, and our communities.

One use of this "smart metadata" can be seen in a mockup from Behavio, in which a photo is shown with a superimposed layer of information gathered at the time of capture. Journalists who deal with amateur eyewitness images will be familiar with metadata, as this information (things like date and time stamps of a photo, as well as the make and model of the device it was taken with) can be extremely helpful in the fact-checking process. At Citizenside, we've even been able to analyze details of extracted metadata to determine if a file has been copied from a Facebook gallery, and then provide a link back to the gallery where that photo originally appeared. All this is done to determine that the files are originals and authentic.


However, the "smart metadata" gathered using the Behavio technology could potentially provide journalists with a game-changing amount of new information about the conditions under which a certain image was captured. Factors like the amount of ambient noise, the weather conditions, the speed at which one was moving, and a list of nearby devices would shed so much light on the situation a person was in that it could become much easier to determine whether they were present at the event, and whether the event actually took place as they described.

With its Knight Challenge winnings, Behavio plans to build out a software development kit, which will allow developers to create Android apps to tap into the multiple smart phone sensors for a variety of uses, something that until now has been very difficult to do. Ahrony explained in an interview with Andrew Phelps (@andrewphelps) of the Nieman Journalism Lab at Harvard: "When the big organizations do this, it's usually closed-source ... so small developers end up doing everything from scratch."

As obtaining this "smart metadata" becomes easier, it will help to inform not only journalists who are fact-checking eyewitness photos, but a range of professionals from the health and sciences fields on crowd dynamics and tracking epidemics. A number of other interesting ideas for what can be done with the Behavio platform can be found in the above-mentioned article from the Nieman Journalism Lab at Harvard.