Open data is a buzzword, or more accurately a buzz-phrase, these days. In certain segments of the humanitarian community, there seems to exist an energy behind making data more openly and publicly available. Whether looking at UN OCHA’s impressive new HDX data-sharing site, the IATI registry, or a bevy of individual organizations’ online data portals, one could come to the conclusion that open data has fully arrived.
As media and technology are rapidly changing the quantity and quality of the information circulated in our daily lives, we all know intuitively that our practices and standards for dealing with these shifts has not caught up.
On 17 September, I was invited by the European Journalism Centre (EJC) to attend the PICNIC Festival 2012 at the EYE Film Institute in Amsterdam, where they hosted an entire session entitled “Maps, the Power of the Crowd, and Big Data Verification.” This session focused on the crucial role of crowdsourced information in humanitarian emergencies.
Beyond the hype of “going digital,” many media outlets who have lead the way in multimedia content production are increasingly facing the question of what to do with all this new material, how to keep it safe, and how to share it over the long term.. A new study from the Internews Center for Innovation & Learning, “Digital Media Preservation: Why Media Organizations Can’t Archive (And What They Can Do Instead),” explains options for the safe storage of content and the associated challenges faced by organizations attempting to store large amounts of data. The study also makes recommendations for effective cataloguing and management of digital archives, allowing content within an organization to be described, discovered and re-used.