I tried to listen to the livestream of Sean Spicer’s daily White House press briefing while simultaneously following and reporting the conversation on Twitter before, during and after the briefing. My Storify here.
Photo courtesy of the Nieman Foundation
After spending much of the past decade reporting on politics and science in his home state of North Carolina, Tyler Dukes became concerned about a glaring gap in the skill sets being taught to the next generation of reporters in journalism schools. As an investigative reporter on the state politics team for the local television station WRAL in Raleigh, Dukes has focused on using data and public records to uncover and tell stories of the problems plaguing mental health care in state prisons and the implementation of protection orders for victims of domestic violence. Yet in his experience teaching at UNC-Chapel Hill’s journalism school and as a researcher at Duke University’s Reporters’ Lab, he saw that data journalism was being addressed in only the most superficial ways, if at all.
“Very few courses are offered,” Dukes said in an interview, speaking of journalism schools across the country, “and when they are, they are far outstripped by courses like how to make pretty graphs and how to do data visualization. It is not data analysis first. It is not using data as a source first. It is not acquiring data through public records and things like that. So it skips this really important step, which is data literacy.”
Hoping to address this problem, Dukes and his wife moved to Cambridge in the fall where Dukes would spend an academic year as a Nieman Fellow at Harvard University studying best practices for college journalism programs and newsrooms looking to democratize data-driven reporting for underserved communities. Now more than half way through his fellowship, Dukes acknowledges he is far from discovering a solution to those challenges. Much of the fall semester was spent taking advantage of Harvard’s offerings to address gaps in his own knowledge in areas like statistics, machine learning and artificial intelligence. These classes, he said, helped to both demystify certain concepts but also offer pedagogical lessons on how to teach things like statistics from a practical, applied perspective to policy makers or journalists who may have limited math backgrounds.
In his remaining months in Cambridge, Dukes plans to continue conversations he began in the fall with students and reporters about how best to move ahead with his idea of creating an extracurricular independent study resource about the various facets of data journalism. He says he envisions some kind of platform, perhaps recorded Google hangouts or Skype calls with experts, that students who do not feel served by their journalism schools could easily access. While many similar online modules and resources for journalists exist in an ad hoc fashion, he says that dedicated organization have had difficulties incorporating these models into universities and colleges.
“If we are pretending we are equipping them to be journalists in modern times they have to have basic data literacy,” Dukes said. “And if journalism school aren’t going to do it, someone is going to have to force their hand.”
In the meantime, Dukes and his fellow Niemans are also using their time in Cambridge to reflect on the deeper questions about their role in the journalism ecosystem that have emerged in the politically volatile past few months. He admits he is starting to feel the pull to return back to his newsroom which, despite widespread consternation about the future of local news, is still relatively robust. While the overall economic climate for journalism is shaky and shifting quickly, Dukes thinks people are too quick to generalize about an industry that is hardly monolithic and varies widely based on platform and location. Though increased coverage and competition from other outlets would be welcome, Dukes has the luxury of returning to a healthy newsroom in a fairly well covered media market that is continuing to aggressively report on post-election dynamics in his absence.
Though he concedes a twinge of regret at not being in the thick of things, he says that “the impact of elections are felt for years…the story is not going away.” And at a moment in which the role of the press in covering politics is being hotly debated, there is a certain “perspective that comes from being forced not to do your job for several months. Hopefully it is going to make our work that much better when we get back.”
In the spirit of doing the assignment in a medium outside my wheelhouse, I attempted to create a multimedia storytelling account of a lecture I attended at the MFA. The end result is ok though since I was confined to the free version of the platform Atavist I had to make due without most of the bells and whistles and stick to relatively basic functions. I wonder if ultimately the final product is anything more than a glorified powerpoint, which leads me to question whether traditional journalistic reportage is sometimes still the best option. The time constraint was also an issue as I sought to master this new platform. A major error is that I had hoped to upload audio of a bilingual portion of the event but I had recorded in m4a and atavist only accepted mp3 format and I didn’t have time to do a conversion. Thus for the time-being the audio is a placeholder of birds chirping…
You can see my story here
I used Rescue Watch to track my media consumption but was not completely satisfied with the detail of reporting I received at the end of the week. I would have liked more in-depth information about the sites I was visiting (rather than the top three per category), as well as how the program chooses what to classify as “news and opinion” vs. “references and learning” etc. Another problem with this program is I have a very dangerous tab habit in which I average about 40 tabs open in a browser at at time and they may stay open for weeks…so I think the time analysis per page was somewhat skewed, Nonetheless, the snapshot view provided by the platform as well as some additional reporting and recording on my part provided me with a few takeaways:
- Like most people, scheduling and communication, primarily on email and calendars, was the single greatest individual category of media consumption. According to Rescue Watch, it was 36% of my time and a total of 8 and a half hours over the course of the week. That actually seems pretty low to me and I wondered about the nature of the tracking. This did not include emailing and messaging on my phone. .
- The rest of my top 5 was pretty expected – numerous course readings, the New York Times and Amazon (streaming TV shows). I also spent 4 hours actually going to the movies (the perks and joys of a long weekend).
- Digging a bit deeper into my browser history, while a significant number of the sites I visited were news and opinion sites, I spent far shorter periods of time on each page, meaning I am either a faster or more superficial reader than I thought.
- My news consumption started daily with a broader reading across international, local, national news and arts sections (though perhaps limited in ideological perspective) and then became increasingly more narrow in scope and theme as the day progressed. This was because most of my clicks either came from social media where like-minded friends were posting on the issues that I care about or through the various thematic email newsletters I subscribe to (Latin American politics and human rights, media industry, freedom of expression and press freedom etc.). Besides the Times, my reading was very piecemeal with never more than one or two articles a day from the same news source. I realized through this exercise that while my daily news consumption may include a variety of sources, aside from the day’s top stories, the subject matter was generally more or less always the same.
- The big surprise and rude awakening was the fact that shopping came in as the third ranked category on my rescue watch list. I am not a big online shopper so did a little more digging and realized the program was reading my Amazon time as shopping, rather than streaming content. Nonetheless, I did attempt to buy a couch online this week and this program laid bare all the wasted hours on this failed enterprise.
- Main conclusions: unsurprisingly, I spend a lot of time writing and responding to emails, reading the news and going to the movies. My indecision means I should not be allowed to online shop for furniture.
The endless stream of information and content provided by social media is the worlds’s greatest gift to a reporter or researcher and also his or her worst nightmare. As helpful and empowering as crowdsourcing this kind of newsgathering or research can be, if your job is to corroborate that information it can present a minefield. How to verify the overwhelming flow of information, particularly in a breaking news, high volume context such as violence during massive demonstrations or in a conflict zone? We’ve all seen (and perhaps disparaged) people who have shared images from one conflict zone incorrectly labeled as another, whether by honest mistake or as part of a concerted propaganda campaign. But it’s all too easy to be duped by such material, particularly if shared widely in a high pressure, deadline-looming situation.
A number of people and organizations have sought to tackle this problem by creating various kinds of verification tools. A recent one is NewsCheck, a Chrome extension launched by First Draft, a coalition of organizations and places like the Google News Lab working on tools to improve skills and standards in online reporting.
The extension is a web-friendly version of a previously published guide to verification for photos and videos and essentially works by presenting the user with a checklist of considerations to run through: Are you looking at the original version? Do you know who captured the image? Do you know where the image was captured? Do you know when the image was captured? The app scores the user based on the answers and these results can be published alongside the embedded image on the intended website so that other users can see for themselves to what extent it has been possible to authenticate the information. This isn’t a perfect fix obviously and I would love to see this tool expanded to automatically feed into to some of the best and most vetted online authentication tools available, as sometimes the number of tools can be as overwhelming as the amount of content and further curation is always helpful. But it’s a nice step to attempt to systematize basic verification into workflows for anyone sharing this kind of content and to increase transparency on these efforts to readers/viewers.
I’m a first year graduate student in Comparative Media Studies and a Research Assistant at MIT’s Open Documentary Lab. Before coming to MIT, I was the Researcher on Central America at Amnesty International, based in Mexico City. There I covered human rights issues in the region and led a year-long project on Central American migrants fleeing (and being deported back to) unrelenting violence. Before that I was the the Americas Program Researcher at the Committee to Protect Journalists, based in New York, where I covered press freedom issues in Latin America and the United States. I’ve also worked as a freelance journalist and with a number of international NGOs and foundations throughout Latin America, predominantly in Argentina and Colombia, as well as in my home town of New York City. I’m a journalism junkie and film buff and am interested in looking at how to apply new narrative and storytelling techniques to the human rights issues I’ve been working on for the past several years, particularly in the area of freedom of expression.