Producing data stories and visualisations the Reuters way

Reuters has an exceptionally strong track record of creating innovative data journalism projects and effective visualisations. We spoke to their data editor to find out how they do it.

Whether it’s about tracking construction on the South China Sea, visualising the perilous state of Rohingya refugee camps, or examining the evolution of dog breeds, Reuters has consistently produced some of the most outstanding examples of visual storytelling and data journalism out there.

We caught up with their data journalism editor Janet Roberts to discuss the news agency’s approach to data projects, how data visualisations have evolved over time, and what skills are important when entering the field of data journalism.

After we spoke to Janet, Reuters’s Life in the camps project won the ‘Data visualisation of the year’ award at GEN’s Data Journalism Awards 2018.

Reuters is a sponsor of the Data Journalism Den.
The interview has been edited for clarity and brevity.

 

When you start working on a data story, what are your first steps?

I wouldn’t say that there’s any one set way. Sometimes we respond to news and try to create something quickly. Sometimes beat reporters come to us with ideas that we help germinate. Sometimes we explore our own ideas.

When we start a project, we first check the landscape to see what’s already been done, as we don’t want to revisit things that have already been revealed. We’ll do a review of academic literature to see what smart people have to say on the topic, what types of approaches they may have taken, and what kinds of datasets they’ve used. And we see how easy or hard it would be to acquire data for the project.

I’m not a big believer in using data to search for a story. I think that data journalism tends to be much richer when you start with a story, and then look for empirical findings by which to tell that story, rather than the other way around.

How long does it take to get from the initial idea to publication?

If it’s a breaking news topic, we try to turn out a story in a week or two, while the news is still fresh. But other times we may take a really deep dive into a topic in an original way with a lot of difficult data analysis. We’ve had a couple of projects that have spanned more than one year.

I think the longest one so far has been the Echo Chamber project, which looked into the influence elite lawyers have on the US Supreme Court. That took a couple of years to pull off. We have another project going on right now that started over a year ago, but I can’t say more about that yet.

But even with these long projects, it’s not like we hold all the stories until the end: we’ll do data analysis and roll out stories as the project progresses.

Chart from the Echo Chamber story.

How many people in your team normally work on a project together?

We have right now six journalists including me on our team, and we’re about to add another. On some projects, like the one we did about the European migration crisis, every single member of the team takes a piece of the project and works on it. But it depends on the story, and what resources are needed to pull it off. There was a story last year that two journalists from my team worked together on from start to finish. And other times someone from my team will pair up with a general reporter from the file.

What are the biggest challenges that you typically face during a project? Are they about tech, access to data, or something else?

Every project has its own challenges, but there are some typical ones. Getting access to the data can be challenging, for example when we try to do a national analysis and need to get data from all 50 states in the US. Sometimes there are freedom of information challenges, and we have to fight reluctant government agencies to get data. Sometimes there are technical problems, like when you have an enormous dataset that consumes a lot of resources and you have to figure out an efficient way of managing the data. And data can also be very ugly and require a lot of programming before you even get to the point where you can analyse it.

How does your team approach visualisations?

We’re kind of left-brain analytical thinkers, so we’re not the best people in the room to figure out how to visualise data. We work with our graphics team to find out the best way to present our findings visually. We just hired a new editor whose primary mission is visual storytelling: she gets involved very early in the discussions on a project and we then try to make sure that we have the pieces that we need when we go out and collect information.

I also have a developer on my team, who comes up with ideas and has done some pretty cool things for some of our projects. But typically we sit down with the head of graphics, Matthew Weber, and explain the project and the data sources we have and what findings we’re expecting. Then we do some brainstorming about how we might present them visually.

A lot of Reuters projects seem to use the ‘scrollytelling’ format, where content is revealed or changed as the user scrolls. Could you tell us a little bit more about that?

The graphics staff is fond of those and it indeed seems to be a common theme in our work of late. Thanks to our new visual editor, we’re now thinking about our graphics and how to best present our findings, not only in terms of how to present data visually, but how we can incorporate it in the narrative as well.

I think we’re going to see more integration, not only in the reduction of chart and graphs but also how we textualise and tell a story. Do we tell it all the way through in 4000 words, or are we going to tell it in bite-size chunks? Or should we present it more visually? We’ll be thinking about questions like these.

Has your workflow evolved with time?

We’ve certainly got better in trying to involve all of the different pieces of the story early on, including graphics, photos, video and even social media, so that we can make sure that we’re not scrambling at the end of the project. And as we’ve learned, the nature of collaboration has improved over time. For example, we make sure that everyone in the team knows what our strongest material is going to be, so everyone can prepare for that.

What recent data projects have been highlights for you?

We were part of the collaboration about Daphne Caruana Galizia, the reporter who was assassinated in Malta. It wasn’t a particularly heavy data lift, but we tried some new storytelling techniques on that story that were well received.

I really liked our project on Duterte’s drug war in the Philippines. I think we did a good job in combining graphics, data and storytelling, specifically in the story about the Davao boys, the kind of elite police squad that was brought in to carry out missions of rounding up drug suspects. And I really liked our piece on the killing of the 10 Rohingya men in the Rakhine state in Myanmar, that our journalists are now imprisoned over in that country. It was a really deep dive into what happened there and the human toll came across very well.

The boys from Davao project examined the secretive police squad that is involved in a lot of drug-war killings in the Philippines.

Has it been a challenge to use data to convey a human story? Data journalists deal with numbers, but in the end, you want to get to the people behind them.

I think that is a driving force behind all of our projects. When we are contemplating which stories to take on, we ask ourselves questions like: Who are the victims here? Who are the villains? What is the impact that reporting on this topic might bring?

We’re always considering the human angle: do the numbers illuminate something about people, about suffering, about some injustice? What light can we bring to bear on a topic through our analysis? There are people lurking in the numbers. And quite often the data helps find them.

Would you have any tips for somebody who is getting started with data journalism or wants to develop their skills?

I think every journalist today, whether you call yourself a data journalist or not, should know how to use a spreadsheet to sort data, write formulas, do a pivot table, and import data they find on the internet. And you should learn how to integrate data as one of your sources. Just like you develop people on your beat, you should be developing data: be aware of what data the agencies you interact with have, in what format, and how they make it available. Figure out how these datasets can inform your coverage.

If you want to dive a little deeper from there, think of what skills you need to interview any data sources that you develop on your beat. Is Excel enough, or do you need to learn some SQL? And if you want to go beyond dabbling in data and want to have that ‘data journalist’ label attached to your name, you definitely should pick yourself a coding language. Python and R seem to be the most popular choices these days, but if you have a foundation already in another language, just run with what you already know. It’s really important to have some coding skills in your pocket, they’re going to enable you to push past those inevitable roadblocks you encounter when you start working with challenging datasets.

But the most important thing is to develop your aptitude for learning new technical skills. The people who really get ahead as data journalists are the people who are keen to learn and are always looking to develop new skills. Keep an eye on the latest statistical techniques out there and what the computer scientists are up to. What could we take and apply to our work that would allow us to explore frontiers that haven’t been explored yet?

One thing that excites me is marrying journalism with academia. I love talking to computer scientists and professors of statistics, I love reading their papers and seeing what they’re up to, because I think they’re doing a lot of really cool things that could also be applicable to journalism. But there often is a kind of a gap, as they don’t really understand what we do, and we have a hard time explaining it. But when you find the right alliances, really cool things can happen.

Can you describe your own journey into data journalism? How did you enter the field?

I majored in journalism in college and started out as a beat reporter in North Carolina. I was a reporter for about five years and then joined the editing rank. I did that for about another five years and was feeling kind of burned out, but was also hearing about the cool things IRE was doing with computer-assisted reporting. I convinced my bosses at the time to send me to one of their boot camps in Missouri. That was the beginning of it all right there, it was so exciting that I never really looked back from that week forward.

I did that kind of journalism at my first paper for a couple of years, but it was too small a place to have someone devoted to that. So I moved on to a bigger metro paper and worked at the St. Paul Pioneer Press in Minnesota. From there I went to the New York Times and I was one of the first members of the data journalism team they formed there in the mid-2000s. I worked there for about seven years, and when Reuters came calling in 2011, I was up for a new challenge and decided to come over here.

It sounds like it was really your own passion and interest that allowed you to develop your data journalism skills?

It’s all self-taught, everything I know, though I’ve certainly had some great mentors over the years. Tom Torok, who headed the computer-assisted reporting team at the New York Times, took me under his wing and I learned a ton from him about coding and SQL servers. But generally, it’s a matter of finding out what’s going on out there, understanding how it might be applied to the journalism that you want to do, and seeking out the people who can help you learn.

I always say, I know whatever it is I need to know to do the story I want to do. I’m not a skilled expert in any particular area. I have strong skills in some areas and weak skills in others, but I have the skills I need to do the story.

Mapping migrant kidnappings on the Libyan coast. From the Migration Machine investigation.

From your point of view, how has data journalism as a field evolved over the last few years?

Data visualisation is obviously all the rage right now, that’s what everyone is talking about. If you look back maybe ten years, everyone was talking about interactives. Then suddenly that went out of vogue, because it’s not really scalable, and it’s not easy to make them work well on phones and tablets.

One of the biggest changes has been growing the graphics desk muscle and marrying that with the analysis. I think we’re still figuring out how to make that marriage work. More and more graphics people are becoming storytellers instead of just chart drawers, and more and more data journalists are learning to think more visually about our work. I definitely see that as the biggest change.

I worry that we’re losing a lot as a whole just because of the nature of the business and how a lot of small and large metro daily newspapers have been economically gutted. There’s not as many people doing this kind of journalism out there in the smaller geographic areas. But it’s encouraging to see that there are still metro dailies in the US that are committed to investigative journalism and are still putting out some really great projects. Another trend has been the rise of the non-profit journalism model, which has been really interesting to watch.

It’s also really exciting to see how many people come to the conferences every year. I remember going to NICAR when maybe 250 people were at the conference, and now there’s over a thousand every year, it’s become almost overwhelming.


The main image is from the Reuters project Casting a wider net