Rob Grant Individual Portfolio

Rob Grant Individual Portfolio

Organisation: Trinity Mirror Data Unit (United Kingdom)

Publication Date: 04/09/2015



My name is Rob Grant, I’m a data journalist at the Trinity Mirror Data Unit. We work with the national Mirror titles in London and our regional titles such as the Liverpool Echo, Manchester Evening News (M.E.N.) and Birmingham Mail. I divide my work up into news and project work. In terms of news, every day I find data, analyse it for news stories, write copy specific to our regional titles and wire it to my colleagues via email. They can then either publish the story as it is or get one of their reporters to spend some more time on it. In this way one good dataset can lead to stories in several of our titles. Here are five sets of news stories that have come from my data analysis and reporting: 1. The first piece was an analysis of gun crime figures by police force in England and Wales from the Office for National Statistics. Comparing the rates for 2013/14 with 2012/13 I found that the West Midlands had actually overtaken London as the gun crime capital of England and Wales. The Birmingham Mail splashed on this news. It was a real agenda-setting story that was followed up by The Times, The Guardian and the BBC. It also generated strong lines for the Liverpool Echo and the M.E.N. 2. I sent Freedom of Information requests to public bodies around the country asking the amounts of money they paid the so-called ‘Big Four’ accountancy firms - Deloitte, EY, KPMG and PwC, as well as outsourcing giant Capita. The dozens of requests needed collating into one spreadsheet. The results were revealing: the five companies were being paid millions for auditing, consultancy and other outsourced work. It provided two hard-hitting business exclusives for the M.E.N. 3. One dataset that our readers always love to read about is the house price data from the Land Registry. The full dataset lists every house and flat sale going back to 1995. It is millions of rows of data - much too large for everyday spreadsheet programs to handle. I use Tableau as a tool to provide some fascinating insight into the growth of the housing market in Britain over the last fifteen or twenty years. I averaged all the prices paid for houses for each calendar year for postcodes we were interested in using Tableau. It showed startling growth in some areas over this time. The story published on the Liverpool Echo showed how you would have likely trebled your money by now if you had bought a house in the L7 postcode around Kensington and Edge Hill in Liverpool back in 2000. 4. One of the coalition government’s most important policies to do with jobs and benefits has been the Work Programme. It is designed to get people off Jobseeker’s Allowance or Employment and Support Allowance. However it has been criticised by opposition politicians, trade unions and campaigners for its low success rate. Ahead of the election, I found that one of the private companies contracted to run the scheme in Wales, Rehab JobFit, could not manage to get one in ten people into a job long-term in the country - one of the worst records in Britain. WalesOnline picked up the story. 5. Things are better for the disabled on public transport than in years gone by - but there is still some way to go before they can travel freely around the country by train. I scraped the National Rail website, which has a page for each train station in the country, using OutWit Hub. I exported the data from OutWit into a spreadsheet program. The Office for Rail Regulation (ORR) publishes data on station usage, so I cross-referenced my data with the ORR statistics to get the local authorities in which the stations are located. I was able to show that in places such as Glasgow there are many smaller stations that are still effectively off-limits to people who cannot easily climb stairs or step on or off trains. I would also like to highlight one data-led project I worked on this year. Our project work at the data unit focuses on working with very large datasets to shed some new light on a topic our readers care about. Last year was the hundredth anniversary of the beginning of the First World War - one of the most significant events in British history. We wondered whether we could find out how many servicemen and women died from each of the cities and counties our titles cover. I contacted the Commonwealth War Graves Commission (CWGC) and asked them whether we could have a copy of it to analyse. Very kindly, they agreed. With the million rows of data loaded into Tableau, we now faced the challenge of assigning soldiers as coming from a particular area. This was no easy task because the biographical data was messy. Sometimes a spouse was listed with an address; other times the parents were listed. In some cases the birthplace was there and often we had a combination of the three. If a soldier’s wife lived in Birmingham and his parents in Coventry, where was he from? To make things even trickier, the data was first compiled in the years after the war, when the country was still a very different place from the Britain of today. The huge urban areas such as Greater Manchester and Tyneside did not exist in the same way they do today. I had to do my best to ‘map’ our patches on to data that was nearly 100 years old. This whole process took several weeks. By the end, I was able to say as definitively as possible how many soldiers from Greater Manchester, Merseyside, the West Midlands and our other cities died in World War I - the first time a news organisation had ever attempted this. Our output was planned for print and digital. With the help of my colleagues Dmitri Thompson and Carlos Nóvoa, we designed pages that showed how many people died in each large town in our titles’ patches. Carlos automated the process using a spreadsheet and Adobe Photoshop to generate pages. They were put together in print supplements that, town by town, showed the scale of the sacrifice made by different places in and around Birmingham, Manchester, Newcastle, Liverpool and other cities. They also named the first and last people to die and the oldest and youngest killed in action in each area. The automation process meant we were able to produce dozens of pages with the bare minimum of manual editing. In addition, Carlos and Dmitri also designed a widget to sit in our websites. You could search the entire database for a name or a street. This was warmly received by our readers - we received messages from people who had found relatives lost in the war using the widget. It also proved very popular - more than 400,000 searches were made on the widget in the first couple of weeks. The project provided new journalistic insight into the First World War, as well as being a fitting tribute to each city’s sacrifice in the tragedy. It coincided with the 100th anniversary of the outbreak of the war. We were delighted that we were able to produce a piece of work that provided spreads, features for our different newspapers and even a full supplement for the M.E.N., adding thousands of extra sales.

Technologies used for this project:

Google Sheets OpenOffice Calc Tableau OutWit Hub Adobe Photoshop Adobe Illustrator
Follow this project

Comments (0)

You have to be connected to contribute

You have to be connected to follow

Leave this project and no longer be informed about this project

By joining this project, you will be informed by email when an update or a new contribution is posted on the website.

Thank you for your active participation !

The GEN Community Team