At yesterday’s Business Performance Roundtable we spent a lot of time looking at examples where the use of data created actionable insight with the help of David Anker from Lightfoot Solutions and Alan Meekings from Landmark Consulting.
But one of the key issues was around data. Is it good data and do we trust it? As Martin Scott of Unipart kindly reminded us, there is a quality issue and an integrity issue.
Let me give a couple of examples to illustrate the point. A hospital will have a good record of the time of first cut to discharge from the operating theatre as it is a statutory requirement. But how often do we find that the time of admission onto the ward was after the time of first cut, because someone doesn’t think the time really matters and is in a rush to complete the record? A police force may record the crimes committed but how much data do we really have. The fact that a house was entered and goods stolen is very basic. However, if we know the time of day, the entrance was via the back door, the search was tidy and only jewellery was taken we have significantly more information. Are the majority of your product failures coded under “other reasons”? That is all about data quality.
Now data integrity is about deliberate massaging of the data. Do we re-categorise violent crime to make the numbers look better? Do we delay ambulances in A&E because we need to reduce admissions and put back the start time of the four hour treatment window? Do we invoice a customer before the job is completed or the goods are dispatched to make the month end figures? Do the data jockeys remove the extreme cases of data variation when compiling data for reports? Now this is deliberate data manipulation and is about data integrity.