Tag Archives: human error

Organ Donor Records Mix-up

The Sunday Times reported in April 2010 that NHS Blood and Transplant, who run the UK organ donor register, last year wrote to new donors with their consent details. After respondents complained the information was incorrect it was discovered 800,000 individuals’ details had been recorded incorrectly. 45 of those affected have since died and their incorrect wishes carried out!

“The mistake occurred in 1999 when a coding error on driving licences wrongly specifying donors’ wishes was transferred to the organ registry.”

400,000 of the affected records have been changed, and the remaining 400,000 people will be contacted soon and asked to update their consent.

Information Quality – Every Little Helps

[Thanks to Tony O’Brien for sending this one in to us recently. For those of you not familiar with Tesco and their marketing slogans, this is their corporate website.]

ManagementToday.com has a great story (from 25th November) of how six bicycles purchased by Tesco from a supplier came with an apparent£1million (US$1.62 million) price tag.

Some red faces at Tesco HQ this morning, after news emerged that Britain’s biggest supermarket accidentally paid one of its suppliers almost £1m for six bikes.

The unit cost for each bicycle turns out to be a whopping £164000 instead of the usual £164.

While the majority of the money was repaid, the trouble for Tesco is that they are engaged in a dispute with the supplier in relation to other matters so the supplier has held on to 12% of the money. So Tesco have called in their lawyers. Which means that the total cost of failure will inevitably be much higher by the time the whole mess is sorted out.

Of course, simple consistency checks on data entry could have trapped that error and saved Tesco money and embarrassment.

It seems that with Information Quality, as with Retail Grocery, every little helps.

An Airtravel trainwreck near-miss

From today’s Irish Independent comes a story which clearly shows the impact that poor quality information can have on a process or an outcome. The tale serves to highlight the fact that information entered as part of a process can feed into other processes and result in a less than desirable outcome.

On 20th March 2009, poor quality information nearly resulted in the worst air traffic disaster in Australian history as an Airbus A340-500 narrowly avoided crashing on take off into a residential area of Melbourne. The aircraft sustained damage to its tail and also caused damage to various lights and other systems on the runway of the airport at Melbourne.

The provisional report of the Australian Air Crash investigation found that the root cause for the incident was the inputting of an incorrect calculation for the weight of the aircraft of 262 tonnes, where as the plane was actually 362 tonnes in weight. This affected the calculations for airspeed required for take-off and the necessary thrust required to reach that speed.

The end  result was that the plane failed to take off correctly and gain height as required, resulting in the tail of the plane impacting on the runway and then proceeding to plough through a lighting array and airport instruments at the end of the runway.

It is interesting, from an Information Quality perspective, to read the areas that the Accident Investigation team are looking at for further investigation (I’ve put the ones of most interest in Bold text, and the full report is available here):

  • human performance and organisational risk controls, including:
    • data entry
    • a review of similar accidents and incidents
    • organisational risk controls
    • systems and processes relating to performance calculations
  • computer-based flight performance planning, including:
    • the effectiveness of the human interface of computer based planning tools.
  • reduced power takeoffs, including:
    • the risks associated with reduced power takeoffs and how they are  managed
    • crew ability to reconcile aircraft performance with required takeoff performance, and the associated decision making of the flight crew
    • preventative methods, especially technological advancements.

The Report by the Australian authorities also contains reference to some of the migitations that the aircraft operator was considering to help prevent a recurrence of this risk:

  • • human factors – including review of current pre-departure, runway performance calculation and cross-check procedures; to determine if additional enhancement is feasible and desirable, with particular regard to error tolerance and human factors issues.
  • training – including review of the initial and recurrent training in relation to mixed fleet flying and human factors.
  • fleet technical and procedures – including introduction of a performance calculation and verification system which will protect against single data source entry error by allowing at least two independent calculations.
  • hardware and software technology – including liaising with technology providers regarding systems for detecting abnormal take-off performance.

For those of us familiar with Information Quality practices, this is an impressive haul of information quality management improvement actions focussed on ensuring that this type of near-miss never happens again. It is doubly interesting that causes of poor quality information feature in the items that are subject to further investigation (e.g. “human factors”, risk controls etc.) and common approaches to resolution or prevention of information quality problems form 75% of the action plan put forward by the operator (process enhancement, improved checking of accuracy/validity, assuring consistency with other facts or measures etc).