It collected the data from 134 hospital trusts between January and December last year and then statistically analysed the results.

There were around nine million discharges, from which 294,000 deaths were recorded either while in hospital or within 30 days of discharge.

The findings were supposed to have been published last month, but officials had to push the date back. The reason behind the delay is unclear.

In the NHS report, 103 trusts reported no excess deaths and 18 even had a lower number of fatalities than expected.

The data showed slightly less than a tenth, or 9.7 per cent, of NHS trusts reported higher than expected deaths.

WHAT WAS THE MID STAFFS SCANDAL?

A disputed estimate suggested that hundreds of people may have needlessly died at Stafford Hospital, ran by Mid-Staffordshire NHS Trust, due to poor care between 2005 and 2009.

In what is one of the worst care scandals in living memory, anecdotes provided by Julie Bailey, who was responsible for exposing neglect at the hospital, suggested patients were left lying in their own excrement and had been so thirsty that they were reduced to drinking water from vases.

The Francis report, the inquiry into the hospital’s workings in 2013, found that box ticking bureaucrats prioritised targets over basic levels of care.

But not a single individual will be prosecuted in connection with the scandal, police admitted last year – despite a three-year investigation.

Stafford Hospital has been renamed County Hospital, and is now run by a different trust.

They included Blackpool Teaching Hospitals NHS FT, South Tyneside NHS FT and Wye Valley NHS Trust, which have all had excess deaths since 2013.

Chiefs warn the data is only a ‘smoke alarm’ and only warrants the need for further investigations to examine the cause of the excess deaths.

But Professor Jarman’s recalculation of the data shows more than a fifth of NHS trusts, or 21.6 per cent, have deaths higher than expected.

He used the internationally recognised system, recommended by the Association of Public Health Observatories, called Byar’s confidence intervals.

A similar process helped uncover poor care at the Mid Staffordshire trust, which was at the centre of one of the biggest scandals to ever hit the NHS.

However, the NHS uses another method, called overdispersion, which gives trusts slightly more leeway in terms of recording deaths.

Under that system, a trust is regarded as having a higher than expected number of deaths only if its SHMI is about 12 per cent or more above the national average.

In contrast, the Byar’s method is used internationally and classifies a much higher proportion of trusts as being significantly above the national average.

Professor Jarman told MailOnline: ‘I don’t consider it is appropriate to give some additional “leeway” to hospitals, if it leads to the Care Quality Commission considering the 16 trusts I have listed as having “no problem with mortality”.

Professor Jarman said the NHS had used the internationally recognised Byar’s confidence intervals in the past.

Until January 2012 it was used in tandem with overdispersion. But the Byar’s method was then dropped, with bosses at NHS Digital, saying it was confusing to have two systems running at once.

Chris Roebuck, director of publications and head of profession for statistics for NHS Digital, said: ‘The SHMI is designed to act as a high level measure and is a helpful indicator that may flag potential problems that require further investigation.

‘It needs to be used with care and should always be used in conjunction with other information. The UK Statistics Authority has endorsed its use in this way.

‘It was commissioned by the Department of Health following a review of the design and use of similar mortality indicators.

‘That review benefitted from input from specialists from NHS organisations, regulators, universities, royal colleges and field experts including Professor Jarman.’

He added: ‘All the experts we worked with brought different perspectives on how to construct the indicator, and we took all of their advice into account.

‘The expertise of this group was invaluable to the development of SHMI methodology to best serve what is a complex area.

‘It has been continually reviewed and developed drawing on expert opinion and we will continue to do this.

‘NHS Digital in association with other relevant organisations have recently been reviewing the use and existence of a range of mortality indicators including SHMI.’

South Tyneside NHS Foundation Trust said: ‘These figures include patients from Sunderland who die in St Benedict’s Hospice, which we run, but the patients have not otherwise been under our care during their treatment.

‘Removing deaths in St Benedict’s results in our mortality figures returning to within the expected range. This has also been confirmed by an independent review conducted by the North East Quality Observatory.’

HOW DID PROFESSOR SIR BRIAN JARMAN MAKE THE CALCULATION?

NHS Digital releases its Summary Hospital-level Mortality Indicator (SHMI) data every summer for the previous year. Last week the body flagged 13 trusts as having more deaths than expected.

It uses a method called overdispersion to sift through the data and highlight trusts with excess deaths, which gives trusts slightly more leeway in terms of recording deaths.

Under that system, a trust is regarded as having a higher than expected number of deaths only if its SHMI is about 12 per cent or more above the national average.

Professor Sir Brian Jarman used an internationally recognised system, recommended by the Association of Public Health Observatories, called Byar’s confidence intervals, to sift through NHS Digital data.

In contrast to overdispersion, the Byar’s method is much stricter, classing anything with a SHMI of more than six or seven per cent above the national average as significantly high.

A similar process helped uncover poor care at the Mid Staffordshire trust, which was at the centre of one of the biggest scandals to ever hit the NHS.