A review of errors in key health figures has slammed nearly every aspect of Te Whatu Ora's attempt to publish the data monthly.
The report indicated the move was poorly thought out, badly executed and provided information of little value to the public anyway.
The review was triggered by wildly inaccurate figures on emergency department waiting times in Te Whatu Ora data which prompted the organisation to stop publishing the figures.
It said Te Whatu Ora had decided to publish data on 12 measures every month instead of every quarter and without a three-month time lag to ensure accuracy.
But in doing so it ignored advice and asked data teams to do work for which they were ill-prepared and resourced.
The report, published on Thursday, said the timeframe for starting monthly reporting was not realistic and failed to recognise that the selected measures did not have stable sources of data.
"Teams inexperienced in the national data collections data were required to extract and prepare the data reports without clear instructions and expertise in the national collections. The teams were not able to effectively check the data," it said.
The report said the 12 indicators did not provide meaningful data for the public because some of it was difficult to understand without explanation and some was only useful if viewed over longer time periods such as every quarter.
It said Te Whatu Ora did not have the capability to support monthly reporting and the work required would be better used elsewhere.
"Local data teams are overwhelmed with data demands, do not have sufficient capacity to meet increased demand and do not have a framework/guidance to enable them to prioritise their work."
The report also criticised Te Whatu Ora for its handling of data about Māori.
"The inherent rights and interests that Māori have in relation to the collection, ownership and application of Māori data are not recognised in a clear reporting framework and in all performance reporting," it said.
One of the errors that sparked the review happened because an extra line was included when data was extracted to a spreadsheet, the report said. That resulted in offsetting the correct data for emergency department waiting times by one line for about half of the regions.
Other errors happened when data was entered, sometimes retrospectively, when it was copied, and when it was extracted.
The 12 measures were:
- Immunisation rates for children at 24 months
- Rates of preventable hospital admissions for under-fives
- Percentage of under-25s accessing mental health services within three weeks of referral
- Rates of preventable hospital admissions for 45-64-year-olds
- Number of days in hospital following acute admission
- Percentage of cancer patients receiving first treatment within 31 days of decision to treat
- Patients waiting longer than four months for first specialist assessment
- Waiting list of patients not treated within four months
- Number on a waiting list for more than a year
- Emergency department attendances
- Emergency department admissions
- Percentage of ED patients treated, transferred or discharged within six hours.
Te Whatu Ora acting chief executive Dr Nick Chamberlain said it had resumed public reporting on a quarterly basis.
"Following robust evaluation and validation of the data, we are publishing 11 of the 12 metrics today," he said, adding that most of the measures had not improved when compared to the relevant previous reporting period.
Chamberlain said one measure - emergency department admissions - had been temporarily removed because the definition of a hospital admission from ED had been interpreted inconsistently, meaning the data could not be accurately reported.
Te Whatu Ora was now implementing the review's 28 recommendations, he said.
"As an organisation, we are determined to get this right, as we wholeheartedly believe in the importance of performance reporting as a key means to build public trust and confidence in our work.
"Many of the problems identified in the review are not new, and we now have a plan to fix them. Te Whatu Ora is combining processes and systems from 29 organisations, and the review has been an opportunity to better understand data collection and validation. This will assist our focus on performance of the health system, including steps to further improve public reporting."
Chamberlain said Te Whatu Ora had underestimated the time required to move from the former reporting system, particularly given the complexity of the data and systems involved.
"While the inaccurate data we reported was unfortunate, as a new organisation we now have a robust understanding of the issues we need to get on top of to give the public confidence in our future performance reporting."
"The decision was really with the new health system to be transparent, to let our communities know how we're performing" - Interim chief clinical officer Richard Sullivan
Te Whatu Ora aimed to be transparent with communities about the health system and at the time believed the best way to do this was monthly reporting, Interim chief clinical officer Richard Sullivan told Morning Report
The agency took advice and decided it had enough confidence in the data to make the change, though there was some advice to the contrary. "Yes there was advice to suggest for instance ... that maybe that wasn't the right way," he said.
The agency was already working on some of the review's recommendations, he said.
"In part it's about improving our analyst capability and function, part of it's about looking towards automated reporting.
"We now have much more confidence in the data. You'll see from the review we now have a quality assurance process, we go back to the clinical teams to check on it."