Bridging the gap for better outcomes

Steve Collier
Managing Director, Penelope QIP

 

The revisions to the National Quality Standards (NQS) were introduced on February 1. Change is often accompanied by uncertainly. To help, I drew on the Penelope QIP team’s expertise in quality regulation and systems for efficient continuous improvement in providing a free ‘understanding the revised NQS’ webinar series. Indicative of the level of doubt linked with the revisions, more than 3000 people from within Australian Early Education and Child Care (EECC) have registered to date.

 

A key factor in the unprecedented demand for this webinar is that many people don’t feel supported to adapt to the changes. If they are aware of information such as ACECQA’s 633-page ‘Guide to the National Quality Framework (NQF)’, they’re first struggling to identify the changes relevant to them. There’s a further feeling of despair surrounding how services need to adapt their proactive.

 

‘The goal posts had only just entered their sight before being shifted again.’

Assessment and rating is still relatively new in Australian EECC. Through work in similar care settings, we know it can take some getting used to. We’ve seen where these feelings of frustration often result in people working in services feeling disconnected with those assessing them. A general feeling of it’s ‘us against them’.

 

The process of assessment and rating first provides assessors with evidence to assure consumers your service performs to base standards. Your service’s policies and procedures are robust, and your people adhere to them in interacting with children to assure a safe and positive experience. However, this process fits within the National Quality Framework (NQF). The true intention of the framework is clearly documented on page 316 of AECEQA’s Guide as promoting and facilitating the continuous quality improvement of each service to result in better outcomes for children.

 

Always doing our best to benefit children. There are few that are in EECC for any other reason. Seeing regulators as the enemy only results in added frustration, confusion and expense. Perhaps a useful first step to bridging this gap is recognising the common ground. Regardless of our individual roles within the EECC space, we’re all working towards nurturing the minds of our next generation.

 

The recent revisions to the NQF also show that the people working in the space are being heard by regulators. Many of the changes were in response to feedback received through a lengthy consultation period and comprehensive analysis of assessment and rating data. This represents a significant commitment of resources in improving the overall system we operate in.

 

For example, thousands of completed quality assessments were analysed through the revisions. It was found that there were many cases where 2 elements received the same rating. For example, the old elements assessing sustainability, 3.3.1 and 3.3.2, received the same rating in over 92% of quality assessments reviewed. In the revised NQS, these two elements have been merged into 3.2.3. The intention being to ensure each element is addressing a unique concept of quality and to reduce duplication.

 

When first analysing the original NQS on inception, it was pleasing to see that lessons had been learned from other spaces. Specifically, in moving away from standards written to be so prescriptive, they stifled creativity and innovation. However, in hindsight, the right balance wasn’t quite achieved. Attempting to introduce both a flexible standard within an entirely new mandatory process was a tough ask. It is evident that many services, while understanding the intention of the NQS, have also craved more direction from them. Ambiguity in the standards has perhaps helped create doubt in the minds of already time-poor people. People who are focused on providing more for children through their daily practice rather than the underpinning framework and the subsequent administrative requirements. This has created inefficiencies that have further fanned the flames of discontent regarding assessment and rating at all levels of many provider team’s hierarchies.

 

Conversations about the assessment and rating process often report poor experiences. Or negative preconceptions built on hearing of the poor experiences of others. Many express a lack of faith in an inconsistent system. Particularly where assessors are often seen to rely on subjectivity over evidence within vague criteria and a complicated 5-level rating system.

 

For example, I recently met with Management of a small regional organisation who operated two almost identical services just 3 doors apart. The same person wrote and submitted the QIP for both. The services used the same systems, policies and procedures and a shared pool of staff. Both services most recent assessment and rating visits were only days apart but conducted by different assessors. The first service achieved an exceeding rating and the other received working towards. This was later overturned following an expensive and exhaustive appeal process.

 

In addition to reducing duplication, ACECQA have acted to provide more clarification through the revisions. The standards have been better organised to reduce from 18 standards and 58 elements to 15 and 40 respectively. Beneath the streamlined standards the assessment criteria documented in ACECQA’s guide. This provides further clarity in the process of self-assessment for services to inform their QIP along with the evidence sought by formal assessors.

 

An example is where former element 2.1.2 has been revised to makes explicit that services must provide for each child’s well-being as well as comfort in revised element 2.1.1. While the majority of services would likely have already assumed this in their practice, obviously there have been enough that haven’t to warrant adding additional direction.

 

There is also now more guidance around what ‘above and beyond’ means in being eligible for an exceeding rating. Even better, the application fee for the previously controversial excellent ratings has been removed. An excellent rating should no longer be seen as ‘paid for’.

Pleasing everyone can be hard. For anyone working in a service, satisfying families might prove a relatable experience. For example, some providers have already denied that the changes facilitate quality improvement because the quantity of information added to clarify requirements is commensurate with an increase in regulation and workload.

 

The NQF is 6 years old and the revisions are less than one-month in. The central focus of an evolving quality system for EECC is children. We would all band together, adjust and demonstrate patience in nurturing a developing child. Perhaps collaboration and compromise can help refine our system and deliver better outcomes for all children?

 

Given those that are at stake, it’s at least worth giving reform a chance.