Learning Lessons from Inspections

Exploring the CIF: The importance of Data Analytics

Guest writer: Judy Bloxham

Recent headline cases of poor and inadequate Ofsted inspection results should act as an alert to all, of the change of focus in the Common Inspection Framework (CIF) introduced last year. The results of the first five colleges inspected under the new CIF saw three of them fall a grade, and in the last two months, two independent providers have seen their grades fall from Good to Inadequate, one of which had its SFA funding withdrawn leaving 1000’s of learners stranded.

When the changes were announced, that all educational institutions would be judged under the same criteria, whether it be a primary school or independent work based learning provider, many thought it would be unworkable. However, the new CIF was introduced last summer, along with extended notes to add guidance to account for differences in educational establishments. The changes mean that all educational provision will be judged in the same way, and reports written in a common language, to allow better comparison of performance.

The new CIF has changed the focus of inspections, and many organisations have not yet recognised this. Some of the areas of inadequacy identified from various recent Ofsted reports include:

“Managers do not identify and tackle the differences in achievement between various groups of learners, which have increased.”

“Performance management of managers and staff is weak and has not raised the quality of provision to good across much of the college’s work.”

“Leaders and managers do not use data well enough to monitor the performance of all groups of apprentices and aspects of quality of the provision, including that of subcontractors.“

“Managers do not use data well enough to monitor learners’ progress effectively, or take decisive action to bring about swift improvement.”

“There is insufficient rigour in the monitoring of underperforming programme areas.“

“Managers do not use data and information well enough to identify weaknesses and to intervene to bring about rapid improvement.“

“The analysis of differences between the achievement of various groups of learners is perfunctory.”

At the introduction of the new CIF sir Michael Wilshaw said “HMI will be looking to see that the leadership has a clear understanding of the key areas for development – and a credible and effective plan for addressing these.”

It is difficult to make any judgements of progress and needs, if you are not able to easily identify them. It is important that senior managers are able to have an overview of the situation, but equally, it is useful for staff to be able to see their own, and their learners’ progress against targets. One of the identified trends in learning technology for 2016 is the importance of Data Analytics. There has been increasing momentum in this direction over several years, and Ofsted comments like tbose show the requirement of this level of understanding of what is happening in an organisation.

The emphasis on learner segmentation means there is a requirement for the learning provider to be able to slice and dice the learner profiles, to make comparisons of the various groupings. Many examples of learner groupings are given in the Ofsted handbook (section 14): sexual orientation, disadvantaged learners, minority ethnic backgrounds, EFL, Traveller/Roma/Gypsy ….. etc. Are you able to segment your learners by these profiles, then look at these groups and say that you are giving all an equal opportunity? That you are implementing interventions when there are differences in performance? If so can you prove evidence to Ofsted?

So how do you deal with making your data more accessible to staff?

Having suitable data visualisation that is available to all staff, not just senior managers, is the key. All staff need to be able to see what is happening in real time, so they are able to take ownership of improving their performance, in the same way as managers need it for improving performance at a higher level.

Gender profile (click to enlarge)

To enable this a data dashboard that presents data in a simple format, that is easy to make sense of, is an important step. Data can be presented through graphs, statistics, or reports highlighting exceptions. Within this it should be possible to view the data at a level that suits your role in the organisation, and then have the possibility of drilling down into the details to gain a more granular view. Some of the organisations that have implemented this successfully do not restrict who can see what, but have transparency so that all staff, can see all data (as long as it isn’t breaching confidentiality). To do this requires a multilevel dashboard, that allows you to drill down to the level you need at that point in time.

I can remember working in an organisation where both staff and learners used to question the data’s accuracy because attendance was only ever correct once a week, as it was only updated at that frequency. This brings me on to another important criteria for a dashboard, it needs to show real data, in real time. For staff to have faith in a system they need to see its value and see its accuracy.

College attendance (click to enlarge)

Where staff have this level of confidence, this results in better adoption of a system, which results in the system working effectively and having an effect on the organisation’s progress. For any system to work effectively it needs all staff to buy into it and understand the value.

Data dashboards may be seen as an expense in these cash strapped times, but as recent events show, is the alternative not more expensive? Having nothing tends to result in an ad hoc arrangement of systems that are impossible to collate anything of value from. Some proprietary dashboards can be very inflexible and designed for a specific market sector and so don’t work well with different learner groups. Alternatively getting these ‘off the shelf’ systems customised can also prove difficult or expensive.

Here are some suggestions of things you should be looking for if you are considering investing in a data visualisation system:

Assessment results (click to enlarge)

Can the dashboard pull data from a wide variety of systems in real time and be refreshed on demand?

Does the visualisation allow for easy drilling down into the data?

Can you get the system to work with your specific data needs, or are you stuck with what comes out of the box?

Is the system usable by all staff in a way that will suit different organisational needs?

Is the data presented in a way that is easy to understand?

Can you run reports easily?

Does the system allow tracking over time, and year on year comparisons?

Can the reports be modified easily to reflect any changes in policy?

Ofsted want to see that you are managing your data effectively, and reacting to situations in a timely manner. A dashboard can help you show this, and a good dashboard can prove how you are managing your learners and striving to improve. It also provides staff with a clear picture of the situation within the organisation so they can have more input into its success.