Are Analytics the Key to Predicting Scholastic Success?

Analytics can drive better outcomes in education, but the process isn’t easy.

Administrators in the Spokane Public Schools in Washington used to pore over spreadsheets and PDFs to see which students might be at academic risk. By the time they had interpreted the information, it was often too late to act. Now they are using sophisticated analytic techniques to gauge student performance in real time, and district leaders say it is making a world of difference.

“You can’t have a process that takes two or three weeks to give you a report. If you are going to intervene early, you have to know right now,” said Chief Academic Officer Steven Gering.

RELATED

Spokane is on the cutting edge of a trend that uses analytics as a means to improve the quality of K-12 and higher education. Proponents say these techniques can help to shape course content, improve student outcomes and even boost enrollment in the competitive higher education landscape. At the same time, however, a number of hurdles must be overcome in order to reap the full benefits of analytics.

Many academic institutions are struggling to put their data to work. According to KPMG’s recent Higher Education Industry Outlook Survey, 39 percent of the respondents said adopting new analytical techniques is a top data challenge, and just 29 percent report using data to inform strategic decisions. Still, 36 percent say that while they have good data, they lack the resources to conduct analyses.

At the same time, schools are under increasing pressure to put data to work, according to Gartner Research Director Glenda Morgan. In K-12 education, public funding has become increasingly contingent upon the ability to demonstrate outcomes. In higher education, schools are pushed by competitive pressures to show graduation rates and academic success in order to recruit and retain students.

Arguably, solid analytics could help schools to achieve these ends. The corporate world thinks so: Gartner predicts that the use of analytics as a way to drive decision-making will be the top CIO priority through 2017.

How will that play out in education? It all begins with the data. Right now access to relevant data is a mixed bag in education. In some cases data can be culled from universal sources, especially through standardized testing, where results are likely to be automatically aggregated for easy comparison across local, state and even national benchmarks.

Some schools can also do this with internal data relatively easily. As a nonprofit online school, Western Governors University can cull data on enrollment, attendance, grades and even which questions an individual student got right or wrong on a test. “We have mapped all our assessment items to individual learning outcomes, so we have a really good understanding of what areas students are doing well in and what areas they need to improve,” said Jason Levin, the university’s vice president of institutional research.

Sometimes data is harder to come by. K-12 teachers, for instance, may have to enter scores, attendance and other markers manually on a spreadsheet. In KPMG’s survey, higher education leaders said they struggle with data residing in multiple locations (60 percent), as well as with the quality of data available (40 percent).

Once data is acquired, by whatever means, it enters an analytics process typically comprising three key elements: a data warehouse, a processing tool and a visual presentation component. It’s in that middle phase — the software component — that the number-crunching takes place, and this can take numerous forms. For example, software may assign a risk profile when students are not on track; it may cross-reference performance among many students to see where a teacher can improve; or it may use outcomes to determine where course content can be improved.

Analytics also can align attendance with geography: Maybe kids are late because there’s no public transportation available. Or a district might look at demographic factors. Are kids from single-parent households having a harder time? While the price tag of such analytic tools can vary widely, costs typically include new software, added storage capacity, and training for personnel who will enter data or access information.

Ultimately, the outcomes of these investments should be concrete, including enhanced student performance, better retention rates in higher education and improved graduation rates at all levels.

“We don’t say that technology is the only answer, but when we do look at technology, we want to find tools that can enhance student learning and student success,” said Bill Moses, managing director of education programs at the Michigan-based Kresge Foundation, a $3.6 billion organization that supports education and diverse social needs.

“Right now we see analytics as addressing the fundamental processes of delivering education,” he said, noting that today’s data tools are a significant leap forward from yesterday’s spreadsheet-driven world. “In the past, decisions would be based either on very old information or else on gut instinct. Having real-time information means you know where students are currently, and it also means you have the potential to act on that in real time.”

Moses’ criteria for a sound analytics investment present a meaningful rundown of what the best of these solutions can offer. He is looking for:

Rapid turnaround: The ability to analyze data and get it into teachers’ hands quickly.

Scalability: Analytics must be reproducible outside a single class in order to use the investment campuswide.

Educators and tech experts point to a number of potential hurdles in implementing any analytics solution. One needs reliable data, as well as thoughtful policies for its use. Educators must be trained to interpret the findings, and public concerns about the privacy of data must be addressed. In some cases, the technology investment may simply be out of reach, especially for smaller school districts. There’s no silver bullet, except to say that each of these issues must be addressed in any analytics program.

In the end, the outcome of all this work may come at the district level, with the formulation of policies that respond specifically to verifiable information. For many, though, analytics will play out locally as a means to drive early intervention. When the numbers can highlight a negative trend and send up a red flag alerting a teacher to potential trouble, that’s when analytics really prove their worth.

This means putting findings to use at the ground level. In Prince George’s County Public Schools in Maryland, for instance, measures and indicators are fed to the district, but findings ultimately get passed back to individual schools to determine interventions. Principals and teachers may call for parent-teacher meetings, along with student meetings. They also may use the data to improve communications among teachers or to drive classroom support efforts for at-risk kids.

Janeal Maxfield likewise looks to keep the analytic outcomes close to home. As elementary instructional specialist for the North Thurston Public Schools in Washington state, she has driven implementation of an online math curriculum from ORIGO, a system whose assessments are used to generate a fuller picture of students’ learning status. That information in turn goes right to those on the front line via reports at the school, teacher and class levels.

“If we were to keep the data at the district level, it would get stale to the point where there is nothing the teachers can really do about it,” Maxfield said. “So we put it in the hands of the principal and the teachers, since they are the ones taking action — they are the ones who take that data and put it to use in the classroom.”

Spokane Analyzes for Attendance and Performance

With 30,000 K-12 students, the Spokane Public Schools are an early entrant into the analytics arena, with a Web-based effort to compile, correlate, interrogate and disseminate a range of student data.

“In the old way of doing things, data was dead. Somebody downtown would suck out the information, put it into Excel spreadsheets and make a PowerPoint, or make charts and tables and graphs, and then send it out to schools,” said Chief Academic Officer Steven Gering. Analytics means more than just understanding the data. “This is about looking at grades daily.”

The data dive goes beyond just grades. Running business intelligence tools from Tableau Software on a virtualized server, the school district brings together grades, attendance, behavior and test scores, weighted to paint a picture that identifies at-risk students. In three years, graduation among at-risk students rose nearly 8 percent, from 76.6 to 84.5 percent.

Assessment goes both ways. In addition to looking at student performance metrics, administrators also collect student feedback on teacher performance, school safety and other factors that might be addressed at the school or district level. Course content and other variables can be adjusted accordingly.

The school used a Microsoft SQL Server database to construct a data warehouse that pulls this information from a range of sources, including a student management system that tracks attendance, discipline, test scores and other metrics. The Tableau software presents the information in a dashboard, as well as in a variety of reports.

That visual access has been a major driver of success thus far. “We are able to connect data we have never been able to connect before,” said Gering. “Principals can see all their data side by side, in a single view, all on one screen, where before I used to get a binder from the district, a report here, a spreadsheet in the mail, and we expected principals to somehow do something with all of this.”

The path has not been without its bumps, and Gering’s team has learned some important lessons about the nature of metrics.

“We can get a number that looks wrong and we can say, ‘Fix this.’ Then we get unintended consequences,” he said. “It is easy for a principal to fix a graduation rate. You just pass all your kids.”

Sometimes the problem lies in a faulty curriculum or some other factor that goes deeper than a quick fix. Gering tries to present the data in a way that will make that deeper need apparent.

“We want to help them get at the real solution, to get past whatever might be inhibiting them from moving forward,” he said.

Stemming Enrollment Decline at the University of North Carolina

While charged with fulfilling a range of educational priorities, administrators at the University of North Carolina (UNC) System have paid special attention lately to the state’s looming teacher shortage. Specifically, they have been looking to analytics to help close the gap in teacher training.

The system operates 15 teacher-training colleges statewide, and while the schools have trained more than 20,000 newly licensed teachers over the past five years, enrollment has dropped by about 30 percent since 2010. “We have some information about why these enrollments have declined, we have theories and assumptions, but we are still painting the whole picture and the analytics are helping to do that,” said Alisa Chapman, vice president for academic and university programs at UNC.

To sort it out, planners weigh a range of factors: Do students come from North Carolina or other states? What do their evaluations look like? How long do they stay in the job? What is their instructional practice? By cross-checking all of these, they can pinpoint which teacher trainees are likely to complete the coursework and succeed in the classroom.

“That allows us to refine and make programmatic adjustments where they are needed most, as opposed to making broad sweeping changes that might be changing something we are already doing really well,” Chapman said.

An advisory committee drawn from among state educational leaders helped draft the initial criteria, and a newly formed committee of national experts plans to meet quarterly to oversee the ongoing research. The results of this work appear on an online database and an educator dashboard. Both use data visualization and analytics from software firm SAS; both ensure broad public access to the university’s research and trend data.

This transparency is a critical piece of an analytics program in a public institution, said Chapman. Tight budgets and, as educational institutions gather more data about students, privacy concerns are increasing pressure on school leaders to make information visible so that everyone can see what is being collected and how it’s being used, a trend that puts new expectations on any emerging analytics program.

“People can be dismissive of a printed report, but with a publicly visible dashboard they are asking very detailed questions and we have to be responsive to what those outcomes mean. It has really changed the conversation,” Chapman said. This puts new pressure on administrators to get it right the first time, but it also creates the opportunity for more effective policymaking. “It allows us to skip over to a higher level of conversation.”

Early Warning Analytics at Western Governors University

In four years, the nonprofit online Western Governors University has grown from 32,000 to 70,000 students. Despite the challenges that may come with rapid growth, success metrics have risen steadily in that time. The school claims a 37 percent graduation rate among undergraduate students 25 and older, 10 percent higher than the national average. WGU projects graduate rates will rise to 45 to 50 percent among students admitted in recent years.

“Certainly there are other factors involved, but we know that analytics has been a big part of that whole picture,” said Jason Levin, vice president of institutional research. The crux of the school’s analytic effort is the ability to give teachers early warning when a student may be in trouble academically. To get there, Levin’s team has implemented Talend, a software tool to perform extract, transform and load functions, feeding data into a central repository and formatting it for analysis by IBM Cognos, which sits on top of an Oracle data warehouse.

The system drills down into specific test scores, measuring academic performance as the key indicator of an at-risk situation. “The best measures are measures of actual learning, and that’s what this is,” Levin said.

Based on statistical analysis, the system will broadcast a green, yellow or red alert to teachers, flagging potential problems. By correlating course assessments and other factors over time, the system also can draw attention to possible changes needed in the curriculum.

It takes a lot of hands-on effort to make the system work, including three engineers and three analysts who respond to daily requests for new dashboards and data tracking. “There is an endless queue of stuff that can be done,” Levin said.

Having multiple eyes on the system helps to ensure success, since metrics are as much about the process as about the product. “These dashboards and reports have to be meaningful and useful, and you have to be sure of the validity of these things, so there has to be some governance,” said Levin. “Someone always needs to be asking, ‘What are the important processes here? Who is going to be responsible for that?’ So we have built a lot of internal audits into the system.”