Patrick Malleyhttp://patrickmalley.com/Fri, 06 Jan 2017 12:46:39 +0000en-USSite-Server v6.0.0-15121-15121 (http://www.squarespace.com)Bringing Data TogetherDataPatrick MalleyFri, 06 Jan 2017 12:56:17 +0000http://patrickmalley.com/blog/2017/1/6/bringing-data-together4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:586f91af15d5dbae3b3c6494As mentioned in a previous post, I am in the process of updating the mathematical model used by my school to determine when students are ready to take college-level courses. This model is important to us because we send over a third of our juniors and half of our seniors to college each year and we don’t want to mistakenly send students to college before they are ready. Using this model, my team has gotten pretty good at determining readiness; last year our students passed 97% of the college courses they attempted.

Before the model can be applied, it must first be brought together into a single database or spreadsheet. Depending on your systems, this can be a quick or timely endeavor. For me, bringing together all of the data we have on students took a little over six hours. Here’s what I did:

Google Sheets

Because it is shareable and applies edits in real-time, I do all of my modeling in a single Google Sheet. For anyone who is an Excel devotee, this may sound crazy. It is. But, for me, the benefits outweigh the costs.

For this year’s update, I created a new Google Sheet called “Master Data File” where I pasted an export from our Student Information System (SIS) containing each student’s name, ID, DOB, sex, graduation year, and commutative GPA. Because our SIS contains the most up-to-date information regarding student enrollments, I always start there and then use that data as reference for gathering the rest. No need to gather data on a student no longer enrolled.

Microsoft Excel

So far, there is only one function I need that is not easily done in Google Sheets: consolidating data. At one time, I would spend hours manually inputting data from one system’s export file to another. Excel can consolidate data from two spreadsheets in minutes.

The Consolidate function is in the "Data" ribbon on Microsoft Excel.

For example, data downloaded from the College Board website looks different than data taken from our SIS. The College Board data includes some students who have left my school, is missing data for students who are newly enrolled, and may have other formatting differences that would make a simple copy/paste impossible to do.

As long as I have a single column that uniquely identifies individual student (student ID, “Last Name, First Name” combinations, etc.), Excel can consolidate the data from both sources into a single row to be included in the master file.

Data Brought Together

Here’s the data I consolidated into the single Google Sheet for each student organized by source:

Student information System

Demographic Information used for sorting and aggregated data analysis

High School Grade Point Average: used as a primary indicator of future college success. This topic will be expanded upon further in a later post.

College Board

PSAT 8/9, 10, and 11: We give the PSAT to all students every year in grades 8 through 11. While we do not yet use this data in our model, I decided to pull it in hopes of future analysis and reporting.

SAT: In Michigan, all 11 graders are required to take the new SAT. Our community college partner accepts SAT scores for determining college course placement, so we use these scores as part of our readiness model.

Accuplacer: While this is technically a College Board product, we get this data from our college partner. Our students take this college placement assessment each year until they place into college-level coursework beginning in the 9th grade.

ACT

ACT: Now that the state of Michigan has moved from ACT to the SAT for it’s college readiness assessment, we only have a few students each year who take this assessment. For those who do, though, I need to consider their scores when determining readiness.

Compass: Until this year, our college partner used the ACT’s Compass assessment for determining college placement. This assessment was replaced by Accuplacer but we still consider Compass data in determining students’ college readiness.

Other

Agency Score: Each year, we ask our teachers to rate each student’s skill at exercising agency on a scale of 0-5. Agency, for those not familiar with the concept is one’s ability to be an “agent” of his or her own learning. It consists of two components, both a part of our instructional model: 1.) ability to complete tasks to specification and on time, and 2.) growing from challenging work and setbacks. I simply ask teachers to rate each student and take the average of their input. More on this measure of college readiness later.

When recording assessment data, I like to separate it by the year it was taken relative to the student. I like to know what each student’s score was each year they took it. This allows me to see growth or stagnation in student performance, and makes analysis and reporting of data much easier to do.

Next up: what I do with this data once I have it all in one location.

]]>Modeling Future Student SuccessDataPatrick MalleySun, 01 Jan 2017 11:20:23 +0000http://patrickmalley.com/blog/2017/1/1/modeling-future-student-success4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:5868e5f7e4fcb5e1688cb4a2Over the next few weeks, I will be updating the mathematical model I created to predict students' future success in college. That model, which my school has been using and revising for the past four years, looks for patterns in academic and behavioral data to help predict individual student's likelihood of earning passing scores in college coursework.

I created the model in response to learning that standardized test scores alone left far too many edge cases to accurately predict future academic success. Too many students had previously scored well on tests yet did poorly in college classes. Similarly, some students we thought could handle college coursework did not score well on traditional measures of college "readiness."

Using this model, my school sends a third of its juniors and half of its seniors to college. Last year, these students passed 97% of the courses attempted. Ninety-three percent passed with a C or better.

To learn more about my school and why we send so many students to college while still in high school, I recommend reading my post from June titled Early College For All.

There is nothing magical about the model. It simply applies what is already known about past students' success to predict how well current students might do in college coursework.

The model uses three primary sources of data:

Standardized college placement or college readiness scores: I have used data from different assessments over the years with relatively similar results (Compass, Accuplacer, ACT, and SAT).

High school grade point average: in my school, the strongest predictor of future academic success is past student success.

Teachers' subjective assessment of student "agency:" Each winter, I ask my faculty to evaluate each student on how well they are perceived to grow through challenging work and complete work on time.

Each year, the weight applied to each of these data sources has changed to reflect what we've learned about past student success. Last year, high school GPA and test scores were weighted about evenly. Agency, while found to be an accurate predictor, was weighted very little (approximately 10%) due to its subjective nature and the potential for perceived bias.

Over the coming weeks, as I update the model, I hope to share more of the details that go into its creation and revision. I see great value in having more schools analyzing data in this way and think it's a simple enough process that can be replicated with bit of time and effort.

Disclaimer: I am not a mathematician and do not claim to be an expert in inferential statistics. I am simply a practitioner with a good memory of his Statistics 101 class. I welcome any feedback from readers with stronger mathematical grounding.

If you have questions about this model that you would like me to expand upon or would simply like to learn more, feel free to leave a comment or reach out by email.

]]>Early College for AllOutcomesPatrick MalleySat, 11 Jun 2016 12:30:58 +0000http://patrickmalley.com/blog/2016/6/11/early-college-for-all4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:575c021ed51cd40da04c769dLast week, I wrote a press release about my school that was picked up by our local paper. I'm proud of the work that it represents and I'm proud of the students who make this possible. So, I want to share it here:

Two-Thirds of Meridian Grads Opt for Free First Year of College

SANFORD, MI - June 4, 2016 - One hundred twenty students walked across the stage at Meridian Public Schools commencement ceremony this past Thursday night, but only forty of them took home a diploma. That’s because the other eighty students – two-thirds of the graduating class – have chosen to participate in Meridian’s fifth year program for a free first year of college.

“As an early college high school, we are set up to offer all students five years of education,” said Patrick Malley, Meridian’s high school principal. “During students’ fifth year, they take a full-time course load with one of our early college partners.”

The majority of students will take courses at Delta College. Others have opted to earn vocational credentials through the Greater Michigan Construction Academy or Bayshire Beauty Academy.

None of the students participating in fifth year have to step foot on the high school campus. For the most part, they are treated just like any other first year college students.

“This graduating class has already earned over 1,600 college credits during their junior and senior years,” said Meridian Superintendent, Craig Carmoney. “Now, with so many of our students staying for fifth year, we estimate that over 90% of them will participate in postsecondary education.”

Meridian transitioned its high school to an early college four years ago, when the students in this class were just freshmen. According to Principal Malley, the decision to become an early college made sense considering the work they were already doing: “The district had just joined the New Tech Network in an effort to improve student success after high school. We had the support of our teachers, parents, community, and Board to re-imagine our high school to improve outcomes. We saw alignment between our work with New Tech and the Early College movement, so we went for it.”

As one of only twenty-two early college high schools in the state of Michigan, Meridian is offering students an experience that was unimaginable just five years ago. In the fifth year, students receive funding for tuition, books, and supplies. They also get monthly gas cards to help offset the cost of transportation, and are assigned a laptop that they can use in the classroom and take home. Additionally, they are linked with an Early College Coach who supports them through their first year college experience.

“Our goal is to remove as many barriers to college and career success as possible for these students,” said Superintendent Carmoney.What are the other third of the class not staying for the extra year doing next year? Most of them applied for an early graduation after just four years and are going on to a university, the military, or to work in a family business. Because of the opportunities offered, only a few of them graduated undecided about their next step after high school.

“While our evaluation of the success of this program will have to wait until students finish their fifth year, the early results appear very positive,” said Malley. “By eliminating the major stumbling blocks to college success – funding, transportation, and support – we anticipate we’ll see our first year college completion rates more than double our ten year average.”

Based on the program's past success and the number of students participating, it is likely that over eighty percent of Meridian graduates will earn a year of college credits before exiting the program.

According to Amy Boxey, Dean of Student Transitions at Meridian, some will even graduate with over 60 credits.

“Our goal is to send students to postsecondary programs once they are able to show us that they are ready,” said Boxey. “Students who demonstrated readiness their junior year went to college. More were ready and went to college during senior year. Now that these students are entering the fifth year, the opportunity has opened to all. We look forward to supporting so many of our students on their next step after high school.”

It was nine years ago that I launched NewSchool Learning, a small Moodle design company that has produced just shy of 1,000 custom themes for clients across the planet. As of this past December, sadly, NewSchool Learning is no more.

The reason I decided to close shop is simple: the company stopped making money. Revenue had fallen 50% each year for the past three years. This last quarter, expenses exceeded revenue and I knew it was time to call it quits.

The reason for the decline in revenue? I honestly cannot say. I’ve been too distant for too long from the day-to-day operations of the business and the Moodle community to even guess. I just know that the accounting stopped making sense and that meant it was time to call it quits.

The fact that NewSchool Learning lasted as long as it did is pretty amazing, and credit certainly belongs to Lead Designer and Developer, John Stabinger. As I moved from teacher to school administrator over the past five years, I stepped farther away from my work with and on the company. John has kept the lights on while I’ve paid the bills, and for that I’ll be forever grateful.

I’d be lying if I said I was wholly sad to see the company close; part of me is happy to be free of dealing with payroll, accounts, and corporate taxes. At the same time, nine years of relatively passive income from a company that started as a curiosity makes it challenging to say goodbye.

Here’s to other adventures.

]]>NewSchool Learning Closed for BusinessThe new ‘magical thinking’ about high-tech in schools — and why it’s a problemTechnologyPatrick MalleySun, 08 Nov 2015 01:43:11 +0000https://www.washingtonpost.com/news/answer-sheet/wp/2015/11/07/the-new-magical-thinking-about-high-tech-in-schools-and-why-its-a-problem/4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:563ea8b0e4b07f322bc424dfThe notion that education is a "problem to be solved" is just flat out wrong.

Could Rubric-Based Grading Be the Assessment of the Future?AssessmentPatrick MalleyFri, 06 Nov 2015 02:20:57 +0000http://ww2.kqed.org/mindshift/2015/10/14/could-rubric-based-grading-be-the-assessment-of-the-future/4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:563c0c91e4b015d0a341a410So, apparently the Association of American Colleges and Universities has been piloting the use of rubric assessments of "cross-cutting skills." They call their rubrics Valid Assessment of Learning in Undergraduate Education or VALUE.

According to Katrina Schwartz's reporting on the pilot last month, the professors involved were surprised by what they, themselves, learned by doing assessments in this way:

Professors began realizing how much the language of their assignment prompts communicated what they expected from students. That might seem obvious, but without other samples to compare to, professors just thought their students didn’t have the skills.

Why Do So Many Kids Have Difficulty Adjusting to School?DisciplinePatrick MalleyWed, 04 Nov 2015 03:44:16 +0000https://www.psychologytoday.com/blog/freedom-learn/201007/adhd-school-assessing-normalcy-in-abnormal-environment4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:56397f0be4b0dbf1a095b47cPeter Gray MD, writing for Psychology Today back in 2010:

"From an evolutionary perspective, school is an abnormal environment. Nothing like it ever existed in the long course of evolution during which we acquired our human nature. School is a place where children are expected to spend most of their time sitting quietly in chairs, listening to a teacher talk about things that don't particularly interest them, reading what they are told to read, writing what they are told to write, and feeding memorized information back on tests."

We need to be talking more as a society about our perceived need to medicate 12% of boys and 4% of girls to make it through each school day.

It Is Rocket ScienceInstructionPatrick MalleyTue, 03 Nov 2015 02:01:19 +0000http://mobile.edweek.org/c.jsp?cid=25920011&item=http%3A%2F%2Fapi.edweek.org%2Fv1%2Fblog%2F92%2F%3Fuuid%3D550504fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:56381570e4b0c0ca907e958fNew theories about the science of learning from the Deans for Impact. Interesting findings include:

The idea that we each have different learning styles? Unsupported by research.

"Research shows that taking a quiz or forcing oneself to recall information is a better practice" than, say, rereading a book chapter or completing a study guide.

Peer tutoring? "When we want a student to learn something, have the learners recall what they know and teach someone else instead of sitting with a few peer who already gets it."

"Teachers [should] alternate practice with different kinds of content rather than practicing one type of problem several times before moving on."

My sense of this: The better able students are at being agents of their own learning, and the better teachers are at supporting that type of learning, the more students learn.

Allowing Student Choice in Their Daily Schedule: a Technical How-ToTechnologyPatrick MalleyTue, 20 Oct 2015 02:12:15 +0000http://patrickmalley.com/blog/2015/10/19/l1jc9r7zbzc8qoe2jnjilntim98reg4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:56256c1ce4b0e95c6a902c01This year, using Google Forms and two Add-Ons, I cobbled together a system that allows teachers to account for our 400 high school students during a relatively open 30-minute period of their day. When developing this system, I had two primary objectives:

I wanted students to choose which class they go to for this 30-minutes.

I didn't want to use passes or paper and pencil sign-ups.

I needed to know where students were at and whether or not they attended, but I didn't want students to have to go to one teacher's class for attendance just to leave (as is done in typical seminar-like structures I've seen elsewhere). I find this to be a waste of time and energy.

After some help from the internet and time to tinker, I came up with the following system:

1. Students Register for the Class They Want to Attend

We offer these 30-minute periods, called FIT (for Focused Instructional Time) every Tuesday and Thursday in the middle of the afternoon. Prior to the start of each FIT period, students navigate to this page on our website, click on the name of the teacher who's class they need to focus on, and complete a Google Form letting the teacher know they plan to attend.

For reasons that will become evident in the next step, each of the registration links on this page go to a separate Google Form specific to that teacher's class.

2. Registrations are Capped at Thirty Students Per Class

To prevent some classes from becoming overrun with students, I use the formLimiter add-on by New Visions Cloud Lab. While not perfect, this add-on looks at the spreadsheet where registrations are being recorded and turns off the form once registration levels hit a pre-determined level (in our case thirty students).

3. Students Receive an Automated Email Confirming Registration

Whenever a student successfully registers for a class, they see a message on the screen and receive an email confirming our expectation that they will attend. This email is sent using Google's own Form Notificaitons add-on. It becomes the fall-back receipt in the event of a registration getting lost.

4. Registrations are Captured in a Single Google Sheet

To ensure my entire staff can easily find any student during this 30-minute period, I have set the destination of each of the individual teacher's registration forms to the same Google Sheet. My entire staff needs to have edit rights to this spreadsheet, so I protected all of the cells they shouldn't edit to prevent errors from breaking everything.

5. Teachers Take Attendance Using Registrations

Every Tuesday and Thursday, during this 30-minute period, teachers open up the spreadsheet, navigate to their course tab, and take attendance using the list of students who have registered to be in their class during FIT. If a student who registered is absent, the teacher copies the registration information and pastes it into another tab labeled "Absent." My dean of students checks the absent tab for students who should be present while my instructional coach and a paraprofessional "sweep" the halls looking for students who may have forgotten to register.

6. Once Attendance is Taken, Teachers Delete the Registrations

Within the sheet is a tab containing formulas that count the registrations for each teacher. It is this tab that each registration form looks at to determine whether or not the class is at capacity (30) and the form needs to close. Deleting the day's registrations after attendance is taken resets the count tab value to zero for the course allowing another 30 students to register the next time around.

If a teacher forgets to delete their registrations after taking attendance, then only as many students as seats available will be able to register the next time around before the form automatically shuts itself off. For this reason, deleting student registrations after attendance is taken is a key behavior to ensuring this system works.

Every Friday, I call five parents. While calling them, I share something great about their student from that week. It could be a concept they worked hard to improve, a great peer interaction, or showing respect to me or another teacher. I do this every Friday without fail.

A classroom well balanced between breadth and depth might introduce new concepts on a regular basis and practice them to ensure basic understanding while at the same time have students always working on one project or task that goes deeper in a keystone area. While the majority of class might still be used to introduce, explain, and practice new content, a significant portion of class time might be devoted to projects and tasks focused on keystone concepts, which students would spend considerable out-of-class time on as well.

‘Student Agency’ Is Not Something You Give or TakeOwnershipPatrick MalleySat, 17 Oct 2015 16:58:46 +0000https://www.edsurge.com/news/2015-10-16-student-agency-is-not-something-you-give-or-take?utm_content=buffer426cf&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:56227d43e4b0d2325422b173Andrew Rikard, Junior at Davidson College in North Carolina:

When educators say that I am an equal, even when I clearly am not intellectually, everything changes ... I feel both a sharp fear and an intense freedom. Suddenly, my voice is valuable ... my thoughts can change the mind of the other collaborators. This is empowering. It is the exclamation that we all are learning together.

New Tech GPA Stronger Predictor of College SuccessDataPatrick MalleyFri, 20 Feb 2015 21:08:16 +0000http://patrickmalley.com/blog/2015/2/20/new-tech-gpa-better-predictor-of-college-success4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:54e7a246e4b035a09d99012bA few weeks ago, I shared that my dual enrollment students' high school GPA was the strongest predictor of college success — stronger even than scores on college placement exams. Last week, it struck me that half the group of students we sent (our juniors) were taught in 100% New Tech courses before dual enrolling in college. The other half were seniors who were taught in traditional classes one year ahead of our New Tech initiative. What a great opportunity for data comparison!

For those unfamiliar with New Tech, let me explain:

Three years ago, my district contracted with the New Tech Network to support change in our high school in three key areas:

Empowering students through increased voice and choice in their learning.

Engaging students in deeper learning of course content through wall-to-wall implementation of project- and problem-based learning as our instructional model.

Enabling students to foster their own learning by providing them with 1-to-1 technology and teaching them to use it effectively.

As part of this initiative, we spent over 2.5 million dollars renovating spaces, buying furniture and technology, and training teachers and leaders. As a result, our staff is now working collaboratively to design authentic projects. We've moved our teacher desks into one of two "Bullpens" where teachers meet between classes and during prep. We integrate courses whenever integration makes sense. Our students take classes like "GeoDesign," "BioLit," "American Studies," and "Civic Reasoning." Each of these classes have two teachers and more time to learn from their work. We are doing a lot of things differently. And better.

To put things back into perspective, we have two groups of students dual enrolling this year: seniors and juniors. Both were educated by the same teachers in the same school. The juniors are part of our New Tech initiative. The seniors are not. The circumstances are begging for further analysis!

To start, let me describe the students. Last semester, we had 67 students dual enroll: thirty-nine juniors and twenty-eight seniors. Both groups represent what we would consider our "top third" performers (more juniors dual enrolled because their class size was larger). The average high school GPA for the groups were close: 3.39 and 3.32 respectively.

They were also demographically similar. Both groups had a few more boys than girls. They represented only a third of our free and reduced lunch population (only 18% of dual enrolled students vs 55% total high school enrollment). They were racially similar, 99% white, which is consistent with our district and community makeup.

The one demographic difference that stands out to me is the obvious one: seniors are, on average, one year older than juniors. They also have one more year of high school experience and are one year closer to entering college full-time. While I cannot say that this information is statistically significant, after working in high schools for the past ten years, it feels anecdotally significant.

In college, they also performed similarly when looking at the average. Seniors passed 96% of college classes with a GPA of 3.01. Juniors passed 92% of college classes with a GPA of 2.90. Failure was experienced by just three students, one senior and two juniors.

One other comparison that seems notable is that both juniors and seniors took similar courses in college with one potentially significant exception: being farther ahead in curriculum, more seniors took advanced math than juniors (46% vs 13% respectively).

Where performance differences become noticeable is in the way individual GPA distributes across students. The graphs below demonstrate that difference by overlapping the distribution of high school and college GPAs for each group independently.

Generally speaking, it is clear that both groups performed better at the top of the GPA range in high school than they did in college; both groups saw fewer individual students with a college GPA in the 3.0—4.0 range. It is notable, however, that the size of the gap between high school GPA and college GPA at the top of the range is smaller for the New Tech juniors than it is for the seniors (this will be highlighted later). And, while that gap continues to exist — albeit in the opposite direction — for seniors in the middle of the GPA range (1.5–3.0, it seems to disappear for juniors. At the bottom of the range, of course, more juniors than seniors earned a GPA below a 1.5.

The degree to which high school GPA and college GPA move together can be further illustrated in the following two scatterplots:

N=28, R=+0.65, r^2=0.418

N=39, R=+0.84, R^2=0.705

As previously reported, there was a strong positive correlation between high school GPA and college GPA for all dual enrolled students (r=+0.74). As this data shows, the correlation was higher for juniors (r=+0.84) than it was for seniors (r=+0.65). And, while I do not have the mathematical chops to tell you yet whether or not this difference (r=+0.19) is groundbreaking, I can only tell you that I find it encouraging.

As an educator, I strive to give students accurate information about their potential to succeed after high school. I find it satisfying to learn that our New Tech initiative may be increasing that accuracy.

Time will tell whether or not this trend will continue. I don't want to make any broad claims about why our New Tech educated students' GPAs are better predictors of college success. I will, however, close with some wonders:

I wonder what effect our measurement of skills (collaboration, agency, oral & written communication) in addition to content is having on high school success as it relates to college success?

I wonder if this trend will continue with our next group of New Tech students who dual enroll? Specifically, I wonder if the model will apply equally to lower high school GPA-earning students?

I wonder if other New Tech high schools have found similar results.

I wonder if I will be satisfied if the only quantifiable difference between our New Tech educated students' college success and those students taught in our traditional high school is this increase in our ability to predict said success? I wonder if our community would be satisfied?

I wonder what questions I'm not asking that may have compelling answers in this data?

Our New Tech students are taking the ACT for the first time next week. We will also begin scheduling our second group of Early College participants. I can't wait to add this data to the mix for further analysis to see how they compare.

]]>The Problem with BoysDataPatrick MalleySun, 01 Feb 2015 21:54:44 +0000http://patrickmalley.com/blog/2015/2/1/college-success-for-poor-males4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:54cea0aae4b0db114fede2faAs previously mentioned, my high school is now dual enrolling more students than ever — about ten times more. A quarter of all juniors and seniors took half their classes at the community college last semester as part of our early college efforts.

By most measures, these students did very well. As a group, they earned over 95% of the credits they attempted with an average GPA over 3.0. They were, after all, able to dual enroll because of their past performance on standardized tests and high school coursework. They went to college because we thought they were "ready."

Yet, unsurprisingly, not all students performed equally well. About 15% of our dual enrolled students ended the semester with a college GPA below a 2.0. A few students even experienced their first academic failure in college. So, even within our high average of success, not all students shared the same experience.

First Semester 2014-15 Dual Enrollment GPA Distribution (N=67)

We consider this fact — that some students didn't do as well as expected — to be a really big deal. It means that our algorithm for credentialing students for college readiness isn't yet perfect. To be clear, we didn't expect it to be, and while we acknowledge that reaching "perfect" isn't probable, wanting perfect gives us reason to dig into our data in hopes of finding some clues that will help us identify relative risk in the future.

Our biggest takeaway?

Boys did much worse in college coursework than girls — a whole grade point worse, on average.

Girls earned college GPAs that were 1.05 points higher than boys, on average.

This is despite the fact that girls and boys performed equally on both the COMPASS and ACT assessments, which we use to determine eligibility for college-level coursework. We're talking less than 0.01 difference between boys and girls on these tests.

Being a boy had a stronger negative effect on student success than any other factor: free/reduced status, high school GPA, etc. At the same time, these factors still added to the risk — going to college as a boy receiving free lunch with a high school GPA below 3.0 was clearly tough — these students earned an average GPA below 1.5 in college.

The average college GPA for girls receiving free lunch with a high school GPA below 3.0: a respectable 2.5.

What now?

We certainly can't increase our requirements for boys above that of girls without raising some eyebrows. What we can do is educate parents and students on the relative risks of going to college and how our data should inform that risk. While hope will likely spring eternal for most, some students may delay college entry in hopes of better results down the road.

We can also raise our expectations overall since doing so would result in sending fewer students with high school GPAs below 3.0. Even though most boys saw their GPA decline in college, the decline was less detrimental on students that started college with a high school GPA that was above 3.0. This seems obvious. It is good to have data to back this up now.

Lastly, I think it's crucial that we think of new ways to support students, specifically these struggling boys, while in college. To do this appropriately, we're going to have to get to know our boys a bit better to start to decipher what is going on. Is it maturity? Is it social expectations? Is it video games? We need to learn more about what is going on with them so that we can build in better supports for them to be successful.

]]>Predicting College SuccessDataAssessmentPatrick MalleyFri, 30 Jan 2015 04:08:50 +0000http://patrickmalley.com/blog/2015/1/29/predicting-college-success4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:54caf0bde4b0cc1bb3fa653aI spent my morning analyzing the grades of the sixty-seven juniors and seniors who dual enrolled from my school this past semester. Of the 464 college credits attempted, 440 were earned, giving us a pass rate just a hair under ninety-five percent. Half the group had a college GPA above a 3.43. I'd say this is pretty good news for our first cohort of New Tech students taking college classes.

One of the goals of my analysis was to assess how well we predicted college readiness amongst these young advanced students. While only four of the sixty-seven students who dual enrolled experienced failure, some students still performed worse than expected. Pushing students to college too early could potentially blemish their college transcript. Defining "ready" has therefore become a really big deal.

Aligning our thinking with both our college partner and the state, we placed the greatest weight on students' college entrance exam scores last year. In deciding who got to go, we let test scores trump all other valid readiness indicators such as high school GPA, teacher perception, etc.

So, how did that work out for us?

The worst predictor of student success for us was their score earned on the COMPASS, taken by our current juniors who had not yet taken the ACT. The COMPASS is used by our community college partner to place students into courses at appropriate levels. For us, it turned out that the COMPASS provided only a minor ability to predict college success (r=0.25).

The correlation between student COMPASS scores and college GPA was a low r=+0.25.

Coming in second was the ACT assessment, taken by all juniors in the state of Michigan. The ACT proved to be a fair predictor of college success (r=0.44).

The correlation between student ACT scores and college GPA was a moderate r=+0.44.

The best predictor of college success turned out to be student GPA (r=0.76).

The correlation between student high school GPA and college GPA was a high r=+0.74.

While the state of Michigan allows schools to use varied methods of determining college readiness before allowing students to dual enroll, it is interesting that they will not not allow GPA be a primary determining factor, given it's apparent ability to correctly predict student success.

What we will most likely do in the future, given this data, is create a single numerical value for each student that takes into account their college entrance exam score and their high school GPA. This would appear to provide some additional predictive ability (r=+0.82 to r=+0.86) not possible using test scores alone.

UPDATE—January 30, 2015: Looking at this with fresh eyes, I think it's important to point out that we used the minimum COMPASS and ACT scores required for college-level coursework placement with our community college partner as our cutoff for allowing students to dual enroll. We did not use the state minimum scores, which are higher. It is logical that using the higher scores would have increased these assessments' predictive ability. We are choosing to use the lower scores to increase access with the hope of keeping risk to a minimum for our students.

We Still Don't Know the Difference Between Change and TransformationLeadershipChangePatrick MalleyFri, 16 Jan 2015 00:10:39 +0000https://hbr.org/2015/01/we-still-dont-know-the-difference-between-change-and-transformation4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:54b8535fe4b06fad9b8fbfcaRon Ashkenas, writing for Harvard Business Review:

Unlike change management, [transformation] doesn’t focus on a few discrete, well-defined shifts, but rather on a portfolio of initiatives, which are interdependent or intersecting. More importantly, the overall goal of transformation is not just to execute a defined change — but to reinvent the organization and discover a new or revised business model based on a vision for the future. It’s much more unpredictable, iterative, and experimental. It entails much higher risk. And even if successful change management leads to the execution of certain initiatives within the transformation portfolio, the overall transformation could still fail.

While his audience is clearly the business community, the above statement speaks volumes to educators just the same.

If I could break down a party for you in social media terms, here’s how it would pan out:

You post yourself getting ready for the party, going to the party, having fun at the party, leaving at the end of the party, and waking up the morning after the party on Snapchat.

On Facebook you post the cute, posed pictures you took with your friends at the party with a few candids (definitely no alcohol in these photos).

On Instagram you pick the cutest one of the bunch to post to your network.

Snapchat is where we can really be ourselves while being attached to our social identity. Without the constant social pressure of a follower count or Facebook friends, I am not constantly having these random people shoved in front of me. Instead, Snapchat is a somewhat intimate network of friends who I don't care if they see me at a party having fun.

Five Rules for Getting Things Done as a PrincipalProductivityPatrick MalleyTue, 13 Jan 2015 12:57:22 +0000http://patrickmalley.com/blog/2015/1/13/five-rules-for-getting-things-done-as-a-principal4fff960324ac03cb5e6ae4b1:4fff9e44c4aa0702d0a51ff5:54b513e4e4b0a3b30671beb1One of the more challenging aspects of school administration is undoubtably managing your time and attention. Between the emails, phone calls, texts, memos, agendas, and chats in the hall, demand always seems to be greater than supply. Handling all those inputs requires a great deal of skill — one in which I am definitely still learning and think about a lot.

The following "rules" represent what I've learned it takes to accomplish daily tasks as a high school principal. My thinking has been influenced largely by David Allen's treatise on personal productivity, Getting Things Done.

Rule 1. View yourself and your management of time, tasks and things as a system.

Systems produce the result they produce because they are designed that way; there are no excuses for systems. If you regularly fail to keep commitments that you make – to yourself or others – then your system has a design flaw requiring increased attention. It's not personal and it's certainly not "the nature of the work." It's a flaw that requires intentional thinking and planning to remedy.

Rule 2: Capture your ideas, next actions, and commitments as they come into your system.

Stop relying on your brain to capture what you think and need to do. It will fail you every time. Decide on the specific tools and methods you will employ to capture this information and get dicsiplined about using them everywhere without exception.

Rule 3: Make time daily to process and organize everything you capture.

You must decide what work must be done the next day, week, month, or year. For me, this almost always occurs between the hours of 9pm and 7am, when I'm not at work and have enough time and space away to think about what needs to be done. Without this time, you're efforts to capture what's coming into your system will be fruitless and what you inevitibly do will be absent the type of intentionality that leads to meeting your goals.

Rule 4: Crank through your next actions and commitments each day.

Procrastination is not your friend. Be honest with yourself about what can be done and do the work. If an urgent event demands your attend – because it will – regroup and get back to work on your list of next actions and commitments as soon as you can.

Rule 5: Review open projects, commitments, and goals every week.

Did you forget to do something? Did a commitment go unmet? How do you know? I spend an hour to an hour and a half each weekend reviewing open loops and establishing next actions. Without regularly taking a 35,000-foot overview of your work, you're bound to miss something, which is a flaw in your system.