Post navigation

Ok, so that headline might sound a little dramatic. Yet even on the most collegial of campuses, a serious conversation about retention rates – especially if that number has gone in the wrong direction over the last year or two – can quickly devolve into wave of finger pointing and rekindle a litany of old grudges.

We’ve all heard the off-handed comments. “If admissions would just recruit better,” “If faculty would just teach better,” “If student affairs would just help students fit in better,” “If financial aid would just award more money.” Lucky for me, all of these assertions are testable claims. And since we have the data . . . (cue maniacal laughter).

Yet using our data to find a culprit would fly in the face of everything that we are supposed to be about. We say that our students learn because of the holistic nature of the Augustana experience. And analyzing our student data by individual characteristics implies that our students are somehow one-dimensional robots. Most importantly, if we want to improve, then we have to assess with the specific intent of starting a conversation – not ending it. That means that we have to approach this question with the assumption that we are all critical contributors to retaining students.

This year’s first-to-second year retention rate isn’t great. At 82.9%, it’s the lowest it’s been in five years. In the context of the Augustana 2020′s target retention rate of 90%, there is certainly reason for raised eyebrows. To understand what is going on underneath that overall number, it’s worth looking at our data in a way that mirrors the students’ interaction with Augustana up until they decide to stay or leave. So let’s organize these trends into two categories: students’ pre-college demographic traits and the students’ first year experiences.

Scholars of retention research generally point to five pre-college demographic traits that most powerfully impact retention: gender, race, socioeconomic status, academic preparation, and first generation status. Below is five-year trend data across these categories.

2009

2010

2011

2012

2013

Overall

87.8%

87.6%

84.4%

84.9%

82.9%

Female

91.2%

89.6%

85.7%

90.1%

82.7%

Male

83.6%

85.0%

82.8%

78.6%

83.2%

White

88.1%

89.0%

84.4%

85.8%

84.2%

Multicultural

87.0%

82.6%

85.6%

81.3%

78.4%

Pell grant recipient

78.4%

83.3%

84.0%

81.3%

80.8%

Only qualified for loans

88.7%

87.4%

83.5%

83.3%

81.2%

Did not qualify for need aid

90.6%

90.6%

86.2%

89.5%

86.7%

First generation

data not collected until 2012

83.0%

80.8%

ACT <= 22

77.8%

82.1%

83.3%

75.0%

78.6%

ACT 23-25

90.5%

89.5%

84.3%

87.7%

84.9%

ACT 26-27

92.4%

88.2%

83.0%

90.4%

83.8%

ACT >=28

91.0%

90.9%

89.0%

87.0%

82.2%

Test Optional

76.7%

75.0%

75.8%

81.3%

84.8%

ACT top three quartiles

91.0%

89.9%

85.3%

88.3%

83.9%

As you can see, our own data suggests a more complicated picture. Although nationally women persist at higher rates than men, our data flipped last year when persistence among men actually eclipsed persistence among women. Our retention rate for multicultural students (our euphemism for non-white students) has trended steadily downward, a fact made more pressing by a steady increase in the number of multicultural students. Although we haven’t tracked first-generation status for more than a few years, this retention rate has also dropped while the number of first generation students has increased. While our retention rate of Pell Grant recipients (those students with the highest need) has increased slightly, the retention rates of students who only qualified for loans has dropped steadily. At the same time, the retention rate of students from the highest socioeconomic status has dropped a bit.

Finally, academic preparation retention rates paint an interesting picture. The national data would suggest that our worst retention rates should be among those students who come from the lowest ACT quartile. At Augustana, those students’ retention rates are also lower and haven’t changed much. By contrast, the retention rates of students from each of the other three quartiles, although they are still higher than the lowest quartile, dropped substantially between 2012 and 2013. Interestingly, the retention rate of the students who applied test-optional has gone up almost 8 points over the past five years.

But students’ likelihood of persisting to their second year is not etched in stone before they start college. Another way to look at some of these trends is to examine the characteristics of the students who leave against those traits that might indicate an experience that differs from the mainstream in some important way – especially if we know that this difference in experience might affect the calculus by which the student determines whether it is worth the time, money, and emotional investment to stay. So as you look through this table, remember that these percentages are the proportion of departed students who fit each category.

2009

2010

2011

2012

2013

cumulative GPA was below 2.5

57.3%

50.5%

47.3%

43.4%

39.3%

Male

60.0%

51.6%

47.3%

63.6%

45.8%

Female

40.0%

48.4%

52.7%

36.4%

54.2%

White

68.0%

73.1%

78.2%

71.7%

69.2%

Multicultural

9.3%

21.5%

16.4%

28.3%

30.8%

Pell grant recipient

29.3%

32.3%

25.5%

31.3%

31.8%

Only qualified for loans

36.0%

38.7%

47.3%

46.5%

42.1%

Did not qualify for need aid

34.7%

29.0%

27.3%

22.2%

26.2%

First Generation

24.2%

30.8%

ACT <=22

32.0%

28.0%

20.9%

30.3%

25.2%

ACT 23-25

21.3%

23.7%

28.2%

23.2%

26.2%

ACT 26-27

12.0%

19.4%

20.9%

11.1%

15.9%

ACT >=28

21.3%

19.4%

18.2%

24.2%

24.3%

Much of this data corroborates what we saw in the examination of retention rates by pre-college characteristics in the case of gender, race, socioeconomic status, first generation status, and academic preparation. However, one new trend adds some interesting nuance to the impact of the first year experience on retention. If we look at the proportion of departing students who are also in academic difficulty when they left, there is a clear differences of almost 20 percentage points among those who had less than a 2.5 GPA when they left. In other words, far fewer of our departing students are in a position where their grades might be the primary reason for their departure. This suggests to me that, if they aren’t departing because of grades, then there are other key elements of the first year experience that would be primary contributors to the decision to depart.

We aren’t going to answer the question of what is negatively impacting retention today. Even if we were to pinpoint a significant factor in a particular year, because the nature of each class differs, evidence from one year might be entirely useless the next. My point today is simply to highlight the degree to which all of us impact retention together.

I’d like to think that on some level we know that finger pointing is foolish. Yet in an environment where we are simultaneously immersed in our own silos and entirely dependent on the efforts of others (e.g., faculty don’t have a job if admissions doesn’t recruit anyone), it doesn’t seem all too surprising that such behavior (especially if budgets are under threat) might surface despite the best of intentions. So maybe if you hear someone grouse about retention rates and “rounding up the usual suspects,” you’ll remind them that we are in this together. If we fail to improve, it won’t be because someone didn’t do their job – it will be because we all didn’t pull together.

I think I’ve resigned myself to the fact that it is nearly impossible to explain to someone who hasn’t experienced Augustana’s 10-week terms what it feels like to go from standing still less than a month ago to flying by the seat of your pants at week four. But here it is verging on mid-term time, and I’m hurtling through space trying everything I can to get my bearings!

So, to show that the Institutional Research and Assessment staff (AKA Kimberly and I) doesn’t just sit around dreaming up ways to collect more data, I thought I’d share with you . . . more data. (I guess this doesn’t really debunk my assumptions about your assumptions, does it.)

Every spring, we ask our graduating seniors to complete a survey that asks all sort of questions about their experiences at Augustana. In addition, we ask a few important questions that we’ve found to be useful outcome questions (would you choose Augustana again, is your post-graduate plan a good fit for who you are and where you want your life to go, and do you already have a job or grad school place).

It takes a lot of work to process this data into a readable report, but it’s finally finished and posted on the IR web page. Here is the direct link to the 2014 Senior Survey results.

Now you could jump on that link right away and start swimming in the data – percentages, averages, and standard deviations (oh, my!). And you might survive the experience, although your eyes will probably start to glaze over as you look at mean score after mean score and your brain will likely start to go soft wondering (rightly so) what exactly each average score means – is it good? is is bad? is it just right? (sort of like Goldilocks in the house of the three bears . . . replacing the bowls of porridge with excel spreadsheets, of course).

So if I may, let me make a suggestion that I hope will make some of this data more meaningful. Instead of looking at the numbers first, put a sheet of paper over the numbers and look at the questions first. Reflect on why each question might matter for students and what might be the “about right” distribution of responses. Pick out 3-5 questions that seem particularly interesting to you.

THEN take away the sheet of paper covering the numbers. Do your musings match up with the average score or the distribution of responses? What more would you like to know that might help you get a better handle on what we could do to improve, if the difference between your reflections and the actual score suggests that institutional improvement might be valuable?

This data isn’t of much use if it doesn’t help us get better at what we do. And you – the people on the ground floor who are working with students every day – are the ones who are ideally suited to tackle this data, jump into this process, and benefit from the results of your efforts.

If you have any questions, comments, suggestions, or criticisms (I prefer to think of it as constructive feedback!) about the senior survey, PLEASE PLEASE PLEASE contact us – Kimberly or me – in the IR office. Nothing we have built is so important that it can’t be changed . . . especially if those changes make the survey better.

All too often we talk about feedback as if it’s something that either happens or doesn’t. Students get feedback or they don’t. Faculty give feedback or they don’t. Moreover, all too often I think it’s easy for people like me to unintentionally imply that if students would just get the right feedback at the right time, they would respond to it like budding valedictorians.

However, the concept we are really talking about is much more complicated than just simple information given in response to information. At its fullest, effective feedback encompasses a recursive sequence of precisely timed instructor actions intertwined with positive student responses that produces a change in both the quality of the student effort AND the quality of the student work. Yet despite our best efforts, we know that we have only partial control over this process (since the student controls how he or she responds to feedback) even as we agonize over our contribution to it. So it doesn’t help when if feels like what we hope for and what we get are two very different things.

In this context it’s no wonder that raising the issue of effective feedback can cut close to the quick. All of us do the work we do because we care about our students. To those who burn the midnight oil to come up with just the right comments for students, suggesting that we could improve the quality of the feedback we provide to students could easily come off as unfair criticism. To those who think that there isn’t much point in extended feedback because students today rarely care, raising the issue of faculty feedback seems like preaching to the (wrong) choir.

I, for one, have not always been precise enough in my own language about the issue of effective feedback. So I ought to start by offering my own sincere mea culpa. The conversations we’ve had on campus over the last month about gathering more comprehensive data on our students’ progress early in the term have helped me think much more carefully about the concept of feedback and the ways that we might approach our exploration of it if we are to get better at what we do. With that in mind, I’d like to share some recent data from our freshmen regarding feedback and suggest that we explore more deeply what it might mean.

For the last several years we’ve asked freshmen to respond to the statement, “I had access to my grades or other feedback early enough in the term to adjust my study habits or seek help as necessary.” The five response options ranged from “strongly disagree” to “strongly agree.”

Two events combined to start our consideration of a question like this. First, changes in federal financial aid law steepened the ramifications for dropping classes, making it critical that students know their status in a course prior to the drop date. In addition, we had been hearing from a number of people who work with struggling students that many of those students hadn’t realized they were struggling until very late in the term. Recognizing the pervasiveness of willful blindness among many of those same struggling students, it took us a while to phrase this question in a way that at least allowed for the difference between students who simply never looked at their grades or other relevant feedback versus students who never received a graded assignment until the second half of the term.

Here is the distribution of responses from last year’s mid-year freshman survey.

I had access to my grades or other feedback early enough in the term to adjust my study habits or seek additional academic help.

strongly disagree

62

16%

disagree

111

30%

neutral

75

20%

agree

104

28%

strongly agree

24

6%

What should we take from this? Clearly, this isn’t the distribution of responses that we’d all like to see. At the same time, the meaning of this set of responses isn’t so easily interpreted. So here are some suppositions that I think are probably worth exploring further.

Maybe students are, in fact, regularly ignoring specific improvement-focused feedback that they get from their instructors. Maybe they assume that since the assignment is already graded, any comments from the instructor are not applicable to improving future work. Given the “No Child Left Behind” world in which our students grew up, it seems likely that they would need substantial re-educating on the way that they use feedback if the feedback we provide is specifically designed to guide and improve future work.

On the other hand, maybe students are getting lots of feedback, but it isn’t the kind of feedback that would spur them to recalibrate their level of effort or apply the instructor’s guidance to improve future work. Maybe the feedback they get is largely summative (i.e., little more than a grade with basic descriptive words like “good” and “unclear”) and they aren’t able (for whatever reason) to convert that information into concrete actions that they can take to improve their work.

Maybe students really aren’t getting much feedback at all until the second half of the term. If they are taking courses that are organized around a midterm exam, a final paper, and a final exam, then there would be no substantive feedback to provide early in the term. Given the inclination of some (i.e., many) students to rationalize their behaviors in the absence of hard evidence, this combination of factors could spell disaster.

Finally, maybe students are getting feedback that is so thoroughly developmental in nature that it is difficult for the student to benchmark their effort along a predictive trajectory. In other words, maybe the student knows exactly what they need to do in order to improve a particular paper, but they don’t understand that partial improvement won’t necessarily translate into the grade that they wanted or believed they might receive based on the kindness and empowering nature of the instructor’s comments.

The truth is that all of these scenarios are reasonable and in no way suggest abject failure on the part of the instructor. And it is highly likely that all students experience some combination of these four scenarios through the academic year.

Whatever the reason, our own data suggests that there is something gumming up the works when it comes to creating a fluid and effective feedback loop in which students’ effort and performance is demonstrably influenced by the application of the feedback provided to them.

What should we do next? I’d humbly suggest that we dig deeper. To do that, we need to know more about the kind of feedback students receive, the way that they use or don’t use feedback, the ways that students learn to use feedback more effectively, and the ways that instructors can provide feedback more efficiently. In other words, we need the big picture. Maybe the new mid-term reporting system will help us with that. But even if it doesn’t, we still would do ourselves some good to look more closely at 1) the result that we intend from the feedback we give, and 2) the degree to which the feedback we give aligns with that intent.

If history is any predictor of our potential, I think we are more than capable of tackling the gnarly problem of effective feedback.

When Augustana faculty, staff, and administrators were discussing the possibility of a single building that combined student life offices, a dining hall, multiple academic services, and the Tredway Library under one roof, one of the suggested advantages to such a design was grounded in the potential for proximity and efficiency. The proximity argument asserted that students would take advantage of more opportunities and services because these offices were conveniently located together. The efficiency argument claimed that students who already intended to use many of these services would be able to do so more quickly and easily.

Unfortunately, we will never be able to produce iron-clad proof that the Center for Student Life (CSL) has lived up to its billing. That would require building an identical Augustana College campus on which we did NOT build a CSL connected to a Tredway Library so that we could compare student behaviors under both conditions (how’s that for a fund-raising challenge!). However, now that we have a year of Gävle gatherings and “all-you-care-to-eat” dining under our collective belts, we ought to be able to examine our student data to see if use of the CSL is contributing to student growth and success.

So let’s start with the aforementioned rationales for attaching the CSL to the Tredway Library.

The proximity of academic and student life offices and facilities would collectively boost student use of academic services and involvement in student groups.

The convenience of locating all these facilities and services in one place would help students engage the totality of the Augustana experience more efficiently.

In last year’s freshman surveys, we asked several questions that we can analyze together to test these assertions. The question central to today’s analysis asked, “How often did you study – by yourself or in small groups – in any part of the CSL/Tredway library building?” Although this question focuses on academic pursuits, if the prior assertions hold true responses to this question ought to correlate positively with increased use of academic resources, increased involvement in student groups, and growth on some important developmental or learning outcome of the freshman year.

It turns out that the Center for Student Life appears to be functioning exactly as we hoped it would. Even after accounting for differences in gender, race, incoming academic ability, and socio-economic status, the frequency of students’ studying in the CSL or Tredway Library predicted a stronger response to the statement “I took advantage of academic support resources (faculty office hours, reading and writing center, tutors, study groups, etc.) when I could benefit from their help.” Likewise, the frequency of studying in the CSL or Tredway Library building predicted stronger agreement to the statement “I am participating in at least one student group/organization that interests me.”

Finally, students’ frequency of studying in the CSL/Tredway Library significantly predicted their agreement with the statement, “During the year I got better at balancing my academics with my out-of-class activities.” By comparison, students’ frequency of studying in their dorm room produced no such relationship.

So what does all of this mean? In short, the Center for Student Life seems to be cultivating the kind of student behavior patterns that improve multiple aspects of their engagement as well as a key aspect of their development. The more time students spend studying in the CSL/Tredway Library, the more likely they are to use the academic resources they need when they need them, find and join student groups that fit their interests, and improve their ability to balance all the in-class and out-of-class elements of the Augustana experience that we believe are important for learning. These findings suggest that we ought to take a hard look at students’ propensity to study in their dorm rooms (75% of last year’s freshmen spent at least half of their study time in their dorm rooms) and the ways that we guide them to make more effective use of space and place.

Moreover, this is the kind of guidance that students need to hear over and over. In many cases our students are coming from life experiences where they didn’t leave their homes – or even their rooms – to study. In addition, they may not know much about the importance of establishing effective behavior patterns or conditions most conducive to learning. We know that a fundamental difference between high school and college success lies in the shift to a more assertive approach to learning, and the idea that one would find a distinct location to study is a longstanding example of such an approach to college.

Based on our freshman data, the benefits of the CSL seem pretty clear. This isn’t to say that the CSL is perfect or that there aren’t other things that we could do to improve the building or the way that we use it. But in terms of its effect on increasing the quality of our students’ experience and helping Augustana meet its educational mission, the CSL seems to be off to a good start.

But the building isn’t so effective that it will magically suck students into some sort of learning vortex. If they don’t use it, then it’s of little use. So I hope that you will strongly encourage your students to put themselves in a position to reap benefits of the CSL and the Tredway Library.

It’s great to feel the energy on campus again. And it’s exciting to restart my Delicious Ambiguity blog: Season 4 (I’ve been renewed!). Each week I share some tidbit (data that comes from statistics or focus groups) from our own Augustana student data that will help you do what you do just a little bit better the next time you do it. If you’re new to Delicious Ambiguity, you might also want to know that you can search the three years of previous posts (about 100 in all) for everything from athletes to introverts, Greeks to geeks. In addition to a ton of useful findings, you might even find a few funny quips (AKA bewildering side comments).

By now you’ve probably heard me say on at least one occasion that building assessment efforts around genuine improvement, as opposed to doing assessment to find out what’s already happened (i.e., to prove what you think you are already doing), thoroughly changes every part of the assessment process. More importantly, it’s the only way to actually get better because:

You’ve backward-designed the entire project around finding out what you need to do to get better instead of just finding out what happened, and

You’ve humbled yourself to the possibility of improvement and thereby matched your efforts with the way that educational processes actually work.

I’d like to share an example of one program at Augustana that has clearly benefited from an “assessment for improvement” approach. My goal here isn’t to brag, but rather to walk you through an example of such a process in the hope that something I share might be applicable to your own unique context.

Augustana has run some version of freshman orientation for a very long time. And by and large, it’s been a pretty successful program. Yet everyone involved has always wondered what they might do to make it just a little bit better. Much of our prior data only told us the proportion of students who were “satisfied” with the experience. Although we could pat ourselves on the back when the numbers were decent (which they were virtually every year), we had no way of turning that information into specific changes that we could trust would actually make the experience demonstrably more effective.

So a few years ago, folks from Student Affairs, Academic Affairs, and the IR office began applying an improvement-centered assessment approach to orientation. First, we talked at length about drilling down to the core learning goals of freshman orientation. Sure, we have lots of things we’d love to result from orientation, but realistically there are only so many things you can do in four days for a group of 18-year-olds who are some combination of giddy, overwhelmed, and panicked – none of which makes for particularly fertile learning conditions.

So with that in mind, we needed to strip our goals down to a “triage” set of learning goals for orientation. We settled on three concepts.

Welcome Week will connect new students with the people necessary for success.

Welcome Week will connect new students with the places necessary for success.

Welcome Week will connect new students with the values necessary for success.

The people we identified included all of the individuals who might influence a student’s first year experience – other students, student affairs and residential life staff, and specific faculty members. The concept of place involved a) knowing EXACTLY how to walk to one’s classes and specific first-year resources, and b) finding other places on campus that a student might use for emotional rejuvenation as well as intellectual work. The values we discussed focused on clarifying a strategy for getting the most out of a liberal arts college setting. This meant introducing students to a mindset that focuses on actively participating in a process of learning and growth and show them how this approach will increase their likelihood of success, both in the first year and beyond.

Once we spelled out our goals for Welcome Week, then we could set about our work from two directions. First, we could start to alter the design of the experience to meet those goals. Second, we could build a survey that examined the degree to which students came away from Welcome Week connected to the people, places, and values that substantially increase the likelihood of success in the first year.

Over the last two years the survey findings have provided a number of interesting insights into the degree to which certain experiences were already meeting the goals we had set. More importantly, the survey data has become a critical conversation guide for specific improvements. Because the questions were built around specific experiences, it has given everyone – particularly peer mentors – a clear target to shoot for with each student. For example, if the goal was to ensure that each student would say that they knew exactly how to get to their classes on the first day, then the peer mentor could shift from merely pointing at buildings while walking around campus to creating some way for new freshmen to walk right up to the door of the room where their class would be.

At the same time that we were using data to guide specific adjustments, the folks planning Welcome Week examined the design of the entire program. This led them to introduce several changes, including the Saturday morning concurrent sessions titled “Augie 101,” focusing on all kinds of issues that would specifically increase the likelihood of successful academic acclimation.

We will survey the freshmen in the next week or so to find out how the most recent set of changes impacted their experience during Welcome Week. But even without that data, I suspect that the new programming this year improved the experience. My confidence comes from one particularly compelling data point that isn’t a number (I know – sit down and take a deep breath!). During the Augie 101 sessions, peer mentors and other older students who were assisting with Welcome Week kept saying, “I wish we would have had something like this during my Welcome Week experience.” To me, that is a powerful endorsement of our efforts.

We will likely never be perfect, but we have mounting evidence that we keep getting better at what we do. That doesn’t mean that we have any reason to brag or rest on our laurels. It just means that we are doing things right. And that’s what makes doing this work so much fun.

As we claw our way toward the finish line at the end of another spring term, it isn’t hard to look around and see proof of our passion for our students’ development. But one disadvantage of working in the unusually autonomous environment of a small college is that we don’t often get the chance to step back and enjoy the totality of our collective efforts. So in my last post of the 2013-14 academic year, I hope this data I share below will give you a chance to revel in our success and take some real pride in what we have accomplished together.

A few weeks ago, Gallup released the summary report of its first large-scale study of college graduates (they hope to make this an annual study). The project, titled the Gallup-Purdue Index, explored the relationship between undergraduate experiences and the nature of college graduates’ engagement at work and overall well-being. You can read some of the reviews of these findings in The Chronicle and Inside Higher Ed (or the actual report) here. Essentially, after surveying over 30,000 individuals across the country, Gallup found what we have known for a very long time: the quality of student-faculty interaction is fundamentally important to a college graduate’s long-term quality of life.

Interestingly, the various questions on which the Gallup findings are based look awfully familiar. That is because we’ve been asking many of the same questions for years now, and using that data to inform our work and our perpetual effort to improve. So I thought it would be nice to take a moment to step back, compare the responses from our students to those questions with the responses from the Gallup study participants, and smile.

Below I list each of the Gallup data points followed by a couple of similar Augustana data points.

Gallup-Purdue Index

I had at least one professor at [College] who made me excited about learning.

63% strongly agree

Augustana Senior Survey

My one-on-one interactions with faculty have had a positive influence on my intellectual growth and interest in ideas.

54% strongly agree + 37% agree

I really worked hard to meet my instructors’ expectations.

45% very often + 39% often

Gallup-Purdue Index

My professors at [College] cared about me as a person.

27% strongly agree

Augustana Senior Survey

Faculty in my major cared about my development as a whole person.

52% strongly agree + 34% agree

My major advisor genuinely seemed to care about my development as a whole person.

50% strongly agree + 28% agree

The faculty with whom I have had contact were interested in helping students grow in more than just academic areas.

41% strongly agree + 48% agree

Gallup-Purdue Index

I had a mentor who encourage me to pursue my goals and dreams.

22% strongly agree

Augustana Senior Survey

Faculty in my major knew how to help me prepare to achieve my post-graduate plans.

37% strongly agree + 38% agree

How often did you major advisor ask you about your career goals and aspirations?

84% very often, often, or sometimes

I am certain that my post-graduate plans are a good fit for who I am right now and where I want my life to go.

During my senior inquiry project I learned a lot about myself (work habits, handle setbacks, manage a larger project, etc.) in addition to the topic of my paper/project.

42% strongly agree + 38% agree

Gallup-Purdue Index

I had an internship or job that allowed me to apply what I was learning in the classroom.

29% strongly agree

Augustana Senior Survey

60% Participation in an Internship

My out-of-class experiences have helped me connect what I learned in the classroom with real-life events.

22% strongly agree + 54% agree

65% of seniors work on campus

Gallup-Purdue Index

I was extremely active in extracurricular activities and organizations while attending [College].

20% strongly agree

Augustana Senior Survey

How many student groups or clubs did you find that fit your interests?

32% many + 48% some

Gallup-Purdue Index

I feel emotionally attached to [College].

18% strongly agree

Augustana Senior Survey

I felt a strong sense of belonging on campus.

24% strongly agree + 43% agree

If you could relive your college decision, would you choose Augustana again?

40% definitely yes + 33% probably yes

While the senior survey data we highlighted in this table came from the 2013 senior class, these responses aren’t any different from the 2012 class or the class that will graduate in this weekend. So take a moment, even in the midst of all the last minute craziness and stress the comes with finishing out the last term of an academic year, and pat a colleague on the back today. Because we do this together. And we have every right to be proud of ourselves, each other, and the work that we do.

This year we tried something new with our freshman survey. Instead of administering the entire survey in the spring term, we split it into two pieces; one part administered in the middle of the year and one part administered at the end of the year, concentrating our questions in the mid-year survey on academic and social acclimation and focusing the end-of-the-year survey on learning and development. This allowed us to get much better data from struggling students, who often are no longer enrolled in the spring. It also allowed us to link the conceptual emphases of each part of the survey with what students were more likely to be experiencing at the time when they took the survey.

Over the last couple of months I’ve shared some of the findings from the mid-year survey -findings that can help us improve our support of students’ acclimation to college life. In this post I’d like to share some early learning and development findings from the end-of-the-first-year survey. The two items below are a few of the new items we added to the end-of-the-first-year survey as potential outcomes of the first year. You’ll notice in the phrasing of the questions that we approached these outcomes developmentally. In other words, we don’t conceive of freshman year outcomes as absolute thresholds that have to be met at the end of the first year. Instead, we think of the first year as a part of a larger process in which students move at different speeds and at different times. In the end, we are trying to get a sense of the degree to which freshmen believe that they have made progress toward one skill and one disposition that undergird a successful college experience.

During the year I got better at balancing my academics with my out-of-class activities.

Strongly disagree

8

5%

Disagree

10

6%

Neutral

35

21%

Agree

84

51%

Strongly agree

29

17%

Over the past academic year, I have developed a better sense of who I am and where I want my life to go.

Strongly disagree

2

1%

Disagree

15

9%

Neutral

41

25%

Agree

71

43%

Strongly agree

37

22%

In both cases, there appears to be reason to smile and reason to frown. On one hand, about two-thirds of the freshmen respondents agree or strongly agree with these statements. While one could quibble about whether some of these students were already fully capable of balancing their academic and out-of-class responsibilities or already had a strong sense of self, direction, and purpose, I think it is fair to suggest that students are more likely responding to the phrasing about their own perceptions of personal growth.

On the other hand, about a third of our students indicated neutral, disagree, or strongly disagree to these questions. While there are probably more than few potential explanations that are outside of our control, I suspect that future analysis, once all of our data is cleaned and recoded for analytic purposes (that is fancy talk for “turned into numbers that statistical geeks like to use to play in the statistics sandbox”), will help us understand some of the experiences that positively and negatively predict students’ responses to these items.

Then comes the really hard work. What do we do with that knowledge? I hope we will do what we are learning to do more often: change our practices and/or policies to improve our students development and learning. So stay tuned as we unpack all of our new data. And PLEASE PLEASE PLEASE if you have any freshmen in your classes, implore them to complete the freshmen survey that has been emailed to them several times over the last few weeks.

This upcoming week includes the annual spring Board of Trustees meetings, so everyone of us up in Founders Hall are scurrying up and down and back and forth putting together data reports and prepping for two days of meetings. And since it’s week nine of the spring term, I know that most of you are in the same high push mode just to get through to the end of the month.

So even though I don’t have as much time to dedicate to this week’s post as I’d like, I thought I’d show you a couple of numbers that might surprise a few folks. It turns out that we – faculty, staff, and administrators – aren’t the only ones who are working our tails off.

In the last couple of years we’ve talked about the potential developmental benefits of student employment on campus. Yet even I was surprised by the proportion of our seniors who work a job, either on-campus or off-campus, during their senior year.

It turns out that 86% of our 2012-13 seniors worked a job for pay either on-campus of off-campus during their senior year. And with most of this years’ seniors (514) responses recorded, 89% of our 2013-14 seniors worked either on-campus or off-campus. I suspect that these numbers are higher than what most faculty and staff would have guessed.

In both cases, most of those students work 10 hours or less per week. However, there is a substantial proportion of our students who are working 20 hours or more each week.

Of course there are trade-offs within this reality. Our students may be learning many valuable skills through their work experience. However, this obligation takes time away from their ability to be involved on campus or put additional time into their academic pursuits.

Either way, unless Augustana wins a couple of state lotteries and puts it all into a fund for financial aid, this is going to be a reality for most of our students for the foreseeable future. We can either grouse about it or find ways to take advantage of it for the educational benefit of our students.

Over the past two years we’ve been asking seniors to give us a ballpark comparison of their participation rates in on-campus events during their junior and senior years. We inserted this question into our senior survey for a couple of reasons. First, we thought it would be useful to get a sense of whether our seniors maintained a similar level of campus engagement once they move off campus. Since we describe ourselves as a four-year residential liberal arts college, it seemed appropriate to ask whether our seniors’ participation patterns met the spirit of such a claim even if, technically, the basic reality might not be quite so. Second, given the possibility that living off campus might set circumstances in motion that could decrease campus participation among seniors, we thought it would be useful to know if any particular experiences increased the likelihood that seniors continued to stay involved on campus despite living elsewhere.

Even though surveying this year’s seniors isn’t finished yet, the response to this question in each of the last two years suggests a clear change in campus engagement between the junior and senior year. Here’s the distribution of responses for both classes of seniors.

“How often did you participate in on-campus events during your senior year?”

2012-13 Seniors

4% – More than when I lived on campus

54% – About the same as when I lived on campus

41% – Less than when I lived on campus

2013-14 Seniors (with about 80% of the seniors’ responses submitted so far)

4% – More than when I lived on campus

46% – About the same as when I lived on campus

49% – Less than when I lived on campus

Of course, there are a variety of opinions on whether this is a good thing or a bad thing. There may be some value in seniors stretching their legs and starting the transition to independent life after college while they are still seniors. Similarly, there are more than a few opinions about whether we should try to build more residences on campus and require seniors to live in them. But as long as seniors continue to move off campus for their last year at Augustana, it seems to me that the question we ought to ask is (assuming that we would like to have our seniors involved in on campus events during their senior year – a foregone conclusion, I hope): what do we know about the factors that predict more campus involvement among our seniors? And how can we ensure these factors are equally experienced across our entire student community?

With the data we now have at our disposal, we can begin to peel back this onion a little bit. The guiding question for this analysis ultimately turned on whether or not the obligations of participation in formally organized activities (sports teams, music ensembles, student groups, etc.) explained all of the difference between seniors who stayed involved on campus and those who did not, or if there were other informal experiences that influenced on-campus participation above and beyond those obligations.

It turns out that the degree to which seniors said that they felt a strong sense of belonging on campus correlated significantly (in a statistical sense) with participation in on-campus events compared to the junior year, even after taking into account membership in athletics, music, Greek groups, or student clubs.

At first glance you might argue that this is self-evident. And I wouldn’t argue with you one bit. However, I’d add that, in the context of the way in which we currently organize our students’ college experience, this finding makes even more clear the importance of helping students feel like they belong on our campus. We already know that this sense of belonging varies across student types and groups. For example, students in the Greek system on average feel a significantly stronger sense of belonging than non-Greek students. Similarly, some of our data suggests that students in some of the smaller STEM majors also feel a lower sense of belonging on campus. Based on these variations, once seniors move off campus, it is reasonable to suggest that the culture of our campus might be shaped in part by the type of seniors who choose to stay involved in campus events during their last year. This, in turn, could perpetuate a similar variable sense of belonging across student types, and make it more likely that we cultivate a student culture that privileges some types of students more than others.

I’m not saying that we are desperately off-kilter or need some sort of radical readjustment in our student culture. I’m only hoping to point out that a feeling of belonging is more than just an abstract feeling. It has real consequences in student behaviors that in turn produce a demonstrable student culture with identifiable characteristics. Finally, this finding also means that we shouldn’t consider ourselves powerless to change it.

As you probably know by now, the new Augustana 2020 strategic plan places our graduates’ success after college at the center of our institutional mission. In real terms, this means that what our students learn in college matters to the degree that it contributes to their success after college. Put another way, even if our students learn all kinds of interesting knowledge and complicated skills, if what they have learned can’t be effectively and meaningfully applied to life after college, then we haven’t really done our job.

Now whether you think that this is the last nail in the liberal arts coffin or the long-awaited defibrillator to revive liberal arts education, our own success hinges on something else that I’m pretty sure we haven’t thought much about. Exactly what are we talking about when we talk about a successful life after college? Do we have a working definition of what might make up a successful life for an Augustana graduate? In order to grapple with those vertigo-inducing questions, we have to know a lot more about what happens to our graduates after college. But do we have anything more than vague notions about our own graduates’ lives?

I’m afraid that the answers to those questions are probably no, no, and no. In part, it’s because these are big, hard questions. And to be fair, I don’t know of a college that has tried to get a real handle on these ideas. So . . . . here we go . . . .

This is the kind of research project that can keep you up at night. Because it isn’t just about getting data to figure out the relationship between one thing (an Augustana college experience) and another thing (a successful life after college). For starters, these are two monstrously complicated constructs. Distilling them down to some essential qualities may well be impossible. I’m not saying that it’s NOT possible; I’m just admitting to the fact that I’m intimidated by the very idea of trying to identify a set of valid essential qualities. And as if that weren’t enough, we (higher education researchers writ large) have yet to have developed a conceptual framework that is complex enough to account for the almost infinite range of ways in which people’s lives evolve. To date, every effort to link alumni success to their college experience has presumed a straight line – even when we know that very few of us traveled a straight path to get to where we are now.

So over the past six months or so, Kimberly and I have built a multi-stage study in an attempt to get at some of these questions. We settled on calling it “The Winding Path Study” (all credit to Kimberly for the title) and we have organized it around two initial stages, with room for additional exploration. First, we had to find a conceptual framework that fit the way that people live their lives. We found one that I think works that comes out of sociology and anthropology called Life Course Perspective. Essentially, this framework describes lives as amazingly complex and almost infinitely unique, yet full of three common elements – trajectories, transitions, and turning points. While Life Course scholars have extended definitions for each of these terms that I won’t try to summarize here, I think we all know what these terms mean just because we can likely point to moments in our own lives where the impact of these concepts became clear.

Next, we built a survey (but of course!) to try to get a better sense of the range of trajectories, transitions, and turning points that our graduates have experienced. I hoped that we might get 1000 responses. From these respondents, I hoped that we might find 100 that were willing to participate in a 30 minute interview.

Well, apparently we struck a chord. We got 1000 responses from Augustana alumni in the first 12 hours of the survey, and finished with 2,792. In addition, over 1200 respondents said that they would be willing to participate in a 30 minute interview.

I’ll share more about this project in the next several months as we pore through the data. One thing that jumped out at me as I began to watch the data coming in was the extent to which people were willing to tell us surprisingly personal details about their lives. Our respondents wrote and wrote and wrote. We now have a treasure trove of data that we have to read through and organize. At the end of this project, however, we will likely have a much greater understanding of the range of life courses that our alums have taken. Better yet, we hope to find some patterns that will help us think about the way that we guide our students during college.

The goals of the Augustana 2020 strategic plan are lofty and complicated. I’m not sure we even realized how challenging this plan would be when the Board approved it in the winter or when we designed it last fall. But now that we’ve started to roll up our sleeves, I think we already have information on our graduates that most colleges could only wish that they had. Now comes the fun part!