During our team’s work on the Storytelling for Social Changecourse with University of Michigan (U-M) faculty member, Anita Gonzalez, we recognized a need to really bring to life a tool that would allow learners to share their text- or image-based work for the course with other learners in an easy and open manner and to also receive robust feedback on their work from other learners. Because of the nature of our learners’ work in the course, and because of the different expectations and experiences we wanted them to have, we sought an alternative to the course platform’s peer-review tool. This alternative had to be one where work and feedback could both be shared more freely and in a way that prioritized high-quality interactions (especially dialogue) over numerical scores and one-way assessment. Ultimately, we ended up with a Gallery that would facilitate this sort of learner interaction and empower learners to share work—or multiple works—without fear that criticism of their work or the particular “rules” of the peer-review tool would impede their successful progress in the course.

The Gallery tool is not only being used in the Storytelling for Social Change course, but also our Python Basics course, which introduces the basics of Python 3. For Python Basics, we wanted learners to have an opportunity to practice their Turtle programming skills and to submit their work for peer feedback. We wanted a lightweight option, something that would allow learners to share their work in a “low-stakes” environment, without the formality and restrictions of peer-graded assignments. The Gallery Tool allowed us to create a forum for learners to upload their drawing(s) and create prompts, which ask their peers for specific feedback about their drawing. We set the tool up to allow learners to filter on type of drawing, such as abstract, animal, building, logo, and nature.

We are already seeing a tremendous range of subject matter in the Gallery, including spider webs, pyramids, U-M logos, nature scenes, and many, many abstract drawings. Learners are asking for feedback on topics such as how to create color effects, how to create specific shapes, and areas for improvement. Interestingly, learners are also asking questions of other learners that relate to skills they have demonstrated in their drawings, such as “How do you fill a shape?”

Figure 1: Two Turtle drawings, published in the Python Basics course using the Gallery Tool

What does the Gallery Tool do?

Figure 2: The Upload Submission screen learners see before submitting a piece for peer feedback.

Learners can upload a text- or image-based artifact, or a link to an artifact in another medium, to the Gallery, where they will be able to also provide a synopsis of the artifact and some relevant questions they would like to pose to a potential reviewer. In turn, they will have the ability to browse other learners’ work and provide feedback on it, taking into account the very questions that their colleagues have posed in association with the work. The Gallery is very much a place of reciprocity, and it thrives on learners contributing and receiving meaningful thoughts and reactions from others.

A collaboration across Academic Innovation teams

The creation of this tool was very much a joint effort across Academic Innovation teams, namely the Online Tools Team, who did the heavy lifting of designing and building the tool, the Design Management team, and the Learning Experience Design team. Our former colleague Steve Welsh provided a lot of early guidance on the tool’s design from a learning-experience perspective, and Anita Gonzalez also contributed helpful ideas about its purpose and execution, as well as a thoughtful early critique of a prototype. Together, members of all of these teams met regularly to assess the Gallery’s features, design, and future efficacy when employed in the context of our courses.

One personally eye-opening aspect of the development process was the careful balance of designing a robust tool that would be truly effective in Storytelling for Social Change—which was natural, since it was the impetus for the tool in the MOOC context—but easily adaptable for other courses and contexts from both the pedagogical and programming perspectives.

What’s next for the Gallery Tool?

We can see lots of potential for use of this tool in future projects. Essentially, this tool is a forum for learners to participate in a “show and tell” of their work, because it allows them to share creative artifacts and receive feedback from peers. Some courses ask learners to complete a final project. The Gallery Tool would be a great place for learners to share sketches and drafts, and receive responses to questions about their work before they submit their project for summative evaluation. Learners can also browse through previous examples, before beginning work on a challenging project. Some of our courses are hosted on two platforms simultaneously. Since the Gallery Tool works through Learning Tools Interoperability (LTI) integration, the tool could be a bridge between both versions of a course. Learners on Coursera would be able to share work and interact with learners on edX, and vice versa. In sum, Learning Experience Designers and others at Academic Innovation are excited by the flexibility that the tool affords, and are eager to use the tool in situations where learners would benefit from the opportunity to share and showcase their early work with a receptive and constructive audience.

Since 2012, the University of Michigan has scaled access to rich learning experiences through massive open online courses, course series, and Teach-Outs in more than 190 countries around the world. This global reach expands upon the university’s public purpose while also providing personalized pathways for lifelong learning. Through partnerships within, and outside of, the U-M community, the Office of Academic Innovation is building upon this diverse library of online learning experiences and is enriching the learning process for residential students at U-M (and beyond) through an expanding portfolio of digital educational technology tools.

https://ai.umich.edu/wp-content/uploads/7-million-enrollments-600x338-01.png338600Trevor Parnellhttp://ai.umich.edu/wp-content/uploads/AI_logo_header.pngTrevor Parnell2018-12-06 15:46:012018-12-07 09:17:22Infographic: Innovation Impact at the University of Michigan and Beyond

Personalization is a popular concept in and outside of higher education, yet definitions vary, sometimes widely, about what it means to “personalize” educational experiences for students. ECoach, a tailored communication system, is using personalization backed by well-researched behavioral science, smart user experience design, and ongoing software development, to help students succeed in large courses. Professor Tim McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education and founder of ECoach, explains what it was like for him to grapple with meaningfully and successfully reaching hundreds of students in his large introductory physics course. Listen as Professor McKay talks about the “a-ha” moments that motivated him to create a digital edutech solution to provide the right information, at the right time, and in the right way to his students. Hear Professor McKay examine how ECoach has evolved over time, and what the future may look like for ECoach and thoughtful, student-centered, and technology driven personalization as part of the future of higher education.

ECoach is a digital platform that was originally developed by a research team led by Professor Timothy McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education, to create a tailored communication system for introductory large-scale courses at the University of Michigan. Currently implemented in courses such as statistics, chemistry, economics and biology, ECoach provides personalized and timely feedback to students on how to succeed. ECoach content is informed by behavioral science techniques such as motivational affirmation and multiple data streams, including input submitted by students themselves. This digital tool helps learners navigate big (and sometimes overwhelming classes) by providing tailored communication, insights into their progress and ways to approach common obstacles. By making information more transparent and personalized to each student the hope is to increase student motivation and engagement with the course. In the past few years, this electronic personal coaching platform has grown immensely and its use continues to expand.

https://ai.umich.edu/wp-content/uploads/ECoach-Infographic-Featured-Photo-600x338.jpg339600Eric Joycehttp://ai.umich.edu/wp-content/uploads/AI_logo_header.pngEric Joyce2018-08-10 12:42:562018-08-13 16:45:31Infographic: Growth and Adoption of ECoach Across the University of Michigan

At the Office of Academic Innovation, we improve our digital tools through feedback from students and users, and as a former Innovation Advocacy Fellow at Academic Innovation, my work focused on helping to initiate innovative forms of usability tests. In this blog post, I will talk about one form of usability testing we’ve conducted in the past and how it is a valuable means to collect feedback for both informing iterative improvements to our digital tools. (Figure 1: Pop-up test on north campus)

What are “Pop-up” tests and what advantages do they provide?

Figure 1: “Pop-up” test on north campus.

“Pop-up” tests are an experimental form of usability testing that I worked on from an initial stage during my time with Academic Innovation. Unlike traditional forms – such as one-on-one interviews, focus groups etc. – “pop-up” tests free us from the constraints of small, enclosed meeting spaces and a traditional Q&A format. Instead, these tests allow researchers to interact with students during their daily routine to encourage more interaction between participants and interviewers. Advantages of this type of activity include gathering quick feedback from a larger and wider student body in a short period of time, making more students and faculty aware of digital tools developed by Academic Innovation, and ample opportunity to collect feedback. Through these tests we realized the activities used to gather feedback are not confined by rigorous interviews. Due to the flexibility of the environment in these “pop-up” tests, we can actually have participants transition their roles from passive to active participants whose responses and reactions can even change the direction of the activity. Therefore, we came up with a hands-on activity for a “pop-up” test researching the course page layout of data visualization tool, Academic Reporting Tools 2.0 (ART 2.0).

Using “pop-up” tests to inform layouts that make the most sense for students

ART 2.0 helps students, faculty, and staff make more informed decisions by providing access to, and analysis of, U-M course and academic program data. By allowing students and faculty to access data on courses and majors from past academic terms, ART 2.0 allows for data-driven information to lead toward better decision making and new opportunities at U-M. With this tool, students can decide what major other students like them pursue and what courses they could consider taking the following semester. A lot of students report they like to use it with Wolverine Access to backpack courses.

Figure 2: ART 2.0 Course Page.

Although ART 2.0 is already an established website (see Figure 2), we still want to learn what is an optimal layout of information for student users. I proposed an alternative, hands-on activity to engage student participants instead of a traditional Q&A format for gathering user feedback. To accomplish this, we took the website and created a form board with the information displayed on the page separated into small components. We put Velcro on the back of these components so students could combine and move around the these pieces until they reached the kind of layout that made the most sense for them (see Figure 3). By offering this hands-on activity, it is easier to assess intrinsic factors, like curiosity, instead of only extrinsic factors, such as treats or rewards, in their decision making process. It is also a “free of fail” activity for participants since we know that different people have different preferences in comparison to a Q&A format, where participants may be embarrassed by not knowing the correct answer to a question.

As we expected, there were no two identical answers out of the 30 samples we collected. Some students preferred a more concise layout and others proposed to combine similar groups of information, for example pre-enrollment, co-enrollment and post-enrollment, for a particular class. From there, we assigned different scores to different areas of the board (upper, middle lower). Components that were placed in the upper section received three points, the middle section received two points, the lower section received one point, and all others received zero points. With this strategy, and our experience interacting with participants, we are able to identify some general patterns:

The top three factors students take into consideration when deciding on a course are grade distribution, instructor reviews, and student evaluations.

Graduate students pay less attention to school, major, enrollment trends, and grade distribution because they have fewer instructors to choose from.

Different schools/colleges also have their own way of collecting course evaluation, and students wish to see more information that is tailored to their own school/college.

During this first round of hands-on, “pop-up” usability testing, we were able to gather valuable feedback while identifying a process that we could keep improving upon. We are confident in the advantages of a substantial user pool and in the feedback collected locally by U-M students. Through this process, we hope Academic Innovation will keep creating and improving tools that best serve students.

What happens when an English faculty and a Chemistry faculty partner to create a writing-to-learn program? You get M-Write.

Listen to the latest episode in the Origin Stories podcast as Anne Ruggles Gere, Arthur F. Thurnau, Gertrude Buck Collegiate Professor of Education and English Language and Literature, Director of the Sweetland Writing Center, and President of the Modern Language Association, and Ginger Schultz, Assistant Professor of Chemistry, discuss how they came together, from disparate fields, to create the M-Write program. Hear how M-Write uses pedagogy and the creation of software tools to help students use writing exercises to learn science, economics, and engineering concepts in large STEM courses. Professors Gere and Schultz talk to us about how they partnered with the Office of Academic Innovation to help scale M-Write, and explore their long-term plans for the program.