Tools and Systems: Thoughts on the Future of Canvas

I have been informed by the Community Managers that my opinions here have crossed a line that violates the Community Guidelines. It was not my intent to be rude or disrespectful, and I understand that when you blog in other people's spaces, you follow their Guidelines. Since someone has been offended (I am not sure who), I would like to apologize to that person; it was not my intent to offend anyone but only to put my opinions out there. I will do that in another space in order not to violate the Community sense of how things should be.

In a back-and-forth with some people at Twitter yesterday, an idea started to take shape that I wanted to share here: the difference between tools and systems. I am a fan of tools, but I am not a fan of systems...

A TOOL: something you use to do things; you are the user, and you decide how you want to use the tool and for what purpose

A SYSTEM: you are part of the system, and the system does things; the work you do is part of how the system works

So, Canvas is an LMS, a learning management system as the name proclaims, but I wish it were more like a tool.

Here are just two examples that are on my mind as I think about this distinction of tool versus system; if I have time later this week, I will write some more about this.

GRADEBOOK LABELS. The Canvas Gradebook looks like a spreadsheet, but it is not. A spreadsheet is a tool that I can use for all kinds of purposes (and I do! spreadsheets rule my world!), but the Canvas Gradebook is a system imposed on my class by Canvas, and I cannot configure it except in small and trivial ways.

Because the Missing labels are completely inappropriate for my classes, I used James Jones's script to remove them as soon as enrollment was concluded (I could not do that sooner because James's script cannot alter the Gradebook system but instead only the records in it: item by item, student by student).

But what about those Late labels: would students want to be reminded by a red Late label that they had used the "grace period" I offer them, a no-questions-asked extension on any assignment? Canvas presumably thinks students supposedly benefit from this constant reminder of their past errors, so maybe Canvas is right: maybe students want the Late labels...? To find out, I had the students vote during the mid-semester review week: Remove the Late labels, or not? The majority voted yes, so I removed the labels.

But what about those few students who wanted to keep the labels? Too bad for them: the system cannot accommodate what they want. Instead of being able to use the Gradebook like a tool in their own way, the students are just part of the Gradebook system: like it or not, there can be only one way to configure the Gradebook for the class... and even that option is only thanks to James Jones who used the Canvas API (because, yes, the API is a tool, and a powerful one!) so that we have a partial solution to the Gradebook label problem. Thank goodness for that at least; I don't know what I would be doing right now without James's script.

So, the Gradebook might look like a spreadsheet, a tool for me to use. But don't be fooled: the Gradebook is a system and your class must conform to the expectations and assumptions of that system. There is no other way.

CANVAS AI AND ANALYTICS. If you have not read Phil Hill's piece on the latest robotutor-in-the-sky hype from Instructor, then you have to go read this, depressing though it is:

This quote from Dan Goldsmith sums up the vision, if you can call it that: one AI to rule them all, one AI to bind them.

We can take that information, correlate it across all sorts of universities, curricula, etc, and we can start making recommendations and suggestions to the student or instructor in how they can be more successful. Watch this video, read this passage, do problems 17-34 in this textbook, spend an extra two hours on this or that.

The claim, as we have heard again and again from ed-tech solutionists, is that instead of giving students and teachers tools we can use for our own purposes, they are instead going to monitor us and build a system that controls (or attempts to control) our behavior. The student doesn't know best, the teacher doesn't know best... the machine knows best: the machine is going to do the learning and then tell us all what to do, as opposed to offering students and teachers tools we can use to design and direct our own learning.

Even if it worked, I would say that is undesirable: people need to learn how to make their own choices about life as well as about learning. But it is not even going to work.

Why won't it work?

Because the data is not there, not really.

Canvas has lots of data, sure, but it does not have MEANINGFUL data on which to make these decisions. Canvas knows NOTHING of importance about our students... because it has never asked them. Canvas knows when they log on and off, it knows how long they have a page open in a browser, it knows what score they get on a quiz.

But that data is all TRIVIA.

And if you build your AI on trivia, then it will indeed be nothing more than a trivial pursuit.

So, in conclusion:

I say we need more meaningful communication, and less trivial data.

Likewise, we need more tools, and less system.

And we need to be able to make more choices, instead of having the system choose for us.

Don't get me wrong. I love the digital world. I love teaching online. That is why I will fight back every time educational systems take away our freedom to explore that world and use what we discover in our own ways and for our own purposes. I still believe in the promise of the web. This web, the one that we build. With our own tools.

Laura, I'm curious if the no and strong no supporters provided any qualitative feedback on why they didn't want the labels removed? Just wondering about their rationale.

As for the AI, I've had the opportunity to try (with my own courses) a couple of "Canvas X" tools I believe Dan was referencing during his talk. While none of them were specifically AI related, I can see where they could easily be the stepping stones in that direction. These tools provided an extra layer of information to me (the teacher) to help me better understand how my students were doing and (in the one I was the most excited about) how their interaction with the course content and the course itself was related to their course performance. Having access to this information as a teacher was pretty great in that it helped confirm in some cases what I already knew, but in other cases brought to light things I hadn't realized or didn't know for sure. I honestly could see a lot of value and use in having that information for myself (as a teacher) and for the faculty I work with having that information as well - even if it was just a way to jump start a deeper conversation about the course content and assignments or how to better engage with students.

From the quote you used above, I think the main point is this - "... start making recommendations and suggestions to the student or instructor." To me this doesn't mean blind following as much as being provided with information so someone can make an informed or possibly better decision about how to spend their time or what to focus on. In all reality this is already being done in education (Ex: Aleks & Knewton) and other areas (Ex: Netflix & Amazon). My College uses Dropout Detective which utilizes various types of Canvas course data to provide Instructors and Administrators with information about which students are most at risk and why they are at risk. We've been using this program for almost six years and have great (and statistically significant) success in our ability to connect with and help students before they have failed or dropped out of their courses. The biggest reason we went with Dropout Detective in the first place is that they were able to provide information we'd never had easy access to before in a straightforward dashboard that lent itself to quickly seeing how students were doing and who needed the most help. Yes, as a teacher who is very connected to her students, I can tell you who is probably most at risk without checking Dropout Detective, but sometimes it catches students who are trending down that I hadn't noticed. This then provides me with the opportunity to dig deeper into what's going on and then have a conversation with the student. Maybe it was an off week, maybe they just didn't understand the content for this chapter, maybe it's life or personal problems. Either way, I can connect with that student and find out if there is anything I or the College can do to help support them. This reaching out not only helps the students know that I really do care, but also serves as the chance to change the downward trend and get them back on track before they are too far gone. To me, this is the power of having access to this type of student/course information - in an easy to access/understand format - we can make a real (and personalized) difference with our students.

tl;dr - Technology isn't needed to be a great teacher, but as long as we continue to humanize our use of technology and information I believe it can be used to improve online teaching and learning.

If you were designing this blend of the personal and the technological, I would have more faith, Kona Jones, but when you look at the kinds of things that Goldsmith is saying, that is not what he is talking about: instead, it is the robotutor-in-the-sky type of stuff where it is not about analytics to highlight a problem to bring to the teacher's attention; instead, it is about designing the curriculum, and using algorithms generated by Canvas based machine learning using data across classes and even across schools for predictions that would tell students what content to read/watch, etc. -- that is something very different than Dropout Detective. Did you read the quotes in Phil's article?

It's the same robotutor-in-the-sky promises that other companies have made too, completely untested and unproven, and which is all going to be very costly. Even more costly if it fails and Instructure has expended resources chasing that robotutor-in-the-sky, as opposed to investing in ways to give us access to our data to use in ways that make sense to us (machine learning, by definition, does not make sense to humans... which is why I do not put a lot of faith in it).

Yes, I read what Phil wrote and I guess my thoughts are that if Canvas really does go down the road that you think they will, it will either be a paid (so extra $$) or opt-in feature. Either way, it will likely be something a teacher (or program) individually decides if they want to use or not.

As to Instructure chasing something that could likely fail, my hope is that they keep students and learning their number one priority as it is developed. As such, I would expect, based on previous experience, that they will do their due diligence to make sure the needed research and testing is done along the way to ensure whatever they end up with is useful and something that teachers would want to use and that would actually help students.

Am I putting a lot of trust into Instructure, yes, but in the last seven years they've earned my trust through their actions and openness. Yet, I will also continue to pay attention to new updates and information, volunteer to provide input or to even test whatever they are doing, and generally make sure I'm an active part of where Canvas is headed in the future so I can help make sure it's done the right way.

I'm obviously participating too, Kona Jones, but I am really concerned. There were a lot of alarm bells for me in Goldsmith's keynote conversation at InstructureCon, and there were a lot of projects from Project Khaki postponed with no explanation or information. Will there be a Project Khaki this year? Based on the previous schedule, there would be, so that is my next canary-in-the-coal-mine, waiting to see if they run a Project Khaki and, if they do, what we can learn about the postponed projects from what they tell Project Khaki participants this time around.

There was a big article in the Chronicle last week about well-intentioned people who nevertheless squandered huge amounts of time, money, and opportunity chasing impossible dreams driven by big data and other "massiveness" ... my school also wasted millions of dollars on similar stuff. Goldsmith's comments about supposedly being able to get insights out of a big database that would drive teaching and learning (aggregating data across classes and campuses)seems to me exactly the same kind of hubris that led so many schools to squander millions of dollars in the MOOC madness. But, like Phil said, maybe it was just Goldsmith speaking in a way that is just marcomm and not really what Instructure is aiming for.

I've asked before why Instructure is not collecting analytics that would give more direct access to what students are THINKING as opposed to what they are doing, surface behaviors. That would mean letting students rate content as more/less useful, letting them rate assessments as more/less accurate measures of their learning, building in student self-assessments along with course assessments, that could then be used to contextual the behavioral data in meaningful ways. Until Instructure starts collecting data with more insight FROM the students, I don't think they are going to be very successful in changing those student behaviors. Behavior is just that, behavior. If you want to change people's behaviors, you have to start investigating the causes of those behaviors. Canvas analytics are still at the very primitive stage of measuring behaviors, not revealing causes. To get at the causes, they are actually going to have to start asking the students questions about the WHY, not just the what and when.

It's interesting that you mention this - "letting students rate content as more/less useful, letting them rate assessments as more/less accurate measures of their learning.." - because that's pretty much the exact same thing Adam Williams, Chris Long, and I came up with/talked about needing at the 2015 (??) Canvas Unconference. We envisioned something like a star rating system where students could rate what they thought of all of the course content (videos, pages, files, etc), as well as the assignments. Personally I'd LOVE something like this and would find it very valuable. It's also why I ask for feedback from my students EVERY week and specifically ask them to let me know what they thought of the course content and assignments. It's amazingly insightful information and I've made a number of changes even this semester (in the middle of the semester) to better accommodate my students and improve the course and their learning. :-)

I asked Jared Stein about this in a convo a long time ago too! He explained that it's not part of Analytics, at least it was not then -- and for me, there's no point in using Analytics until that student component is the heart of the process. I'm not using grades to force students to change their behaviors (because I don't think that's going to result in real, long-term learning); instead, I'm trying to get insight into my students' motivations and to understand the challenges they face. Finding good content for them to use and good learning activities is my goal, and I only know if I have succeeded based on what the students tell me, and I try to gather feedback in lots of forms. Structured, unstructured; formal, informal; direct, indirect; numeric, non-numeric: I am not anti-data. But I am anti-trivia. :-)

I've written elsewhere about how I do gather data and use a spreadsheet to track it. The Canvas Gradebook as it stands right now is useless to me for those purposes because it is based on assumptions that do not apply to my classes in any way.

Laura, While I understand your frustration in the lack of customization, I would argue with your Tool/System definitions. To be sure, yes Canvas is a system, but I find it to be a system of tools (perhaps a toolbox?). You are defining a tool as

something you use to do things; you are the user, and you decide how you want to use the tool and for what purpose

However this is not really true of tools and presents a false premise. I cannot pick up a hammer and decide to saw a piece of wood with it. Well, I guess I could, but I shouldn't complain when it does make a nice clean cut. Tools have purpose by their vary nature along with an imply method to use them. Can I unscrew a Phillips screw with a flat head screwdriver? Sure. Is that the most effective way, no. But again I am choosing to use a tool that is not the correct tool for the task I want to complete. If it is the only one I have, do I complain to the maker of the screws that they should have made it too work with the screwdriver I choose?

Please don't get me wrong, but as you say yourself, you do not build courses in a way that is "in the system", so why the frustration when the same platform doesn't accommodate your toolbox? As someone that has used many different platforms over the years, Canvas is by far the most flexible enterprise level solution. Light years ahead. Hands down. No contest. The tools and means they provide allow so much, like the ability for community members to write a program that will remove the labels that you don't want to see.

While I would like to see more granularity is some parts of Canvas, they have to design for the majority and where it makes sense to accommodate those that want to "bend" the product. I think Canvas does this.

Matthew Jennings the point of digital tools is that you don't have to limit yourself to designing for the majority... but yes, that is what Canvas makes you do, and that is what I had to do with my students and those late labels. I was able to make only one choice for the whole class, even though some students would have preferred to leave the labels on.

Worse: the only reason I was able to turn the labels off at all is because James Jones was able to use the API tool to build me something I could use to turn off the labels.

It's a chicken-and-the-egg thing: does Canvas impose these limitations because that is how most people teach, or do most people teach this way because Canvas (and the other LMSes for that matter), have such a narrow, top-down view of what learning is...? If they gave us a tool so we could choose, well, we might have a better answer to that question, but instead they give us a system, and we are expected to conform.

So, not only is Canvas willing to settle for the "majority" (as opposed to inclusion and accommodation), but by doing so, they are forcing people, like it or not, to conform to that majority way of doing business.

Just speaking for myself, I will not use Canvas if that conformity means I cannot do my job as a teacher. Without James's script right now, I don't know what I would do.

And you should read Phil's article if you have not read it yet: in pursuit of machine learning and new corporate customers, Instructure is making some real choices right now about where to invest its time and money. I personally don't care one way or the other about Quizzes.Next, but for the people who are (still) waiting on features there, I am guessing that the news about Instructure chasing AI/machine-learning instead is not good news. Do you really think we are going to benefit from the kind of algorithms Goldsmith is promising? (and which they are unlikely to deliver anyway...). I do not.

point of digital tools is that you don't have to limit yourself to designing for the majority

I still disagree that this means all digital tools should allow you to customize any way you like. Digital tools still have to have a standard purpose. I would not open MS Word to participate in Social Networking or edit a photo. That is not in the design of the program. There have to be decisions made about the best way to organize and design a tool to accomplish a certain task. That is what Canvas has done. It is A tool, but not the only tool. It cannot, and I would argue should not, be everything to everyone. There was a presentation from Christi Wruck at InstructureCon 2017 (https://community.canvaslms.com/videos/3212-from-instructure-with-love-innovations-in-canvas-ux-christi-wruck) that talks about the decision process and has a great example using a Simpsons episode.

I don't think it is necessarily a chicken-an-the-egg situation. I believe from the evidence of discussions with Canvas at things like Khaki, this presentation above and other conversations I have had at InstCon, that they put a lot of thought into how they are building things. I am not saying that it does everything I would like, but I have an understanding that there is NO tool or system (digital or otherwise) that will give me complete autonomy to do whatever, whenever and however I want. That is not the purpose of the product. Even when I hosted my own blog with WordPress, Tumblr, Blogger and several others, I was always limited to doing things the way the tool was designed with the options they provided. I would have to create my own product for that for that kind of control or I accept that the product I select to use has certain limitations or ways I have to work around it to make it do what I want.

On top of the intended design of a product, if it is being used as a enterprise solution, there are limits placed on users by the institutions as well. The way an admin configures it can greatly change the way it can be used. There is also, in my case at lease, policies from out IT and legal departments that prohibit me from performing certain activities that could increase my productivity. There are many thing I would like to do, Canvas would allow me to do it, but I cannot because of the other limiting parties.

I am not trying to just be a jerk and to be an apologist for Canvas, but it feels like you are very worked up about a product not working the way you want and a company not focusing on the thing you think are important. I am just trying to give some counterpoint.

Please note: I am not addressing anything about the whole AI portion of your discussion. It's not that I don't care about it, but I know I am not informed enough to respond to you regarding this. I am only address the statements about Canvas not being a tool and being a system, I personally think it is more of a starter tool box. It comes with the basics things you need and plenty of space for you to buy and add any additional tools you may need for what you are wanting to accomplish.

If there are policies at my school that say I can or cannot do something, that's between me and my school. There is no reason for Canvas to tell me what I can or cannot do in that same way as my school might; instead, Canvas should make possibilities available so that students, teachers, and schools can configure the software in ways that work best for them.

And yes, I know software has limitations by definition, but I am talking about the ASSUMPTION OF AUTONOMY within that design. So, within the purpose design of the software, what assumptions are made about autonomy?

WordPress is not an LMS, so it does not have a Gradebook.Canvas is an LMS, so it has a Gradebook. How much does Canvas let me configure the Gradebook as I might configure a blog with WordPress? Basically not at all. That's because Canvas has a low assumption of autonomy when it comes to the Gradebook. (And hey, it's not surprising that Canvas makes low assumptions about instructor autonomy, because most instructors make low assumptions about student autonomy: it's education karma coming back around to kick us in the butt.)

In terms of another software parallel, the Gradebook is just a glorified spreadsheet, and other spreadsheet programs show us how much more powerful the Gradebook COULD be if Canvas did not operate on an assumption of very low autonomy for its users. Just the opposite of WordPress, which is designed with very high user autonomy in mind. That's why I am calling WordPress a tool, while Canvas is a system (learning management system after all).

Online teaching in digital spaces is something that is still incredibly new, and we have to experiment -- experiment WILDLY in my opinion -- to discover what is going to work best.

You are willing to accept that Canvas already knows best.

I am not prepared to accept that. I am still experimenting.

So, we can just agree to disagree about it. It's a matter of opinion, and I have a very different opinion about the kind of autonomy I want/need to do a good job teaching online.

P.S. I should add that it is precisely because of school rules and FERPA that I CANNOT replace the Canvas Gradebook with a spreadsheet that actually works for my classes. In other cases, yes, I simply use other tools.

But for legal reasons I am stuck with the Canvas Gradebook, which is why I have to keep complaining about how bad it is instead of just using another/better tool instead as I do for everything else.

Screenshot of the actually useful spreadsheet I use to track my own work in the class as a student instead of the Canvas Gradebook (I take my classes as well as teaching them): I can sort/filter the whole thing by type, plan my choices, color code automatically, etc. etc. You know, work with the data like a real spreadsheet allows. By using this spreadsheet myself, I am constantly reminded of how much less useful the Canvas Gradebook is for my students, but I have no choice when it comes to things related to grades: I have to use the Canvas Gradebook, even though I have (thank goodness) freedom in the other tools I choose and that my students choose to use -- they choose what blogging platform they prefer, what web publishing platform they prefer, what they want to read, etc. etc.: autonomy. I am a fan.

Now this is why I love Community. Laura your blog made me sit up and think. It also made me breathe a sigh of relief knowing there are so many creative teachers out there who continue to breathe life into the way they teach within their teaching environments including Canvas.

tl;dr - Technology isn't needed to be a great teacher, but as long as we continue to humanize our use of technology and information I believe it can be used to improve online teaching and learning.

Then Kelley gave hope when referencing those creative teachers who aren't slaves to the system, they make it work for them.

While I would like to see more granularity is some parts of Canvas, they have to design for the majority and where it makes sense to accommodate those that want to "bend" the product. I think Canvas does this.

You know I will keep on bending, Bobby Pedersen...! But if we ignore the kind of news that Phil reported in his article, we do so at our peril. I think we need to let Instructure know what we think about chasing these AI/machine-learning/robo-tutor-in-the-sky goals: speaking for myself, I think it is a terrible idea, especially if R&D for Canvas is increasingly limited to make room for Bridge and those new corporate customers, as Phil reports in the article.

So, are machine-learning-driven algorithms what people really want/need to do a better job in their teaching?

Or would they rather have features of the sort that were voted up at Project Khaki but which were deferred indefinitely?

I think it will be very telling to see if there is another Project Khaki this year or not (as there would have been), and also what will happen to the items from the last Project Khaki that were deferred and did not happen, like global search. In fact, I find it pretty ironic that Instructure is making all these claims about its massive database, but they cannot even let users (instructors or students) search their own course content. That's not even machine-learning or AI... that's just searching!

Anyway, my impression is that the spirit of Project Khaki belongs to the old way of doing business at Instructure, not the new way.

All of the Community Managers have read this post and had time to discuss it. We very much appreciate people expressing their passions, but we'd also like to remind you that if you put yourself out there, there are actual humans on the other side of the screen that are just as passionate about teaching and learning, but with different perspectives and ideas. I'm a bit disappointed that a conversation like this does not demonstrate more growth mindset and is so narrowly focused in opinions at some points that it feels adversarial. Don't get me wrong, there are some comments that are very oriented in growth mindset in this thread, and I am thankful for them.

This might be a good time to call upon the guidelines, just as a reminder.

And with that reminder there is one fact I'd like to correct and one question I feel needs answered.

When speaking of Khaki priorities that were completed and shelved, this document should have been referenced. Khaki 2018 Update. There are blog posts that precede it, but this is the final summary of Khaki 2017. Only one priority that was originally prioritized was shelved and there was clear communication as to the why it was shelved. [tl;dr: the scope was much larger than originally thought and would have taken the entirety of engineering resources for Khaki]

Khaki has never been a regularly scheduled event and probably never will be. We'd like to find ways to do similar activities that provide the same type of learning experience for more than just 30 people every couple years. We never duplicated attendance, but even then that's only 60 people in 4 years. We don't know what the answer is to this yet, but you know us, we'll keep trying different things.

Renee Carney Just a quick reminder: growth mindset is about getting OUT of your comfort zone, and I thought that was what the Community space was for. If you are telling me that my sharing Phil's article expressing his concerns about Instructure developments (important concerns IMO), along with reiterating my profound dislike of the Gradebook labels is making people uncomfortable, then I would say that is what growth is about: challenging ourselves to try new things and think about new possibilities. But if you would prefer for me to take my opinions elsewhere, there are lots of spaces where people can connect and share online. You need to be more direct, though, okay? I can blog elsewhere if that is what you are saying.

And one of the best things about Khaki was exactly the idea of bringing in new people every time! That's exactly how you get new ideas. But I cannot really tell what you are saying here about whether there will or will not be another Project Khaki this spring. Again, since you are not being direct, I'm not sure, but I am curious. And just speaking for myself (I can't speak for anyone else), I think Instructure needs some Project Khaki goodness right now during this obviously big corporate transition where there are very big top-down agendas emerging as Phil documented in his post, with fewer resources going to Canvas.

As for the failure of search, the fact that it was posed as an all-or-nothing is very unfortunate. If the project needed to be rescoped, I am not sure why it was not rescoped. I tried to ask that question at InstructureCon and got no response; I was hoping that it would be rescoped with community input. In fact, I am still hoping it will be rescoped with community input. And I will continue to be skeptical about claims re: AI and machine-learning if it is not even possible for people to search their own course content. After the failures of big data in education so far (see the Chronicle of Higher Ed article cited above just as one example among many), I think some skeptics are useful, in or out of the Community.

So, like I said, I really like blogging here, but if my own opinions are better expressed elsewhere, that works too. Your comment here is really not clear, but it sounds like I should go back to blogging separately from this Instructure platform. It's fine with me either way. The web is a big place.

Hey Laura. Long time no talk. I respect your passion for teaching and learning and the time you invest in that endeavour and I hope everything is going ok for you outside of this endeavour as well. There are quite a few assumptions being tossed around here and you requested more directness so let me try to be direct so we can get back on the same page as soon as possible. I would like to start by referencing the first two Canvas Community Guidelines.

Be cool. It’s OK to be critical and express frustration from time-to-time, but rudeness is not acceptable. Always treat others with respect. Behind every comment is a living, breathing human being, and we are all in this together. Personal attacks or criticisms of another's abilities or motives will not be tolerated.

Be accurate.You are entitled to your own opinions but not your own facts. Feel free to express your opinions, but if you express an opinion, identify it as such. If you make an assertion, be prepared to back it up with sources.

On the "Be accurate" guideline, you made some statements that weren't accurate about Khaki which I believe detracted from the points you were trying to make. We want everyone reading to have the correct information on this. Specifically, this line caught my attention:

Laura Gibbs wrote:

...and there were a lot of projects from Project Khaki postponed with no explanation or information.

On the "Be cool" guideline, again, I understand the impact of technology on teaching and learning is something you are passionate about. We are passionate about it too. However, I think that passion is leading to some very strong opinion-based statements that are moving past "expressing frustration from time-to-time" and approaching "rudeness" to the human beings who are working hard to make things better every day. Furthermore, it seems these opinions are based upon speculation of what might happen in the future. Perhaps it would be better to reserve that passion to some degree until things transpire beyond statements and speculation. If we screw up, we know you will all tell us and I hope you know we will work to make it right. Afterall, there was change to the missing and late labels and I have to think that yours and others opinions and examples of how it affected you in your situations played a role in that. So I would hope we have some level of trust in that regard.

I hope you truly want to engage with us and the Community in a productive manner as I have always felt you do. If so, please consider the fact that passion about things we are unhappy with must be handled with care, lest they inadvertantly turn into rudeness to those with whom we wish to engage and have a longstanding relationship with.

I hope you choose to continue to engage here. You have brought much value to our shared Community. But if if you feel the need to express yourself beyond the guidelines, I would recommend doing that in another forum. This is not mutually exclusive of course. You can choose to express yourself one way here, and another way in other forums. Either way, I hope we are able to continue to work together towards solutions. Cheers!

I've been doing a lot of self-censoring the past years here, and I did not realize I was not self-censoring enough to meet the Community guidelines. Per Adam and Renee, I see that it's time to dust off the old Digital Teaching blog.

Thanks for all the good discussions; I honestly thought this was a good discussion too, but you know the community managers mean business when they call you out in public like this. (For anybody who's curious, no, I did not get any kind of private heads up, just being called out in public here. And about that, I have to say: Ouch. That is also not cool IMO.)

We have a lot of respect for you, which is why we have reached out privately so many times in the past. Please do not confuse what we're asking you to reflect on (and ultimately everyone) here. It's ok to express opinion, it's ok to express frustration. All needs to be done in a way that is respectful and uplifting of all parties involved.

I don't have time to join this discussion; which is a shame, because I also have some thoughts on this and come from the foundation of doing things with our students, rather than doing them to our students.

Date & Time: April 16, 2019; 3:00-5:00 pm PT In recent years, a new article or book has appeared every few weeks on the importance of fostering growth mindsets, resilience, and belonging in today’s students. How can college educators translate this research into practice? What difference do these psychological factors make in college math classrooms anyway? In this webcast, participants will explore strategies to foster growth mindsets, build resilience, and increase belonging in mathematics/quantitative reasoning classrooms. Participants will also learn about currently available resources for addressing psychological factors in course design and daily instruction. A significant amount of time will be spent collaborating in virtual breakout rooms with peers from across the Washington. For best use of the breakouts, participants should join individually from a computer with a working microphone and camera. They are encouraged to meet with campus colleagues after the webinar to discuss what they have learned.

Thanks Kelley L. Meeusen ! I have seen more and more awareness of growth mindset at my school in the past years because now when I ask students if they have encountered it before, sometimes they have -- and often in the context of math and engineering classes! Yay! :-)

What a thought provoking post! I really enjoyed reading this and appreciate your student-centric approach. It's clear that student-centered learning isn't just the flavor of the month for you and I love that you asked your students for feedback on something as mundane as the labels in the gradebook. The results also surprised me and I just assumed that students findthe labels to be useful. I bet you are the first teacher that has ever asked them about that (that would be a good question to add to your next survey) and I think that is part of the bigger problem.

If a large number of educators aren't actively involving their students, collecting their feedback and using that to make changes then I wonder if that is something Ed Tech companies can sell? Can a tool change the system? Is the Canvas Community a tool or a system?

Thanks Chris Long! It was actually thanks to someone here at the Community (James Jones I think?) who said that the students might find the Late labels useful. I had just assumed the opposite, so it was great to stop and think and then be able to just ask the students. That was good for the students, too, of course, because it also got them to stop and think, and also to show them that I'd like to configure Canvas in the best way I can, even though it is a bad fit for my classes overall (grading-wise).

I can understand the fear of Instructure moving toward a development/corporate learning software. However, I don't think they'd be able to or interested in abandoning the education solution that is Canvas for HigherEd and K12.

But, I'm struggling with the apprehension with AI and Prediction Models?

You say there is no data that Canvas can use to leverage those kind of recommendations, and I must respectfully disagree. As someone who has been building web applications for 20 years the amount of data that Canvas creates for each school and globally would provide an extremely large set for AI and Prediction Models to understand various student centered and even individualized feedback.

While I have no experience in AI or Prediction Models a lot of what I'm working on is the ability to address different patterns in student activity and course design that create success. Find a common problem in the course over multiple semesters or a pain point for students and make a correction and compare that to the previous data and you have a system for 'suggestions' for either students or teachers.

Put simply, the same course delivered over multiple semesters by different groups of students is going to produce a data set that allows you to make 'educated' predictions about how much time a student spends, navigation, resources accessed, communication, and ultimately their progress and success or not in the course. The teacher can then choose how to act, send notifications or not.

UCF

Provides a fantastic Dashboard designed to help students assess their effort and success. Student View

There is an example in this presentation where he can show student who clicked a quiz question, whether they changed their answer multiple times. If you analyze that data across students, across semesters, you can determine what % of students change their answer, did those changes result in a correct answer, or did they get it wrong. Does the question need to be improved. The student can then decide how to act, do more, ask for help, or not.

Identifying patterns of deployment and student behavior has high potential for improving what we can do with learning.

Robert Carroll As I said to Kona, this is not about analytics in the way that Canvas has offered that before: getting feedback to help teachers correct poorly written quiz questions or getting information about students whose level of participation is low is very different from the machine-learning-driven predictive analytics that are meant to take the teacher OUT of the equation, combining data across schools and having machines make predictions, which is what Goldsmith is promising. In Goldsmith's own comments, he was emphatic that this is no longer just about analytics but about AI and algorithms. As someone who has resisted the way that TurnItIn has profited (to the tune of one and a half billion dollars) by monetizing the student data that schools turn over to them, I am very sad to see that this is apparently Instructure's next play: creating AI and machine-learning systems based on the student data that they have collected from schools in the past. That is not just analytics anymore, and it is very different from what schools might choose to do with their own data.

But I really cannot say more here since I do not know how to self-censor enough to stay within the Guidelines.