In this part III we’d like to circle back to the spectrum of approaches to digital evolution discussed in part I looking through the lens of “What strategy is going to give most bang for the buck both near term and longer term?” After all, every step that a state authority or local authority takes is an investment of precious time and resources - which cannot afford to be wasted. The spectrum is visualized in the following diagram:

Each of the three strategies that are illustrated along the spectrum has their strengths and weaknesses, summarized here:

Requires evolution to teaching, learning and institutional processes that enable personalized and experiential learning to obtain full benefit

Digital learning objects in the sky

Google search is already what everyone does in all aspects of life, why not education?

Difficult/time consuming to achieve coherence of instructional materials or the instructional experience

What strategy makes the most sense? Well, the reality is that all three strategies are going to occur just because they are all options that are out there. However, we think there is an obvious better choice both now and into the foreseeable future – and that is the “digital curriculum toolkit” strategy.

Why does the digital curriculum toolkit strategy make the most sense? The digital curriculum toolkit is the only strategy that attempts to improve instruction in a realistic manner. Why?

It is much more likely that coherent digital curriculum that can be personalized will come through a combination of working with trusted suppliers/sources combined with institutional created content (noting that his latter category may include some of what comes from the Learning Objects in the Sky).

As discussed in Part II actually achieving the 5 potential advantage/benefits of evolving to digital curriculum requires going substantially beyond the “PDF or e-text versions of books” strategy (even though some resources of those types could clearly become part of the digital curriculum toolkit strategy).

The digital curriculum toolkit strategy seems to fit the “way institutions/states in K-12 do business” in terms of working with trusted providers and taking responsibility for curriculum (albeit that as noted in the above table, to achieve full benefit a concerted effort to evolve instruction to more personalized and experiential models will be required).

If you are in the U.S. you are seeing a huge amount of investment from the Gates Foundation, CCSSO and the U.S. Department of Education in the Common Core State Standards (CCSS). So, it might be natural to ask,

“Won’t the Common Core provide the glue that will enable any and all of the three strategies?”

To this we would answer that we are hopeful that it might help, but realistically it is not going to be the panacea that everyone is hoping for. Here are the reasons why the Common Core will not radically change things:

No one can be sure that the Common Core will work. There is already a huge amount of deviation occurring and, some are saying that it is not implementable.

The Common Core, while potentially a positive step for U.S. education, is at best still a “driving while looking in the rear view mirror” approach to educational reform (the Common Core approach largely copies what some other countries have been doing for years).

One can make a very good case that to fully actualize the goals of the Common Core that instruction must be radically evolved to a much more interdisciplinary, experiential approach to learning – and in doing so the Common Core (or any agreed upon set of learning standards for math and language) becomes a relatively small (while potentially important) part of the solution.

To that 3rd point, the IMS Instructional Innovation through Interoperability Leadership Council (I3LC) of school districts and states has recently published a position paper that attempts to put some of the myriad projects and investments made in the last few years in the U.S. by the Gates Foundation into perspective. These initiatives include the Learning Registry (initially funded by the U.S. government, later by Gates), LRMI (Learning Resource Metadata Initiative) and SLC (Shared Learning Collaborative), now InBloom. These projects all share the notion that learning objects or progress can be referenced back to a common set of educational standards, and are generally complimentary, and perhaps even dependent upon success of the Common Core.

The paper may be viewed as controversial in some circles because it clearly concludes that despite the huge investments from the Gates Foundation in the combined set of projects that they will not enable districts to support evolution to effective instructional reform. In the terms of this blog series, this is because the Gates investments are largely focused on enabling the “learning objects in the sky” strategy. While those investments may indeed help lead the way to enabling that alternative – in which case the Gates Foundation can point to a meaningful contribution – they are not focused on the most relevant strategy: the digital curriculum toolkit.

That’s the potentially “bad news.” The potentially “good news” is that organizations such as IMS can work pragmatically across the membership to help “bridge” these strategies. Metadata and federated search are great examples of areas in which the IMS membership will be working pragmatically to make the various marketplace pieces fit. What pieces need to fit? Here is a diagram excerpted from the position paper.

In part II we will address the roles and importance of interoperability standards in the evolution to digital curriculum. We also discuss a common sense ordering of "putting standards in place" based on feedback from the market.

Now, when we say “standard” we could mean a lot of things, as standards in their best sense mean a voluntary collaboration among education community participants on the technical approach to interoperability as well as a fair/neutral decision-making process. However, the following paragraphs are just as relevant if what we mean by an interoperability standard is one agreed upon way for two applications to exchange information necessary for those applications to work together in well-defined way (in comparison to multiple and diverse ways to accomplish essentially the same thing).

Here is our explanation of the critical role of interoperability standards in evolving to digital curriculum, specifically with respect to achieving the five potential benefits outlined in part I.

Potentially lower cost. Some people seem to think that all digital learning materials should be free because the distribution costs of an additional copy (once the digital version has already been produced) are essentially zero. A very small zealot group of “free software” advocates have come to the same conclusion regarding software. However, for those of us that live in the real world and want to see higher and higher quality digital products, it is very obvious that digital materials will still have a cost associated with them – and the price will be market-driven – meaning it may be lower, or may even be higher than today’s printed books. Regardless, it is very clear that having to reformat digital learning into a wide array of formats to run a wide variety of devices and software platforms (e.g. Apple, Google, Amazon, Blackboard, Desire2Learn, Instructure, Moodle, Pearson, Global Scholar) will add cost to the production equation. Even if the set of options in the education space were limited and static this is a daunting situation. It even becomes a “competitive” situation where content providers try to “be the first to market” on newer and sexier platforms with large market share. While this may all seem “fun” to the end users the reality here is that the dollars spent on essentially reformatting and recoding are dollars NOT spent on creating better learning materials. And, the cost of having to deal with the diverse platforms is shifted to the end-users (teachers and students) and the IT departments who must figure out how to equitably support BYOT (Bring Your Own Technology). Unless innovative digital learning experiences are easy to support in the educational context, well, they just won’t get incorporated. Thus, the critical need for interoperability between content and platforms to help remove the cost associated with platform diversity is very clear. While the worldwide web interoperability standards (such as HTML 5 managed by the W3C) and browsers (as the ‘platforms’) go a long way to providing content interoperability, they are lacking with respect to some key additional constructs used frequently in education, but rarely in the generic worldwide web (such as assessment).

More interactive and engaging. It has been very encouraging and exciting to see exciting new learning innovations each year as finalists in the IMS Learning Impact Awards, such as game-based learning, adaptive tutors, social learning and simulations. Some of the most innovative applications come from small start-ups with very limited resources. Unless innovative digital learning experiences are easy for IT, teachers and students, as well as suppliers, to integrate into the educational context, well, they just won’t get incorporated. The hurdles that get in the way are multiple logins, manual transfer of enrollment information, passing of other parameters that enable students to interact in the right groups and so on. If every application and platform accomplishes these integrations with their own APIs (Application Program Interfaces) – all of which evolve over time – well, its difficult to get any reasonable number of tools integrated in the first place, much less maintained over the years. Most IT departments at even well-funded institutions struggle with care and feeding of 3-5 integrations. Therefore, there is a very obvious and critical need for interoperability standards to make “plug and play” of innovative digital tools and learning experiences easy.

More personalized and accessible. The popular idea of “learning objects” – meaning chunks of content or learning experiences – that can be delivered at the right place and the right time, is not new. This has been the primary objective that people have been envisioning with the explosion of the Internet/worldwide web, as well as before with CBT (computer based training). In fact there have been many products over the last 20 years that have focused on this approach – with adaptive tutors/homework applications perhaps now becoming the most successful in the education context (while still penetrating only a relatively small percentage of the market). The goal is personalized learning. However, in order for this to work when more than one content/application provider/source is involved requires a lot of interoperability to make finding the right resource at the right time tractable for teachers or students. First of all, for highly relevant objects to be “found” there needs to be some agreement on the metadata used to search for them. This metadata not only describes the content, but also potentially the state/progress of student learning, so that the two can be compared. Now, once the right object is found there are potentially the same integration issues as detailed in (1) and (2) above. The other very important aspect of personalization is accessibility. Not only do students have preferences for how they can best learn digitally (audio vs. visual, font size and type, etc.) but the exploding use of a rapidly evolving array of tablet devices both mean that alternative representations of learning objects that fit the user and usage are required. Without interoperability standards to enable user preferences and platform versatility, the development of content and apps becomes much more expensive than today’s printed books.

Producing usable data. As mentioned in (3), a primary foundation of achieving personalized learning digitally is the need to describe student progress. The concept of progress is often thought of as a learner profile and the potential prescribed paths are often referred to as learning maps. As with (3), if the application is completely self-contained and does not provide data to other applications then interoperability is not required. However, if it is desired to have multiple content/applications/assessments work together to help teachers and students, then interoperability standards for activities, outcomes, learner profiles and learning maps become critical. While one can certainly conceive of a data warehouse with a huge amount of data not complying to any standard, the degree to which aspects of student progress can be agreed upon can potentially be more actionable. Of course, this is the goal for standardized testing and other forms of assessment.

Easier to transport. One laptop or notebook computer certainly weighs less and takes less space than multiple paper textbooks. But, if we put all of the learning materials into an accepted format, such as PDF, this would allow us to eliminate the books without making any progress on potential benefits (2), (3) or (4). Worse yet, it is entirely possible that the teacher, student and IT department could end up having to deal with a myriad of platforms (because not all apps and content run on all platforms) AND textbooks. Yikes! More cost, more weight, more space. Thus, an absence of interoperability standards could and probably is resulting in the worse possible scenario for students, teachers and institutions.

Now, since relatively little interoperability as required for personalized digital learning per the above exists today in the marketplace, a natural question to ask is “where is the best place to start?” Another way to ask this question is “what needs to come first in order to enable evolution over time to personalized digital learning?”

The method for determining such things in IMS is to start multiple threads of action and see which ones the market adopts first. Absent of third-party incentives (such as grants that favor one priority over another) the education community participants are pretty smart about building their future. It is very difficult to achieve market adoption of a “standard” when there is large diversity and competition among approaches. In such cases it is better to consider early developments as potential input to the standards process – rather than as a standard.

For those institutions, states, districts worldwide that wish to take advantage of the progress IMS has achieved in market adoption of these standards, especially those wanting to put in place a strong foundation for digital curriculum and personalized digital learning, IMS has recently released a document that describes how to specify requirements for digital content and applications based on open standards. Please read the press releaseand the document: Open-Standards Requirements for Digital Content and Application Integration with Enterprise Learning Platforms– and let us know if you have any questions! We are pleased to help all institutions and states evolve to open standards.

Does this mean that IMS is ignoring the other areas such as outcomes data, analytics, profiles or learning maps? Absolutely not. IMS has been active in these areas for years and is in the process of rolling these out at market speed, using the DLS standards as the backbone. The prioritization comes around supporting key market drivers, such as support for the U.S. Common Core State Standards, the rise of e-textbooks, the need for federated search (as integration of multiple products grows), etc. IMS members that are experts and experienced market participants in each area are driving each area – and these requirements are addressed in incremental/evolved versions of the specifications. Such evolution also allows for region specific variations, as depending on the interoperability area, there can be some significant diversity. This is of course less true in the plumbing layer.

In the next installment, part III, we will address the spectrum of three scenarios for evolving to a digital learning ecosystem. Whereas the discussion above and RFP guidance that IMS has produced will help you regardless of which of the scenarios you chose, there is a clearly preferred approach that makes sense for today and probably the next 5-10 years. Perhaps surprisingly, our view is VERY different than what is being encouraged by huge investment from the Gates Foundation in projects like LRMI (Learning Resource Metadata Initiative) and SLC (Shared Learning Collaborative) / InBloom. We will explain in part III.

Introduction to the spectrum of approaches to evolving to digital curriculum that we are seeing in the marketplace

Many state level education authorities, local education authorities and colleges/universities are looking at accelerating the movement towards digital curriculum materials. This is because digital materials have many potential advantages. Among these are:

Potentially lower cost than printed textbooks because there are potential savings in the printing and distribution.

More interactive and engaging student experience as digital curriculum can potentially be much more interactive and up-to-date than printed materials.

More personalized and accessible in that digital materials can be chunks of content or applications that go directly at a perceived gap in competency or an alternative learning style or a preferable set of user delivery preferences.

We feel that the digital toolkit metaphor is the right one to latch on to if you want to get this right. In fact, it is critical to actualizing the benefits listed above. This is because we see much misguided effort going into two other alternatives that are on opposite sides of the spectrum in terms of the approach. The approaches are illustrated in the following diagram.

At the very “basic” level of going digital there is the PDF (non-reflowable digital book) or e-text (such as Kindle or Nook) version of a textbook approach. The advantage of this approach is that it is very consistent with well-understood textbook-based models of instruction. The disadvantage is that it only addresses “lower cost” and “easier to transport.” The other major issue with this approach is that many of the mobile readers each have their own proprietary platform formats. This includes the proprietary standalone app formats of Apple devices. In other words, support across a range of platforms is a major issue with anything other than static PDF. And static PDF is not usable across all screen sizes – that is, a printed textbook is generally better.

At the very other end of the spectrum is what we affectionately term “learning objects in the sky” approach. This is a very advanced vision of being able to find, mix & match instructional materials from all over the web. This is not a new vision at all. The reason we have affection for this idea is that IMS has been involved in working to enable this vision since 1995. There has been a lot of well-intentioned investment in this vision – and there is a new wave of such activities today that are attracting quite a bit of buzz. One is the Learning Registry – that has been funded by the U.S. Government and the Gates Foundation. Another is LRMI (Learning Resource Metadata Initiative) funded by the Gates Foundation. And, the Shared Learning Collaborative, now InBloom, again, funded by the Gates Foundation. While IMS completely agrees with the learning object vision, the fatal flaw has been the ability of these “found materials” to fit together and thus produce a better instructional experience. There are numerous other issues associated with achieving this vision – none of which have been adequately addressed by the new initiatives. We will cover these in more depth in future installments. The bottom line is that whereas this approach has great aspirations to achieve the interactive/engaging, personalized experience and usable data benefits listed above, the current implementations are far from overcoming the large obstacles to getting there.

In the middle is the digital toolkit idea. This is where the productive activity is today. The approach fundamentally is about selecting digital material suppliers (commercial, OER, whatever works) that are carefully selected to be complementary to the instructional approach desired, ability to provide the data desired to students & teachers, and ability to easily work in a coordinated fashion with the other sources selected. Based on our interactions with both buyers and sellers, while there are still many issues that need to be worked out to get to this more modest goal (versus the learning object in the sky), that it is a more pragmatic approach with a higher probability of getting to all five potential benefits above.

In the next two installments we will discuss some specifics regarding the types of content interoperability that must be supported to achieve the five potential benefits of evolving to digital and the perennial issue of metadata (needed to describe and thus search for learning resources). Also, we will review some RFP guidance, policy and analysis papers as they are released in next couple of weeks.

Framingham State University is ready to put the Learning Information Services (LIS v 2.0) specification to the test as the next step toward fulfilling a commitment to adopt the IMS Global Learning Consortium open interoperability standards. The target implementation will be integration between Blackboard Learn (hosted and managed remotely by Blackboard) and Banner (hosted and managed locally by Framingham State). The objective is to more efficiently enable combined uses of educational technology that rely on the import and export of student information without the need for custom system integrations that inhibit innovation and are costly to implement. The hope is that this first LIS integration will be followed by others in order to meet the growing demand for easier and more cost efficient ways to merge complementary content and services from third party providers within a coherent user experience that depends on the exchange of data with the University’s student information system. Learning Information Services is a specification that maps the exchange of data between student information systems and educational software or services for the management of information about people, courses, groups, memberships and outcomes. LIS was developed as an open industry standard by members of the non-profit IMS Global Learning Consortium. It is based on six services that can be used individually or in combination:

The “Bulk Data ExchangeManagement Service” provides for the transfer and batch processing of data in order to initialize the exchange of information between a system of record and one or more educational resources and is used to keep them synchronized. This includes support for the data models from each of the other five other services.

The “Group Management Service” provides management and manipulation of organizational structures, and other group structures, through the exchange of data about those structures.

The “Membership Management Service” provides management and manipulation of enrollment in courses, and other activities, through the exchange of data about those memberships.

The “Course Management Service” provides management and manipulation of course structure information through the exchange of data about courses.

The “Person Management Service” provides management and manipulation of information about people through the exchange of data about participants.

The “Outcomes Management Service” provides management and manipulation of results information, from grade books etc., through the exchange of data about outcomes.

Obviously, the University needs to maintain or improve the existing integration between Blackboard and Banner which already works to most people’s satisfaction (albeit with room for improvement). In other words, it is essential that Framingham State’s adoption of the Learning Information Servicesspecification not “break” anything or degrade functionality as compared to what exists now.

Blackboard Learn and Banner must be equally conformant with the LIS v 2.0 specification in order to meet the minimum criteria for a successful implementation as defined above. In addition, whether or not Blackboard Learn and Banner provide adequate service call support at the current time also needs to be verified. The extent of conformance and service call support will hopefully prove to be a sufficient match to Framingham State’s integration requirements. The IMS Global Learning Consortium provides a testing and certification program for suppliers - and they list the suppliers & specific products that have achieved certification on their web site imscert.org. Ensuring that all vendors have achieved IMS LIS v2.0 conformance certification will greatly reduce the cost of implementation and integration at Framingham State because they have gone through this rigorous interoperability testing witnessed by the neutral IMS Global. IMS also provides problem resolution services for certified products - so this is about as good go a guarantee as an institution can get to make sure the integration will work!

Gating Factor #2: Test Target Implementation and Verify it Works

Once the degree to which both systems are conformant is determined, and whether or not mutually supported service calls meet Framingham State’s required functionality, thorough testing of the target implementation of LIS will be required to verify everything actually works as specified. (Framingham State has test environments for both a remotely hosted Blackboard Learn and locally hosted Banner instance to use for this purpose).

A team of people from Blackboard, Ellucian, Framingham State University and other members of the IMS Global Learning Consortium are actively engaged in taking LIS from a roadmap to an open roadway of interoperability. This first blog post will be followed by others to chronicle the journey, and (more importantly) help the many who follow by providing insights and guidance.