Learning: we know a thing or two about a thing or two (we think).

08/01/2012

The Internet has transformed almost every aspect of our lives, from how we work and play, to how we connect with each other professionally and personally. This change is transformational on a scale rarely seen in human history and compares to the invention of movable type, the printing press and the ability to transmit recorded sound and images.

Based on this change, corporate learning and development (L&D) organizations need to recognize the three forces fueling the learning transformation that is happening now:

Globalization

Global access to markets and talent is reshaping the fundamentals of business operations

Demographics

Multiple generations (up to five by 2020) will be working side-by-side in organizations, creating a unique talent development challenge

To attract, develop and retain a productive workforce in the face of transformational change such as this, your HR strategy should align itself to these three forces. Developing people in today’s work world means adapting rapidly to support the business’s imperatives. To do this, the L&D organization must adapt its business model. This affects everything, including:

How learning is designed and delivered

Developing on-demand learning requirements

Reconsidering the shelf life of learning products

Creating personalized learning

This means L&D will need to redefine, and in many instances, reallocate budgets to invest more in informal learning structures and new learning design models, including re-thinking the “traditional” definition of eLearning to consider:

Social and collaborative learning

Serious gaming

Immediate learning

Short content bursts rather than courses

Availability on multiple devices

User generated content

Professional learning roles are changing as well. Sample new skill sets for L&D include:

Being adept at using the tools of learners

Focusing on end results and less on event management

More crowdsourcing

Sample new job roles for L&D include:

Community gardeners

Chief technologists

App developers

The audiences L&D organizations support are now:

Comfortable with weightless technologies such as mobile devices, tablets and laptops

Expecting those technologies to be available to them at work and in their learning opportunities

Valuing training and development, particularly coaching and mentoring

Members of social networking sites

Full of digital expectations, not necessarily digital skills

To support the global aspect of these changes, L&D needs to:

Provide practice to proficiency by using interactive learning experiences

Minimize the need to travel to training

Use instructors to teach context and application, not knowledge or procedural transfer

Use selection and on-boarding to increase employee speed-to-competency

Help balance work:life integration

As we know, change can be difficult at any time for individuals, but also for organizations, because patterns are set and some are afraid of the unknown. However, change can be managed appropriately and aligned to the imperatives of the business as long as the team chartered with supporting learning are proactive and prepared for what’s coming.

08/09/2011

Assessment is used to measure learning outcomes, but if you see it only as a testing tool you’re missing half its value. Assessment should not be punitive; it should uncover the learner’s strengths and help identify areas for improvement. Well planned and integrated assessment creates opportunities for learners to reflect on their learning, apply new skills and knowledge, and also enables the business to recognize ROI on the learning intervention. Adapting existing models can take you much further than re-creating the wheel. In this post, we examine some guiding principles for effective assessment.

An excellent example of assessment as a way to support learning goals is Adobe’s Certified Associate practice tests. Learners can take a certification exam without any structured practice, but there is also an unlimited certification prep test package. The practice test closely mimics the structure of the final exam. Both combine multiple choice questions and simulations of the Adobe environment. The multiple choice questions require the learner to demonstrate knowledge of design concepts and the production process. The simulations require the learner to complete a task within the simulated software application interface. Keyboard shortcuts are disabled, but otherwise, learners can use any correct method they choose to complete the task, using menus or panels as appropriate.

Another good example is the Sun (now Oracle) Java certification paths. Each path contains a test prep “kit” that includes preparation recommendations, additional resources, a practice test, and a re-take policy. Each path is designed to prepare the learner to achieve a specific level of certification, and used as a benchmark against industry standards. These certifications are recognized by employers and can advance a person in their career.

Accreditation or certification can be used to validate mastery of a topic, but this is not the only way that assessment can be useful. By seeing assessment an integral part of instruction, we can support the learner’s career development while measuring true performance for the business.

Factors that make an ASSESSMENT tool also a useful INSTRUCTIONAL tool are:

Authenticity

The learner performs the task in context, not recalling theory, but actually demonstrating competency.

Open ended

Because the end product is assessed, not the method used to get there, learners are able to use whatever menus or panels they choose.

Learning while doing

Learners use contextual clues and critical thinking to complete tasks. They may not know how to adjust alpha levels in Photoshop, but they may know to investigate the color panel to find them.

Self-reporting

Learners can mark questions that they’d like to return to if they have time or opportunity.

Cumulative time

The test is timed with one master clock, not with individual times for certain sections or items.

Feedback

Learners receive feedback on each item, with notes about the correct answer.

Tracking

Performance from one practice test to another is tracked.

Assessment is not something that should only occur in a testing situation, indicating pass/fail rates, but it should be integrated throughout instruction to allow the learner to know how they are doing, so they can learn more effectively. Don’t leave your learner half-assessed!

07/28/2011

We spend a great deal of time planning for learner success. Do we ever create the ability for learners to lose? Should we?

Our society puts a premium on “winning” -- nowadays it seems like every team gets a trophy -- not just the winner. Americans are culturally bound to the ideal of winning. However, in the real world people lose all of the time: athletes lose games and tournaments, politicians lose elections. In high stakes situations like these, losing has a real consequence. How does this affect learning?

Low stakes losing occurs every day. When we lose, we tend to either give up, or continue to practice to overcome the failure. Any casual gamer is familiar with losing and attempting to overcome the loss by quickly replaying the game. Losing, in this context, can be a big motivator, driving the will to practice, which can lead to increased skill. Learning to lose effectively is actually a skill in itself. Resiliency in the face of losing and using strategies to improve is an important life lesson and professional skill. If losing always equates to complete failure, then learners stop striving, stop attempting creative solutions, and see themselves as incapable.

In learning design, a primary goal should be to create an atmosphere where losing is acceptable and intrinsically motivates the learner to try again. The potential learning value from this type of experience is measured not by how often the player loses, but by how much they improve through repeated practice.

In your training design include opportunities for low stakes losing before learners are thrust into their high stakes, real-world situations -- be it a qualifying exam or on the job performance.

Recommendations for a “losing” design:

Make content available on demand so that learners can review material as needed

A common mistake is being overly controlling with content. If the goal is for learners to gain knowledge and skills, shouldn’t they have free access to the tools that can make that happen?

Communicate how the learner is performing relative to passing scores or via peer-to-peer review as appropriate

By communicating how comparatively well they are doing, learners can better sense where they can improve. This also gets their competitive juices flowing.

Allow plenty of practice opportunities before the final assessment occurs

If the only time that learners can demonstrate knowledge is a final graded test, their opportunities to “lose” are reduced to one high stakes moment. Build in low stakes assessment opportunities that prepare learners for the one that counts.

07/21/2011

“Training” and “competition” are not usually words that are associated with each other. However, competition is an innate motivator and humans by nature enjoy winning.

Certain aspects of winning are universal to all competitive activities, including learning. A recent Newsweek article about winning provides insight into how instructional designers can create more engaging training. The author notes that winning by itself is not the most compelling impetus, but that winning while a competitor loses is more satisfying (this would seem obvious to anyone with siblings).

Rather than using a “task completion” metaphor, instructional designers should use a gaming and winning metaphor when designing training. Rote tasks can be made more engaging if instead of simply reading and reacting in a safe environment, the learner triumphs over a tension-filled activity. Similarly, you can provide competitive opportunities with other learners virtually.

In many K-12 situations “safe” learning environments take out so much of the excitement of competition and rating. You have an obligation to make sure every beginner learner succeeds, but you also have an obligation to groom the special talents that individuals have as well.

How do you turn cognitive tasks into challenge?

Learning objectives can remain the same, but it’s the way that it is presented that changes. You don't have to completely redesign your training to make it more challenging. Consider these simple ideas:

Allow learners to be wrong

Allow and penalize for incorrect answers. If learners can complete a course by merely clicking through content, they have little reason to engage with the content. Activities that allow for “failure” can create “good tension”.

Add variable scoring

Reward learners for learning more difficult material by acknowledging that all content is not equal.

Add timed components for some activities

While not appropriate for all activities, it does create a feeling of tension.

Allow for replay opportunities

This reduces some of the negative aspects of the capability for “failure” allowed by other competitive components.

Rather than learning taking place within a silo, authentic learning events can bleed into the larger community.

A simple multiple choice game can be either boring orcompetitive with a simple design tweak:

Boring

The learner answers a question and then views the answers tagged with "correct" and "incorrect" feedback.

Competitive

The learner answers a question. They receive variable points based upon the correctness of the answer, difficulty of the challenge and the speed with which they answered. They are then shown their score relative to other learners. The feedback is also contextual and continues the gameplay.

Take some lessons from this short quiz. How many times did you feel compelled to play?

04/23/2011

During the past several months there have been low background rumblings in the land of education and training. That is the sound of the learning world discovering what Internet professionals working in other vertical markets have known for years: The digital “breadcrumbs” that learners leave behind about their viewing, reading, engagement and assessment behaviors, interests and preferences provide massive amounts of data that can be mined to better personalize online experiences.

I have always thought that "adaptive learning design" would finally give us the holy grail of "contextual relevance" for our learners. This post illuminates what to consider when analyzing learner behavior through their data trails. Combine that with observing learner behavior and you can potentially get to that holy grail of relevance.

03/06/2011

This month I will be participating in a panel discussion about the Potential of Augmented Reality (AR) for Education at SXSW in Austin. I'm lucky to be joined on the panel with some leading experts in the field of AR: Tish Shute, Brendan Scully, Karen Hamilton and our fellow blogger here, Enzo Silva.

Tish Shute, founder of Ugotrade, a leading AR blog, explains that AR is in a critical development stage:

"Augmented Reality is in its infancy and we are just beginning to see AR learning apps emerge for young children on consumer devices, and museums are pioneering AR in situated educational experiences. AR has a long history in military and industrial training and simulation excercises because in these areas bulky high priced equipment is not an obstacle to adoption. We are in the middle of a data revolution. Soon just about everything in our world will have a data shadow. Sensors everywhere will allow us to communicate with the world around us in totally new ways. AR is the natural interface to bring relevancy to data. It will turn arcane spreadsheets, vast unstructured data sets - the data wakes of our cyborg lives - into visible, visceral, interactive, optimizable, and actionable parts of everyday life."

Others on the panel include Brendan Scully from metaio, Karen Hamilton from George Brown University in Toronto, and Enzo Silva from Oracle. Enzo organized the panel, and will be moderating.

We're hoping to have a dynamic and interactive discussion about where AR is currently, where it's going, and how organizations can begin to leverage the technology to build more authentic and immersive educational experiences.

We hope you can join us in Austin! Until then, check out this cool video from metaio on multichannel AR:

01/02/2011

Hi all, Happy New Year! Hope 2011 gives you what 2010 didn't... ;). 2010 was interesting on several fronts -- and work-wise was a year of learning and learning design for me! More on that soon.

To kick-off 2011 I've decided to make only one resolution: to make work simpler. That doesn't mean to make my work less engaging, interesting, or informative... it really means to undress it, to de-accessorize it. In that vein, it dawned on me that as an instructional designer I really only need to do three things:

Provide information before interaction. This is a traditional approach to instruction — inform the learner about the subject matter and communicate what it does (This is what the button does when you click it).

Offer feedback during interaction. This is how the object behaves so that it reinforces what it does, or what it’s doing (The button displays the help screen).

Encourage reflection after interaction. This is what the person should know after she has finished interacting.

By understanding how to "do" these three things, you can create more effective learning experiences. That's what we really strive for -- and there's been a lot of talk about "dressing up" what we do... build engagement, heighten emotion, ensure contribution... sure, there are valid points to make on all these fronts. But if you unwrap the glitz, you'll find that these three elements are the foundation to anything instructional. How you accessorize it is well... up to you and your audience.

07/15/2010

Many training organizations may think their curriculum is aligned to the company's business needs. Can you confidently say that yours is?

When you look holistically at your curriculum, does it:

Meet definable and observable performance standards?

Enable you to determine if learner achievement of the curriculum objectives occurs?

Allow you to determine if the instructional elements effectively meet the curriculum objectives?

Do you evaluate the effectiveness of your instructional programs along the following criteria:

The instructional content teaches to the objectives specified

The time required for instructional programs are appropriately aligned to the student time available

Instructional materials and resources are easily accessible and contextual to the student's physical and virtual environments

Appropriate assessment mechanisms are included that provide performance-related remediation

Performance-related data is tracked and stored

The instructional activities are evaluated to determine effectiveness

Too often, training organizations become siloed and redundant in their efforts to train the workforce. Consistently measuring the effectiveness of your curriculum and ensuring it's aligned not only to identified business needs, but also to the organization's competencies, will help you provide a measurable and sustainable ROI to the business.

The question is: how do you intend to ensure your curriculum strategy stands up to the test of alignment and effectiveness? Consider these three actions to help you get started:

Evaluate all curriculum and instructional components annually

Leverage business intelligence software to gather usage data on your course catalog. Consider deeper analysis on courses and/or programs that are not being used or accessed frequently

Ensure that all courses fit into a program or curriculum that is tied directly to a business need

Question the request for training if it can't be easily shown what the desired outcome is or how it is aligned to a business need or goal

Provide supplemental support services for post-course evaluation

The primary success factor resulting from a well-defined curriculum strategy is performance improvement. By focusing your efforts on an appropriate strategy, you will move one-step closer to demonstrating true ROI to the business and the audiences you support.

07/08/2010

When creating online learning, the primary goal is to create a course or program that fosters "a continuing motivation to learn," which is defined by Maehr as "voluntary engagement in continuing to learn more about a given topic."

The complexity of human motivation makes it hard to find a one-solution-fits-all model for learners. However, the ARCS model establishes basic criteria an instructional designer can leverage to increase the potential for establishing learner motivation while reducing learner malaise, frustration, and/or dropout.

The ARCS model consists of these four components:

Attention

Relevance

Confidence

Satisfaction

Consider the following tactics for each ARCS component. However, there is no substitute for proper analysis to uncover the motivational elements specific to your audience. Without that analysis, you are likely to include too many, too few, or wrong motivational elements.

Attention:

Gain learner attention by integrating:

High-resolution graphics

Interactivity and animation

Conflict and failure

Problem-solving and inquiry

Relevance:

Provide relevance to the learner's situation:

Describe clear goals that are consistent with learner expectations

Connect content to the learner's prior experience and/or knowledge

Include guided instruction and navigational systems (while allowing self-determination in the consumption of the content)

Incorporate "authentic" activities based on learner's job or tasks

Confidence:

Make the learner feel confident and aware of learning outcomes:

Communicate clear expectations for successful completion

Teach to the learner's "Zone of Proximal Development" -- avoid tasks and/or activities that are too easy or too hard

Include activities that require critical thinking and minimize guessing

04/21/2010

I've been following the "Apple vs. Adobe" debate about Flash for several weeks now, and have recently come to the conclusion that it's not really about Apple forcing its will on Adobe as many in the media seem to be reporting. A milestone occurred yesterday when Adobe's Flash Platform product manager, Dan Chambers blogged about Adobe giving up on creating further iPhone app development tools.

As a designer and developer of corporate training materials, I obviously have a vested interest in this debate. Why, you may ask? Well, if you scan the landscape of eLearning development you can be rest assured that a large majority of online educational material relies on Adobe's Flash platform for its delivery.

I go back all the way to the pre-Internet Authorware days... yes, Authorware 1.0 when it was Mac-only authoring software. I loved Authorware. It was designed specifically for development of computer-based training programs. As the Internet era dawned and Macromedia put its energy behind Flash, I saw the writing on the wall for Authorware. I mean, at first, it was great... Macromedia kept rolling out new versions, and even put a lot of effort into the Authorware web player. The dark days eventually came, and I knew when I talked to some friends at Macromedia what was happening when they lamented that no new Authorware engineers were being hired... and there was only one left at that time!

As Flash emerged, the learning developer community embraced it and began retrofitting it for educational interactivity. I never completely transitioned from the flowline to the timeline. I still remember Robert Milton's explanation of the flowline in Authorware training, and how dang logical it was. For me, the timeline was anything BUT logical.

When Flash became the de facto standard for learning development, chaos ensued:

Constant player updates adversely affected courses designed in earlier versions of Flash

Just TRY to get Flash to communicate with any LMS without having a programming wizard (if you could find one). I used to beg Andrew Chemey (a Macromedia engineer) to help, and he could always make it work, but he could never explain how.

Throw out the idea of modular development, which was Authorware's coup de grace.

Expect a new or heavily modified scripting language with major updates

And the list can go on

All this for a "light web player" -- people complained about the 3-6mb Authorware web player (this was pre-broadband). Granted, Flash has made important progress as it relates to learning development, but this whole "Apple/Adobe" conversation got me to thinking about the state of tools in eLearning design and development.

Flash the parent, and all its unruly siblings such as Articulate, Captivate, Camtasia... all the tools that produce .swf files, are essential for various tasks in eLearning development. And Adobe seems to be whining that Apple is being a big bully about the Flash platform. I think it's actually the opposite. I think Adobe has been the bully over the last several years.

Apple is basically saying, "we don't want to have to adopt or promote a company's proprietary technology on our platform when it's not consistent with the technologies that serve as the foundation for our platform." Flash is anything but open.. it's locked technology that requires software available only from the vendor. It's not open source, or even open for that matter.

I remember the days before the Flash player (remember DHTML?). I remember coding HTML to work in various versions of browsers, and it wasn't pretty. Flash made life a whole lot easier on several fronts because it created its own "walled garden" that was guaranteed to operate consistently across multiple operating systems and browsers. I fear HTML5 will bring back the whole issue of compatibility and interoperability.

I also know from first-hand experience that creating a software product with the Flash platform is fraught with potential issues because any potential change in player versions could cripple features/functionality. And try getting updates from Adobe on changes, or roadmaps on functionality.

I tend to side more with Apple on this matter, only because they're really not bullying Adobe -- they're just saying, look we have our "walled garden" and we've decided what the best technologies are to support it. You have your proprietary technology that may be a standard with a specific group of developers, but it's not an open standard for the Internet.

What does this mean for learning developers? I think just as we all were forced to move away from Authorware several years ago, we need to have our eye on where the industry is going. I tend to think more open technologies will always win at the end of the day. But that must be countered with how we get our jobs done as well. Tools like Flash do allow developers to get work done in an efficient manner. Change is guaranteed, and learning new tools and languages will always be a task we must perform.