Recently in Teaching Category

The MLA (Modern Language Association) has an interactive language map of language communities in the U.S. based on the 2000 Census data (with updates from 2005) at:
http://www.mla.org/resources/census_main

In addition to the basics, you can find information on language communities by state, county and even zip code. If you really want to check it out, I recommend viewing data from the Los Angeles area. It's probably as linguistically diverse as New York.

As a fun class exercise, I just took the basic U.S. map showing concentrations of non-English speakers (bluer = higher percentage of English speakers) then asked students to guess which language communities were being represented. Another fun exercise would be to have people look up the third largest spoken languages in different regions. Overall in the U.S., the third largest is Chinese, but in Pennsylvania it's German (and Tagalog (Phillipines) in California).

P.S. I should note that today the map is hanging when collecting data, but Internet speeds have been slow in general...hopefully it's a temporary glitch. If the map isn't working, you can retrieve the raw data by clicking "Tabular View".

I've been following an interesting discussion on whether a "conversational" approach should be used for Latin or not.

For modern language courses, the behavioral objectives are fairly obvious. After 2-3 semesters of a language, you want to be able to walk into a cafe or bar, read the menu and order the beverage you want (or figure out how to get the train to Marseilles, or get the latest scoop from ¡Hola! magazine. That is you want a certain level of listening, reading and speaking proficiency with enough writing thrown in to fill out an application or compose a quick thank you note.

These days a conversational approach is advocated so that students learn to communicate in the target language "on their feet". Exposure to native language speech input is also recommended whenever possible so that leaners can parse audio.

With classical languages like Latin and Greek, the objectives may be different. For instance, Attic Greek (i.e. the language Sophocles spoke) is what you need to read the original Ancient Greek literature. If you're in Greece, Attic Greek is helpful for reading street signs and monument inscriptions. But if you want to order some ouzo in Athens, you probably need to learn Modern Greek. That is, learning classical languages is usually about being able to read in the target language - not being able to speak it.

Can the conversational approach help here? Interestingly many of the Latin teachers said they DID advocate the conversational approach. Apparently learning Latin without using it conversationally was a little be too "abstract" for students. I'll admit that my Latin teacher burned in the supine into my brain with "correctives" like horrible dictu (or "Ugh! Horrible Latin!"). Interestlngly Latin has taken on a life of its own as a living language community. You can even get your news (nuntii) in Latin. Clearly, there's something to this.

It should be noted that traditional Latin pedagogy then focused more on grammar and translation. The idea was that if you understood in detail how the Latin phrase or sentence was bulf, you that you would be able to read Latin by "deconstructing" the combination of words and grammatical endings. In practice though, I would say that the result is that many students can recite a lot of paradigms but end up having troubles reading actual texts from Cicero.

But...even with the conversational approach, I wonder if you hit a wall. I'm glad we have "modernized" Latin, but it can't be the same as what Cicero wrote. It's a form of Latin spoken mostly by speakers of modern European languages - none of which much resemble Latin anymore. Even modern Italian has very different syntax than Classical Latin.

What I found was that even with "conversation" and "grammar", I had great trouble parsing Cicero - I could translate the words, but couldn't string them together so that they meant anything. There's a certain pragmatic logic in Latin that is lost in literal translation. After all Qui/Cui bono doesn't literally mean "Who benefits?" but "Good for whom?"

I would say that I didn't truly understand how to learn Classical languages until I took Middle Welsh. Although we did learn some grammar, the focus wasn't being able to speak or even translate anything. Instead we just picked up an actual text with a glossary in the back and plowed through. I took notes in the text, but it was so small that I learned to only translate the key vocabulary words I didn't get. The more "simple" words I could memorize, the faster the reading went. In other words I was learning to read the syntax directly. I had slight indigestion that I would not be able to order a mead in Middle Welsh, but then again, this is not really possible anyway.

Another benefit to the "learn as you read approach" is that you may not be thrown off by minor inconsistencies. Many medieval languages were "flexible" in terms of grammar and spelling - it really is more important that you be able to recognize a potential irregular past tense rather than that you know exactly what it is.

When I thought about it, I realized this is probably the best approach - after all you are trying to read the language, and sometimes you may need to read an undiscovered document which may contain new verb forms as well as previously unattested vocabulary. Sometimes reading ancient texts is a decoding exercise.

In the end, it's about the reading and neither the speaking or the translation. There's just one remaining problem - by the time I had gotten to Middle Welsh, I had Modern Welsh under my belt. If you're starting from scratch, it really can be an interesting chicken and egg challenge.

Since linguistics invokes mathematical formalism (i.e. phrase trees, feature bundles, rules or tableauz, etc), I am interested in some aspects of how math is taught.

One question that comes up a lot is why is it important for all students to learn algebra or trigonometry if only a small minority will ever use these tools in daily life. The standard answer is that algebra teaches you "mathematical thinking," but I'm pretty sure most students (especially those who hate math) miss the point. Actually, I would say that if you want to learn "deductive" skills, you're better off taking formal logic or rhetoric.

However, there is one aspect of algebra that is important in real, but rarely pointed out and that's its ability to provide multiple respresentations for "the same thing". For instance the concept of "1" can be represented as "1", 4/4 (four-fourths), x0, |i2| and my personal favorite - .999999... And believe me I haven't even touched the tip of the iceburg. Although these formulations all represent the same quantity, they do not quite the same meaning.

You normally use "1" in real life, but if you're working on a weird property issue where an piece of lanf is divided into quarters maybe the formulation "4/4" would have meaning. Or maybe you have a formula which you raise x to a certain power - whatever it is. It's just that when it's zero, the result is 1.

My point isn't just that the "same" item can have multiple
representations but that the different representations can be selected
to help you focus in a different aspect. To borrow a concept from
Semantics class, the meaning of something is partly fixed by your
context - but you have to know EXACTLY what your context is.

The use of multiple representations does extend beyond algebra (and I don't just mean linguistics either). For instance, there are lots of places around the world which have multiple place names, and sometimes you select one based on what era you are studying.

For instance modern historians may study be studying "Turkey", but historians from the 14th-early 20th century may be studying the heartland of the "Ottoman Empire" while those who specialize in the Bronze Age probably study "Anatolia" and Roman historians are probably studying "Asia Minor." It's roughly the same place, but the different names not only establish the time context, but can be used fudge minor details like changing political borders.

You don't want to start calling modern Turkey "Anatolia", but the use of the term "Anatolia" is useful for referencing the set of Bronze Age cultures in the region (none of which are now related to the modern Turkish culture in terms of language or religion)...so you don't usually call ancient Anatolia "Ancient Turkey" either (unless you're writing a tourist brochure). And no matter what - you never want to confuse Turkey with Turkestan (not cool).

This kind of mathematical thinking isn't about accepting one "right answer," but systematically determining what the possible answers are and when to deploy them while understanding that some answers are just plain wrong!

I'm teaching phonology again, and once again, I am contemplating the issue of which phonetic system to use. It seems like a trivial issue, but it actually gets into some tricky issues.

One of the trickier issues is transcribing the "y" sound of "yes".

FROM Y TO J

Although linguists generally stick to IPA, there is a close variant called American Transcription. Usually, I just teach both (and accept both for credit), but I do like to stick to one variant in my lecture notes if at all possible.

The "y" sound is /y/ in American, but /j/ in IPA (following German spelling convention). In the past, I used /y/ in order not to confuse my students who are generally familiar with English/French/Spanish - all of which spell this sound as "y".

American - yes = /yɛs/ boy = /boy/IPA - yes = /jɛs/ boy = /boj/

This time I have changed my mind and have moved to /j/. One reason is that all other Penn State classes use IPA. Another is that even the Wikipedia uses /j/. At this point, I'm starting to look a little "backwards" for sticking with American /y/ instead of the more continental /j/.

NOT A COMPLETE SWITCH

But I haven't made a complete switch...Following Kentowicz's 1994 textbook Generative Phonology, I still prefer American for some sounds. Some because they show phonological relations more clearly (per Kenstowicz). Others because, quite frankly, it's easier to crank them out on a keyboard.

Some examples
* I use American /ñ/ (as in señor) instead of IPA /ɲ/ for the palatal nasal.

This is because a) it's easy to type /ñ/ (especially on a Mac), b) American students are familiar with the Spanish sound and c) there are too many n's with tails in IPA (ŋ ɲ ɳ). At small point sizes, I think it's easier to spot ñ.

* I use American /ṭ,ḍ,ṇ/ instead of IPA /ʈ,ɖ,ɳ/ for retroflexes.

This one is because a) almost all scholars of language of India (the prime retroflex languages) use the dot underneath b) I can generate these on the Mac extended keyboard and c) I still don't like that IPA retroflex tail visually.

* I use American /ü,ö/ instead of IPA /y,ø/ for front rounded vowels.

Because 1) German spelling uses umlauts and b) it signals "front rounded". It also means I never have to use /y/ in transcriptions - avoiding the whole "What does /y/ mean?" issue.

* I use American /č,ǰ/ instead of IPA /tʃ,dʒ/ for alveolo-palatal affricates

The reason for this one is that even affricates are supposed to be "two sounds", they are generally treated as a special kind of stop in most languages. Interestingly Indic scripts all treat these two sounds as one letter, as does Arabic (and English "j" and Italian "c").

Just to be weird though, I use IPA /ʃ,ʒ/ instead of /š,ž/. This was clearer to many students for some reason, and they are distinct in shape. These IPA symbols are also very common in French linguistics.

I won't claim that this is a perfect solution - after all IPA is becoming more of a universal standard these days than when I was learning linguistics. If nothing else though, I do like to mention the alternates because both were in use for a long time.

A linguist (even me) has to learn to make adjustments for different linguistic documents.

I am one of those cranky people who see new social technologies like Twitter and MySpace and ask "Why do I want to tell this stuff to strangers?" or "Why should I care what some guy in Denver is doing Friday night?

But I am intrigued by the FICTIONAL incarnations of these tools where people assume virtual identities based on know historical or fictional figures and then do their blogs, Twitter and MySpace profiles.

Many classes like history and business have latched on to role play, but these technologies take it to a whole new level. Man

Thus far my favorite examples have been:

Second Life Gold Rush Game - Why shop in a virtual world when you can earn money in 1849 California? A real simulation from a real course.

Geoffrey Chaucer Hath a Blog - Both a way to practice "modern" Middle English (with French & Latin phrases) and a comparison of 14th century vs 21st century living.

Silliness aside, there is a chance here for students to explore the messiness of politics and daily life from previous generations in a way that makes it more real. Thomas Paine had his pamphlets, but I'm pretty sure he'd be a prolific blogger today.

The H-Teach List is having a really in-depth discussion on plagiarism. The general consensus is that the US plagiarists generally know they are being naughty, but Doug Deal asked an interesting question about most textbooks

Textbooks have a lot of written materials and lists of "suggested readings," but as far as I know, they don't usually have footnotes or endnotes or works cited. they don't cite specific sources for the information or the analyses they contain. Why not?

The lecture presents an oral version of the same problem. We undoubtedly could cite sources for some of what we say in every lecture, but typically we don't, or at least we don't do so scrupulously. Is that okay too?

That is, many of our lecture Powerpoints for the classtoom tend to present facts "as-is" and not delve too deeply into where we got our information. To give some people credit, some lectures and textbooks do include citations (and I try to squeeze them in), but it's rarely a key feature in the information most people see.

By the way, it's not just the classroom. Most popular non-fiction, news stories and informational Web sites hide or eliminate citations. Why does it happen? Basically to simplify presentation for the audience. In a teaching situation, it may be the case that students don't care where you got your Welsh data from...just that they have to memorize it.

Another problem many instructors/news providers may encounter is that many younger or less-advanced students usually don't want to hear "maybe this or that". I know I didn't want to hear about it when I had one of these classes as a junior. In the beginning, students usually want to hear about one method/story and be done with it.

For instance, if you asked "how many sounds does English have?" I bet you don't want to hear the linguist state "It depends on the dialect..." (even though it does).

So...how do we train students to care about citations? We can use the traditional stick method (it's the one I mostly use). Problem Based Methodology would say that practicing research would teach students the importance of citations. That would probably help.

Here are some things that have taught me to love citations

Sometimes I need to look up a data point back up (usually an Old Irish verb form in my case). Citations really narrow the search process down quite a bit.

The stories I hear about people trying to figure out where different ancient authors really got their information makes me appreciate citations more. When reading ancient travelogues with crazy third-hand stories, you really do wish they had included a citation somewhere.

And it does help to have other sources to back up your kooky idea. It's not just YOUR kooky idea, it's just a minor extension of {citation here].

Finally, I like to look up other people's data points (usually Fula verb forms) to see if any strategic editing has been done. Are ALL the data points included? Is ALL the text quoted? The answer is maybe not. In fact I'm downright paranoid about using other linguists' data...I usually prefer to go straight to the original non-linguistic grammar or text.

It's this last point that made me really understand the importance of looking up the original source and how important an honest citation is. It's only when you can look at and TRACK several sources that you can begin to filter out unconscious bias.

Every now an again it's nice to check to see how non-US cultures handled education. A great example is the Chinese Civil Service or Keju exam. Lasting from the 7th century AD to the last dynasty in the early 20th century, this was an exam which determined if a person was academically qualified enough to become a well-paid government administrator/bureaucrat. Today we have counterparts like the SAT (how good a college you can go to), MCAT (how good a medical school) and the LSAT (law school).

On the positive side, exams like these theoretically allow anyone with the resources and ability to learn enough material to pass to become credentialed. But Dr. Hoi Suen from Penn State feels that this kinds of high stakes exam will inevitably lead to problems.

"With approximately 1,300 years of history and extensive official and unofficial records that were kept throughout this period, China’s is the only examination system that can provide us with a glimpse of what might be some long-lasting chronic problems of high-stakes, large-scale testing programs as well as of the efficacy of attempts to remove unintended negative consequences" — Suen (2006)

The Problems

So what were they? Pretty much what you get with the SAT/MCAT:

Keju test handbooks similar to the Kaplan handbooks

Violent or suicidal behavior from students who either failed the exam or were studying for the exam

Massive cheating in the form of

students memorizing entire essays and poems to copy down later

bribing test proctors and graders - sometimes they looked for "key phrases" in essays so graders would know who to score highly

Baton Effect

In addition to these issues, Suen also found something called the baton effect - which basically says that Chinese society focused only on learning material on the test, which in this case meant literature and poetry at the expense of medicine or technology. The baton effect shows that high-stakes testing actually can influence what people learn.

Think of how many people today who complain that we don't study literature and poetry enough because the US students are too concerned with studying "for-profit" fields like accounting, medicine or the law.

So the problem may not be that testing will inhibit education, but that it can be TOO influential. You ideally want a test that matches what your society really needs, and yes there are plenty of diverging views out there.

Any Solutions?

Of course, China and the U.S. are not the only high-stakes testers out there. Most countries today have some sort of high-stakes test for university admissions and some used to have them for high school. And I think it's safe to say that once a test becomes high stakes enough to count, you will get the cheating and the destructive psychological behaviors described. I've heard great academic dishonesty stories from some of my non-US colleagues.

But what are the alternatives? Traditionally the alternative has been "the old boy's network" or whatever variant you have in mind. I need to hire someone for a task and I check in with my social network to see what "qualified" applicants are out there. Actually, when the population is small scale, this might actually be the best solution.

But once your population gets too large, caste think tends to set in. Anyone not born in the right circles would somehow have to find a patron (possible, but not easy).

Is there a third way that's more equitable? Maybe the ultimate solution is just to open more pathways to success. We're not all meant to be doctors or lawyers or government workers, so why should we all be trying? Wouldn't a system that rewarded something other than academic performance be interesting?

For instance, a highly-skilled welder may actually be well-paid and know quite a bit about metallurgy, gas chemistry and structural engineering. Welders may even need to receive continuous training to keep up with the latest techniques...Some welders even wield their torches to become metal craftsmen (and their art may command high prices).

But how many professors or lawyers want their children to grow up to become welders? (Actually see P.S. 2 for my reasons why not).

Class and High Stakes Exams

All of this speculation leads me to think that the biggest reasons for the problems encountered by the Keju and the SAT/LSAT/MCAT is that they are entries for people to gain or maintain a relatively higher social status. Hence there is much more competition in them (as well as a very strong desire to circumvent the system).

There are actually lots of other high stakes certification exams like the CPP (Certified Payroll Professional) and ones for welding, yet I don't think the issues of academic dishonesty are quite as prevalent. They're challenging, but not as many people take them.

Postscripts

P.S. 1 - An interesting development recently is that "chef" has become a much more glamorous profession thanks to the rise of cooking channels. One person admitted that he was glad to have found cooking...because he really hated school.

P.S. 2 - Another annoying quasi universal is that societies often set the highest social class to those who do the least work. Cognitive labor is always above manual labor, and no labor at all can be the best. I think that part of it is that manual labor can be a bit dangerous (welding accidents are more likely than attorney accidents). And it is cleaner, which counts for a lot in our subconscious mammal brains....still many societies have missed out on key technical innovations because the philosophers "didn't want to get their hands dirty" doing actual experiments.

Learning has been defined as an alteration in long-term memory. If nothing has altered in long-term memory, nothing has been learned.

Specifically cognitive load theory says that knowledge (I assume facts/procedures in this case) is stored with some organizational structure attached (possibly all hierarchical schemas or schemas plus other structure). In contrast, new information has to be filtered through working memory (a type short term memory) Sweller proposes that when receive information, the working memory will try to get to a schema from long term memory to reduce cognitive load (I would say you try to recognize first, then learn new information). Interestingly, he proposes that if the learner's memory can't get a schema, the learner may first try to see if another person has one available (an instructor, a peer or the textbook). This fits the social aspect of learning, but it slightly contradicts the constructivist approach in that constructivism does not really assume that the learner is looking for a close match internal content organization. They assume the learner is constructing everything from scratch. On the other hand, if the learner already has a schema in place it is easier to process new additions (the more you know, the easier it may be to learn more). On the other hand, CHANGING a schema to fit new information can be pretty tricky. Some non-intuitive predictions I found interesting

Redundant/duplicate information adds cognitive load - because you have to process ALL the material before you can determine it was duplicate. Instructional designers sometimes advocate showing the same information multiple ways to help different learner types, but you can get into overkill territory if you're not careful (been there, done that)

Worked examples critical - Sweller cites research that learners may need to see fully worked example problems to best learn the technique. Asking learners to "recreate" a technique from scratch may not be as productive. On the other hand, you do have to help learners transition to solving their own problems. Interestingly, although Sweller does not address the creative arts, it is interesting to note that art is usually taught by showing many examples of how a "design problem" is solved.

Experts actually store a lot of "factoids" - But Sweller contends that experts index factoids in such a way so that they can recall the correct one given a current problem they're solving. It's different from being able to recite a random list of trivia. But you still need to get the factoids in there at some point...

I think this theory is on the right track, but there are a few valid criticisms I think a constructivist could make:

There is no overt role for motivation or emotion - although I generally feel that motivation is something that either enhances or interferes with the ability of learners to place content in long term memory. However, a complete model should take this factor into account somehow, and I actually think a model like this could easily accommodate affective factors as a factor affecting memory storage.

Assumes all hierarchical schemas - Actually I think this is what the EXPERTS store (but only "left brained" knowledge). Novices may be storing facts as "unstructured lists" and need help sorting what they know into appropriately structured schemas. Still some creative processes involve a "subconcious" or "right brained" mulling of the problem with strange tangents that is not well understood.

Cognitive load theory may be more math/science geared - That is his focus is on learning a set body of facts and procedures. He does not really address issues like creativity in the arts or multiple points of view in sociology. On the other hand, even policy studies rely on being able to interpret facts and figures.

Does not acknowledge cultural differences - Even if people are born with the same brain, they don't get the same upbringing. Conflicts between home culture and academic culture can interfere with learning (because of affective issue). Acknowledging cultural differences can enhance opportunities for learning (especially for the instructor) I suspect Sweller would NOT believe different cultures store knowledge with different mechanisms. Different cultures may have different schemas (e.g. the tropics subdivide fruits into "hot" and "cold" varieties for various reasons), but they're still schemas!

Does not acknowledge "inborn" learner differences - On the other hand, some people may wonder if such a thing exists!

As my instructional design colleagues already know, I moved straight from theoretical linguists to Web design, then on to instructional design. I don’t recommend this for everyone because I will be the first to admit I was weak on pedagogical theory. In fact, I had to “construct” my own meaning of “constructivism” and it was full of “cognitive dissonance” (I thought concepts contradicted each other). I’m still not sure I have it right, which is why I’m still a “linguist among constructivists.” Here’s why:

This mirrors the Chomskyan language acquisiiton model that assumes that children acquire language by listening to adults (not by overt instruction by the way, but by children processing the raw signal and concocting their own grammar).

So far so good, but the pesky linguist in me immediately asked exactly WHAT kind of structure is a learner constructing? Does it have parts? Do they come in more than one type depending on complexity level (following Bloom's taxonomy? or verbal vs. kinesthetic?)

After all, linguists divide the language component into components like phonology (sound), morphology (word structure), syntax (sentence structure), semantics (literal meaning) and pragmatics (actual meaning). There must be even MORE components for something like critical thinking or algebra.

Yet, most typical sources on constructivism do not really specify this at all (although I do see the reference to concept map and schema). The closest answer I’ve gotten on the constructivist road is it’s “very complex and counterintuitive”. [http://chiron.valdosta.edu/whuitt/col/cogsys/construct.html]...I’m sure that’s true.

I finally realized after a while that most constructivists assume a “holistic” model in which defining the parts is not necessarily critical. Theoretically, if the child is in the correct learning environment, then the right structure will be built.

At this point, I will have to say that the insight that parts add up IS important. Learning does involve a complex interaction of perception, cultural biases, physical health, previous mental structures and motivation (connation). Mess any one of these up and the learner will more than likely have problems.

But in the end, I can’t abandon the idea of defining components of cognition and learning. After all, HOW do we define the optimal environment if we don’t understand all the components of the environment? Which strategies can we deply to maximize the functioning of each component in the learner? And if we assume learner differences, what are they exactly?

Some say the actual cognitive model might be “too complex” to work with at this time, but if the meterologists can sort through a complex mix of climatological data (carbon emissions, sun spots, humidity levels, season, wind flow, volcanic emissions) to make a weather forecast...I have faith that we can do the same. Meterologists keep refining their models, and so can we. I think it’s important to try.

P.S. What do I think is being constructed in a learner? My best guess is that the learner makes a change somewhere in long term memory and that it varies depending on the content. Choices include semantic memory (facts), procedural (how-tos) and autobiographical (single events). Of course, this also has to go through the perception channels to short term memory to some sort of internal processing. And I don’t necessarily understand how memory chunks are stored and organized.

Because my working environment is a strong advocate of team learning, I have been experimenting with group activities...some of which actually work. Looking back through my notes, I think the ones that generate the most excitement (and hence talking) from the students are ones which build on something they already know well (like Thanksgiving).

For reasons I discuss below, most focus on sociolinguistics instead of topics like phonetics or syntax which tend to require more formal "mathematical" machinery.

Thanksgiving Ethnography

I asked students to get in groups and compare Thanksgiving traditions based on several dimensions such as patterns in dinner table conversation (topic, formality, interruptions) and treatment of older relatives (as well as side dishes). It's a chance for students to see cultural differences in linguistic behavior.

"New Jerseyite" vs. "New Jerseyan"

I first asked students to look up "New Jerseyite" and "New Jerseyan" on Google to determine which form was "correct". Since students used different strategies, I asked them to meet in groups to discuss how they approached the question then we did a summary. In this activity we generally have a good discussion of "prescriptive" vs. "descriptive" grammar since the dictionary mandates "New Jerseyite" but all native New Jerseyans unilaterally reject the word.

Color Chart

I split the group into men and women and asked each group to assign labels to a color wheel with 12 colors. Unlike the stereotype that "women know more color words", both men and did equally well in this case. My point here was that individuals can diverge from "group" norms in their behavior.

Investigating Missing Fudge

In terms of an online discussion forum, I found that I got the most passionate answers when students were asked to discuss how they would ask roommates about missing fudge. Answers ranged from expected indirect questioning to outright accusations (but only if they knew the person well).

Why I think they worked

I think students found these the most exciting because they were asked to analyze a "common" situation in a brand new way. Not only did students see connections with the content, but scaffolding was built in. Students were able to take old concepts and critically think about them. With newly learned data, critical thinking seems to be harder.

I have done semi-successful group activities based on new data, but the excitement is not the same and you do often see students who don't participate because they feel lost. It's not the same.

One of the more interesting cases was when I asked students to solve a morphology problem in pairs (think algebra but with letters). Unlike other activities, students immediately fell silent and worked on the problem individually instead of talking to each other. In this case, their instinct was to work it out "on their own".

When one instructor told me that she dropped discussion in her Gen Ed class, I wasn't surprised that her reason was "the students didn't know enough." At the Gen Ed stage, students may still be stuck in the low level knowledge and fact stage and not yet to advance to a higher level on content alone. A connection to something they already knew in daily life may have been needed.

As for critical thinking outside "daily life", maybe that IS something that needs to wait a semester (or at least a few weeks). There may be a time issue involved in moving from level to level.

I'm always amused when a paper talks about the benefits of class discussion and the example comes from a graduate level class. If they've made it to graduate school, we can be sure they've mastered most of the lower level content already!

Search

Search this blog:

About Me

I am a former linguistics Ph.D. (Celtic languages) turned instructional designer and part-time linguistics instructor. I am especially interested in monitoring development in historical linguistics, morphology and phonology.