Robert Talbert, one of my colleagues here at GVSU, has been using Python in teaching Discrete Mathematics for Computer Science for years; but I kept playing around with other systems (Haskell or Java) without ever really ‘tasting’ Python. Then, midway through last semester (i.e. just a few months ago), when I was about to tell my students what system they should use to do some work in this introductory course on Discrete Mathematics, I went through the Python tutorial — and I was hooked. Python provided pretty much precisely what I wanted!

I went down the hallway to Robert’s office, and I said:
“Why didn’t you tell me how wonderful Python is?”
He replied:
“I wanted you to find out for yourself.”
(Incidentally, can you tell from this reply of his that Robert is a mathematician?

Here’s what I’m doing with Python this semester:

Of course it’s necessary to start with teaching students the basics of Python since our CS curriculum here at GVSU does not yet (;-) use Python in our beginning Computer Science course. I start presenting Python during the second week of the course (after covering translation between decimal, binary, and hexadecimal numbers etc. during the first week). And teaching Python is simple, considering how easy it is to start using it: you can just start its interpreter and type expressions which get evaluated. How easy is that?! (The contrast between interpreted and compiled languages is classic.) None of my students have any trouble starting to use it. For an initial homework exercise, I ask them to try some of the actions specified in the Python tutorial, after I show them such actions myself during lecture. And actually what I show them first in lecture is how Python can do this course's entire first week's assignment, translating between decimal and binary and hexadecimal etc., by typing just a few lines of text in Python:

● An integer number is said to be a perfect number if its factors, including 1 (but not the number itself), sum to the number. For example, 6 is a perfect number, because 6 = 1 + 2 + 3, and 8 is not a perfect number, because 8 ≠ 1 + 2 + 4. Write a function perfect(n) that determines whether parameter n is a perfect number.
[Acknowledgement: this exercise is derived from Deitel & Deitel.]

(While I don't present map() or lambda with the initial material here, I do ask the students to try, for example, list(filter(perfect, range(9999)).)

The total number of points for this Homework Assignment #2 is 24, and my students' median score is 24 and their mean is 22.89. Looks like Python is easy to learn!

Next, like any general programming language, Python provides utilities for showing the numbers that correspond to characters and vice-versa: ord(x) and chr(n). I ask students to use these utilities to do Caesar ciphering and deciphering. Continuing to use the interpreter (since this course's purpose is Discrete Math, not really programming per se), managing input is a non-issue, so students can do this work much more easily than they could in Java (which I had students use in the past).

Above, I showed bin(n). Students can use it when I ask them to learn operations on binary numbers: addition, multiplication, shift left, bitwise AND, etc. Of course Python provides all the standard bitwise operators. And whereas Java's Integer.toBinaryString(n) may appear better than Python's bin(n) for negative integers (by the way it's because Python's integers aren't limited in size), well, we're Computer Science professors and Python is a programming language:

I don't use Python for this course's next topics: boolean algebra, propositional calculus, predicate logic, and introduction to writing proofs — these topics appear to me to require skills that are orthogonal to mere computation. (Actually I do use the C programming language to show students that true + true is true, etc., and that bool values are output as 0 and 1.) A few years ago I was actually the instructor of record for an undergraduate Honors College student's Senior Project which was a Python program that performed theorem proving. But I don't have my students use that Python program: to learn theorem proving for the Computer Science curriculum, I think students need to 'get their hands dirty' more, by trying to construct proofs themselves. For reference, consider learning outcomes specified in "Computer Science Curricula 2013":

Of course, it’s necessary to handle infinite sets carefully as in Haskell (with e.g. [0..]), avoiding for example any attempt to display such entirely or check a value which has already been ‘passed’ (such as a negative one for N). Nonetheless, I think this is cool.

Now, when a 'calculator' appears to be available to do students' math homework for them, a teacher might react negatively. But I think a 'calculator' actually provides opportunities for pedagogy to quickly get through relatively mundane details and reach more thorough or deeper understanding of concepts, e.g. set identities. (How good are your students’ proofs using set identities if they haven’t been able to clearly see the results of some of the operations such as union and complement?) And of course it's also possible to use a 'calculator' to verify or clarify points that students have doubts about such as what is the cardinality of {∅, {∅}}, and whether {1,2,3} may be a subset of itself.

Exercises that I assign to students on this material include simple 'calculator' exercises to figure out how to get Python to calculate a range of set expressions suchas A ∪ ~(B ∩ C), and exercises to write a function powerset(S) and a function is_partition(𝒞), where 𝒞 is a collection or set of sets. (By the way, the sets that are inside a collection or set of sets actually need to be frozensets in Python. Can you guess what Disney movie I reference in my class when this comes up? ;-) .

Following sets, the next topic in this course is Tuples and Relations. Well, guess what Python provides — even using traditional notation, if you want?

(The video clip I show to my students regarding relations may be a little risqué; sorry.)

The next topic in this course is an introduction to mathematical induction. See above regarding whether students’ learning to construct proofs may be very susceptible to processing by a programming environment.

The next topic is performance: O() etc. And look what you can do in Python:

I actually show students the values of f(n), g(n), and c*g(n). And I think demonstrations like this make O(), Ω(), etc. come more alive for students. Seeing actual values appears to clarify relative growths more than considering 'abstract' aspects of functions such as derivatives (and how memorable are the positions of different curved lines in graphs?).

What computing systems do you like to use in your introductory course on Discrete Mathematics for Computer Science? What are the strengths (and/or weaknesses) that you perceive with them?

P.S. A few days after I originally made this posting, I gave a midterm examination to this class; and here's what I asked about Python. (Did I say above that Python might not be good for boolean algebra and logic?

[8 points] What are Python's outputs if material is entered as follows?
(Write the outputs in the spaces for them below.)

You — and/or your children, students, etc. — could build your own copy of the original Analytical Engine! Out of LEGOs!! (Along with other items I’ve mentioned in my earlier posts, LEGOs are another thing I love: I estimate that at least a third of my children’s toys are LEGO sets…)
OK, this LEGO model wouldn’t actually function. ☹ But hey, the Analytical Engine was never originally built and wouldn’t have functioned either, so this feature — being nonfunctioning — would actually bolster this model’s authenticity!!?! ☺

Here’s another view of the proposed model, this time with LEGO minifigures of Lovelace and Babbage:
For reference, here are images of Lovelace and Babbage wearing dark clothes ≈like that:
(Don’t worry about the beard appearing with the proposed minifigure for Babbage; that detail should get corrected.)

If you think the proposed LEGO Analytical Engine needs to be made more accurate, well, if the LEGO company approves it then they might actually improve it, hopefully in consultation with historically accurate sources. For perspective on the LEGO company’s work producing models, I’ll say that I think their Architecture division has been doing a nice job reflecting the spirit of various monuments such as the Leaning Tower of Pisa:

If you will indeed support / vote for the proposed LEGO model of the Analytical Engine with Lovelace and Babbage, then access the proposal’s website, https://ideas.lego.com/projects/102740, and click the blue “Support” button which is along the right side (you may need to scroll down the page a little). When you’re prompted, create your own LEGO ID (completing the form etc. for this may take a couple of minutes). Then, finalize your support (by clicking the proposal’s “Support” button again)!

By the way, here are a few other places where this proposed LEGO set is featured — so if you want you can link to these social media from yours…:

My local public library (which my family and I also greatly appreciate, by the way) where I borrowed this book categorizes it as a graphic novel. I include a sample of this book’s main content with this posting — in a footnote (;-) below.* But this book contains much more than typical fictional comic-book panels: while I was perusing this book, I actually spent more than half the time reading Padua’s factual footnotes, endnotes, and appendices: they contain a wonderful selection of primary source material and illustrated explanations which I greatly appreciated for CS — particularly things that were new to me such as Lovelace’s calling card; I’ll show some of these things to you as follows:

On page 26, the footnote:

What was certainly Lovelace’s original realization was to be the essential root of computer science: that by manipulating symbols according to rules, any kind of information, not only numbers, can be operated on by automatic processes.

[The Engine] might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechnism of the engine. Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

On page 34, Section 1′s Endnote #4:

Politically radical though he was, Frend was a mathematician so conservative that he managed to write an entire book on algebra (Principles of Algebra, 1796) rejecting the use of negative numbers, and he even wrote a satirical burlesque ridiculing the use of zero. His quotation “we desire certainty not uncertainty, science not art,” is from an objection to the use of general undefined symbols in algebra as opposed to purely numbers, a big subject of debate in mathematics in the 1820s and ’30s. Lovelace was to propose using Babbage’s Analytical Engine for the manipulation of general symbols, a truly radical idea.

On page 34, Section 1′s Endnote #5:

This is as good a place as any to confront the fraught issue of what to call our heroine. Her name when she was born was Augusta Ada Gordon, as her father was George Gordon, Lord Byron. She was called familiarly Ada Byron (dropping the “Augusta” because she was named after Byron’s half-sister, whom Byron . . . oh, geez, it’s too complicated). She married William King when she was nineteen, becoming Augusta Ada King; then her husband became the Earl of Lovelace in 1838, so then she was Augusta Ada King, Countess of Lovelace, or Lady Lovelace. It is quite incorrect to call her Ada Lovelace, but everybody did, and still does.

On page 36, Section 1′s Endnote #10:

Shorthand of this period, by the way, has an intriguingly codelike look — like this from Thomas Gurney’s Bracygraphy: or, An Easy and Compendious System of Short-hand, 1835:

Viewing Gurney’s book, some text corresponding to this short-hand is as follows: “My Lords and Gntlmn, The jst cnsrn, which I have flt in my on brst, on the sdn deth of the lat King, my ryl grandfathr, mks me not dot, but you mst al have been dply afctd with so sver a los…”

On page 77, the footnote:

Lovelace is equipped with a handheld puncher [in a panel drawn on this page] for emergencies [to patch code ;-] . Herman Hollerith, who first effected the use of punch cards in computing, was inspired by railway-ticket punches such as Lovelace has here; the first punch cards for analyzing the U.S. Census of 1898 were punched by hand until the repetitive-strain injuries to the tendons of the operators, so familiar to modern computer workers, necessitated Hollerith inventing a keyboard puncher.

On page 78, the footnote:

Lady Lovelace declared that the Engine could do only what it was “ordered to perform.” Alan Turing (1912-1954), the great theorist of twentieth-century computing, argues back in his Computing Machinery and Intelligence:

LADY LOVELACE’S OBJECTION. Our most detailed information of Babbage’s Analytical Engine comes from a memoir by Lady Lovelace. In it she states, “The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform” (her [emphasis]). The view that machines cannot give rise to surprises is due, I believe, to a fallacy to which philosophers and mathematicians are particularly subject.

On page 90, part of Section 2′s Endnote #2 (click to view larger):

On page 277, “M.L.”‘s Personal Recollections of Babbage:

One day [. . .] when he had come to see me, he had also forgotten his cards, so he took a small brass cog-wheel out of his waistcoat-pocket and scratched his name on it and left it for a card!

On pages 281-82:

Calling card of Ada, Countess of Lovelace, in Babbage’s possession at his death:
. . . and Ada Lovelace’s handwriting on the back!

Appendix II comprises history and explanations of the workings of the Analytical Engine such as its Store (memory), Mill (≈ALU), and punch cards, with illustrations of the gears etc. and associated historical material such as 1890 U.S. Census card reading and Colossus paper tape reading. On page 309 is some material as follows:

Babbage [. . .] seemed to have paid almost no attention to software, leaving that to Lovelace (and some of his other assistants through the years) — the scattering of programs in Lovelace’s paper is pretty much all there is. [. . .] Almost a hundred years after Babbage first thought of making his Difference Engine eat its own tail, and Lovelace proposed that it could play symbols beyond the realm of numbers, mathematician Alan Turing described another imaginary machine, the Universal Computer. Turing didn’t bother with the engineering and hardware specifics that Babbage spent so many decades on, picturing instead an abstract, formless device, a platonic form of a computer. Turing’s Universal Machine would have some way to “read” and “write” data, a way to move the data in and out of a storage system, and a code of symbols by which the computer could instruct itself. The Turing Machine is still the standard against which all computers are measured, and by that standard the Analytical Engine was the first computer.

I enthusiastically recommend that you check out Sydney Padua’s The Thrilling Adventures of Lovelace and Babbage. You should appreciate it, and your students should benefit from it.

* Here in my footnote (;-) I’ll show you some of The Thrilling Adventures of Lovelace and Babbage‘s nominally main content, comic-book panels (click to view larger):

This example brings up another reason why I appreciate this book: I had not been conscious of the rationale for the name “Difference Engine” until I saw this material, though I can remember greatly appreciating the method of differences back when I was around 8th grade — I thought that method was very cool. Now, since it’s coming up like this, I feel like sketching it as follows in this footnote. (If the method of differences isn’t new to you, I hope this will give you a warm, fuzzy feeling of remembering it back in high school or middle school like me…

For an example, consider: what polynomial p(n) yields the following sequence of values?

Well, calculate the differences between successive values of p(n), and then the differences between those resultant successive values, and then the differences between those successive values, and so on:

Specifically, I want to present how revision of our first Discrete Mathematics course is proceeding here at GVSU.

For starters, in the first lecture I greet students saying “Bonjour! Bonjour!”, and then I actually start lecturing in French for the first minute or so, and then I shift into English, saying: “…<French words> gradually here you’ll gain familiarity with all this notation, terminology, and operations, so whenever you may need to use them in later courses and work, you’ll be comfortable and handy with them — again, from having learned this language.” And I really do have the material I display for that first lecture written in both languages, English and French. As I hinted in my earlier posting regarding Math, I started the first lecture this way to indicate to students the perspective I’m developing/promoting: that for CS, the mathematical material in this course can serve as basic language which the students can then recall and use throughout the CS curriculum, e.g. set notation and quantifiers such as ‘exists’ in a course on databases, and binary and hexadecimal numbers in a course on computer organization and architecture.

Then, beyond starting my first lecture that way, the list of topics I’m settling on in this course is as follows (note that some of these topics require less than a week):

Introduction / L’Introduction

Different Language(s) Used in Computer Science for Representing Numbers: Binary, Hexadecimal, and Octal

Introduction to Python and Its Facilities for Discrete Mathematics

Some Other Data Representation in Computers: Characters (Application: Cryptography)

Basics of Arithmetic in Computers Using Binary Numbers: Addition and Multiplication

Low-level Computer Processing Using Binary: Bitwise Operations

Further Representation and Processing of Numbers in Binary: Negative Integers, Subtraction, and “Decimal”s (i.e. Fixed-Point Fractions)

As some things in that list indicate, I connect this mathematical material to the rest of Computer Science. But I must admit that this list of topics is a work in progress: the sequence of topics I used this past semester was actually somewhat different from that; so to be honest I’m hoping to settle things this Fall.

One element there I want to highlight is Python. I hadn’t mentioned it in my previous postings. I knew that I wanted to include some actual computing in this course, but I hadn’t settled on exactly what to do. An issue is that — like tendencies Pete Henderson criticizes in his last “Math CountS” column — my institution starts the Computer Science major with a very low threshold for problem solving and mathematics: only pre-college (remedial) algebra as the prerequisite for our initial “Computer Science 1″ course!! And speaking of that, oddly this Discrete Mathematics course I’m discussing here doesn’t even have our initial “Computer Science I” course, which uses Java, as a prerequisite! So if I want to use computing in this Discrete Mathematics course, I actually can’t expect students to possess any background other than college admission.

Last Fall, I tried having students use Haskell. It’s cool; it would add a functional, curried language to the repertoire of languages that they experience in college; and it has list comprehension and lazy evaluation (so for example it can represent ℕ). But while Haskell could be used as a discrete-structures expression evaluator like general mathematical packages such as Maple and Mathematics, I felt that it was a bit of a stretch for me to try to teach actual programming in Haskell to students with such unsure computing/mathematical backgrounds/maturity.

And then one day I tried Python — like Perl, Python is already available on systems that I use (Mac & Linux). And I fell in love with it!!! 😍 Python can be used interactively as just an expression evaluator, so students can start using it without knowing programming already! And it straightforwardly handles expressions I want to use such as exponents and binary and translation between characters and numbers! And it’s not necessary to have overhead such as “... public static void main(String[] args) { ...“! And it actually uses braces “{ }” for sets! And it features set comprehension and unlimited iterators (so it can represent ℕ: basically N = itertools.count()) and tuples! And its syntax is simple and clear! And the students here can do coding in it such as Caesar encryption and decryption after just a week and a half of exposure to the language! And of course it has great capabilities beyond Discrete Mathematics — how could anyone not love a language that has Turtle programming built into it?!? ☺

In particular, this posting provides perspectives of Michael Goldweber and Henry M. Walker, authors of articles in the ACM Inroads issue, that respond to Mark’s blog.

Response by Michael Goldweber

In a recent CACM blog post, Mark Guzdial commented on the March issue of ACM Inroads special section on the role of programming in a non-majors CS course. In particular, he wrote somewhat critically regarding two of the articles; one authored by me and one by Henry Walker.

His comments centered around two issues:

the lack of evidence-based pedagogy in these two papers, and

the process whereby the content of such courses is decided upon.

I offer the following in the spirit of continuing this dialog.

Regarding point #1, I agree 100% with Dr. Guzdial, neither my article nor Henry’s article contains any evidence-based pedagogy. In fact neither article contains any discussion of pedagogy at all—evidence-based, folk, or otherwise. So while I agree with Dr. Guzdial’s comments regarding the absence of evidence-based pedagogy, they are nonetheless not germane.

As for point #2, Dr. Guzdial argues against CS faculty unilaterally deciding the content of such non-majors courses. He asks (rhetorically?) “Are CS faculty the right ones to define the learning goals for students who do not plan on a career in computer science?”

I suppose by this argument philosophy faculty members do not have the right to define the learning goals for students taking a philosophy elective, who do not plan on a career in philosophy. It appears that Dr. Guzdial has never taken part in a liberal arts or Jesuit/Catholic (or other religion-based) institution’s core curriculum war—errr, I mean discussion.

Non-major courses do not exist in isolation; they fit into the wider focus or nature of an institution such as a technical school, liberal arts college, or research university. Dr. Guzdial’s institution is a technical school where everyone is required to take a course in computing. Again, I agree with Dr. Guzdial, it is inappropriate for CS faculty to design the learning goals for a service course in isolation. The design of any service course should carefully consider the needs of the “client” department(s). If a client department requires programming, then programming must be included. Notwithstanding, my comments and Henry’s were not meant to inform the design of any course that is required by any department/program.

When Henry and I first began planning this special section, we recognized the wide variety of institutional foci and goals. Hence, we invited a set of authors who would speak to the breadth of this space as best we could, given the constraints of the medium. The goal was to present a plethora of perspectives on this question.

So, while I agree with Dr. Guzdial’s point regarding the process of designing a service course, it is nonetheless not germane. The design of a non-major’s course in computing, which is not a service course for some other department/program, should belong in the hands of the CS faculty. Students electing to explore a discipline take these courses. Surely, discipline experts are those who can best decide what to present from the discipline.

Dr. Guzdial’s comments aside, the purpose of the special section was not to discuss pedagogy or to focus on the process of topic selection. The point was to explore the question of whether programming should be one of the included topics.

There is no question that including programming is the traditional route for non-major courses; perhaps, one might say, the way white male faculty have been electing to do it for many years. My and Henry’s articles (yes, we are both white males, as is Dr. Guzdial) dare to explore what a non-major’s course might look like without including the traditional programming component.

The purpose of the Special Section was to engage computing educators about the appropriate content for non-majors courses in computing, and I am delighted with the conversations this Special Section has generated.

From my perspective, initial course planning starts with content, goals, and context. After identifying appropriate material to be addressed, pedagogy considers how that material might be organized and taught effectively. Evidence-based pedagogy provides a fine mechanism to move from high-level themes to “the integration of professional wisdom with the best available empirical evidence in making decisions about how to deliver instruction” [US Department of Education, quoted in Mark Guzdial’s blog]. However, the focus of the Special Section was on the Big ideas, not on pedagogy. Once one knows what material to cover, the references in Mark’s blog to evidence-based practice may provide faculty with guidance on how to organize, plan, and conduct a course — but all of that comes after initial planning.

Returning to a consideration of initial planning, some comments seem appropriate regarding, content, goals, and context. In the language of CS Principles, what are the Big Ideas that a course should explore? As indicated in Mark’s blog, the CS Principles effort has identified seven fine Big Ideas that might be considered for a non-majors course join computing. My article, “Priorities for the non-majors, CS course”, suggests directions for identifying numerous other Big Ideas as well. My intention was not to present final statements of Big Ideas, but rather suggest additional themes and directions that might be considered. I doubted a long laundry list of possible themes would help faculty consider what makes sense for their courses and institutions; rather the point was to consider additional directions.

Mark’s blog also mentions some courses that have a programming component. For example, I have followed the development of the courses at both Kalamazoo College and Georgia Tech for years, and I certainly agree that those courses do a fine job in what they attempt to do. Similarly, the CS Principles course is well thought out. However, at least one of the Big Ideas in each of these courses is related to programming. For example, one of the the CS Principles’s Big Ideas states, “Students in this course design and produce solutions, models, and artifacts … .” If this is a starting point, then further development of a course using programming seems natural. In summary, if programming is a goal of the course, then the course should cover programming.

However, my article and the article by Michael Goldweber dare to suggest that the collection of Big Ideas need not include programming. Although we approach the non-majors course from different perspectives, we both identify a range of topics that might be considered — programming-related topics are just one of numerous candidates.

Turning to course goals, the purposes of a course need to take into account the target audience. A non-majors course for engineering students may need to address specific skills useful for engineering, a non-majors course for accountants may need to consider elements of spreadsheets or databases, a non-majors course for artists or historians or language majors may need to address other needs. More generally, if a non-majors computing course is being designed to address the needs of a client department, then the goals and Big Ideas may need to reflect some of those needs. Computer scientists naturally will have insights about what material should be covered, but the course is only useful if it connects with the target audience and their needs as well. As a separate matter, a general non-majors course may need to touch upon a range of student interests and have a hybrid target audience.

Context represents yet another factor shaping a course. Courses do not exist in isolation, and students do not take only one course in their undergradaute programs. Thus, in planning a course, one should connect themes and Big Ideas to broader goals of an overall curriculum, and these will necessarily vary according to each institution. What are the various institutional learning goals and how do they fit with departmental goals? Needs for a course naturally reflect the nature of the institution (e.g., technical school versus a liberal arts college), the role of the course (e.g., course for both majors and non-majors versus a course specifically for non-majors), and the role the course might play in students’ overall education.

Putting these pieces together, my article and the article by Michael Goldweber identify numerous possibilities for course themes and Big Ideas. With such a range of topics to choose from, it is curious that Mark’s blog states, “The claim that “significant other topics must be dropped” is an empirical claim.” If a course could cover any range of topics, then there would be no need for CS2013 to recommend about 300 hours of class time for a major in computer science. Further, the CS2013 Task Force would not have been faced with the challenge of how to shape their recommendations to fit within a realistic undergraduate program. With the range of themes and Big Ideas suggested in my article, choices are needed as well.

In summary, initial planning for a non-majors course in computing requires consideration of content, goals, and context. The articles in the Special Section identify a large number of possible themes and Big Ideas — far more than can be covered in a single course, and this requires faculty to make choices. Mark’s blog rightly identifies some successful courses in which faculty have chosen to include programming-related themes, and I have no objection to that. (To correct Mark’s blog, my article does not “argue[s] that programming should not be a priority for a non-CS majors course, as claimed in the blog. Rather I argue that programming need not be a priority.)

The point of my article is other legitimate choices are possible as well. One size does not fit all, and there is no reason to believe that one selection of themes or Big Ideas will fit all courses in all institutions.

Of course, once the themes and Big Ideas are identified, then one can move on to the pedagogy — perhaps using evidence-based practice, as Mark’s blog suggests. However, the starting point is identifying the themes and Big Ideas, not pedagogy.

The first historical circumstance pertaining to the CS major at GVSU which I’ll present is reflected in my department’s name: “School of Computing and Information Systems“. At GVSU, our CS major shares departmental resources including introductory courses with our Information Systems (IS) major. To be specific, until 2012 students pursuing either of these two majors were required to take the same ‘technical core’ courses during their first year: “CIS 162: Computer Science I”, “CIS 163: Computer Science II”, “MTH 225: Discrete Structures: Computer Science”, “STA 215: Introductory Applied Statistics” (or a more advanced Statistics course), and either “WRT 150: Strategies in Writing” or “COM 201: Speech”. And while our department’s professors naturally meet all together to discuss and decide the curriculum for each of our majors, it was difficult to change critical courses such as CIS 163 for the purposes of the CS major because IS had different aims. Now in 2012, the IS major diverged a little more from the CS major, switching from CIS 163 and MTH 225 to alternatives; but the other ‘technical core’ courses remain shared, again because our department must share our resources, plus we want to facilitate students changing from one of our majors to the other, which happens a lot — if no courses were shared, then changing majors would onerously make students lose a semester or two of time. Then, since the IS major does not involve prerequisites of prior exposure to computer programming or any advanced mathematical qualifications beyond what’s necessary for admission to college, our primary shared course “CIS 162: Computer Science I” is very introductory, really starting from nothing. Then after that rudimentary course in the first semester, it is difficult for our sequel CS course “CIS 163: Computer Science II” to progress very far in computer science; for example, it has only one week to cover linked lists, and it does not cover binary trees at all.

Another historical circumstance pertaining to the CS major at GVSU concerns the C programming language. As you may know, although C is now old, it remains important: consider that the TIOBE Programming Community index gives C the highest rating of approximately 17.5%, it gives Java the second-highest rating of approximately 15%, next Objective-C approximately 9%, and then C++ approximately 6% — and the latter two languages obviously use C programming at their core. Consistent with such prominence of C, our CS major at GVSU includes significant courses which use C, e.g. “CIS 452: Operating Systems Concepts”. But… our coverage of C has actually varied somewhat. As of 2002, our CS major included the required course “CS 361: C and Unix”. The next year, the CS major underwent changes including the following:

“We are dropping CS 361 as a requirement and will eventually drop it from the catalog as well. [...] Some content from this course will be moved to CS 163 which is expanding to a four hour course and CS 343 which already covers the C and C++ programming languages. This course will now be accepted as an elective. [...] The CS curriculum is consistent with the national curriculum as reported in the ACM/IEEE-CS Joint Task Force on Computing Curricula 2000.”

That referenced course “C[I]S 343″ is “Structure of Programming Languages”. As usual at any university, such a course surveys the range of programming languages, so it includes discussion of C and maybe C++; but for it to really teach all of C is difficult. So now, professors who want to use C in courses such as “CIS 452: Operating Systems Concepts” observe that students who learned C by happening to have chosen the elective CIS 361 smoothly handle the C programming, while other students who have not taken CIS 361 struggle with it.

Now it’s fair to counter, “But… the ACM/IEEE CS Curriculum Guidelines don’t mandate teaching the specific language C.” That is true; however, the Guidelines do specify including material such as “references and aliasing” , “manual memory management”, and “how fundamental high-level programming constructs are implemented at the machine-language level”; and clearly the C programming language is currently the best environment for thoroughly teaching such material. Anyway, aside from C, some other material specified by the Guidelines which we need to cover better is as follows:

clear coverage of algorithmic strategies such as brute-force vs. greedy

full coverage of the Knowledge Unit “AL/Basic Automata, Computability, and Complexity” [3 Core-Tier1 hours plus 3 Core-Tier2 hours], which includes topics such as finite-state machines, regular expressions, and introduction to the P and NP classes and the P vs. NP problem

full coverage of the Knowledge Unit “SF/Evaluation” [3 Core-Tier1 hours]

Now of course, sometimes some of our instructors have optionally covered some of this material. (And again, some of this material such as automata was included in our required courses until the 2002/03 change.) However, we don’t have this material really ‘nailed down’ clearly in our curriculum right now.

So we’ve been discussing how we might address these deficiencies of our curriculum, including discussing some elements actually for several years before the publication of the 2013 ACM/IEEE CS Curriculum Guidelines. But… as I indicated above, sometimes we’ve been held back because of competing purposes for our courses such as “CIS 163: Computer Science II”. Or last semester, we were ‘holding our breath’ while ABET was evaluating us for reaccreditation; and that may actually occupy some more time for us since we need to address some concerns they raised. Is there any hope for us to clean up our ‘mess’?!?

Well now, looking more at ACM/IEEE Computer Science Curriculum Guidelines, consider the following (to see the image larger, click it or view page 37 in the original document):
In that redistribution of Core hours among Knowledge Areas, note particularly that the Knowledge Area “Architecture and Organization” was previously assigned 36 Core hours but now is assigned only 16 Core-Tier2 hours. At GVSU, we taught all the previously-specified Architecture and Organization in two required courses, “CIS 251: Computer Organization” and “CIS 451: Computer Architecture”. Now, however, considering the lessening of hours for this Knowledge Area, we should be able to cover the necessary material in just one course.

And if we do remove one required course from our CS major, what can we do with that available space? Why, change CIS 361 on C back to a required course again! Though of course, we’d also take this opportunity to satisfy more ACM/IEEE CS Curriculum Guidelines, e.g. covering version control and changing topics in further courses to address the deficiencies itemized above. (These changes to our curriculum would to some extent be performing an ‘undo’ of the changes done in 2002/03.)

But… there are complications. For one thing, students should take this ‘new’ required course on C etc. in their second year to maximize the opportunity for later courses to use C; but… inserting a course at such a random point in the curriculum would necessitate displacing various other courses, and possibly lengthening our CS major’s prerequisite chains of courses. And… if the prerequisite chains of courses are too long, then students might find it difficult to graduate from college in four years if they need to do anything unusual, e.g. studying abroad which entails missing a semester or two of our courses, or if they’re transfer students from community colleges who have completed General Education and need to just take CS major courses.

Another ‘but…’ concerns Computer Organization and Architecture. It’s actually difficult to cram all the necessary material into just one semester; it would help if some topics such as binary number representation were taught elsewhere such as in “MTH 225: Discrete Structures: Computer Science”. But… that course is under the purview of the Mathematics department, so we can’t just willy-nilly dictate what happens in it.

And another ‘but…’ concerns “CIS 163: Computer Science II”. It’s a critical course: it needs to transition students from the very introductory course “CIS 162: Computer Science I” to, well, all of Computer Science; it’s the principal prerequisite for most of our CS courses. And its current setup includes some Linux material which should be covered in CIS 361. Then making CIS 361 required again means that CIS 163 might have some room to cover different material, e.g. basic binary trees. But… should this be done? Our students currently find CIS 163 very challenging; instead of adding more material to CIS 163, should we just lengthen coverage of the material that’s already in it, so it’ll be easier?

Stay tuned to find out what happens to our intrepid department facing the prospect of modernizing our CS major. Will we make changes as described above? Or will we encounter more “but…”s which hinder this process?

How has modernization of your CS major proceeded at your institution? Do you want to compare your situation to the situation at GVSU? Or how about this: Do you want to get together with me and write up a cross-institution experience report on curricular reform being done to conform to the new ACM/IEEE CS Curriculum Guidelines? I’m game if you are….

Along this vein, if a student majors in Mathematics during college but then switches to Computer Science for graduate school, as I did, then a perspective I’ve heard from Mathematics faculty is that the student has ‘gone over to the dark side’. (;-)

While I am in the field of Computer Science, I love Mathematics, so one course I teach is “Discrete Structures: Computer Science [1]“. Now, some notes regarding this course are as follows:

A catalog description of such a course is as follows: “This course covers the mathematical foundations of Computer Science. Topics include propositional and predicate logic, set theory, counting techniques, relations and functions, and mathematical induction.”

Discrete structures are foundational material for computer science. By foundational we mean that relatively few computer scientists will be working primarily on discrete structures, but that many other areas of computer science require the ability to work with concepts from discrete structures. [...] The material in discrete structures is pervasive in the areas of data structures and algorithms but appears elsewhere in computer science as well. For example, an ability to create and understand a proof—either a formal symbolic proof or a less formal but still mathematically rigorous argument—is important in virtually every area of computer science, including (to name just a few) formal specification, verification, databases, and cryptography. Graph theory concepts are used in networks, operating systems, and compilers. Set theory concepts are used in software engineering and in databases. Probability theory is used in intelligent systems, networking, and a number of computing applications.

What might be more attention-grabbing is that CS Curricula 2013 allocates 37 Core-Tier1 hours plus 4 Core-Tier2 hours totaling 41 Core hours for “Discrete Structures”, an amount which is a close second after the Knowledge Area “Software Development Fundamentals” which is allocated 43 Core-Tier1 hours, and a significantly greater amount than other Knowledge Areas (the next largest amount is 28).

The most popular textbook for such a course is Discrete Mathematics and Its Applications, by Kenneth Rosen. Other introductory textbook authors include Susanna S. Epp, Richard Johnsonbaugh, Judith L. Gersting, and Alfred V. Aho and Jeffrey D. Ullman. The cover of the textbook by Aho & Ullman (which is dated 1994) is as follows:
This book’s Preface says the following:

“About the Cover”
It is a tradition for computer science texts to have a cover with a cartoon or drawing symbolizing the content of the book. Here, we have drawn on the myth of the world as the back of a turtle, but our world is populated with representatives of some of the other, more advanced texts in computer science that this book is intended to support. They are:

For a long time, I bought into those perspectives that introductory discrete mathematics is “foundational” for Computer Science. So when I’ve taught this course, I’ve made students fill truth tables, use quantifiers, write proofs by induction, and so on. In fact I even developed some software for constructing proofs, and I made students learn how to use it.

And I was happy. But… I nonetheless kept feeling a need to tinker with this course. So, at different times I tried different textbooks; or I spent a lot of class time on the number theory which is the basis for RSA encryption as a motivating example; or I tried to change things up other ways. And while one might say that symbolic logic and doing proofs can be a unifying theme for this material, I realized that what I was teaching was actually various disconnected topics — and students weren’t using them much in their subsequent college courses or jobs. How often does any CS student in college write a universal quantifier “∀” outside of this course or maybe a theorem-proving segment in an Artificial Intelligence course? When does any CS student in college ever check whether a function is “into” or “onto”? Does a current course on networking really involve proofs, or rather packets and protocols and settings? Is this material really foundational for Computer Science, in the sense of all of it being really necessary to be able to do CS? Or is this material just some background theory of the field for which it’s nice that it’s there, but it’s simply not useful for day-to-day actvities — just as, for an analogy, it’s good that the theory of common physics has been worked out (F = ma and all that rot ;-), but we don’t actually use any of that theory when we do real activities such as playing catch or deciding when to apply brakes while driving — not even when teaching people how to drive, by the way.

In considering this question, I started focusing on what material can be covered in this course that could be useful for subsequent college courses. And at first I was disappointed because it seemed to still be a disconnected collection of notation and jargon. But then I realized that that’s exactly what’s useful: notation and jargon! It’s typical for a field to use its own notation and jargon. For theater, consider the adjective “stage left” and the verb “upstage”. For organic chemistry which is a requirement for medicine, I’ve heard that it’s necessary to memorize a bunch of notation and terminology. For Computer Science, consider the following notation and terminology which this introductory course could present: binary, hexadecimal, ASCII/Unicode, bitwise operations, boolean algebra, symbolic logic, set operations, recursive definitions, O(), finite-state machines, regular expressions, and more. Proofs may be considered as claims’ explanations/justifications written in this mathematical language for Computer Science. For example the best way to explain the N*log(N) time complexity of mergesort is to prove it. And is this material foundational in the sense that it’s all necessary for day-to-day practice of Computer Science, i.e. writing computer programs? No! Or is this material all really strongly interconnected? No! But is this material useful like introducing a foreign language to someone, providing them with a variety of words — for “hello”, “please”, “bus”, “water”, “hostel”, “one”, “two”, and so on — so they may be able to understand and use some of the terminology in different situations they may encounter? Yes!! Well, that’s what this course can be about. “Bonjour, étudiants! Bienvenus à ce cours, où vous allez apprendre du langage mathématique pour l’informatique….” (“Hello, students! Welcome to this course, where you’ll learn some mathematical language for Computer Science….”) After this course, say titled “Introduction to Discrete Structures for Computer Science”, a later course can present theory of Computer Science: varieties of automata, equivalence with categories of formal languages, Turing machines, decidability, reducibility, complexity, P vs. NP, etc.

Qu’est-ce-que tu penses de ce sujet? (What do you think about this subject?

Share this:

It’s my first year in the classroom, and it very much feels like I’m going back to school – and I don’t even think of myself as a “real” teacher. (I help teach an AP Computer Science course at a public school in San Francisco through the TEALS program, but I’m a professional engineer by day.) So that meant a summer of preparation, combing the web for AP CS resources; I’d hoped to lean on the work of more experienced educators as I put together material.

I didn’t find much online, which was surprising. There’s all sorts of information about all sorts of things online, but AP Computer Science teaching resources were meagre. Though most American high schools don’t offer an AP Computer Science class, the curriculum and evaluation are standardized, which might’ve, I thought, encouraged more teachers to share and use resources together.

So I began posting materials I’d made and (with permission) found on Teach APCS. My hope is that the site becomes a living repository of free, good-quality computer-science education materials and that my initial contributions are dwarfed by what others give.

There’s currently:

exercises that have been tested in classrooms – these have been the most popular with busy teachers so far.

an interactive Java REPL to help students get a handle on using bits of Java: realizing they know how to do math in Java, poking at the Math class, experimenting with Random number generators or substrings, that sort of thing.

a dictionary of common terms with examples. (If you’re looking for a place to contribute, or have your students do so, this is probably the easiest place to start.)

a compiler-error-to-English translator, since speaking compilerish is an unfortunately-acquired skill

a “microtext” – snippet-sized explanations of key AP CS concepts, put together in a way that might resemble an internet textbook

a list of microlectures – explanation videos, from YouTube, that could supplement classroom instruction or be another resource for kids at home.

At the end of page 5, “editor’s letter” by editor-in-chief Moshe Y. Vardi, it says “Copyright held by author.” To help you see why this is funny, I’ll provide a copy of the entire article which you can access via the following link (OK, I’m just kidding about making my own copy; but anyway do you get why I think those words are a little funny considering the content of that article?).

Next, Vint Cerf’s column “ACM and the Professional Programmer” appears on page 7, and in it he considers “whether and how ACM can adapt its activities and offerings to increase the participation of [...] professionals.” Now in this article, he mentions “professionals”, “computer science researchers”, and his “mentors in graduate school”. Well, considering his stated goal, I think he should have also mentioned basic instructors who teach basic (i.e. introductory, i.e. undergraduate) computer science courses. If it’s desirable to influence people, isn’t it generally most productive to “get ‘em while they’re young”? (Once when I was in college, I was lucky to be in a small group addresed by preeminent linguist Noam Chomsky. At one point I asked him what’s the best way to learn a language, and he replied, “First, make sure you’re less than five years old….”) Computer science professionals generally spend several years in college learning basic computer science (beginning programming, data structures, algorithms, etc.) before starting their professional activities. If it’s desirable to inculcate these students with a message such as “ACM membership is a mark of a computing professional”, then wouldn’t it be most effective if this message comes regularly from their instructors — who may be teachers more than researchers, by the way? (A last thought of mine regarding this material is that in this CACM article Vint Cerf says, “In order to reach the very parties whose opinions and interests I would most like to gauge, this column is being published here and in ACM Queue, an online publication devoted to practitioners.” To achieve his goals which he indicates, maybe Vint Cerf should also publish a column in ACM Inroads!

Then, pages 8-9, “Blog @ CACM” is “Why the U.S. Is Not Ready for Mandatory CS Education” by Mark Guzdial. At the end he says, “How can we argue that computer science is more important than all the [other sciences and mathematics] and demands a mandate that the others do not deserve?” I’ll refrain from discussing here whether science and mathematics deserve to be required so our electorate can reason about issues such as health (insurance) and the environment, but I do want to discuss here whether computer science is more important than all the rest and deserves to be required. But to prepare for this discussion, consider another point of Mark’s: “Making [CS education] mandatory will force the majority of schools to invent something they will call ‘computer science,’ but it will look like Photoshop or CAD software instruction because that is all they can do without a CS teacher.” Well then, to start discussing whether CS education should be mandatory, it’s necessary to start by first specifying what is ‘general CS education’. I don’t know the content of California Bill AB 1530, but if it doesn’t include such a specification, then I would agree with Mark about what will probably happen if it passes.

(Caveat regarding this discussion: I am not an expert on K-12 CS education. But of course there are other sources of such, e.g. the entire Computer Science Teachers Association, which has an annual conference.)

There are many books providing an introduction to CS for people in general, e.g. Computer Science Illuminated by Nell Dale & John Lewis and Invitation to Computer Science by G. Michael Schneider & Judith L. Gersting, and regarding specification of what general CS education may be, the latter book for example says the following:

The introductory computer science service course has undergone many changes over the years. In the 1970s and early 1980s, it was usually a course in FORTRAN, BASIC, or Pascal. [Later], a typical [...] version of this course would spend a semester teaching students to use word processors, databases, spreadsheets, presentation software, and electronic mail. [My institution, Grand Valley State University in Michigan, has this.] But most computer science service courses do not teach students the foundations and fundamentals of computer science! We believe quite strongly that students in a computer science service course should receive a solid grounding in the fundamental intellectual concepts of computer science in addition to important uses of computing and information technology. This parallels the introductory course in biology, physics, or geology, where the central underpinnings of the respective fields are introduced. The topics in such a computer science course would not be limited to “fun” applications such as Web page design, social networking, and interactive graphics, but would also cover issues such as algorithms, hardware design, computer organization, system software, language models, theory of computation, and social and ethical issues of computing. An introduction to these core ideas exposes students to the overall richness and beauty of the field, and it allows them to not only use computers and software effectively but to understand and appreciate the basic ideas underlying their creation and implementation.

For further descriptions of what introductory CS education could be, consider the CS Principles effort, as presented at http://www.csprinciples.org. (One of the Comments printed with this article by Mark Guzdial also mentions “CS principles”.)

Though they may not be required, schools provide such introductory courses in biology, physics, and mathematics — e.g. I remember a biology course including &asym;advanced topics such as the Krebs cycle which I took when I was in 10th grade. I believe that if schools can provide introductory courses in other disciplines like that, then they can provide such an introductory course in computer science. Would it be somewhat different if it’s at different levels, such as in 10th grade versus in 5th grade? Of course. E.g. for an example of an algorithm, such a class at a lower level might do Euclid’s original algorithm for calculating greatest common divisors which involves repeatedly comparing two numbers and subtracting the smaller one from the larger one, until zero (or equality) is reached. Can such a lower level do this? Hey, can 5th graders compare two numbers, subtract the smaller one from the larger one, and either stop at zero or repeat? Of course. (My father tells me about volunteering with inner-city children and having them do such an activity, using the book Algorithms and Automatic Computing Machines by Trakhtenbrot.) And is this activity computer science, like a computer program? Well, let’s see: this processing includes sequence, conditional selection of alternatives, and repetition; so, yes! 5th graders can also do binary search, sorting, and so on. Beyond that, general computer science could cover cool history including the Jacquard loom and Moore’s law, artificial intelligence which is interesting, and information security which is relevant to their daily activities using Facebook, text messaging, etc. And again, there are textbooks such as Invitation to Computer Science which present all these topics and include engaging weekly activities to illustrate them.

After that discussion of what general CS education can be, I’ll directly address Mark’s question, “How can we argue that computer science is more important than all the [other sciences and mathematics] and demands a mandate that the others do not deserve?” This is easy: Almost all work and a lot of leisure these days involves use of computers: Buying anything at almost every store involves them. Driving (and navigating) involves them. (Vint Cerf’s article actually reflects how much work involves computer science.) And Facebook, text messaging, etc. depend on them. (The first Comment printed with this article by Mark Guzdial appears to me to be mistaken about whether adults do computing. Just managing one’s privacy settings in Facebook is intricate.) It might be fair to say that use of computers has become a fundamental skill like reading, writing, and arithmetic; so some attention to it in K-12 schools makes perfect sense.

Then, as in the material to which Mark Guzdial is responding, after preparing all people to participate well in society using computers, I believe a K-12 course on general CS education will stimulate a broader (than we see now) spectrum of students to delve further into this field, to do what may be considered traditional computer science (programming etc.)

What do you think about this question?

Hugh

Share this:

In a frighteningly few number of days, classes will start for the next academic year. It’s frightening because I have so much to do: prepare for classes, write some committee reports I was supposed to do at the conclusion of the preceding academic year (a.k.a. during the summer), and all the family activities my wife thought were necessary during summer before losing me to work again.

In my first lecture for each of the classes I teach, I tell my students some personal information about myself such as my favorite author (J. K. Rowling), the ages of my children (currently 10, 7.8, and 6), and my favorite book on parenting: Parenting with Love and Logic by Foster Cline and Jim Fay.

Another child-development-related book which I like is The Read-Aloud Handbook by Jim Trelease. I enjoy points in it such as the following:

“Computers are incredibly fast, accurate, and stupid; humans are incredibly slow, inaccurate, and brilliant; together, they are powerful beyond imagination. — Albert Einstein” (Of course we all enjoy this point.)

“Comic books are a frequent childhood choice of people who grow up to become fluent readers.” (Can you guess why I enjoy this point?)

“When children are watching television, closed-captioning should be activated along with sound. [...] Reasonable doses of captioned television can do no harm, and most likely it can help greatly with reading.”

“Every year McDonald’s spends more money on advertising than it did the previous year, which comes to more than $1 million per day. Its marketing people never think, ‘Everyone has heard our message. They should be coming to us on their own, instead of spending all this money on advertising.’ Every time we read aloud to a child or class, we’re giving a commercial for the pleasures of reading.”

“Make sure your children see you reading for pleasure.”

The book that I’ve been reading recently for pleasure (in front of my children) is Dave Barry Turns 50 (by Dave Barry). Some of this book’s points which I enjoy are as follows:

“When you turn 50, [...] you can’t read anything. [...] Actually, this started happening to me when I was 48; I started noticing that when I tried to read restaurant menus, they looked like this:

At first I thought that this had nothing to do with me — that, for some reason, possibly to save ink, the restaurants had started printing their menus in letters the height of bacteria; all I could see was little blurs.”

“Let’s say you’re a customer-service representative, and you’re on the phone with a typical member of the public, by which I mean an individual who has the cognitive powers of celery.”

Dave Barry makes a lot of interesting points. OK, I’ll be more honest: Dave Barry makes a lot of very entertaining points. But really, part of what makes such humorous material so captivating is that it actually contains a lot of truths. (For further, more CS-relevant examples of humor containing truths, look at some of the cartoons of http://www.abstrusegoose.com or http://xkcd.com, e.g. http://abstrusegoose.com/206 or http://xkcd.com/835/.) My development of presbyopia strikingly paralleled what Dave Barry described, including starting at age 48 and initially appearing to me to be occasional odd situations (e.g. once I couldn’t read a street map and I thought it was because the light was too dim) rather than age-related deterioration of my body.

But besides some humor and truths that Dave Barry uncovers, what’s on my mind is the question of how old-fashioned I may be at age 50, considering that I’m supposed to teach modern Computer Science. There are many aspects to this question; but in this posting, I’m focusing on the topic of books. How appropriate is it to require physical textbooks for our courses? Or, from the perspective of a developer versus a user, how appropriate is it to write such?

Here’s an email which a student sent to me a few days ago:

Hello, I am scheduled to be in your class and I wanted to verify that I need to get a book for class and that the book I have listed is correct.

I am asking because first semester last year I bought a $150 book I didn’t need so I waited for classes in the winter semester and I ended up not needing any books although all of the classes listed them.

I list “Digital Design & Computer Architecture” as the book needed for your class.

Thank you and I look forward to getting the semester started!

[a student]

And here’s the response that I sent to this student:

Hello [...],

In this modern era, you may as well not buy the textbook listed for this course. In lectures I generally present most of the information that you need to do the homework, and you can probably look up anything else you need online.

See you in a few weeks.

Regards,
Prof. McGuire

What are your thoughts regarding requiring physical textbooks for courses these days?

Regards,
Hugh

P.S. There are actually a lot of issues involving things mentioned above that are worth discussing. For example, I mentioned “lectures”; well, how appropriate are such these days, versus ‘flipping’ classes?