Common Core & Data Collection

A particularly troubling aspect of the Common Core scheme is the emphasis on massive data-collection on students, and the sharing of that data for various purposes essentially unrelated to genuine education. U. S. Secretary of Education Arne Duncan has said:

Hopefully, some day, we can track children from preschool to high school and from high school to college and college to career . . . . We want to see more states build comprehensive systems that track students from pre-K through college and then link school data to workforce data. We want to know know whether Johnny participated in an early learning program and then completed college on time and whether those things have any bearing on his earnings as an adult.

To know all this, of course, we have to know pretty much everything Johnny does, throughout his lifetime.

The underlying philosophy has its roots in early-20th-century Progressivism. The Progressives believed that the modern world had become so much more complex than the world that existed at the time of the American founding, that the old principles of individual freedom and limited government were no longer sufficient. In the modern world, experts would be needed to address increasingly complex challenges. Experts – armed with sweeping data on the citizenry – offer the best hope for societal progress.

An essential component of this “necessary” data is data that can be gleaned from the captive audience of public-school students (and maybe private and homeschooled as well). Progressive education reformers such as Marc Tucker, of the National Center on Education and the Economy, have long advocated the creation of massive student databases that can be used to track children from birth into the workforce. This is what Arne Duncan, quite openly, wants to do.

Now the problem here, from Duncan’s point of view, is that a federal statute prohibits maintaining a national student database (20 U.S.C § 7911). What to do? What the federal government has chosen to do – and this predates the Obama Administration – is to incentivize the states to build identical databases so that the data can be easily shared. We end up with a de facto national student database.

All this was done, of course, through the power of the federal purse. In 2002 the federal government began something called the Statewide Longitudinal Data System grant program to offer grants to states that agreed to build their student data systems according to federal dictates (20 U.S.C. § 9501 et seq). The most recent iterations of this grant program were the infamous Stimulus bill of 2009, which required the construction of particular data systems in exchange for money from the State Fiscal Stabilization Fund. and then the Race to the Top program. A successful Race to the Top application required the state to adopt the Common Core Standards, to adopt an assessment aligned with Common Core, and to commit to expanding its student database. This is what states, like Kentucky for instance, agreed to do.

What kinds of data are we talking about? The National Education Data Model includes over 400 data points, including health history, disciplinary history, family income range, voting status, religious affiliation, and on and on.

Well, is this really connected with Common Core? Yes. The most direct connection is through the national assessments, PARCC and Smarter Balanced. Each of those consortia has signed a cooperative agreement with the U.S. Department of Education in which the Department is allowed access to all student-level data the consortium gets through the testing. This access is to be allowed “on an ongoing basis” for purposes including undefined “research.” Parents will not be allowed to object to this; indeed, they won’t even know it is happening.

Even for those states that have had the good sense to opt out of the PARCC and Smarter Balanced assessments, the privacy threat remains. The U. S. Department of Education is becoming increasingly aggressive about demanding personally identifiable student data in conjunction with all sorts of federal grants, as can be attested by the states that are not in Common Core. And the federal government is encouraging widespread sharing of student data within states, such as with departments of labor, public health, corrections, etc. The idea is that the State (upper case) should know everything there is to know about a student, so that he can be better directed toward his proper slot in the economic machine.

We are told not to worry about this, because any sharing of data will comply with the Family Educational Rights and Privacy Act (FERPA). But as of January 2012, FERPA has been gutted, and no longer protects our children’s data from almost unlimited sharing. Under the new regulatory interpretation, the U.S. Department of Education (USED) (and in fact state departments of education) may disclose personally identifiable student data to literally anyone in the world, as long as the disclosing agency uses the correct language to justify its action.

Pursuant to this enthusiasm for sharing student data, where might that data end up? One illustrative example (and an obvious one, in this workforce-development model) is departments of labor. In fact, USED and the U. S. Department of Labor have a joint venture called the Workforce Data Quality Initiative, the purpose of which is “developing or improving state workforce data systems with individual-level information and enabling workforce data to be matched with education data.” Student education data are to be shared, to the extent possible, with labor agencies to promote the goal of workforce development.

Of course, under the new regulations that gutted federal student-privacy law, data can now be shared with literally any agency if the correct enabling language is used: the Department of Health and Human Services, Homeland Security . . . the IRS?

USED is supporting a plethora of other programs that encourage states to build ever bigger and ever more “useful” data systems on students. It’s funding something called “Common Education Data Standards” to help states develop a common vocabulary for their data – the better to enable interstate sharing. It’s funding “Digital Passport,” which will allow interstate sharing of data on students who move across state lines. It’s funding the “Assessment Interoperability Framework” to “allow for the transfer of assessment-related data across applications within a district, between a district and a state agency, and across state lines.”

So although the federal government assures us it is not building a de facto national student database, everything it is doing in the area of technology is designed to allow for just that.

Both USED and state education officials further insist that privacy concerns are overblown, because student-level data will be anonymized. In the first place, this is simply not true – the data coming from the Common Core assessment consortia, and the workforce tracking data showing which students participated in which education programs and then earned which salaries, are necessarily student-specific.

In the second place, in the era of Big Data, there really is no such thing as anonymization. When there are multiple, perhaps hundreds, of items in the database, the absence of a name or a Social Security number becomes a mere inconvenience, not an obstacle to identifying the student.

One scholar who has studied this problem has explained that “[u]tility and privacy [of data] are . . . two goals at war with one another. . . . [A]t least for useful databases, perfect anonymization is impossible,” (Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA LAW REV. 1701, 1752; 2010). And USED fully intends for student data to be enormously useful to the Progressive machine. Anonymization will be impossible.

Where are we headed with all this? It is instructive to look at what USED itself is working on and writing about.

One report that appeared on the Department’s website in February of 2013 is called Promoting Grit, Tenacity, and Perseverance. The thesis is that education must inculcate these qualities in students, and that their presence or absence must be measured in some way. How? The report suggests assessment of physiological reactions that a student exhibits to stimuli such as stress, anxiety, or frustration. These reactions could be measured through posture analysis, skin-conductance sensors, EEG brain-wave patterns, and eye-tracking, (pg. 41-45). And the report barely mentions the appalling invasion of privacy this kind of physiological measurement would entail; rather, it focuses on the “problem” that this isn’t practical for the classroom – yet, (pg. 45).

The Grit, Tenacity, and Perseverance authors also drew a direct line to the Common Core Standards, noting that the math standards expressly require perseverance in struggling through problems, (pg. 6). If it is in the Standards, they reason, it must be measured.

Another USED report that came out around the same time focused on the enormous windfall of student data that will result from digital-learning technologies and digital assessment. This report was authored primarily by Karen Cator, who now heads up another federal data-development project. The type of digital learning that the Cator report promotes is not simply an alternative means of accessing text or lectures. Rather, it’s the type of computerized products that work by stimulus-response – the student sees something on the screen and has to choose a response, which leads to another prompt, and so on. Think Pavlov. This type of technological interplay generates enormous amounts of data on each student’s behaviors and dispositions; the term for it is “data exhaust.”

The Cator report urges that this data exhaust be used to develop individual profiles on students, that it be shared with various institutions and other stakeholders who may have an interest, and that it be used “for studying the noncognitive aspects of 21st-century skills, namely, interpersonal skills (such as communication, collaboration, and leadership) and intrapersonal skills (such as persistence and self-regulation).”

USED also emphasizes that the gathering of this “extremely fine-grained information” on students will help with the implementation of the Common Core Standards, which promote “deeper learning objectives” rather than acquisition of academic knowledge.

In October 2013, USED hosted a conference to explore the possibilities of implementing Common Core with the help of this intrusive digital learning. They called this conference “Datapalooza.” The CEO of one educational technology company waxed enthusiastic about the future. He said, “We are collecting billions of records of data . . . pulling data from everywhere . . . tens of thousands of places.” This data, he said, will help students develop the “21st-century skills” that the government has determined students will need.

And how are these “21st-century skills” being promoted in the classroom? Through Common Core. “Common Core,” he said, “is the glue that ties everything together.”

I haven’t even mentioned the ever-present problem of data-security. Hacking into student databases will occur, and in fact has already occurred. The wealth of data collected on students and their families is a hugely tempting target for people with malicious motives. But as serious as this problem is, the deeper problem is that the government has deemed our children little machines to be programmed, “human capital” to be exploited. Progressives have yearned to do this for at least 100 years – now they have the technology to do it. And the Common Core Standards, which diminish academic knowledge in favor of the “21st-century skills” that are developed and measured by this technology, are their passport to the Progressive future.

Comments

I wrote and made the recommendation that was approved at the 1960 White House Conference on Children and Youth that went to Congress and signed into law by President Kennedy establishing vocational programs in America except for DECA. This was to help lower the high school dropout rate. I was a 17 year old high school senior from Louisiana at the time. To give an example of the incompetence of all these so called education experts, children are sent home to study for a test and have never been taught a course on how to study. Professor Olney has DVDs on Where there is a will there is an A. There are dozens of programs to help students at all levels. After 37 years of reeducating myself I would debate any of these Dr.s and PhDs. Four of our Founding Fathers died on July 4th. It is not in the book and the teacher does not know it. I know why they died on July 4th and I challenge anyone to answer why.

Watch this video, trust me you have to see it to believe it. This was filmed at the White House in 2012 during a “Datapalooza” and shows the vision of data collection and data mining in public education. The video is of Jose Ferreira, the CEO of Knewton. Knewton’s largest partners are………get ready for this……..Pearson and Microsoft.

This video helps tie all of the pieces together of a very well thought out strategy that has been in the works for years. A very small number of elite individuals including Gates and Duncan have the vision of revolutionizing global education using a computer based system called Next Generation Learning or Individualized Learning. Some claim it will personalize education. Others claim it will use data mining of powerful, predictive and personal information to profile your child and put them on a predefined educational track. While the benefits seem intriguing at first glance, the risks are simply beyond comprehension if we don’t protect our children’s privacy. All of this is happening without transparency because the elites know parents and teachers would never go for it.

Loopholes were added to Federal Education Rights and Privacy Act (FERPA) in 2008/2011 to allow these companies access to student’s personally identifiable information (PII) without a parent’s KNOWLEDGE or CONSENT in order to “conduct research” or “improve instruction.” The Common Core national standards provide the tagging structure used for the “atomic concept level” data to be gathered in a uniform way. Common Core is not a curriculum, but the more detailed the tagging structure becomes the closer and closer it gets to a national curriculum.

But..but ..but the UN will save us from hackers by installing a kill switch, see if we all are shut out then the hacking just won’t happen. We need to be controlled and tamed and herded like cattle, we need to be managed by some well chosen elites and corporate interests.

States Fighting Back

CCSS Opt-Out Form

Follow us on Twitter

Campbell’s Law

"The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."