As you interact with the Services, metacog collects realtime streaming process data based on those interactions. This data is limited to HOW learners solve challenges, and we use NO NAMES, only keys representing anonymous users. We limit the data we collect to only what we believe is necessary to provide a better learning experience.

We do not ask certifying agencies, higher education institutions, corporations, publishers, learners or assessment companies to submit Personally Identifiable Information. In unexpected instances we may unintentionally be sent persistent identifiers but these are anonymized and still secure and owned by the creators of the content, not metacog.

Available, Reliable, Fault Tolerant and Scalable

Metacog is designed to satisfy a complex and challenging set of user and systems requirements, including near real-time data ingestion and queryability, as well as high availability, reliability, fault tolerance, and scalability for large data and query volumes.

Massive Data Volumes

Specifically, Metacog handles petabytes of data, processes millions of data point updates per second, and serves billions of queries that fetch trillions of results per day. Metacog was built to handle from tens to tens of millions of simultaneous users. No need to constantly maintain, administer and resize brittle database platforms and schemas - metacog automatically does all that work for you.

Data Consistency, Safety and Security

Metacog is geo-replicated across multiple datacenters and provides consistent and repeatable query answers at low latency, even when an entire datacenter fails.

Extensive API Documentation

Metacog maintains a separate developer site with documentation for the various metacog API’s Visit www.developer.metacog.com to learn more. The ability to implement metacog APIs throughout means not having to re-engineer your existing systems. Metacog can run in parallel with your current and future infrastructure and data storage, and does not require data scientists to implement.

Low Cost, Immediate Value

You can try metacog’s sandbox for free to see how it works in your new or existing products. Metacog is built as a service that is very cost-effective (literally pennies on the dollar) compared to building and maintaining a highly performant Internet scale data infrastructure and platform and is value-priced based on volume. Metacog enables you to run the services you need at scale. If you are either a small or large organization that wants to self-serve and purchase metacog, information on API pricing can be found here.

Based on 20 years of Foundational Educational Data Mining and Learning Analytics Research

Our guiding principle has been to provide the platform able to fulfill the requirements of academic rigor and to provide suitable validity and reliability for our most demanding customers. While our platform is rooted in Psychometrics, Educational Data Mining and Learning Analytics best practices, we also maintain a keen eye to scaling these capabilities via our easy to implement API. What this means to you is that you can rest assured that a wide range of studies and papers support our methodologies and implementation.

Data and Code Ownership

Code ownership:metacog owns the code running on its servers; that is, the stuff behind the API is ours. APIs serve as a technical contract boundary between caller (e.g., You, the customer) and callee (metacog). It's also the natural ownership boundary. metacog has no ownership of the code running on your servers; that is, the stuff in front of the API is yours. Any metacog client libraries you use within your code do not infringe on your ownership in any way.

Data ownership:You, the customer, own the data that metacog captures and can get it back at any time. You also own any computed results or derived data. metacog retains a license to use the data for the following purposes:1.) Automated regression testing to make sure any improvements in metacog don't change behavior in unexpected ways. 2.) Aggregated anonymized usage statistics; think McDonald's claim from the old days "over 2 billion hamburgers served". No mention of customers names or what kind of burgers - just a running count. We'd like to be able to compute and say "sessions captured" or "sessions scored".

What is the Difference Between Metacog and xAPI/TinCan?

They're both event streams whose aggregation supports conclusions about human performance. They differ in purpose but can complement each other for a full solution:

The primary difference is that xAPI was originally designed for results and achievements (learning statements) whereas metacog is designed to capture the complete learning process that led to the results.

A metacog-instrumented widget could generate thousands of events per second (in one case a partner is generating tens of thousands of events per second). The metacog platform is designed to handle this volume across millions of users. Most xAPI implementations aren't intended to scale to this level of detail.

Metacog offers the ability to define scoring rubrics, create training sets, and create machine-scoring models to score the event streams. xAPI is a standard format for emitting and collecting learning statements that is at the core of the next generation of learning record and management systems.

Metacog can act as an xAPI statement emitter. since it offers the ability to map scores into xAPI statements and forward them to an xAPI Learning Record Store.

Combine the two and what do you get? xAPI records resulting from authentic learning experiences (scored by metacog) where metacog serves as the audit trail to the achievements recorded within an xAPI LRS.

What is the difference between metacog and Google Analytics? (or other analytics platforms)

Metacog is focused only on learning analytics, and is specifically focused at the semantic Learning Object level to collect and analyze learner METACOGNITIVE PROCESS data (streaming data on how learners solve challenges and interact with the Learning Object) not simple activity level analytics. While they're both analytics platforms that can count page level visits and activity level behaviors, metacog goes far deeper - and beyond - simple descriptive statistics (visit and activity counts) to include full spectrum process capabilities like diagnostic, predictive and prescriptive analytics. Very powerful next-generation capabilities can be embedded easily when they are integrated into your learning products via metacog’s APIs.

Build Smarter Products with our APIs and Process Data Platform

Metacog’s technology enables much deeper assessment challenges to optimize human learning and performance. Use AI to automatically improve technical, college and career readiness and to develop the harder-to-measure, complex 21st century problem-solving skills.

Experiential technologies like VR/AR/MR, Serious Games and Simulations emphasize evidence-based demonstration of procedural knowledge and cannot be practicably assessed utilizing multiple-choice scoring. Immersive Environments provide an incredible new way to assess the ability to synthesize knowledge in realistic situations that measure the ability to PERFORM in lifelike conditions, not simply to MEMORIZE and regurgitate information.

The problem is that there has been no ability (other than time-consuming hand-scoring) to score open ended challenges at scale to assess competencies from these rich, authentic, experiential learning challenges. Metacog’s technology tracks complex problem solving processes to learn what they can actually do—automatically and inexpensively!

Use data to provide direct assessment for Competency-based Learning models

The technology industry has been very good at pushing finely crafted content at users but until now there has been no way of hearing back from users at scale and provide direct evidence of utilization and competencies.

Users generate rich data streams as they solve challenges. This data can be used to understand not only how they came to their final answer, but to assess the quality of the learning objects themselves and to more deeply understand the user’s cognitive and non-cognitive state and emerging utilization pathways and what they should be doing next.

As users go about their normal interactions and solve challenges, metacog obtains the “data exhaust” - an incredibly fine grained streaming information about performance: how they think, how they approach difficult problems, whether they try very hard, or give up easily as well as higher order cognitive processes. We believe this type of information is the next generation of valuable data.

Build smart + connected products: Compete in the new data age

Improve your data strategy, product development roadmap and competitiveness in the coming data-driven years. In the Internet of Things (IoT) era, companies who are able to harness data, connect their products together and create systems of systems to optimize their customer experience will be the clear winners.

With metacog you have a platform that can easily data-enable your products to both create and deliver optimal customer value without needing an army of additional data scientists. We make it possible to easily supplement your existing data people with subject matter experts to lower costs, automate feature extraction and model building as well as speed your go-to-market strategy.

Envision fast lifecycle learning products that much better prepare learners for success in complex skills, college, careers and beyond. 21st century careers require 21st century learners and technologies to fill skills gaps. Metacog makes it fast, easy and inexpensive to integrate advanced data capabilities into your new and existing products.