+ Page 57 +
---------------------------------------------------------------------------
####### ######## ######## ###########
### ### ## ### ## # ### # Interpersonal Computing and
### ### ## ### ## ### Technology:
### ### ## ### ### An Electronic Journal for
### ######## ### ### the 21st Century
### ### ### ###
### ### ### ## ### ISSN: 1064-4326
### ### ### ## ### January 1996
####### ### ######## ### Volume 4, Number 1, pp. 57 - 74
---------------------------------------------------------------------------
Published by the Department of Education
University of Maryland Baltimore County
Additional support provided Georgetown University
This article is archived as NEWMAN IPCTV4N1 on
LISTSERV@LISTSERV.GEORGETOWN.EDU
-------------------------------------------------------------------------
AN EXPERIMENT IN GROUP LEARNING TECHNOLOGY: EVALUATING CRITICAL THINKING IN
FACE-TO-FACE AND COMPUTER-SUPPORTED SEMINARS
D. R. Newman, Chris Johnson, Clive Cochrane and Brian Webb
Introduction
In 1992, a group of lecturers at Queen's University Belfast wished to
explore the possibilities of using computer- supported seminars as an
alternative to face-to-face seminars. We were faced with increasing class
sizes, leading to seminar groups of up to 30 students, only a few of whom
took part in each discussion. The hope was that computer conferencing could
be used to support discussions among more students without increasing the
lecturer's time.
However, it is important to ensure that the quality of the learning
does not decline. So we set up a controlled classroom experiment, in which
the students of an Information Society module did half their seminars face-
to-face, and half over a computer conferencing system. In each week, some
of the seminar groups had face-to-face seminars, while others went in each
day to our computer lab, logged on the our Network Telepathy computer
conferencing system (Ashmount Research 1992), looked at the own group
topic, and left comments on the subject being discussed. Over two weeks the
comments accumulated as the discussion continued. The face-to-face seminars
were held for one hour, with groups of 10 to 20 students. The lecturer used
the same approach in both, rather than trying to adapt his style to each
medium, so this was a comparison of average, rather than optimised, use of
each technology.
--------
Figure 1. Browsing an on-line tutorial on Network Telepathy
--------
+ Page 58 +
The main purpose of holding seminars in Information Society was to
encourage deep learning approaches among the students, in which they
achieved an in-depth understanding of the subject, rather than a surface
learning approach which helps pass examinations. In particular, the
lecturer (Clive Cochrane) wished to encourage critical thinking about
contentious issues in IT and society, such as computers and privacy. It is
quite easy for face-to-face discussions to degenerate into monologues,
silence filled by the teacher, or an exchange of unjustified opinions. So
there is even a question of whether critical thinking takes place in face-
to-face seminars, let alone computer supported ones.
This experiment was analysed using two techniques. The students were
given a questionnaire to complete at the end of the semester, in which they
rated how much each technology contributed to features of critical thinking
as set out by Garrison. This was then analysed by factor analysis, and has
been reported elsewhere (Webb et al. 1994).
We also transcribed both face-to-face and computer-assisted seminars,
and analysed their contents. An earlier paper (Newman et al. 1995)
describes our content analysis method. This paper reports the results of
this content analysis in the first year we used computer conferencing as an
alternative to conventional seminars.
Theory
This section summarises the theories of critical thinking upon which we
based our content analysis method, for those who have not yet read the
methodology paper (Newman et al. 1995).
Garrison developed a theory of critical thinking, as a kind of
problem-solving process (Garrison 1992). Critical thinkers move through 5
stages, identifying a problem, defining it more clearly, exploring the
problem and possible solutions, evaluating their applicability, and then
integrating this understanding with existing knowledge. Although he
initially developed it as a means of studying individual distance learners,
it is well suited to the analysis of critical thinking within group
learning, since these same stages are followed.
+ Page 59 +
Table 1. Stages and skills in the critical thinking process
Garrison's CT stages Henri's critical reasoning skills
1. Problem identification Elementary clarification
a triggering event arouses observing or studying a
interest in a problem problem, identifying its elemets,
observing their linkages
2. Problem definition In-depth clarification
define problem boundaries, analysing a problem to
ends and means understand its underlying
values, beliefs and aasumptions
3. Problem exploration Inference
ability to see to heart of admitting or proposing an idea
problem based on deep based on links to admittedly
understanding of situation true propositions
4. Problem applicability Judgement
evaluation of alternative making decisions, evaluations
solutions and new ideas and criticisms
5. Problem integration Strategies
acting upon understanding to for application of solution
validate knowledge following on choice or decision
----------
Henri (1991) identified five dimensions for analysing Computer-
Mediated Communication (CMC): participative, social, interactive, cognitive
and metacognitive. Questions of deep learning and critical thinking are in
the cognitive dimension, so we concentrated on that. She laid out 5 skills
needed for critical reasoning. It turns out that each skill is used mainly
in one of Garrison's stages, as Table 1 shows. For each stage,
Garrison, Henri and we identified indicators that showed (or at least
suggested) critical thinking was taking place. For the content analysis, we
picked pairs of indicators: a +ve indicator, showing evidence of critical
thinking, and a -ve indicator, showing its opposite (e.g. uncritical
acceptance or denial, deviations from the subject).
These are discussed in detail in the earlier paper, and listed in
Appendix A, where we show how we think they map into Garrison's 5 stages of
critical thinking.
+ Page 60 +
Results and analysis
Five face-to-face seminars were tape recorded and transcribed. These were
from 3 groups (A, B and C) and covered 3 subjects (office automation,
privacy and the information industry). The same groups also took part in
discussions on the computer conferencing system. Each was given a topic for
discussion within the seminar group, and in addition there was a general
topic open to all students. These discussions were automatically captured
on disk by the system. Since the CC discussions were much shorter, all the
discussions on all subjects for each group were analysed together. So we
had:
Transcribed Group A Group B Group C All
seminars students
f2f seminars Privacy Privacy
Office Office
Automation Automation
Information
Industry
Computer All All All All
conferences subjects subjects subjects subjects
Each transcript was analysed by marking each statement that obviously
indicated deep (+) or surface (-) learning approaches, according to the
indicators explained in our IPCT- J paper. (Newman et al 1995) From these
we calculated ratios of the depth of processing. This is called the depth
of critical thinking (CT) ratio in this paper. It ranges from -1 (all
surface statements, no deep) to +1 (all deep statements, no surface). It is
calculated as:
depth of CT ratio = (x+ - x-)/(x+ + x-)
where x is one of the indicators, like justification, linking ideas or
relevance, x+ is the count of positive statements and x- is the count of
negative statements in a transcript. First look at the overall depth of CT
ratio, calculated from the total +ve and -ve counts.
Overall comparison between seminars and computer conferences
------------------------------------------------------------
+ Page 61 +
Figure 2. Critical thinking is deeper in computer conferences.
Group Overall depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
A c 0.76 ccccccccccccccccccc
s 0.62 sssssssssssssss
B c 0.81 cccccccccccccccccccc
s 0.57 ssssssssssssss
C c 0.89 cccccccccccccccccccccc
s 0.58 ssssssssssssss
c = computer conferences
s = seminars
---------
As Figure 2 shows, we found evidence for critical thinking in both
face-to-face seminars and computer conferences. The depth of critical
thinking ratios were more positive in computer conferences for all seminar
groups. This difference is significant at 4% as measured in a matched
sample difference test (t = 4.58 > critical value of 4.3). The highest
difference was found in group C's transcripts, which would be interesting,
if significant, since that was the largest seminar group. However, an
analysis of variance with group and technology as the factors showed no
significant difference between groups (Table 2).
---------
Table 2. ANOVA of overall depth of CT ratio, by group and
technology.
Source of
Variation SS df MS F P-value F crit
Groups 0.002 2 0.00 0.29 77.3% 19.00
Technologies 0.078 1 0.08 20.94 4.5% 18.51
Error 0.007 2 0.00
-------
This increase in depth of critical thinking took place against a
background of reduced participation. We found 18 times more markable
statements per week in the seminar transcripts than in the computer
conferences. Perhaps they found writing in a computer conference to be less
spontaneous and take more thought and time than making a comment in a
seminar. To explore this, and other questions, we need to look in more
detail at the different elements of critical thinking.
+ Page 62 +
Effects on indicators of critical thinking
We can look at how different indicators of critical thinking are affected
by the technology used for the discussions, by grouping together all the
statement counts for relevance, importance, novelty, outside material,
linking ideas, justification, criticism, resolving ambiguity, widening the
discussion and practical grounding. Figure 3 shows the overall pattern,
comparing all the analysed computer conference and seminar transcripts. No
ratios were plotted for W (widening) and P (practical grounding) for the
computer conferences because the sample was too small to make the ratio
reliable.
--------
Figure 3. Patterns in depth of critical thinking by indicator
for all CC and f2f seminars.
Indicator Depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
R c 0.88 ccccccccccccccccccccc
s 0.82 ssssssssssssssssssss
I c 0.89 cccccccccccccccccccccc
s 0.30 sssssss
N c 0.35 cccccccc
s 0.56 ssssssssssssss
O c 1.00 ccccccccccccccccccccccccc
s 0.74 ssssssssssssssssss
L c 0.29 ccccccc
s -0.04
J c 0.68 cccccccccccccccc
s 0.52 ssssssssssss
C c 0.92 ccccccccccccccccccccccc
s 0.77 sssssssssssssssssss
A c 0.00
s 0.02
W s 0.20 sssss
P s 0.88 sssssssssssssssssssss
c = computer conferences, s = seminars
--------
On average, the students taking part in the computer conferences
brought in relevant outside material more often and were better at linking
together ideas and solutions, while in the face-to-face seminars students
came up slightly more often with new rather than old ideas. The students
seem to have adopted a more serious, worthier, styl e when taking part in
the computer conferences, as if it were writing an essay, as shown by the
higher ratio for important statements.
+ Page 63 +
--------
Figure 4. The CC-Seminar differences in depth of CT ratio for
different indicators.
Indicator Difference in depth of CT ratios
-1.0 -0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8 1.0 1.2
R a 0.17 aaaa
b -0.05 b
c 0.19 cccc
I a 0.58 aaaaaaaaaaaaaa
b 0.61 bbbbbbbbbbbbbbb
c 0.00
N a 0.22 aaaaa
b -0.44 bbbbbbbbbbb
c 0.36 ccccccccc
O a 0.27 aaaaaa
b 0.24 bbbbbb
c 0.22 ccccc
L a -1.04 aaaaaaaaaaaaaaaaaaaaaaaaaa
b 1.39 bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb
c -0.42 cccccccccc
J a 0.00
b 0.43 bbbbbbbbbb
c 0.57 cccccccccccccc
C a 0.28 aaaaaa
b 0.06 b
c 0.33 cccccccc
A a -0.18 aaaa
a = group A, b = group B, c = group C
--------
How consistent are these differences? One test is to calculate the
differences between depth of CT ratios for matched samples: i.e. group A CC
- group A seminar, and so on. These are plotted in Figure 4. Only for the O
(bringing in outside knowledge and material) indicator is the difference
consistent for all three groups, significant at 0.3% (t = 18.8 > critical
value of 4.3). None of the other indicator differences are significant (at
p = 5%), nor is a repeated measures multivariate analysis of variance
(Table 3). Given the small sample size, this is not surprising, but there
are notable differences for I (importance), N (novelty) and L ( linking
ideas) in some groups.
+ Page 64 +
------
Table 3. Repeated neasures ANOVA of CC-Sem. differences by indicator.
Vari- Hypoth. Error F Sig. ETA Power
able MS MS of F Square
R 0.02 0.01 1.8 0.312 0.47 0.13
I 0.24 0.06 4.0 0.184 0.67 0.22
N 0.00 0.09 0.0 0.864 0.02 0.05
O 0.09 0.00 353.9 0.003 0.99 1.00
L 0.00 0.80 0.0 0.978 0.00 0.05
J 0.17 0.04 3.8 0.191 0.65 0.21
C 0.08 0.01 7.1 0.117 0.78 0.33
--------
Figure 6 shows the pattern for the computer conferences run by
different groups. The patterns are all similar except for the ratios for L
(linking ideas). Figure 5 shows a similar variability in L for face-to-face
seminars, and a similar variability in I (important statements). Looking
more closely at Figure 5, we notice that these differences are consistent
with the subject discussed. Although the group makeup and size are diff
erent, the group seems to have less effect on the depth of CT ratios than
the subject.
---------
Figure 5. Seminar depth of CT for different indicators.
IndicaDepth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
R a 0.90 aaaaaaaaaaaaaaaaaaaaaa
A 0.67 AAAAAAAAAAAAAAAA
b 0.89 bbbbbbbbbbbbbbbbbbbbbb
B 0.68 BBBBBBBBBBBBBBBBB
c 0.81 cccccccccccccccccccc
I a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
A -0.17 AAAA
b 1.00 bbbbbbbbbbbbbbbbbbbbbbbbb
B -0.78 BBBBBBBBBBBBBBBBBBB
c 1.00 ccccccccccccccccccccccccc
N a 0.50 aaaaaaaaaaaa
A 0.60 AAAAAAAAAAAAAAA
b 0.73 bbbbbbbbbbbbbbbbbb
B 1.00 BBBBBBBBBBBBBBBBBBBBBBBBB
c 0.24 ccccc
O a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
A 0.50 AAAAAAAAAAAA
b 0.33 bbbbbbbb
B 1.00 BBBBBBBBBBBBBBBBBBBBBBBBB
c 0.78 ccccccccccccccccccc
L a 0.33 aaaaaaaa
A -0.50 AAAAAAAAAAAA
b 0.33 bbbbbbbb
B -0.86 BBBBBBBBBBBBBBBBBBBBB
c 0.75 cccccccccccccccccc
J a 0.31 aaaaaaa
A 0.69 AAAAAAAAAAAAAAAAA
b 0.57 bbbbbbbbbbbbbb
B 0.56 BBBBBBBBBBBBBB
c 0.43 cccccccccc
C a 0.61 aaaaaaaaaaaaaaa
A 0.81 AAAAAAAAAAAAAAAAAAAA
b 0.78 bbbbbbbbbbbbbbbbbbb
B 1.00 BBBBBBBBBBBBBBBBBBBBBBBBB
c 0.67 cccccccccccccccc
A a 0.20 aaaaa
A 0.14 AAA
b 0.20 bbbbb
B -0.20 BBBBB
c -0.25 cccccc
W a 0.50 aaaaaaaaaaaa
A 1.00 AAAAAAAAAAAAAAAAAAAAAAAAA
b -1.00bbbbbbbbbbbbbbbbbbbbbbbbb
B 0.50 BBBBBBBBBBBB
P a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
A 0.75 AAAAAAAAAAAAAAAAAA
c 1.00 ccccccccccccccccccccccccc
a = group A Off. Auto.; A = group A Privacy
b = group B Info. Ind.; B = group B Privacy
c = group C Off. Auto.
+ Page 65 +
---------
Figure 6. CC depth of CT for different indicators.
Indicator Depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
R a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
b 0.76 bbbbbbbbbbbbbbbbbbb
c 1.00 ccccccccccccccccccccccccc
I a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
b 0.67 bbbbbbbbbbbbbbbb
c 1.00 ccccccccccccccccccccccccc
N a 0.78 aaaaaaaaaaaaaaaaaaa
b 0.40 bbbbbbbbbb
c 0.60 ccccccccccccccc
O a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
b 1.00 bbbbbbbbbbbbbbbbbbbbbbbbb
c 1.00 ccccccccccccccccccccccccc
L a -1.00aaaaaaaaaaaaaaaaaaaaaaaaa
b 1.00 bbbbbbbbbbbbbbbbbbbbbbbbb
c 0.33 cccccccc
J a 0.50 aaaaaaaaaaaa
b 1.00 bbbbbbbbbbbbbbbbbbbbbbbbb
c 1.00 ccccccccccccccccccccccccc
C a 1.00 aaaaaaaaaaaaaaaaaaaaaaaaa
b 1.00 bbbbbbbbbbbbbbbbbbbbbbbbb
c 1.00 ccccccccccccccccccccccccc
A
a 0.00
a = group A, b = group B, c = group C
----------
The I and L depth of CT ratios are negative for both privacy
discussions, and positive for the others. Whereas the N depth of CT ratios
are higher for the privacy discussions. It appears that the students
generated a lot of new ideas in their discussions of IT and privacy, and
kept the discussion wide, but were less able to link these ideas together,
resolve ambiguities, bring in relevant outside material, or to keep the
discussion centred on important, non-trivial, issues. It is with such
subjects that computer conferencing could be of most benefit, since it
shows signs of supporting those aspects of critical thinking that were most
lacking in this face-to-face discussion.
+ Page 66 +
----------
Table 4. Subjects discussed by CT indicators ANOVA.
Source of SS df MS F P-value F-crit
Variation
Subjects 0.53 1 0.53 10.6 0.50% 4.49
Indicators 3.56 7 0.51 10.1 0.01% 2.66
Interaction 3.5 7 0.5 9.98 0.01% 2.66
Within 0.8 16 0.05
Total 8.39 31
----------
As a check on the significance of the subject discussed upon the depth
of critical thinking ratios, we ran a 2-way analysis of variance on subject
by indicator, for the depth of CT ratios found in the two office automation
and the two privacy seminars.
This showed a strong, highly significant interaction effect. So the
subject discussed affected the depth of processing adopted in seminar
discussions differently for different indicators.
Effect of participation
Since the computer conference transcripts were shorter, we thought it
worthwhile to check if the differences we have found were affected by the
participation levels. A reasonable measure of participation is a count of
statements made: which we approximate as the sum of +ve and -ve indicators.
So we have plotted the CT ratios versus total counts for each indicator in
Figure 7 (for CC) and Figure 8 (for seminars).
There is no obvious relationship between the two for the computer
conference discussions. Nearly all the ratios are strongly +ve. For the
seminars, there is a tendency for most indicator ratios to increase with
participation (except perhaps novelty). There was no sign of the often
feared trivialisation of discussion as people talk more. Perhaps none of
the discussions got that excited!
----------
Figure 7. Quality versus participation for computer conference
discussions.
----------
Figure 8. Quality versus participation for seminars.
----------
Relating our findings to Garrison's stages of critical thinking
---------------------------------------------------------------
+ Page 67 +
We have not attempted to break down our discussion transcripts according to
which of Garrison's stages is taking place at any point. This is quite
difficult, since some individuals may be exploring the problem and
solutions while othe rs are still defining it, so the stages overlap within
the group.
However, it is possible to get an indirect indication of the depth of
the critical thinking going on in each stage, by relating each indicator to
the stage in which it is most expected (see Appendix A). For example, we
would expect new problem-related information (NP+) to be introduced in
Garrison's stage 2. By mapping our indicators to Garrison's 5 stages, we
get an estimate of how deep the learning style is at each stage of critical
thinking or problem solving. The depth of critical thinking has been
plotted against Garrison's stages, for each seminar group.
----------
Figure 9. Group B changes in depth of CT with Garrison's stages.
Stage Depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
G1 c 0.76 ccccccccccccccccccc
s 0.69 sssssssssssssssss
S 0.65 SSSSSSSSSSSSSSSS
G2 c 0.80 cccccccccccccccccccc
s 0.53 sssssssssssss
S 0.20 SSSSS
G3 c 0.60 ccccccccccccccc
s 0.44 sssssssssss
S 0.25 SSSSSS
G4 c 0.93 ccccccccccccccccccccccc
s 0.75 ssssssssssssssssss
S 0.85 SSSSSSSSSSSSSSSSSSSSS
G5 c 0.75 cccccccccccccccccc
s 0.60 sssssssssssssss
S 0.33 SSSSSSSS
c = computer conference, all subjects
s = information industry seminar
S = privacy seminar
----------
+ Page 68 +
Content analysis of Group B's discussions showed an overall increase
in the depth of critical thinking across Garrison's stages when using
computer conferencing. A similar pattern was found for the other two
seminar groups.
----------
Figure 10. Group A changes in depth of CT with Garrison's stage.
Stage Depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
G1 c 1.00 ccccccccccccccccccccccccc
s 0.87 sssssssssssssssssssss
S 0.71 SSSSSSSSSSSSSSSSS
G2 c 0.79 ccccccccccccccccccc
s 0.78 sssssssssssssssssss
S 0.45 SSSSSSSSSSS
G3 c 0.09 cc
s 0.49 ssssssssssss
S 0.41 SSSSSSSSSS
G4 c 0.83 cccccccccccccccccccc
s 0.57 ssssssssssssss
S 0.78 SSSSSSSSSSSSSSSSSSS
G5 c 1.00 ccccccccccccccccccccccccc
s 1.00 sssssssssssssssssssssssss
S 0.15 SSS
c = computer conference, all subjects
s = office automation seminar
S = privacy seminar
----------
Figure 11. Group C changes in depth of CT with Garrison's stages.
Stage Depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
G1 c 1.00 ccccccccccccccccccccccccc
s 0.81 ssssssssssssssssssss
G2 c 1.00 ccccccccccccccccccccccccc
s 0.33 ssssssss
G3 c 0.52 ccccccccccccc
s 0.38 sssssssss
G4 c 1.00 ccccccccccccccccccccccccc
s 0.65 ssssssssssssssss
G5 c 1.00 ccccccccccccccccccccccccc
s 0.78 sssssssssssssssssss
c = computer conferences, s = office automation seminar
----------
+ Page 69 +
But for them, the advantage of computer conferencing was least during
the problem exploration phase, stage 3. During problem exploration, the
participants should be creatively exploring new ideas. This is a somewhat
different task to the structured problem-solving found in the problem
identification, problem definition and problem integration stages. Once a
gain, we find that computer conferencing helps the more structured, less
creative, parts of critical thinking process. It is in stages 1 and 5 where
computer conferencing shows a significant consistent gain over face-to-face
seminars, as shown by the difference plot in Figure 12 and confirmed by
matched sample t- tests (Table 5).
-----------
Figure 12. Matched sample CC-seminar differences by Garrison's
stages
Stage Difference in depth of CT ratio
-0.8 -0.6 -0.4 -0.2 0.0 0.2 0.4 0.6 0.8
G1 a 0.18 aaaa
b 0.09 bb
c 0.19 cccc
G2 a 0.19 aaaa
b 0.44 bbbbbbbbbb
c 0.67 cccccccccccccccc
G3 a -0.36 aaaaaaaa
b 0.28 bbbbbbb
c 0.14 ccc
G4 a 0.15 aaa
b 0.13 bbb
c 0.35 cccccccc
G5 a 0.43 aaaaaaaaaa
b 0.31 bbbbbbb
c 0.22 ccccc
a = group A, b = group B, c = group C
-----------
Table 5. Matched sample CC-seminar differences by Garrison's
stages.
G1 G2 G3 G4 G5
A 0.18 0.19 -0.36 0.15 0.43
B 0.09 0.44 0.28 0.13 0.31
C 0.19 0.67 0.14 0.35 0.22
t 4.6 3.1 0.1 3.0 5.5
t-crit 4.3 4.3 4.3 4.3 4.3
p 4.4% 9.0% 93.0% 9.8% 3.2%
-----------
+ Page 70 +
If we now examine the privacy and office automation seminars in these
figures, we once again find a common pattern. There is a difference in the
depth of learning style adopted in stage 5, the integration of the problem
back into the world. There is good evidence for this integration in the
office automation discussions, but the privacy discussions seem to have
degenerated at this stage. So this looks like an effect of the discussion
subject on critical thinking.
To test this, we compared the CT ra tios for the seminars in which the
same subjects (office automation and privacy) had been discussed. An
analysis of variance of these seminars on the same subject shows that all
the relationships were significant (see Table 6). There was an overall
difference between the depth of learning over the stages, and a significant
interaction between the subject and the stage, confirming the impression
given in Figures 10-12. Looking in more detail at each stage, none of the
differences with subject at different stages are significant at 5% in this
small sample, but stages 5, 4 and 1 come closest.
----------
Table 6. ANOVA of Garrison's stages by subject discussed in
seminars (office automation and privacy).
Source of SS df MS F P-value F crit
Variation
Subjects 0.17 1 0.17 8.69 1.5% 4.96
G1-5 stages 0.43 4 0.11 5.42 1.4% 3.48
Interaction 0.37 4 0.09 4.69 2.2% 3.48
Within 0.20 10 0.02
Total 1.18 19
----------
Table 7. F-tests on each of Garrison's stages for differences
between subjects in seminars.
Vari- Hypoth. Error F Sig. ETA Power
able MS MS of F Square
G1 0.014 0.002 7.06 0.12 0.88 0.34
G2 0.030 0.065 0.45 0.69 0.31 0.07
G3 0.007 0.009 0.72 0.58 0.42 0.08
G4 0.021 0.003 8.38 0.11 0.89 0.38
G5 0.210 0.021 10.12 0.09 0.91 0.43
----------
+ Page 71 +
Conclusions
In our earlier paper, we laid out a detailed, theory-based, methodology for
content analysis, in the hope that it would be a powerful way of studying
the quality of learning in group learning situations. When applied to our
small-scale experiment, it has produced some interesting findings:
1. We found more statements indicating critical thinking than the
opposite. So the cynical view that no critical thinking takes place in any
kind of seminar was not confirmed.
2. The next worry was that computer conferencing might reduce the
critical thinking in seminars. But in fact, the computer conference
discussions showed a significantly deeper overall critical thinking ratio
than the face-to-face seminars. This was independent of group differences.
3. However, the students said less in the computer-supported seminars.
From the content analysis, we do not know why. But we found a negative
factor in the factor analysis of student questionnaires, that appears to be
related to difficulties of learning and using the computer conferencing
technology (Webb 1994).
4. Is the difference in depth of CT ratio an effect of the different
levels of participation? For example, one might expect more deviations and
distractions in the longer or more energetic discussions found in the face-
to-face seminars, leading to lower ratios. But, if anything, the CT ratios
increased with participation.
5. Apart from the overall depth of critical thinking, the content
analysis technique allows us to study different aspects of critical
thinking, through the different indicators. We found deeper CT ratios for
bringing in outside material and experiences, linking ideas together, and
making important points in the computer conference transcripts. But for
some groups there were more new ideas in the face-to-face seminars. This
reflects a somewhat "worthy" style of messages on the computer conferencing
system, somewhat closer to points in an essay than oral conversation.
6. It is possible to get an idea of the depth of critical thinking
taking place in each of Garrison's stages of critical thinking by
considering which skills and indicators would be found in each stage. Doing
this, we found deeper critical thinking at all stages in computer
conferencing, but with the smallest difference in stage 3, problem
exploration. This is in the stage where most creativity is required,
including the generation of new ideas. It seems that the computer
conference discussions did not stimulate the writing down of new ideas.
This may be due to the self-censorship of new ideas before committing
finger to keyboard, or less spontaneity at the slower pace of asynchronous
computer conferencing.
+ Page 72 +
7. Stage 5, problem integration, was more affected by the subject
discussed than the technology used to support the discussion. Privacy
discussions were not brought to a successful conclusion integrating the
solutions into the students knowledge.
8. The content analysis technique allows us to study the effects of
other things than just the technology used, such as teaching and learning
techniques, and, most noticeably in this case, the subject matter studied.
Finally, where do we go next? Since this was a small sample, of 3
seminar groups of 10-20 students over one semester, similar studies need to
be carried out on other classes. The content analysis technique seems to be
a powerful way of studying critical thinking in group learning, in
particular in the way it allows us to study different aspects of critical
thinking and the stages of the critical thinking process. A particularly
important issue to investigate is the effect of differing learning tasks
upon critical thinking: does all group learning follow the problem-solving
approach of Garrison's theory?
From our own results, we are now looking at new combinations of
technology and learning task to draw on the strengths found for each
technology. For example:
a) Exploring more easily learned technologies, to try and reduce the
negative factor mentioned above, such as later versions of PowWow, and
World-Wide Web based discussion systems.
b) Stimulating the generation of new ideas by bringing together
participants with differing experiences who could surprise each other, like
students in Northern Ireland and Brazil. This could be supported by
synchronous technologies, like Internet Relay Chat, Maven or Cu-See Me,
where they cannot be brought together in the same place.
c) Drawing on the identified strength of computer conferencing in
linking ideas together by setting up systems optimised for this, such as
ideas mapping software, like CM/1, or group editing environments, such as
WebShare, for use by student project groups.
d) Implementing computer support for proven techniques of encouraging
group learning face-to-face, including those mentioned by Gibbs and Jenkins
(1992) and creativity techniques (Burnett 1994).
REFERENCES
Ashmount Research (1992). Mailto:ashmount@ashmount.cix.compulink.co.uk for
details of the software. Later we changed to the Windows version of the
same software, called PowWow, but at the time of the work reported here we
used the DOS version.
Burnett, Andrew (1994) Computer-assisted creativity. In Peter Lloyd (ed.),
Groupware in the 21st Century. London: Adamantine Press.
+ Page 73 +
Garrison, D. R. (1992) Critical thinking and self-directed learning in
adult education: an analysis of responsibility and control issues. Adult
Education Quarterly, 42(3), 136-148.
Gibbs, G. & Jenkins, A. (1992) Teaching large classes in Higher Education.
London: Kogan Page.
Henri, F. (1991) Computer conferencing and content analysis. In O'Malley,
C. (Ed.) Computer Supported Collaborative Learning. Heidelberg: Springer-
Verlag.
Newman, D. R., Webb, B. & Cochrane, C. (1995) How to measure critical
thinking in face-to-face and computer supported seminars through content
analysis, IPCT-J, 3(2), 56-77.
Webb, B, Newman, D. R. & Cochrane, C. (1994) Towards a methodology for
evaluating the quality of student learning in a computer-mediated-
conferencing environment. In Gibbs, G. (Ed.) Improving Student Learning:
Theory and Practice. Oxford: Oxford Centre for Staff Development, Oxford
Brookes University. 1st International Symposium Improving Student Learning:
Theory and Practice, Warwick University, Sept. 1993.
Appendix A. Mapping of CT indicators to Garrison's stages of
critical thinking.
Garrison's stages (+ deep, - surface)
1+ 2+ 3+ 4+ 5+ 1- 2- 3- 4- 5- Indicator
.5 .5 R+ relevant statements
.5 .5 R- irrelevant statements,
diversions
.5 .5 I+ Important points/issues
.5 .5 I- unimportant, trivial
points/issues
1 NP+ new problem related
information
1 NP- repeating what has been
said
1 NI+ New ideas for discussion
1 NI- False or trivial leads
1 NS+ new solutions to problems
1 NS- Accepting first offered
solution
1 NQ+ Welcoming new ideas
1 NQ- Squashing, putting down
new ideas
1 NL+ Student (learner) brings
new things in
1 NL- dragged in by tutor
1 AC+ Clear unambiguous
statements
1 AC- Confused statements
1 A+ Discuss ambiguities to
clear them up
1 A- Continue to ignore
ambiguities
1 OE+ Drawing on personal
experience
1 OC+ Refer to course material
1 OM+ Use relevant outside
material
1 OK+ Evidence of using
previous knowledge
1 OP+ course related problems
brought in
.5 .5 OQ+ Welcoming outside
knowledge
.5 .5 OQ- Squashing attempts to
bring in outside knowledge
.5 .5 O- Sticking to prejudice or
assumptions
1 L+ linking facts, ideas and
notions
1 L+ Generating new data from
information collected
1 L- Repeating information
without making inferences or
offering an interpretation
1 L- Stating that one shares
the ideas or opinions stated,
without taking these further
or adding any personal
comments
1 JP+ Providing proof or
examples
1 JP- Irrelevant or obscuring
questions or examples
1 JS+ Justifying solutions or
judgements
1 JS+ Setting out advantages
and disadvantages of
situation or solution
1 JS- Offering judgements or
solutions without
explanations or justification
1 JS- Offering several
solutions without suggesting
which is the most appropriate
1 C+ Critical
assessment/evaluation of own
or others contributions
1 C- Uncritical acceptance or
unreasoned rejection
1 CT+ Tutor prompts for
critical evaluation
1 CT- Tutor uncritically
accepts
1 P+ relate possible solutions
to familiar situations
1 P+ discuss practical utility
of new ideas
1 P- discuss in a vacuum
1 P- suggest impractical
solutions
.5 .5 W+ Widen discussion
.5 .5 W- Narrow discussion.
------------------------------------------------------------------------
+ Page 74 +
Interpersonal Computing and Technology: An Electronic Journal for the
21st Century
Copyright 1996 University of Maryland Baltimore County. Copyright of
individual articles in this publication is retained by the individual
authors. Copyright of the compilation as a whole is held by the
University of Maryland Baltimore County. It is asked that any
republication of this article state that the article was first published
in IPCT-J.
Contributions to IPCT-J can be submitted by electronic mail in APA style
to: Susan Barnes, Editor IPCT-J SBBARNES@PIPLELINE.COM