Canadian Journal of Learning and Technology

Volume 32(1) Winter / hiver 2006

Using asynchronous online discussion to learn introductory programming: An exploratory analysis

Robin Kay

Authors

Dr. Robin Kay is an Assistant Professor, University of Ontario Institute of Technology, Faculty of Education, 2000 Simcoe St. North, Oshawa, Ontario, L1H 7L7. Correspondence regarding this article can be directed to him at: robin.kay@uoit.ca

Abstract

Abstract: Previous research on online discussions has focused on university students learning higher level subjects. The purpose of the current study was to examine whether online discussions could be used effectively by secondary school students attempting to learn introductory level topics. Forty-five male students, ranging in age from 13 to 15 years old, participated in two consecutive online discussions used to supplement the learning of HTML (24 days) and beginning programming (36 days) respectively. Students were able to actively understand and apply new concepts and procedures using an online discussion format. The majority of students posted clear, good quality messages that covered material which went beyond the course curriculum. Although attitudes toward using online discussions and participation rates were uneven, most students reported gaining useful information from the discussion board. More than three quarters of all discussion threads were resolved. Finally, and perhaps most important, participation in the discussion board was significantly and positively correlated with learning performance.

Résumé: Une recherche antérieure sur les discussions en ligne mettait l’accent sur des étudiants de l’université s’intéressant à des matières plus avancées. L’objet de l’étude en cours était d’examiner si les discussions en ligne peuvent être utilisées efficacement par les élèves du secondaire qui tentent de s’initier à des cours de base. Quarante-cinq étudiants masculins, âgés de 13 à 15 ans, ont participé à deux discussions en ligne consécutives utilisées pour compléter l’apprentissage du HTML (24 jours) et pour débuter la programmation (36 jours). Les étudiants ont été en mesure de très bien comprendre et d’appliquer de nouveaux concepts et de nouvelles procédures en discutant en ligne. La majorité des élèves ont enregistré des messages clairs et de bonne qualité qui abordaient de la matière qui allait au delà du plan de cours. Même si on a observé certains écarts quant aux attitudes relatives à l’utilisation de discussions en ligne et quant au taux de participation, la plupart des élèves ont déclaré avoir pu tirer profit du groupe de discussion. Plus de trois-quarts des sujets abordés ont trouvé réponses. La dernière conclusion, et non la moindre, la participation au groupe de discussion était corrélée au rendement de l’apprentissage de façon importante et positive.

Overview

It has been well documented that properly structured, face-to-face, cooperative learning can have significant positive effects on learning ( e.g., Dewey, 1966; Johnson & Johnson, 1994, 1998; Kagan, 1997; Sharon, 1999; Slavin, 1995), particularly when students are actively constructing meaning (e.g., Bereiter & Scardamalia, 1989; Br uner, 1983, 1986; Scardamalia & Bereiter, 1999; Vygotsky, 1978). Over the past ten years, considerable research has examined whether the benefits of cooperative and constructive learning models can be extended to a computer-based environment in the form of online discussion (e.g., Berge & Muilenburg, 2000; Blignaut & Trollip, 2003; Bonk & King, 1998; Burstall, 2000; Hara, Bonk, & Angeli, 1998; Knowlton & Knowlton, 2001; Li, 2003; Loomis, 2000; Love, 2002; Mazzolini & Maddison; 2003; Schrum & Hong, 2002; Shaw & Pieter, 2000; Son, 2002; Wickstrom, 2003; Wu & Hiltz, 2004; Zhu, 1998). The vast majority of these studies have focussed on higher education students. Considerably less has been written on the use of online discussion by secondary school students (Knowlton & Knowlton, 2001; Love, 2002). However, evidence supporting cooperative learning and constructivism apply to a wide range of age groups, including adolescents ( Johnson & Johnson, 1994, 1998; Kagan, 1997). It is reasonable, then, to expand the analysis of online discussions to secondary school students.

The focus of online discussions has been on non-technical, higher level subject areas, such as astronomy (Mazzolini & Maddison, 2003), cognitive psychology (Hara et. al., 1998), English (Love, 2002), environmental studies (Thomas, 2002), nutrition (Shaw & Pieter, 2000), reading assessment (Wickstrom, 2003), research methods (Loomis, 2000) and teacher education (Blignaut & Trollip, 2003, Li, 2003, Son, 2002). No formal studies could be found involving the use of online discussion in more technical subject areas, such as computer science or mathematics, particularly with respect to learning introductory level concepts. The success of cooperative learning and constructivist models is not dependent on topics that are non-technical or higher level ( Johnson & Johnson, 1994, 1998; Kagan, 1997). These models have been used in a wide range of subject areas, depending on the level of the students being taught. Given that our understanding of how to use online discussion in effective and meaningful ways has been described as minimal at best (Blignaut & Trollip, 2003), it is prudent to expand the range of topics and cognitive processing levels that have been investigated to date.

The purpose of this study is to broaden the scope of previous online discussion research by looking at younger participants (Grade 9 students) who were learning introductory level concepts in a technical subject area (basic computer programming).

Literature Review

A formal model has not been developed to coordinate the wide range of results reported on the use of online discussions. This study proposes a framework based on Stephen Ceci’s (1990) model of intellectual development. Ceci’s framework includes three key components: context, person and process. The context component refers to environmental influences (e.g., initial question, role of teacher, content of discussion), the person component encompasses personality dispositions (e.g., attitude, style, ability), and the process component incorporates mental dispositions (e.g., cognitive processing, social interaction). Ceci’s model was chosen because ( a) it is not excessively modular, like Gardner’s (1983) theory of multiple intelligences and therefore does not restrict the range of variables that might influence the use of online discussions; (b) it does account for process components, unlike Gardner’s (1983) theory; (c) process components are considered domain-specific, unlike those in Sternberg’s (1990) model of intelligence, and this assumption is consistent with current expertise research (Ericsson & Smith, 1991); and (d) Ceci’s person components permit an analysis of non-intellectual factors such as attitude and style. Therefore, the “ context–person–process” framework was used as a lens to organize and interpret online discussion literature over the past ten years.

Context Components

The context in which online discussion takes place has been looked at from several perspectives: quality of initial question in a thread (Berge & Muilenburg, 2000; Hara et. al., 1998; Savage, 1998; Wickstrom, 2003), role of the educator/teacher (Berge & Muilenburg, 2000; Blignaut & Trollip, 2003; Burstall, 2000; Figallo, 1998; Hara et. al., 1998; Knowlton & Knowlton, 2001; Li, 2003; Love, 2002; Mazzolini & Maddison, 2003; Moller, 1998; Wickstrom, 2003), navigational structure (Burstall, 2000; Hammond, 2000; Hara et. al., 1998; Knowlton & Knowlton, 2001; Son, 2002; Wickstrom, 2003), and location of learning (e.g., Hara et. al., 1998; Love, 2002; Schrum & Hong, 2002).

Previous research suggests that the initial question starting a discussion thread is germane to the quality of subsequent interaction (Berge & Muilenburg, 2000; Hara et. al., 1998; Savage, 1998; Wickstrom, 2003). Specifically, more successful questions are clear (Berge & Muilenburg, 2000), provocative (Love, 2002) and promote higher level thinking (Savage, 1998). One might predict that questions about less provocative and basic concepts like beginning programming would not stimulate effective discussions. However, Hara et. al. (1998) noted that clear patterns in questioning were challenging to discover.

The role of the educator in online discussion has received considerable attention (Berge & Muilenburg, 2000; Blignaut & Trollip, 2003; Burstall, 2000; Figallo, 1998; Hara et. al., 1998; Knowlton & Knowlton, 2001; Li, 2003; Love, 2002; Mazzolini & Maddison, 2003; Moller, 1998; Wickstrom, 2003), although researchers have yet to agree on the most appropriate strategy for involvement. One school of thought proposes that educators are critical to the success of an online discussion (Blignaut & Trollip, 2003; Figallo, 1998; Knowlton & Knowlton, 2001; Love, 2002; Moller, 1998). The educator serves to raise the level of discussion to a higher level (Figallo, 1998). Moreover, giving students the responsibility to determine the direction of discussions may not be a viable approach (Moller, 1998). The other school of thought claims that educators should let students construct their own knowledge (Burstall, 2000; Li, 2003; Mazzolini & Maddison, 2003). These researchers have reported that peer messages are more effective than educator messages at stimulating discussion and that instructor presence can actually shut dialogue down (Li, 2003; Mazzolini & Maddison, 2003). One might speculate that more active participation of an instructor may not be necessary for lower level subject areas because there is no need to ask higher level questions. However, an instructor may need to be involved more in order to correct misconceptions that develop early on in the learning process.

Students and instructors can face considerable problems trying to navigate through a typical discussion board. Specific problems observed include length of message (Hara et. al., 1998), number of entries (Burstall, 2000; Hara et. al., 1998; Hammond, 2000; Knowlton & Knowlton, 2001; Son, 2002; Wickstrom, 2003), unclear subject lines (Hara et. al., 1998) and lack of organization (Chen & Hung, 2002; Li, 2003). In other words, the number and length of messages can be overwhelming, particularly if messages are not organized well. Chen and Hung (2002) add that the traditional threaded discussion format may be inadequate for true knowledge building. There appears to be no obvious reason why secondary school students learning beginning level programming would experience less or more navigation errors than university level students studying more complex subject areas.

A large number of online discussions are used in conjunction with face-to-face learning (e.g., Hara et. al., 1998; Love, 2002; Schrum & Hong, 2002), yet there is no research on how much discussion actually goes on outside of the school environment. While the assumption may be that students are spending time reflecting and posting messages at home, there is no data to support this supposition.

Person Components

A wide range of person characteristics have been investigated with respect to online discussion use. These characterises can be divided into three subcategories: attitudes, ability, and style. Attitude results have been mixed with some students reporting relatively positive perceptions of online discussion use (Hammond, 2000; Son, 2002), and others, especially beginners, being reticent to add messages (Hammond, 2000; Mazzolini & Maddison, 2003; Wickstrom, 2003).

Research on ability has noted that some students have considerably difficulty typing messages (Hammond, 2000; Hara et. al., 1998). The negative influence of poor organization skills and low levels of self discipline (Schrum & Hong, 2002) has also been documented.

Finally, research on style has revealed the role of humour, sarcasm, or tone in leading to misinterpretation (Berge & Muilenburg, 2000; Knowlton & Knowlton, 2001) and the distinct roles that students acquire when participating in online discussions: non participants (Hammond, 2000; Wickstrom, 2003), reflectors (Hara et. al., 1998), and mediators (Palloff & Pratt, 1999).

An argument could be made that technologically savvy secondary school students might be more comfortable with and be more able to use online discussion than their higher education counterparts simply because computers appear to be playing an ever increasing role in adolescent life both inside and outside of school (e.g., computer games, internet use, instant messaging). More positive attitudes and higher ability, then, might lead to increased use of online discussion. It is less clear, though, whether secondary school students would assume the specific roles observed with higher education students, such as reflectors and mediators.

Process Components

The process elements of online discussion have not been looked at in extensive detail. Two key areas have been examined: social learning and cognitive processes. Vygotsky (1978) was a pioneer in exploring the role of language in thought. He noted that conceptual learning was a collaborative effort requiring supportive dialogue. Slavin (1995) added that widespread research supports the positive effects of cooperative learning on achievement. Online discussion, then, has the potential to support collaboration and concept development. A number of researchers, though, have reported that true social interaction leading to cognitive development is rare (e.g., Berge & Muilenburg, 2000; Hara et. al., 1998; Son, 2002; Wickstrom, 2003). In fact Hara et. al. (1998) report, that most students rarely participate a second time in an online discussion thread. One might speculate that adolescent students would be more comfortable with technology, especially with respect to communicating with peers. Consequently, social interaction may be more frequent with his age group. However, it is difficult to predict whether increased interaction translates into socially constructed meaning and the learning of new concepts.

While detailed content analyses of online discussion have been done by several investigators (Berge & Muilenburg, 2000; Hara et. al, 1998; Zhu; 1998), only Berge and Muilenburg (2000) and Knowlton and Knowlton (2001) used a theoretically-based taxonomy to investigate cognitive processes. To date, there is little systematic research to guide the use of online discussion in promoting higher level thinking, although Savage (1998) provides a list of reasonable, yet untested suggestions. The current study will use a revised version of Bloom’s taxonomy (Anderson & Krathwohl, 2001) to look at both knowledge and processing level of online discussion messages.

It should be noted that because the sample in this study was collected from secondary school students who were 13 to 15 years old, the level of knowledge and processing may be concentrated at the concrete operational stage (Piaget, 1954; 1974). In other words, some students may not have developed the ability to think at a metacognitive or abstract level. Furthermore, the introductory level of the topic might preclude the use of higher level thinking skills like analyzing or creating.

Summary

Stephen Ceci’s (1990) three pronged, “context, person and process”, model of intellectual development was used to frame the literature review on online discussion use. Contextual components included the quality of question starting off a discussion thread, the role of the educator, navigational structure of the discussion board, and location of learning. Person components included attitudes, ability, and style. Process components include social leaning and cognitive processes. This model is illustrated in Figure 1.

Figure 1. Model for Examining Discussion Board Use

The principle research question guiding this study is “Can secondary school students participate meaningfully and effectively in online discussion about a subject area that is technical and situated at an introductory level?” The research methods used to explore this question are summarized next.

Method

Sample

The sample tested consisted of 45 secondary school students enrolled in an introductory computer science course at a private boys’ school in a metropolitan area. The students, all males, ranging in age from 13 to 15 years old, were divided into two classes consisting of 22 and 23 students. The assignment of a student to a particular class was based solely on his schedule at the beginning of the year. The data was collected and analyzed a year after the students finished the course. Post facto permission to use the data was obtained on the condition that the teacher, who saw all discussion board data and grades when the course was given, would be the only person to view and analyze the data. All names were removed from the data to preserve anonymity.

Procedure

Students were asked to contribute messages in two consecutive asynchronous online discussions used to supplement the learning of HTML (24 days) and beginning programming (36 days). The online discussions were part of a regular face-to-face course that met every other day for 90 minutes. Participation in the online discussion was worth 10% of their final grade. Specific grading guidelines were not provided in order to encourage as much participation as possible. It was emphasized that messages consisting of questions or answers would be given equal weighting. It is worth noting that the majority of discussion board research is based on courses where participation is graded (e.g., Burstall, 2000; Hara et. al., 1998, Li, 2003; Love, 2002; Mazzolini & Maddison, 2003; Schrum & Hong, 2002; Son, 2002; Thomas, 2002; Wickstrom, 2003). The discussion board was intended to be student led and the teacher would only intervene if there problems that students could not resolve. After each of the course topics was completed (HTML first, programming last), students were asked to fill in a survey consisting of two open-ended questions.

Data Collection and Analysis

Acquiring cohesive and useful information on the use of discussion boards is partially dependent on developing a consistent, comprehensive, theory-driven metric to assess quality and effectiveness. Currently there is considerable variation in the tools used to assess online discussion boards (e.g., Berge & Muilenburg, 2000; Blignaut & Trollip, 2003; Burstall, 2000; Henri, 1992; Love, 2002; Mazzolini & Maddison, 2003; Wickstrom, 2003) which makes it difficult to combine results into a cohesive base of knowledge to guide practice and education. In addition, the majority of studies have looked at only one or two aspects of online discussion in detail. Several researchers have attempted more complete and detailed analyses (e.g., Hara et al, 1998; Zhu, 1998), although the scope is still somewhat limited with respect to the full range of factors that could influence successful performance.

Because a clear, comprehensive, theoretically-based metric of discussion board use has yet to be developed, four steps were followed to collect and analyze data. First, an extensive review of the literature was done to identify measures used to examine and evaluate discussion board use. Second, the measures were organized according to Ceci’s (1990) “context–person–process” model of intellectual development. Third, three sources of data were used in order to analyze all of the key measures identified in the literature review. These included (a) coded online discussion board messages, (b) statistics accumulated by the discussion board software (Blackboard 5.0) on actual use, and (c) attitude survey data at the end of each topic. Finally, where possible, two or more variables were used to evaluate specific context, person, and process components in order to improve accuracy and validity. Appendix A provides a summary of the specific content, person, and process components assessed, the list of variables used to measure the constructs, the data sources, and the references supporting the use of the metric.

Coding of Online Discussion Board Messages

In order to make the coding scheme as transparent as possible, Appendix B provides a detailed rubric for the key variables used this study.

Statistics on Use

The Blackboard program automatically collected the following statistics: time when message was posted, number of times a message was read by others, number of visits an individual student made to the discussion board, number of days an individual student visited the discussion board, and total number of posts an individual student made.

Survey Data

Two key questions were asked of students after they completed each course topic:

  1. Did you use the discussion board? Please explain in detail why or why not.
  2. Was the discussion board useful to you? Explain in detail why or why not.

The responses from students were examined to identify and categorize (a) reasons for/for not participating and (b) why the discussion board was thought to be useful or not. Frequencies of each category were then calculated.

Learning Performance

External measures of learning performance (term project and term test grades) were used to evaluate the overall effectiveness of participating in discussion boards. Only one previous study could be found that looked at how discussion board participation may have affected performance. In that study, Wu and Hiltz (2004) looked at “perceived”, but not actual learning performance by the students. In other words, the students rated how much they learned.

The current study examined the relationship between discussion board use and actual learning performance. More specifically, the final term project grade and final exam score were correlated with a the total number of visits a student made to the discussion board, total number of days that a student visited the discussion board, and the total number of messages posted.

Results

Overview

Overall, a total of 260 messages were posted for both HTML and introductory programming topics. The mean length of a discussion thread was of 3.5 messages (SD = 2.3; range 1 to 11 messages) and the average number of words per message was 48.3 (SD = 46.2; range 1 to 263 words). Subject lines were moderately clear (M = 1.68, SD =.9; scale range from 0 to 3) and the quality of messages was fair to good (M = 2.3, SD =.9; scale range from 0 to 4). A typical message was read an average of 29.5 times (SD = 11.3; range 2 to 77). The average time to respond to a posted message was 3630 minutes or 2.5 days (SD = 7377 minutes; range 1 to 49109 minutes).

With respect to content, a majority of messages were either related to or went beyond the official course curriculum covered in class (n = 223, 86%). The primary purpose of a typical message was to ask a question (n = 63, 24%) or to offer an answer (n = 175, 73%). The discussion board was rarely used for non-academic purposes (n = 15, 6%).

Context Components

Initial question: The impact of the initial question in a discussion thread was assessed by looking at the effect of five variables (whether the question was easily answered elsewhere, subject line clarity, message quality, knowledge type and processing level) on the average number of messages read in a discussion and the total number of messages posted. Ten one-way ANOVAs revealed no significant differences. In other words, the quality of the first question in a discussion thread appeared to have no significant impact on amount of subsequent interaction that took place.

Role of educator: Students initiated questions in 95% (n = 50) of the discussion threads started. Students also posted the last message in a discussion thread a majority of the time (n = 49, 89%). Overall, there were no significant differences between teacher and student messages with respect to the number of times each were read, length of message, and how fast a message received a response (response time).

Navigation: Navigation issues were examined by looking at the effect subject line clarity and location of message within a thread (message number) on how many messages were read (reading rate) and how fast a message received a response (response time).

The clarity of a subject line was not significantly related to reading rate or response time. In other words, messages with clear subject lines were not read or responded to more quickly than messages with unclear or confusing subject lines.

Message number was significantly and negatively correlated with average number of times the message was read (r = -.26; p <.001). There is a steady drop in the average number of times a message is read from the initial message (M = 39.18) to message number 11 (M = 14.5). Message number was not significantly related to response time.

From the post-task survey data, navigation was reported as the number one problem in using the discussion board in both the HTML and programming topics (n = 35; 54%). Specific concerns voiced were that is was hard to find specific content because there were too many messages and too much clicking required to access messages. Several students thought that the discussion board was being diluted with messages because students were being graded. A number of students suggested that there should be greater division and classification of topics to decrease navigation time.

Learning location: Just over half (n = 142; 55%) of all messages posted on the discussion board were completed outside of school hours. There were no significant differences between in-school and out-of-school messages with respect to clarity of subject line, message quality, response, time, and number of words, however messages posted out-of-school were rated as more difficult to answer than messages posted in-school (p <.05).

Person Components

Attitude: The measure of attitude in this study focused on perceived usefulness of the discussion board. In the post-task survey, over one third of the students thought the online discussion was an effective learning tool (n = 24/65; 37%). With respect to actual use, 38% of the students used the discussion board frequently, 25% occasionally, and 27% not at all. Almost two thirds (65%) of the students reported that they had received useful information, while one third (39%) thought they had provided helpful information. Eight-two percent of the students indicated that grade was not a key motivator for participating in the discussion board.

Ability: The discussion board ability of students was examined using four variables: message clarity, message quality, course knowledge, and external resources used. A majority of messages were clear (n = 174, 67%) or somewhat clear (n = 70; 27%). Only 16 messages (6%) were unclear. Message quality was good or excellent 41% of the time, fair 47% or the time, and poor or incorrect 12% of the time. The content of messages was directly related to (n = 78, 30%) or went beyond (n = 145, 56%) the course curriculum material. Non-academic (n = 24, 9%) and administrative issues (n = 6, 2%) were discussed infrequently. Finally, students did not cite resources in their messages one third of the time, however they did refer to specific coding examples (HTML or programming) in 41% (n = 105) of all messages posted. Web site references were a distant second in terms of resources noted in a message (13 %; n = 34).

Style: Individual style differences among students who posted at least five messages were observed in the average number of messages read (p <.001), average response time (p <.001), number of words used (p <.001), and message quality (p< .001). Students also differed with respect to the number of messages they posted ranging from 1 to 17. Students did not differ significantly with respect to clarity of subject line, knowledge type, and processing level demonstrated.

Process Components

Social learning: Evidence for social learning was based on length of discussion threads, number of messages read, primary focus of posted message, whether new knowledge was added, the number of situations where students participated two or more times in the same discussion thread, and whether the discussion thread was resolved.

The number of discussion threads containing 5 or more messages was 26/55 or 47%. The mean number of times a typical message was read was 29.5 (SD = 11.3) and ranged from 2 to 77 times. Specific responses to other students in the form of questions or answers occurred in 66% of all messages posted (n = 172). New knowledge was added either indirectly (n = 69, 27%) or directly (n = 103, 40%) in a majority of messages posted. Students participated in the same discussion thread two or more times in 13 out of 28 HTML discussion threads (46%) and 10 out of 35 programming discussion threads (29%). Finally, collective information was provided by students that either resolved (n = 51, 37%) or went beyond resolving (n = 49, 36%) questions asked in the discussion threads.

Cognitive processing: Based on Bloom’s revised taxonomy (Anderson & Krathwohl, 2001), the predominant knowledge type demonstrated was procedural (n = 140, 57%), followed by conceptual (n = 51, 21%) and factual (n = 50, 21%). Metacognitive knowledge was present in only 3 out of 244 messages evaluated. With respect to processing level, students displayed understanding most (n = 85, 35%), followed by remembering (n = 66, 27%), applying (n = 52, 22%), analyzing (n = 31, 13%) and evaluating (n = 10, 4%).

Learning performance

Learning performance (final grade on term project and term test) for both HTML and beginning programming topics were significantly and positively correlated with number of visits, numbers of days visited, and number of messages posted, with one exception—the number of visits the HTML discussion board was not significantly correlated to the final web page project grade (see Table 1).

The results above are supported by the post task survey where over one third of the students reported learning significant concepts using the discussion board (n = 24/65; 37%).

Table 1. Correlations Among Discussion Board Participation and Learning Performance Measures

Discussion

The main research question in this study was “Can secondary school students participate meaningfully and effectively in online discussion about a subject area that is technical and situated at an introductory level?” Overall, the evidence suggests that students of this age group can use online discussions successfully to learn basic programming. A majority of messages in the online discussion contained information that was related to or went beyond the course curriculum and these were read frequently by nearly three quarters of the class. Over 70% of all discussion threads were resolved. Additionally, significant correlations between discussion board participation and final grades supports the premise that online discussion can supplement the learning of technical, introductory level concepts for secondary school students.

A more detailed analysis, guided by Ceci’s (1990) “context–person–process” model of intellectual development, reveals key similarities and differences between secondary school students discussing basic programming concepts and university students conversing about higher level concepts.

Context Components

Initial question: The impact of the initial question in a discussion thread had a marginal influence on participation, unlike previous results that specifically advocated the use of higher level, provocative questions (Berge & Muilenburg, 2000; Love, 2002; Savage, 1998). A reasonable argument could be made that higher-level, controversial questions would not be appropriate or necessary for a discussion board focussing on basic level programming. The role of an initial question, then, may be dependent on the content being discussed.

>Role of educator: The educator in this study did not dominate or excessively stimulate discussion. Students initiated questions and provided conclusions for the vast majority of discussion threads. This finding supports the philosophy of allowing students to construct their own knowledge (Burstall, 2000; Li, 20003; Mazzolini & Maddison, 2003). Students were not only successful at interacting and building new knowledge, but their participation appears to be related to better performance on final projects and tests. This result does not preclude the possibility that they could have performed even better if the teacher had taken a more active role. It does indicate, though, that students are capable of taking responsibility for a discussion and learning new facts, concepts, and applications without significant teacher intervention and participation. Since much of the knowledge covered in the discussion board went beyond the standard curriculum and students participated in discussion outside of class more the fifty percent of the time, the online discussion board has the potential to be a powerful supplement to a traditional computer science classroom format.

Navigation:Navigation problems observed in this study were consistent with previous research. The large number of entries (Burstall, 2000; Hammond, 2000; Hara et. al., 1998; Knowlton & Knowlton, 2001; Son, 2002; Wickstrom, 2003) and poor organization of messages (Chen & Hung, 2002; Li, 2003) were identified as problematic by secondary school students. Unlike previous results (Hara et. al., 1998), subject line clarity had little impact on whether a student read or responded to a message. Students in this study, who started with a limited knowledge base, might have been less discriminating and more accepting of unclear subject lines while they are learning basic concepts.

The observation that reading rate dropped sharply after the first message, and then declined at steady rate brings up two critical questions: how many messages are users willing to read within a specific discussion and why do they stop reading. Chen and Hung’s (2002) speculation that the traditional online discussion format is limited with respect to supporting true and personal knowledge building was not backed up by the current results. As stated earlier, most students, in spite of the navigation issues, managed to participate regularly and learn effectively. Nonetheless, features such as notifying the author of a message when there is a response to that message or providing specific prompts to encourage knowledge building may improve learning (Scardamalia & Bereiter, 1999).

Location of learning: The discussion board in this study was used to supplement the teaching of a secondary school course in introductory programming. Clearly, students were willing to participate outside of school hours. This result is even more powerful given a majority of the topics covered on the discussion board went beyond the curriculum and use of the discussion board was significantly correlated with learning performance. In fact, more difficult topics were discussed at out-of-school than in-school time.

Successful, meaningful, and effective use of discussion boards outside of school hours could prove to be beneficial to educators. In large classes, it is often not possible to answer the range and number of questions raised during class—the discussion board offers a viable and effective alternative.

Person Components

Attitude: The mixed attitudes toward online discussion expressed by secondary school students are consistent with previous research on higher education students (Hammond, 2000; Mazzolini & Maddison, 2003; Son, 2002; Wickstrom, 2003). Roughly one third of all students said they used the discussion board sparingly or not at all. They noted that either the discussion board did not match their personal style of learning or that that they thought there were more efficient ways for them to learn (e.g., using a book, talking with someone, using the Internet). Another third of all students appeared to have an indifferent attitude toward online discussion, using it only on occasion. The final third were enthusiastic participants who received and offered new ideas frequently. These differences in use and attitude should be noted by educators. While some students may thrive with this tool, others need more convincing or may not be prepared to use the discussion board at all. Note that the policy of grading participation, practiced by numerous educators (e.g., Burstall, 2000; Hara et. al., 1998, Li, 2003; Love, 2002; Mazzolini & Maddison, 2003; Schrum & Hong, 2002; Son, 2002; Thomas, 2002; Wickstrom, 2003), may need to be re-examined if only a limited number of students actively and enthusiastically contribute.

Ability: Secondary school students’ ability to use the discussion board appeared to be high. These students posted clear, good quality messages that met or exceeded course expectations almost 90% of the time. Poor typing skills (Hammond, 2000; Hara et. al., 1998) and low level of self discipline (Schrum & Hong, 2002) reported previously did not appear to play a prominent role in this study.

Style: To date, individual differences in using a discussion boards have not been looked at in a systematic way, although anecdotal evidence suggests that students have different styles or roles (e.g., Hammond, 2000; McGrath & Hollingshead, 1994; Palloff & Pratt, 1999; Wickstrom, 2003). The results from the current study support previous anecdotal observations. Students could be differentiated based on number of messages they read, how fast they responded to messages, number of words they wrote, and the overall quality of their messages.

Process Components

Social learning and cognitive processing: There is clear evidence that the secondary school students in this study were genuinely engaged in social activity. Many discussion threads exceeded 5 messages, and a majority of posts consisted of specific responses to fellow students’ comments and problems. As well, roughly one third of all students participated in the same discussion thread at least 2 or more times.

Social “activity” does not necessarily guarantee that social “learning” is taking place (e.g., Berge & Muilenburg, 2000; Hara et. al., 1998; Son, 2002; Wickstrom, 2003). The analysis of the discussion board messages in this study, though, suggests that students were actively and cooperatively trying to understand and apply new HTML and programming concepts. Furthermore, they were successful in collectively resolving almost three quarter of all discussion threads.

However, metacognition, analysis, and evaluation cognitive processing levels were not observed often. This result is partially predicted by Piaget (1954, 1974), who notes that formal operations may not occur in younger students. Case (1992, 1998) suggests that abstract thinking, indicative of the formal operations stage, might be accelerated if a student acquired more advanced knowledge of a particular topic. However, the two topics taught in this study were introductory in focus and it is unlikely that many students had a highly developed knowledge base. One other explanation for the absence of higher level knowledge and processing might rest in the task oriented nature of the introductory computer science curriculum. Students were primarily focused on learning specific ways to do particular procedures, not reflecting about the process.

The final piece of evidence to suggest the existence of social learning was the positive and significant correlation between discussion board participation and performance on a term project and exam. Students who participated regularly in the online discussion performed better than students who were less involved.

Theoretical Implications

Three key theoretical implications can be drawn from this study. First, the evidence collected in this study suggests that that (a) secondary school students are capable and willing to engage in online discussion and (b) introductory level concepts of a more technical nature can be discussed meaningfully and productively using an online format. These findings are consistent with a considerable base of research on cooperative learning ( Dewey, 1966; Johnson & Johnson, 1994, 1998; Kagan, 1997; Sharon, 1999; Slavin, 1995) and the principles of constructivism (Bereiter & Scardamalia, 1989; Br uner, 1983, 1986; Scardamalia & Bereiter, 1999; Vygotsky, 1978).

Second, Ceci’s (1990) model of intellectual development provided a much-needed descriptive framework to organize previous research in online discussion and interpret the results of this study. This kind of framework is an important start to bringing together the widely disparate results reported in previous online discussion research.

Third, the comprehensive, theoretically driven, collection of measures used in this study to assess the use of online discussion provided valuable and detailed information. Researchers should be encouraged to use more wide-ranging metrics in order to resolve discrepancies that currently exist in the online discussion board literature.

Educational Implications

There are a number of educational implications for the use of discussion boards that emerge from this study:

  1. Online discussion was used effectively by secondary school Computer Science students to solve significant problems that would not have been discussed during a traditional class where the teacher-student ratio is high;
  2. Topics on the discussion board not only exceeded standard curriculum expectations, they were resolved a majority of the time;
  3. Participation by the instructor was not critical or necessary for effective discussion to occur;
  4. Meaningful participation in online discussion occurred outside of school hours;
  5. There are individual differences among students and their use of discussion boards. Not all students learned using this tool, however, most students gained some useful information from online discussions;
  6. The commonly used policy of grading participation in online discussions may need to be revisited given that only one third of the students in this study actively participated;
  7. Regular participation in online discussion was significantly correlated with classroom performance.

Caveats

The results of this study should be treated with a certain degree of caution for the following reasons:

  1. The sample selection is clearly limited: all boys from a private school. Additional research needs to be done on more diverse populations.
  2. The topic of the discussion board in this study was computer-related. Different results might be observed for other introductory level topics.
  3. The analysis of attitude in this study was based on two open-ended questions. A more detailed examination using direct questions may reveal richer and more informative insights.
  4. Even though a majority of students said that grading participation did not effect their involvement, it would be worth examining online discussions where participation was not graded
  5. Online discussion was used as a supplement to face-to-face classes. The results might be markedly different if students did not have face-to-face interaction.

Future Research

A natural extension of this study would be to examine more diverse sample populations and different technical subject areas such as mathematics or science. In addition, research is clearly needed on addressing the navigation challenges that all students, regardless of educational level, appear to experience. Exploring online, introductory level courses without face-to-face interaction would add another dimension to the results observed in this study. Finally, and perhaps most importantly, researchers of online discussion need to work toward building a model of discussion board use through the use of more systematic and comprehensive metrics.

References

Anderson, L. W., & Krathwohl, D. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. New York: Longman.

Bereiter, C., & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction (pp. 361–392). Hillsdale, NJ: Erlbaum Associates.

Berge, Z., & Muilenburg, L. (2000). Designing discussion questions for online, adult learning. Educational Technology, Sept-Oct, 53–56.

Blignaut, S., & Trollip, S. R. (2003). Developing a taxonomy of faculty participation in asynchronous learning environments – an exploratory investigation. Computer and Education, 41, 149–171.

Bonk, J. B., & King, K. S. (Eds.) (1998). Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and discourse. Mahwah, NJ: Lawrence Erlbaum Associates.

Bruner, J. (1983). Child's talk. Learning to use language, Toronto, Canada: George J. McLeod Ltd.

Bruner, J. (1986). Actual minds, possible worlds. Cambridge, MA: Harvard University Press.

Burstall, J. (2000). Learning communities for social change in forums on the web. Australian Journal of Adult Learning, 40(1), 33–52.

Case, R. (1992). The mind’s staircase: Exploring the conceptual underpinnings of children’s thought and knowledge. Mahwah, NJ: Lawrence Erlbaum.

Case, R. (1998). The development of conceptual structures. In D. Kuhn & R. S. Siegler (Eds.), Handbook of child psychology: Vol. 2: Cognition, perception, and language (pp. 745–800). New York: Wiley.

Ceci, S. J. (1990). On intelligence .. more or less. A bio-ecological treatise on intellectual development. Englewood Cliffs, NJ: Prentice Hall.

Chen, D., & Hung, D. (2002). Personalized knowledge representations: The missing half of online discussions. British Journal of Educational Technology, 33(3), 279–290.

Dewey, J. (1966). Democracy and Education. New York: MacMillan.

Ericsson, A. K., & Smith, J. (1991). Prospects and limits of the empirical study of expertise: an introduction. In K. A. Ericsson, & J. Smith (Eds.), Toward a general theory of expertise (pp. 1-38). Cambridge: Cambridge University Press.

Figallo, C. (1998). Hosting web communities: Building relationships, increasing customer loyalty, and maintaining a competitive edge. New York: Wiley.

Gardner, H. (1983). Frames of Mind. New York: Basic Books Inc.

Hammond , M. (2000). Communication within on-line forums: the opportunities, the constraints and value of communicative approach. Computers & Education, 25, 251–262.

Hara, N., Bonk, C. J., & Angeli, C., (1998). Content analysis of online discussion in an applied educational psychology (Center for Research on Learning and Technology, No. 2-98). Retrieved March 17, 2005, from http://crlt.indiana.edu/publications/techreport.pdf

Henri, F. (1992). Computer conferencing and content analysis. In A. R. Kaye (Ed.), Collaborative learning through computer conferencing: The Najaden papers (pp. 115–136). New York: Springer.

Johnson, D. W., & Johnson, R. T. (1994). An overview of cooperative learning. In J. S. Thousand, R. A. Villa, & A. I. Nevin (Eds.), Creativity and collaborative learning: A practical guide to empowering students and teachers (pp 31–44). Baltimore, MD: Brookes.

Johnson, D. W., & Johnson, R. T. (1998). Learning together and alone. Cooperation, competition, and individualization (5 th ed.). Englewood Cliffs, NJ: Prentice-Hall.

Kagan, S. (1997). Cooperative Learning (2 nd ed.). San Jose Capistrano, CA: Resources for Teachers.

Knowlton, D. S., & Knowlton, H. M. (2001). The context and content of online discussions: Making cyber-discussions viable for the secondary school curriculum. American Secondary Education, 29(4), 38–52.

Li, Q. (2003). Would we teach without technology? A professor’s experience of teaching mathematics education incorporating the internet. Educational Research, 45(1), 61–77.

Loomis, K. D. (2000). Learning styles and asynchronous learning: Comparing the LASSI model to class performance. Journal for Asynchronous Learning Networks, 4(1), 23–32.

Love, K. (2002). Mapping online discussion in senior English. Journal of Adolescent & Adult Literacy, 45(5), p. 382–396.

Mazzolini, M., & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers in Education, 40, 237–253.

McGrath, J., & Hollingshead, A. (1994) Groups interacting with technology. Thousand Oaks, C.A. : Sage.

Moller, L. (1998). Designing communities of learners for asynchronous distance education. Educational Technology, Research, & Development, 46(4), 115–122

Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace – effective strategies for the online classroom. San Francisco: Jossey-Bass.

Piaget, J. (1954). The construction of reality in the child. New York: Basic Books.

Piaget, J. (1974). Understanding causality (D. Miles & M. Miles, Trans.) New York: Orion Press.

Savage, L. B. (1998). Eliciting critical thinking skills through questioning. The Clearing House, 71(5), 291–293.

Scardamalia, M., & Bereiter, C. (1999). Schools as knowledge-building organizations. In D. Keating and C. Hertzman (Eds.). Today’s children, tomorrow’s society: The development health and wealth of nations (pp. 274–289). New York: Guilford.

Schrum, L., & Hong, S. (2002). Dimensions and strategies for online success: Voices from experienced educators. Journal of Asynchronous Learning Networks, 6(1), 57–67.

Sharon , S. (Ed.) (1999). Handbook of Cooperative Learning Methods. Westport, CT: Praeger.

Shaw, G. P., & Pieter, W. (2000). The use of asynchronous learning networks in nutrition education: Student attitude, experiences, and performance. Journal for Asynchronous Learning Networks, 4(1), 40–51.

Slavin, R. E. (1995). Cooperative Learning (2nd ed.) New York: Longman.

Son, J. (2002). Online discussion in a CALL course for distance language teachers. CALICO Journal, 20(1), 127–144.

Sternberg, R. J. (1990). Metaphors of Mind. Cambridge, NY: Cambridge University Press.

Thomas, M. J. W. (2002). Learning within incoherent structures: an examination of the virtual space of on-line discussion forums. Journal of Computer Assisted Learning, 18(3), 351-366.

Vygotsky, L. S. (1978). Mind in society. Cambridge, M.A: Harvard University Press.

Wickstrom, C. D. (2003). A funny thing happened on the way to the forum. Journal of Adolescent & Adult Literacy, 46(5), 414–423.

Wu, D., & Hiltz, S. T. (2004). Predicting learning from asynchronous online discussions. Journal for Asynchronous Learning Networks, 8(2), 139–152.

Zhu, P. (1998). Learning and mentoring: Electronic discussion in a distance learning course. In C. J. Bonk, & K. S. King (Eds.), Electronic collaborators: Learner-centered technologies for literacy, apprenticeship, and discourse (pp. 233–259). Mahwah, NJ: Erlbaum.


Appendix A

Variables Used to Analyse Discussion Board Messages

Appendix B

Detailed Rubric for Assessing Online Discussion Data


ISSN: 1499-6685