Authors
Terry Anderson is a professor and Canada Research Chair in Distance Education at Athabasca University. Correspondence concerning this article can be sent to Terrya@athabascau.ca
Abstract: Design-based research is proposed as a strategy to address the need for innovation in educational contexts. The article argues the case for increased and more effective research and development in education, and then presents a discussion of design-based research as a methodological set of tools to address this need. The four phases of design-based research (Bannan-Ritland, 2003) are explicated and an example, in which design-based methods are used to develop and assess a call centre innovation at Athabasca University, is presented.
Résumé: On propose la recherche axée sur le plan à titre de stratégie pour régler le besoin en matière d’innovation dans des contextes éducatifs. L’article stipule que la recherche et le perfectionnement dans le monde de l’éducation doivent être plus efficaces et davantage présents, il présente ensuite une discussion sur la recherche axée sur le plan qui pourrait servir d’outil méthodologique afin de répondre à ce besoin. Les quatre étapes de la recherche axée sur le plan (Bannan-Ritland, 2003) sont expliquées et on présente un exemple dans lequel les méthodes axées sur le plan sont utilisées pour mettre sur pied et évaluer un nouveau centre d’appels à la Athabasca University.
Design-based research is a relatively new methodology and research orientation that is proving to be an effective educational research tool. Design-based research has high potential to aid in the development and assessment of innovations in education such as e-learning. A rationale for the description and development of components, methodologies and the strengths and weaknesses of design-based research are discussed in this paper. The paper concludes with a case study of a design-based investigation in which call centre organizational structures and technologies are used to cost-effectively enhance student support in a large distance education undergraduate program.
Design-based research arose as a strategic response to two compelling and related characteristics of educational research. The first is that there isn’t much of it!
An easy scapegoat to blame is the lack of public or foundation funding for educational research. Although governments in most developed countries spent only slightly less on education than on health, the discrepancy in amounts spent on research in these respective fields is large and growing. The amount spent in North America on educational research is estimated to be about .01% of total educational expenditures (Burkhardt & Schoenfeld, 2003). Health researchers set a goal of 3% or 300 times as much for basic and applied health research. This can be compared to high technology information businesses that typically spend 10-20% of their revenues on research and development. It is obvious that educational research suffers from an extremely impoverished funding base as compared to related social and private enterprises. But is that a cause or merely a symptom of relatively low educational research productivity?
The second characteristic of the context is that education practice, unlike many other professions or economic activities, is not marked by regular and continuous innovation. Practitioners, politicians, parents, teachers and learners are frustrated at the general lack of innovation in schools. As long ago as 1922, John Dewey complained that ÉC;the school system represents not thinking but the domination of thought by the inertia of immemorial customsÉD; (Dewey, 1998, p. 270) . Branson (1998) contends that ÉC;there has been no sustained managed changes in the typical school model in the past one hundred and fifty years even though there has been enormous advances in educational research and developmentÉD; (p.5).
This lack of innovation is directly related to the absence of an effective research and development component within the educational system. Bereiter (2002) argues that predominant models of educational research that are based on correlation and experimental studies are ineffective at stimulating and sustaining innovation. Innovation is necessarily a slow process in conservative contexts, such as education, that are not primarily driven by bottom line economics. Yet sustained innovation is necessary if the current models of education are to meet the challenges and demands of a networked and knowledge society based on lifelong learning opportunities (Friesen & Anderson, 2004) .
An essential component of effective strategic change is an active research and development component of the system designed to insure that pedagogical, technological, sociological, political and commercial changes and opportunities are both developed and exploited within that system. These insights from effective research and development, originate both from within education domains as well as being imported from related disciplines. However, they must be transferable and have impact on ordinary educational contexts if they are to justify the necessary cost. Brown (1992) argues ÉC;an effective intervention should be able to migrate from our experimental classroom to average classrooms operated by and for average students and teachers, supported by realistic technological and personal supportÉD; (p. 143).
Many researchers have described the challenges of reform in education systems (e.g., Tyack & Cuban, 1995) . Some, like Sabelli and Dede (2001) , blame researchers themselves:
Insights and innovations from research provide valid new strategies for improvement to consider, but seldom make a general and sustainable impact on the field. This is in part because researchers often focus on issues peripheral to the concerns of practitioners and policy makers, or study highly restricted situations from which results do not generalize to other contexts, or report their outcomes in a scholarly manner that does not stress the types of evidence and process details that educational implementers find persuasive, or provide insufficient opportunities for practitioners to develop a deep understanding of the conceptual basis of the innovation, of its goals, and how it looks in the classroom when adequately implemented. (online)
Burkhardt and Schoenfeld (2003) argue that lack of education research support and activity is due to a combination of factors that includes negligible industry support, lack of personnel for whom educational research is their vocational priority, no large-scale integrated efforts (such as the human genome project or large international physics projects) and a pervasive lack of trust in the efficacy of educational researchers amongst all stakeholders. The authors of the U.S. National Research Council report Scientific Research in Education (2002) argue that educational research has lost touch with the guiding principles of scientific inquiry that have been the basis for much more sustained progress in harder domains such as natural sciences and medical research. In sum, there is an absence of an effective educational research culture.
The solution to this problem proposed by some is to adapt a natural science research paradigm of blind control group studies designed to determine once and for all what ÉC;worksÉD; in education (for example see the What Works Clearinghouse at http://www.w-w-c.org/). Rather than rushing backwards to slavishly import these so-called ÉC;evidence- basedÉD; research models as suggested by some leading education researchers (Slavin, 2002) and government funding bodies (U.S. Department of Education, 2002) , we are encouraged to seek research methods that are attuned to the continually changing and complex context of real educational settings. We have been guided in this search by recent special issues of major educational research journals. The Journal of the Instructional Sciences, (13, 1, 2004), Educational Researcher (32, 1, 2003) and Educational Psychologist (39, 4, 2004) have highlighted an emerging research methodology known as design-based research. The following section overviews this methodology and attempts to address a few of the contentious issues related to the definition and practice of design-based research.
Design-based research is a method developed for conducting educational research that focuses on systematic and multifaceted development and evaluation of interventions in operating educational contexts. It shares the naturalistic research imperative to study interventions in the context of actual use as opposed to research conducted in laboratory settings. Like action research, design-based research entails work on a practical problem identified by practicing professionals. Design-based research involves a partnership between educational practitioners and trained researchers. Thus, unlike action research which often places the burden of research on busy practitioners, this partnership serves to share both workload and expertise of all participants. Design-based research is a relatively long-term strategy that typically iterates through multiple applications of an ever changing intervention in multiple contexts. Thus, it is differentiated from more rigid forms of comparison studies as the intervention (or independent variable) itself is changed and improved in subsequent use and as it is adapted to different contexts.
Bannan-Ritland (2003) describes four stages of design-based research and maps these to more traditional forms of education research and publication. The salient features of each of these four stages are discussed below.
The first stage is one of informed exploration in which a variety of literature review, theoretical extrapolation and expert and participant input is used to design the intervention. This first stage also focuses on the search for what Dewey referred to as an ideal. This ideal is not a complete formalism from which actions can be induced, but rather an incomplete motivator that generates working hypotheses to be tested in real contexts. Dewey expounds on the nature of these ideals as ÉC;working hypothesis of actionÉD; that serve a function ÉC;like the stars; we steer by them, not towardsÉD; (Dewey, 1894 in Boydston, 1971 4:262). In other words, the ideal provides a vision and a guide as well as a significant component of the measuring stick by which the ideal, as instantiated in actions within a real context, is measured. Dewey also notes the reciprocal influence of the ideal and the consequences of its functioning in real contexts. He wrote ÉC;the leading idea must direct and clarify the work; the work must serve to criticize, to modify and to build up the theoryÉD; (Dewey, 1896 in Boydston, 1971 5:288) . The ideal of the informed exploration is used to energize and function as a lever to secure funding to move to further stages in the design experiment. Diesing (1991) notes that this ideal is conditioned by the social status and class of the researchers themselves and the various political or value orientations to which they adhere. The ideal also has potential to blind researchers to inherent problems or contradictions inherent within the ideal. Diesing (1991) writes that ÉC;the value component is useful as it energizes researchers to search for the empirical workings of the phenomenon, to clarify or revise parts and to fill in details. But when the interest in the good or the bad is so strong; that it demands unquestioning loyalty and action then science has passed over into propagandaÉD; (p. 178). Data collection in the first phase focuses on literature reviews, expert interviews, assessment of interventions in comparable educational contexts.
In the second stage of enactment the intervention is constructed with attention paid to documenting design and production decisions. This is a high visibility stage of the research. Often the most funds are expended during this stage and attention is paid to the cost and length of time needed to design and build the intervention. The cost and complexity of the intervention ranges from large scale construction processes involving teams of programmers and multi-media experts to pedagogical innovations focused on non-technological intervention. Often this production process, with its project management phases, is clearly specified and monitored. But, in less formal problems when an innovative process or a simple learning activity is developed, this second phase of structured construction is often the part of the research that is easiest to talk about and communicate. Data collection in this second phase seeks to document production decisions, processes, barriers and costs.
In the third phase of evaluation within a local context a variety of qualitative and quantitative measures are used to assess the multiple impacts of the intervention in the original context for which it was designed. In this critical phase, evaluative instruments are created to describe, monitor and assess both the intended and the unintended consequences of the intervention. Usually a combination of qualitative and quantitative techniques are used to gather more objective data as well as more interpretative data related to the meaning of the intervention in the lives of the participants.
In the final broader impact evaluation, the intervention is studied in multiple contexts and efforts are made to theorize its impact and improve the design across ever larger and more generalizable contexts (Collins, Joseph, & Bielaczyc, 2004) . In this fourth phase, larger generalizations of the effect of the intervention as well as knowledge about the ways and means by which specific characteristics of each unique educational context effect upon the efficacy of the intervention, are developed. As Dewey noted, the search in this final stage is not for overriding de-contextualized principles or grand theories that function with equal effect in all contexts. Rather, we acknowledge that design results reflect the conditions in which they operate. What we can expect from this fourth stage are the tools and conceptual models to understand and adjust the context and the intervention so as to enable effective learning to occur. Dewey again warns that these fourth stage results ÉC; while general ideals are of utmost value in the direction and enlargement of conduct, they are also dangerous: they tend to be set up as fixed things in themselves, apart from reference to any particular caseÉD; (Dewey, 1932 in Boydston, 1971, 7:232) Dewey realized that new meanings, values and attitudes become encultured in schools only when they have become embodied and are sustained within real life contexts. This requirement disadvantages those types of research that unilaterally descend for testing in a classroom and then disappear with the researcher once the experiment has been concluded. We also see in these diverse applications of the intervention the marked effect of individuals, change agents, leadership and a host of other cultural influences on the innovation. Design-based research does not seek for universal solutions but rather for deep understanding of innovations and the factors that effect improvement in local contexts.
Design-based methods leave room for and encourage multiple iterations through all four of the phases resulting in the intervention’s continuous evolution and development. Indeed, Bereiter (2002) defines design research as ÉC;any kind of research that produces findings that are fed back into further cycles of innovative designÉD;. Obviously the ideals that emerge across contexts in the fourth stage become guiding ideals for further study in a regenerated first stage. However, after each phase we are more knowledgeable than at the entry point for the previous stage. Thus, knowledge grows in a circular fashion as it iterates through phases of the design-based research project. But design-based research like all field-based naturalistic research rarely follows pre-designed paths. Thus, one can expect iterations especially between the second and third stages where incidences of the intervention are developed and tested in a formative manner.
Some writers have defined design-based research based upon a set of qualifying adjectives. These include iterative, process focused, interventionist, collaborative, multileveled, utility oriented, theory driven and generative (Shavelson, Phillips, Towne, & Feuer, 2003) . Berieter (2002) adds that design research is driven by a vision for yet unrealized possibilities and therefore characterized by emergent goals and design flexibility. Design-based research has been criticized by even its own proponents as lacking clear definition, approaches, nomenclature, grammar and theoretical underpinnings (Kelly, 2004) . This criticism is quite natural in a methodology that is relatively new and in a rapid state of development and evolution. However, design-based research is inherently pragmatic and thus its practitioners seem, at present, little disposed towards extensive efforts in detailing its epistemological or ontological underpinnings.
Design-based research is not just a more complicated name for formative evaluation. Ann Brown (1992), the American researcher often referenced as the original developer of the term, viewed design-based research as iterative interventions within a real world context. In conjunction with practitioners, design-based researchers refine interventions—iteratively improving them, testing them in different contexts; moving back and forth between applying and expanding theory and striving for the explication of theoretical insights that though contextually grounded, can be extrapolated and modified for effective use in ever more diverse contexts (diSessa & Cobb, 2004) . Design-based research methods speak to antecedent activities of theory and literature adaptation and expert review. They go beyond formative review to conclude with theory generation and efforts to understand and support innovation and sustainability across multiple contexts.
It is perhaps no surprise that design-based methodology has arisen largely within American contexts. Design-based research takes its philosophical grounding from the pragmatic theories and practice of American philosophers notably William James and John Dewey. Rather than search for illusive grand designs or theories that transcend everyday contexts, design-based research like pragmatic philosophy focuses on practical solutions to everyday problems. For Dewey and James the ÉC;problem of truthÉD; is resolved instrumentally by searching for an idea’s practical contribution to the solution of an important problem. In his ÉC; short catechism concerning truth ÉD; Dewey (1910) eloquently argues that:
the pragmatist claims his theory to be true in the pragmatic sense of truth: it works, it clears up difficulties, removes obscurities, puts individuals into more experimental, less dogmatic, and less arbitrarily skeptical relations to life; aligns philosophic with scientific method; does away with self-made problems of epistemology; clarifies and reorganizes logical theory, etc. He is quite content to have the truth of his theory consist in its working in these various ways, and to leave to the intellectualist the proud possession of a static, unanalyzable, unverifiable, unworking property. (Dewey, 1910, par 164)
Both Dewey and design-based studies demand that knowledge be useful in practice. Thus, the next section turns to an applied example. I describe the application of design-based research to an innovation developed to support independent students at Athabasca University— Canada’s Open University. The careful reader will note that chronologically much of the work described in this example began before common (or at least our own) exposure to the terminology of design-based research. We apologize for this rather retroactive fitting of the methodology to a project that began without the benefits of this design. To wait for a new example constructed a priori from design-based tenants would preclude its publication at this time. Despite this shortcoming, I hope that the reader will still derive benefit from the retrospective methodological overlay. It is heartening and useful to realize that elements of design-based research have been a part of much of our continuing work in the field of educational technology. However, now we have a more relevant, coherent and practical lens and language to analyze, describe and critique our research practice.
A continuing and expensive problem in distance education is the provision of effective and cost-efficient student support services. These services include personal, academic, administrative and financial assistance. There is considerable evidence linking the effectiveness of these services with a host of critical success indicators for distance education systems including retention, satisfaction and successful completion of programs (Kenworthy, 2003; Phillips & Hawkins, 2003; Tait & Mills, 2003) .
The traditional model of provision of student support in individualized forms of distance education is through assignment of each student to an academic tutor for each of the courses within the program of studies. The tutor is responsible for providing academic tutoring, advising and assessment and is also charged with allowing for the development of some type of personal relationship with the individual student. In addition, they are the first contact for resolving or referring for more specialized assistance, any other type of student concern.
This traditional distance education system solution is problematic for a number of reasons. First, many students rarely or never initiate contact with their tutors, though the institution pays these staff to ÉC;be availableÉD; (Coldeway, 1991) . Second, it is nearly impossible for tutors to be adequately knowledgeable not only about the content matter of the course, but also with the myriad administrative and institutional issues that are often of great concern to students (Phillips & Hawkins, 2003) . Despite the efforts of tutor training and the provision of yet very early and often inadequate information systems, the distributed and often part-time nature of the tutors’ work often confounds attempts to provide effective and ongoing in-service training—thus the tutors are often not adequately prepared to deal with many student concerns. Recent research with systems designed to build accessible knowledge management system that support both students and tutors may offer long term solutions for future interventions (Johnson & Barrett, 2003; Kenworthy, 2003) . Finally, it is challenging to improve or even to measure the quality of tutor interactions and support since the interactions are usually private and removed from institutional monitoring. These challenges are not unlike many of those inherent in providing support for retail customers and thus both distance educators and many types of consumer sales and support organizations are innovating in ways that are designed to improve service, while containing or reducing costs. The most common solution evolved to date is the telephone call centre.
Woudstra and his colleagues define a call centre as an innovation that combines machines and humans to provide, when needed, advice, information and related knowledge services to distributed clients (Woudstra, Huber, & Michalczuk, 2004) .
To set the context for this study I briefly overview the undergraduate Commerce and Administration program at Athabasca University. This program is the largest undergraduate program at the University, enrolling over 11,000 students annually. Unlike many distance education programs globally, the long distances from which our students enrol and their low concentration in any one location precludes any face-to-face interaction. In addition, these are continuous intake programs in which a student can enrol at any time and can progress at their own pace (up to a two year upper limit) through the course. Traditionally, at Athabasca University the first line of student support has been telephone or more recently email interaction with a tutor. These tutors are ‘on-call’ for two hours a week for telephone interaction and are obliged to respond to emails within 48 hours. The intervention introduced in 1994 was to create a call centre, modeled on those that have become the mainstream means of customer support in the customer service business. Three call centres now operate at Athabasca (a general information centre, a computer help desk and a tutorial service within the School of Commerce and Administration). The operational details of each centre are similar. Instead of having one day a week in which students can talk with a tutor they can now call or email 60 hours a week and talk, not to a specific tutor, but to an undergraduate business advisor. This advisor likely does not know the student personally but unlike the tutors, they do know Athabasca University—its courses, curriculum, administrative requirements and the answers to questions that students enrolled in the business courses have been asking over the past 10 years.
In this section, I illustrate how the development of this intervention can be mapped to the four phases of a design-based research project. In the informed exploration stage, call centre theory and practice were reviewed and studied in their mostly consumer support applications. Call centre interviews were undertaken with some of the key actors and other sorts of qualitative data were collected. As the call centre iterates through various additional applications and is exported to other contexts, we have returned again to the exploration phase to learn of ideological concerns, morale problems, potential deskilling effects and other often unanticipated outcomes of other call centre applications (see for example, Bain, Watson, Mulvey, Taylor, & Gall, 2002; Dilevko, 2001; Taylor, Hyman, Mulvey, & Bain, 2002) . We are also cognizant of how student interaction with call centres’ personnel and technology is a form of substitution for interactions previously supported by other actors. These types of substitutions, and especially those that are scaleable, are of special interest to open universities like our own. I have argued previously (Anderson, 2003b; Anderson, 2003a) that distance education delivery models that effectively support the substitution by student-content and student-student interaction for expensive student-teacher interaction are a necessary development—especially in a lifelong learning world that extends learning ÉC;beyond the course.ÉD; This information, as well as our own growing understanding acquired in practice, is used to further enrich the literature for the benefit of others (see Adria & Chowdhury, 2002; Adria & Woudstra, 2001; Alpern & Carr, 2000; Woudstra, Huber & Michalczuk, 2004) . The ideal that motivates the development of this design is to improve the cost-effectiveness of higher education. Although for some any effort to increase cost-effectiveness is seen as an attack on public or especially liberal education, we see it as a critical component of our value as an open university. The rising costs of higher education within developed countries and the critical lack of opportunity to participate in any form of higher education in most developing countries convince us of the need to be constantly searching for ways to reduce the cost of formal education. This search may result in less funding spent on teachers, tutors, course designers or other crucial components of our development and delivery system—yet to restrict our innovations to those that serve the interests of these staff, as opposed to learner interests, is a dereliction of our duty.
In the second enactment stage, Lotus Notes applications were constructed to provide the information database needed to answer typical questions and the tracking system developed to monitor the type of questions asked. Unfortunately, we didn’t gather as much data on the production as we could have and thus there is little data detailing costs, timelines and design specifications. Such data loss inevitably results in less capacity for the innovation to be replicated elsewhere. However, we have also found that commercial products built on the Lotus platform provide higher quality and more cost-effective tools than those we constructed for our initial development. Thus, we have iterated through and updated the call centre software through a number of installations. This is a strategy that is consistent with design-based innovation adoption and development. Finally, pilots were conducted and results monitored on pilot study and regular students.
The third valuation stage consisted of both quantitative and qualitative data collection. First and most importantly, all communications (from telephone as well as email) are tracked in a database. This database can be searched and interrogated by faculty and administrators online using a web browser, thereby allowing faculty to monitor, on a continuing or needed basis, the type of interactive questions, queries and concerns of their current students. This data can also be monitored over successive years thus gathering longitudinal data that can further inform our practice. This source of data is in marked contrast to the ÉC;black holeÉD; in which student interactions with tutors fell into in the past. Previous to the call centre, it proved a very challenging task to gather comprehensive data related to frequency, content, context and solution to student concerns.
In the local evaluation stage of the design-research, annual evaluations of student satisfaction with learner support services are analyzed to differentiate between students with traditional tutors and those assigned to call centres. These studies reveal that there is no significant difference in overall satisfaction (Athabasca University, 2003) . In addition, this innovation saves the School of Business over $100,000 a year (Woudstra, Huber & Michalczuk, 2004) . However, it should be noted that some students (and tutors) miss the familiarity that they were used to with a single tutor assigned to a restricted number of students.
We have also learned that the call centre is a disruptive technology. Some of the tutors and their union have expressed concerns that their jobs have been reduced in scope and in resulting compensation. Currently the call centres handle 80% of student concerns and only 20% of the questions are passed on to academic experts for reply. These and other questions are of course monitored, the time for resolution tabulated, and the answers then made available for both tutors and call centre advisors in a FAQ (frequently asked question) file. Further design-research work is needed to monitor the effect of this innovation on long-term completion rates, academic outcomes and the effect on working conditions of academic, tutors and advisory staff. The data collection incorporated into the system also encourages practitioners to monitor their own work. It provides a ready tool for the development of action and more theoretical research questions and projects since staff are able to easily monitor the effect on student queries of any further innovation on any part of the educational system that directly impacts students.
The final, fourth ÉC;broader impactÉD; phase of a design-based study is ongoing in this example. The results of the initial project, as well as effects of various iterations, continue to be gathered and published (Adria & Woudstra, 2001; Woudstra & Adria, 2003; Woudstra, Huber & Michalczuk, 2004) . As mentioned earlier, we now have similar call centres operating in our computer help desk and our general university advisory service. We have also had visitors from other institutions who subsequently based their own distance education call centres on the Athabasca University model. It is useful to compare the type of call centre approach that we have developed as reports are published from other educational applications (i.e., Hitch & MacBrayne , 2003; MacBrayne, 2003 ) . We see the frontier of innovation is the degree to which the call centre can provide services beyond those associated with administration to providing effective academic and personal support. Finally, it is hoped that publication of the research results will result in replication in different contexts, allowing further study and innovation development as the results of these innovations are shared throughout the distance education community.
To conclude, it is useful to detail the advantages and disadvantages of a design-based research approach to e-learning concerns. The most obvious advantage is the reintegration they bring between practice and theory and between researchers and practitioners. Design-based studies take place in situ and rely on the active input and participation at all four stages of practitioners. Secondly, design-based studies involve active intervention. We seek not only to understand, document and interpret but rather to change and improve educational practice and opportunity. Thirdly, design-based studies are not dependent upon a specific methodology or ideology. They are pragmatic adaptations to real problems that utilize whatever tools and techniques are available to enhance understanding and practice. Further, design-based studies could be described as methodologically neutral, though they are not methodologically neutered. They can use very rigorous qualitative and quantitative methods as appropriate within any of the four phases.
Design-based studies also offer challenges. They are not short term and often not as clear-cut and predefined as required by funding agencies or thesis supervision committees. Secondly, design-based studies today lack a clear and coherent vocabulary that helps identify and justify their activities and results (Kelly, 2004) . No single world view or epistemological understanding of the way in which learning occurs guides either the design of the intervention or its assessment. Thus, new design-based researchers may feel awash in a sea of possibilities with no accurate charts to guide them. Hopefully these problems will be addressed as we further develop our capacity to innovate and improve education through well funded and well managed design-based research, empowering education research to be ÉC;research that makes a difference.ÉD;
Note: Parts of this paper were contained in a keynote address to the 3 rd Research Workshop of the European Association for Distance Education. Oldenberg, Germany. March 3-5, 2004
Adria, M., & Chowdhury, S. (2002). Making room for the call center. Information Systems Management, 19(1), 71-80. Retrieved July 16, 2004 from http://www.auerbach-publications.com/eJournals/articles/ article_synopsis.asp?id=31479
Adria, M., & Woudstra, A. (2001). Who's on the line? Managing student communications in distance learning using a one-window approach. Open Learning, 16(3), 249-261.
Alpern, M., & Carr P.D. (2000). The service dilemma: A case for web-based customer service. 13th Biennial Conference of the International Telecommunications Society. International Telecommunications Society.
Anderson, T. (2003a). Getting the mix right: An updated and theoretical rational for interaction. International Review of Research in Open and Distance learning, 4(2). Retrieved October 30, 2003 from http://www.irrodl.org/content/v4.2/anderson.html
Anderson, T. (2003b). Modes of interaction in distance education: Recent developments and research questions. In M. Moore & W. Anderson (Eds.), Handbook of Distance Education (pp. 129-144). Mahwah, NJ: Erlbaum.
Athabasca University . (2003). A comparative assessment of undergraduate tutorial delivery report. Athabasca: Athabasca University.
Bain, P., Watson, A., Mulvey, G., Taylor, P., & Gall, G. (2002). Taylorism, Targets and the Pursuit of Quantity and Quality by Call Centre Management. New Technology, Work and Employment, 17, 170-185. Retrieved July 13, 2004 from http://www.hrm.strath.ac.uk/fow/publications%20pdf%20files/NTWE%202002%20Targets%20paper.pdf
Bannan-Ritland, B. (2003). The role of design in research: The integrative learning design framework. Educational Researcher, 32(1), 21-24. Retrieved January 23, 2004 from http://www.aera.net/pubs/er/toc/er3201.htm
Bereiter, C. (2002). Design research for sustained innovation. Cognitive Studies, Bulletin of the Japanese Cognitive Science Society, 9(3), 321-327. Retrieved September 1,2004 from http://ikit.org/fulltext/2002 Design _Research.pdf
Boydston, J. (1971). John Dewey: The Early Works. Carbondale: Southern Illinois University Press.
Branson, R.K. (1998). Teaching-centered schooling has reached its upper limit: It doesn't get any better than this. Current Directions in Psychological Science, 7(4), 126-135. Retrieved July 26, 2004 from http://www.cpt.fsu.edu/ pdf/teaching.pdf
Brown, A. (1992). Design Experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141-178.
Burkhardt, H., & Schoenfeld, A. (2003). Improving education research: Towards a more useful, more influential and better-funded enterprise. Educational Researcher, 32(9), 3-14. Retrieved February, 23, 2004 from http://www.aera.net/pubs/er/pdf/vol32_09/ERv32n9_pp03-14.pdf
Coldeway, D. (1991). Patterns of behaviour in individualized distance education courses. Research in Distance Education, 3(4), 6-10.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of the Instructional Sciences, 13(1), 15-42.
Dewey, J. (1910). A short catechism concerning truth. In J. Dewey (Ed.), The Influence of Darwin on Philosophy and Other Essays (pp. 154-168). New York: Henry Holt and Company. Retrieved August 13, 2004 from http://spartan.ac.brocku.ca/~lward/dewey/Dewey_1910b/Dewey_1910_06.html
Dewey, J. (1971a). Pedagogy as a university discipline. In J. Boydston (Ed.), The Early Works of John Dewey (pp. 281-290). Carbondale: Southern Illinois University Press.
Dewey, J. (1971b). The Study of Ethics: A Syllabus . In J. Boydston (Ed.), The Early Works of John Dewey. (pp. 262) Carbondale: Southern Illinois University Press.
Dewey, J. (1998). Education as engineering. In L. Hickman & T. Alexander (Eds.), The Essential Dewey (pp. 270-273). Bloomington, IN: Indiana University Press.
Diesing, P. (1991). How Does Social Science Work? Reflections on Practice. Pittsburgh: University of Pittsburgh.
Dilevko, J. (2001). An ideological analysis of digital reference service models. Library Trends, 50(2), 218-244.
diSessa, A., & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences, 13(1), 77-103.
Friesen, N., & Anderson, T. (2004). Interaction for lifelong learning. British Journal of Educational Technology, 35(6), 679-688.
Hitch, L., & MacBrayne , R. (2003). A Model for Effectively Supporting e-Learning. The Technology Source , March. Retrieved July 16, 2004 from http://ts.mivu.org/default.asp?show=article&id=1016
Johnson, M., & Barrett, C. (2003). Addressing the learning skills needs of students at a distance: A dual mode approach. In A. Tait & R. Mills (Eds.), Rethinking Learner Support in Distance Education (pp. 41-55 ). London: Routledge Falmer.
Kelly, A. (2004). Design research in education: Yes, but is it methodological. The Journal of the Instructional Sciences, 13(1), 115-128.
Kenworthy, B. (2003). Supporting the student in new teaching and learning envrionments. In A. Tait & R. Mills (Eds.), Rethinking learner support in distance education: Change and continuity in an international context (pp. 55-63). London: Routledge Falmer.
MacBrayne, P. (2003). Role of call centers in student support. Western Cooperative for Education Technology. Retrieved September 16, 2004 from http://www.wcet.info/projects/laap/resources/call_centers.asp
National Research Council. (2002). Scientific Research in Education. Washington DC: National Academy Press. Retrieved September 1, 2004 from http://books.nap.edu/books/0309082919/html/
Phillips, A., & Hawkins, R. (2003). Blending the mix: The provision and integration of students support services in the networked age. Open Praxis, 1, 7-13.
Sabelli, N., & Dede, C. (2001). Integrating educational research and practice: Reconceptualizing goals and policies. Project ScienceSpace. Retrieved July 26, 2004 from http://www.virtual.gmu.edu/ss_research/cdpapers/policy.pdf
Shavelson, R., Phillips, D., Towne, L., & Feuer, M. (2003). On the science of educational design studies. Educational Researcher, 32(1). Retrieved January 23, 2004 from http://www.aera.net/pubs/er/toc/er3201.htm
Slavin, R. (2002). Evidence-based education policies: Transforming educational practice and research. Educational Researcher, 31(7), 15-21. Retrieved February 2002 from http://www.aera.net/pubs/er/pdf/vol31_07/AERA 310703.pdf
Tait, A., & Mills, R. (2003). Rethinking learner support in distance education: Change and continuity in an international context. London: Routledge.
Taylor, P., Hyman, J., Mulvey, G., & Bain, P. (2002). Work organization, control and the experience of work in call centres. Work, Employment & Society, 16(1), 133-150. Retrieved July 13 2004 from http://www.hrm.strath. ac.uk/fow/publications%20pdf%20files/WES%20journal%202002%20Work%20organisation%20and%20control.PDF
Tyack, D., & Cuban, L. ( 1995). Tinkering toward utopia. Cambridge, MA: Harvard University Press.
U.S. Department of Education. (2002, March 07). Strategic Plan 2002–2007. Washington. Retrieved September 1, 2004 from http://www.ed.gov/ pubs/stratplan2002-07/index.html
Woudstra, A., & Adria, M. (2003). Organizing for the new network and virtual forms of distance education. In M. Moore (Ed.), Handbook of distance education (pp. 531-47). Mahwah, NJ: Lawrence Erlbaum Associates.
Woudstra, A., Huber, C., & Michalczuk, K. (2004 ). Call centers in distance education. In T. Anderson & F. Elloumi (Eds.), Theory and practice of online learning (pp. 295-318). Athabasca: Athabasca University Press. Retrieved July 2004 from http://cde.athabascau.ca/online_book/ch12.html
© Canadian Journal of Learning and Technology