How Research Moves into Practice: A Preliminary Study of What Training Professionals Read, Hear, and Perceive

Saul Carliner, Regan Legassie, Shaun Belding, Hugh MacDonald, Ofelia Ribeiro, Lynn Johnston, Jane MacDonald, and Heidi Hehn

Authors

Saul Carliner is an Associate Professor of Educational Technology at Concordia University and a member of the board of the Canadian Society for Training and Development. Correspondence regarding this article can be addressed to: saulcarliner@gmail.com

Regan Legassie is Commandant of Training for the Canadian Forces Joint Support Group and a member of the board of the Canadian Society for Training and Development.

Shaun Belding is CEO of the Belding Group of Companies and a member of the board of the Canadian Society for Training and Development.

Hugh MacDonald is Principal in HR MacDonald: Training and Development, Inc. and a Chair of the board of the Canadian Society for Training and Development.

Ofelia Ribeiro is a PhD student at Concordia University.

Lynn Johnston is President of the Canadian Society for Training and Development.

Jane MacDonald is the Manager of Marketing and Events of the Canadian Society for Training and Development.

Heidi Hehn was Membership Coordinator of the Canadian Society for Training and Development at the time this article was written.

Abstract

In the growing body of research on the practice of training and development, several studies suggest that use of research-based findings in practice is low. The present study was designed to better understand the research-practice gap by exploring these questions: (1) Which published sources in the field are practicing professionals reading? How frequently do they read these materials? (2) Which conferences and meetings do practicing professionals attend? How frequently do they attend these events? (3) In what formats are research content most usable to practicing professionals? (4) What are practicing professionals’ general perceptions of research publications and presentations? Key findings point to publications having a wider reach among practicing professionals than conferences and, of those publications, professional magazines have a wider reach than peer-reviewed journals. In terms of the manner in which the content is presented, practicing professionals prefer case studies from the workplace over other types of content.

Résumé

Dans le corpus croissant de recherches portant sur la pratique de la formation et du perfectionnement, plusieurs études suggèrent une faible utilisation des résultats de recherche dans la pratique. La présente étude a été conçue afin de mieux comprendre l’écart entre la recherche et la pratique par l’examen des questions suivantes : (1) Quelles sources de publications du domaine les professionnels pratiquants lisent-ils? À quelle fréquence lisent-ils ces publications? (2) À quelles conférences et réunions les professionnels pratiquants assistent-ils? À quelle fréquence assistent-ils à ces événements? (3) Dans quels formats les contenus de recherche sont-ils le plus facilement utilisables par les professionnels pratiquants? (4) Quelles sont les perceptions générales des professionnels pratiquants envers les publications et présentations de recherche? Les résultats principaux indiquent que les publications rejoignent davantage de professionnels pratiquants que les conférences et que, parmi ces publications, les magazines spécialisés ont une portée plus vaste que les publications évaluées par les pairs. En ce qui concerne la manière dont le contenu est présenté, les professionnels pratiquants préfèrent les études de cas en milieu de travail aux autres types de contenu.

Background

In 2006, the Canadian Society for Training and Development (CSTD) published Review of the State of the Field of Workplace Learning: What We Know and What We Need to Know about Competencies, Diversity, E-Learning, and Human Performance Improvement, a review of the research in each of these areas, especially as they related to workplace learning and performance in Canada. Although the report was written in a simple language (for the most part, Grade 12) and available on the websites of both CSTD and the Canadian Council for Learning (the sponsor of the report), usage of the report was believed to be low even though the association delivered a series of webcasts to promote the findings. This belief was based on low enrolment levels in the webcasts (less than 1% of the membership in some, 2% in the largest) and the lack of feedback on the report.

Although disappointing, the low usage of the report was not surprising. Despite a growing body of research evidence underlying the practice of training and development, studies suggest that the use of research-based findings in practice is low. Indeed, the evidence-based practice movement in public education is a response to the low usage of research in these settings (Slavin 2004).

More immediately, concern exists in the academic communities that conduct research on training and development (also called workplace learning and performance)—the disciplines of human resource development, adult education, and educational technology—about the extent to which their research findings are applied. In the field of Educational Technology, Clark and Estes (2002) take a prescriptive approach. Their book, Turning Research into Results, explains to practitioners how to review and apply research findings to get “results.” Others take a more descriptive approach. Concerned about the extent to which research-based guidelines in the field of Human Resource Management (which includes Human Resource Development) find their way into practice, Rynes, Colbert and Brown (2002) compared HR managers’ and executives’ beliefs about 35 HR practices that are supported by research. They found that, on several of these practices, the beliefs of more than 50% of the participants were inconsistent with the findings of research. Exploring this issue further, Deadrick and Gibson (2007) found that part of the problem might result from a gap between the groups’ presumed primary choices of content: peer-reviewed journals or professional magazines.

This research-practice gap is of special concern in workplace learning because, at the least, the research is conducted by so many disciplines, including adult education, educational technology, industrial psychology, human performance technology, and technical communication. At the most, none takes the lead in assessing the extent to which research findings on training make their way into practice. Given the experience with the Field Review, concern existed that, despite the emphasis placed on research dissemination in the Canadian research funding system (indeed, most granting agencies, like the Social Sciences and Humanities Research Council (SSHRC) and Fonds de Recherche sur la Societe et la Culture (FQRSC) ask researchers to explain how they will disseminate their findings and groups like SSHRC and the CCL see themselves as knowledge management agencies), research might not be reaching its intended audience using the traditional means of publications, conference presentations and the more recent addition of the Web.

Specifically, questions had arisen about:

The study described in the rest of this article is intended to answer these questions. The next section presents the methodology, followed by the results. The closing section presents conclusions, describes the limitations of the study, and suggests future research.

Methodology

This was intended as an exploratory study from which future research might proceed. Its primary purpose was intended to gather descriptive information about the ways that training practitioners learn about research.

The research was conducted by a team of the Canadian Society for Training and Development (CSTD), the largest professional association serving trainers in Canada. CSTD is a global partner of the American Society for Training and Development, which is the largest association of training professionals in the world. CSTD already had considerable institutional knowledge about the journals, magazines, and conferences serving the training community and where basic and applied research training is reported. As a result, we did not need to conduct interview-based or focus group research to discover these sources. Our focus was the extent to which the journals, magazines, and conferences are used by this population. A survey that would provide such descriptive statistics seemed most appropriate.

Although a simple review led us to literature on the subject, the body of the literature is small, and most of the pertinent studies were published in journals on human resource management, rather than the more specific discipline of human resource development or training. The literature review included a keyword search as well as manual reviews of each issue of Educational Technology Research and Development Human Resource Development Quarterly, Human Resource Development International, and Performance Improvement Quarterly published between the first quarter of 2003 and the first quarter of 2008.

The intended participants of this survey were professional trainers working in the field, which includes instructors, instructional designers, planners for training departments, and managers of these professionals. Trainers work in government agencies, non-profit organizations and private corporations. Because of the limited resources for conducting the study, as well as the easy availability of the CSTD membership list, a convenience sample of CSTD members was used.

This use of a convenience sample raises two concerns about the extent to which the participants actually represent the target population. The first is that trainers who join a professional organization might feel a stronger affinity for the profession and, as a result, might engage in more professional development activity than trainers who do not become members. In addition, some members of CSTD identify themselves as aspiring or former trainers—that is, people who are not currently working in the field. The methodology did not provide a means of excluding their responses from the results.

Although initially considered as a stand-alone survey, because CSTD was already planning to conduct a membership survey, the survey questions were included in the CSTD membership survey to avoid over-surveying the membership. This survey was conducted electronically using Survey Monkey. In March 2008, 2334 invitations were sent to all current members of CSTD. A reminder was sent three weeks later. Four hundred thirty two (432) people responded to the overall survey, an 18.5% response rate. A summary of responses was prepared from the online data.

Survey questions were validated through a review by members of the research team in advance of conducting the survey and comments were incorporated into a revised draft of the survey. Members of this team have formal backgrounds in research, usability, and editing, as well as training and development. Simple statistics were reported, including counts and, when appropriate, means and averages. A copy of the survey is provided in Appendix A.

Results

The next several sections present the results. After a brief note about the responses, data on the demographics are presented, followed by data about which print and web-based materials participants read and which meetings and conferences participants attend. Last, data about participants’ general motivation to track developments in research is presented.

About the Responses

For most of the questions that focused on professional development habits, approximately 75% of those participating in the larger CSTD membership survey responded. Response rates varied among questions, however. For most questions, between 313 and 337 individuals responded. Those questions receiving a lower response will be identified. Although the section of the CSTD membership survey that considers professional development habits was tested internally (as described earlier), this section of the survey was not field tested and, as a result, some questions arose regarding responses. These, too, will be indicated.

Demographics of the Participants

The participants were predominantly female (68.5%), and the majority had completed a bachelor’s degree or additional education (1.5% of participants had completed a PhD, 32% had completed a master’s degree, 14.8 had completed some graduate work, and 35.6% had completed a bachelor’s degree).

Most participants were experienced in training and development; only 22.3% had worked in the profession for five or fewer years. Of the rest, 32.6% had worked in the profession for six to 10 years, 25.5% 11 to 20 years, and 19.8% had worked 21 or more years in the profession.

Most participants were “captive employees;” only 17% were independent consultants. Of the rest, 30.9% worked in departments of one to five people, 11.6% worked in training departments of six to 10 people, 16% worked in training departments of 11 or more people, and 13.1% worked internally to an organization, but outside of training departments. Of these “captive employees,” however, several might work for training vendors, because the nature of the organization’s business was not requested on the survey.

In terms of certification, 7.8% had earned CSTD’s voluntary Certified Training and Development Professional (CTDP) designation and another 18.5% were in the process of earning it.

Which Printed and Web-Based Materials Participants Read

The first several questions explored published sources that survey participants read, the frequency that they read them, and some preferences regarding content from published sources. The survey separated these sources by type: professional magazines (both in-print and online), peer-reviewed journals, and websites and blogs (ones that did not follow a more traditional magazine publication schedule), and preferences and beliefs about published content.

Professional Magazines Read by Participants

In terms of professional magazines, some of which are published as frequently as 10 times per year, we listed magazines specifically targeted to training professionals, including those published by professional associations like CSTD, and those published by for-profit organizations, like the e-Learning Guild and Nielsen Business Media. Table 1 shows the responses to the question.

Table 1. Readership of Professional Magazines*

table 1

* The survey tool automatically rounds to 1 decimal place, so percentages might exceed or miss 100% by .1%.
** Less than once a year means that the individual might have read the publication once or twice over the course of their career, but extremely infrequently.

The most widely read professional magazines were those published by CSTD: CSTD’s bi-weekly e-newsletter (of which 78.5% of participants either read most or all issues) and the Canadian Learning Journal, its twice-a-year print magazine (of which 67.1% of participants either read most or all issues). This result was not a surprise as the population surveyed was the CSTD membership.

The next most commonly read magazines were those with the longest publishing history in the field were TRAINING (which has been published for over three decades) and T+D, the magazine of the American Society for Training and Development (ASTD), a magazine that has been published for over six decades. The next most commonly read magazines are ones that are most likely to appeal to more experienced members of the field, the Harvard Business Review (not a training magazine per se, but included to get a sense of the readership of broader business publications; during the period of the survey, the magazine published several articles on training), HR Professional (the trade newspaper of the Human Resources Professionals Association of Ontario) and e-Learning Solutions magazine, an online magazine for experienced e-learning developers (published by the e-Learning Guild).

Overall, most or all of issues of magazines were read by 19.82% of the respondents (an average of the combined responses in the two right-hand columns of Table 2 (2-3 Times a Year and All 4 Issues)). But that number is skewed because of the extremely high readership of the two CSTD publications. If they were removed from consideration, respondents read most or all of the issues of professional magazines 10.1% of the time. As noted earlier, readership for the two CSTD publications was higher than all of the other publications. In contrast, 40% of the respondents indicated that they had never read two of the other magazines, and 50% or more respondents indicated they had never read nine of the magazines. In fact, more than 70% of respondents indicated that they had never read four of those nine magazines.

Peer-Reviewed Publications Read by Participants

Readership of peer-reviewed publications was, on the whole, lower than that of professional magazines. Journals listed were published by the professional associations most likely to attract training practitioners and scholars, such as the Academy of Human Resource Development, Academy of Management and the Canadian Network for Innovation in Education. We did not include the Canadian Journal of the Study of Adult Education from this list as it was on a publication hiatus at the time of the survey, and we were unsure of its future. Table 2 shows the responses to the question.

Table 2. Readership of Peer-Reviewed Journals*

table2

* Note that the survey tool automatically rounds to 1 decimal place, so percentages might exceed or miss 100% by .1%.
** Less than once a year means that the individual might have read the publication once or twice over the course of their career, but extremely infrequently.

The most widely read journal was the Canadian Journal of Learning and Technology, showing a continued preference for Canadian publications among the respondents. 10.2% of participants either read most or all issues.

The next most widely read publication was Performance Improvement Quarterly (of which 6.7% of participants either read most or all issues, a higher readership than Performance Improvement, the magazine published by the same organization). Next on the list was HR Quarterly (the journal of the Academy of Management, of which 6.4% of participants either read most or all issues) and Human Resource Development Quarterly (of which 5.8% of participants either read most or all issues the oldest of four journals published by the Academy of Human Resource Development). The least frequently read publication among those listed was the Sloan Business Review, which 3.2% of participants read 2 or 3 times per year. Like the inclusion of the Harvard Business Review in the list of magazines, we included this publication to get a small sense of the readership of broader business publications.

Overall, participants, only 5.87% of respondents read most or all of the issues of these journals (an average of the combined responses in the two right-hand columns of Table 2 (2-3 Times a Year and All 4 Issues)). This is a little more than half of the readership of magazines.

One concern about the responses to this question is whether readers confused some of the publication titles. Although the HR Quarterly and HRD Quarterly were identified as publications of particular associations, questions about participation in other activities sponsored by these organizations ranked lower, raising questions that respondents confused the names.

Also of note was the number of people who indicated that they never read these publications. Of the six publications listed, 80% or more of the participants indicated “never’ for five of the publications and 72.6% said that they never read the Canadian Journal of Learning and Technology.

Blogs and Websites Read by Participants

The survey separately asked readers about checking blogs and websites. Only 7.9% said that they regularly check blogs. (Note, however, that the term “regularly” was not defined for users, so respondents supplied their own meaning when responding to the question.) The blog most frequently mentioned was Stephen Downes’ blog (known as Stephen’s Web). Blogs by Jay Cross and Elliot Masie also received multiple mentions.

A majority said that they do not regularly check websites on training (54.3%). Of the 45.7% who do, the most frequently named websites were those of ASTD and CSTD, followed by Google and Google Scholar, and various provincial human resources groups, such as the site of the Human Resources Professionals Association of Ontario (HRPAO—as it was called then, now the Human Resources Professionals Association, HRPA) and Ordre des conseillers en ressources humaines et relations industriels du Québec and a few private providers, like Langevin (which offers an introductory course for new training professionals).

Preferences and Beliefs about Published Content

Although the majority of respondents check neither blogs nor websites regularly, the most preferred source of content was websites, selected by 47.7%. The next most preferred sources of content were magazines (37.9%) and journals (8%). The least preferred source of content was blogs (1.8%).

The most credible type of published content to respondents is case studies of personal experience (preferred by 53.2 of participants), followed by reports of formal research studies (28.1%) and news articles by staff of the publication or website (11.6%). Other types of credible content that participants named in their written included e-newsletters and e-magazines.

In written responses, participants provided reasons for these preferences. These included the free price of online resources, information that addresses the needs of a unique situation, the presumed timeliness of online content and, the most commonly cited reason, easy access to information.

The choice of print over paper sparked passionate responses. Some found print easiest to access because they read professional materials while commuting; others found online access most convenient. Writing in favour of print, one respondent commented, “I like the visual aspect of holding a magazine, plus I can take it to lunch with me.” In contrast, another expressed a preference for online content, writing that: “I find websites more current and can lead me to topics of interest at that moment more than the journal contents.” Still other respondents saw the benefits of both media. One commented that “I expect different information in each of the media listed - I would not go to a blog for peer-reviewed research, for example. I would not go to a journal for first-hand practical case studies and personal experiences.”

Which Meetings and Conferences Participants Attend

The next several questions explored the extent to which survey participants participated in meetings and conferences—the other traditional way to transfer research findings—and some preferences regarding content from them. This section of the survey separated these sources by type: professional meetings and conferences, and closed with questions regarding preferences and beliefs about meetings and events.

Attendance at Professional Meetings

In a separate question about attendance at CSTD chapter meetings, a majority of participants indicated that they attended at least one meeting per year. Specifically, 12.7% attended four or more meetings per year and 38.9% attended one to three meetings per year. Still, a large percentage, 48.5, never attended meetings.

42.6% of participants attended chapter meetings of other professional associations. Most frequently, these other associations focused on training and human resources, such as the British Columbia Human Resources Management Association (BC HRMA) and the Human Resources Professionals Association of Ontario (HRPAO) (the most frequently cited group). Some participants also attended meetings of professional associations in other fields, such as the Canadian Medical Association, International Association of Business Communicators, and the Project Management Institute.

Attendance at Conferences

In terms of conferences attended, the survey listed those specifically targeted to training professionals, including those produced by professional associations like CSTD, and those produced by for-profit organizations, like the e-Learning Guild and Nielsen Business Media.

As the most widely read professional magazines were published by CSTD, so the most widely attended event was the CSTD Annual Conference and Trade Show (of which 37.3 had attended once or twice and 20.1% had attended three times or more). The second most attended event was the American Society for Training and Development International Conference and Exposition (of which 14.2 had attended once or twice and 3% had attended three times or more). CSTD is a global partner of ASTD, and CSTD actively promotes the ASTD conference. The third most attended event was the CSTD National Symposium, an event started in 2005 and which 12.7% had attended once or twice and 1.7% had attended three times or more.

The next most attended events were those intended for practicing professionals, including (in order of participation) events by the e-Learning Guild, ISPI Annual Conference and Exposition, e-Learn, TechKnowledge (ASTD’s technology-focused conference), TRAINING/Spring and TRAINING/Fall (both produced by Nielsen Business Media), CLO (Chief Learning Officer), and ISPI’s fall symposium.

Less attended were more academically focused events, none of which were attended by any more than 1% of the respondents (combined responses of those who attended one or two times, and those who had attended three or more times). These conferences include the Canadian Association for Prior Learning Assessment Conference, Canadian Network for Innovation in Education (CNIE) Conference (the new name of the merged Association for Media and Technology in Education in Canada (AMTEC) and Canadian Association for Distance Education (CADE)), Academy of Management, and Association for Human Resource Development.

No one participating in the study had attended the American Educational Research Association Conference, which is supposed to be the largest event for research on learning.

Also of note was the low recognition rate of most of these events. Only three events seemed to be widely known. Of the rest, two were not known to 65% or more, five were not known to more than 70%, and six were not known to more than 80% of participants in the survey. Table 3 shows the responses to the question.

Table 3. Conferences Attended by Participants in the Survey

table3

* Note that the survey tool automatically rounds to 1 decimal place, so percentages might exceed or miss 100% by .1%.

Credibility of Presentations at Meetings and Conferences

The most credible presentations at meetings and conferences were case studies of personal experiences reported by internal trainers (preferred by 43.9 of participants), followed by case studies of personal experiences reported by consultants (21.3%), industry reports by consultants and the staffs of professional associations (13.5%) and reports of research by practicing researchers (12.5%).

Other formats of credible content named in write-in responses included case studies (like the ones named earlier) and “joint presentations of academics and practitioners.” One person commented, “I like personal real life experiences from whoever is sharing them - things I can use!” In fact, 70% of the written responses to this question mentioned the importance of being able to use the content. One participant noted, “More practical . . . includes strategies to overcome internal issues and politics.”

Participants’ Motivation to Track Developments in Research

The last group of questions explored the motivation to check for research-based content in the context of work, and the credibility of research-based content to participants.

When unsure of how to approach a training situation, survey participants responded that the type of information they are most likely to seek out includes:

97.7% of survey participants felt that they used research-based concepts on the job. Among the most frequently named research concepts were adult learning principles, Bloom’s taxonomy, coaching models, needs assessment, how learners learn, the Kirkpatrick methodology of training evaluation, leadership, and performance assessment. Only 15 of the 81 written responses to this question cited an academic source for the content. The rest cited professional sources such as CSTD and ASTD publications and events, as well as those of private providers like Langevin, TRAINING magazine, and TrainingOutsourcing.com

To gauge a broader influence of research on the field, the survey asked participants to name the three most pressing issues facing the field of training and where they learned about these issues. The issues named clustered around the need for training evaluation (and the related issue of demonstrating the return on training investment), using learning technology (especially e-learning), and leadership of training groups (such as the perception of training in an organization, and securing resources for training). Participants learned about these issues through informal conversations with other trainers (40.4%), a publication or website (13.5%), or through a conference (8.6%). But 37.6% said that they learned about these issues from other sources, and, most commonly, their own experience.

A series of questions in this section explored perceptions about the relevance and usefulness of published information about research. 83.1% of participants responded that they generally find research studies in magazines, journals, blogs, and websites to be interesting and useful, and 74.7% said that they generally find these research topics to be relevant to their work.

In addition, 79.9% of participants responded that they generally find research studies in magazines, journals, blogs, and websites to be easy to understand. And if given a choice, 84.1% said that they would generally choose to read a research article in a magazine, journal, blog, or website.

A related series of questions in this section explored perceptions about the relevance and usefulness of information about research presented at conferences. 76.7% of participants responded that they generally find research presentations at meetings and conferences interesting and useful, and 72.5% said that they generally find these research topics to be relevant to their work. In addition, 66.3% of participants said that, if given a choice, they would generally choose to attend sessions on research at meetings and conferences.

Conclusions and Recommendations for Further Research

The results of the survey provide many insights, which are explored in this section. First, the key conclusions and their implications are presented, followed by the limitations of the study. In the final section, several suggestions are offered for future research.

Conclusions from the Data

In general, the results suggest that published content reaches more working professionals than content presented at conferences and meetings. Of the published content, professional magazines are more likely to be read by practicing professionals than peer-reviewed journals. Of conferences, those intended for practicing professionals are better attended by that population than academic conferences. Most significantly, local content (Canadian publications and conferences) seem to get more attention from this population than content from outside of the country.

In terms of data preferences and credibility, participants seem to prefer case studies over research reports and, within case studies, those reported by internal trainers rather than those reported by consultants. Written responses suggest that participants preferred research reports that contextualized research findings within their working situations. This is consistent with the ways that researchers see case studies. For example, Yin (2003) commented, “case studies are a preferred strategy ....when the focus is on a contemporary phenomenon within some real-life context (p.1).

Similarly, the data might suggest that the retention of content learned in conferences might be higher than that learned through publications. Although fewer respondents named conferences as the source of information about the three most pressing issues facing the field of training, the percentage difference was half that of the level of readership of magazines and the attendance at conferences.

But the data also suggests that, outside of those provided by their immediate professional association, practicing professionals do not have a sense of other publications, conferences, and events that might be relevant for their professional development.

Some results were consistent with empirical data from other sources. For example, the level of attendance at chapter meetings seems to be consistent with data found by the Association for Association Executives (Stolgitis 1997) for all of its member organizations with affiliates.

In contrast, responses in one part of the survey conflicted with that in other parts of the survey in some instances. For example, less than 10% of respondents regularly checked blogs and participants, on the whole, checked websites far less frequently than they read publications. Yet websites were named as the most preferred source of information. This response needs clarification, but could be the result of the way questions were worded on this survey. The questions focused on ongoing habits; participants might use the web to find and check facts on an as-needed basis, as questions arise in their work.

In addition, the readership of HRD Quarterly seemed surprisingly high considering the low responses to other questions about the Academy of Human Resource Development. A concern exists that the name of this publication might have been confused with other publications that have similar names, such as HR Magazine and HR Professional.

Similarly, some responses in the survey seemed to conflict with empirical data reported elsewhere. For example, only 8.6% of participants said that they relied on their own data and needs assessment to make professional decisions, but research conducted by Van Tiem (2004) and Rowland (1993) suggests that experts rely most on intuition when making decisions. Similarly, the most common written response to the question, “How did you find out about the three most pressing issues facing the field of training and development today (broader than the issues facing your employer)?” was personal experience. Perhaps the difference in responses might be the result of semantics, but it might indicate that more investigation of decision making by professionals in training is needed.

Although the popular press suggests that blogs are an important source of data for all types of endeavours, the data from this study suggests that practicing professionals do not regularly check them. Only 7.9% of participants in the survey regularly checked blogs and blogs were only a preferred source of content for 1.9% of participants.

In general then, the study seems to suggest that publications have a wider reach among practicing professionals than conferences and, of those publications, professional magazines have a wider reach than peer-reviewed journals. The study also suggests that “locally” published and presented research (in this case, local refers to Canadian and training-related) is more likely to be read than that produced by other organizations and that, outside of the immediate professional community, trainers have a limited awareness of other sources of content. Web-based content plays a role in professional life, but it is not a regular source of content (at least, at the time this study was conducted). In terms of the manner in which the content is presented, practicing professionals prefer case studies from the workplace over other types of content.

Implications

If these results were conclusive of the larger professional community (an issue discussed in a few moments), they would have significant implications for those who fund research, as well as those who publish it. In terms of those who fund research and require research dissemination as part of the process—including dissemination to those who might ultimately use the content—the results suggest that publishing the content in professional publications in the context of a case study would be especially useful.

In addition, if the results that conferences are not widely attended were to be verified, funding organizations might require that at least one publication be included in a dissemination program for a proposed study—and might welcome dissemination programs that have few if any conferences listed. In addition, because professionals do not seem to widely read publications or attend events, the same content might need to be published and presented in several places to reach a wide audience.

This, in turn, suggests implications for those responsible for publishing magazines and journals, and producing conferences. In general, magazines and journals have preferred exclusive rights to publish content. However, in a highly fragmented market, publishers and producers might start considering, instead, co-publishing articles with related (but not necessarily competitive) publications. For example, CSTD might co-publish content with its counterparts in other countries.

Such cooperation is more likely among professional magazines than peer-reviewed journals, where a long-standing culture of publication prevents re-publication of material as ethically unacceptable, except in a few limited instances, such as a joint issue of a journal, like that published in 2000 by the IEEE Professional Communication Society and the Society for Technical Communication. Should the data be verified, this ethic might need to be revisited in light of practical issues.

Similarly, all types of publications and conferences have few if any case studies, even though these are overwhelmingly the most preferred sources of research content. If the results were validated in a larger study, funding agencies might consider how to raise the profile of case study research and reporting. (For example, even a quantitative study might be reported in terms of a “what if” scenario of a case study, showing how the results might be applied in a real-world context.)

In addition, the data suggests that blogs are not a significant source of data for practicing professionals (at the time of the survey). If so, publishers of online content that includes blogs might reconsider the role of blogs in their overall publication plan. And the conflicting report suggesting that the web was the most sought source of research data suggests that, perhaps, data from research might need to be specially “packaged” for use in focused web searches. A website like usability.gov, which consolidates research on web usability into a series of research-based heuristics (each of which is rated based on the strength of the research evidence), provides one possible model.

Limitations

But the data from this study merely suggests these implications. Limitations of the study prevent it from being used as a stand-alone source of data on which to base decisions. As noted earlier, a concern exists about the extent to which the population of the study represents all training professionals or merely those who belong to CSTD and chose to participate in the survey. A second limitation is that the study only looked at use of publications and conferences on training. Yet the one question about readership of a business publication suggests that professionals might be reading those, and a future study might explore the use of content published outside of the field of training. Third, the larger survey in which these questions were asked had 85 questions, admittedly a long survey. Furthermore, some of those questions were unintentionally repetitious, because different parts of the survey were written by different people. This could have caused response fatigue.

Fourth, the data itself is self-reported, and actual behaviour might differ from that indicated in the responses. In some instances, definitions were not provided, such as the definition of “regularly” when checking blogs and websites on training. Similarly, participants might be providing answers they believe are socially acceptable, a conclusion that Rynes et al. (2002) reached in their study of a related population. More basically, concerns about confusion regarding the questions exist. For example, because the Canadian Network for Innovation in Education is new, that so few people attended its conferences might result from a lack of familiarity with the new name, rather than not having had attended earlier conferences of AMTEC and CADE. Similar confusion around the name of HR Quarterly also raises a concern about the quality of those responses. Although an informal formative evaluation of the questions reported here was conducted, it was not formal, nor was it conducted within the context of the larger membership survey. In a future survey, this should be conducted.

Fifth, information about job role was not collected; therefore we were not able to determine whether reading and attendance habits vary by this important demographic characteristic. In fact, because of some of the other limitations of the study, including issues of access to the data of the larger study, we did not run inferential statistics that might have indicated whether one or more sub-populations had more of a propensity to read the professional literature.

Suggestions for Future Research

To validate and extend the conclusions reached, some of which challenge current convention in reporting research and transferring its results into practice, as well as to address the limitations of the research, at the least, a follow-up survey should be conducted with a broader population of trainers in Canada, including those who do not belong to CSTD.

To address the self-reported nature of the data, some observational studies might be helpful in validating those results in the data that vary most widely with current conventions or contradict other empirical data. Similarly, future studies need to investigate in more depth the use of web-based data to resolve the contradictory results of this study.

In addition, future studies might explore differences in responses among different demographic groups within the profession, comparing participants’ responses by such characteristics as their number of years in the profession, whether or not they have formal academic education in the field, their role in the training department (such as instructor, instructional designer, or manager), and the type of industry in which they work.

These studies might also go further back, starting with some interview-based research to generate grounded hypotheses about how professionals find research-based information for their jobs, and using observational studies to not only validate findings from any subsequent surveys, but to gain insights into how to more effectively design research-based content in a way that practicing professionals are most likely to use it.

With the data from these additional studies, research funders and publishers and producers of research-related content should seriously address the conclusions from this study that are supported by data from the proposed studies related to this one.

References

Clark, R., & Estes, F. (2002.) Turning research into results. Atlanta, GA: CEP Press.

Deadrick, D. L., & Gibson, P. A. (2007.) An examination of the research–practice gap in HR: Comparing topics of interest to HR academics and HR professionals. Human Resource Management Review, 17(2), 131-139.

Rowland, G. (1993.) Designing and instructional design. Educational Technology Research and Development, 41(1), 79-91.

Rynes, S. L., Colbert, A. E., & Brown, K. G. (2002.) HR professionals' beliefs about effective human resource practices: Correspondence between research and practice, Human Resource Management, 41(2), 149–174.

Slavin, R. E. (2004). Education research can and must address “What works” questions. Educational Researcher, 33(1), 27-38.

Stolgitis, W. (1997). Participation by members in professional associations. Society for Technical Communication 44th Annual Conference. Toronto, ON. May 11, 1997.

Van Tiem, D. M. (2004). Usage and expertise in performance technology practice: An empirical investigation. Performance Improvement Quarterly. 17(3), 23-44.

Yin, R. (2003). Case study research design and methods (3rd ed). Thousand Oaks, CA: Sage Publications.

Acknowledgement

This work was undertaken with the support of the Work and Learning Knowledge Centre of the Canadian Council on Learning, which bears no responsibility for its content.

Appendix A: Survey

appendix a: Survery