Article Text

Download PDFPDF

Mining reflective continuing medical education data for family physician learning needs
  1. Denice Lewis,
  2. Pierre Pluye,
  3. Charo Rodriguez and
  4. Roland Grad
  1. Department of Family Medicine, University of Ottawa, Ottawa, Canada
  2. Department of Family Medicine, McGill University, Montreal, Canada
  1. Author address for correspondence: Denice Lewis, Information Technology Primary Care, Research Group, Department of Family Medicine, McGill University, 5858 Côte-des-neiges, Suite 300, Montréal QC H3S 1Z1, Canada denicelewis{at}hotmail.com

Abstract

A mixed methods research (sequential explanatory design) studied the potential of mining the data from the consumers of continuing medical education (CME) programs, for the developers of CME programs. The quantitative data generated by family physicians, through applying the information assessment method to CME content, was presented to key informants from the CME planning community through a qualitative description study.The data were revealed to have many potential applications including supporting the creation of CME content, CME program planning and personal learning portfolios.

  • clinical informatics
  • continuing medical education (CME)
  • needs assessment

Commons license http://creativecommons.org/licenses/by/4.0/

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Participation in CME activities is a basic element of self-regulation in the medical profession. This is particularly important in a specialty as clinically broad based and diverse in practice as Family Medicine. CME programs have traditionally based the assessment of learning needs on internally driven unguided self-assessments. However, the literature around self-assessment supports the notion that people do not know what they do not know. This has resulted in a paucity of trusted data for CME planning purposes. We set out to assess the learning needs of family physicians (FPs) in the context of an online accredited reflective learning CME program.

Methods

Over a 22-week period, FP members of the College of Family Physicians of Canada were emailed e-Therapeutics + ‘Highlights’ - evidence-based treatment recommendations provided by the Canadian Pharmacists Association. The highlights were linked to the validated information assessment method (IAM), a 4-construct, 23-item tool designed to stimulate reflective learning. In CME programs, IAM systematically documents reflection on relevance, cognitive impact, use, and health outcomes of objects of information delivered by (or retrieved from) electronic knowledge resources. The details are provided in the work of Lewis1.

Results

A total of 3690 FPs participated, providing us with an average of 680 IAM ratings per highlight. Two patterns emerged from the quantitative data: (i) low learning and motivation to learn more while the highlight was rated as highly relevant (e.g. the highlight on low back pain) and (ii) high learning and motivation to learn more with low clinical relevance (e.g. the highlight on lymphogranuloma venereum). For each highlight, we counted the number of positive responses for IAM items such as ‘I learned something new’ and ‘I am motivated to learn more’. When viewed through a lens of clinical relevance, the highlights with highest proportions of ratings ‘I am motivated to learn more’ revealed learning needs.

Quantitative data from the IAM tool was then interpreted by six continuing professional development (CPD) experts during structured telephone interviews. Compared with traditional CME programs, the CPD key informants were impressed with the reflective component of both the program and the resulting needs assessment data. As such, these data were interpreted as higher in quality than that traditionally produced. The pattern seen for the low back pain highlight was easiest to interpret and likely represented a clinical topic where therapies/best practice have remained stable for some time and/or the information contained in the highlight did not address learning needs.

Conclusion

Analyzing IAM ratings provides a novel way to capture CME needs of a diverse group of FPs. Our findings may be applied in practice to several domains of CME. Content providers for CME programs may use the IAM data to edit their knowledge products to fit the voiced clinically relevant educational needs of family physicians. Second, CME planners may incorporate IAM data based on the responses of FPs from a defined geographic location (i.e. by province) for the purposes of program development. Third, IAM ratings can be used to create self-learning portfolios for individual FPs.

Acknowledgments

The authors would like to thank Carol Ann Repchinsky, Bernard Marlow and James Bonar for their guidance and access to the described CME program and its content.

Reference

  1. 1.