Attending: Sean Hayes and Francis Kwakwa, co-chairs; Wendy Anson, Peter Greene, Emil Petrusa, Jeannie Poterucha, Andy Rabin, John Schatzer, Juliane Schnedier, Valerie Smothers

Francis asked Valerie to sumarize the current status of the group's work. Valerie summarized that the group has developed technical standards for aggregate evaluation data, including a description of the activity, learner demographics, survey data, and knowledge assessment data. The standard provides details on how many individuals chose a specific response, which enables broad research efforts in CME. The standard was approved as an American national Standard by ANSI in 2009. The standard has been implemented by the ACCME as part of their Program Activity Reporting System.

Andy added that if you are looking to analyze your own data, there is not a lot of value. If you are looking to aggregate data from multiple groups, that is where the standard comes into play.

Andy showed the CE city 360 platform that provides support for education development. Organizations use the tool to roll out learning to users. It also provides management features. Much of what is driving this is the regulatory environment. ACCME has adopted mems. As a result CECity added a component for pars reporting that collects some additional data that doesn’t fit anywhere else. They also added a validation component.

Peter commented that the ACCME chose the bare minimum for PARS data. How would you handle a group of providers doing serious evaluation?

Andy added that some providers have simpler approaches. They haven’t added ability to read in a mems report. They can launch a survey.

Emil asked if the system allows users to create surveys. Andy replied it does.

Sean commented that the standards forced people to think about what they are collecting. There is a lot of junk collected as well as unique datasets. The standard drives direction. Think about what you are collecting before you start.

The group discussed the focus group findings. Sean emphasized the need to summarize the focus group findings and send them to the Alliance for CME technology group. Groups worldwide are demanding aggregate evidence of ce efficacy. Brian McGowan is chair of the Alliance technology group. We had a small, unique sample. We can support Jack taking that to the committee. Making it an open letter may raise visibility.

Juliane commented that the Med Library Association Conference would be a good place to promote the standard. There are regional meetings as well, such as the North American Health Sciences Library meeting.

Sean commented that we should work with the Alliance of CME to conduct a survey before their meeting in January.  That would provide an opportunity to get feedback on the change adoption curve. What are barriers to implementation etc. Sean recommended talking to Jack about sending a letter to the Alliance technology committee. He added that there would be groups internationally interested in knowing about the standard.

The survey should:

  1. Assess readiness to change/adopt
  2. Gauge awareness

Peter asked if the alliance had a venue to bring together vendors. Andy replied no. He added that certification may provide benefits.

Sean went on to say that the FDA’s REMS – risk evaluation mitigation strategy – requires Pharma to collect outcomes data on opiate education. In many cases this education may be delivered by independent CME providers, so a standard format for data collection would be essential.

Emil commented that FDA could require collection of certain data points as ACCME does with pars.