Skip to end of metadata
Go to start of metadata

Meeting Information


March 26, 2013



Call in Number

USA +1-203-418-3123



Attendees: Francis Kwakwa, Chair; Doris Auth, Jennifer Baumgartner, Tony D’Ambrosio, Pamela Ball, Stephanie Cordato, Brad Hill, Cynthia Kear, Ed Kennedy, Joanna Krause, Tom McKeithen, Raj Patel, Nikki Reyes, Valerie Smothers, Lorraine Spencer, John Sweeney, Emma Trucks, Johns West, and Julie White. 

Agenda Items

  1. Review minutes of last meeting

The minutes were approved as submitted

    2.   Implementation update from CORE

Francis explained that CORE is the Collaborative for REMS Education and is comprised of ten organizations.  There were two people from CORE on the call, Tom McKeithen and Cynthia Kear.  Tom explained that the American Academy of Hospice and Palliative Medicine held a three hour meeting on March 13th.  They had a good turnout and 82 evaluation assessments were returned.  The knowledge and competence assessment results were good with both pre and post test. There was an improvement from 57% correct on the pretest to 86% correct on the post test. There were 14 knowledge and competency questions.  An interest to this group may be the demographic type questions. The audience was mostly physicians and advanced practice nurses, and 10% other, mostly nurses. 

As far as practice type questions, 58% fell into the other category.  This issue was talked about before where the practice type questions are only supposed to go to physicians. Because these questions were on the same sheet as the assessment, everyone had the chance to answer that question, so results were from the whole group.  58% indicated a practice type of other, virtually all put down palliative medicine.  25% indicated Primary care, 6% indicated pain specialist and 6% indicated non-pain specialist.  Nothing was unexpected, except there still is the issue of practice type, if you have paper assessment there is no way to direct that question to certain professions.  Cynthia mentioned she was mindful of the amount of time this process takes, nearly 3 ¾ hours process of pre and post testing.  CORE has a tremendous number of live activities that are 2-3 hours on average.  How much time does it take a learner to complete pre and post test.  Tom noted there is a person dedicated for noting the “run” time.  There is a unique balance for content rich curriculum and on the spot assessment.  They will continue to pay attention to that. 

Francis asked if the pre and post are done at the same time as the activity.  Tom answered the pretest is done online.  The registrants were sent a link 3-4 days before the activity.  Valerie asked Tom about the four practice types, primary care, pain specialist and non-pain specialist and other.  How will you use that data?  Tom answered they will use data to look at each of these practice types separately and note differences from an outcomes point of view and improvement and needs assessment.  The ability to capture the nature of the other group, they could be considered non-pain specialist, so that may be another issue.  They didn’t recognize what is a non-pain specialist.  For reporting into the system they would be regarded as non-pain specialist. 

Francis asked if we should better explain what these are on the form so that learners can better select the categories.  Tom thought we need to decide what we are looking for there.  To him, if it were to say other specialist, then that might capture more of what we’re looking for.  Francis commented defining it would help people get out of the other category, Tom thought that was possible,  Francis asked if the forms could be modified moving forward.  Tom stated the forms could be changed.  Francis asked if Valerie should work on wording of that or leave it to providers.  Valerie suggested leaving it to the providers. 

Julie commented that they used the description that was referenced from the implementation guidelines.  They are available on the website.

Linda asked Tom if the 82 folks that completed the assessment completed a post activity assessment or was that the pre-assessment, Tom answered it was the post assessment, although numbers for the pre-assessment were fairly similar.  Linda asked if the forms referenced to the definition of prescriber and did the audience members have concerns about that?   Tom commented they did fit the question have you prescribed opioids in the past year.  Cynthia also confirmed it was on there.  The reporting they received from Julie AAHPM, there didn’t seem to be any resistance to that question.  It was interesting to see the mix of audience members.  Julie noted they understand the primary audience, the ER/LA opioid prescribers.  If folks are uncomfortable with their level of knowledge they may not be current prescribers, it was interesting to see how the data splits out. 

Julie commented that 49% or 47%  folks taking their course were not active prescribers.  Tom shared that 68 out of 82 had prescribed in the past year.  Francis asked if the evaluation was required.  Tom answered they were strongly urged to complete it and given the time to do so.  Cynthia noted it was the white coat effect; the course director encourages them to do the evaluation and post test.  They have faculty training program to encourage completing evaluations from the podium.  Maybe add summary slide to remind people that this is really important to fill out the assessment.  They are working closely with faculty training programs taking lots of time, assessment questions and we need to encourage learners to complete.   

    3.   Discuss feedback from RPC-accreditor calls

          a.  Fields for identifying duplicates, and potential changes to the implementation guidelines

Francis began by suggesting the ACCME could begin this discussion.  Ed mentioned he was on the call last week with staff of the RPC and Polaris, to discuss fields about REMS activities.  There were a few points of data they might want to add such as activity date, location type and activity identifier.  The discussion evolved into the need for collection of data to de-duplicate across accreditors and the fields that would best serve that purpose. Ed recommended the group might want to look at the activity ID as the primary key for determining that uniqueness.  The RPC could assign a unique identifier for people putting on REMS activities. That can be referenced when providers pass data to accreditors and forwarded on to the RPC.  Valerie asked the group if they thought that sounded like a good approach.  Francis thought any kind of unique identifier would be helpful. He asked how we could accomplish that; does every activity go through the RPC?  Linda commented that it would be possible for RPC supported activities.  Other activities would not receive any commercial support.  If you wouldn’t have RPC identifier, how do you avoid double counting those? 

Valerie commented that the providers’ own identifier for the activity would be unique.  For example if Johns Hopkins had an activity that was a REMS compliant offering that wasn’t funded by the RPC, they would include that identifier in data sent to ACCME, ACPE, and any othe raccreditor. They would include the JHU ID each time. Lorraine shared that’s what they do for PARS now.  Every record is tagged with an internal unique identifier, they tag their own Unique ID, and it’s easy to identify activity that way.  Francis asked didn’t we have a unique activity ID created by the provider already?  Lorraine said you could have duplication; the question that came up on that call was if the activity was live as well as web-based would they have separate and distinct activity numbers.  Lorraine said they would be different for her organization.  Enduring and live are two different activity types. 

Pam commented some of the programs could possibly be accredited by multiple organizations besides ACCME, so having some kind of number that is unique is important. Valerie agreed, noting that the including the provider’s unique ID for the activity is very important.  She asked Pam if the AOA collects the ID created by the provider for the activity.  If SOMA in Arizona is accredited by the AOA, do they include SOMA’s activity identifier. Pam answered yes.  Valerie asked if you pass on the SOMA ID, then that would identify duplicates.  Pam stated it would have to be tagged or referenced to know that it came in from ACCME first and then the AOA.  The AOA gives SOMA a unique identifying number and the ACCME gives them another number, if they submit how do you identify it.  Valerie asked when you report, how do we identify the accredited provider?  When you send data to the RPC do you include the name of organization or a number.  It is problematic for different accreditors have different names for accredited provider.  Pam answered they talked about how to safeguard against that kind of thing; if the name of the organization is Johns Hopkins Inc., and we just put Johns Hopkins it gets the same consideration and will not be dually counted. 

Valerie noted that some of this is beyond the scope of Medbiquitous’ work. Polaris and the RPC are resolving these problems. What do we need to put in our implementation guidelines, so Polaris can identify duplicates? Ed had suggested activity date, location, type, and identifier. Type may be problematic given that accreditors have different nomenclature to describer that.  The other fields are already in the guidelines.  Raj mentioned activity type, location, ands activity ID are important.  If an ID is unique to the provider, then it’s unique no matter who they provide the data to. That is the best option.  Valerie asked if the issue is that there are providers that don’t include their own ID when they send accreditors data. John stated that is optional; in general they plan to make it a required element in REMS and have a uniqueness requirement as well.  Valerie thought that was helpful. 

Francis asked if the unique ID is from the provider side.  Do you keep the provider ID or give the accreditor ID?  Valerie thought you probably do both. John mentioned they keep the provider’s own identifier and assign their own ACCME identifier.  Raj asked Valerie if start date and end date describe the reporting period.  Valerie noted that was correct.  Raj asked Valerie if she would expect multiple records to have the same reporting date.  Valerie said yes, there are also fields in Healthcare LOM that would meet the other requirements that Ed brought up and could be captured in different fields.  They are already in the specifications and could be added to the implementation guidelines; Raj could also add those to the spreadsheet.  Valerie will go ahead and update the implementation guidelines to reflect our conversation and present to the group for review. 

Francis thought activity type is problematic, given the differences in nomenclature; he is not sure how we would resolve that.  Ed commented that he doesn’t think activity type should be used to identify duplicates because of the reasons identified.  

Raj provided a recap for the group, going forward with adding location, activity date and the accreditor generated provider ID as well as provider generated unique ID.  Valerie clarified the provider generated activity ID was optional a couple of weeks ago.  She noted it says “may be included”, should it be changed to “must be included”? Raj commented yes.  Valerie asked the group what they thought about making the provider generated Identifier required.  Jennifer asked how the provider generated ID is being generated.  Valerie answered if Johns Hopkins is saying our ID number is 12345678, each activity would have a different ID, an activity ID not a provider ID.  Jennifer thought they could accommodate that in their system, they just have to figure out how to manage it.  It also requires providers to generate an ID, a unique identifier for that activity.  Francis commented provider activity ID should be required. 

Valerie asked about recommending standard naming convention for identifying providers? She doesn’t think it’s necessary if all providers send their unique ID for the activity.  That keeps it simple, and addressed the questions of non-RPC assigned activity.  Raj mentioned as long as they pass on the same ID to all accreditors that would help flag duplicative data from other accreditors.  Valerie suggested changing the activity ID description in the implementation guidelines from “may include” to “will include.” She can also add guidance for using an RPC generated ID and accreditor provided ID.  Pam questioned whether they should use the same ID every time we submit?  Pam confirmed the AOA would have their own identifier for activity from the provider.  Valerie summarized the changes she will be making: change activity ID to required field in the implementation guidelines, include RPC generated as optional, and include accreditor generated activity ID as an optional field.  Raj agreed.     

     b.  Fields for identifying duplicates, and potential changes to the implementation guidelines

     c.  Jointly accredited confusion

     d.  Specifying supporters other than the RPC

  4.   Discuss linking to REMS

  5.   Discuss commercial supporter tags

  6.   Discuss updated implementation guidelines

  7.   Discuss updated specification 


Action Items

  • change activity ID to required field in the implementation guidelines
  • include RPC generated id as optiona
  • include an accreditor generated activity ID as an optional field. 
  • No labels