January 3, 2012
8 PST/11 EST/16 GMT
Attending: Linda Lewin and Alan Schwartz, Co-Chairs; Susan Albright, Carol Carraccio, Mike Dugan, Simon Grant, Patricia Hicks, Kimberly Hoffman, William Iobst, John Jackson, Valerie Smothers, Kevin Souza, Janet Trial, and Sandra Waters
1 Review minutes
The minutes were accepted as submitted.
2 Discuss podcast and any preliminary feedback from medical students
Valerie summarized that she had sent the podcast and overview to the students in December as previously discussed. She noted the students had till January 13th to respond with feedback. She thanked everyone for the great feedback provided. She asked the group if anyone wanted to receive the link and several members expressed interest. She sent the link out the working group. Valerie requested edits to the podcast be sent to her; she will maintain a current version on the website. The request for feedback included the OSR (AAMC’s student representative mailing list), the AMSA (American Medical Student Association), the AMA medical student section, and the Student National Medical Association.
Bill commented that Valerie had asked for feedback on Internal Medicine’s perspective around the characterization of the sub competencies. He thought they were wonderful and aligned nicely with the work in the Internal Medicine community, however the terminology was confusing. He mentioned an article set to appear in the January issue of the New England Journal of Medicine that formally presented new GME accreditation standards that will address the concepts of milestones, EPA’s and narratives. The article will be coming out in the next few weeks and would be important for the group to review. Bill noted the Internal Medicine community is not as organized as the pediatric community in developing competencies for training programs but has recently developed a working group to address definitions. He suggested adding representatives from this group to the working group. He suggested checking with Lee Berkowitz who will have recommendations for additions. Carol agreed that was a great idea.
Kevin asked about the timeline for the standard and how it may start changing as a result of all this. Valerie shared that our timeline was to have completed the standard by December, so we are clearly working on a different timeline now. She added that was appropriate given the complexity of the standard. The sponsors (NBME and AAMC) are giving us their support to extend the timeline and get the job done right. Kevin was concerned about pressure from the sponsors and Valerie agreed to check with the sponsors about timeline.
3 Discuss weighting of exam data
Alan asked Valerie to summarize the issue for the group. Valerie explained that on the last call, David Melamed brought up whether there should be weighting of standardized exams incorporated in the standard. He proposed that USMLE account for 10% of medical knowledge scores; Valerie commented that such policy decisions are beyond the scope of MedBiquitous’ work, but weighting could be incorporated in the standard. Valerie asked the group whether they wanted more details about the sources of competency data. The group consensus was for information about the evidence. Simon recommended a text field to describe the various sources and weighting. The group agreed that would be sufficient. Patty commented that providing the user of the data to weight certain achievements more than others would be useful. Simon commented that was an issue for the software implementing the standard but not for the standard itself.
Patty asked what validity evidence would be available for each piece of data that is offered. How would we share that information with people making decision about it? Simon mentioned in order to have a useful tool you need a comparable scale, in terms of values and scores attached to it. Otherwise the data might be misleading. Alan added that the argument around accepting another institution’s evidence is much more than weighting. Alan recommended taking the group’s decision and providing an opportunity to revisit the point at a time when David is available to respond.
4 Review updates to data analysis slides
Valerie explained that the slides with the red asterisks in the title have changed. Slide twelve shows a data about levels of supervision overlaying EPA assessment data. The green line shows the supervisory levels, and they are taken from the milestone Carol sent around on providing appropriate supervision. Linda questioned whether level of supervision was a factor of the learner or the observer. Carol answered the level of supervision stays the same between point two and three. We don’t know that advanced novice equates with direct responsiveness, we don’t know enough to equate those.
Valerie asked if it would be better to separate this out. Bill thought using competing frameworks is confusing. A supervisory framework resonates in terms of accreditation, but he’s not sure we can equate a level of supervision to a level of performance. There is not a general consensus to use the Dreyfus scale in competency based system (for example general surgery doesn’t use this scale). Valerie noted the words in blue are changeable. She asked if it was better to use different examples or labels for illustration purposes. Carol answered they were using levels one, two, three, four and five. Bill mentioned they are getting away from the tendency to pick the category not the performance. Carol recommended providing the descriptions of the levels on mouseover. Valerie agreed to remove the labels from the performance scales and separate out the supervisory data.
Carol and Bill added that the MOC concept becomes more important after residency and would not be part of that scale. The scale is context dependent. John clarified that organizations have the option to define what the scale points mean. Bill asked does this need to have more concise framing of what stage they are in within their career. His concern was for people coming out of GME; they may be ready for unsupervised practice in the larger specialty but completing a fellowship in a subspecialty for which they are not yet ready for unsupervised practice. Valerie will add something to make that clearer. Carol agreed it would be nice to have a continuum to build on what their achievements have been in GME. Simon said achievement needs two pieces of information, the level within scheme and an id. Bill said the data alone isn’t enough; you must identify the program where the data applies. A year later you could start a program where you are not competent. We need to make competency clearer to avoid confusion on interpreting the data.
Valerie continued with changes in slides fifteen and sixteen. These slides show how you can drill down on sub competencies from the EPA. Trajectory was added on slide eighteen at Kim’s suggestion; you can see the trajectory of the learner in this particular program was awarded entrustment for care of a healthy newborn. You can see other EPA’s there as well. Alan suggested carrying over the discussion on slide eighteen to the next call.
Susan asked if there could be some examples for medical schools added instead of just residency. Valerie agreed that was a good suggestion. Carol commented that some of the slides could be used for medical students also. On slide eight the essential information level one and two are medical students. Valerie asked for volunteers to work on slides showing medical school performance.
5 Open discussion
- Continue discussion of slide 18 on next call
- Have a text field for evidence where the various sources and weightings may be indicated; do not strive for machine readable weightings provided by institution.
- Recommend user-driven weighting system for those developing software based on the standard.
- Valerie will work with Bill Iobst to add members from the new Internal Medicine working group to the MedBiquitous working group. (They will check with Lee Berkowitz)
- Valerie will check with sponsors regarding our new timeline.
- Valerie will make the following changes to the ppt:
- Remove labels from performance scale on slides 12-17
- Instead of overlaying supervision data over competency data, make it separate
- Add data indicating the program and any other necessary context to EPA slides
- Valerie will ask for volunteers to help with creating more slides relevant to medical schools