Skip to end of metadata
Go to start of metadata

Meeting Information


September 27, 2011


8 PDT/9 MDT/10 CDT/11 EDT/16 BST

Attendees: Linda Lewin and Alan Schwartz – Co-Chairs; Carol Carraccio, Sharon Coull, Mike Dugan, Simon Grant, Steve Kenney, Scott Smith, Valerie Smothers, Janet Trial and Lori Troy.

Agenda Items

1 Review minutes of last meeting

The minutes were approved as submitted.

2 Review Educational trajectory

Linda commented that the Educational Trajectory is the last project the group took on and reviewing it will allow us to see how the current project will meld and build on the previous work.  She added that the visual presentation is a good tool to facilitate discussion about the data will look like. 

Linda continued with an explanation of the slide presentation.  The first slide shows John Doe’s year he was in medical school divided into quarters and the different lines shows what he was doing. The black line is his coursework leading to the MD degree, and there are no gaps in coursework. The green line shows coursework beyond the MD degree. The blue lines indicate two different enrichment activities.  On the second slide if you click on the black line it shows you where he was getting a degree and a brief description of what he was doing, who wrote it and where the information came from (his institution).  The blue line on the third slide shows what clinic he was working in, the dates he was there and that the activity was unpaid.  There is also a description of what he did, where the information came from and when it was added.  The fourth slide shows Mr. Doe having a brief experience in his second year when he went to China as an unpaid volunteer.  Slide five shows course work outside of the primary degree where the student received an MS in a joint program through UC Berkley and UCSF. 

If you click on the link “Outcome” on the next slide, you can see that a manuscript was submitted and attached, as a product of his experience.  Slide seven shows an enrichment example taking a smoking cessation class for the homeless and if you click on “Testimonial”, up comes a letter from the person he worked with that says he did a great job.  The information is there but not all information is visible, you need to drill down to get more detail. Slide nine is the publications that would tie back to Educational Trajectory;the learner can provide more details via Reflections.  Linda noted the last slide shows how people would enter their publications, whch reflects the underlying data structure.

She shared the beauty of this is it presents a simplified picture and place to start.  It wasn’t easy to put together, but it’s a good example of what we might be shooting for.  Valerie added that the presentation served as a great communications tool for the working group. It also facilitated communications outside the working group.  The development of the presentation was an interactive and highly iterative process.  The slides describe data structure in a way that everybody can understand.  We want to use the same kind of process for our educational achievement work.

Linda asked the group for feedback. Sharon stated this clarified things for her.  Mike thought the report was very user friendly and provided a nice summary. He added that there would be work to get this kind of a drill down from a data stream.  Valerie commented that you could characterize unique capabilities and a summary of the learner could be characterized as a testimonial within educational trajectory.  Linda questioned whether we could embed all educational achievements within the trajectory and not have to come up with a brand new structure. 

Simon commented that the fundamental authority is the learner themselves as opposed to achievement information which comes from the institution.  Can we do it in the same format?  Valerie explained we are early in the process of defining data requirements, we need to understand more about that and as we do, we can see how that will mesh with Educational Trajectory and other useful specifications out there.  Linda mentioned we would need to continue past medical school and think about what the data parts would be.  Lori commented this was a great way to represent an enormous amount of information.  She added that if the user isn’t interested in a learner’s pathway through medical school, she shouldn’t have to see it.  Linda agreed.

3 Discuss data analysis (and preliminary feedback)

Linda continued with the data analysis. These are visual examples showing what we are looking for.  Valerie mentioning this document is intended to be beaten up.  It also is intended to try and bring together various points we’ve discussed on the last call. The first things we know we want to capture are statements of competence.

1) Is the person competent or not with regards to a specific competency. 

2) The person’s level of performance related to a specific competency .

3) The person’s level of performance related to a component of a competency.

Several of the examples were taken from pediatrics.  Carol mentioned in terms of goals, she hoped for bullet points two or three.  Then we need data as to what the judgment is based on and what it means.  Simon “developing as expected” was not a simple thing.  Linda added that gets to sources of evidence, which are also described.  Linda asked about the competency working group and how that plays into educational achievement standards. Valerie commented that the Competency Working group is developing a standard for expressing competency frameworks like ACGME competencies and CanMeds. We can’t cram everyone into one framework, however; we can give them a way to express what their framework is and tie achievements to competencies.  Simon mentioned they haven’t really thought about the question of levels yet. That work is pending.  

Valerie asked Sharon how they are doing things in Dundee. Sharon mentioned they are moving from Scottish Doctor Learning Outcomes to Tomorrow’s Doctor’s, the UK outcomes framework.  Valerie asked if performance levels are based on year of study. Sharon answered yes, they have to reach competency by passing all assessments before moving forward.  Linda stated we would have to capture the timing of competency statements. 

Linda asked Valerie to distinguish between achievement and competency.  Simon answered achievement is attributing a competency to a particular person and competency is what you mean by the concept.  Valerie agreed.  Carol mentioned they are dealing with formative as well as summative assessment; they want to see the progression to competence.  She asked if competence is the end game or if there would be a way to indicate that a learner surpasses competence.   Valerie stated she hoped that the levels of performance would allow you to do that. You would be able to indicate high achievers as well as those struggling.

Linda asked if the competency work encompasses achievement.  Valerie gave an example: the ACGME competency Professionalism is an example of a competency.  Saying that Linda exceeds expectations in Professionalism is an achievement. We can then provide evidence of that achievement.  Achievements are tied to competencies, but achievements relate to a specific learner’s accomplishment. Simon added that some achievements may be outside the scope of a formal competency framework.

Lori asked how granular the competencies were. For example, the CanMeds framework defines Enabling competencies and then defines specific behaviors for each enabling competency. Valerie commented that the Competency Framework allows for hierarchical relations as in CanMeds but does not mandate how granular a competency framework can get.  The specification can support any level of granularity.

Linda referred the group to the visual examples.  In the data analysis you can see examples of ways you could indicate achievement. The linear scale arrow could be moved to the right; the radar plot/spider web shows all different competencies in one place and the ability to drill into the details. The checklist from Bob Englander indicates competent/not competent and the source of the data.  Linda asked the group for reactions. 

Scott mentioned the first visual was what he was taking about looking at exam and course grades;  seeing the distribution via histogram would be nice. He added that they use radar plots in evaluation of trainees. The small print indicates how the number is generated. They are one of nine Internal Medicine programs working with ABIM to develop milestones and Entrustable Professional Activities (EPAs).  They spent four years creating an agenda setting observational evaluation tool to see how well trainees set the agenda with the patients. They found that agenda setting crossed five competencies and twelve milestones.  They expect trainees to progress from two to three, making the circle move outward over the course of the year.  

Carol asked about individual EPA’s.  Scott replied that eventually they will start adding EPAs as evidence. He clarified that the benchmarks change every year. They use the radar plots to see where strengths and weaknesses are and figure out a plan of how to address that. 

Linda asked Scott if it has helped identify people having trouble.  Scott answered yes; you can get used to the patterns quickly.  Valerie asked Scott if she could follow-up with him after the call to learn how EPA’s will fit in.  Scott agreed, adding that EPA is a connection that works well for them. Carol asked how many EPA’s are there for Internal medicine. Scott answered there are four to five, including agenda setting.  They set a competency level that was a low bar.  The levels of competence/milestones help people get better even if they are really good.  They are now using videos to train faculties on scoring.   

4 Open discussion


Action Items

Valerie will follow-up with Scott on EPA’s.

Group members can contact Valerie with questions on the data analysis document or feedback regarding where you would like achievements to go. 

  • No labels