April 17, 2012
8 PDT/9 MDT/10 CDT/11 EDT/16 BST
Attending: Linda Lewin and Alan Schwartz, Co-Chairs; Susan Albright, Carol Carraccio, Simon Grant, David Melamed, Scott Smith, Valerie Smothers, and Jan Trial
1 Review minutes
The minutes were approved as submitted.
2 Review revised data model
Valerie continued the discussion on the revisions in the data model. The first page originally had a block for educational trajectory, which was renamed portfolio to allow for more flexibility. Simon commented the more specific you are the better. Valerie mentioned her discussion with Simon regarding a new format of Leap2A using microdata instead of XML. There is the potential ability to incorporate a data set or link out to a data set. Right now that is just a placeholder, but it’s a good opportunity for discussion. Simon mentioned it was great to see portfolio integrated through the document however, he didn’t think duplicating information was a good idea. He suggested thinking about scenarios for use of information. You could link to a URL that gave the student’s view of this. Valerie mentioned the one thing that is important is the capability to reference specific pieces of portfolio.
Valerie continued with the changes to the event on page two. Previously there was a box for continuing education credits, which was parallel to what exists in the Curriculum Inventory Working Group. Valerie discussed the draft data model with the Technical Steering Committee, and they raised concerns about there being two ways to describe and transmit data about continuing education and the confusion that might result. The Activity Report standard describes how to document maintenance of certification and CME activities already. The Curriculum Inventory description is very different than how the maintenance of certification is currently described. Certifying boards ask was the assessment completed and did you pass, not on scores or specific competency reporting. Carol commented that perspective may change. She was talking to the Internal Medicine group about changing the framework and getting more granular around milestones and the kinds of data that are going to be important for verifying competency for certification. Valerie noted clarified that would apply to a resident trying to sit for the boards. That data would be coming from their residency program or from data commons. You would still use the Educational Achievement specification for that interchange. She asked Carol after the initial certification is received if there would be any tracking or reporting of maintenance of certification that is related to milestones. Carol mentioned the way MOC is set up it is supposed to have four parts that mirror the six ACGME competencies. Any incorporation of milestones into that is years away. Valerie proposed updating the Activity Report specification to reflect that need when the need arises.
Susan asked if we would take it out of the Curriculum Inventory. Valerie explained they would have to talk to Terri Cameron, who is working at the AAMC on the curriculum inventory portal, and obtain her input. Valerie will have a conversation with Terri, Marc Triola, and Susan on this.
Valerie continued with the question on page two of the document regarding the learner’s ability to mark things as private. Valerie noted at the NBME either you share data or you don’t share it at all. She asked the group if there were specific things they wanted marked private. Linda commented her group marks things private things that are more formative. Formative assessments may not be validated. Carol mentioned the information is important for the learner but people might be reluctant to be honest about strengths and weakness without a privacy feature. That would be doing a disservice if you are not honest and specific with feedback. If the feedback is not so terrific, the learner can use it to get better. Linda commented you should be sharing things that are validated and meaningful. For example, shelf exam scores are validated and she would be happy to make them non-negotiable.
David expressed concern that every single data point would be exported into the system. Valerie asked if the learner reporting educational achievement data from the University of Maryland should be able to mark part of that data as private. Alan added that we had not discussed the program requiring learners to report something out; the learner chooses what to report out. David commented that was pushing the limits of administration; it can be extremely challenging. Linda asked if it was more an application question rather than a data element question. Valerie answered it could be a data question if the learner marks something as private, through e-folio connector, for example notes on their own performance. It is a matter of the level of granularity at which the learner selects data to report. At the University of Illinois Chicago, for example the learner achievement data is recorded. The learner has been writing notes about their experiences. They don’t want to share that reflection, that reflection is tied to an event, maybe it is a business rule that you don’t include the reflection piece.
Linda commented that they would manually tell the system to omit formative assessment results. Valerie offered the suggestion of leaving it to the system to be able to omit all notes and all formative assessments. Linda asked if there was a place to note formative and summative information. Valerie noted that there is in the Curriculum Inventory specification; it is an attribute of the assessment method element. It is not depicted in the data model, but the assumption is that it would be there as an attribute. Valerie clarified that the group’s decision is that there is no privacy attribute; it is up to the system implementing the specification to give the learner the ability to omit reflections and formative assessment. Susan recommended stating this clearly in implementation guidelines and best practice. Valerie commented we have an implementers note with recommendations for implementers; she will add it.
Alan asked if it made sense for notes to be attached to assessment results. Valerie commented both notes and assessment results are attached to events. David shared that currently you have folks who are dealing with this at the graduate medical education level. Program Directors are making decisions about whether to make data public or not. Eventually, EPAs will be longitudinal from medical school to practice. If we make values objectionable, we are going to have breaks in data and it will be fragmented. Where do we stand with where to make things optional; do we need this data to build a robust dataset on the person?
Simon commented it depends on what the use of the data is. To make concrete the arguments for including privacy levels, we need to say we want to transfer bundles of information for mixed purposes. Why would you be doing that and how? Valerie stated it’s for mixed purposes. We have use cases as to how data is supposed to be used. Use cases include the learner using the data for their own improvement, an external reviewer looking at the data as part of a career transition like applying to residency training, fellowship, and professional transition. A researcher or administrator may review data looking for trends. And a mentor may review data to help the learner. The other use cases relate to authorization and learner control. Given those use cases, the learner is selecting the subset of data that gives them the privacy they need. However, to get back to David’s point, no program is going to let students submit what they please. There are pieces of data you will need to share. Valerie mentioned that is something that can be built into the system. Carol commented the student signs off on the type of information sent to the board. Valerie suggested not worrying about the privacy attribute for now and if any objections come up we can deal with them.
Valerie continued with changes on page four. She noted this is the model of EPA in a competency framework. There are still a couple of kinks we’re trying to work out. This diagram shows a sequence block on skills assessment. It contains two events: IV catheter observation and clinical skills OSCE. The events reference competencies. The purple box shows a competency framework with a competency object called intravenous catheter insertion (this is the EPA). The performance framework is green. It has two performance levels: performs competently or does not perform competently. These tie to competency framework indicate the learner’s level of performance.
David likes this and he noted it would be worth considering adding values to observation/clinical skills similar to a credit score. Points for different types of assessment would allow you to judge curricula against one another. Carol asked who was working on tools for EPAs to enable this kind of comparison. David mentioned Kelly Caverzagie in Internal Medicine has been working on EPAs. Scott mentioned two groups in Internal Medicine have been working to validate EPAs. At their annual meeting there were 15 EPA’s presented for surgery.
Valerie continued with changes to the document on page five showing how individual assessments can be described. She suggested going over the specifications next time. Susan asked Valerie whether the medical schools knows all this is happening. Valerie said there was an announcement at the GIR meeting and mentioned talking to our AAMC colleagues to see when to do an announcement and how.
3 Overview of initial specification
4 Open discussion
There is no privacy attribute. It is up to the system implementing to give the learner the ability to omit reflections and formative assessment or to do so as a matter of course.
The implementation guide and recommendations will emphasize the importance of providing privacy for learner reflections and optionally formative assessment.
Valerie will talk with the AAMC about when and how to announce the specification work to medical schools so that they have ample time to implement.