Child pages
  • 2012-02-07
Skip to end of metadata
Go to start of metadata

Meeting Information

Date:

February 7, 2011

Time:

8 PST/10 CST/11 EST/16 GMT

Attending: Linda Lewin and Alan Schwartz, Co-Chairs; Susan Albright, Rachel Brown, Carol Carraccio, Maureen Garrity, Simon Grant, Kimberly Hoffman, Steve Kenney, Valerie Smothers, Kevin Souza, Loreen Troy, and Sandra Waters.

Agenda Items

1 Review minutes

Alan began the meeting with a review of the minutes.  The minutes were accepted as submitted.  

2 Discuss further medical student feedback

Valerie sent out a PowerPoint presentation just before the call began on the preliminary feedback from the medical students.   AMA-MSS survey results consisted of a survey asking medical students what they thought about ideas presented in the overview document that we sent out.  Specifically slide one asked does the data appear to be useful to you and 65.9% responded yes and 34.1% responded no.  The AMA-MSS sent out the newsletter with a link to the survey yesterday which will probably generate more feedback with a deadline of February 17, so the results should be available for the next call.  They used social networking sites on twitter and Facebook to get the data shown here.  Of the fifty people who responded to the survey most were from Ohio. 

Slide three summarizes student responses.  Valerie commented that some of the responses pointed to the educational trajectory report.  Some responses mentioned including a lot more information, including scholarly achievements and MCAT scores.  The fourth slide asked if they felt there was any data that was not pictured appropriately or that should be excluded.  One mentioned accessing most recent data more easily.  Another one mentioned the granularity of data as too much for residency application purposes.  Another comment was about the data metrics requiring robust and timely reporting from medical schools.  Valerie acknowledged that is where we need to go.   There is a concern about the quality of data, is it fair and accurate. One comment was that radar plots were too strong a visual image.  Valerie asked if there were any questions or thoughts about the feedback.

Carol commented she was struck by the references to college level data and engaged with a group of people in Texas who are looking at creating a competency based continuum between undergraduate and medical school curricula.  Carol thinks we need to do more reaching back into the undergraduate years and allow people to showcase what they did then to get them to medical school. 

Scott Smith, from Boise, noted the most striking comment he thought was that anything that is automatically viewable by a residency director should be excluded.  That is counter to the purpose of the specification. It should be viewed as feedback for growth and improvement. He added that there was misunderstanding with regards to how the data is put in; it’s automatically done by a system.  He mentioned the comment on images of radar plots being too strong: they summarize strengths and weaknesses and help the student to see them.  Clarity can frighten them. If they are doing well, it’s great; if they are not doing well it’s scary.  It’s important to not make it punitive but let it be for growth and improvement. 

Valerie asked the group if they saw any changes we need to make based on these comments.  Linda focused on the comments about what is viewable vs what is private. We may want to depict the ability to choose the right report in the beginning of the PowerPoint presentation.  A residency application app would include some of the data but not all of it.  She added that having a statement about accurate and standardized measurement was important and should be included. Rachel commented that may lead people to include data that is easily measured and leave out other important data.

Simon commented that the radar charts and how you present stuff gives an impression of how it is being used.  He thought we need to make it really clear.  It is a good opportunity to reflect on what the data is for, the context of use, or be more explicit on how it will be used.  He recommended adding examples of how it will be used. 

Linda stated that in previous efolio discussions, the private area verses public area notion was important. That concept has gone by the way side. She recommended building on that idea.  Simon agreed there needs to be a clearly distinguished way the information will be used.  Kim noted much of the work to date has been focused on public facing data; she would recommend more explicit information about private space.  Bob clarified the private piece is distinguished by source verification; the public data will be person verified. 

Simon commented that only two thirds of respondents saying the information is useful was disappointing.  Valerie noted the process did make it clear it was for personal learning and self improvement, useful for the learner, not used for application to residency.  Susan commented about the feedback, the theme that students want feedback not judgment at the end.  She was not sure we can give them that in these standards.  Valerie commented the things that she has heard that may have implications for the standard are the ability to indicate primary source verified or self reported. Alan noted more generally, the source.  Sandra agreed.

Valerie agreed better indication of how the data is being used (a general personal use/private view as well as reporting “apps”) would be beneficial. She commented that she was uncomfortable with the idea of adding a statement about accurate and standardized measurement. A text field where that can be described might be better. Linda commented a nomogram and text box describing the assessment methodology would be better.  Scott commented it would be interesting if the efolio data used different colors for private vs public data.  Radar plots should include bulleted statements summarizing comments in written evaluations. 

Linda thought the standard should be able to designate each item as sharable or not.  Valerie commented that may be a system function more than a function of the standard; the line is not hard and fast. She was not sure we can say this piece of data is always going to be private.  Linda replied that if the learner was able to have the ability to mark something only for themselves ever, they wouldn’t feel the risk of having that data used when they didn’t want it to be used.  Simon recommended showing  a “make private” button.  Having an example there would be powerful.  Valerie will incorporate the changes and present on the next call.    

3 Review updates to data analysisslides & report from UIC

Valerie mentioned there were a few new things to note on the data analysis slides. With the help of Jan and Maureen she has added slides on what the the entrustable activities and skills would look like in medical students.  Slide eleven shows activities that have been entrusted: arterial puncture and intravenous catheter. Details can be seen.  On slide twelve, they have the exam, date it was passed and exam score, and additional criteria.  Most schools use this model for clinical skills.  Jan agreed that synthesized their conversation.  The educational trajectory included slide 21 shows undergraduate medical education, and stars indicate where entrustment is awarded.  Slide 23 includes additional exam data. All parts of USMLE are now shown. Three digit scores, 2 digit scores , and standard deviation are shown.  Rachel commented that they are not publishing two digit scores; they are only available upon special request. Certain licensing boards use the two digit.  Valerie agreed to take out two digit scores for USMLE. 

Lori continued with an explanation of what she sent the most current procedural skills list developed over six years, annotated to draw attention to fact what is on list that is procedures not formally required on clerkship.  Students have to log all of these within their procedure logs.  All 15 are formally assessed in the M4 clinical exam that occurs early in the 4th year.  Students don’t have to enter these; it is done for them when they pass the examination.  Lori will send Valerie the two different reports 1) audit of current management system, and 2) lists of  procedures by students with approval dates certifying that that they have satisfied requirements. They have had a lot of discussion of what is the minimum number of times a procedure should be done for formal certification.  Lori asked if the group would be interested in seeing summary report and the group was interested. 

4 Open discussion

Decisions

Action Items

  • Valerie will edit the slides to reflect the discussion
  • Lori will send skill reports
  • No labels