June 20, 2017
Attending: Susan Albright, Hugh Stoddard, Co-Chairs; Sascha Cohen, Editor; Terri Cameron, Kristi Ferguson, Walter Fitz-William, Paul Schilling, Jenny Shaw, Valerie Smothers, and Julie Youm
1 Review minutes
The minutes were accepted as submitted. Sascha noted that UCSF is only tracking five core EPA’s. Valerie noted that Colorado put their EPA tagging efforts on hold.
Hugh explained that Emory is using the concept of Student Physician Activities (SPA’s) as a central organizing principle for its curriculum. SPAs are program outcomes that resemble EPA’s without the entrustment component. SPA’s underlie what and how they teach and assess throughout the curriculum. They also tag the curriculum with organ systems, NBME topics, LCME topics, and patient conditions.
Hugh noted that the SPAs are more comprehensive than AAMC core EPA’s. Everything a student does is tagged with eight categories including SPA’s plus four others so everything is aligned. This identifies how well students are learning, if they are teaching the right thing, teaching it well and students are learning it.
Valerie asked what data under framework are being sent to the AAMC and what commercial tool was used. Hugh answered only the instructional methods and assessment methods are sent, and they are using OASIS and putting the other fields in as keywords. They spent a year and a half putting data into OASIS and just finished two weeks ago. Valerie asked the group to consider whether that data should be modeled in the standard and how to model it.
Terri noted that she had discussed including Core EPAs in the Curriculum Inventory with EPA Pilot schools. The schools will document those as either expectations or key words. They will use a consistent vocabulary: COREEPA01, etc. Susan questioned including attributes for EPAs in the standard. Hugh notd that Emory maps SPA’s to PCRS. The same could be done for Core EPAs. He added that they want to do CQI in education. Paul mentioned they map key words to event objectives. Valerie thought you could use expectations to represent EPAs and use a category value that indicates it is an EPA. Terri questioned whether you could show a relationship between program level expectations and EPAs. Valerie suggested further exploring EPA to program objective mapping. Hugh supported tying EPA’s to assessment only, noting they are not the same as competencies. Terri noted that schools also want to see where students are being prepared for EPAs and where content relates to EPAs. Susan agreed with Hugh’s concerns, noting that the term brings baggage for those schools not using models based on entrustment. Valerie suggested focusing on Core EPA schools who have an interest in reporting and how to report Core EPA’s, and obtain consensus on the definition of those.
3 Discuss use of EPAs at UC Irvine and how they are integrated with and tagged to the curriculum
Julie provided background on their current EPA mapping using Ilios, preparing for a site visit. They added EPAs as topics in Ilios. Their programmers developed a basic reporting tool. They have now moved EPAs from topics to vocabularies. They are reviewing program objectives and moving reporting into Tableau. Ilios data pushes into Canvas. They are now looking to use EPAs as their Program Level Objectives and mapping to PCRS.
Valerie suggested using the term professional activities; would that remove concerns or muddy further. Julie thought the term was too vague.
Valerie commented from a CI perspective, the most valuable reports are working with a set of agreed upon things that people are mapping to. PCRS, Instructional methods, assessment methods and resources. Walter expressed concern that another layer confuses people on the cusp of participating; he encouraged a transparent process with flexibility and understanding of how it all builds together. Terri suggested giving one upload August 1, 2018 using key words or expectations. Julie offered to add EPAs to PCRS but Terri thought it would create confusion. Hugh added they are supposed to be evaluating each academic level as a whole, and the CI has been extremely helpful.
The following topics were tabled until next time:
- Review progress on API documentation
- Updates around conversations on tracks (tentative)