Child pages
  • 2011-07-12
Skip to end of metadata
Go to start of metadata

Meeting Information

Date:

July 12, 2011

Time:

8 PDT/10 CDT/11 EDT/16 BST

Please note: the conferencing service will ask you to enter the pound sign. Press # for pound.
To mute, press *6.

Agenda Items

1 Review minutes of last meeting

Linda began the meeting with a review of the minutes.  There were no additions or corrections and the minutes were approved as submitted.

2 Discuss proposed use cases

Alan noted that Rachel and Carol posted comments on the wiki regarding the use cases.  Anita requested clarification of who might be considered an administrator.  She commented the Program Director is an administrator; however, at the medical school it might be the Dean of Student Affairs.  Alan offered the formal definition of an administrator would be an individual who examines the educational achievement data for comments or trends.  Carol questioned whether the term administrator meant a professional person or an administrative assistant and suggested including examples (like Program Director, or Associate Dean, etc.). Valerie acknowledged that examples could be included and Anita agreed they would be helpful.  Alan noted the importance of distinguishing the difference between administrator and external reviewer by looking at the individual learner data.  The administrator is reviewing aggregate data and adding the word aggregate to the definition would be helpful.  Linda asked who that person would be after residency.  Alan commented it could be a dean or a committee that oversees continuing medical education.  Valerie clarified that she was currently editing the document and changing the definition of administrator to an individual who exams aggregate data (adding examples such as, Program Director or Dean or Committee Director). 

Howard mentioned the definition looked like a researcher role and it may need further clarification to avoid confusion.  Alan shared an important point for the group to note; they aren’t people, but roles and may act as different titles. Valerie mentioned from her perspective, the difference is the scope; administrators are looking at graduates, researchers don’t have a restriction.  Patty commented that the purpose of each was different; one makes decisions about the programs, whereas the other may be interested in publishing. Linda suggested defining administrator as looking for patterns or trends in program evaluation, whereas the researcher’s role is for dissemination of information.  Valerie made the changes and saved them to the wiki.

Carol discussed the types of external reviewers and asked whether the data flow changes depending on the external reviewer? She commented that may be more granular than the group needs to get at this time.  What level of detail do we need to support case #3?  Valerie suggested one option would be to try to outline what data flow might look like for those different entities to use for specific purposes in support of the learner.  She noted the specific use cases around credentialing certification will impact the kind of data you are collecting.  Patty clarified that the use cases would be used to point to certain data elements to support data flow.  A grid could be used to determine what data would come over.

Alan mentioned that Rachel is concerned about authorization and who is able to obtain the data we are able to present.  Valerie commented the specification doesn’t deal with authorization but the system implementing the specification would. The specification addresses data being transferred from a local system to a central system; the central system would have to address authorization.  Linda referred to use case number 2, 5, and 6 where it says the external reviewer is able to see a complete and up-to-date copy of the learner’s educational achievement data or an authorized subset of the data. She questioned whether it would always be complete or should it say that the reviewer would see the part the learner released, at the discretion of the learner.  Valerie agreed to add “or an authorized subset of the data” to use cases 2, 5, and 6.

Howard commented in use case number 6, there are permission issues. Is it required to have it or should it say the learner authorizes release of education data, and does it mean all data or can it be a subset of the data? The external reviewer is allowed to see updated copy mutually. Valerie asked if language needed to be added to make that consistent.

Alan commented that Rachel raised an important point about provision of data to external reviewers.  Is it important to inform reviewers that they are only getting part of data? Patty noted from a practical standpoint medical boards have gotten clever.  In some states residents must sign a waiver granting license boards access to any and all evaluations on resident training. If the external reviewer is a program director, the resident may want to share residency information but not medical school data. That could have an impact on technical specifications. Linda asked how do you account for selection bias. The learner will send the information they want seen and nothing will get sent without the learner’s permission.  If there is other information requested, can the learner say no they will not provide it, and then will they not be considered?  Alan commented when you ask for the data you only get that data requested and no other information that exists in the system about the learner.  That should be specified in case #2.  Boards and residency programs want everybody to send them what they require, no more and no less. 

Howard shared one of the key things about portfolios used in the undergraduate selection process is that the learner creates the subset of the portfolio and a decision is made based on that subset (case 7 not case 2).  Hilary described a scenario where you have a clerkship grade but don’t know where the student falls in the aggregate of their school and they don’t have a grade distribution. It’s hard to make an assessment based on that grade in isolation.  You expect those aggregate data are available for review.  Linda proposed what if that student has written feedback that would raise a red flag, would it be helpful to have that option? Hilary mentioned the goal is to protect the learner and have them be in control, we shouldn’t know there is more data available.  Patty provided an example of recruiting someone who had no obvious red flags in their file, but later found out about a serious mental illness problem.  The dean suggested we ask for evidence of things that weren’t obvious to ask for, but that has legal implications. As a program director, you want to have the information to know a person is able to proceed. If the data set included summary elements, then that may be helpful. 

Lori mentioned the objective of writing the technical standards is to allow us to provide educational achievement data.  What would this profile look like? Valerie shared that is exactly what the role of this group is: to determine what the data will look like on a more granular scale, with specific tests and evidence of achievement of specific competencies.   

John questioned what would the validation process of learner data look like before the information was released to program directors. Valerie asked how that validation process works now. He mentioned that learners develop coverletters to present their own spin on achievements. Does that constitute an educational achievement or is it something else, like metadata?  Alan commented it would be metadata. Linda commented that there needs to be a way to designate the source.  Patty thought that was a good idea.  She shared that now when you report, you have in your file a certificate, but there is not a unique identifier to that credit.  It is validated at the institutional level, however; there is no system in place to do cross checks.  Valerie asked if self reported data is something the group wants to capture in this specification?  Linda commented we won’t know that until we get into the next steps.  Howard suggested including a field for the degree of validation. 

John would like to see a review of mentor activity by the administrator included to find out if mentors are completing their tasks. He asked if that was beyond the scope of this discussion. Valerie agreed it was beyond the scope here but commented it was an interesting point to consider.  Having access to mentor’s comments or suggestions would be of interest to some parties.  Alan mentioned he wouldn’t have captured mentor data unless it was included as educational achievement data.  Linda commented if a mentor gathers information from a learner and wants to respond, it should be offline, it should not go through the repository. 

3 Questions for consideration: what level of detail will support the use cases?

4 Open discussion

Kirk shared that they have an electronic portfolio system up and running and it will be interesting to see how it will interface with a national standard.  He asked who else is currently using electronic portfolio to capture this information.  Linda replied that Elaine from Cleveland Clinic is not on the call today, but they are using portfolios for learning and assessment.  Valerie asked the group to share any other feedback on the wiki or list serv.  The agenda and supporting information will be sent out before the next call. 

Summary of discussion points 

  • Each role has a unique purpose reflected in the definition of that role. An administrator is conducting program evaluation, a researcher is interested in disseminating knowledge, an external reviewer is evaluating the individual learner.
  • An individual make act in different roles at different times.
  • External reviewers will only see the data that learners choose to share with them. If data is omitted, there is nothing that appears saying that data was omitted.
  • The learner's summary or preface to a specific educational achievement may be included as metadata about an educational achievement.
  • The source of data source must be identified.

Action Items

Valerie will update the use cases based on the group's discussion.

  • No labels