Skip to end of metadata
Go to start of metadata

Meeting Information


December 9, 2014


9 MST/10 CST/11 EST/16 GMT/17 CET/18 EET

Attending: Deb Burgess, Andy Johnson, Rosalyn Scott, Valerie Smothers, Michael Steele, Craig Wiggins, Luke Woodham. 

Agenda Items

1 Review minutes of last meeting

The minutes were approved.

2. Discuss VA XAPI project

Valerie provided brief overview of the objectives of the VA’s XAPI project.

  • Create Competency Frameworks for the analysis and reporting of learner outcomes related to code simulations.
  • Create recommendations for a standardized approach to tracking learner outcomes in multi-faceted simulation activities such as code simulations.

The goal is to leverage XAPI to bring together simulation data from across VA systems for analysis. Rosalyn added that serious medical games will be used by the VA medical centers as well.  One is intended as multiplayer game for training staff. There are two tele ICU command centers – one in Ohio and one in Minnesota. Both have expanded significantly the number of hospitals they are covering. Team competencies could be replicated in the virtual environment. Because of the conversations around XAPI, we have put in the project plan using XAPI to track activities in the virtual environment. She would recommend adding that company to this working group and approaching them to become members of MedBiquitous. We can certainly say we are working on many activities that align with what they are doing for the VA. They are called ECS.

Deb added that ECS also is a prime contractor for the air force simulation program. They are a good inroad to DoD. The Competency framework would align with much DoD work.

Rosalyn gave Dr. Dominguez a proposal and suggested specific subject matter experts to participate in the project. We are hoping those could be provided by the VA. He is in DC this week. Next week, we will recontact him and say we are interested in moving forward.

3 Review use case 8use case 11, and use case 13

Valerie made several updates based on the feedback of the last meeting.

4 Begin discussing data requirements for accepted use cases (ie exactly what do you want to track?)

The group began discussing what it would want to track using XAPI.

Rosalyn suggested a description of the activity, including the targeted audience, context, and where it was offered; if it is simulation, what type of simulation; prerequisites, the system used; competencies addressed; objectives; assessment methods; learner evaluation of course; facility; unit on facility. [Comment – much of this is metadata about the activity. How much of this would go in XAPI? Do we want to recreate Healthcare LOM In XAPI?]

For learner information: profession, specialty, level of maturity in a particular field, performance data, competency data, how the learners are related to one another. [Would this be captured in XAPI, or in some referenced document?]

Interactions: indication that a learner has demonstrated a particular competency at a particular level, time to initiate a specific task, time to completion of specific task,  or timestamp start of each task and completion of each task, score on an assessment.

Rosalyn provided the following example. A patient has arrhythmia that requires defibrillation. If a learner takes 20 minutes to recognize that, the patient won’t do well.

Luke asked if you would you break it down into series of tasks, and have time stamps on each one that would allow you to decipher the time to initiate, complete etc. Rosalyn replied it may not be a single order of activity. Luke commented that it doesn’t imply an order. Timestamps would allow you to build a map.

Michael commented that you would be collecting an activity stream , what the learner did and when they did it in a general way. Analysis gets complex. The golden path is different for each scenario. It may be that of the first 20 steps, a learner must do 1, 5, and 9 before moving to 21.

Deb  commented that we need more objective analysis of learners. We should create definitions  of what is novice and mastery for particular competency in such a way that they can be imposed on learner behaviors.

Michael commented that gets to mapping to the competence framework.  What is required for analytics? You can use standard verb and sub metadata components to say I did this, this being a somewhat complex embedded thing.  Is there a sense we can come up with a generic way to assess activity streams? Use a scoring rubric across multiple organizations?

Rosalyn replied that the VA needs to drive data from educational activities into a common database used for enterprise level analysis. They need to see where are gaps in skills among workers.  There are a Lot of educational activities going on, but no a common way to capture this data. Trying to use mock code activities as model for how enterprise level data could be collected around competence. They want to conduct analysis based on profession, facility, and unit within a facility.

Andy recommended breaking down requirements in terms of the I did this model and looking at the requirements for each piece.

Rosalyn added that currently ACLS requires recertification every 2 years. AHA will introduce a new method for ongoing certification, where the learner must have to demonstrated before 2 years in simulated or real environment that you have ongoing competence.

Michael suggested we use ACLS training as a use case to work through.  What is the decision rubric with ACLS? There are many factors, is that generalizable?

Rosalyn recommended sending slides around tying ACLS performance to a performance framework. 

5 Open discussion


Action Items

  • No labels