Skip to end of metadata
Go to start of metadata

Meeting Information

Date:

November 21, 2014

Time:

9 PST/10 MST/11 CST/12 EST/17 GMT/18 CET

Attending: Jay Anton, Erick Emde, Jason Haag, Gregg Lipson, JB McGee, Ellen Meiselman, Valerie Smothers, Michael Steele, Craig Wiggins, Luke Woodham. 

Agenda Items

The new members of the group each gave a brief introduction.  Jason is from ADL and the DOD.  He is an instructional designer with emphasis on mobile learning.  Greg is a Senior Business Systems Analyst on the AAMC’s Pivio Project.  His background is in systems analysis, solution design and data driven applications.  Jay is the founder of IVIR Inc., He is a former CTO for Meti.

1 Review minutes

Minutes were approved as submitted. 

2 Continue use case review

Valerie provided an update on the previous use case discussions.   The group had discussed and recommended changes to use cases 1-3. The group also discussed 5-7 briefly. We have not discussed 8 and 11-13. David raised questions about what to track on use case 3; he also had comments on the University of Michigan use cases and Clinician needs to use a device.

Use case 4 is tracking discrete actions with a simulation or clinical experience.  She raised the question whether this was integrated into use case number one.  Ellen agreed it was. 

Use case 7 is tracking supervisor/faculty signoff on competency achievement or non-achievement.  The group concurred this was integrated into number five. 

Use  case 9 document resource use (paradata) Valerie was unsure where to go with that, it could be part of use case one or two.  Ellen asked for a definition of paradata.  Valerie clarified paradata is data about how people are using a resource.  Craig commented it is the context in which you are using information. Ellen thought that should be a separate use case.  Erick also thought it should be a separate use case.  Erick and Craig will work with Valerie on developing this use case.    

Valerie asked if use case 10,  adaptive learning, was part of use case number eleven.  Erick agreed; it is not just data about learner, but what training have they done.  Ellen commented you can access learner data including performance and demographics.  Valerie will make the change.   

Use case number eight, associate supporting materials, including video, with individual group experiences to support formative assessment and activity evaluation.  JB asked if that was the same as use case 3. Valerie reviewed the actors and trigger events.  Ellen commented it will have context automatically associated with checklist.  She provided an example of a nurse preceptor using a checklist.  Learners must also complete a learning module and a quiz or test to be considered competent.  Valerie asked if the application at Michigan allows preceptors to rate and record comments.  Ellen commented the entire rating and feedback would be associated with a competency. Valerie clarified the preceptor application does the associating; in the other example, there is no preexisting relationship that is machine readable between the resource and the activity. The learner is making that association.  Greg added the example of the Pivio electronic portfolio around sharing and commenting.  The primary E-portfolio items the learner can track such as experience, education, publication, can be added via a note in the system and then can be attached to one or more rich content items, such as links, videos, documents and images.  Valerie will work on use case 8 for the next call. 

Use case number 11 access learner data, including performance and demographics, and use to direct learning.  Ellen requested this be a separate use case.  Valerie will come back next time with more detailed information.  

Use case 12 is analyze learning and performance data across systems, sites and organizations.  The XAPI message indicates the system, site, and organization for the experience. Valerie asked if things like system, site, and organization are inherent in the XAPI message or would they need to be added.  Ellen hoped lots of data would be added to statements, but the reality is that it may not happen. They often have to go back and make correlations with data from other systems after the fact.  Valerie suggested that data form another system could be joined data in an LRS.  Valerie will add that assumption. 

Ellen commented you could have a health care workers learning exchange that is semi de-identified and aggregated across institutions for specific types of learning.  There you would have to add data as you were doing contributions to that exchange for additional context.  Valerie added if collecting internally you may not have it; if contributing to a large initiative context could be added.  Valerie changed the assumptions to include The XAPI message contains sufficient data to join with another data source that contains system, site, or organization for the experience.

Michael asked if portfolio standards are intended to do that.  Valerie answered there are several.  The professional profile describes the individual health care profession in detail in terms of their name, address, contact information, licensure data, disciplinary actions, membership, education and training.  There may be other ways to describe a learner.  You could have data about the learner’s organization in that profile; however, it may not help you location where a particular activity took place.  Ellen commented they are pretty free form regarding metadata.  Valerie noted the LRS and analytic system may be one in the same or it may not.  Ellen provided the exception that someone uses a standard that analytic system hasn’t predicted.  Valerie added the exception an XAPI message uses extensions that the analytic system cannot interpret. 

For Use case 13, Valerie mentioned Apple has a health platform on their latest IPhone that is capable of sending data to several different electronic health records including Epic.  Ellen commented that they imagine using patient data to match up using physicians and patients, allowing patients to ask questions and allowing physicians to make changes to patients’ plans.  Erick summarized it as a recommendation engine that also has human interaction to make changes to the recommendations. Ellen commented a good use for this is with diabetes patients.  Valerie thought it would be interesting to know if they were using Epic for this.  Greg was unsure if this was HIPPA compliant and was there a masking element for identifiers.  Valerie added we don’t want to mask, we want system to take action to track their activities.  Ellen commented the LRS is storage, analytics decisions are made outside LRS, so it doesn’t matter so much as long as the system can associate statements with a certain person.  It needs to be able to be tracked back to the patient.  Valerie agreed. 

Jay will review the use cases before the next meeting.  All of these issues answer problems they have had.   

3 Open discussion

 

Decisions

Action Items

Erick and Craig will work with Valerie on Use Case 9.

Valerie will work on use cases 8 and 11.

  • No labels