From AMSA (upon reviewing flash overview):
Here is feedback on the the MedBiquitous work from some fellow students and I.
Slides 1, 2, 3 & 6 (Competency Hexagons + SubCompetency displays (polygons/matrices))- We like the nice visual representation. It is great from a learner perspective to have links to descriptions of levels of achievement- as mentioned later, videos, I think could be even better. The linkage of each of the points of the competency hexagons to either another intra-competency polygon (as exampled for Interpersonal Skills) or matrix (as exampled for Medical knowledge) is also great. From a learner perspective, I would want to easily view the dates/occurrences of each of these assessments. I saw the date stamp at the bottom of each of these, and if the data is that simple (all data points arriving with most recent evaluation), this would suffice - but when this isn't the case, a more detailed report of what data points are from when/where would also help. I also think (if it fits with the competency data input, and/or may be worth considering as an input variable) it would be beneficial to display some data about in what setting this competency was assessed (ie. VA clinic, university hospital inpatient) and possibly what area of medicine (ie. cardiology consults, general medicine clinic, surgical oncology service). Based on the timing of the assessments, the learner should be able to recall this kind of information in the short term (1-2 years) but in the long term, I think it would be nice to have that kind of record as part of the system.
Slides 4 & 5: The single competency display with linkage to description of levels of competency again is great, as is the bar and pie charts that show peer comparison and cohort data. I think having the descriptions (if necessary, in briefer verbage, ie. Proficient, Advanced, etc) of the level of achievement below the numbers on the bar chart would be helpful. And yes, video links from each of these to examples would be great too.
We felt that the challenge when it comes to intra-competency charting will be the variability in component descriptions between schools/residency programs, and allowing schools to (if necessary) input their own description or require they select pre-written descriptions that fit how they define a competency. It will be most helpful for students to see all detail of a school's competency evaluation, not just a general competency score and no sub-competency/skill scores. We've seen this is what happens with other systems where two entities try to combine work.
Slides 7-9: We think this 'skills sign-off' segment is great too. I would encourage, again, some kind of functionality that would allow each of the data points over time to be linked to the date/time/setting of that evaluation, not just when a learner was judged to be proficient.
With specific skill proficiency in mind, I think it would be great if this could be applied to procedural skills (Central lines, etc). I know that many residency programs have their own process for tracking the number of procedures performed and when competency is reached. I think it could be beneficial for a graduating resident to have an online/digital system to access history of procedures performed in residency and record future procedures if they wanted. I think most of these systems are entered by the resident, so maybe this could be a part of the technology standards that would be more user interactive.
Slides 10-12: We all thought this was important data to include and a nice way display it.
A few other points about possible additional features to include:
This is less related to the type of data included and more related to functionality: although this (technology standards program) is being designed to allow a learner/MD to regularly view their assessments, and then share (digitally) with others when applying for jobs, etc. I think it would be helpful to have some kind of report-generating interface that would allow a learner to select which parts of their achievement record (ie. competencies, standardized exams, clerkship evals) they wanted included in a hard copy report (pdf form), that they could then print off or email to a recruiter, fellowship director, etc.
Finally, I think it would be very neat to have a feature that allowed a learner to enter in their own commentary on their performance, for example as a reflection to particular narrative commentary (clerkships) and/or setting of goals for the future. The record could become even more complete by allowing students to put into context certain poor performances or highlight good performances as career-defining. This commentary will bring life to this assessment/standards record, and give residency/fellowship directors, and possibly even patients the ability to view the MD behind the numbers. I think a feature like this also has the capability to encourage more adult learning, as reflection, interpretation and goal-setting will lead to greater emphasis on purposeful, life-long learning. Whether or not some one uses this feature on their own is unfortunately another question. Maybe with time it is something that medical schools, residency programs, and patients will expect/require.
I hope this helps your MedBiquitous developments. Please let me know if you have any questions about this feedback or if you'd like input on any other developments. Thank you for involving the student voice in this development.