Child pages
  • 2008-06-18
Skip to end of metadata
Go to start of metadata

Meeting Information

Date:

18 June 2008

Time:

12 PM EDT/5 PM BST


Attending: Tim Willett, Co-chair; Mary Pat Aust,  Peter Greene, Ronald Harden, Ted Hanss, Valerie Smothers

Agenda Items

1. Overview of in person meeting (see presentation*and* minutes*)*

Tim asked Ronald if he recalled the Picker Institute paper that Lesley mentioned during the annual meeting.  Ronald commented that the Picker paper he was thinking of did not compare frameworks.  Rachel's Medical Teacher article on the Scottish doctor and GMC learning outcomes did provide a systematic report of the comparisons. Ronald left Peter some copies of the third edition of Scottish Doctor, which is partly based on this comparison.

Tim recapped the discussion from the annual conference. They discussed that three concepts had emerged - the competency, relationships among competencies, and connecting competencies to other things (competencies in other frameworks, things in a curriculum, etc). They discussed the results of the survey and talked about what we would want to capture in a competency definition. The group suggested looking at some of the more popular competency frameworks. We'll monitor our development process against those frameworks. Tim suggested that if Mary Pat could identify a nursing framework, we could include that in our comparison. Mary Pat commented that ANCC has come out with Competency Designation for CNE; it is brand new. Organizationally, state boards of nursing looking at continued competence as relicensure requirement. ANCC has started that dialog and is capturing data.

Ronald commented that there was a Picker report on nursing from North Carolina that included definitions of competency in nursing. Mary Pat added that competency is what they do; looking at broader picture is important.

2. Plan for moving forward

Rosalyn, Tim, and Valerie spoke last week regarding moving forward. We are ready to start specification development. Today we want to discuss the competency itself - what data do you need to capture. Then we will talk about relationships and information to link a competency to other resources. We will involve the Technical Steering Committee to provide assistance.  Peter commented that on the Scottish doctor site, there is an outcome browser tool. It has level, learning outcomes, identifiers, etc. Ronald commented that it may not have the most recent changes yet or interface with tomorrow's doctors.

3. Review of competency/ieee comparison

Tim explained the background. Different outcome frameworks are at the top, rows show different types of data based on our survey. You can see whether a competency framework captures that kind of data. All frameworks capture the statement. Typing is important, too. Is it competency, outcome, learning objective, role, etc. Many terms are used to describe items.  IEEE model does not have a notion of typing. A couple outcome frameworks use descriptions; IEEE model does have description. Canmeds does use multiple languages. None of the outcomes frameworks has conditions of performance, but still many responding to the survey wanted it. The same can be said for outcome criteria and performance criteria. Valerie explained that the IEEE statement element can be used for these; an attribute allows one to specify the type of statement. This gives a sense of what data is currently being published.

Ronald commented that there are two gaps. One is the mapping. How do we communicate competencies to teachers and learners? There has not been much work in that area. The second gap is the relationship among competencies. Scottish Doctor does have hierarchical relationships. That needs to  be embedded in the competency framework.

Tim agreed that we need to do the same type of review regarding the relationships among competencies. We are just looking at the competency in isolation, even though they don't exist in isolation. With regard to the representation, how do you display data in a useful way? Is there a parallel process there similar to the Virtual Patient working group's development of a player? That may not be captured in the spec. Valerie asked Ted if he had worked on or seen a competency viewer. Ted commented that UVA's tool was interesting. Having a viewer would help them think though other parts of it. Peter added there is a collection of efforts in other areas working on display of complex interlinked resources. Like browsers for snomed. Also, there is interesting work in social networks. In online publishing world, browsing of related article is similar. Cnet had a news visualizer. Would mind maps help display? They don't show different kinds of dependence. Ronald commented that one element  that belongs in this stage is the notion of hierarchy.

The group agreed that relationships among competencies is fundamental. We can't come out with anything without having an idea of how that is represented. Even if we come out with something and find out the IEEE is headed in a different direction, we need to represent hierarchies.

Ronald commented that links between competencies and how we describe those would be very interesting. Concept mapping is an example.

Tim asked which fields are most important based on knowledge and experience? Which could or should be excluded? Ronald commented that assessment is a separate domain, but it wouldn't be inappropriate to have a recommended assessment method. Peter commented that we want to make sure we aren't including something related to relationships or connections. Tim clarified that the link between a specific assessment event and a competency and an assessment method and competency is different. Ronald added that the assessment method could help you better understand what the competence is. You could also include how you teach it, but that doesn't always enhance the description.  Peter added that it would be fair to include descriptive information, but not the entire assessment activity.  Mary Pat agreed. You don't need to see details, just recommended methods. That's pretty common to what they do in nursing.

Peter added that notions of conditions of performance showed up as a statement but didn't map to anything. That didn't seem to resonate with our needs.  Mary Pat commented that she was surprised that outcome criteria weren't present in any of the frameworks. How do you know what you are measuring against? Peter commented that pay for performance measures would have outcome criteria. We are looking at earlier frameworks.  Valerie recommended getting David Price's input as to what would be necessary from a certification board perspective.

Mary Pat recommended having common definitions for the terms we are using, such as outcome criteria. Tim commented that within nursing, competencies do state criteria as to what needs to happen in order to determine competence. Mary Pat added that it may be multiple things: multiple choice questions, demonstration, successful completion 3 times, etc.  Or it may be as simple as self assessment. Ron asked are these carry overs from objectives? It's older terminology. On that sense, should it be under supervision, on your own, in a rural area, etc. Is it fully equipped hospital setting, simulation lab, etc.  There's also level of mastery implied. For example, with or without supervision. Tim asked if outcome frameworks specify expected level of mastery for different points on the continuum? Ronald commented that the nine abilities from Brown have three levels of competence. Gynecologists in the UK have competency with different levels of mastery. Tim asked if level of mastery be included? Valerie commented that we should take a look at something implementing levels of mastery. Ronald added that it is an important concept related to assessment levels. Many criteria would vary depending on the level of mastery.

We'll add to the grid and for next call, talk to the Technical Steering Committee about our technical approach,  and decide on some fields based on the analyzed frameworks and use cases. Ronald asked if we could have discussion on level of granularity of competency on the next call.

4. Discuss how comparison relates to use cases

5. Discuss which components should be included in a specification

6. Open discussion

Decisions

Action Items

  • Valerie will contact David price to discuss requirements for competency frameworks used for certification
  • Tim will include other competency frameworks in the comparison document.
  • Valerie will begin drafting an initial specification document
  • No labels