Child pages
  • 2008-11-19
Skip to end of metadata
Go to start of metadata

Meeting Information

Date:

November 19, 2008

Time:

4 PM GMT/11 AM EST/8 AM PST

Attending: Rosalyn Scott, co-chair; Susan Albright, Mary Pat Aust, Peter Greene, Isarin Sathitruangsak, Valerie Smothers

Agenda Items

1. Review of minutes of last 2 meetings (27 October and 7 October)

Rosalyn asked the group if they had any questions regarding the minutes of the past two meetings. Valerie added that the October 7 meeting had focused on exploring relationships among competencies in a framework, while the October 27 meeting had focused on the evolving white paper. Tim is currently working on a revised draft of this document.

Susan commented that she is having her first meeting tomorrow with the schools that are implementing competency frameworks. Following that process, she should be able to contribute to the white paper. Rosalyn commented that we would not make any major moves towards publication without Susan's input.

Both minutes were approved.

2. Summary of Technical Steering Committee call and recommendations

Rosalyn asked Valerie to review the technical steering committee call that had taken place. Valerie commented that she and Tim had spoken with the technical steering committee regarding some issues related to the competency architecture, particularly extra competency relationships. The technical steering committee recommended a separate mapping document and schema that would describe the relationships between competencies and external objects such as learning objects, items in curricula, and performance reports. Having the relationships defined separately from the competencies themselves would facilitate changes in future versions (additional types of relationships) and provide flexibility.

Susan commented that she liked to the concept of the mapping document. Rosalyn added that it would provide a great deal of flexibility.

Valerie added that Tim had also brought up the subject of rules related to relationships. For example, if X is a sub competency of Y, does that mean that X and all other sub competencies of Y must be achieved in order for an individual to be competent in Y?

Susan provided a use case for clarification. At Tufts, there is one competency exam with many subgroups. Is the question how much you have to pass in order to have passed that competency? Also, the medical school has master courses and sub courses. They are considering whether you can fail one subpart and still pass the whole. Valerie commented that that was indeed the question.

Mary Pat commented that those types of rules would vary depending on your setting and the complexity of the work. Susan and Rosalyn agreed. Such rules could vary within an institution by competency or healthcare field. Mary Pat added that we need to create a framework to help people to find a competencies so that they can find what they need and mold it to be what they need. Susan agreed, adding that the competency framework should allow educators to get a better picture of what they are doing.

Peter asked if extra competency relationships are outside the framework does that, does that mean that they are outside of the standard? Will the group be tackling that issue or not?

Roslyn replied that she and Tim have decided that it is important to define both extra competency relationships and competency frameworks in parallel. They will be separate files, but parallel efforts.

Peter added that the discussions seem to focus on what is a rule and what is an interesting characteristic of the relationship. If something is very complex, that may be impossible to capture within the relationship alone. Tim's draft set of relationships is very interesting.

Rosalyn commented that the standard should address a marker of success: what do you have to do to be competent. She questioned whether that would be a rule or how that would fit into the standard. Valerie commented that that was an important issue that the group would need to consider as we built the architecture.

Susan describes what one potential competency measurement might be like: you have to do seven procedures, you have to do so well, and you must complete all parts of one competency exam successfully. That is a complex measurement. Rosalyn commented that that is the ultimate goal. Susan said that it would help organizations to have clarity around the measurements themselves. If you are measuring diagnosis skills, would that be consistent across courses, and would you be able to see where and how you are measuring this competence.

Peter questioned whether that encompasses rules. Susan replied that she thought it did. Peter commented that each competency may have a list of fulfillment criteria. Valerie questioned whether that would be part of the competency definition or part of the mapping. Peter commented that he would need to see more rules to see how they would overlay the framework. CanMeds and others that we have focused on do not get to that level of detail. He asked Susan if rules she described were encoded in any way. Susan replied the she's not sure how they will use rules yet, but that the important point is that you will need to be able to have rules.

Mary Pat added that not all organizations use grades to indicate competence. There may be some validation that the individual can do the work required, has the knowledge, etc. Roslyn commented that there needs to be a way to express success.

Peter asked if it made sense to start with modeling the competencies in their intra competency relationships. The group agreed.

3. Discussion of Tufts database diagram

Susan commented that they worked with the dental school to understand what they have done and model that, trying to map it into the specification. The dental school has domains, which are competency categories such as ethics, professionalism, information management, diagnosis, etc. Each domain has a summary statement, then several competency statements, then learning outcomes related to the competency statement. Valerie questioned whether the relationship between learning outcomes and competencies were hierarchical. Susan replied that they were. But the same learning outcomes may appear in many competency exams. Educators don't realize that they are measuring the same thing in different ways in different places. The competency exam is a measurement for a collection of learning outcomes. There is a rating scale (pass fail, grades, etc.), and sometimes that rating scale is different for the same learning outcome in different exams. Rosalyn asked how it was implemented by year. Susan replied the students carry folders with their competency exams and keep track of how they did on each one. You need to pass the exams to graduate. While diagnosis is measured in many exams, the relationship is not clear.

They have been struggling with the fact that there are many ways to measure outcomes: exams, osce, etc. All have relationships. That's complex. Right now those relationships are not clear. They will send next iteration of diagram.

4. Discussion of data model mapping to existing framework

Valerie commented that Tim had developed these diagrams to show how are existing ideas about the data model and relationships map to existing competency frameworks. Rosalyn commented that within the Scottish Dr. learning outcomes there seem to be a great deal of information in one box. Valerie commented that the assessment methods could be listed separately. Because this is a high-level competency, it is listing a pool from which to choose for lower level competencies.

Susan commented that the diagram seems to be mixing apples and oranges. The top level ought to be a competency statements and everything should flow from that. She added that learning objectives likely don't apply here.

5. Discussion of elements of a competency object

6. Open discussion

The group will move forward on the data model for competencies for the next call. Susan agreed to help in Tim's work.

Decisions

Action Items

The group will move forward on data model for competencies on the next call.

  • No labels