WebCEX title Illustration

WebCEX was developed by Christof Daetwyler MD, and Gregory McGee at the Drexel University College of Medicine jointly with Anthony Donato MD from the Reading Hospital in Reading, PA.

What is WebCEX?

WebCEX is an on-line tool that is designed to allow the structured assessment of clinical competencies through direct observation. It features an interface that adapts automatically to the computer device being used - it works fine with smart phones, tablets, and laptops alike - whereas the emphasis was made on the ease-of-use for smart phones, since these are available to the physician at any time. As a special feature, the tool does not only record the scoring, but also provides the assessor (and later the assessee) with tailored prompts for the feed-back and even learning assignments to address the deficiencies that were identified.

Learning Cycle




Please click the 4 sections of the picture on the left to learn how WebCEX integrates in the learning cycle..

step 1: learning of facts step 2: practice step 3: assess step 4: assign



During residency, senior faculty scores clinical competencies of residents by using the current American Board of Internal Medicine ABIM paper based mini-clinical evaluation exercise (ABIM Mini-CEX) format, which is followed by a mentoring session where the senior faculty uses the observation noted in the ABIM Mini-CEX to provide the assessee with feed-back and an action plan on how to improve performance. Due to its comprehensive scale, the ABIM Mini-CEX must be done at least 10 times to provide an accurate assessment. Even so, the ratings obtained by the ABIM Mini-CEX are not very accurate, as was shown in research studies.

To address the shortcomings of the ABIM Mini-CEX, Anthony Donato developed the pen and pencil “Minicard” format that is small and easy to use and features structured prompts and behavioral anchors for improving the quality of the follow-up mentoring session. Anthony Donato and his colleagues were able to validate their tool in showing that the use of their “Minicard” tool resulted in more observations with better inter-rater agreement and more accurate performance rating.

However, there are limitations to the currently used paper and pencil tools: the paper forms are not always there when an observation could be conducted, resulting in missed opportunities. And due to time constraints, not all the possible prompts and anchored behaviors are given during the feedback mentoring session, resulting in a sub-optimal creation of learning opportunities.

In the fall of 2009, we began jointly with Anthony Donato to develop the “WebCEX”, a novel on-line tool to counter the limitations of the pen and pencil Mini-CEXs that we optimized for being used on smart-phones (which are standard equipment for today’s physicians). In addition to providing an interface to easily enter the data of a structured assessment, the tool features the automated creation of prompts for the follow-up mentoring session – and it creates a personalized web page for the assessee with precise learning assignments to address the deficiencies that were identified. The current version is editable and allows the creation of multiple different scoring lists, thus allowing it to be set up for assessing (and documenting) not only comprehensive competencies as in the ABIM Mini-CEX, but also specific competencies that medical students must acquire during the clinical years.

How does WebCEX work?

We prepared two videos that show WebCEX in action:

Next Steps

We are thinking about developing a graphic interface that may allow a more intuitive entry of the observation data.

Illustrations 2a and 2b: a mock up of a visual interface with 480x320 pixels is shown. Scrollbars, zoom and turn controls allow access to any point on the manikin’s body (2a). Clicking an area reveals a popup list with exams that can be performed at this place (2b).

Illustrations 3a and 3b: clicking the exam that is being observed (3a) brings up a list with the micro skills that make up that exam skill. The list with the micro skills is color coded: clicking or tapping a line where it is green renders the whole line to bear that color, and tells the system that that skill was well performed (3b). Clicking the yellow part tells the system that the skill was partially performed; the red tells the system that the skill was badly performed. The micro skills assessment list in illustration 3b shows a situation where most extra ocular movements were assessed correctly, with the exceptions that the reflexes in the corneas were not checked, and the moving from down-right to extreme left wasn’t done completely right.

This site was last upadated on 06/28/2011 by Christof.Daetwyler@DrexelMed.edu