Gynecologic surgeons need validated tools to assess performance and guide their efforts to maintain and improve their skills. C-SATS provides a crowdsourcing solution.
Dr Robinson is Associate Professor of Obstetrics and Gynecology at George Washington University, Washington, DC and AAGL MIGS Fellowship Director. He has received consulting fees from Bayer.
Dr Vargas is an AAGL/ASRM Fellow in Minimally Invasive Gynecologic Surgery at George Washington University, Washington, DC. She has no conflicts of interest to report in respect to the content of this article.
The rapid and widespread adoption of laparoscopy and robot-assisted laparoscopy for gynecologic procedures has challenged gynecologic surgeons to acquire and maintain specialized technical skills. Depth perception, psychomotor ability, and bimanual dexterity must be continuously refined to optimally perform laparoscopic surgery.
Furthermore, despite the enhancements of robotic surgery to overcome the technical challenges of laparoscopy, a significant skill set is still necessary to attain proficiency. Gynecologic surgeons need validated tools to assess performance and guide their efforts to maintain and improve their skills.
Surgical residency training programs are shifting from subjective assessments with poor validity and reliability1 to objective and systematic approaches, such as the Objective Structured Assessment of Technical Skills (OSATS). OSATS has been validated for use in many surgical subspecialties,2-5 and evaluates skills such as respect for tissue, time and motion, instrument handling, and flow of operation (Table 1).
More recently, GOALS (Global Operative Assessment of Laparoscopic Skills) and GEARS (Global Evaluative Assessment of Robotic Skills) were validated6,7 and residency programs are using them for assessment of laparoscopic and robotic skills. GOALS and GEARS evaluate skills specifically needed to perform laparoscopy, such as depth perception, bimanual dexterity, efficiency, and tissue handling (Tables 2 and 3, next page). OSATS, GOALS, and GEARS are traditionally implemented in simulated or actual operating room environments and are generally used by educators to assess trainees. The translation of similar assessment tools for practicing surgeons has the potential to help ensure continual professional development and surgical proficiency, but implementation presents a number of challenges.
In the professional environment, surgeons may review videos of themselves performing surgery to assess their skills. Self-evaluation of video performance is vitally important for developing insight into areas in need of improvement, but is not objective.
Enlisting colleagues to review performance may increase objectivity, but it is tedious and time-consuming for the reviewer, and could lead to tension in the professional relationship if the feedback is not received well. Another option is to hire an outside professional, but this could be costly and requires confirming evaluator qualification and expertise.
Recently, an innovative approach to assessing surgical skills for practicing surgeons was introduced by C-SATS, Inc. C-SATS was founded by a team of surgeons, engineers, and biostatisticians in Seattle, Washington.
The C-SATS approach is novel in that it uses crowd sourcing, a practice of outsourcing services to an anonymous group of people typically identified through an online community.8 Common uses for crowd sourcing include software testing, dictionary compilation, and obtaining input for product development. More recently, crowd sourcing has been used for civic engagement in public policy initiatives, such as Iceland’s constitution reform in 2011. The most well-known crowd-sourcing project is Wikipedia.
Using a combination of expert and surgically naïve crowd-worker assessments of surgical videos, C-SATS provides anonymous and almost immediate feedback about technical performance. The feedback is provided using validated objective scoring systems, such as GOALS and GEARS. Surgeons can upload videos to the C-SATS website, which is compliant with the Healthcare Insurance Portability and Accountability Act, 24 hours a day, 7 days a week. Submitted videos must be unedited and C-SATS will guide surgeons in selecting a 10-minute segment for review.
Distinct panels of reviewers evaluate segments of the video. Statistical modeling is used to tabulate the overall performance score. Scores are also given for individual skills (ie, bimanual dexterity, depth perception, efficiency, and tissue handling). Reviewers provide comments to enhance the utility of the scores. In addition, the “digital performance dashboard”, used to provide the reviewers’ feedback, links to the associated video segments (Video 1 and Video 2).
The feedback is rapidly produced and easy to understand. Anonymity helps reduce the stress of the evaluation, allowing surgeons to focus on maintaining and improving their skills. To our knowledge, C-SATS is the only company that provides such a service. The company has taken an inventive approach to filling this void and is providing the service at the relatively affordable price of $100 per video.
The C-SATS system has been validated in a number of studies, both in simulation and wet-lab environments. Chen et al. showed that assessments of a single surgeon by 500 hired members of a surgically naïve crowd were equivalent to assessments provided by a panel of 10 surgical experts.9
Holst et al. subsequently showed that crowd-workers rated the performances of 2 residents and 2 attendings performing suturing tasks in the same order of skill level and with excellent inter-rater reliability when compared with expert faculty.10 Notably, in this study, assessments were available on average within 3 hours of video submission when completed by crowd-workers versus 26 days from video submission when completed by the group of expert surgeons.10
Through these studies, C-SATS also developed a validated process honing the crowd to enhance the reliability of the feedback. This involves language assessments, screening video reviews, and intermittent questions to ensure the reviewer is actively paying attention during the video review.9-11
Thus far, the preponderance of evidence supports C-SATS’ methods for skills assessment in lab-based environments. Results are also promising for the operating room. A study published in The New England Journal of Medicine in 2013 found that expert video assessments of skills during the performance of bariatric surgery directly correlated with surgical outcomes11, suggesting that video review of operating room performance using structured assessment tools is a valid approach.
More recently, data were presented at the 2015 American Urologic Association’s annual meeting using C-SATS assessments of radical prostatectomy that showed reliability of crowd-sourced evaluations with expert reviewers. Crowd-workers were also able to consistently detect surgeons in the bottom quartile of performance. These results are preliminary and research is ongoing.
References for validation studies and information about ongoing research projects are available at the C-SATS website.
In addition to the standard video assessment services for individual physicians, C-SATS develops services for training institutions and healthcare organizations in the United States and Canada. For example, C-SATS assessments have been used in residency programs to enhance the objectivity of the surgical skills evaluation processes while reducing the burden on faculty surgeons. In addition, some training program directors are working with C-SATS to develop systems for linking trainee C-SATS scores to the Accreditation Council for General Medical Education (ACGME) milestones.
C-SATS has also established services for streamlining credentialing and surgical privileging, as well as for evaluating potential hires. Healthcare organizations are utilizing C-SATS to develop performance measures to better allocate resources to quality improvement efforts. More recently, C-SATS has developed a partnership with an organization that will link scores to longitudinal outcomes with the goal to guide coverage and referral patterns to favor surgeons with the best surgical outcomes.
C-SATS also allows practicing physicians to obtain Continuing Medical Education (CME) credit. There are 2 methods through which physicians can obtain up to 20 CME credits. The first is through the review of other surgeons’ video-recorded performances. Surgeons can select from a library of video-recorded surgeries and anonymously complete an evaluation of technical skills. Through video review, surgeons can obtain up to 20 CME credits with the opportunity to compare their performance to other performances in the library.
The second method is a 3-stage process being implemented through the American Medical Association. The first stage simply involves the submission of 3 video performances for review. The surgeon receives standard feedback from C-SATS using the appropriate assessment tools. In the second stage of the process, surgeons are challenged to identify the areas of their performance in most need of improvement through comparison with a video of an optimal performance. The surgeon then has the opportunity to implement new techniques and practices and ultimately must submit a second batch of videos for review. The surgeon must then review his or her performance between stage 1 and stage 2.
In stage 3, the surgeon submits a third batch of videos for review, compares results among all 3 stages, and reflects on the 3-stage process by completing a questionnaire. Ultimately, surgeons can obtain 5 CME credits for completing stage 1 or 2, and 20 CME credits for completing all 3 stages of the process. Specific information about the steps necessary to obtain CME credits is available at the C-SATS website.
In his piece Personal Best, Atul Gawande introduces the concept of coaching in medicine.12 He points out that nearly all elite athletes have coaches to help them play at their best. He translates this concept from the sports world to the operating room: No matter how well trained we are as physicians, we need guidance to maintain our best personal performance in the operating room. In an ever-evolving field, we are challenged to continuously learn and develop our capabilities.
In addition, as the medical culture progressively focuses on healthcare quality and performance measures, we are increasingly charged to be self-aware about our performance. C-SATS essentially provides an anonymous surgical coaching service for practicing physicians. It is the first company with such a concept, and offers a solution to the void of assessment tools for physicians after their initial certification. It will likely continue to expand in its utility in areas such as hospital credentialing and recertification.
In this performance-focused era, we will likely see companies like C-SATS become increasingly recognized as important adjuncts for large-scale and individual development efforts to optimize outcomes by keeping surgeons operating at their best.
1. Mandel LP, Lentz GM, Goff BA. Teaching and evaluating surgical skills. Obstet Gynecol. 2000;95:783-785.
2. Goff B, Mandel L, Lentz G, et al. Assessment of resident surgical skills: is testing feasible? Am J Obstet Gynecol. 2005;192:1331-1338; discussion 1338-1340.
3. Lentz GM, Mandel LS, Goff BA. A six-year study of surgical teaching and skills evaluation for obstetric/gynecologic residents in porcine and inanimate surgical models. Am J Obstet Gynecol. 2005;193:2056-2061.
4. Swift SE, Carter JF. Institution and validation of an observed structured assessment of technical skills (OSATS) for obstetrics and gynecology residents and faculty. Am J Obstet Gynecol. 2006;195:617-621; discussion 621-623.
5. Martin JA, Regehr G, Reznick R, et al. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273-278.
6. Vassiliou MC, Feldman LS, Andrew CG, et al. A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190:107-113.
7. Goh AC, Goldfarb DW, Sander JC, Miles BJ, Dunkin BJ. Global evaluative assessment of robotic skills: validation of a clinical assessment tool to measure robotic surgical skills. J Urol. 2012;187:247-252.
8. Holst D, Kowalewski TM, White LW, et al. Crowd-Sourced Assessment of Technical Skills (C-SATS): Differentiating Animate Surgical Skill Through the Wisdom of Crowds. J Endourol. 2015.
9. Chen C, White L, Kowalewski T, et al. Crowd-Sourced Assessment of Technical Skills: a novel method to evaluate surgical performance. J Surg Res. 2014;187:65-71.
10. Holst D, Kowalewski TM, White LW, et al. Crowd-Sourced Assessment of Technical Skills: An Adjunct to Urology Resident Surgical Simulation Training. J Endourol. 2014.
11. Birkmeyer JD, Finks JF, O'Reilly A, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369:1434-1442.
12. Gawande A. Personal best: top athletes and singers have coaches. Should you? The New Yorker. 2011.