Volume 4, Issue 1 (Continuously Updated 2021)                   Func Disabil J 2021, 4(1): 0-0 | Back to browse issues page


XML Print


Download citation:
BibTeX | RIS | EndNote | Medlars | ProCite | Reference Manager | RefWorks
Send citation to:

Khazaei M, Abolghasemi J, Amiri Shavaki Y, Sedigh Maroufi S, Azari S, Jenabi M. Traditional treatment approach for disordered articulation of sounds: evaluating clinical skills of Speech Language Pathology students by Direct Observation of Procedural Skills. Func Disabil J 2021; 4 (1) : 36
URL: http://fdj.iums.ac.ir/article-1-158-en.html
1- Department of Speech Therapy, School of Rehabilitation, Iran University of Medical Sciences, Tehran, Iran.
2- Department of Medical Education, School of Public Health, Iran University of Medical Sciences, Tehran, Iran.
3- Department of Medical Education, Iran University of Medical Sciences, Tehran, Iran.
Full-Text [PDF 1139 kb]   (876 Downloads)     |   Abstract (HTML)  (1381 Views)
Full-Text:   (570 Views)
Introduction
Medical education and various health-related disciplines, including Speech Language Pathology (SLP), as part of the higher education system, deal with human life. One of the essential tasks of the country’s medical universities is to train the human resources to meet the community’s health needs with high quality [1]. Therefore, paying attention to the quality and quantity of education and improving it will improve the quality of services in the health sector of the country [2].
In evaluating the quality of medical education, the quality of input (students and faculty members), training process (curricula, facilities, and evaluation methods), and output (graduates) should be considered [3]. Evaluation is an integral and essential part of the duties of any organization, especially an organization such as the Ministry of Health and Medical Education, which is the coordinator, planner, and implementer of medical and paramedical education and a wide range of health services [4]. Evaluation can change education from a static state to a dynamic path and make it possible to identify the strengths and weaknesses of the education process based on its results. Then, by strengthening the positive aspects and eliminating the shortcomings, the planners can take appropriate steps to create educational transformation and reform. Evaluation also motivates students and helps the teachers to revise their activities. Now, if this evaluation is accompanied by appropriate feedback, it can improve the learning of skills [5, 6, 7]. In clinical evaluation, the students’ way of dealing with the patients and their mastery of the desired skills should be evaluated. Learning these skills is essential to save the patients’ life and promote community health [8].
Concerning these critical issues, few steps have been taken to modify traditional assessment methods. Research has shown that evaluation methods in most clinical courses are neither appropriate for educational goals nor effective in measuring students’ clinical skills and performance. These methods could not fully assess students, and only evaluate the result of short term study of students before the exam [9, 10]. Therefore, these exams only strengthen the short-term knowledge. In this superficial approach to learning, the students do not have the opportunity to diagnose their mistakes or try to correct it. However, one of the basic needs in clinical education is gaining the necessary ability to deal with the patients and the disease management [9, 11, 12, 13]. Also, the implementation of such assessment methods leads to student dissatisfaction, because they do not usually evaluate all skills of the students [14]. 
For years, specialists have been looking for ways to assess students’ clinical performance [11] effectively. Currently, different methods have been designed for the clinical evaluation of students. These methods include the portfolio test, Objective Structured Clinical Examination (OSCE), the mini-Cex clinical trial, and the Direct Observation of Procedural Skills (DOPS) [15]. At present, the OSCE performance booklet and test are highly acceptable [16]. The OSCE test in assessing clinical skills in some departments, including SLP, was conducted by Moradi et al. at Ahvaz Jundishapur University in 1996. Its positive points are as follows: 1) the questions are the same for all students, 2) the tester becomes aware of its strengths and weaknesses, and 3) it identifies the strengths and weaknesses of testers in training. Also, its weaknesses are as follows: 1) performing the OSCE test requires a lot of time, facilities, and staff, 2) test time is long for learners and test-takers, 3) student stress during the test is high, and 4) it evaluates student performance only on the day of the test and does not care about student’s performance during the semester. Therefore, it seems necessary to study other methods of evaluation [17]. The DOPS test was designed to provide feedback during a practical procedure. Since the Royal College of Medicine in England provided the DOPS test, there was no assessment tool where the test subject could be observed and evaluated while performing practical skills [1819]. DOPS is an excellent way to provide an opportunity to provide conductive feedback and to focus the student on the points needed to perform the desired skill; because evaluation to improve performance requires specific and timely feedback [202122].
In a study conducted by Farajpour et al. at Mashhad Medical School to assess the satisfaction of medical interns and professors with performing the DOPS test, 60 medical interns were evaluated by the test. Significance was higher than 50 (average score expected), and the effect of education and learning had the highest score. Satisfaction from the examiners’ view was also significantly higher than 50. According to the results of this method, it can be used as a practical activity for better learning and accepting of professional responsibilities by medical students and their satisfaction [23]. Naina Kumar1 et al. conducted a study in India to study the usefulness of DOPS in the evaluation of midwifery graduate students. Their results showed that repeated DOPS, regardless of teaching method, improved students’ skills and confidence in managing emergency living conditions [24]. In another study conducted to evaluate the validity of the DOPS test in 11 pediatric colonoscopy centers by Keith Siau et al. in the UK, 29 students participated. The results of this study provide credible evidence that DOPS is a competent tool in pediatric colonoscopy. DOPS is also valuable for measuring progress in clinical skills competence during clinical education [25].
According to the evidence of DOPS evaluation research, no study has been conducted about articulation disorders in the field of SLP. Because of the importance and necessity of a valid test to evaluate the clinical performance of students and the lack of research in this field, we decided to design this study to assess the validity and reliability of the DOPS test in determining the clinical skills of SLP students in Mashhad School of Paramedical Sciences.
Materials and Methods
This study was descriptive-analytic noninterventional research, and data collection was done in the first and second semesters of 2019-2020, in Mashhad University of Medical Sciences. The studied samples were 20 students in the SLP major who were passing their clinical courses. The sample was selected by census method, after giving explanations about the study and taking their consent.
The DOPS is a clinical observation test specifically designed to assess procedural skills and provide feedback to the students. This method requires direct observation of the learner while performing a procedure in a real environment and simultaneous evaluation by the professor using a checklist [26]. Using the DOPS method, the examiner pays attention to the essential points evaluated in the desired skill. The progress of learners is assessed and tracked according to the forms and checklists. This method facilitates the provision of feedback to the learner; instead of general comments, feedback is based on real and objective behaviors [2728].
A list of procedural skills of SLP in the field of assessment and treatment of speech sound disorders was provided to 10 SLP professors. They were asked to rate the items on the list in terms of the clinical importance of that skill. Based on the obtained rankings and their agreement, and the suggestion of the faculty members of the SLP group, the traditional treatment approach skill, which had the highest average grade, was selected for the test materials. This step was performed based on the opinions of experts familiar with clinical education, including faculty members of the SLP Department of Iran University of Medical Sciences and Mashhad University of Medical Sciences, to enable accurate and correct judgment. Although the proposed method of Lawshe suggests that the minimum number of members is 4, we decided to request as many professors as possible to participate in this study because some professors may withdraw from the research and not return the questionnaire.
Then, the evaluation criteria of the procedure were compiled by extracting from the SLP reference books and the opinions of the faculty members of the mentioned universities. An evaluation form was prepared, and the method of recording and documenting the results was specified. Each question was scored on a scale of 0 to 10 (unacceptable, lower than expected, limit, expected, and higher) based on the student’s performance. This is a qualitative way to determine content validity. However, to evaluate the content validity, the two dimensions of Content Validity Ratio (CVR) and Content Validity Index (CVI) were used quantitatively. In the study of CVR, the necessity and usefulness were examined. To determine the CVI, the simplicity, transparency, and relevance of the questions were examined. The face validity was assessed according to the experts’ opinions about the impact score of questions. The interrater reliability of the test was determined by using two raters.
The next step was the training of two examiners, who were instructed and provided with a written guide about these items: a) for the clinical exam and how to perform the DOPS test, and b) for greater reliability and uniformity of examiners’ skills in judgment and how to score and guidance for using of checklists with the necessary criteria in each part of the checklist. To educate the students, we held individual sessions where the students were informed about the purpose of the research, the evaluation process, its procedures, and the evaluators. Students entered the study with informed consent. The time of the test was determined by the students so that whenever they felt to have the necessary competence, they asked the professors to evaluate their performance. The students were directly monitored by the examiners while performing the skill. The examiners recorded their observations in a checklist, and appropriate feedbacks were provided to the students after the treatment in a suitable environment, and their strengths and weaknesses were discussed. The time required for the observation and feedback phase was about 15 and 5 minutes, respectively.
Results
The study was conducted on 20 undergraduate students of SLP in the Department of Speech Language Pathology in the Mashhad School of Paramedical Sciences. Two assessment forms have been completed for each student. Two assessors simultaneously observed, recorded, and completed these forms. The average time for each exam was 15 minutes, which the final minutes were spent to provide feedback to the student. This test was performed in Ghaem and Dr Sheikh hospitals in Mashhad.
To obtain the content validity of this test, we have acted according to the Lawshe table, for 10 experts, the content validity ratio was 0.62 or more. For the CVI, we have divided the number of professors who declared the items relevant or completely related to the total number of experts. The value is 0.8 or more (Table 1).


To obtain the structural validity of the DOPS test, we used the internal consistency method, where the internal structure of the test is examined. The criterion used to check internal consistency is the total score of the test. In this way, the correlation between the scores of the subtests and the total score of the test was calculated, and then each of these subtests (subtest) that showed a small correlation with the total score of the test was removed from the test. Test questions can also be used instead of subtests, in which case the correlations between the test questions are compared (Table 2).


Table 2 shows the correlation coefficients of each of the questions related to the traditional treatment approach skill. The highest correlation is related to questions 4 and 6 (r=0.788) then questions 5 and 9 (r=0.725) and the lowest for questions 1 and 8 (r=0.627).
The correlation coefficient between DOPS scores was calculated using the Spearman correlation coefficient (Table 3).


The face validity of the DOPS test in assessing procedural skills on the real patient has been confirmed according to the experts’ opinion. The impact score of the questions was calculated, which was more than 1.5 for about all of the questions. Therefore, the face validity of the questions was confirmed, and all of them were included in the questionnaire.
To be reliable among the evaluators, two experts in the field of SLP examined 20 students to evaluate the treatment skills in the traditional treatment approach in the field of speech sound disorder. The results in Table 3 show the ICC (Internal Correlation Coefficient) values so that the two evaluators had a high agreement in scoring learners (ICC<0.9).
The Cronbach α coefficient was also calculated to evaluate the reliability of the internal consistency. The Cronbach α value for the traditional treatment approach test was 0.805, which confirms the internal consistency of DOPS for the traditional treatment approach test and the total DOPS score (Table 4).


Discussion
The findings of this study confirm the validity and reliability of the DOPS test conducted in SLP students who were undergoing clinical internships. Face validity and content validity have been verified according to experts’ opinions to assess clinical skills through DOPS on the real patient. Similar studies, including a 2008 study by Wilkinson et al. at the Royal College of Physicians in England on the validity of the DOPS test, found that DOPS had a high apparent validity [17, 29]. Also, the findings of Barton et al. study on colonoscopic skills, Brown and Doshi in psychiatry, Mitchel et al. and Bari studies in radiology, Hamilton et al. study in psychology health care, Shahid Hassan, Khoshrang, et al., Kogan, Sahib al-Zamani, and Naeem are consistent with this study [5, 22, 26, 30, 31, 3233, 34, 35].
In determining the structural validity according to the results, there is a significant relationship between the DOPS test questions. Therefore, the test has a good internal structure. The internal structure confirms the structural validity. This finding is consistent with the results of K Siau et al. on endoscopic skills assessment and gastroscopic diagnostic skills assessment [3637]. It is also compatible with the results of Kuhpayehzadeh et al. in assessing the clinical skills of midwifery students and Sahib al-Zamani et al. in assessing the clinical skills of nursing students [20, 22]. In an experimental study, Wilkinson et al. also confirmed the validity of the test construct by higher scores by higher-level trainees [17, 29].
The K Siau study confirms the validity and reliability of the DOPS test in assessing flexible sigmoidoscopic clinical proficiency [38]. The study by Barton et al. also confirmed the validity and reliability of the DOPS test in assessing bowel cancer screening skills [30]. Also, the study of Roozbehani et al. at Iran University of Medical Sciences reported the validity and reliability of the DOPS test in assessing the clinical skills of audiology students [39].
In the present study, the reliability among testers was significant. The Cronbach α of the DOPS test was in the appropriate range (α<0.7). The Cronbach α value for the traditional treatment approach test was 0.805, which confirms the internal consistency of DOPS for the traditional treatment approach test and the total DOPS score.
This finding is consistent with the study of Kuhpayehzadeh et al. [20] and Sahib al-Zamani et al. [22]. In Weller et al. study in evaluating the reliability of the short clinical trial, they reported high internal consistency with α=0.095 [25]. A study by Bould et al. at the Toronto Children’s Hospital shows that the DOPS test is highly valid and reliable in assessing anesthesia skills [27, 40].
In the study conducted by Roozbehani et al., the validity and reliability of the DOPS test were investigated in assessing the clinical skills of audiology in the Faculty of Rehabilitation Sciences of Iran University of Medical Sciences. This study was performed on 25 internship audiology students. The content validity of the DOPS test using the content validity index was more than 0.79, and the content validity ratio was more than 0.42. The Spearman correlation coefficient of adult hearing assessment skills, tinnitus assessment, and assessment of auditory evoked responses with a total DOPS score were 0.742, 0.704, 0.785, respectively, indicating the optimal internal structure of the test and the structural validity (P>0.001). The reliability of the test was confirmed by the Cronbach α coefficient of 0.788. The lowest and highest correlation coefficients in reliability between evaluators were 0.504, and 0.837, respectively (significant in all cases) [39].
Based on the above findings, the DOPS test for objective assessment of clinical skills in the field of SLP and speech disorders has validity and reliability, and the use of this method due to direct observation and feedback can improve procedural ability and knowledge. SLP students develop clinical skills in the field of speech sound disorders, which can improve the quality of service delivery and patient satisfaction.
Conclusion 
According to the study, the use of DOPS for objective clinical skill assessment has validity and reliability. This test can be used to evaluate the clinical work of Speech-language pathology students, provide feedback, identify strengths and weaknesses, and improve them.

Ethical Considerations
Compliance with ethical guidelines

This study was approved by the Ethics Committee of Iran University of Medical Sciences (Code: IR.IUMS.REC.1398.793).

Funding
The paper was extracted from the MSc. thesis of the first author at the Department of Speech Therapy, School of Rehabilitation Sciences, Iran University of Medical Sciences. 

Authors' contributions
Conceptualization, supervision: Younes Amiri Shavaki, Shahnam Sedigh Maroufi, Jamileh Abolghasemi, Mohammad-Sadegh Jenabi; Methodology: Jamileh Abolghasemi; Investigation, writing – review & editing: All authors; Writing – original draft: Matin Khazaei; Funding acquisition, Resources: Matin Khazaei, Mohammad-Sadegh Jenabi.

Conflict of interest
The authors declared no conflict of interest.

Acknowledgments
We sincerely thank the cooperation of Iran University of Medical Sciences, School of Rehabilitation Sciences, as well as the faculty members of Mashhad Paramedical School and the officials and staff of Ghaem and Dr. Sheikh hospitals of Mashhad.


References 
  1. Lincoln MA, Adamson BJ, Cant RV. The importance of managerial competencies for new graduates in speech pathology. Adv Speech Lang Pathol. 2001; 3(1):25-36. [DOI:10.3109/14417040109003706]
  2. Anbari Z, Ramezani M. The obstacles of clinical education and strategies for the improvement of quality of education at Arak University of Medical Sciences in 2008. AMUJ. 2010; 13(2):110-8. https://web.s.ebscohost.com/abstract?direct=true&profile=ehost&scope
  3. Seifuashemi M, Amin Beydokhti ME, Yazdiha MS, Nabavi M, Faranoosh M. [Internal evaluation as a means of promoting the quality of education in the department of pediatrics of Semnan university of medical sciences (Persian)]. Koomesh. 2001; 2(3):167-75. http://koomeshjournal.semums.ac.ir/article-1-310-en.pdf
  4. McGinn NF, Borden AM. Framing questions, constructing answers: linking research with education policy for developing countries. Cambridge, MA: Harvard University Press; 1995. https://www.amazon.com/Framing-Questions-Constructing-Answers-International/dp/0674317157
  5. Bari V. Direct observation of procedural skills in radiology. AJR. 2010; 195(1):W14-8. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.1057.352&rep=rep1&type=pdf
  6. Franko DL, Cousineau TM, Trant M, Green TC, Rancourt D, Thompson D, et al. Motivation, self-efficacy, physical activity and nutrition in college students: Randomized controlled trial of an internet-based education program. Prev Med. 2008; 47(4):369-77. https://www.sciencedirect.com/science/article/abs/pii/S0091743508003381
  7. Golmakani N, Yousefzadeh S. The midwifery students’ perspective about clinical evaluation based on log book. J Res Dev Nurs Midw. 2012; 9(1):103-11. http://nmj.goums.ac.ir/article-1-218-en.html
  8. Shah GS, Pouladi A, Bahram RM, Farhadifar F, Khatibi R. Evaluation of the effects of Direct Observation of Procedural Skills (DOPS) on clinical externship students’learning level in obstetrics ward of kurdistan university of medical sciences. J Med Educ. 2009; 13(1-2):29-33. https://www.sid.ir/en/journal/ViewPaper.aspx?ID=144622
  9. Shirinbak I, Sefidi F, Sarchami R. [Comparison of traditional and Objective Structured Clinical Examination (OSCE) exams in terms of clinical skills assessment and attitudes of dental students of Qazvin Dental School (Persian)]. J Med Educ Curric Dev. 2014; 9(3):67-74. https://jmed.ssu.ac.ir/article-1-433-en.pdf
  10. Dailly I. Is the objective structured clinical examination and appropriated method of evaluation paramedics. Paramedic Academy. 2001; 1(3):1-4.
  11. Nouhi E, Motesadi M, Haghdoust A. Clinical teachers'viewpoints towards objective structured clinical examination in Kerman University Of Medical Science. Iran J Med Edu. 2008; 8(19):113-19. https://www.sid.ir/en/journal/ViewPaper.aspx?ID=125108
  12. Rushforth HE. Objective Structured Clinical Examination (OSCE): Review of literature and implications for nursing education. Nurse Educ Today. 2007; 27(5):481-90. https://www.sciencedirect.com/science/article/abs/pii/S0260691706001389
  13. Schoonheim‐Klein M, Walmsley AD, Habets LL, Van Der Velden U, Manogue M. An implementation strategy for introducing an OSCE into a dental school. Eur J Dent Educ. 2005; 9(4):143-9. [DOI:10.1111/j.1600-0579.2005.00379.x]
  14. Tazakori Z, Mozafari N, Movahedpour A, Mazaheri E, Karim Elahi M, Mohamadi MA. Comparison of nursing students and instructors about OSPE performance and evaluation methods in common practice. In Proceedings of the 7th National Congress Country Training; 2005 (Vol. 9).
  15. Heidari T. The effect of Portfolio’s evaluation on learning and satisfaction of midwifery students. AMUJ. 2010; 12(4):81-8. http://jams.arakmu.ac.ir/article-1-434-en.html
  16. Moorthy K, Munz Y, Sarker SK, Darzi A. Objective assessment of technical skills in surgery. BMJ. 2003; 327(7422):1032-7. [DOI:10.1136/bmj.327.7422.1032]
  17. Wilkinson JR, Crossley JG, Wragg A, Mills P, Cowan G, Wade W. Implementing workplace‐based assessment across the medical specialties in the United Kingdom. Med Edu. 2008; 42(4):364-73. [DOI:10.1111/j.1365-2923.2008.03010.x]
  18. Dornan T, Mann KV, Scherpbier AJ, Spencer JA. Medical education: theory and practice E-Book. Amsterdam: Elsevier Health Sciences; 2011. https://books.google.com/books?hl=en&lr=
  19. Wragg A, Wade W, Fuller G, Cowan G, Mills P. Assessing the performance of specialist registrars. Clin Med. 2003; 3(2):131. [DOI:10.7861/clinmedicine.3-2-131] [PMCID] [PMID]
  20. Kouhpayezadeh J, Hemmati A, Baradaran HR, Mirhosseini F, Akbari H, Sarvieh M. Validity and reliability of direct observation of procedural skills in evaluating clinical skills of midwifery students of Kashan nursing and midwifery school. JSUMS. 2014; 21(1):145-54. https://www.sid.ir/en/journal/ViewPaper.aspx?ID=368248
  21. Nooreddini A, Sedaghat S, Sanagu A, Hoshyari H, Cheraghian B. Effect of clinical skills evaluation applied by Direct Observation Clinical Skills (DOPS) on the clinical performance of junior nursing students. J Res Dev Nurs Midw. 2015; 12(1):8-16. http://nmj.goums.ac.ir/article-1-700-en.html
  22. Sahebalzamani, M. and H. Farahani, Validity and reliability of direct observation of procedural skills in evaluating the clinical skills of nursing students of Zahedan Nursing and Midwifery School. Zahedan Journal of Research in Medical Sciences. 2012; 14(2):76-81. https://www.sid.ir/en/journal/ViewPaper.aspx?id=261462
  23. Farajpour A, Amini M, Pishbin E, Mostafavian Z, Farmad SA. Using modified Direct Observation of Procedural Skills (DOPS) to assess undergraduate medical students. J Adv Med Educ Prof. 2018; 6(3):130-6. [PMCID] [PMID]
  24. Kumar N, Singh NK, Rudra S, Pathak S. Effect of formative evaluation using direct observation of procedural skills in assessment of postgraduate students of obstetrics and gynecology: Prospective study. J Adv Med Educ Prof. 2017; 5(1):1–5. [PMCID] [PMID]
  25. Siau K, Levi R, Iacucci M, Howarth L, Feeney M, Anderson JT, et al. Paediatric colonoscopy direct observation of procedural skills: Evidence of validity and competency development. J Pediatr Gastroenterol Nutr. 2019; 69(1):18-23. [DOI:10.1097/MPG.0000000000002321]
  26. Naeem N. Validity, reliability, feasibility, acceptability and educational impact of Direct Observation of Procedural Skills (DOPS). J Coll Physicians Surg Pak. 2013; 23(1):77-82. https://www.jcpsp.pk/archive/2013/Jan2013/17.pdf
  27. Bindal N, Goodyear H, Bindal T, Wall D. DOPS assessment: A study to evaluate the experience and opinions of trainees and assessors. Med Teach. 2013; 35(6):e1230-4. https://www.tandfonline.com/doi/abs/10.3109/0142159X.2012.746447
  28. Weller JM, Jolly B, Misur MP, Merry AF, Jones A, Crossley JM, et al. Mini-clinical evaluation exercise in anaesthesia training. Br J Anaesth. 2009; 102(5):633-41. https://academic.oup.com/crawlprevention/governor?content=%2fbja%2farticle-abstract%2f102%2f5%2f633%2f253676
  29. Wilkinson J, Benjamin A, Wade W. Assessing the performance of doctors in training. BMJ (Clinical research ed.). 2003; 327(7416):s91-2. [PMID] [DOI: 10.1136/bmj.327.7416.s91]
  30. Barton JR, Corbett S, van der Vleuten CP, Programme EB. The validity and reliability of a Direct Observation of Procedural Skills assessment tool: Assessing colonoscopic skills of senior endoscopists. Gastrointest Endosc. 2012; 75(3):591-7. https://www.sciencedirect.com/science/article/abs/pii/S0016510711022875
  31. Brown N, Doshi M. Assessing professional and clinical competence: the way forward. Adv Psychiatr Treat. 2006; 12(2):81-9. [DOI:10.1192/apt.12.2.81]
  32. Hamilton KE, Coates V, Kelly B, Boore JR, Cundell JH, Gracey J, et al. Performance assessment in health care providers: A critical review of evidence and current practice. J Nurs Manag. 2007; 15(8):773-91. [DOI:10.1111/j.1365-2934.2007.00780.x]
  33. Hassan S. Faculty development: DOPS as workplace-based assessment. Edu Med J. 2011; 3(1):e32-43. http://eduimed.usm.my/EIMJ20110301/EIMJ20110301_05.pdf
  34. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA. 2009; 302(12):1316-26. [DOI:10.1001/jama.2009.1365]
  35. Mitchell C, Bhat S, Herbert A, Baker P. Workplace‐based assessments in Foundation Programme training: do trainees in difficulty use them differently? Med Edu. 2013; 47(3):292-300. [DOI:10.1111/medu.12113]
  36. Siau K, Crossley J, Dunckley P, Johnson G, Feeney M, Hawkes ND, et al. Direct Observation of Procedural Skills (DOPS) assessment in diagnostic gastroscopy: nationwide evidence of validity and competency development during training. Surg Endosc. 2020; 34(1):105-14. https://link.springer.com/article/10.1007/s00464-019-06737-7
  37. Siau K, Dunckley P, Valori R, Feeney M, Hawkes ND, Anderson JT,  et al. Changes in scoring of Direct Observation of Procedural Skills (DOPS) forms and the impact on competence assessment. Endoscopy. 2018; 50(08):770-8. [DOI:10.1055/a-0576-6667]
  38. Siau K, Crossley J, Dunckley P, Johnson G, Haycock A, Anderson JT, et al. Training and assessment in flexible sigmoidoscopy: Using a novel Direct Observation Of Procedural Skills (DOPS) assessment tool. JGLD. 2019; 28(1):33-40. [DOI: http://dx.doi.org/10.15403/jgld.2014.1121.281.nov]
  39. Alborzi R, Koohpayehzadeh J, Rouzbahani M. Validity and reliability of the persian version of direct observation of procedural skills tool in audiology. Sci J Rehab Med. 2021; 10(2):346-57. http://medrehab.sbmu.ac.ir/article_1101106_en.html
  40. Bould MD, Crabtree NA, Naik VN. Assessment of procedural skills in anaesthesia. Br J Anaesth. 2009; 103(4):472-83. https://academic.oup.com/crawlprevention/governor?content=%2fbja%2farticle-abstract%2f103%2f4%2f472%2f233170
 
Type of Study: Research | Subject: Speech Therapy
Received: 2021/03/13 | Accepted: 2021/09/12 | Published: 2021/11/25

Add your comments about this article : Your username or Email:
CAPTCHA

 

Designed & Developed by : Yektaweb