Programmatic Assessment Prototype Design Using Integrated System Thinking And Design Thinking Framework.

Objective
To design a prototype programmatic assessment model using integration of system thinking and design thinking frameworks, and to explore barriers in the way of its implementation in medical education.


METHODS
The qualitative case-study was conducted at Bahria University Medical and Dental College, Karachi from September to December 2022. Methodological approaches used were systems thinking and design thinking. Philosophical paradigm used was critical realism. Maximum variation purposive sampling technique was used for selecting faculty members and medical students. Data was collected in two sets of semi-structured interviews. The initial interviews were at the levels of empathise, define and ideate. After that low-fidelity prototype programmatic assessment design was created and presented to the participants. The second set of interviews were focussed on feedback on prototype of programmatic assessment design. Pattern matching method was used for data analysis.


RESULTS
Of the 65 subjects, 20(30.7%) were faculty members; 5(25%) lecturers and 5(25%) professors each from basic and clinical sciences. The remaining 45(69.2%) were students; 5(11.1%)from each year of medical and dental streams. Initial interviews revealed that after failure, students struggle to understand the cause of failure as assessment system lacked narrative feedback mechanism and numbers or pass/fail decisions were not self-explanatory. Students lacked attitude towards continued improvement, formative assessment were not taken seriously as they carried only 10% weightage. After the programmatic assessment design was presented to the participants, faculty members voiced concerns regarding faculty training, shortage of resources and legal issues related to its implementation. Students supported the design, especially continued narrative feedback for their academic improvement. Portfolio development was considered a time-consuming task and they had reservations regarding their ability to make quality assessment portfolio.


Conclusion
Implementation of programmatic assessment required faculty development and was resource-intensive, but stakeholders, especially students, were found interested in its implementation in medical education.


Introduction
Assessment is measurement of learning and determinant of the competence level.Inefficient assessment methodologies lead to incompetent medical graduates.Hence, a well-structured assessment system is required to promote learning and to measure the level of competence at every stage of undergraduate medical education.There are multiple tools for assessment, but all of them have limitations.The recently proposed design of programmatic assessment enhances both the learning function and the decision-making, using well-researched principles of assessment.2][3] Instead of traditional divide of assessment methods into formative and summative, programmatic assessment focusses on continuum of stakes, ranging from low to high stakes in decisionmaking. 2,3Programmatic assessment includes multiple assessment strategies as well as longitudinal view of learning progress and assessment in relation to certain learning outcomes or competencies.5][6] High-stake decisions require many data points.8][9] Final decisions should be made by mapping and aligning of data points to the curriculum.Each assessment method and its content are purposefully chosen with a clear justification for using that particular assessment in a certain part of the curriculum. 10,11The model of programmatic assessment is perceived as complex and theoretical by most educators.0][11] The current study was planned to design a prototype programmatic assessment model using integration of system thinking and design thinking frameworks, and to explore barriers in the way of its implementation in medical education.

Materials and Methods
The qualitative case-study was conducted at Bahria University Medical and Dental College, Karachi from September to December 2022.The study used critical realistic philosophical paradigm.Community of practice was used as the theoretical framework for guiding interviews and analysing them.Community of practice referred to group of individuals who shared common problem and worked together towards it solution by mapping knowledge and identifying gaps.
A qualitative design was chosen for in-depth exploration of barriers to the implementation of the programme.Critical realist paradigm was used to understand the underlying cause which makes the implementation difficult in medical education.System thinking is understanding regarding each component of a system, and the relationship between components and factors affecting each component.Design thinking is a creative problem-solving approach, which emphasises the need for identification of a problem by stakeholders, and focusses on the needs and experiences of the stakeholders.The study integrated a '3P 6C; system thinking framework with system-design thinking frame work for need assessment of programmatic assessment and explored barriers to implementation by developing prototype programmatic assessment design and presenting to the stakeholders.System thinking believes in expansionism instead of reductionism, according to which every system has an interconnected component, and changes made in one component can adversely affect other components.There will be consequences of changes made in the assessment strategy on teaching and learning strategies.In the 3P 6C, system thinking framework 3P signifies personal, an individual learner, programme, teaching along with assessments methods, both the explicit as well as hidden, that a learner might experience and practice.Three initial Cs are focussed on teaching activities and outcomes, and the next three Cs indicate assessment.Design thinking refers to identifying a problem empathically with stakeholders and taking their feedback on prototype innovative design.It is a non-linear and flexible process consisting of 5 steps: empathise, define, ideate, prototype design presentation and feedback.Empathy with the users' experience is at the core of design thinking. 12 empirical or observed and predicted or theoretical patterns appear congruent, then this can provide data analysis evidence to strengthen the research's internal validity. 13e sampling technique used in the study was maximum variation purposive sampling.Inclusion criteria was faculty members, seniors and juniors from basic and clinical subjects.Medical and dental students from each year of the respective academic programmes were also included.Faculty members and students who did not give consent were excluded.Interviews were conducted in two phases after taking informed consent from all the subjects.The first phase of interviews focussed on problems in the existing assessment strategies.After initial interviews, a prototype programmatic assessment design was developed by incorporating new ideas which were co-constructed with stakeholders.The prototype design was presented to all the participants.The second phase of interviews was conducted to take feedback of stakeholders on the prototype design.Interviews were conducted in college cafeteria during lunch break for students and faculty members.Data collection was done using semi-structured interviews in two sets, which was in alignment with design thinking framework, which were audio-recorded and then transcribed.Data was collected until data saturation was achieved.Analysis of transcripts of interviews was done using pattern matching technique.After the interviews, the transcripts were analysed in stages of open, axial and selective coding, and themes were constructed.The generated themes were compared with theoretical framework to analyse similarities and differences between theoretical framework and the study outcome pattern.
The first phase of interviews suggested ambiguity in results.After failure, students struggled to understand the cause of failure, as the assessment system lacked narrative feedback mechanism and numbers or pass/fail decisions were not self-explanatory.
"My friend and me studied together from the same books, but I have no idea why there is so much difference in marks we secured in the final professional examination" Another element highlighted was lack of continuous improvement.Formative assessment was not considered important by students as they carried only 10% weightage.End-of-year summative assessment was considered most important by the students, but they found it difficult to cover extensive curriculum.
"The actual examination is end-of-term examination; that is what matters." Further, locus of control was identified.The students blamed their failure on external factors, like difficult examination content, strict marking, inappropriate assessment strategies and teachers' attitude.This demonstrated an external locus of control; a non-cognitive skill.
On their part, faculty members were satisfied with the present examination system.
The prototype (Table ) incorporated multiple cycles of feedback to ensure that mentor and students agreed on the same point.The mentor should provide written narrative feedback on each data point of assessment and portfolio.Reflection on portfolio by students would enhance their learning abilities as well as portfolio quality.Longitudinal assessment of multiple assessment data points and mapping data points were also part of the prototype.The final step would be a sequential judgment procedure by the assessment committee compiling assessment data points, considering few data points for low-stake assessment and multiple data points for highstake assessments.
In the second phase of interviews, the quality of portfolio emerged as the most import theme for both faculty members and students who showed concerns regarding evaluator bias.
Faculty members showed concerns regarding faculty training for programmatic assessment, and also identified shortage of resources for compiling data required for it.
"Faculty training and new resources will be required to implement this new system." Faculty members demanded specific guidelines to be given for structured feedback.The students supported the design, especially continued narrative feedback for their academic improvement.Portfolio development was considered a time-consuming task by most students and faculty members who also had reservations regarding their abilities to compile quality assessment portfolio.
"At least we will understand why we failed as numbers are not self-explanatory." The faculty members voiced concern on the legal aspect of institutional autonomy to implement programmatic assessment strategy.

Discussion
The study identified the experiences of faculty members and students regarding the existing assessment methods, and presented a prototype programmatic assessment design.The faculty members and students were not satisfied by the current assessment system.The feedback on the prototype programmatic assessment design was positive, but senior faculty members showed concern regarding massive information data collected over a period of time and difficulties to be faced regarding its management with the existing resources.Collection of information through electronic portfolios is solution to anticipated concerns of faculty members.Portfolios allow periodic analyses of the students' competence development and reflections on learning goals. 14The foundation of programmatic assessment is information richness, and without rich assessment information, programmatic assessment will be unsuccessful. 15,16It is time to switch from quantitative to qualitative assessment feedback.Narrative feedback gives the learner a better picture of areas of improvement.Internal formative assessment carries only 10% weightage and 90% is allotted to summative final examination, which is not justified.At least 40% weightage should be given to formative assessment for students to take it seriously and learn from the feedback given in narrative form, which will motivate the students to study all year along. 17 Identification of outcomes and competencies to be assessed Identification of assessment tools and use of multiple tool for one competency Portfolios preparation by students in collaboration with mentors in alignment with specific assessment criteria and standardization by providing checklist of required learning outcomes to be prepared by students.
Continues narrative feedback on each assessment to individual student, identifying strength and weakness by mentors.
Member checking procedure by asking students their views on strength and weaknesses identified by mentor to ensure student satisfaction Programmatic Assessment committee in collaboration with assigned mentors for final decisions considering few data points of low stake assessment and many data points for high stakes assessment.Quantitative scores and qualitative results will be integrated for final decision.

Conclusion
Teaching strategies need collaboration, self-directed learning, constructivism and context.All this can be achieved by problem-based learning.Similar attributes are needed for effective assessment as well.Therefore, implementation of programmatic assessment is needed in place of traditional assessment in medical education.Disclaimer: None.Conflict of Interest: None.Source of Funding: None.

Table :
Prototype programmatic assessment design.