Assessment Content

Learning Task 2 – Graduate Course EDER 602, Program Evaluation and Practice

Learning Activity Title: Completing Incomplete Program Evaluation

Learning Activity Description:

In this hands-on activity, teams of 3-4 students (formed in Week 3) will compete to complete one of five available incomplete program evaluations using an innovative artificial intelligence program for data collection and analysis. This exercise will hone students' abilities in synthesizing information, conducting stakeholder interviews, analyzing qualitative data, deriving meaningful conclusions, and proposing actionable recommendations.

Objective:

This learning activity is designed to provide students with a realistic experience of completing a program evaluation process, from data collection to delivering insightful recommendations.

Instructions:

1. Provided Material: Five incomplete program evaluations will be made available to the class. Each team will choose one of these evaluations to complete. The chosen evaluation will include sections such as executive summary, introduction, evaluation purpose and questions, and methodology.

2. AI Interviews: Teams will use an artificial intelligence program configured to simulate the personas (will be provided) of relevant program stakeholders. Each team will design interview questions and conduct interviews with these AI personas to collect essential data for their chosen evaluation.

3. Data Analysis: After conducting the AI interviews, teams will receive transcripts which they need to meticulously analyze to gather data relevant to their evaluation questions.

4. Completion of the Report: Using the analyzed data, teams will then complete the remaining sections of their chosen program evaluation, which include:

a. Findings: Present the results of the evaluation, detailing the discovered performance and impact of the program on its target audience, organized by evaluation questions or program objectives.

b. Discussion/Interpretation of Findings: Discuss the significance and implications of the findings. Compare them with the program's expected outcomes or with findings from other similar programs.

c. Conclusions: Summarize the main conclusions drawn from the evaluation findings.

d. Recommendations: Based on the findings and conclusions, provide feasible suggestions for improving the program, ensuring that these are specific and linked directly to the evaluation findings.

e. Appendices: Include any useful supplementary information, such as interview transcripts and theme maps or tables.

Expected Deliverables:

Each team will submit a complete program evaluation report that encompasses all the above sections. The final report should be well-structured, concise, and professionally written, showcasing thoughtful analysis and practical recommendations and much be about 10-15 papges using APA 7^th^ edition format.

Note: Technical support will be available to help students understand and use the AI program. It is encouraged that teams familiarize themselves with the tool in advance to ensure smooth interviews.

Criteria For Assessment of Option 2 Learning Task 2

Criteria Exemplary (3) Proficient (2) Developing (1)
Findings The results of the evaluation are presented in a clear, organized, and comprehensive manner, demonstrating a deep understanding of the program’s performance and impact on its target audience. The results of the evaluation are presented in a clear and organized manner, demonstrating a good understanding of the program’s performance and impact on its target audience. The results of the evaluation are presented in an incomplete or disorganized manner, demonstrating a limited understanding of the program’s performance and impact on its target audience.
Discussion/Interpretation of Findings The significance and implications of the findings are discussed in depth, with insightful comparisons made to the program’s expected outcomes or with findings from other similar programs. The significance and implications of the findings are discussed, with comparisons made to the program’s expected outcomes or with findings from other similar programs. The significance and implications of the findings are discussed in a limited or superficial manner, with little or no comparisons made to the program’s expected outcomes or with findings from other similar programs.
Conclusions The main conclusions drawn from the evaluation findings are clear, insightful, and well-supported by the presented data. The main conclusions drawn from the evaluation findings are clear and supported by the presented data. The main conclusions drawn from the evaluation findings are unclear or unsupported by the presented data.
Recommendations The provided suggestions for improving the program are specific, feasible, and directly linked to the evaluation findings, demonstrating a thorough understanding of the program’s strengths and weaknesses. The provided suggestions for improving the program are specific and feasible, demonstrating a good understanding of the program’s strengths and weaknesses. The provided suggestions for improving the program are vague or unrealistic, demonstrating a limited understanding of the program’s strengths and weaknesses.
Appendices Supplementary information is included and presented in a clear and organized manner, enhancing the reader’s understanding of the evaluation process and findings. Supplementary information is included and presented in a clear manner, contributing to the reader’s understanding of the evaluation process and findings. Supplementary information is missing, incomplete, or presented in a confusing manner, hindering the reader’s understanding of the evaluation process and findings.
Contributions The contributions of each member to the project are clearly described and demonstrate a high level of collaboration and teamwork. The contributions of each member to the project are described and demonstrate a good level of collaboration and teamwork. The contributions of each member to the project are unclear or do not demonstrate effective collaboration and teamwork.

Sample incomplete Program Evaluation (that student needed to complete)

Impact of Culturally Responsive Teaching Strategies on K-12 Teachers' Professional Development

Introduction

The primary focus of this program evaluation report is to assess the impact of Culturally Responsive Teaching Strategies (CRTS) on K-12 teachers' professional development. CRTS is a progressive educational program designed to enhance the skills and competencies of K-12 teachers in integrating culturally responsive teaching methods into their classrooms. By doing so, the program aims to promote an equitable and inclusive learning environment that caters to the diverse needs of students from various cultural, linguistic, and socio-economic backgrounds.

The objectives of the CRTS program revolve around the following key areas:

1. Enhancing teachers' cultural competence and understanding of the diverse backgrounds of their students.

2. Equipping teachers with effective strategies to engage students from diverse cultural backgrounds and create an inclusive learning environment.

3. Encouraging teachers to reflect on their own teaching practices and make necessary changes to cater to the needs of all students.

The purpose of this evaluation is to determine the extent to which the CRTS program has achieved its intended outcomes and to identify areas for improvement. This evaluation serves as a valuable tool for educators, administrators, and funders who are invested in the success of the program and are responsible for its ongoing development and support.

The CRTS program has been implemented in a context marked by increasing cultural diversity in schools, as well as growing awareness of the importance of addressing issues related to equity, social justice, and inclusion in education. Moreover, research has consistently shown that culturally responsive teaching practices contribute to improved student outcomes, particularly among students from historically underrepresented groups. Given this backdrop, the CRTS program is both timely and relevant, as it seeks to address one of the most pressing challenges facing the education sector today.

The potential impact of the CRTS program cannot be overstated. By fostering a more inclusive learning environment, the program has the potential to positively impact not only the students who directly benefit from culturally responsive teaching strategies, but also the wider school community and society at large. In this regard, the evaluation of the CRTS program serves as an essential mechanism for informing and refining its implementation, ensuring that it continues to contribute to the overarching goal of promoting equity and inclusion in education.

Methods

Research Design

The research design employed in this program evaluation is a mixed-methods approach, combining both quantitative and qualitative data collection and analysis techniques. This approach allows for a comprehensive assessment of the impact of the Culturally Responsive Teaching Strategies (CRTS) program on K-12 teachers' professional development, catering to the multiple objectives of the program.

Sampling Method

A stratified random sampling method was used to select research participants from the population of K-12 teachers who have participated in the CRTS program. This sampling method ensures that the selected participants represent various demographic characteristics, such as age, gender, teaching experience, and school type. This method also allows for the inclusion of teachers from diverse cultural, linguistic, and socio-economic backgrounds, reflecting the intended target group of the CRTS program.

Data Collection Tool

Semi-structured interviews were conducted with the selected participants to collect qualitative data on their experiences, perceptions, and practices related to the CRTS program. This data collection method was chosen because it allows for in-depth exploration of teachers' perspectives and insights, which are crucial for understanding the program's effectiveness and impact. Furthermore, interviews provide an opportunity for participants to reflect on their own teaching practices, a key objective of the CRTS program.

Data Analysis Techniques

Thematic analysis was used to analyze the qualitative data gathered from the interviews, focusing on identifying common themes and patterns related to the CRTS program's objectives. Additionally, descriptive statistics were employed to analyze the quantitative data collected in the form of demographic information and self-reported changes in teaching practices. By combining these analysis techniques, a comprehensive understanding of the program's impact on teachers' professional development was achieved.

Ethical Considerations

To ensure the ethical conduct of the evaluation, informed consent was obtained from all research participants prior to their involvement in the study. Participants were assured of their confidentiality and anonymity, and that the data collected would be used solely for the purpose of the program evaluation. Additionally, participants were informed of their right to withdraw from the study at any time without any negative consequences.

Reliability and Validity

Several measures were taken to ensure the reliability and validity of the data collected. The interview questions were carefully designed to be clear, unbiased, and relevant to the program's objectives. Moreover, the use of a stratified random sampling method increased the external validity of the findings by ensuring the inclusion of diverse teacher participants. The use of multiple data analysis techniques also contributed to the trustworthiness of the findings, as it allowed for the triangulation of data from various sources and perspectives.

Suitability of Methods

The choice of methods in this evaluation is suitable for the context and goals of the CRTS program, as it allows for a comprehensive understanding of the program's effectiveness and impact on teachers' professional development. By employing a mixed-methods approach with a focus on qualitative data, this evaluation is well-positioned to capture the complex and nuanced experiences of teachers as they engage with the CRTS program and implement culturally responsive teaching practices in their classrooms. This alignment of methods with the objectives of the program and the evaluation ultimately ensures that the findings are relevant, reliable, and useful for informing the ongoing development and support of the CRTS program.

This work is marked with CC0 1.0 Universal