About AAC&U

Phase 4: Analyze & Revise

This fourth phase of the TLA Framework focuses on analyzing and reviewing data on student outcomes after the implementation of enhanced teaching and learning strategies within the designated pathway(s). Campuses analyze their disaggregated data, determine where gaps persist in student outcomes, and identify strategies to revise course and program-level practices, in preparation for scaling their efforts.

Analyze/Revise

The twenty pilot community colleges utilized the VALUE Scoring Collaborative, a direct assessment of student learning approach that provides institutions with their own raw data, a series of scoring results, and templates that show aggregate results on all learning outcome components. VALUE reports also include disaggregated data by student demographics. These VALUE reports allow participants to contextualize the results at their institutions and create benchmarks against existing rubric criteria.

Guiding Questions

Consider these guiding questions as you move through the Analyze & Revise phase.

  • What are the key advantages of using the VALUE rubric-based approach for authentic assessment?
  • How does the VALUE assessment process differ from other forms of direct assessment used at your institution?
  • What might participation in the VALUE Scoring Collaborative reveal about direct assessment practices that can be applied across pathways?
  • What are some key observations derived from your analysis of the data and reports produced by the direct assessment of student learning?
  • When coupled with disaggregated data on student outcomes (e.g., course grades, progression, completion), what do the assessment results indicate about persisting equity gaps in student learning?
  • How will you disseminate and discuss assessment results with other faculty, administrators, and leaders on campus?
  • What components of your preliminary action plan might be revised, based on the results from the direct assessment of student learning?
  • What course-level practices, pedagogies, and assignments may need to be revised, based on what you learned from the assessment results?

Analyzing Direct Assessment Results


The process of analyzing your direct assessment results takes time, capacity, and intentionality. It is important to consider the implications that these results may have at the course, program, and institutional levels, and to examine what the results reveal about persisting equity issues for student learning and success. It is also crucial that the assessment results are shared with key stakeholders on your campus through clear, effective methods. To help guide the twenty pilot campuses through the VALUE Scoring Collaborative’s process of analyzing assessment results, AAC&U’s Kate McConnell (Vice President for Curricular and Pedagogical Innovation and Executive Director of VALUE) developed and led a data analysis presentation that includes key questions, considerations, and recommendations for how to situate your direct assessment results within the unique context of your institution.

To view the presentation, please click here.

Campus Spotlights: Analyzing & Sharing Direct Assessment Results

  • When the Middlesex Community College team received their initial VALUE assessment reports, they were not surprised to see that a majority of the student work from their Information Technology (IT) course scored at the level 1 benchmark or level 2 milestone performance levels—the course from which the artifacts were drawn was an introductory, first-semester course. So, this scoring was anticipated.

    However, when the data was further broken down and disaggregated by race/ethnicity, a different story emerged. Overall, the team found that work from BIPOC students outperformed work from the White students in the IT program. At the level 1 benchmark performance level, 42 percent of White students’ work met the criteria, while 38 percent of BIPOC students’ work met the criteria. In the level 2 and 3 milestone performance levels, however, 62 percent of BIPOC students’ work met the criteria, while 58 percent of White students’ work met the criteria.

    This revelation caused the Middlesex team to dig deeper into the assessment data and consider an additional perspective: gender. The team found that in the IT program, women of color did exceedingly well on the assignment submitted for scoring. The team will utilize the findings to help recruit more women of color into the IT program at Middlesex, by sharing the narrative and data that demonstrate their success in IT, a field in which women of color are grossly underrepresented.

    As a result of their project, the faculty within the IT pathway thought more intentionally about equity and what it means to be an equity-minded practitioner within their courses, within programs, at the institution, and within the surrounding community.

  • After receiving VALUE reports and scores, the Northeast Wisconsin Technical College (NWTC) project team organized a data debrief session to share the goals, process, and data results with colleagues and important stakeholders on campus.

    The project team’s data debrief session included reflection on the process of their project, and reflected on the following questions:

    • Where did we make the biggest change?
    • Why does it matter?
    • What did we anecdotally see because of the change?

    The team discussed what was learned after submitting student work to the VALUE Scoring Collaborative, and how assignments were intentionally redesigned and revised to consider the lessons learned from the VALUE results. For this team, these changes included:

    • Linking a service-learning assignment to the philosophy of nursing assignment
    • Mediasite recording of updated service-learning requirements
    • Updating the philosophy assignment and course rubric
    • Reevaluating the written assignment in alignment with the VALUE rubric and collect/score artifacts internally, using online scorer training knowledge

    Finally, the session helped faculty interpret and understand the new results by walking through the VALUE scoring scale and reading through aggregate results for the 100+ artifacts scored in using the Written Communication VALUE Rubric. The project team presented next steps, which included:

    • Further data evaluation
    • Separating the two VALUE submissions and comparing the results
    • Evaluating the writing levels of NWTC students compared to those at other institutions
    • Collaborating with colleagues on rubric development

    The session included opportunities for participating stakeholders to engage in breakout groups to discuss next steps and practice. Prompts for breakout groups to consider included:

    • What are three steps you will put into practice?
    • What will you need?
    • Who will you work with?
    • How will you share your learnings with other faculty?
    • As we enhance employability skills and equity, what role would you like to have?

Lessons Learned: Aggregate Results


One of the benefits of the VALUE approach to direct assessment of student learning is that it provides aggregate results across all participating institutions, which can then be analyzed for patterns and gaps across student learning outcomes. Aggregate results from the twenty pilot community colleges highlighted several key takeaways about student learning at community colleges, learning within the Guided Pathways model, and faculty teaching practices. VALUE-able Assessment: Pragmatic Lessons Learned through Guided Pathways by Kate McConnell highlights the assessment results of the twenty pilot campuses and the implications/key takeaways from their aggregate reports.

Revising and Enhancing Practices

After reviewing and analyzing direct assessment data, campuses work to revise and improve practices implemented in earlier phases of the TLA Framework. Direct assessment data should reveal where gaps exist in student learning outcomes, and examination of disaggregated data can highlight the equity issues that persist among different student groups. With these considerations in mind, faculty work collaboratively to interrogate and enhance course-, program-, and institution-level practices (i.e., HIPs, program reviews, rubrics, professional development) to address the gaps in quality and equity of student learning.

Campus Spotlights: Revising Institutional Practices

  • After receiving their VALUE Scoring Collaborative results, the Miami Dade College team worked collaboratively to create a flexible, institution-informed model to enhance and revise common practices at the college. To prepare for scaling the work, the project team focused on implementing and revising practices across various institutional areas, including embedded assignments, assessment/evaluation, qualitative inquiry, data dialogues, and decision-making.

    The table below shows some of the practices that were revised and improved, based on the results received from direct assessment of student learning in the School of Justice at Miami Dade College:

    MDC


  • For the Salt Lake Community College (SLCC) team, VALUE Scoring Collaborative results demonstrated that most student work in their Critical Thinking sample received scores across Milestones 2 and 3. However, disaggregated VALUE data revealed persisting gaps in the learning outcome among student groups. After analyzing these disaggregated data, the SLCC team used their VALUE report to highlight the need to close these gaps and revise their existing HIPs, which had been implemented earlier in the project.

    For SLCC, this included enhancing the use of the signature assignment requirement in ePortfolios by making General Education (Gen Ed) and student learning outcomes more transparent and intentional. This practice would include scaffolding the signature assignment earlier and throughout the semester.

    Revisions also included the use of Gen Ed maps to create a rubric for evaluating signature assignments in Gen Ed courses and to assess how they align with specific learning outcomes. By providing structured criteria and focusing on specific designations, the Gen Ed maps will allow SLCC more effectively and efficiently to assess student learning outcomes across pathways and across student demographic groups.

[From our VALUE results] we learned a lot about some of the assignments in our pathway. Now, a lot of those assignments have been redesigned to impact critical thinking throughout our students' first semester.
/ Faculty member from pilot campus, 2021

Key Takeaways

    • Be intentional about the process of reviewing your direct assessment results—collaborate with other faculty, staff, leaders, and students to make sense of the data.
    • Start to identify strategies and guiding questions for addressing the results through improving assignments, pedagogies, future assessment practices, etc.
    • Utilize cross-sectional data, intersectional disaggregation, and triangulated data to paint a clear picture of your results. Make sure to pay attention to any data outliers and what they may imply on pedagogical and systemic levels.
    • Consider the implications that direct assessment data have for your courses, programs, and institution as a whole—think about the implications for resources, policies, assignment design, teaching practices, and future assessment.
    • Consider what your assessment results tell you about students who are either earlier or later in their college careers. Reflect on what needs to be learned as well as on areas where improvement is needed, where students excelled, and where students did not meet expectations.
    • Think about how your results fit into the larger institutional environment for assessment and the implications that this has for accountability and transparency.
    • Compare demographic data from your assessment sample to your institution’s overall demographics to measure whether your sample is representative.
    • Pinpoint any noticeable disparities or patterns across student demographic groups including sex, race/ethnicity, and Pell eligibility.
      • Consider any other evidence on campus that might also point out existing equity gaps among student groups.
    • Reflect on the implications of these equity gaps and how they affect course, program, and institutional teaching practices, learning environments, and culture.
    • Consider the key campus stakeholders that would be most impacted by your assessment data—consider their position on campus, their influence on campus decision-making, and how they may disseminate the data themselves.
    • Be intentional about how you are sharing your assessment data—consider how you will communicate and deliver the results, which results will be most effective to share, and how you will convey next steps for creating change.
    • Consider hosting a data debrief session to break down your assessment results into digestible, understandable information for your intended audience. Show disaggregated data and visuals to help your audience understand any disparities or equity issues that exist.
    • Utilize direct assessment data to interrogate, revise, and scaffold course-, program-, and institution-level practices.
    • Examine disaggregated data by student demographics to better understand where equity issues still exist and how these affect student learning—utilize these data for a deeper inquiry into why and how these inequities in student learning exist and persist. Then, focus on revising and enhancing classroom practices, assignments, HIPs, assessment methods, and professional development/training to address the inequities.
    • Consider creating an informed decision-making framework, situated within the context and culture of your individual institution—involve campus leaders and stakeholders in conversations about direct assessment data and their relevance to student learning in order to inform future decision-making and policies.