Detail photo of a stack of books on a desk
Collection

Designing and Conducting Center Program Assessment

This collection contains resources for educational developers interested in conducting assessment and/or evaluation of their programs.

Updated January 2023
Lindsay Wheeler headshot
Senior Associate Director & Associate Professor
Office of the Executive Vice President and Provost
View Bio
01

Measuring Faculty Learning: Trends in the Assessment of Faculty Development

Stephen P. Hundley and Susan Kahn

This book chapter provides an overview of the research assessing educational development and provides ideas for future work.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This book chapter provides an overview of the research assessing educational development and provides ideas for future work.

View excerpt

Trends in Assessment

Stephen P. Hundley and Susan Kahn
Open resource

The chapter contributors challenge faculty developers to move toward evidence-based practice in development interventions and discuss related needs for a theoretical foundation for faculty development activities, development of organizational cultures of learning, and methodologies for assessing learning outcomes of faculty who participate in faculty development.

Was this resource helpful?
02

Researching the Impact of Educational Development

To Improve the Academy

This article systematically reviews the literature on educational development interventions and their impact.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This article systematically reviews the literature on educational development interventions and their impact.

View excerpt

In this review of 138 studies on the impact of educational development practices, we discuss the idea of impact and summarize previous studies of this topic. We then present an overview of findings on impact from current studies of typical educational development activities: workshops, formal courses, communities of practice, consultation, mentoring, and awards and grants programs. We conclude that although the studies vary in quality, the sheer volume of results offers guidance for development practice.

Was this resource helpful?
03

Program Impact Evaluation Model

International Journal for Academic Development

This article presents a seminal framework for evaluating the impact of educational development programs based on six levels of outcomes.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This article presents a seminal framework for evaluating the impact of educational development programs based on six levels of outcomes.

View excerpt

Impact Evaluation of Educational Development Programmes

International Journal for Academic Development
Open resource

Although most educational development professionals value the importance of monitoring their programme's impact, systematic evaluation is not common, and often relies on inference measures such as extent of participation and satisfaction. This paper discusses approaches to programme impact evaluation in terms of six possible points of focus: (1) participants' perceptions/satisfaction; (2) participants' beliefs about teaching and learning; (3) participants' teaching performance; (4) students' perceptions of staff's teaching performance; (5) students' learning; and (6) effects on the culture of the institution. Whatever focus is selected it is important to address the following questions: (1) What is the intended impact? (2) Why evaluate? (3) When to evaluate? (4) Who evaluates? (5) How to evaluate? (6) Is the actual impact the same as the intended impact and is the actual impact desirable? (7) Who should receive the results of the evaluation? (8) What will happen as a consequence? Based on these two sets of questions, a 6 x 8 matrix is proposed to guide the evaluation of educational development initiatives. It is argued that the approach to impact evaluation needs to be aligned with the focus of the desired change as well as the intervention strategies used to bring about such change.

Was this resource helpful?
04

Center Assessment Framework

Journal of Faculty Development

This article presents a framework for organizing and prioritizing center program assessment and attends to the multiple stakeholders of centers.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This article presents a framework for organizing and prioritizing center program assessment and attends to the multiple stakeholders of centers.

View excerpt

Assessment is a cyclical process within which educators construct outcomes, implement programs, assess constructs such as learning, evaluate results, and utilize results to craft stronger programs and services. Within educational and faculty development, assessment measures program impact on faculty, students, and/or institutional culture. Additionally, assessment activities support the scholarly dissemination of evidence-based and high-impact practices. Unfortunately, many centers of teaching and learning struggle to implement assessment practices that go beyond satisfaction-based program evaluations. This struggle can be attributable to multiple factors, including weak assessment infrastructure, shortsighted assessment goals, ill-conceived frameworks, and limited resources to do more than collect satisfaction data. We present an assessment framework that addresses these limitations by organizing center assessment efforts around faculty learning outcomes (FLOs). This framework focuses on the impact center programming has on faculty while also providing feedback on program quality and important information about institutional culture. More importantly, the multi-tiered FLO framework allows centers to systematically collect multiple data sources for each FLO. Comprehensive analysis of these FLO-driven data sources gives centers a new and more robust tool for understanding center effectiveness and evidencing the value of faculty development in higher education to diverse stakeholders.

Was this resource helpful?
05

Model for Evaluating Centers

To Improve the Academy

This article presents a model for evaluating centers using a four-phase approach that helps address the challenges of assessing center work.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This article presents a model for evaluating centers using a four-phase approach that helps address the challenges of assessing center work.

View excerpt

This paper provides a program evaluation model, along with field-testing results, that was developed in response to the need for an evaluation model able to support systematic evaluation of teaching and learning centers (CTLs). The model builds upon the author’s previous studies investigating the evaluation practices and struggles experienced at 53 CTLs. Findings from these studies attribute evaluation struggles to contextual issues involving evaluation capacity, ill-structured curricula, and ill-conceived evaluation frameworks. This field-tested Four-Phase Program Evaluation Model addresses these issues by approaching evaluation in a comprehensive manner that includes an evaluation capacity analysis, curricular conceptualization, evaluation planning, and plan implementation.

Was this resource helpful?
06

Program Assessment Guidelines

Rutgers Office of Teaching Evaluation and Assessment Research

This document provides a detailed description of best practices in program assessment.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This document provides a detailed description of best practices in program assessment. While tailored to departmental program assessment, the practices translate to educational development and are helpful for those wanting to learn more about program assessment.

View excerpt

Guidelines for Departmental Assessment Plans and Conducting Assessments

Rutgers Office of Teaching Evaluation and Assessment Research
Open resource

A good departmental learning outcome assessment plan has the following characteristics:

The plan’s goals:

  • Ask important questions about student learning

  • Reflect the institution’s mission and learning goals

  • Support the stated departmental goals and objectives for learning

The plan’s methods:

  • Are collegial and collaborative

  • Are appropriate to the department’s learning outcome goals.

The plan’s implementation, or 'closing the loop' procedures:

  • Are directly linked to decision-making about the program's curriculum.

  • Lead directly to actions for improvement by the department or program.

Was this resource helpful?
07

How to Evaluate Your Faculty Development Services

Academic Briefing

Faculty development is a phenomenon that emerged in the early 1970s, yet rarely was there interest in evaluating the effectiveness of this effort—until now.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This brief blog post overviews the difference between assessment and evaluation and provides a description of one approach to evaluating program outcomes.

View excerpt

Faculty developers across the nation are working on developing methods to evaluate their services. In 2010, the 35th Annual Professional Organizational and Development Network Conference identified assessing the impact of faculty development as a key priority. It was this growing demand that spawned my interest in conducting a 2007 statewide and a 2010 nationwide investigation of faculty development evaluation practices in the U.S. This article will describe how to develop a customized evaluation plan based on your program’s structure, purpose, and desired results, based on contemporary practices discovered through this research.

Was this resource helpful?
08

Understanding Impact of Educational Development Interventions

International Journal for Academic Development

This award-winning article describes a robust and extensive assessment of a suite of center programs.

Headshot of Lindsay Wheeler
Lindsay Wheeler

This award-winning article describes a robust and extensive assessment of a suite of center programs. Awarded the International Journal for Academic Development 2021 Article of the Year, below, the judges provide a description of how the study contributes to educational development research & assessment practices.

View excerpt

ABSTRACT

This study explored three US educational development (ED) programs: a weeklong course design institute, a new faculty learning community (NFLC), and a STEM learning community (STEM-LC). We compared observed instruction and student achievement for 239 STEM undergraduate courses taught by instructors who had or had not engaged in ED. Courses taught by NFLC and STEM-LC instructors had significantly more learning-focused syllabi and active learning than courses taught by non-engaged instructors, controlling for class size and type. We conclude that instructors need support in implementing active learning to ensure all students benefit. Additional research is needed to explore ED and active learning.


JUDGES' CITATION

This paper bravely ventures into the difficult territory of seeking quantifiable data on academic development interventions—in other words, the kinds of studies that many university leaders demand to see in order to accept that developers' work is genuinely valuable. The paper has a great deal to offer academic development practice and offers a way forward for practitioners to develop rigorous evaluations of the initiatives we champion, which can be used to justify those initiatives and garner institutional support—especially important given the performance regimes that are increasingly becoming prevalent across the world. Using a range of statistical measures, the authors provide insights that will prove helpful to the academic development community, both in terms of findings and methods, and present a robust, comprehensive, and convincing study of the effects of academic development interventions on instructor practices and student learning. It is particularly pleasing to read that the work we as academic developers do can have positive outcomes for Underrepresented Minority students. It's been hard to measure this, and this article shows how we can; we should all be doing more of this kind of research. The paper offers developers a template for how future studies might be conducted, as well as some of the inherent difficulties involved in quantifying academic development interventions.

Was this resource helpful?

Want to recommend a resource to add to this collection? Send us an email.