What is assessment and what is the assessment cycle?
A quick Google search will uncover dozens if not hundreds of common-sense definitions of “assessment.” In this office, we think about assessment as the ongoing process of rigorous self-study that:
- documents good educational practice,
- helps faculty and staff create, revise, or enhance learning opportunities for students,
- informs students’ own understandings of their development,
- enables rich discussions of our mission and values as a learning community, and
- provides evidentiary support for external reports including the requirements of accreditation and funding proposals.
The assessment cycle can be illustrated in a variety of ways (see these Google search results), but these visualizations all are based on the idea that assessment processes start with articulations of objectives, and moves through the collection, interpretation and discussion of evidence, before using findings to make informed plans for future teaching and learning. The process is iterative and introspective.
Writing and Organizing Student Learning Outcomes
Well-conceived and well-worded outcomes are the foundation of an effective assessment plan. They should represent and operationalize the program’s mission in clear, measurable statements of students’ attainment and learning progress.
- Duke University Student Learning Outcomes Synopsis PDF
- University of Wisconsin-Madison
- IUPUI
- Cal Poly
- Checklist for good learning objectives (JMU)
- Adelman, C. (2015, February). To imagine a verb: The language and syntax of learning outcomes statements (Occasional Paper No. 24). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).
A curriculum map or matrix is an illustration of student learning outcomes across a learning experience (whether a topic, a course, or a program of study). The curriculum map can be used to:
- Understand the learning journey across experiences
- Facilitate discussion among stakeholders
- Select and schedule appropriate assessment tasks
Resources:
- Summary
- National Institute for Learning Outcomes Assessment:
- University of Illinois
- University of Cincinnati
- University of Massachusetts
DESIGNING OR CHOOSING MEASURES OR INSTRUMENTS
It also is helpful to understand some of the terms assessment experts use to characterize and evaluate the suitability of an assessment measure.
- Understanding terms: Qualitative and Quantitative approaches, a literature review
- Understanding terms: Formative and Summative (Yale)
- Understanding terms: Direct and Indirect measurement (SMU)
- Understanding terms: High- and low-stakes assessment (JHU)
- Understanding terms: Authentic and performance-based assessment
There are many types of learning measures available to instructors and academic programs. The following sites illustrate and describe the many options for measurement and collection of evidence.
- Penn State University
- University of Illinois
- Buffalo State College
- Clauser, J. C., & Hambleton, R. K. (2017). Item analysis for classroom assessments in higher education. In Handbook on measurement, assessment, and evaluation in higher education (pp. 355-369). Routledge.
- D'Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education, 9(3). DOI: 10.5958/0974-9357.2017.00079.4
- DeMars, C. (2010). Item Response Theory. New York, NY: Oxford University Press.
- Haladyna, T.M. & Downing, S.M. & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309-334.
- Quaigrain, K., & Arhin, A. K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education, 4(1), 1301013. DOI: 10.1080/2331186X.2017.1301013
Standardized Tests Used in Trinity College through 2023 to study Curriculum 2000
- Although exams and essays traditionally have been commonly-used assessment techniques, there are a variety of other ways to collect useful feedback about teaching and evidence of student learning.
- Summary from Indiana University
MEAUSURE AND INSTRUMENT TYPES
- Improving test questions
- Selecting and designing instruments (James Madison University)
- Where to look for pre-existing instruments (James Madison University)
- Multiple-choice exams (Yale)
- An annotated bibliography of test development
- Benjamin, R., Miller, M. A., Rhodes, T. L., Banta, T. W., Pike, G. R., & Davies, G. (2012, September). The seven red herrings about standardized assessments in higher education. (Occasional Paper No. 15). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).
- See the peer-reviewed journal Assessing Writing
- Duke University Writing Studio and the Writing in the Disciplines program
- For questions about scaling essays in larger courses, contact Duke Learning Innovation & Lifetime Education for a consultation.
- Toolkits for project-based learning
- Carlton College Service Learning projects
- Evaluating capstone projects
- University of Warwick
- University of Nebraska
- University of Warwick
- Texas A&M University
- Carlton College
- University of New South Wales
There are many peer-reviewed journal pieces and other publications supporting and critiquing course evaluation processes. The following selections focus on the use of evaluation results to inform teaching practice and program-level assessment.
Many assessment practitioners use the term “Closing the loop” to describe the resolution of an assessment process. The interpretation of evidence leads to well-informed updates to the curriculum or educational practice. This stage usually involves the sharing of written reports and/or presentations with specific recommendations for action.
There are a variety of ways assessment findings can be shared with others. The suitability of the medium or venue depends on the confidentiality of the findings and the audience’s need-to-know.
Options for sharing findings include:
- Discussions within department or program meetings
- Essays published in peer-reviewed journals, trade publications, or popular media
- Newsletters within the department or program
- Summaries on the department’s or program’s website
- Student gatherings like the Majors Fair
Discussions among faculty and staff may be the most critical venue as it invites real-time dialogue and deliberation. Moreover, this is the context in which decisions will be made about the curriculum, courses, pedagogy, and student support services. Acting upon information is an essential stage of good assessment.
Examples of decisions include:
- Changing program requirements
- Updating content of one course to prepare students for a subsequent course
- Offering additional training and support to TAs
- Seeking summer funding for updates to course pedagogy
- Adding an assessment measure to fill in gaps in information.
The term “Closing the Loop” is a bit of a misnomer because the assessment cycle never really closes. As we make evidence-guided adjustments to our work, we restart the process with new or revised learning outcomes and updated targets for student learning.
Research Utilization
- Research Utilization: An annotated bibliography
- Cummings, G. G., Estabrooks, C. A., Midodzi, W. K., Wallin, L., & Hayduk, L. (2007). Influence of organizational characteristics and context on research utilization. Nursing research, 56(4), S24-S39.
- Estabrooks, C. A., Floyd, J. A., Scott‐Findlay, S., O'Leary, K. A., & Gushta, M. (2003). Individual determinants of research utilization: a systematic review. Journal of advanced nursing, 43(5), 506-520.
- Weiss, C. H. (1979). The many meanings of research utilization. Public administration review, 39(5), 426-431.
- Weiss, C. H. (1993). Where politics and evaluation research meet. Evaluation practice, 14(1), 93-106.
- Examples of effective use of assessment results
- Diery, A., Vogel, F., Knogler, M., & Seidel, T. (2020, June). Evidence-Based Practice in Higher Education: Teacher Educators' Attitudes, Challenges, and Uses. In Frontiers in Education (Vol. 5, p. 62). Frontiers.
- Fulcher, K. H., Smith, K. L., Sanchez, E. R., & Sanders, C. B. (2017). Needle in a Haystack: Finding Learning Improvement in Assessment Reports. Professional File. Article 141, Summer 2017. Association for Institutional Research.
- Huberman, M. (1994). Research utilization: The state of the art. Knowledge and policy, 7(4), 13-33.
- Jankowski, N. (2021, January). Evidence-based storytelling in assessment. (Occasional Paper No. 50). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.
Assessment in Specific Learning Contexts
- Civic engagement
- Living-learning communities
- Undergraduate research
- Study abroad or away
- Finley, A. (2019, November). A comprehensive approach to assessment of high-impact practices (Occasional Paper No. 41). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).