User-Centered Assessment Design

An Integrated Methodology for Diverse Populations

Madhabi Chatterji

Hardcovere-bookprint + e-book
Hardcover
January 17, 2025
ISBN 9781462555482
Price: $90.00
446 Pages
Size: 7.375" x 9.25"
pre-order
e-book
December 27, 2024
PDF ?
Price: $90.00
446 Pages
pre-order
print + e-book
Hardcover + e-Book (PDF) ?
Price: $180.00 $108.00
446 Pages
pre-order
professor copy A digital professor copy will be available on VitalSource when published ?

I. Foundations

1. Foundational Concepts in Assessment Design

1.1 Chapter Overview

1.2 Assessments: Old and Emerging Traditions, a Starting Definition, and Some Distinctions

1.3 Viewpoints on Assessment, Measurement, Testing, and Evaluation

1.4 Role of Assessment in Scientific, Professional, and Practical Endeavors

1.5 Evaluating the Quality of Assessments and Construct Measures: Validity, Reliability, and Utility

1.6 Integrating Assessment Design, Validation, and Use: A User-Centered Process

1.7 Summary

2. Why Assess?: Measure-Based Inferences, Uses, Users, and Consequences

2.1 Chapter Overview

2.2 Back to the Future: Early Drivers, Milestones, and Consequences of Assessment Use

2.3 Modern Drivers and Consequences of Assessment Uses in Education

2.4 Modern Drivers and Consequences of Assessment Uses in Psychology, Health, Business, and Other Fields

2.5 Applying User-Centered Principles to Improve Practices

2.6 Summary

3. Whom to Assess? and How?: Specifying the Population and the Assessment Operations

3.1 Chapter Overview

3.2 Why Population Characteristics and the Socioecological Contexts of Assessments Matter

3.3 What Is Measurement Bias?: Case Studies and Hypothetical Illustrations

3.4 Selecting Assessment Operations for Diverse Populations and Multidisciplinary Constructs

3.5 Steps and Actions: Specifying Whom to Assess? and How? with the Process Model

3.6 Summary

II. Assessment Design

4. What to Assess?: Specifying the Domains for Constructs

4.1 Chapter Overview

4.2 Domain Sampling and Domain Specification: Functional Theory and Applied Illustrations

4.3 Construct Types, Domain Conceptualizations, and Structures

4.4 Domain Specification as a Part of the Process Model: Steps, Techniques, Guidelines, and Conventions

4.5 Content-Validating Specified Domains

4.6. Summary

5. Designing Assessments with Structured and Constructed-Response Items

5.1 Chapter Overview

5.2 Why the Mechanics of Item Construction Matter

5.3 Cognitive Constructs Measured Best with Structured- or Constructed-Response Items

5.4 Writing Structured-Response Items: Principles, Guidelines, and Applied Examples

5.5 Guidelines for Designing Constructed-Response and Essay Tasks

5.6. Instrument Assembly

5.7 An Application with the Process Model: A Case Study of Cognitively Based Item and Assessment Design to Foster Learning in Long Division

5.8 Summary

6. Designing Behavior-Based, Product-Based, and Portfolio-Based Assessments

6.1 Chapter Overview

6.2 Behavior-, Product-, and Portfolio-Based Assessments: Definitions, Examples, and Origins

6.3 Advantages of the Performance Assessment Format

6.4 Disadvantages of Performance Assessments: Human Vulnerabilities, Errors, and Biases

6.5 Three Case Studies: Applying the Process Model to Design and Validate Performance Assessments

6.6 Summary

7. Designing Survey-Based and Interview-Based Assessment Tools

7.1 Chapter Overview

7.2 Self-Report Instruments: Their Defining Properties and Common Applications

7.3 Historical Origins of Questionnaires and Attitude Surveys

7.4 Measurement Issues with the Self-Report Modality

7.5 General Design Guidelines for Self-Report Instruments

7.6 Ten More Guidelines for Writing Closed-Ended Survey Items

7.7 A Case Study: Applying the Process Model to Design Two Complementary Self-Report Tools

7.8 Summary

III. Validation and Use of Assessments

8. Analyzing Data from Assessments: A Statistics Refresher

8.1 Chapter Overview

8.2 Preparing for Data Analysis

8.3 Organizing the Data

8.4 Measures of Central Tendency

8.5 Measures of Variability

8.6 Graphical Displays of Data

8.7 The Standard Normal Distribution and Its Applications

8.8 Correlation Coefficients and Their Applications

8.9 Related Statistical Techniques

8.10 Summary

9. Improving the Inferential Utility of Assessment Results: Methods and Limitations

9.1 Chapter Overview

9.2 Frames of Reference and Derived Scores

9.3 Using Norms as the Frame of Reference

9.4 Using Criterion Scores or Standards as the Frame of Reference

9.5 Using Self as the Frame of Reference

9.6 Composite Scores

9.7 Grouped Scores, Equated Scales, and Linked Tests

9.8 Summary

10. A Unified Approach to Construct Validity and Validation: Theory to Evidence

10.1 Chapter Overview

10.2 Construct Validity: An Evolving Concept

10.3 Theoretical Foundations of the Unitarian View of Validation

10.4 Main Clusters and Types of Validity Evidence

10.5 Random Errors of Measurement and Types of Reliability Evidence

10.6 Utility of Measures, Assessments, and Assessment Systems

10.7 Unified Validation Plans

10.8 Chapter Summary

11. Empirical Methods of Validation

11.1 Chapter Overview

11.2 Planning Empirical Validation Studies

11.3 Evaluating Item Performance

11.4 Examining Fairness and Measurement Bias

11.5 Gathering Evidence of Content-Based Validity

11.6 Validating Response Processes: The Cognitive Interview

11.7 Gathering Correlational Evidence of Validity

11.8 Empirical Estimation of Reliability

11.9 Methods to Examine Utility

11.10 Evaluating the Evidence: The PSQI Case Revisited

11.11 Summary

12. User-Centered Assessment Design: Revisiting the Principles, Comparisons, and Conclusions

12.1 Chapter Overview

12.2 Applying the Principles Undergirding the Process Model: A Summary by Section

12.3. A User-Centered Design Process: Comparing the Old with the New

12.4 Extended Applications of the Process Model

12.5 The Process Model Compared to Existing Models of Assessment Design

12.6 Connecting the Process Model with the 2014 Standards

12.7 Summary

Glossary

References

Author Index

Subject Index

About the Author