Revisiting the design of selection systems (Sackett, 2023)

Add this course to my shopping cart

Reference Title: Revisiting the design of selection systems (Sackett, 2023)
Author: Sackett, Zhang, Berry, and Lievens
Publication Type: Journal Article
Publication Date: 2023
Course Level: Advanced
Credits: 2
Price: $20.00
About This Course: This article's attempts to draw out the applied implications of Sackett et al. (2022).
  1. Overview of this CE Home-Study Program
  2. Information About the Course
    1. Educational Objectives
    2. Target Audience
    3. Schedule
    4. Cost and Refund/Cancellation Policy
    5. Author Credentials
    6. Number of CE Credits Offered
    7. Location and Format
    8. Detailed Description of Program Material
  3. Conflict of Interest Statement

1. Overview of this CE program (top)

This home study course entails the independent study of "Revisiting the design of selection systems in light of new findings regarding the validity of widely used predictors" (Sackett, et al., 2023), followed by the completion of a multiple-choice test on-line. Participants who receive a passing grade of 75% or higher on the test will receive 2 CE credits. In accordance with guidance from the APA CE office, we are limiting the number of chances to take the test to 2 times. Participants who fail the test may retake the test once at no additional charge, and receive CE credit if they do pass

A copy of the reading for this course is available for free download at the Article web site.

More detailed information on the content of this article is given in section 2h below.

APR Testing Services is approved by the American Psychological Association to sponsor continuing education for psychologists. APR Testing Services maintains responsibility for this program and its content.

2.a Educational Objectives (top)

Upon completion of this home study program, the participant will be able to:

  1. Explain the criticisms Sackett et al. (2023) make about prior approaches making corrections to estimate operational validity.
  2. Explain the recommendations Sackett et al. (2023) make regarding how to go about correcting validity estimates for unreliability in the criterion and for range restriction.
  3. Describe the relative contribution of various predictors based on old and new validity values.

2.b Target Audience (top)

This CE program is intended for psychologists who hold a doctoral degree. This course may also be taken by other interested professionals (consultants, executives, upper-level managers).

2.c Schedule (top)

Access to program registration and post-test is available 24 hours a day, seven days a week.

2.d Cost and Refund/Cancellation Policy (top)

The fee for this home-study program is $20, which is $10 per CE credit. The fee is fully refundable for 60 days or until the post-test is taken, whichever comes first.

A copy of the reading for this course is available for free download at the Article web site.

2.e Author Credentials (top)

The first author of the journal article you will read for this home-study course is Dr. Paul Sackett. Dr. Sackett is a professor of psychology at The University of Minnesota in Minneapolis, MN and a past President of SIOP.

2.f Number of CE Credits Offered (top)

Participants who complete this course by taking and passing the multiple-choice test will receive 2 CE credits.

2.g Location and Format (top)

This activity requires independent home-based study of "Revisiting the design of selection systems in light of new findings regarding the validity of widely used predictors" (Sackett, et al., 2023). Following completion of the reading material, participants complete an Internet-based multiple-choice post-test on the content of the material.

2.h Detailed Description of Program Material (top)

Publication citation:

Sackett, Zhang, Berry, and Lievens (2023). Revisiting the design of selection systems in light of new findings regarding the validity of widely used predictors. Industrial and Organizational Psychology, 16, 283-300.

Article Abstract:

Sackett et al. (2022) identified previously unnoticed flaws in the way range restriction corrections have been applied in prior meta-analyses of personnel selection tools. They offered revised estimates of opera- tional validity, which are often quite different from the prior estimates. The present paper attempts to draw out the applied implications of that work. We aim to (a) present a conceptual overview of the critique of prior approaches to correction, (b) outline the implications of this new perspective for the relative validity of different predictors and for the tradeoff between validity and diversity in selection system design, (c) highlight the need to attend to variability in meta-analytic validity estimates, rather than just the mean, (d) summarize reactions encountered to date to Sackett et al., and (e) offer a series of recommendations regarding how to go about correcting validity estimates for unreliability in the criterion and for range restriction in applied work.

3. Conflict of Interest Statement (top)

APR Testing Services (APR) has no known conflict of interest with respect to this CE program. APR has not received any commercial support for this CE program.