DOI

10.17077/etd.obgaaqk2

Document Type

Dissertation

Date of Degree

Summer 2018

Access Restrictions

Access restricted until 08/31/2020

Degree Name

PhD (Doctor of Philosophy)

Degree In

Psychological and Quantitative Foundations

First Advisor

Altmaier, Elizabeth M.

Second Advisor

Moser, David J.

First Committee Member

Whiteside, Douglas M.

Second Committee Member

Welch, Catherine J.

Third Committee Member

Kivlighan, D. Martin

Fourth Committee Member

Ehly, Stewart

Abstract

The Personality Assessment Inventory (PAI) is a commonly used instrument in neuropsychological assessment; however, it lacks a symptom validity test (SVT) that is sensitive to cognitive response bias (also referred to as non-credible responding), as defined by performance on cognitive performance validity tests (PVT). Therefore the purpose of the present study was to derive from the PAI item pool a new SVT, named the Cognitive Response Bias Scale (CRBS), that is sensitive to non-credible responding, and to provide initial validation evidence supporting the use of the CRBS in a clinical setting. The current study utilized an existing neuropsychological outpatient clinical database consisting of 306 consecutive participants who completed the PAI and PVTs and met inclusion criteria. The CRBS was empirically derived from this database utilizing primarily an Item Response Theory (IRT) framework.

Out of 40 items initially examined, 10 items were ultimately retained based on their empirical properties to form the CRBS. An examination of the internal structure of the CRBS indicated that 8 items on the CRBS demonstrated good fit to the graded response IRT model. Overall scale reliability was good (Cronbach’s alpha = 0.77) and commensurate with other SVTs. Examination of item content revealed the CRBS consisted of items related to somatic complaints, psychological distress, and denial of fault. Items endorsed by participants exhibiting lower levels of non-credible responding consisted of vague and non-specific complaints, while participants with high levels of non-credible responding endorsed items indicating ongoing active pain and distress.

The CRBS displayed expected relationships with other measures, including high positive correlations with negative impression management (r = 0.73), depression (r = 0.78), anxiety (r = 0.78), and schizophrenia (r = 0.71). Moderate negative correlations were observed with positive impression management (r = -0.31), and treatment rejection (r = -0.42). Two hierarchical logistic regression models showed the CRBS has significant predictive power above and beyond existing PAI SVTs and clinical scales in accurately predicting PVT failure. The overall classification accuracy of the CRBS in detecting failure on multiple PVTs was comparable to other SVTs (area under the curve = 0.72), and it displayed moderate sensitivity (i.e., 0.31) when specificity was high (i.e., 0.96). These operating characteristics suggest that the CRBS is effective at ruling in the possibility of non-credible responding, but not for ruling it out. The conservative recommended cut score was robust to effects of differential prediction due to gender and education. Given the extremely small sample subsets of forensic-only and non-Caucasian participants, future validation is required to establish reliable cut-offs when inferences based on comparisons to similar populations are desired.

Results of the current study indicate the CRBS has comparable psychometric properties and clinical utility to analogous SVTs in similar personality inventories to the PAI. Furthermore, item content of the CRBS is consistent with and corroborates existing theory on non-credible responding and cognitive response bias. This study also demonstrated that a graded response IRT model can be useful in deriving and validating SVTs in the PAI, and that the graded response model provides unique and novel information into the nature of non-credible responding.

Keywords

Measure Development, Performance Validity Testing, Personality Assessment Inventory, Response Bias, Symptom Validity Testing

Pages

ix, 171 pages

Bibliography

Includes bibliographical references (pages 151-171).

Copyright

Copyright © 2018 Owen J. Gaasedelen

Available for download on Monday, August 31, 2020

Share

COinS