Document Type

Dissertation

Date of Degree

Summer 2017

Degree Name

PhD (Doctor of Philosophy)

Degree In

Psychological and Quantitative Foundations

First Advisor

Matthew O'Brien

Second Advisor

Stewart Ehly

Abstract

Function-based experimental evaluation in the development of interventions and treatment plans continue to be under-utilized within the school setting (Hanley, Iwata, & McCord, 2003) despite federal mandates to use functional behavioral assessments and positive behavioral interventions with students with disabilities (IDEIA, 2004). Gann, Ferro, Umbreit, & Liaupsin, (2014) found that teachers prefer function-based interventions based in Applied Behavior Analytic principles when implemented effectively and with fidelity, over traditional classroom practices. As such, data collection and data analysis is critical to promote valid, reliable, and socially acceptable behavior change. Some researchers have advocated for the use of advanced technological or computerized tools to streamline and systematize data collection for wider-spread use (Vollmer, Sloman & Pipken, 2008; Kahng & Iwata, 1998), but there is little in the literature to suggest what might be the most beneficial data collection modalities and methods for school professionals who work with students who exhibit challenging behaviors. A few studies have shown positive effects of using electronic technology to collect behavioral data by researchers collecting direct observation data (Graylee et al, 2006), and measures of social skills development (Sarkar et.al, 2006) as well as teachers implementing discrete trial training (Tarbox et al, 2010). Hunter (2003) found that teachers were more likely to use interventions that they liked and that were subjectively time-efficient; often, convenience and experience factors played a role in resistance to implementing evidence-based interventions (Hunter, 2003).

The purpose of Study One was to better understand the current data collection practices and needs of school professionals who frequently treat students with challenging behaviors in the classroom. The purpose of Study Two was to compare the efficiency of graphing using an iPad application, Catalyst®, to paper and pencil data collection with behavioral specialists (school professionals and behavioral clinicians), and to see what preferences were generated based upon practice with both tools.

Study One’s results suggested that there was interest in data collection tools that are time-efficient and which provide automatized graphical or summarized data. Study Two findings suggested that there was minimal difference between latencies to graphing data session-by-session, but that school professionals exhibited a preference for the iPad application modality. Acceptability self-report ratings for either modality by school professionals also showed initial corroboration with self-report ratings of work load associated with technology skills. Implications for data collection modality preference and use in school settings as well as future directions for research are further discussed.

Keywords

Applied Behavior Analysis, Challenging Behavior, Data Collection, Function-Based Behavior Intervention, School Professional, Technology

Pages

xi, 109 pages

Bibliography

Includes bibliographical references (pages 101-109).

Copyright

Copyright © 2017 Jennifer Kathleen Andersen

Share

COinS