• Login
    View Item 
    •   UMB Digital Archive
    • School, Graduate
    • Theses and Dissertations All Schools
    • View Item
    •   UMB Digital Archive
    • School, Graduate
    • Theses and Dissertations All Schools
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of UMB Digital ArchiveCommunitiesPublication DateAuthorsTitlesSubjectsThis CollectionPublication DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    Display statistics

    A comparison of paper-based data submission to remote data capture for minimizing data entry errors in cancer clinical research

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Find Full text
    Author
    Meadows, Beverly Jane
    Advisor
    Mills, Mary Etta C.
    Date
    2006
    Type
    dissertation
    
    Metadata
    Show full item record
    Abstract
    Background. Patient data are essential for judging the safety and efficacy of cancer clinical trials. The current process of paper-based data entry provides opportunities for incurring data discrepancies. Automated systems have shown potential to reduce the number of data entry errors and preserve the quality of clinical trial data. To test this potential, this study examined case report forms (CRFs) to test for differences in the proportion of discrepancies and the time to resolve these discrepancies between a paper-based data entry and OracleRTM Clinical (OC) Remote Data Capture (RDC). Objective. The purpose of this study was to examine differences in the proportion of errors and the time to resolve specific errors between a paper-based CRF and an electronic RDC format. Reason's conceptual framework for error detection and recovery feedback loop was used to guide this research where the warning environmental cueing function provided feedback to the end-user. Results. The sample consisted of 445 RDC and 445 paper-based CRFs submitted to the Cancer Trial Support Unit (CTSU) from March 12, 2004 through March 28, 2005. There was a significant reduction in the proportion of overall data discrepancies for RDC as compared to paper-based CRFs (46.5% vs. 31.7%, p<.001). Similar results were found for univariate (58.6% vs. 41.2%, p<.001) and multivariate (64% vs. 36%, p<.001) discrepancies. Of the total sample of 890 CRFs analyzed for this study, 509 (57.2%) had no discrepancies. For the 381 (42.5%) forms with discrepancies there was no difference in the mean number of days to resolve discrepancies between RDC and paper-based (43 vs. 35). However, RDC had a greater proportion of resolved discrepancies (52% vs. 48%, p<.001).;Conclusion. The results from this study supported Reason's concept of error detection and recovery. RDC data entry format decreased overall, univariate and multivariate data discrepancies for patient information collected on a colon cancer study; however, there was no difference in the timeline for discrepancy resolution between the two formats. Further studies are recommended to test alternate definitions of discrepancy resolution time points. Results from this study can only be generalized to automated systems that use Oracle RTM Clinical and the instance configuration specific for the programmed edit checks used for the colon cancer study.
    Description
    University of Maryland, Baltimore. Nursing. Ph.D. 2006
    Keyword
    Health Sciences, Nursing
    Information Science
    Health Sciences, Health Care Management
    remote data capture
    Electronic data processing--Data entry
    Data Accuracy
    Identifier to cite or link to this item
    http://hdl.handle.net/10713/1069
    Collections
    Theses and Dissertations All Schools
    Theses and Dissertations School of Nursing

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Policies | Contact Us | UMB Health Sciences & Human Services Library
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.