spacer spacer spacer
JRSA's IBR Resource Center
Return to Main Page
Background and Status of Incident-Based Reporting and NIBRS
State Profiles
Using Incident-Based and NIBRS Data
References and Related Sites
Available Syntax and Sample Data Files
line divider
IBRRC Fact Sheet
Contribute Information
Site Map
Justice Research and Statistics Association (JRSA) Home
 
Notice of Federal Funding and Federal Disclaimer
 

Overview of Data Quality Assessment Tools in Connecticut

The tools described in this section are in use in Connecticut today. These procedures have evolved through serendipity, deliberate searching, and borrowing from New York State. NIBRS, by design, sets higher standards for the quality as well as the quantity of crime data. These expectations drive the development of these Data Quality (DQ) assessment tools.

The sources of NIBRS data are wider and deeper than UCR's more selective and limited source, thereby giving NIBRS greater credibility at the beginning. The extensive FBI edits in NIBRS records management systems do assure a higher quality outcome for crime data than UCR can provide. Yet many NIBRS users act as if the computerized incident-based reporting process eliminates the need for diligent oversight of the data quality. While there is a basis for higher confidence in the NIBRS numbers, this is no reason to eliminate vigilance and assume that perfection has been achieved with NIBRS. Every assessment tool in this section is in use due to errors found in each domain, and the data were all at a low error rate (<2%) with conventional FBI edits.

These procedures analyze systems and data from beginning to end, from software to the agency input to the output at the state repository, because we can and have missed problems at every point. When we assume that the vendor is doing it right or that the agency is doing it right, or that the edits in the repository software are doing it all right, then we find ourselves in difficulty. This "passing the buck" approach to data quality oversight results in all parties seeing what they want to perceive. Subsequently, no one views the data in its full complexity. The conventional edits are necessary, but not sufficient, for weeding out all of the potential DQ problems.

Who is Responsible for Data Quality?

Vendors of records management systems with NIBRS capabilities want to sell their systems. Agencies want to get the data they need to do their job and pass on the NIBRS data to the state. The states want to gather the NIBRS data and pass it along to the FBI without undue difficulties. All parties involved have understandable and well-intended roles in the NIBRS process. If the states do not take the initiative to assure the quality of the data at each step along the way, we cannot claim the high ground for NIBRS data confidently and truly. If state agencies start producing reports of value to our contributors, they will have a greater stake in high quality data and actively assist us in this aim. Vigilance and due diligence are necessary in this quest for quality NIBRS data, data which can then earn our highest confidence.

Acknowledgements to to those diligent officers throughout Connecticut who discovered some DQ problems and assisted us in resolving them. Also, to Chris Mertens, Research Assistant in the Crimes Analysis Unit, for smart, diligent and effective work in pursuit of better data quality.

Download Connecticut's Certification Status Checklist (.doc)
Download Connecticut's IBR Data Quality Update Table (.doc)