On January 8 and 9, the National Criminal Justice Association (NCJA), Bureau of Justice Assistance (BJA), and Justice Research and Statistics Association (JRSA) partnered to convene the Executive Session on Evidence-Based Policy and Practice. The meeting drew almost 100 attendees, including State Administering Agency (SAA) and Statistical Analysis Center (SAC) directors, and federal officials from the Office of Justice Programs and BJA. The agenda evolved from training and technical assistance provided by the BJA, NCJA, and JRSA following an initial focus group of SAAs on evidence-based programs in January 2009; a series of regional conferences in 2010; delivery of technical assistance to more than 20 states; and surveys of SAAs and SAC directors concerning evidence-based programs. The agenda also highlighted BJA's priority of embedding evidence-based practices (EBPs) in all of its grant programs, and JRSA's focus on using data to support EBPs and data-driven decision making. Finally, the meeting provided a chance to enhance the dialogue between the SAAs and SACs on the use of EBPs.
After opening remarks from BJA Director Denise O'Donnell and Jeanne Smith, Vice President of NCJA, as well as introductions by NCJA Executive Director Cabell Cropper, the meeting began with a session on Challenges in Promoting Use of Data and Research in Criminal Justice Programs, in which Dr. Phelan Wyrick, Senior Advisor at the Office of Justice Programs (OJP), discussed the evidence-based movement. Among the challenges to using evidence-based practices is being able to define what "evidence-based" means. He said you must be able to define the term in every-day language; it must mean the same thing to the researcher and the practitioner. He emphasized the importance of bringing social science into the picture because social science provides a level of objectivity that is influential to policy makers and executives. Although EBPs are still an innovation, they have become more accessible and have helped to discredit some popular but ineffective programs. He concluded by discussing OJP's Evidence Integration Initiative (E2I), launched in 2009 with the goal of integrating evidence into justice programs, practices, and policy decisions. Two byproducts of this effort are the OJP Diagnostic Center and CrimeSolutions.gov.
The next session was on Using Data to Inform Evidence-Based Decision Making: The New York Experience. Presented by New York SAC Director Terry Salo, this panel focused on Operation Impact, a program of the New York State Division of Criminal Justice Services (DCJS) that supports strategic crime-fighting and violence reduction initiatives in the 17 counties outside New York City that account for 80% of the crime upstate and on Long Island. This initiative provides participating law enforcement agencies with the information, tools, and resources necessary to implement a data-driven approach to policing. Ms. Salo discussed the data collection and analysis requirements the Impact counties must meet, as well as the data reporting requirements, including monthly shooting incident and victim reports, which help DCJS track trends. She also reported on the Results First Cost Benefit project, in which cost benefit is used to estimate the impact of criminal justice programs or sentencing changes on public safety. New York is among the states using this consistent, formal cost-benefit methodology to predict which programs will achieve the best results at lowest cost. She cautioned, though, that having a technical cost-benefit model is meaningless without targeting (determining exactly what type of offender the program works best for); quality assurance; and evaluation or annual outcome studies of recidivism or other expected program results.
Stan Orchowsky, JRSA's Research Director, gave a short presentation on The Use of Research and Evidence-Based Practices by State Administering Agencies. He said that according to a JRSA survey of 43 SAA directors or their designees conducted in 2011, 20% of the SAAs considered EBPs to be required by their agencies. Survey results indicated that a strong working relationship between the SAA and the SAC was important for implementing EBPs, and that it is important to designate funds for program evaluation. Barriers to implementation of EBP included budget cuts, lack of free or easily accessible technical assistance for local subgrantees, lack of stakeholder buy-in, and lack of resources to ensure fidelity to the program model. Some of the presentations that followed addressed these four challenges.
Pennsylvania SAC Director Lee Ann Labecki discussed the work of the SAC in Setting a Direction for Evidence-Based Practice. Ms. Labecki said that much like New York, Pennsylvania has a large number of law enforcement agencies. The question is how do you get them to use data? One solution they devised was to create an EBP Toolbox with four components: 1) They used grant funds from the Bureau of Justice Statistics for the last three years to reconfigure their website and provide a data clearinghouse; 2) the offender population projections process was moved from the Department of Corrections back to the SAC and standardized, resulting in a more accurate forecast; 3) for counties that don't have the capability to do sophisticated crime analysis, they created a Digital Dashboard that provides online access to analysis tools; and 4) they transformed the Electronic Grants Management System into a proactive performance measurement tool. Ms. Labecki noted that while she has a strong working relationship with the SAA and there is a firm commitment to EBPs, there are still challenges in the form of budget cuts, which translate into a lack of resources to implement effective evaluation and meet fidelity requirements, or to fully implement the research agenda.
SAC Director Phil Stevenson's presentation, The Role of Statistical Analysis Centers in Supporting State Administering Agencies' Implementation of Evidence-Based Practice and Policy, focused on how to turn challenges into opportunities. For example:
The next presentation was about the Ohio Consortium of Crime Science (OCCS). Kharlton Moore, Executive Director of the Ohio Office of Criminal Justice Services (OCJS) and SAC Director Lisa Shoaf discussed the establishment of an academic consortium by OCJS to provide evidence-based solutions to the real-world problems faced by local criminal justice agencies. The first of its kind in the nation, the OCCS will bring together social science researchers under one resource to conduct research, disseminate knowledge, and foster relationships between practitioners, policy makers, and academics. The benefit of having a science consortium is that small law enforcement agencies will have access to the same crime analysis, research, and evaluation resources and opportunities as larger agencies. The Consortium will provide training, technical assistance, evaluation, and consulting services directly to local criminal justice agencies who seek assistance in solving crime and criminal justice problems, and create academic-practitioner partnerships. (More information on the establishment of the OCCS is in the JRSA Forum, Vol. 30, No. 3.) The Consortium also will serve as a model for other states, help bridge the gap between academics and practitioners, promote evidence-based practices, help match expertise with need, and help solve local crime problems.
While the previous presentations focused on SAC efforts to build capacity within the state for evidence-based practice, Norb Federspiel, Executive Director of the West Virginia Division of Justice and Community Services (DJCS), discussed his efforts to establish a SAC with the capacity to do research and evaluation within the agency. He knew he had to hire a director who encouraged EBPs and could build the capacity of the agency from within to be able to do the strategic planning that the Byrne Grant program mandates. As part of this effort, DCJS recently designed a model program that assigns a SAC staff member to each grant program funded by the SAA to ensure performance measurement. To support this, the SAC held a meeting for grantees in the fall of 2012 with workshops on performance measurement, sustainability, and accountability. To maintain this model, Mr. Federspiel said he will need the political support of the governor and Secretary of the Department of Public Safety.
West Virginia SAC Director Stephen Haas continued the presentation by explaining how he changed DJCS from a granting agency to a true planning agency:
After a day of discussions about different states' evidence-based programs and practices, Roger Przybylski, Founder and Consultant of RKC Group, gave a presentation on planning and implementation issues that must be considered when using the evidence-based approach. First he recommended two publications: 1) Effective Recidivism Reduction and Risk-Focused Prevention Programs: A Compendium of Evidence-Based Options for Preventing New and Persistent Criminal Behavior, and 2) Justice Research and Policy, Volume 14, No. 1, 2012, Special Issue on Evidence-Based Policy and Practice. He continued by saying that in order to move programs toward EBP, there must be planning, which is more important today than ever since crime control policy is spread over multiple agencies, causing fragmentation. SAAs are uniquely positioned to carry out strategic planning and SACs are positioned to provide the necessary data. He explained three approaches to becoming more evidence-based: implement "certified" (brand-name) model programs; use generic interventions (e.g., drug courts, mentoring programs); or use practice guidelines derived from science.
Mr. Pryzybylski noted that implementation is a process, not a single event, and discussed four steps of the implementation process: Exploration (ends with a decision on whether or not to move forward), Installation (training, acquisition of equipment), Initial Implementation (initial change in practice; forces at play may include resistance or push-back), and Full Implementation (takes two to four years to achieve; occurs when a high number of practitioners proceed with a high degree of program fidelity). Finally, he said implementation must be supported at higher levels of leadership to overcome barriers, and programs that don't work must be eliminated.
The final session of the meeting included presentations on cost-benefit analysis from Tina Chiu, Director of Technical Assistance at the Vera Institute, and John Roman, Senior Fellow at the Urban Institute. Ms. Chiu discussed the National Cost-Benefit Knowledge Bank for Criminal Justice, a resource administered by Vera and funded by the Bureau of Justice Assistance to broaden and deepen the understanding and use of cost-benefit analysis in criminal justice. Mr. Roman defined cost-benefit analysis as an empirical approach designed to measure the economic impact of government programs, and discussed different ways of analyzing cost. The meeting ended with a discussion among attendees of the challenge of embedding EBP in grant-funded programs.
Presentation slides are available at http://www.ncja.org/executive-session-jan-2013