ADVANCED SEARCH
Keyword »
Topic »
Author »
Date »
to
JULY / AUGUST 2012 :: 73(4)
Promoting Healthy and Sustainable Communities

This issue explores collaborations to improve the health of communities across the state, which are paramount to a healthy population, workforce, and economy. The policy forum includes articles highlighting various state departments' visions for healthy communities, as well as articles on planning, health impact assessments, local food systems, and efforts to strengthen the built environment. Original research includes an evaluation of the North Carolina Violent Death Reporting System and Medicaid coverage cost for the uninsured. A farewell and welcome to NCMJ editors in chief and a perspective on the challenge for health policy are also included.

ORIGINAL ARTICLE

Evaluation of the North Carolina Violent Death Reporting System, 2009

Natalie J.M. Dailey, Tammy Norwood, Zack S. Moore, Aaron T. Fleischauer , Scott Proescholdbell,

N C Med J. 2012;73(4):257-262.PDF | TABLE OF CONTENTS



Background Violence is a leading cause of death in North Carolina. The North Carolina Violent Death Reporting System (NC-VDRS) is part of the National Violent Death Reporting System (NVDRS), which monitors violent deaths and collects information about injuries and psychosocial contributors. Our objective was to describe and evaluate the quality, timeliness, and usefulness of the system.

Methods We used the Centers for Disease Control and Prevention’s guidelines for evaluating public health surveillance systems to assess the system. We performed subjective assessment of system attributes by reviewing system documents and interviewing stakeholders. We estimated NC-VDRS’s reporting completeness using a capture-recapture method.

Results Stakeholders considered data provided by NC-VDRS to be of high quality. Reporting to the national system has taken place before the specified 6-month and 18-month deadlines, but local stakeholder reports have been delayed up to 36 months. Stakeholders reported using NC-VDRS data for program planning and community education. The system is estimated to capture all NVDRS-defined cases, but law enforcement officers report only 61% of suicides.

Limitations The law enforcement agencies we interviewed may not be representative of all participating agencies in the state. Data sources used to assess completeness were not independent.

Conclusion NC-VDRS is useful and well-accepted. However, completeness of suicide reporting is limited, and reporting to local stakeholders has been delayed. Improving these limitations might improve the usefulness of the system for planning and appropriately targeting violence prevention interventions.

Violence-related injuries are among the leading causes of death in the United States, resulting in approximately 50,000 deaths annually [1]. Homicide and suicide are the second leading causes of death among persons aged 15-24 years and 25-34 years, respectively [1]. During the period from 1999 through 2009, homicides and suicides were the second and third most common causes of death among North Carolinians aged 15-34 years [1].

The Centers for Disease Control and Prevention (CDC) began operating the National Violent Death Reporting System (NVDRS) in 2003 to provide public health and law enforcement officials, policymakers, and violence prevention groups with accurate, timely information for prevention planning [2]. NVDRS is a federally funded cooperative agreement between 18 state health departments and the CDC National Center for Injury Prevention and Control, Division of Violence Prevention. The North Carolina Violent Death Reporting System (NC-VDRS) began collecting and reporting data in 2004 and has operated continuously since then. From 2004 through 2009, it collected information on 10,751 deaths.

NVDRS defines violent death as death resulting from intentional use of physical force or power against oneself, another person, or a group or community [2]. Case definitions include codes specified by the International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) [2]. The linkage of NVDRS data sources, including death certificates, coroner or medical examiner records, and law enforcement reports, is unique among injury surveillance systems. NVDRS can therefore provide detailed information on circumstances surrounding multiple-death incidents (eg, murder-suicide, multiple homicides, multiple suicides, or homicide-legal intervention) by linking related deaths when fatal injuries occur within 24 hours of each other. Variables captured in NVDRS include but are not limited to injury location, weapon used, history of mental illness, toxicology, and other psychosocial factors.

System Description
NC-VDRS links information from death certificates, medical examiner’s reports, and law enforcement reports (Figure 1). The NC-VDRS program manager downloads electronic death certificate data weekly and creates an electronic record for all certificates to which the state nosologist has assigned 1 of the ICD-10 codes for violent death. Death certificates are matched with data from the Office of the Chief Medical Examiner (OCME). Data regarding the victim’s occupation, educational status, any history of substance use or homelessness, injury type, and injury intent are collected manually by NC-VDRS abstractors from OCME records, including autopsy and toxicology reports. . The abstractors are employed by the North Carolina Department of Health and Human Services and undergo 4 weeks of intensive training to learn how to understand OCME reports, interpret ICD-10 coding, and generate narrative descriptions of violent events. This intensive phase is followed by a year of training with weekly quality assurance checks and annual continuing education. After OCME records are abstracted, a request for information is sent to the law enforcement agency with jurisdiction over the case, which provides a paper or electronic report. NC-VDRS program staff members enter this information manually. Updated de-identified records are uploaded to NVDRS nightly.

The CDC requests information on 2 timelines. Demographic variables from the death certificate are due within 6 months of the death date. Other variables from OCME and law enforcement reports, including toxicology results, wound descriptions, mental illness history, and injury context and mechanism, are due within 18 months of the death date. Depending on the type of death, abstractors collect and enter different information from these reports about the circumstances associated with infliction of the fatal injury. Each calendar year is finalized for preparation of the published report approximately 18 months after the last day of that year; however, records are continuously updated as new information is received. NC-VDRS annual reports for 2005 through 2007 have been printed and disseminated to stakeholders. The CDC combines data from all participating states and releases an annual report.

Previous evaluations focused on the data collection process. The goal of this evaluation was to assess the quality, timeliness, and usefulness of NC-VDRS data and make recommendations to improve system function.

Methods
The system was assessed according to standard CDC guidelines for evaluating public health surveillance systems [3]. These guidelines outline the tasks that should be carried out as part of the evaluation, such as engaging stakeholders, describing the surveillance system, focusing the evaluation design, gathering credible evidence about the performance of the system, justifying and stating conclusions, making recommendations, and ensuring that evaluation findings are used and that lessons learned are shared [3]. The authors of the guidelines define attributes by which surveillance system performance should be judged and discuss ways in which these attributes might be assessed [3]. Our evaluation focused on data quality, acceptability, and timeliness and consisted of a review of system records, stakeholder interviews, and quantitative comparisons of data. We performed less extensive evaluation of the system’s usefulness, simplicity, flexibility, representativeness, and stability.

Records Review
We reviewed system documents, including communications between NVDRS and NC-VDRS, surveillance reports, other NC-VDRS publications, and publications citing NC-VDRS data.

Stakeholder Interviews
We interviewed informants from all identified stakeholder groups, including all past and present NC-VDRS program staff members and representatives from the State Center for Health Statistics (SCHS), the OCME, the State Bureau of Investigation (SBI), and local law enforcement agencies. We also interviewed NC-VDRS advisory board chairs and researchers and community leaders who have used NC-VDRS data. Interview questions were developed through consensus among the coauthors and consisted of open-ended questions regarding familiarity with the system, difficulties faced in collecting or using system data, impression of the system’s effectiveness, its best qualities, and areas needing improvement. Interview tools were tailored to each group. NC-VDRS program staff interview tools were further tailored based upon role of the interviewee (eg, primary investigator, program manager, data abstractor, and budget manager). For example, program staff members were asked, “Please walk me through the steps of gathering and entering data into the system,” and “How is quality control of the data performed?” Data providers were asked, “What resources from your department are required to participate in the system?” Potential data users were asked, “How have you used NC-VDRS data?” Interviews were administered from September 2009 through January 2010 by the lead author either in person or by telephone. We assured all respondents that responses would remain anonymous. Notes from the interviews were maintained in a locked cabinet to which only the lead author had access. This study underwent CDC human subjects review and, as a public health surveillance system evaluation, was determined to be nonresearch.

In an attempt to assess the perspectives of local law enforcement agencies, we selected 16 agencies from 453 departments statewide. These agencies were eligible for participation in NC-VDRS (that is, they represented a jurisdiction in which 1 or more deaths meeting NC-VDRS criteria had occurred since system inception) and included police (city) and sheriff (county) departments, rural and metropolitan jurisdictions, and large and small agencies. In an effort to have equal representation from 1 or more of each of these types, we contacted 10 of the selected agencies. We attempted to interview the agency personnel who provided data to NC-VDRS. If another agency data user was identified during the interview, we also attempted to interview that person. We attempted to contact each interviewee at least twice, either by e-mail or telephone.

Quantitative Data Comparison
For quantitative evaluation, we used NC-VDRS data from 2007, the most recent year for which complete data were available. Quantitative evaluation of data quality commonly includes calculation of sensitivity and positive predictive value. However, such calculation necessitates an external independent dataset containing the same information for comparison. Because NC-VDRS uses law enforcement reports, which are the only comprehensive source of data on suicides and homicides in the state, no independent data source for comparison exists. As a result, we did not calculate sensitivity and positive predictive value directly; instead, we estimated the true number of cases likely to have occurred by using a capture-recapture technique [4]. We compared 2007 NC-VDRS homicide data from Forsyth County, North Carolina, with 2007 homicide data that Winston-Salem State University (WSSU) researchers obtained independently from law enforcement agencies whose jurisdictions include Forsyth County. We also evaluated data completeness for each of 8 demographic variables (age, gender, race, Hispanic ethnicity, county of residence, date of injury, county of injury, and location type) in each type of data source (death certificates, medical examiner’s reports, and law enforcement reports) by determining the proportion of deaths in NC-VDRS for which the source had reported information about the variable. For law enforcement reports, we assessed data completeness separately for homicides and suicides. We used SAS 9.1.3 software (SAS Institute, Cary, North Carolina) to conduct the analysis and used Fisher’s exact test to test for significance of the difference between proportions, using death certificate data as the referent.

Results
Records Review and Stakeholder Interviews
We interviewed 23 stakeholders, including 12 current or former NC-VDRS program staff members, 8 data providers (1 from SCHS, 1 from OCME, 1 from SBI, and 5 from local law enforcement agencies), and 3 data users (1 researcher and 2 community leaders). With the exception of law enforcement participants, all stakeholders who were approached agreed to be interviewed. Table 1 summarizes responses concerning NC-VDRS performance with regard to all 8 attributes assessed. Results for usefulness, data quality, acceptability, and timeliness are described in detail below, while results for simplicity, flexibility, representativeness, and stability can be found in Table 1.

Usefulness. A public health surveillance system is considered useful if it contributes to the prevention and control of adverse health-related events or improves understanding of the implications of these events [3]. Overall, stakeholders reported that NC-VDRS was useful and acknowledged using the data in a variety of ways. Community organizations described citing data in grant applications and using data to plan programming. One community organization reported having shifted their programming focus to suicide instead of homicide prevention because of NC-VDRS data. A representative from another community organization stated that the information provided by NC-VDRS is “crucial to evaluation of work and allocation of funding and human resources.” Additionally, a law enforcement agency engaged in preventive policing reported using NC-VDRS data to “understand tactically the profile of an individual who will be a victim of homicide to address planning for prevention of violence.” Other law enforcement agencies reported providing the annual report to community partners to assist with prevention strategies.

Data quality (completeness and validity). Data quality reflects the completeness and validity of the data recorded in the surveillance system [3]. Program staff reviews 100% of system events for internal data consistency among 3 sources: death certificates, medical examiner reports, and law enforcement reports. Ten of the 23 stakeholders interviewed explicitly stated that NC-VDRS provides high-quality, trustworthy data. One interviewee stated that NC-VDRS provides “the most up-to-date data, is the easiest to access, and is the only statewide data available in North Carolina.” Data providers consider the quality of NC-VDRS data adequate to use for quality control purposes for their own data. SCHS and OCME staff reported having noted incongruities between their records and NC-VDRS records (eg, coding of intent or manner of death), which led to corrections in the source data. SBI reported using NC-VDRS data to assess completeness of local law enforcement agency reporting. The only concern about data quality, raised by only 1 stakeholder, was that data about the circumstances of suicides are less complete than data about the circumstances of homicides.

Acceptability. Acceptability reflects the willingness of persons and organizations “to provide accurate, consistent, complete, and timely data” and depends on additional factors, including statutory requirements and ease of participation [3]. All stakeholders stated that violent death is of public health importance, and most reported believing that NC-VDRS has potential to effect change in the community. Stakeholders reported that the system responded positively to suggestions and comments about making data more accessible or understandable. State statute does not mandate NC-VDRS reporting; however, deaths “resulting from violence, poisoning, accident, suicide or homicide” must be reported to the medical examiner [5]. Although OCME and SCHS staff indicated that the time burden required for participation in the system was minimal, law enforcement cited a time burden ranging from 1-20 hours per year. Data reporting costs also varied among data providers. The OCME and the SCHS receive funding to support their participation, which is further facilitated by automated reporting mechanisms. In contrast, law enforcement agencies do not receive funding for participation. Reporting ease and time burden depend on the number of deaths investigated within the jurisdiction and on data organization. For example, few law enforcement agencies can search and provide data electronically; most perform these duties by hand.

Data providers specifically mentioned factors that could adversely affect acceptability. One reported doubting the system’s ability to effect change in the community, citing a lack of visible contribution to public policy changes; others brought up delayed surveillance report dissemination. Among data providers, only law enforcement reported knowledge of system data use in the community.

Timeliness. Timeliness reflects the speed between steps in a public health surveillance system [3]. NC-VDRS has consistently reported data to the CDC well before the established 6-month and 18-month deadlines. However, the first local stakeholder annual report was not released until 36 months after the reporting year’s end. Local stakeholder report timeliness has steadily improved; a 2007 provisional report was released in November 2009 and finalized June 2010, 30 months after the year’s end. And a 2008 provisional report was sent to stakeholders in September 2010.

Quantitative Evaluation
Data quality (completeness and validity). All homicides identified by WSSU were present in NC-VDRS. Two additional homicides that occurred in Forsyth County were present in NC-VDRS and not present in the data received from WSSU. Based on this information, NC-VDRS, by death certificate initiation, is estimated to have identified 100% of the homicides that took place in Forsyth County in 2007.

Overall, medical examiner data most reliably provided demographic information, and law enforcement report data did so least reliably (Table 2). Among NC-VDRS deaths in 2007, for the 8 demographic variables examined, death certificate data contributed information a minimum of 69% of the time for date of injury to 100% of the time for gender; medical examiner reports provided information on these 8 variables 97% (race) to 100% (gender) of the time. In contrast, law enforcement reports provided information on these 8 demographic variables only 71% to 72% of the time. Inclusion of law enforcement report data differed by manner of death: 89% to 91% of NC-VDRS homicides included law enforcement report data for all 8 demographic variables, whereas only 61% of suicides included law enforcement report data for all 8 variables.

Discussion
NC-VDRS brings together data sources that have not traditionally been linked to provide comprehensive information regarding demographic characteristics, types of injuries, toxicology, weapon types, and circumstances surrounding violent deaths. This information is otherwise unavailable in North Carolina and contributes to national surveillance efforts. This evaluation of NC-VDRS suggests that the system is useful, is accepted widely, provides high-quality data reliably, and reports data to the national system in a timely manner. Surveillance systems with these qualities are considered useful for public health action [3].

The evaluation also revealed ways that NC-VDRS could improve (eg, by decreasing the time required for local stakeholder data dissemination and by improving the completeness of law enforcement suicide reports). Posting preliminary electronic reports within 18 months of the calendar year end might allow for wider dissemination and more efficient use of limited resources by avoiding printing costs. Funding to support data dissemination could also increase NC-VDRS impact on violence prevention.

In our evaluation, law enforcement data, particularly suicide reports, were less complete, which may be a quantitative clue to system acceptability. To decrease barriers to law enforcement participation, NC-VDRS staff members actively contact law enforcement agencies in whose jurisdiction a violent death has occurred and have made educational presentations about NC-VDRS at law enforcement meetings. More law enforcement agencies have participated every year since 2004. Because suicide has consistently been the most common manner of violent death in North Carolina, complete suicide data is vital to improving system usefulness [6-9]. To improve completeness, NC-VDRS created a suicide and homicide investigation pocket card for law enforcement investigators, which lists the circumstances of interest.

Certain limitations should be considered when interpreting our findings. First, because each NVDRS participant state has its own unique infrastructure, our findings are unlikely to generalize to other states. Additionally, although we attempted to contact all groups of involved stakeholders, our overall numbers were low. We interviewed all stakeholders currently involved with NC-VDRS from OCME and SCHS, as well as all current and past program staff members. However, our sample of law enforcement stakeholders was small and was not chosen randomly; also, we were unable to interview anyone at several of the agencies we attempted to contact. Because this was the first attempt to obtain systematic feedback from local law enforcement agencies, we chose an open-ended interview format. This format allows for gathering detailed information but limits the number of persons from whom that information may be gathered. As a result, the information obtained from these interviews may not be representative of all law enforcement agencies in the state or even of those participating in NC-VDRS. The difficulty we had in interviewing even those agencies that we approached is indicative of the challenge NC-VDRS continues to face in engaging law enforcement. Future evaluations may gather more representative data from law enforcement by making use of improving connections and by using a survey format designed to encourage broader participation.

Additionally, only a few data users were interviewed. This paucity reflects the fact that, prior to the time of our evaluation, NC-VDRS data had not been widely used. However, these data are being used increasingly. The North Carolina Institute of Medicine (NCIOM) [10] has used NC-VDRS data in developing the Healthy North Carolina 2020 injury goals and objectives, and a number of publications have utilized NC-VDRS data to educate academic and medical communities on the nature of violence in the state [11–14]. As the number of NC-VDRS data users increases, future evaluations might benefit from using a survey format to gather information from these stakeholders as well.

Finally, we did not account for cases that do not result in law enforcement report filing, such as a death resulting from an injury sustained several years earlier, which could have resulted in underestimation of the completeness of law enforcement reporting. Furthermore, the method we used to estimate reporting completeness was designed for use with independent data sources. NC-VDRS data and WSSU data were not completely independent, because both were obtained from the same law enforcement agencies, albeit at different times. Our results could overestimate reporting completeness.

Overall, our evaluation determined that NC-VDRS provides stakeholders with useful, high-quality data. NC-VDRS and NVDRS offer an opportunity to more completely define factors associated with violence. By combining information from death certificates with medical examiner and law enforcement reports and by linking information from related deaths, it may be possible to gain new information about demographic groups most affected by violent death, types of injuries sustained, and social factors surrounding such deaths. With increased resources for rapid data dissemination and improved suicide report completeness, NC-VDRS can supply information vital to developing new, more effective strategies for preventing violent death.

Acknowledgments
Financial support. This work was supported through funding by the Centers for Disease Control and Prevention.

Disclaimer. The findings and conclusions of this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention.

Potential conflicts of interest. All authors have no relevant conflicts of interest.

References
1. Injury Prevention and Control Data and Statistics. Web-based Injury Statistics Query and Reporting System (WISQARS). Centers for Disease Control and Prevention Web site. http://www.cdc.gov/injury/wisqars/index.html. Accessed July 13, 2012.

2. Karch DL, Dahlberg LL, Patel N. Surveillance for violent deaths–National Violent Death Reporting System, 16 states, 2007. MMWR Surveill Summ. 2010;59(4):1–50.

3. German RR, Lee LM, Horan JM, Milstein RL, Pertowski CA, Waller MN. Updated guidelines for evaluating public health surveillance systems: recommendations from the guidelines working group. MMWR Recomm Rep. 2001;50(RR-13):1–35.

4. Ackman DM, Birkhead G, Flynn M. Assessment of surveillance for meningococcal disease in New York State, 1991. Am J Epidemiol. 1996;144(1):78–82.

5. Medical Examiner Jurisdiction, NCGS §130A-383 (1955).

6. The North Carolina Violent Death Reporting System: Provisional Data Release for January to June, 2004. Technical Report 1.0. Raleigh, NC: Injury and Violence Prevention Branch, North Carolina Department of Health and Human Services; 2005. http://www.injuryfreenc.ncdhhs.gov/About/NCVDRSJanJuneProvisional2004.pdf. Accessed May 31, 2012.

7. North Carolina Violent Death Reporting System: Annual Report 2005. Raleigh, NC: Injury and Violence Prevention Branch, Division of Public Health, North Carolina Department of Health and Human Services; 2008. http://www.injuryfreenc.ncdhhs.gov/About/2005NVDRSReport.pdf. Accessed May 31, 2012.

8. North Carolina Violent Death Reporting System: Annual Report 2006. Raleigh, NC: Injury and Violence Prevention Branch, Division of Public Health, North Carolina Department of Health and Human Services; 2009. http://www.injuryfreenc.ncdhhs.gov/About/2006NVDRSReport.pdf. Accessed May 31, 2012.

9. North Carolina Violent Death Reporting System November 2009: Provisional Report 2007. Raleigh, NC: Injury and Violence Prevention Branch, Division of Public Health, North Carolina Department of Health and Human Services; 2009. http://www.injuryfreenc.ncdhhs.gov/DataSurveillance/NVDRSProvisionalReport2007.pdf. Accessed May 31, 2012.

10. North Carolina Institute of Medicine (NCIOM). Healthy North Carolina 2020: A Better State of Health. Morrisville, NC: NCIOM; 2011. http://www.nciom.org/wp-content/uploads/2011/01/HNC2020_FINAL-March-revised.pdf. Accessed May 31, 2012.

11. Samandari G, Martin SL, Kupper LL, Schiro S, Norwood T, Avery M. Are pregnant and postpartum women: at increased risk for violent death? Suicide and homicide findings from North Carolina. Matern Child Health J. 2011;15(5):660-669.

12. Samandari G, Martin SL, Schiro S. Homicide among pregnant and postpartum women in the United States: a review of the literature. Trauma Violence Abuse. 2010;11(1):42–54.

13. Madkour AS, Martin SL, Halpern CT, Schoenbach VJ. Area disadvantage and intimate partner homicide: an ecological analysis of North Carolina counties, 2004–2006. Violence Vict. 2010;25(3):363–377.

14. Coyne-Beasley T, Lees AC. Fatal and nonfatal firearm injuries in North Carolina. N C Med J. 2010;71(6):565–568.


Natalie J. M. Dailey, MD epidemic intelligence service officer, Centers for Disease Control and Prevention, Atlanta, Georgia; North Carolina Department of Health and Human Services, Raleigh, North Carolina.
Tammy Norwood program manager, North Carolina Violent Death Reporting System, North Carolina Department of Health and Human Services, Raleigh, North Carolina.
Zack S. Moore, MD medical epidemiologist, Division of Public Health, North Carolina Department of Health and Human Services, Raleigh, North Carolina.
Aaron T. Fleischauer, PhD career epidemiology field officer, Office of Public Health Preparedness and Response, Centers for Disease Control and Prevention, Atlanta, Georgia; North Carolina Department of Health and Human Services, Raleigh, North Carolina.
Scott Proescholdbell, MPH head, Injury Epidemiology and Surveillance Unit, Injury and Violence Prevention Branch, Division of Public Health, North Carolina Department of Health and Human Services, Raleigh, North Carolina.

Address correspondence to Natalie Dailey Garnes, 1102 Bates St, Feign Center, Suite 1120, Houston, TX 77030 (dailey@bcm.edu).