We are in the process of analyzing the results of our research. Here is how you can help.

Looking to get started helping our latest research efforts right away? Follow this link to instructions to volunteer with our content analysis study.

The Puppycide Database Project has completed the first comprehensive review of the information compiled in our database over the last 18 months. With the help of our volunteers, this new publication is based on the details of over 1,500 incidents of police lethal force against animals from across the United States. It is the first survey focused on the topic of police killings of animals. We are very close to providing this document to the public.

In order to facilitate the use of this research, we must provide more than just our data and trends we have found in that data. We also have to provide data* about *our data. This post will briefly describe what this means and what you can do to help us complete our latest publication.

Fundamentally, the Puppycide Database is an observational study that uses a survey as the primary means of data collection. The survey consists of our online form, which is used by volunteers to submit information about police use of force toward animals. While the form only asks questions about facts that are objectively verifiable, all surveys present room for error and bias. Mistakes can be made filling out the form. The form could even be filled out with deliberately fake and misleading information. Even if we completely replaced the form with a computer program that collects data automatically, that computer program could still contain errors or contain behavior that imperceptibly mirrors the bias of its designers.

Fortunately, the reality of bias and error does not present an insurmountable obstacle toward scientific research. Furthermore, the Puppycide Database Project is far from the first to deal with these obstacles. To provide an objective means of interpreting the results of surveys and the contents of documents, researchers increasingly rely on a diverse range of techniques collectively referred to as content analysis and expect that the results of such an analysis are provided as part of publication. When research is peer-reviewed, the results of the content analysis provide an efficient means of determining the veracity of the findings of a research paper (much easier than redoing an entire survey, for example). Inability to reproduce the results of a content analysis usually indicate larger problems.

For the Puppycide Database, we have opted to rely on a form of content analysis called Krippendorff's alpha coefficient. While the principles underlying Krippendorff's alpha can be quite complex, the method required to calculate Krippendorff's alpha are straightforward and the significance of the results are self-evident even to those who aren't familiar with statistics.

The process we are using to determine Krippendorff's alpha for the Puppycide Database begins by having volunteers complete our survey form for an incident that already exists in our database. This is the part of the process with which we need your help. Each volunteer is provided with an article describing a police shooting of a dog or similar use of force and a summary of the incident, and answers 10-20 multiple choice questions about the article. Reading one article and answering the questions typically takes less than 5 minutes.

Once a sufficient number of volunteers have completed reviewing a sufficient number of submissions, we will compare the results of those new forms with the results from the old forms already in the database. Finally, we will create a series of numbers using the Krippendorff's alpha coefficient equations and include those numbers with our publication. That's it!

Having several individuals complete the same forms to determine problems with both specific questions in the survey and with specific submissions to the survey makes intuitive sense. If for example, four out of five readers answers question 1 through 40 with answer A while one reader answers the same questions with answer B, it is worth looking into why that one reader answered so differently. Alternatively, if five out of five readers are unable to determine the answer to question 10 but are able to answer every other question, maybe there is something wrong with question 10. The degree to which the results of multiple volunteers submissions of the same form coincide with one another is referred to as inter-rater reliability.

Content analysis allows us to discover not just the statistical strength of our entire survey and its results, but also of individual questions and individual submissions from volunteers. This sort of tool is particularly important to the Puppycide Database Project, because our surveys are anonymous to protect our sources from fear of retaliation - no small concern since our sources include both owners of pets who have been shot by police as well as police officers who have been involved with or witnessed uses of lethal force. The reliability of a volunteer's answers must be based on the answers themselves, and not the identity of the volunteer.

Any topic related to police behavior will inevitably have political connotations and a savvy public increasingly views commentary of areas with political significance as necessarily subjective and agenda-driven. The inclusion of verifiable and reproducible analysis of our research can serve only to reinforce that studies of police use of force are as capable of objectivity as any other field of social science.