One Norm, Two Standards: Realizing Transparency in Qualitative Political Science

January 01, 2015

By Andrew Moravcsik

Editor’s Note: this post is contributed by Andrew Moravcsik, Professor of Politics and Public Policy and Director of the European Union Program at Princeton University. This posting was updated at 11:00 PM CDT 1/3/2015 at the request of the author.

Quantitative and qualitative political scientists are currently working closely together, and with their counterparts in other disciplines, to render social science research more transparent. The Data Access and Research Transparency (DA-RT) initiative launched by the American Political Science Association (APSA) has resulted in the new professional responsibility norms mandating transparency. To realize this goal, APSA co-hosted a workshop several months ago assembling thirteen editors of top political science journals of all types—including American Political Science Review, International Security, PS, American Journal of Political Science, and Comparative Political Studies. The editors issued in a joint public statement committing their journals to implement new qualitative and quantitative transparency standards by January 2016. It is already gaining new adherents from journals outside the original group.[i] A number of other institutions, including the Qualitative Data Repository at the Institute for Qualitative and Multi-Method Research (IQMR), National Science Foundation (NSF), Social Science Research Council (SSRC), Berkeley Initiative for Transparency in the Social Sciences (BITSS) have are mounting efforts in this area as well.[ii] Social science transparency is an idea whose time has come.

This paper addresses one perceived constraint on this trend: the lack of broad understanding (outside of those closely involved) about exactly what transparency implies for qualitative political science. Quantitative analysts have more self-awareness in this domain. They generally understand enhanced transparency to mean “more of the same,” that is, wider adoption of rules (already in place at some journals as a precondition for review or publication) requiring that data be placed in database at a trusted third-party repository with a digitally automated analytical protocol.

Yet what is the equivalent default transparency scheme for qualitative researchers, most of whom conduct process-tracing analyses of case studies? Finding an effective means to promote qualitative transparency is likely to have a deeper impact on political science than improving quantitative transparency—or, indeed, any other immediate methodological reform. This is true for three reasons. First, the majority of researchers in political science—over 90% in a sub-discipline like International Relations—employ qualitative methods.[iii] Second, little has been done so far to promote qualitative transparency, so any change is likely to have a greater marginal impact. Third, for any changes at the disciplinary level to succeed, the active involvement of non-quantitative scholars will be required.

The answer may seem straightforward. Qualitative transparency ought to be enhanced by familiar means: either by generalizing conventional citation currently employed by qualitative analysts, or by introducing the same centralized data archiving employed by quantitative analysts. Yet scholars have recently come to understand—for reasons this essay will discuss in detail—that for most qualitative political science, neither of these solutions offers a practical means to enhance transparency. More innovative use of digital technology is required. A consensus is forming that the most workable and effective default transparency standard for qualitative political science is Active Citation: a system of digitally-enabled citations linked to annotated excerpts from original sources.

To explain how scholars reached these conclusions, this essay proceeds in four sections. The first section presents a consensual definition of research transparency and why almost all political scientists accept it. The second sets forth essential criteria for judging how this transparency norm should be implemented, realized and applied within specific research communities. The third explains why the best-known approaches—conventional citation, hyperlinks and digital archiving—fail to meet these criteria for most qualitative political science today. The fourth section concludes by describing Active Citation in detail and explains why it best fulfills the essential criteria of an applied standard.

What is Research Transparency?

“Research transparency” designates a disciplinary norm whereby scholars publicize the process by which they generate empirical data and analysis.[iv] It obliges scholars to present the evidence, analysis and methodological choices they use to reach research conclusions about the social world—in plain English, to “show their work.” Recent scholarship has refined this definition by isolating three dimensions of research transparency.[v] The first, data transparency, obliges social scientists to publicize the evidence on which their research rests. The second dimension, analytic transparency, obliges social scientists to publicize how they measure, code, interpret, and analyze that data. The third dimension, process transparency, obliges social scientists to publicize the broader set of research design choices that gave rise to the particular combination of data, theories, and methods they employ.

Unlike almost any other methodological ideal, transparency unifies rather than divides social scientists across the full range of disciplines, epistemologies, methods, theories and substantive interests. Transparency enjoys this consensual status because it constitutes social science as a legitimate collective activity. To publicize data, theory and methods is to fulfill a basic ethical responsibility to act toward other scholars with openness and honesty. Underlying this responsibility is the realization that scholars are humans whose research choices are inevitably subject, sometimes unconsciously, to arbitrary interests, commitments and biases. To be part of a common scholarly conversation requires, therefore, that one admit fallibility and pay respect to readers and potential interlocutors by opening the choices one makes to meaningful public discussion and debate.[vi] The Ethics Committee of the American Political Science Association (APSA)—which represents a very broad range of methods and interests—recently recognized this by expanding the professional responsibilities political scientists share to include data access and research transparency.[vii]

Social scientific transparency is essential for a more pragmatic reason as well, namely that the legitimacy and credibility of academic findings (inside and outside of academia) rests in large part on the belief that they result from well-informed scholarly discussion and debate.[viii] The recent discovery of a large number of non-replicable positive results in the natural and social sciences has called this legitimacy and credibility into question. Transparency offers one check on this tendency by inviting scholars to investigate, replicate, critique, extend and reuse the data, analysis and methods that their colleagues have published. When new work appears, other scholars are inspired to accept, reject or improve its findings. Citizens, private organizations, sponsoring bodies, and public decision makers can evaluate and apply the results, feeding back their experiences to researchers as new data and questions. Scholars are trained to contribute to the advancement of this collective enterprise and are recognized and rewarded for doing so well. They challenge, extend, or borrow from prior data, analysis and methods to move in innovative directions, thereby renewing the flow of research and starting anew. Transparency not only permits this cycle of research to take place, but displays it publicly, enhancing its credibility and legitimacy—thereby ultimately justifying society’s investment in it.

Principled and Pragmatic Criteria for Implementing Qualitative Transparency

Almost all social scientists recognize transparency as an abstract good. Yet political science is divided into diverse research communities with different methods, theories and substantive interests. It would be inappropriate for all to seek to realize transparency by applying the same concrete rules. Different styles of research require different procedures.[ix] To understand how contemporary qualitative political scientists can best achieve transparency, we must understand two aspects of qualitative political science. One is the basic “process-tracing” epistemology most qualitative political scientists employ. The other is the set of everyday practical constraints they face.

The “Process-Tracing” Epistemology of Most Qualitative Political Science

Most qualitative political science researchers employ some form of “process-tracing” to investigate a few cases intensively. Comparing across cases (let alone analyzing a large number of qualitative observations in a database) plays a secondary role. Process-tracing analysts focus primarily within single cases, citing individual pieces of textual and factual evidence to infer whether initial conditions and subsequent events actually took place, and to establish the existence of (and relative importance of) descriptive or causal processes that may link cause and effect. Evidence appears as a set of heterogeneous insights or pieces of data about a causal mechanism and its context (“causal process observations”) linked to specific successive points in a narrative or causal sequence. This inferential structure differs from that found in most quantitative political science, where data is typically analyzed as “part of a larger, systematized array of observations” (“dataset observations”) describing cross-case variation.[x]

There are many reasons—ranging from scholarly, normative or policy interest in a particular case to unit heterogeneity or uniqueness—why a scholar might legitimately wish to focus on internal validity in this way rather than average outcomes across a larger population.[xi] One methodological advantage of the process-tracing mode of qualitative analysis is that researchers can exercise a unique degree of interpretive subtlety in assigning an appropriate role, weight, and meaning to each piece of causal process evidence, depending on its position in the underlying temporal narrative or causal sequence and on its basic reliability in context. For example, they may view a particular fact as a necessary condition (a “hoop test”), a sufficient condition (a “smoking gun” test), INUS (an insufficient but necessary part of a condition which is itself unnecessary but sufficient for the result), SUIN (a sufficient but unnecessary part of a factor that is insufficient but necessary for an outcome), a probabilistic link (a “straw in the wind”), or another type of condition.[xii] Within each category, a test may be viewed as more or less probative, based on the nature of the test, the particulars of the text, the reliability of the evidence, the broader political, social and cultural context, and the Bayesian priors in the literature. This nuanced but rigorous, discursively contextualized, source-by-source interpretation, which is highly prized in fields such as history and law, contrasts with the norm of imposing uniform rules that weight all data in a general database equally or randomly, which is more generally found in quantitative political science.

Pragmatic Constraints on Process-Tracing Research

Perfect research transparency seems attractive in theory. Yet five practical considerations limit the feasibility and desirability of any effort to maximize the evidentiary, analytic and procedural information that any scholar, qualitative or quantitative, reveals.

  • Intellectual property law imposes limits on the secondary use of published material: in most countries a modest quotation can be employed for non-profit or scholarly purposes, but entire published documents cannot be reproduced at will. Even sources formally in the public domain (such as archival documents) are often subject to informal restrictions, and any scholar who disseminated them wholesale might find it difficult to research a related topic again in the future.
  • Confidentiality for human subject protection comes into play when scholars agree (either informally or as part of a formal research design or Institutional Review Board-approved arrangement) to keep information confidential. Some such information cannot be cited at all; some has to be sanitized.
  • An unreasonable logistical burden may arise if scholars are obliged to deposit, sanitize or reproduce massive quantities of data in qualitative form, particularly if they did not expect to do so when collecting the data.
  • Scholars should expect to enjoy a reasonable first-use right to exploit newly collected data and analysis before other scholars can access it. This helps incentivize and fairly reward scholars, particularly younger ones, who collect and analyze data in new ways.
  • Publication procedures and formats are often costly to change, especially in the short-term.[xiii]

What Should We Look for in an Applied Transparency Standard?

For qualitative process-tracing to be intelligible, credible and legitimate, data and analysis must be transparent and the major research choices must be justified. The unique interpretive flexibility qualitative scholars enjoy only serves to increase transparency requirements. Any appropriate and workable standard of qualitative transparency be suited to this type of research. It must preserve the basic narrative “process-tracing” structure of presentation. Scholars should provide readers with the data and analytical interpretation of each piece of evidence in context, and a methodological justification for their selectionReaders must be able to move efficiently, in real time, from a point in the main narrative directly to the source and its analysis, and back again—a function traditionally carried out by footnotes and endnotes. Third, analytic transparency provisions must permit scholars to explain the interpretive choices they have made with regard to each piece of evidence. All this must take place within the real-world constraints set by intellectual property law, human subject protection, logistics, first use rights and existing publication formats.

Can Existing Transparency Instruments Enhance Qualitative Research Transparency?

This section examines three relatively familiar options for enhancing social scientific transparency: conventional citations, hyperlinks to external web sources and data archiving. Though all are useful in specific circumstances, none offers a workable default standard for qualitative political scientists given the constraints discussed above.

Conventional Citation

The transparency standard employed today in political science, conventional citation, is manifestly unable to provide even minimal qualitative research transparency.[xiv] To be sure, the potential exists. Legal academics and historians employ “best practices,” such as ubiquitous discursive footnotes containing long, annotated quotations and a disciplinary discourse in which other scholars often challenge textual claims. Law reviews permit readers to scan at a glance the main argument, quotations from sources and the basis of the interpretation. Historical journals note the exact location, nature and interpretation of a document, often backed by quotations.

Yet in political science qualitative transparency has been rendered all but impossible by recent trends in formatting journals and books: notably ever tighter word limits and so-called scientific citations designed to accommodate references only to secondary literature rather than data. Political science has regressed: decades ago, discursive long-form citations permitted political scientists to achieve greater research transparency than they can today. This trend is unlikely to reverse. The results have been profound, extending far beyond format. Political science has seen a decay in the humanistic culture of appreciation, debate, reuse, extension and increasing depth of text-based empirical understanding—and an atrophy in the skills to engage in such debates, such as linguistic, functional, regional area, policy and historical knowledge remain prized, skills that remain commonplace in legal academia, history and other disciplines. Only in small pockets of regional area, historical or policy analysis linked by common linguistic, functional or historical knowledge does qualitative transparency and depth of debate persist. This is a major reason for the crisis in qualitative political science today.

Two Digital Alternatives: External Hyperlinks and Data Archiving

If fundamental reform of conventional citation practice is unlikely, then effective enhancement of research transparency will require intensive application of digital technology. In recent years, political scientists have considered the digital alternatives. Two possibilities mentioned most often—external hyperlinks and data archiving—prove on close inspection to be too narrowly and selectively applicable to be broad default instruments to enhance qualitative transparency.

The first digital option is to hyperlinks to external web-based sources. Hyperlinks are now ubiquitous in journalism, policy analysis, social media, government reports and law, as well as scholarly disciplines (such as medicine) in which researchers mainly reference other scholars, not data. A few areas of political science are analogous. Yet most sources political scientists cite—including the bulk of archival materials, other primary documents, pamphlets and informally published material, books published in the past 70 years, raw interview transcripts, ethnographic field materials, photographs, diagrams and drawings—are unavailable electronically. Even when sources exist on-line, external hyperlinks offer only data access, not analytic or process transparency. We learn what a scholar cited but not why or how. In any case, as we shall see below, other transparency instruments subsume the advantages of hyperlinks.

The second digital option is to archive data in a single unified database. Data archiving has obvious advantages: it is consistent with common practice in quantitative analysis, builds a procedural bulwark against selection bias (cherry-picking) of favorable quotations and documents; and, with some further technological innovation, and might even permit specific textual passages to be linked conveniently to individual citations in a paper. Certainly data archiving is essential to preserve and publicize complete collections of new and unique field data, such as original interviews, ethnographic notes, primary document collections, field research material, and web scrapes—an important responsibility for social scientists. Data archiving is also the best means to present qualitative data that is analyzed as a single set of “dataset observations,” as in some content, ethnographic or survey analyses. Databases such as Atlas, Access, Filemaker, and Endnote now offer promising and innovative way to store, array and manipulate textual data particularly useful in research designs that emphasize macro-comparative inquiry, systematic coding, content analysis, or weighing of a large number of sources to estimate relatively few, carefully predefined variables—often in “mixed-method” research designs.[xv] These important considerations have recently led political scientists to launch data repositories for textual material, notably the Qualitative Data Repository established with NSF funding at Syracuse University.[xvi]

Yet, however useful it may be for these specific purposes, data archiving is not an efficient format for enhancing general qualitative research transparency. This is because it copes poorly with the modal qualitative researcher, who employs process tracing to analyze causal process observations. The reasons are numerous. First, intellectual property law imposes narrow de jure and de facto limits on reproducing textual sources, including almost all secondary material published in the seventy years and a surprising portion of archival government documents, private primary material, web text, commercial material and artistic and visual products. Second, many sources (even if legally in the public domain) cannot be publicized due to confidentiality agreements and human subject protection enforced by Institutional Review Boards (IRBs). These include many interview transcripts, ethnographic observation, and other unpublished material. Even when partial publication is possible, the cost and risk of sanitizing the material appropriately can be prohibitively high, if possible at all.[xvii] Third, the logistical burden of data archiving imposes additional limits. In theory addressing “cherry picking” by archiving all qualitative data a scholar examines seems attractive, yet it is almost always impractical. Serious primary source analysts sift through literally orders of magnitude more source material than they peruse intensively, and peruse orders of magnitude more source material than are important enough to cite. Almost all the discarded material is uninteresting. Sanitizing, depositing and reading tens or hundreds of times more material than is relevant would create prohibitive logistical burdens for scholar and reader alike. Finally, even if all or some relevant data is archived efficiently, it does little to enhance analytic transparency. As with hyperlinking, we learn what a scholar cited but not why. Overall, these limitations mean that data archiving may be a first best transparency approach for qualitative evidence that is arrayed and analyzed in the form of database observations, but it is sub-optimal for the modal process-tracing form of analysis.

A Practical yet Effective Standard: Active Citation

Conventional citation, hyperlinks and data archiving, we have seen, have inherent weaknesses as general transparency standards. A new and innovative approach is required. Scholars are converging to the view that the most promising and practical default standard for enhancing qualitative transparency is Active Citation (AC): a system of digitally-enabled citations linked to annotated excerpts from original sources.

In the AC format, any citation to a contestable empirical claim is hyperlinked to an entry in an appendix appended to the scholarly work (the “Transparency Appendix” (TRAX)).[xviii] Each TRAX entry contains four elements, the first three required and the last one optionally:

  1. a short excerpt from the source (presumptively 50-100 words long);
  2. an annotation explaining how the source supports the underlying claim in the main text (of a length at the author’s discretion);
  3. the full citation;
  4. optionally, a scan of or link to the full source.

The first entry of the TRAX is reserved exceptionally for information on general issues of production transparency. Other TRAX entries can easily be adapted to presenting sources in visual, audio, cinematic, graphic, and other media. AC can be employed in almost any form of scholar work: unpublished papers, manuscripts submitted for publication, online journal articles, and e-books, or as separate online appendices to printed journals and books. Examples of AC can be found on various demonstration websites and journal articles.

AC is an innovative yet pragmatic standard. It significantly enhances research transparency while respecting and managing legal, administrative, logistical and commercial constraints. The relatively short length of the source excerpt assures basic data transparency, including some interpretive context, while remaining avoiding most legal, human subject, logistical, and first-use constraints.[xix] (If such constraints remain, they trump the requirement to publicize data, but scholars may employ an intermediate solution, such as providing a brief summary of the data.) The optional scan or link offers the possibility of referencing a complete source, when feasible—thereby subsuming the primary advantage of hyperlinking and data archiving. The annotation delivers basic analytic transparency by offering an opportunity for researchers to explain how the source supports the main text—but the length remains at the discretion of the author. The exceptional first entry enhances process transparency by providing an open-ended way of addressing research design, selection bias, and other general methodological concerns that remain insufficiently explained in the main text, footnotes, empirical entries, or other active citations. The full citation assures that each TRAX entry is entirely self-contained, which facilitates convenient downloading into word-processing, bibliographical and database software. This contributes over time to the creation of a type of networked database of annotated quotations, again subsuming an advantage of data archiving at lower logistical cost. Finally, AC achieves these goals while only minimally disrupting current publishing practices. Except for the hyperlinks, existing formats and word limits remain unchanged (the TRAX, like all appendices, lies outside word limits). Journals would surely elect to include active footnotes only in electronic formats. Editors need not enforce ex post archiving requirements.

AC is a carefully crafted compromise reflecting years of refinement based on numerous methodological articles, workshops, discussions with editors, interdisciplinary meetings, consultations with potential funders, instructional sessions with graduate students, and, perhaps most importantly, consultation sessions with researchers of all methodological types. Underneath the digital technology and unfamiliar format lies an essentially conservative and cautious proposal. Interpretivists and traditional historians will note that it recreates discursive citation practices they have long employed in a digital form. Quantitative researchers will note that it adheres to the shared norm of placing data and analysis in a single third-party “database,” though it interprets that term in a very different manner than conventional statistical researchers do.

For these reasons, AC has emerged as the most credible candidate to be the default transparency standard for qualitative papers, articles and, eventually, books in political science. It is rapidly sinking disciplinary roots. AC has been elaborated and taught in published articles, at disciplinary and interdisciplinary conferences, at training institutes and at graduate seminars. The NSF-funded Qualitative Data Repository has commissioned ten “active citation compilations” by leading scholars of international relations and comparative politics, who are retrofitting classic and forthcoming articles or chapters to the active citation format. Software developers are creating tools to assist in preparing TRAXs, in particular via software add-ons to automate the formatting of transparency appendices and individual entries in popular word processing programs. The new APSA ethics and professional responsibility rules recommend AC, as do, most importantly, the APSA-sponsored common statement in which 15 journals (so far) jointly committed to move to enhanced transparency in January 2016.[xx]

Conclusion: A Transparent Future

At first glance, transparency may appear to be an unwelcome burden on qualitative researchers. Yet in many respects it should be seen instead as a rare and valuable opportunity. The cost of implementing active citation is relatively low, particularly when one knows in advance that it is expected.[xxi] After all, legal scholars and many historians—not to mention political scientists in generations past—have long done something similar as a matter of course, and without the assistance of word processers, hand-held cameras and the web, which ease archival and textual preservation and recognition. More importantly, greater transparency offers political scientists large individual and collective benefits. It provides an opportunity to demonstrate clearly, and to be rewarded for, scholarly excellence. This in turn is likely to incentivize scholars to invest in relevant skills, which include interpretive subtlety in reading texts, process-tracing and case selection techniques, and deep linguistic, area studies, historical, functional and policy knowledge. These skills will be in greater demand not just to conduct research, but to referee and debate it. Perhaps most important, the networked pool of transparent data, analysis and methods in active citations will constitute an ever-expanding public good for researchers, who can appreciate, critique, extend and reuse that data and analysis, just as quantitative scholars make low-cost use of existing datasets and analytical techniques.

The trend toward more open research is not just desirable; it is inevitable. In almost every walk of life—from science and government to shopping and dating—the wave of digital transparency has proven irresistible. The question facing qualitative political scientists is no longer whether they will be swept along in this trend. The question is when and how. Our responsibility is to fashion norms that acknowledge and respect the epistemological priorities and practices qualitative political scientists share.

Footnotes

[i] http://www.dartstatement.org/

[ii] http://bitss.org/https://qdr.syr.edu/;

[iii] Daniel Maliniak, Susan Peterson, and Daniel J. Tierney, TRIP around the World: Teaching, Research and Policy Views of International Relations Faculty in 20 Countries (Williamsburg, Virginia: Institute for the Theory and Practice of International Relations, College of William and Mary, 2012), Charts 28–30, 42. 57, at http://www.wm.edu/offices/itpir/_documents/trip/trip_around_the_world_2011.pdf.

[iv] Gary King, “Replication, Replication,” PS: Political Science and Politics (September 1995), p. 444.

[v] This tripartite distinction revises that found in Arthur Lupia and Colin Elman, “Openness in Political Science: Data Access and Research Transparency,” PS: Political Science & Politics (January 2014): 19–42, appendices A and B.

[vi] Brian A. Nosek, Jeffrey R. Spies, and Matt Motyl, “Scientific Utopia: II. Restructuring Incentives and Practices to Promote Truth over Publishability,” Perspectives on Psychological Science (2012) 7(6): 615–631.

[vii] APSA, Guidelines for Qualitative Transparency (2013), see fn. 6.

[viii] For a more complete discussion of this topic and others in this essay, see Andrew Moravcsik, “Trust, yet Verify: The Transparency Revolution and Qualitative International Relations,” Security Studies 23:4 (December 2014) (2014b), pp. 663-688; “Transparency: The Revolution in Qualitative Political Science,” PS: Political Science & Politics 47, no. 1 (January 2014) (2014b): 48–53; “Active Citation and Qualitative Political Science,” Qualitative & Multi-Method Research 10, no.1 (Spring 2012); Andrew Moravcsik, “Active Citation: A Precondition for Replicable Qualitative Research,” PS: Political Science & Politics 43, no. 1 (January 2010): 29–35.

[ix] Lupia and Elman, fn 6.

[x] This distinction was originally introduced by Henry Brady, David Collier and Justin Seawright, subsequently adopted by James Mahoney, Gary Goertz and many others, and has now become canonical. The causal process mode of inferring causality, in Brady and Collier’s words, “contributes a different kind of leverage in causal inference…A causal-process observation may [give] insight into causal mechanisms, insight that is essential to causal assessment and is an indispensable alternative and/or supplement to correlation-based causal inference.” Henry Brady and David Collier, Rethinking Social Inquiry: Diverse Tools, Shared Standards (Lanham, MD: Rowman & Littlefield, 2010), pp. 252–53.

[xi] See Moravcsik 2014b for a discussion.

[xii] Steven Van Evera, Guide to Methods for Students of Political Science (Ithaca, NY: Cornell University Press, 1997), 32; James Mahoney, “The Logic of Process-Tracing Tests in the Social Sciences,” Sociological Methods & Research 41(4) 570–597; David Collier, “Understanding Process Tracing,” PS: Political Science & Politics 44, no. 4 (2011): 823–30; Peter A. Hall, “Systematic Process Analysis: When and How to Use It,” European Management Review 3 (2006): 24–31.

[xiii] For previous discussions of these factors, see Moravcsik 2010, 2012, 2014a, 2014b.

[xiv] See Moravcsik 2010 for a lengthier discussion of this point.

[xv] Evan S. Lieberman, “Bridging the Qualitative-Quantitative Divide: Best Practices in the Development of Historically Oriented Replication Databases,” Annual Review of Political Science 13 (2010): 37–59.

[xvi] For link, see fn. 3.

[xvii] It is striking, for example, that even quantitative political science journals that pride themselves on a rigorous standard of “replicability” often apply that standard only to quantitative data, and resolutely refuse to publicize the qualitative data that is coded to create it—precisely for these reasons.

[xviii] For a longer discussion of AC, with examples, see Moravcsik 2014b.

[xix] In the US, fifty to one hundred words for non-profit scholarly use lies within the customary “fair use” exception, with the exception of artistic products. Most other countries have similar legal practices. This length is also relatively easy to scan, logistically feasible to sanitize for human subject protection, and poses less of a threat to “first use” exploitation rights of new data.

[xx] For a link, see fn 3.

[xxi] For a detailed argument and evidence that the costs are low, see Moravcsik 2012 and 2014b. Even those who point out that retrofitting articles to the standard is time-consuming point out that with advance knowledge, the workload will diminish considerably. See Elizabeth Saunders, “Transparency without Tears: A Pragmatic Approach to Transparent Security Studies Research,” Security Studies 23(4) (December 2014), pp. 689-698.