Go to
Research data can be suppressed in various ways, including organizational secrecy, defamation law and refusal to reply to queries. In a broader sense, methods of suppression include pressures not to do research in the first place and attacks on scientists who produce unwelcome data. The context of this sort of suppression includes individual self-interest, vested interests, and paradigms. Suppressing research data can be either compatible with or contrary to accountability, depending on the constituencies involved. Ways to challenge suppression of research data include individual requests, exposés, refusal to suppress, publicity, creating new data, and social movements.
Keywords: suppression; research data; openness; accountability
One of the commonly expounded principles for the practice of science is openness. In writings on science, the dynamism of fact-gathering, theory-testing, and discovery is often said to depend on results being freely shared within the scientific community. This is necessary so that researchers can build on others results; otherwise, the efforts of earlier workers are wasted as each generation of researchers must start from scratch. As Meadows (1974: 36) puts it: scientific research is potentially open to a much more wasteful duplication of effort than other types of research. This constitutes one reason why a knowledge of new research should be diffused as quickly and widely as possible amongst the scientific community. Access to other scientists work is also essential so that it can be scrutinized, tested, and, if unsatisfactory, modified or rejected. Openness is said to be at the foundation of science as a universalistic enterprise that can generate shared understandings not tainted by parochial distortions.
One of Robert Mertons classic norms of science was communism (also called communalism or communality), the others being universalism, disinterestedness, and organized skepticism (Merton, 1949; see also Barber, 1952; Storer, 1966). By communism, Merton referred to common ownership of the findings of science, which are communally produced. Merton provided no evidence that scientists actually adhere to the norm of communism in their work, and it might be said that his norms were prescription masquerading as description. There have been many critiques of the functionalist approach to science based on norms (e.g. Mulkay, 1976). Mitroff (1974a, 1974b), on the basis of an empirical study of scientists behavior, proposed that science could just as well be described by counternorms -- such as particularism, secrecy (solitariness), and organized dogmatism -- as by the standard norms. In spite of the theoretical and empirical weakness of describing science in terms of norms, the norms themselves -- of which the norm of communism is relevant here -- retain rhetorical support among many scientists, as does what Mitroff (1974a) calls the storybook image of science of which norms are a central part.
Looking outside the scientific community, openness in science is valued by various groups that are concerned about the impact of science on society, such as environmentalists who seek data about ecological impacts of development proposals. The norm of communism refers to sharing of knowledge within the scientific community; the term openness is used here to extend the norm of communism to nonscientists.
The discussions about norms have focused on science, but the same issues are relevant to other fields of research, including engineering, social sciences, and humanities. In this paper, the discussion is about research in general, though many of the examples are from science.
Suppression of data is a frontal challenge to openness in research. The word suppression here implies an active process to prevent data from being created, made available, or given suitable recognition. It is thus different from neglect, incompetence, or other inadvertent ways by which research data may become unavailable.
In spite of rhetoric of openness in research, the practice is often quite different. There are numerous examples of suppression, including pressures not to undertake research in the first place, institutional controls on dissemination of data, and attacks on researchers who produce unwelcome results. A few types of suppression are severely stigmatized, such as research fraud that has the effect of distorting or submerging accurate data. Other types of suppression do not evoke universal condemnation, but may generate concern on a case-by-case basis, such as the use of defamation law to prevent publication. Finally, there are some types of suppression that are commonplace and widely accepted, such as secrecy of research undertaken under the aegis of national security.
This article gives an overview of the topic. In section 2, various methods of suppressing research data are described, under the categories of preventing creation of data, controlling data, blocking dissemination of data, and distorting data. Section 3 deals with the context of suppression: several perspectives on research are outlined and examined for their value in understanding data suppression. Section 4 looks at suppression using the lens of accountability, pointing out that the rhetoric of accountability can be used to either justify or condemn suppression. Section 5 covers a range of responses to suppression, in all cases seeking to make data more freely available. The final section introduces some definitions and discusses data suppression in the light of a signal transmission theory of research communication.
Other contributions to this special issue of Accountability in Research explore particular types and areas of suppression in much greater depth (Colquhoun and Wilson, 1999; Hess, 1999; Macdonald and Hellgren, 1999; Samuels, 1999; Thompson, 1999). They are cited in appropriate contexts.
The word data normally refers to unprocessed observations, whereas information is data that has been processed, classified into categories, or organized in some way. However, the expression research data, used here, seldom refers to data in the strict sense (which might be called raw data), since most observations from research are processed in some fashion before being made available to others; a more appropriate expression might be research information, but this is not a common usage. Both data and information may be contrasted with knowledge, which is facts and principles believed to be true. Thus, in talking of suppression of research data, there is no assumption that the data is correct.
It is convenient to classify methods of suppression in terms of a number of processes to which research data may be subject. These are called here creation, control, dissemination, and distortion, and dealt with in 2.1 to 2.4. In each case, there are several types of suppression. Attacks on researchers, which can have the effect of suppressing research data, are covered in 2.5, followed by a summary.
Normally, the idea of suppressing data implies that the data exists. But a simple extrapolation of the concept leads to the idea that suppression may be responsible for research not being done in the first place. In other words, if research could be done on a particular topic, but isnt, it is worth exploring the reasons. If there are active measures taken against doing the research, it may be plausible to speak of suppressing (potential) research data. In the case of cancer research, interest groups, such as tobacco, asbestos, and pharmaceutical companies, have a stake in areas of ignorance, either in not investigating certain areas where the results might be unwelcome or in using ignorance (i.e., the need for more research) as a justification for not taking action against their interests (Proctor, 1995; see also Hess, 1999).
Going one step deeper into noncreation of data is the possibility that research is not done because no one has any interest in doing it, in turn because incentives are for research elsewhere. For example, there is vast military funding for weapons, but none for technology that specifically supports nonviolent defense, which relies on methods such as strikes, boycotts, and noncooperation and which can be supported by systems of communication, transportation, energy, and so forth that are difficult to control by an aggressor (Martin, 1992a). Not only does the military have no interest in this alternative, but few proponents of nonviolent defense have paid any attention to how this alternative can be supported by appropriate technology. In this case, there is no active suppression to withdraw or just not supply funding. Rather, the domination of one way of thinking on defense has been so pervasive that alternatives are not envisaged. It might be said that the sway or hegemony of the military model is so great that there is little or no thought about undertaking research to support a different model.
Once created, data is a potential threat to individuals and groups that may not welcome the uses to which it could be put. Under control of data are several means by which data is kept under wraps, without an opportunity for outsiders to obtain it or perhaps even to know it exists.
A primary means of controlling data is secrecy. Organizations that undertake in-house research may keep secret the results and sometimes even the fact that research has been done. This mostly happens in large organizations that have the capacity to undertake research, notably government and corporate bodies. By its nature, secret research is difficult to study. Many examples only become known later when revelations are made as a result of voluntary admissions, leaks, freedom of information requests, and exposés. Examples include some of the research carried out by tobacco companies into the hazards of smoking (Glantz et al., 1996; Kluger, 1996) and studies of the toxic effects of chemicals, such as pesticides, carried out by their manufacturers (Epstein, 1989; Hileman, 1998).
Sometimes organizational secrecy is backed by legislative or legal means. Military research carried out by governments or corporations is often secret, with penalties for those who reveal information about it. This is standard in most countries with a military research budget. Those who reveal military secrets to the enemy are branded traitors and may receive severe prison sentences. A well known example is the secrecy surrounding nuclear weapons development and, in the case of U.S. nuclear secrets, the conviction of a few individuals who allegedly gave secret information to the Soviet Union (Stern, 1969). Military secrecy is normally justified on the grounds of national security, raising the question of what degree of secrecy strikes the appropriate balance between preventing the enemy from gaining an advantage on the one hand and fostering scientific advance and maintaining civil liberties on the other (Gellhorn, 1950; Lasswell, 1950; Relyea, 1994).
The information that is classified as secret can change over time. In 1962, the U.S. Atomic Energy Commission published a book with massive amounts of information about nuclear weapons (Glasstone, 1962) -- much of which would have been classified in earlier years -- and updated it as time went on. At any given time, the more recently developed information is guarded most carefully. Publication of technical information, such as about the effects of nuclear weapons, often is easier than publication of information that throws a bad light on an organization or its members, as illustrated in Soviet cover-ups of nuclear disasters (Medvedev, 1979; Medvedev, 1991) and U.S. military secrecy about the effect of U.S. nuclear weapons testing on health (e.g., Ensign and Alcahy, 1994).
In most countries, secrecy is so tight that little is known publicly about any aspect of secret research. In the US, more is known about the existence of secret research, even if the research methods and findings themselves are secret. Even in the US, though, there is a black budget for the Defense Department of many billions of dollars for carrying out research and other functions which are secret even from legislators (Weiner, 1990). The existence of entire enterprises may be kept secret. For many years, the U.S. National Security Agency (NSA) was virtually unknown, though its budget is many times greater than the more notorious Central Intelligence Agency (Bamford, 1983). Yet there are other agencies even more secret than the NSA.
One area that is kept especially secret is the methods of operation of agencies that protect governments from external and internal threats. Again, more is known about this in the US than most other countries (Halperin et al., 1976). The Federal Bureau of Investigation is subject to freedom of information requests; a common reason for denying access is that information would reveal the methods used by the agency for spying. In general, it can be said that special attempts are made to keep secret the methods of maintaining secrecy (see Cowan et al., 1974).
Corporations that are developing a new product or process maintain secrecy in order to stop other corporations from gaining the benefit of their breakthrough or just the hard work of collecting data. Corporations can legally protect their secrets through a form of intellectual property called trade secrets. However, registering information as a trade secret is often secondary to the primary means of maintaining control, which is organizational secrecy itself.
As well as governments and corporations, some other organizations may undertake research that is kept secret, including churches, trade unions, political parties, and environmental groups. For example, an opinion poll commissioned by a political party may be kept secret, especially if it reveals information that might be used against the party or its leaders.
A rather different type of control over research data is when organizations or individuals prevent others from undertaking research about themselves. Corporations and governments are major forces in society, so it is to be expected that social researchers would want to learn about them by studying documents and interviewing workers. Many organizations do what they can to prevent such research or put it on a tight leash (Macdonald and Hellgren, 1999). When sociologist Robert Jackall sought to undertake ethnographic studies of corporate culture, his biggest challenge was to find corporations that would give him access. He was not undertaking an exposé and did not plan to reveal the name of any company that he studied. Even so, he spent many months and enormous effort revising and refining his pitch to corporate executives, and was rejected dozens of times, before he was able to gain access to a company (Jackall, 1988, pp. 13-16). In this and other such cases, the organization is controlling information about itself, not the information that it holds such as trade secrets.
Individuals may also seek to control information about themselves. Some individuals have personal archives, which they may keep secret or give access only to official biographers. In addition, individuals may refuse to give interviews to nonauthorized investigators or to keep certain topics off limits.
A drastic method of preventing others from obtaining research data is to destroy it. The amount of such destruction is unknown, for it is seldom openly admitted. A reasonable presumption is that targets for destruction include:
Controlling data, as discussed in the previous section, refers to stopping data from getting out. In contrast, to speak of blocking dissemination of data assumes that data is available to some degree or in one form or another but that efforts are made to prevent full or wider disclosure or distribution.
One approach is to not tell anyone that research has been done or that it is available. For example, government reports may be released without any publicity, so that only those in the know are aware of how to obtain them. A common technique is to not reply to requests for information. Academic researchers undertaking studies that are not classified or proprietary would, on receiving a request from another researcher, be expected to supply data associated with published work, especially if it did not require enormous work to do so. By simply not replying to such requests, or replying only to some requests, access to data is restricted (see Thompson, 1999).
Intellectual property includes copyrights, patents, and plant breeders rights, and some other types. In principle, intellectual property means that information is freely available but that residual rights are retained by the owner, such as to restrain copying for profit. In practice, intellectual property law can be used to stop dissemination of data. For example, copyright law was used to block publication of a book reproducing Australian government documents about foreign affairs that the government found embarrassing (Munster, 1982). Companies, for example in the radio and electricity sectors, have bought up patents in order to prevent technological innovations that might threaten their business (Dunford, 1987). Patenting of life forms is major point of debate in biotechnology, with many scientists and corporations seeking to control the uses of research as a means to make a profit.
Defamation law is another legal means that can be used to block dissemination of data or discussion of theories. In the case of the theory that AIDS resulted from contaminated polio vaccines used in Africa in the late 1950s, an action for defamation by the developer of the vaccine in question, Hilary Koprowski, had the effect of inhibiting discussion and further publication about the theory (Curtis, 1995). Alexandra De Blas, a student at the University of Tasmania, wrote an honors thesis about the environmental impact of mining operations by a company. The company, Mt Lyell Mining and Railway, threatened to sue for defamation if the thesis was published (de Blas, 1994). Walter W. Stewart and Ned Feder, noted as scientific fraud-busters, were threatened with numerous defamation suits concerning their attempts to publish an article about scientific fraud by John Darsee and about related issues such as Darsees coauthors and honorary authorship in general (LaFollette, 1992: 8-13). A paper by Adil Shamoo surveying psychiatric research on drugs and schizophrenia was accepted for publication in Journal of Clinical Ethics but then not published due to threats of a libel suit from a researcher whose work was cited in the paper (Shamoo, 1997). In recent decades, defamation suits have been used widely in the US to intimidate citizens from speaking out about property developments, environmental hazards, corruption, and other issues, in what have become known as Strategic Lawsuits Against Public Participation or SLAPPs (Pring and Canan, 1996). Although most SLAPPs are neither against researchers nor concern research data, nevertheless the risk of a SLAPP may, on occasion, deter researchers or editors from publishing findings.
Dissemination of data can be blocked by hostile editors or referees. Many rejections can be justified on the grounds of quality control, of course, but some rejections may be forms of censorship, for example when profluoridationists reject articles they perceive as threatening to fluoridation (see Diesendorf and Diesendorf, 1997) or when Luis Garcia, recipient of awards for his pioneering work on pseudoconditioning, was blocked from core psychology journals for nearly two decades (Lubek and Apfelbaum, 1987; Revusky, 1977). More generally, there is considerable evidence that many editors and referees are hostile to papers that challenge prevailing beliefs (Armstrong, 1996, 1997; Campanario, 1995; Epstein, 1990; Horrobin, 1990; Lang, 1998; Mahoney, 1976, 1979; Thompson, 1999). The result in some cases can be that publication of innovative ideas, and data that backs them, is delayed or blocked.
Another means of restricting dissemination of research data is to charge a high price for it. Governments and corporations may charge high prices for data on the grounds of cost-recovery or the need to make a profit, though critics may allege that another motivation is to make it difficult for groups and individuals with little money to obtain the data. Whatever the motivation, high prices do restrict data to those who can afford it. There are cases in which government-generated data relevant to environmental or health impacts or planning decisions has been available only at a high cost, making it expensive for large campaigning groups such as Greenpeace and unavailable in practice to local citizen groups (Tickell, 1998). In some cases, charges for data obtained through freedom-of-information requests are quite high, again reducing availability. In order to challenge costings, it may be necessary to go to court, again raising the cost of obtaining data. The net result of cost barriers may be a reluctance to continue pursuing the data, even though it is available in principle.
Some cases reveal attempts to move data from the public domain to organizational control. Dr Thomas Mancuso, an epidemiologist at the University of Pittsburgh, undertook long-term studies of the health effects of low-level ionizing radiation, funded by the U.S. Atomic Energy Commission (AEC) beginning in 1965. In 1974, the AEC put pressure on Mancuso to repudiate claims by another researcher that the cancer rate among workers at the AECs Hanford facility had increased. Mancuso refused. The AEC then arranged a review of Mancusos study. Although 4 of 6 reviewers supported Mancusos work, the AEC transferred it to a private research lab, Batelle West, under the supervision of one of the critical reviewers, a former AEC employee (Bross, 1981: 217-222). Mancusos opportunity to continue the research and publish the results was curtailed, with the prospect that the new home for the research project would have less interest in publishing certain types of results.
Ethical considerations can serve to restrict dissemination of data. For example, records on hospital patients, records of interviews, referees reports, and any number of other types of data may be restricted in distribution because of concerns about privacy or promises of confidentiality. It can be argued that restricting distribution of data for ethical reasons is entirely proper, but it is also possible that in some cases ethical grounds are used as an excuse to refuse access to data for other reasons. Lewis (1975), given access to referees reports for his research on professional evaluation in higher education, produced a myth-shattering account of selection procedures; this suggests that an incentive may exist to discourage more research along such lines. Ethics committees also have an impact on research at the point of creation, and it is possible that some suppression of unwelcome research occurs at this point.
One solution in some such cases is to make the data available only in a form which protects the identity of individuals or other information that is considered private. This raises the wider issue of selective availability of data: some data is available and some is not. Again, this may be seen as entirely proper or it may be interpreted as censorship.
Most of the examples so far are based on the presumption that research data is either available or not available and that suppression occurs by preventing creation or dissemination of data. However, another possibility is that information is available in distorted form.
In many government and corporate research organizations, a researcher who writes a report is required to submit it for scrutiny by superiors before it can be published. This might be justified on the grounds of national security, commercial confidentiality, or quality control. If superiors (possibly including referees) so demand, the report may be edited before publication by removal of data, alteration of data, deletion or rewriting of text, and any of a host of other processes of textual transformation. It can extend to removing or changing particular words and to omitting references to particular authors or publications. Authors are used to the sorts of changes that can be requested or imposed by editors. To speak of censorship or of suppressing research data in this context is to refer to such gatekeeping and filtering when it cannot be justified by quality control and when it is imposed on authors.
Although conversations with government and corporate scientists suggest that this sort of censorship is standard practice (Martin, 1997), there are relatively few documented cases. Many scientists accept the conditions under which they work and so have no incentive to expose the way their work is edited for publication. Others are not willing to challenge censorship, since it might jeopardize their jobs or working conditions. Allegations were made that papers on forestry were censored within the Western Australia Department of Conservation and Land Management to remove findings unwelcome to management (Schultz, 1993); the official response was that this editing was just peer review (Armstrong, 1993). In the case of the fluoridation trial in Hastings, New Zealand, Colquhoun and Wilson (1999) reveal that important information about experimental protocol was withheld at the time. Samuels (1999) argues that the glutamate industry has both suppressed information and disseminated misinformation.
Most researchers are alert to organizational cues about what is permitted and expected. Rather than put themselves through a tedious and perhaps unpleasant process of having to rewrite their papers or even redo their experiments, they may anticipate the sorts of objections and concerns of organizational gatekeepers and write their papers so that no changes are needed. In such cases, it might be said that there is no censorship or, alternatively, that researchers are engaging in self-censorship. This is a difficult issue to study, since by its nature there is little data on the process or impacts of self-censorship, and researchers themselves may not be conscious of the process.
Another way by which data can be distorted is through various forms of fraud, including reporting only favorable findings, altering data, and manufacturing data without observations (Bell, 1992; Broad and Wade, 1982; Kohn, 1986). Although cases of major fraud seem relatively rare, various forms of misrepresentation -- such as not reporting errors in experimental procedure, gentle massaging of data to make it look more respectable, and including citations in order to please likely referees -- are so common as to be almost standard practice (Martin, 1992b). Whether categorized as fraud or treated as acceptable research practice, manipulations of data before publication have the potential to give a distorted picture.
Plagiarism is considered a serious violation of scholarly ethics, but its impact on research data is less clear. When someone takes anothers research data and publishes it under their own name, the data itself is available. The only bit of data that is distorted is the authorship of the work (assuming that the research data was not subject to distortion for other reasons). From the point of view of availability of research data, it might seem that plagiarism is not of concern. However, the alteration of authorship is of concern to scientometricians, who are researchers who study patterns of research through data on authorship and texts. Plagiarism also is of concern when considering accountability for research data.
Another distortion in research data can occur when there are differences in acceptance rates for different types of findings, although the quality of the work in other regards, such as methodology, is the same. An example, noted earlier, is when it is harder to publish findings that challenge current theoretical expectations than to publish findings that conform to expectations. In such cases, although the research data for any individual published paper may be available, at a meta-level there may be distortions due to an imbalance of certain types of findings. As a consequence, a researcher doing a meta-analysis of findings in the field may come up with different result than would have been the case if all data (subject to quality considerations alone) had been published.
Censorship, fraud, and publication biases are ways in which the availability of research data can be distorted. A different process is distortion of the perception of research data rather than distortion of the data itself. In other words, data is openly available, but efforts are made to shape peoples perception of it. Some methods include:
The field of anomalistics illustrates some of these processes. Anomalies are observations or data that do not conform to current scientific theories or expectations, such as fossils that do not fit the geological record (Fort, 1975). The field of parapsychology fits in here since, for example, the laws of physics are not considered to allow precognition, psychokinesis, or telepathy. One common response to reports of anomalies is to ignore them, assuming that they are incorrect or unimportant. Another response is to attack them as the result of faulty observation, poor methodology, or fraud. A third way by which perception of data on psychic phenomena is distorted is through being lost in a mass of apparently contrary data. Since most scientific research assumes that psychic phenomena do not occur, the scientific literature is filled with experimental results that systematically exclude any consideration of psychic effects. (If, for example, scientists are capable of unconsciously influencing experimental results through their own psychic powers, then changes in experimental protocols would be in order, but this is virtually never done. Few natural scientists even use blind or double-blind methods (Sheldrake, 1998).) Finally, attention to reports of anomalies can be diverted by reports exposing fraud or some other inadequacy. Skeptics who focus on shortcomings in some work in fringe areas -- for example by exposing amateur spoonbenders as fakers -- divert attention from other data that might be worthy of more serious attention.
There are many ways of presenting data, including in tables, graphs, lists, and discussions. Not every bit of data can be presented, so some selections must be made. Often this is by aggregating data in certain ways. In these and other processes of representing data, transformation of data occurs. In some cases, it may be argued that this transformation is sufficiently great or so inappropriate as to be called distortion. For example, some analyses of the health hazards associated with different energy sources have been criticized (Holdren, 1982; Russell and Ferguson, 1980) as seriously misleading.
There is quite a difference between distortion of data, as in censorship or fraud, and distortion of the perception of data. It can be argued that there is no such thing as undistorted perception of data (Hesse, 1974), and indeed that the scholarly processes of testing hypotheses, replicating results, peer review, and the like are intended to influence perception of data, though with the aim of improving understanding of the world. The point here is that in cataloging potential methods of suppressing research data, it is worthwhile looking at cases in which data is nominally available but is discredited or submerged in a welter of contrary data. This is analogous to the way that some news stories are censored in the sense that, although they are published in some outlets, they are not taken up by the mainstream media to the extent that their social importance might suggest (Herman and Chomsky, 1988; Phillips and Project Censored, 1998). It might also be considered analogous to the routine process of making news, which distorts the stories that do appear in accordance with the priority of news values such as an emphasis on conflict and personalities (Bennett, 1988; Tiffen, 1989).
It is worth mentioning an intriguing angle on the issue of distorted data: much information collected by spy organizations is filled with inaccuracies (Kimball, 1983; Mitgang, 1988; Thomas, 1981), arguably due to the bureaucratic systems of secrecy that surround collection and verification of information. However, this information is accessible only within the spy agencies themselves, except when occasionally revealed, usually many years later. Secrecy in this case both limits public access to data -- a problem more widely recognized through the difficulty in accessing ones personal health records or credit ratings -- and provides a seriously distorted picture to those who do have access. Generalizing from this example, it may be asked whether other forms of secrecy and suppression create a distorted picture of the world for those who have access to the information.
Researchers who do research that is unwelcome to certain groups, or who speak out about social issues, sometimes come under attack. Some of the methods used to attack dissenting scholars include ostracism, petty harassment, withdrawal of research grants, blocking of appointments or promotions, punitive transfers, reprimands, demotions, spreading of rumors, dismissal, and blacklisting (Deyo et al., 1997; Glazer and Glazer, 1989; Hess, 1999; Martin, 1996a, 1997; Martin et al., 1986). In many cases, attacks on researchers have the consequence of suppressing research data.
For 20 years, Dr John Coulter was a medical researcher at the Institute of Medical and Veterinary Science in Adelaide, South Australia, where he established and ran a laboratory on testing environmental mutagens, namely substances with the potential to cause cancer. He was outspoken on issues of public health and the environment. After some of his public statements about the hazards of chemicals (not made in his work capacity), complaints from chemical manufacturers were made to the head of the institute. In 1980, Coulter tested a chemical, ethylene dioxide, that was used for sterilization at the Institute, and found it to be mutagenic. He released his preliminary report to the workers using the chemical as well as to the Institute director. Subsequently he was dismissed from his position and the environmental mutagens laboratory was shut down (Martin et al., 1986: 123-163).
Dr Melvin Reuber was head of the Experimental Pathology Laboratory at the Frederick Cancer Research Center, part of the National Cancer Institute in the United States. He was the author of numerous studies of the cancer-causing properties of pesticides, studies which were used by opponents of pesticides. In 1981, the director of the center, who previously had given Reuber the highest commendations, issued a harsh reprimand questioning his research. The substance of the reprimand was published in Pesticide & Toxic Chemical News (1981), a newsletter of the petrochemical industry. Reuber resigned under the stress (Martin 1996b; Schneider 1982).
In both these cases, scientists who were undertaking research that was unwelcome in some quarters came under attack. This had several consequences. One is that their research was terminated at an individual level; neither scientist returned to research. With the shutting down of the environmental mutagens testing laboratory, ongoing research in the area was also terminated. Thus the attacks could be said to have the effect of suppressing research data in the sense that research data that might otherwise have been expected to have been produced by these scientists was not created. A second consequence is that their energies were turned to court battles in the following years rather than research. A third consequence is that their credibility as researchers was questioned, both by direct statements -- most notably the publication of the reprimand of Reuber, which was used to discredit his work around the country and beyond -- and by loss of their positions as researchers at significant scientific facilities. This served to discredit the data that they had produced, at least in some peoples eyes. Finally, the public spectacle of a scientist coming under attack can serve as a warning to other scientists about the risks of undertaking certain types of research or making certain types of public statements. Specifically, attacks may serve to inhibit other scientists from producing or disseminating certain types of research data. In summary, attacks on scientists can have a multitude of impacts, some of which are inhibition, discouragement, or direct blocking of the creation or dissemination of research data.
Table 1 lists various methods raised so far that can cause suppression of research data, noting for each one whether it principally operates via creation, control, dissemination, or distortion of the data.
Creation Control Dissemi-nation Distortion no interest in
research topic no
funding organizational
secrecy legislated
secrecy refusal to
reply destruction defamation intellectual
property cost of
data censorship fraud publication
biases ethics
codes discrediting of
data diversionary
data attacks on
researchers
This table should be treated as an aid to thinking on the issues rather than a definitive statement. In principle, nearly every method could have impacts on all four areas of creation, control, dissemination, and distortion. For example: the high cost of purchasing some data might inhibit creation of further data; lack of funding for some areas in a sense distorts the profile of research data that is published.
The classification of impacts into the categories of creation, control, dissemination, and distortion is arbitrary. It is especially useful in distinguishing between methods that fall mainly into one category but not so useful in illuminating methods that fit into all the categories, in particular ethics codes and attacks on researchers.
Section 6 will address some questions concerning the classification of methods of suppressing research data and the assessment of claims that suppression occurs. The next section, though, considers how to explain suppression.
Why does suppression of research data occur? What are the motivations of suppressors? What social conditions or social structures facilitate, accommodate, or inhibit suppression? Tackling these sorts of questions quickly leads into general analysis of the dynamics of research and society. The social context for suppression is, obviously, an enormous topic. Here, an outline is given of a number of perspectives on research, commenting on the value of each for pursuing an understanding of suppressing research data. This outline does not claim to be comprehensive, and is intended only to introduce some directions for study.
Most writings on the nature and operation of research do not refer to cases of suppression, whether using the word suppression or not, and do not conceptualize the phenomenon (Martin, forthcoming). The implicit assumption in this literature is that suppression does not occur or, if it does, it is an anomalous occurrence that is incidental to the central dynamics of research. Many cases can be explained as proper behavior and following policy rather than suppression. Policies such as organizational secrecy are either not conceptualized as suppression, not treated as problematical, or not addressed at all.
In this perspective, cases of suppression are attributed to individual flaws, analogous to the way that many commentators treat cases of scientific fraud as due to the failings of particular individuals. Suppression in this case is treated at the level of individual psychology, as when scholars refusal to supply data about their own research is attributed to their own personal agendas or shortcomings. There are several motivations that can be advanced to explain suppression of research data, including:
Since this approach operates at the level of psychology, only some types of suppression are considered; systemic suppression such as due to lack of funding or organizational secrecy is seldom considered, except when particular cases deviate from the usual pattern. In some types of cases, suppression can be endorsed, as in a celebratory account of a scientists success in making a discovery and patenting it to block others exploiting it.
An overly committed scholar might also fail to report conflicting data due to the unconscious error of overlooking it. This is indeed suppression at the level of psychology.
Many types of suppression can be linked to the presence of interests in the production of knowledge. In brief, interests in this context are things that individuals or groups have to gain or lose by availability of research data. There is a body of analysis of science that looks at the role of interests in shaping research policy, research practices, and scientific knowledge (Barnes, 1977; Boffey, 1975; Dickson, 1984; Noble, 1977; Primack and von Hippel, 1974).
The most important interest groups impacting on research are government bodies, corporations, professions, and elite researchers. For example, states have backed nuclear power and are linked to suppression of antinuclear scientists; chemical companies have promoted pesticides and are linked to suppression of scientists critical of pesticides; the dental profession has backed fluoridation and is linked to suppression of antifluoridation scientists. In each of these cases, hierarchy in science provides a means by which state, corporate, and professional interest groups can influence day-to-day research (Martin, forthcoming).
Interests analysis provides a straightforward way to understand several methods of suppressing research data. States and corporations typically seek to fund research that has direct or potential benefits to them. For example, pharmaceutical companies have an interest in funding research into drugs that they can patent but not research into freely available substances. Energy companies and allied government agencies undermine development of renewable energy by buying up solar energy patents and companies, cutting funding of renewable energy research, and maintaining policies that advantage coal, oil, and nuclear power (Berman and OConnor, 1996; Reece, 1979). Organizational secrecy is seen by managers as a way of serving the interests of the organization. The expansion of intellectual property rights is promoted by those states and corporations with the most to gain (Drahos, 1996). State elites invoke national security to maintain secrecy in order to protect state interests against enemy states or internal critics. Individual researchers may restrict access to data in order to gain an edge on rivals.
Interests analysis provides a way to unify understanding of a mix of methods of suppression, as in the case of a corporation, government agency, industry, or profession that is involved in any or all of falsifying data, manipulating results, using funding to set research agendas and capture university scientists, using the courts to stop challengers, using silencing agreements to gag critics, using public relations and front groups to confuse the public, and attacking unwelcome results (Epstein, 1978; Fagin et al., 1996; Hess, 1999; Insight Team, 1979; Moore, 1995; Nader and Smith, 1996; Samuels, 1999).
Interests analysis is open to the criticism that it relies on attribution, by the person undertaking the interests analysis, of interests to particular groups. In spite of this and other limitations, interests analysis provides a useful framework for understanding many of the methods of suppressing research data.
The idea of paradigms, worldviews, or belief systems can be used to explain some instances of suppressing research data. When a community of researchers sees things through the lens of a common set of assumptions and undertakes its research using a standard set of practices, then data and claims that fall outside the common assumptions and standard practices may be ignored, discarded, or dismissed. This is the fate of anomalies within Kuhns (1970) classic picture of scientific fields in the grip of paradigms.
The idea of paradigms helps explain why some fields are not funded or investigated: funders and researchers simply do not believe they are worth studying. Anomalous data, if it cannot be ignored, may be attacked and discredited because it is a threat to the paradigm. Scientists who challenge a paradigm may encounter hostility and suppression. Thompsons (1999) experiences in dealing with quantum entanglement seem to reflect the difficulties of confronting an entrenched belief system; likewise, Colquhoun and Wilsons (1999) study suggests the existence of a deep-seated belief in fluoridation by some researchers (see also Colquhoun, 1990).
Whether or not the contentious concept of paradigm is invoked, beliefs certainly play a role in many types of suppression. For example, organizational secrecy is typically associated with a belief that openness is harmful to the organization and perhaps society. Support for intellectual property is linked to beliefs in rights over the products of ones labor and social benefits from granting monopoly privilege. Interest groups typically believe fiercely in their own cause, and often this is linked to a wider set of beliefs about knowledge and appropriate behavior.
Society can be analyzed in terms of social structures such as capitalism, patriarchy, bureaucracy, and the family. Research can be similarly analyzed, for example in terms of capitalist drives for profit and social control (Rose and Rose, 1976a, 1976b). This has similarities to interests analysis, except that structural analysis tends to look at broader and more pervasive systems of power whereas interests analysis more commonly looks at particular industries or groups. It is quite possible to mesh interests analysis and structural analysis (e.g. Noble, 1977).
In understanding suppression of research data, structural analysis is most useful in capturing broad patterns. For example, bureaucracy, namely the organization of work in a system based on hierarchy, division of labor, and standard rules, has a tendency to be built on secrecy, since control over information is one of the key ways by which bureaucratic elites maintain power (Hummel, 1977; Perrow, 1979). Therefore, it can be expected that in any organization that operates as a bureaucracy -- including government departments, large corporations, and some large university research laboratories -- there would be pressures to control research data. This may help to explain resistance to freedom of information requests and the imposition of high costs for providing data.
Many instances of suppressing research data, such as a scholars refusal to supply data, are too specific to be easily grasped by structural analysis. What structural analysis can do is provide an understanding of the wider contours of power that provide incentives and opportunities for either obtaining data or making it difficult to get.
No single approach to analyzing research provides an ideal way of explaining every method of suppressing research data. Hence it makes sense to use different approaches, or a combination of approaches, as appropriate. Structural analysis can help in understanding patterns of incentives for dealing with data. Belief systems are created within these patterns and help to shape them. Interests help to explain the dynamics of particular industries and organizations. Psychology can help to explain what happens at the level of individuals, who also have interests and are shaped by belief systems and social structures. Individual actions often reflect broader patterns of power but sometimes operate in the face of them. Finally, the option of disregarding evidence of suppression, while not useful for explaining suppression, nevertheless can be a useful reminder that other explanations, not invoking suppression, deserve a hearing.
The concept of accountability can be used in various ways, including to justify secrecy and suppression, to justify harassment of researchers, and to justify openness. Here, some of these uses of the concept of accountability are outlined in relation to methods of suppressing research data.
For most of the time in day-to-day work, researchers are not called to justify the systems in which they work, including systems responsible for suppressing research data such as funding priorities, organizational secrecy, or the cost of data. However, when challenges are made to business as usual, the concept of accountability can be used to justify these practices. Three common methods for doing this involve invoking accountability to employers, invoking accountability to the law, and restricting the idea of accountability to quality of research.
Most researchers are employees, either of government bodies, corporations, or universities. Employees are expected to be accountable to employers, either formally as workers who are subject to the managerial hierarchy or professionally in terms of an expectation of loyalty and appropriate behavior. Researchers who are employees therefore can justify their acquiescence in policies and practices that suppress research data by referring to their accountability to employers. This includes, for example, when research is not done because no funding is supplied, when organizational secrecy prevents research from being published or supplied, and when managers take active steps to discredit data or attack dissidents.
Legal accountability can also be invoked to justify suppression, as in the cases of official secrets acts and intellectual property law.
Researchers can simply deny responsibility for the uses of their research, saying that their job is to do good research, while applications are the responsibility of managers, politicians, and others. This defense of acquiescence restricts accountability to quality assessments by peers. It does not endorse suppression but does nothing to counter it.
The rhetoric of accountability can sometimes be used to harass or undermine a researcher. This is most obvious when bosses demand accountability from employees, which can include audits, surveillance, and data monitoring systems, sometimes routinely applied and sometimes targeted at employees who are considered a threat. However, even when openness is the ostensible aim, accountability can be used to justify harassment or attack, as in the case of excessive or unending demands for data, or freedom of information requests, that seriously hinder a researchers work. The process of discovery in a defamation action might seem on the surface to be compatible with a commitment to openness about data, but can be part of a wider context of using defamation actions to discourage research or publication on certain topics. Journalist Chris Nicholls, who exposed political corruption in South Australia, was charged with contempt of court for refusing to name his sources and went to prison (Nicholls, 1994), a case where ostensible concern about openness in sources served as a screen for attacking someone who had revealed information threatening to powerful interests. Because accountability is widely seen to be a good thing, it is to be expected that individuals and groups will seek to use the rhetoric of accountability to justify their actions. Thus, it is important to take a close look at who is accountable to whom and at the implications of the system of accountability.
There are several ways to use the concept of accountability to justify openness and to counter justifications for suppression. The first is grounded in Mertons norm of communism: an accountability to scholarly peers to make results available to the research community, which is a collective enterprise that thrives when those who draw from it, for inspiration and validation, return to it their findings for others to build on. The implication is that researchers have an obligation to publish their findings, to engage in dialog, and to respond to reasonable requests for data. This level of accountability to peers implies rejecting controls on data such as organizational secrecy and intellectual property. Even the standard system of anonymous peer review, in which the identity of referees is secret, reduces accountability to peers (Horrobin, 1974).
A second way to justify openness is to refer to accountability to groups in the wider community (beyond the research community), or to society as a whole. When results have implications for members of the community, then it can be considered that researchers have an obligation to make those results available. Obvious examples include research that reveals hazards to the environment and research about the health benefits of particular substances. Accountability to sectors of the community who favor increased efforts for nonmilitary approaches to security might suggest that information about military research be made available so that an informed debate about it can occur (and even that campaigns can be mounted against it) and so that areas where no research is occurring can be made apparent.
Since different groups in the community have different values and goals, it is seldom possible to definitively specify the public interest. Instead, there are typically competing and overlapping interests within the community, which lead to different conclusions about the appropriate stance in relation to research data. For example, groups supporting the current system of military defense could argue that secrecy in military research organizations benefits society and overrides other values such as openness. Other would argue that military secrecy, or at least a great deal of it, is harmful to society and that greater accountability to outside scrutiny would actually strengthen security generally and perhaps even military effectiveness.
It is possible to pursue the issue of diversity to greater lengths, noting that values and interests are not pre-given but are constructed in specific situations and depend on a range of contingencies, and that concepts such as public interest or openness are constructions that reflect an attempt to foreclose debate over the very areas that need examination. This postmodernist approach is valuable in highlighting the differences and varied circumstances that are obscured in conventional accounts. However, if the concept of accountability is to mean anything more than a rhetorical ploy used in particular ways in particular circumstances and instead to have some general level of applicability, the postmodern interest in particularity, difference, and contingency, rather than being the central focus of attention, could instead be used for producing inputs into a wider picture involving patterns of power and collective goals.
When suppression of research data is considered acceptable, there is no need to take any action. However, when it is considered inappropriate, the issue arises of how to respond to it. Here, various responses are outlined, most of which are based on the assumption that openness is a good thing either in itself or to serve particular individual or social goals.
An immediate and straightforward way to deal with unavailability of research data is to request it from the relevant researcher or organization. There need not be any presumption that suppression is involved, since the unavailability might be due to oversight, shortage of resources, or just lack of a request. Individual efforts in principle can lead to wider changes, since if more individuals request data, then expectations may change about supplying it. However, individual requests have little prospect of changing patterns of suppression such as organizational secrecy.
Another approach, available in some circumstances, is to use freedom of information (FOI) legislation to request data. This can sometimes succeed in obtaining data as in the case of New Zealand government archives dealing with fluoridation trials (Colquhoun and Wilson, 1999). In any case, use of FOI sends a signal about the seriousness of the request, which cannot be so simply ignored as a personal request. (FOI normally applies only to government bodies, and its reach may be restricted when government agencies are corporatized.)
If data is being suppressed due to organizational secrecy, censorship, or high costs, this can be directly countered by making it available. There are two basic approaches here: leaking and exposing. The leaker is typically an insider with routine access to data who, in violation of laws, regulations, or common practice, makes it available to outsiders. For example, thousands of pages of confidential internal memoranda from the tobacco companies Brown & Williamson and its parent BAT Industries were sent in 1994 to Stanton Glantz, medical researcher and smoking opponent, by an anonymous source (Glantz et al., 1996).
Leaking is a standard process in the operations of governments; material may be leaked to interested groups, such as social welfare bodies or opposition political parties, and to the media. When leaks threaten a powerful group such as senior managers or politicians, there may be energetic efforts to find and punish the leaker. When senior managers or politicians leak information to the media, this is often considered just a part of routine management of public opinion. In either case, information is made available to a wider audience that would otherwise be secret or expensive to obtain. However, selective leaking -- revealing just part of the picture -- can contribute to another type of suppression, namely distortion through the circulation of diversionary data or of data intended to discredit other data.
An exposé is similar to a leak except that it is usually done by an outsider who, through searching archives, interviewing workers, or other investigative techniques, reveals information that is otherwise unavailable. Investigative journalists make their reputations through exposés, which can be in fields as diverse as arms sales and the funeral business. In many cases, exposés reveal information of great interest to researchers. For example, Masson (1984), who had access to the Sigmund Freud archives, revealed previously concealed information about the development of Freuds theories. (After some of his findings were reported, a wave of protest led to his dismissal from the archives.) Tom Curtis (1992) interviewed polio pioneers and leading researchers in obtaining information about a theory for the origin of AIDS. Nicky Hager (1996), a veteran peace activist and researcher, revealed highly secret information about electronic spying by major governments by extracting information from the scarce public documents and contacting numerous spy agency employees, winning enough confidence to obtain further information. Howard Morland, using publicly available information, revealed the secret of the H-bomb in an article intended to appear in The Progressive. The US government used a court order to prevent publication, but after months of legal struggle and the publication of similar information elsewhere, the government abandoned the case, after which the article was published (Morland, 1979, 1981). Many exposés by outsiders depend on leaks from insiders (e.g. Toohey and Wilkinson, 1987).
When a researcher or manager is expected or required to suppress data, this can be challenged by simply not doing it or refusing. For example:
Refusal to suppress data is important at an immediate level in freeing up particular information or preventing attacks on particular researchers and at a more general level in providing an example to others in tolerance, sharing, and openness. In some cases, norms of openness prevail even in the face of the law, as in the widespread violation of copyright law by professors who photocopy articles and portions of books (Ellickson, 1991: 258-264). Because refusal to suppress data is seldom visible -- it is often lack of action rather than action -- it appears not to have been studied. It can be argued that it is one of the most important responses to suppression of research data.
Much suppression takes place in secret. For example, when a researcher is chastised, threatened, or formally reprimanded by superiors for speaking out, neither the researcher nor the superiors may seek publicity. The researcher may be embarrassed or not want to cause more trouble, and the superiors may not wish to draw attention to their action. Suppressing research data thus can be a doubly secret process, in that both the data and the suppression remain secret.
Suppression is harder to carry out when it is visible, at least in cases where the action can be interpreted by outsiders as improper or harmful. It is much easier to avoid publicity when blocking the appointment of a little known dissident than when dismissing a prominent one from a secure post. Exposing suppression is a potent challenge to suppression itself.
When McDonalds decided to sue several members of the anarchist group London Greenpeace for defamation over their production of a leaflet about McDonalds, two of them, Helen Steel and Dave Morris, decided to fight in court rather than acquiesce. This eventually led to a massive defense campaign, including a McLibel web site containing vast amounts of information about McDonalds as well as about the defamation case (Vidal, 1997). By both exposing the suppression and making more information available, the McLibel campaign caused enormous damage to the reputation of McDonalds and sent a message to other corporations that might have been contemplating using defamation actions to suppress criticism.
If data is not available, one response is to create new data, for example to redo experiments, to undertake new surveys or to work out an existing technological design by reverse engineering. To get around the constraints of intellectual property, corporations spend enormous amounts of money developing products that are sufficiently new to avoid infringing patents and copyrights, as in the case of pharmaceutical drugs and computer software. The Free Software Foundation encourages the development of software that is available without cost. Creating new data is typically laborious and expensive; it can be considered one of the costs of suppression.
Beginning in the late 1960s, a number of groups have been set up to draw attention to the misuses of science, such as the British Society for Social Responsibility in Science, Scientists and Engineers for Social and Political Action, and Scientists for Global Responsibility. These groups have raised concerns about the uses, directions, and beneficiaries of science, for example dealing with workplace hazards, exploitation of Third World peoples, and generally the use of science and technology for profit and social control (Arditti et al., 1980). Although suppressing research data has not been a central focus of these groups, it has often been a relevant factor in their campaigns. One of their central concerns has been priorities in science funding and research, for example the contrast between the vast funding for military research and development and the little for renewable energy and other appropriate technologies. The movement for social responsibility in science thus has been important in putting on the agenda the issue of socially important areas of research that are suppressed by being starved of funding.
Other social movements provide a challenge to suppression, typically on particular issues. The environmental movement, for example, has challenged secrecy in government departments, had to deal with defamation actions that deter public debate on environmental issues, has exposed censorship, and been the subject of sophisticated corporate campaigns to mislead the public about environmental impacts (Beder, 1997; Stauber and Rampton, 1995). The alternative health movement can provide resources to support development and testing of cancer therapies that are suppressed by mainstream medicine (Hess, 1999).There are also groups whose activities are directly relevant to particular types of suppression, such as free speech committees, civil liberties councils, whistleblower groups, and human rights organizations that challenge censorship and oppose attacks on dissidents. These sorts of groups often are reservoirs of experience in challenging suppression. There remains much to be learned from such groups by those opposing suppression of research data.
There is evidence of the use of a range of methods that cause research data to be suppressed, by preventing its creation, controlling it, blocking its dissemination, or distorting it. There is also evidence of attempts to challenge this suppression in a variety of ways. Hence, while Mertons (1949) norm of communism may not be a good empirical description of the operation of research, nevertheless the norm, and the related value of openness, can serve as a means for understanding struggles over research. The concepts of openness and suppression help to unify understanding of a wide variety of policies and practices, including funding, secrecy, intellectual property, and attacks on researchers, as summarized earlier in Table 1.
In practical terms, an appeal to openness can be used as a means for pursuing creation of or access to data, and reference to suppression can be used as a means of criticizing certain controls over data. In this sense, the concepts of openness and suppression operate as rhetorical tools in struggles over data, as well as helping understand the dynamics of research.
To speak of suppressing research data is not to say that it is necessarily a good or bad thing. If, from a persons point of view, producing data or making it available has undesirable effects, then that person may find suppressing the data to be an appropriate act. For example, data on dissidents might be used by a repressive regime for arrests or executions, or data on small hazards from a beneficial chemical might be used by critics to prevent its availability. Obviously, evaluating instances of suppressing research data depends on the values of the evaluator.
Nevertheless, the word suppression has a negative connotation, as do censorship and secrecy, and those who engage in activities that might be described by these words often prefer alternative language such as quality control and confidentiality. Since to label an act as suppression is to make an implied value judgement, namely that the act should not occur or should be condemned, it makes sense to provide an appropriate definition of suppression. Here is one attempt.
It is worth examining a few elements in this definition. The definition of direct suppression, by referring to action taken by interested parties, assumes the existence of those who may be called suppressors. However, this need not be a conscious process on their part. The expression in conflict with norms of research practice can help to distinguish suppression from actions that are widely accepted as ethical. For example, in evaluating research grant applications, giving a low score may have the effect of preventing the creation of research data. If the evaluation is carried out on grounds of merit -- a norm of research practice -- then it should not be called suppression. On the other hand, if the evaluation is carried out in violation of conventional norms, or the decision about funding conflicts with peer assessments, as in the Mancuso case described in section 2.3, then suppression may be involved. The expression norms of research practice is deliberately vague. Within a military research facility, for example, censorship of research is the norm and no suppression occurs unless the censorship is above and beyond what is standard in the facility. However, from the point of view of researchers in what might be called the open component of the research system, military censorship is a form of suppression. This is somewhat like the way Abraham (1993, 1994, 1995) defines bias in research, namely in relation to commonly accepted knowledge and standards of logic.
Another portion of the definition refers to action ... taken by a person or group with an interest in the outcome. The concept of interest here refers to a financial, bureaucratic, career, or ideological stake, and is to be distinguished from altruism or adherence to universalistic norms. What must be involved for suppression to occur is self-interest at some level. For example, a medical researcher might decline to reveal patient details on the grounds of privacy; if this is of no particular benefit to the researcher, then direct suppression cannot be said to occur. However, if the medical researcher keeps certain patient details confidential that could be released according to ethics codes and which help the researcher keep ahead of rivals or hide questionable activities, then suppression might be said to be involved. In some cases, suppression may occur even though actions can be justified on other grounds; in such cases, the actions can be said to be overdetermined.
The definition of indirect suppression of research data is much broader, invoking a social structural dimension to the process. Whether the concept of indirect suppression is useful remains to be determined. As noted above, this is an initial attempt to define suppression of research data, and the definition no doubt has inadequacies for some purposes.
Defining suppression is one issue; deciding whether it has occurred is another. It is a major topic in itself, which can only be mentioned here. There are a few convenient indicators that suggest that suppression may be occurring.
Ultimately, there is no way to definitively prove suppression. Almost never does anyone admit to suppressing research data; when data is kept secret or its dissemination blocked, this is either not admitted or is justified on grounds that are widely considered legitimate. However, although showing the existence of suppression is usually difficult, it is still worthwhile investigating cases and providing evidence that suggests, to a greater or lesser degree, that suppression is involved.
The different ways in which research data can be suppressed, indicated by the categories of creation, control, dissemination, and distortion, suggest an analogy in communication theory. The classical mathematical theory of communication is encapsulated in Figure 1 (Shannon and Weaver, 1949).
In the case of research data, the information source may be taken to be the researcher and the destination may be another researcher or someone else interested in the results. There are various transmitters and receivers along the way, such as the researchers eyes and fingers, computers, and scholarly journals. Thinking more broadly, the researchers organization might be said to be a transmitter too. Finally, the noise source can include everything from problems in computer file conversion to organizational arrangements that distort communication.
The limitations of this model of communication, which can be called signal transmission theory, are well known. They include difficulty in dealing with interactive communication, difficulty in dealing with the meaning of messages (rather than just the quantity of information), and difficulty in incorporating the social context (such as organizational culture). Nevertheless, for all its limitations, signal transmission theory can be used to bring out insights. Leiss (1994) shows how the theory has been adapted to study communication of information about health and environmental risks, by speaking of problems of communication associated with the source, channel, message, and so forth. With a little adaptation, the same process can be applied to communication of research data: see Figure 2.
Methods of suppressing research data can be mapped onto this model.
With this picture, suppression is a problem in communication and different types of suppression operate at different stages in the process of communication.
This model has the advantage of highlighting some assumptions underlying the usual conception of research data. Note that in signal transmission theory, the information being communicated is treated as unproblematic. The whole issue of systems of meaning embedded in signs and social relations, which is the subject of study in the field of semiotics, is entirely missing. This is congruent with the usual positivist assumption in science that data is an unproblematical reflection of the reality of nature. Scientists collect data and report it for others to consume.
The limitations of the positivist conception of data have been expounded from a number of different perspectives, including the sociology of scientific knowledge which explores the social processes involved in constructing both theories and facts (Barnes, 1974; Bloor, 1976; Mulkay, 1979). This approach rejects the earlier sociology of science based on norms of scientific practice (Merton, 1973). While accepting that the sociology of scientific knowledge is useful for gaining insights into certain aspects of science, for gaining insights into suppression it can still be useful to revisit earlier types of models of science, such as represented in Figure 2, while keeping their limitations in mind.
Methods of challenging suppression of research data can also be mapped onto the model in Figure 2.
These points suggest that if suppression is conceived as a problem in communication of research data, then there are several different types of responses: dealing with a specific bottleneck, setting up alternative communication systems (sources, transmitters, channels), and exposing failures of the system. These responses are primarily aimed at improving access to information; the problem of distortion -- a message problem -- is seldom directly addressed. In principle, one redress for distortion is availability of correct information. However, this may be too little and too late, as in the case of leaked documents about tobacco company research on the hazards of cigarettes, which came decades after the information would have had the biggest impact and even today must still compete with tobacco company promotion of its own agenda (Glantz et al., 1996).
While the communication model for understanding suppression of research data can provide insight and help to operationalize understanding of obstacles to the goal of openness, it gives little guidance on the ease or difficulty in opposing suppression. To get a handle on the forces behind various types of suppression, analyses of interests, belief systems, or structures (section 3) are most likely to be helpful.
Finally, but not least, there is the issue of the desirability of suppression. Value judgments are obviously involved in this. In most discussions, the phenomena here called suppression are either not discussed or not questioned, with the implicit value judgment being that there is no problem. Conceptualizing suppression and examining the evidence are two vital steps for those who wish to change this situation.
Kate Bowles, Mark Diesendorf, Don Eldridge, Stewart Russell, and Wendy Varney made valuable comments on a draft of this paper. I also thank the many individuals who have given me advice, critique, insights, and references on suppression over the years.
Special thanks go to Kate Bowles, who commented as a referee on all the papers in this special issue of Accountability in Research.
Abraham, J. (1993) Scientific standards and institutional interests: carcinogenic risk assessment of Benoxaprofen in the UK and US. Social Studies of Science 23:387-444.
Abraham, J. (1994) Bias in science and medical knowledge: the Opren controversy. Sociology 28:717-736.
Abraham, J. (1995) Science, Politics and the Pharmaceutical Industry: Controversy and Bias in Drug Regulation. London: ICL Press.
Arditti, R., Brennan, P., and Cavrak, S. (eds) (1980) Science and Liberation. Boston: South End Press.
Armstrong, J. (1993) Remain CALM -- its only peer review at work. Search 24(4):98-100.
Armstrong, J.S. (1996) The ombudsman: management folklore and management science -- on portfolio planning, escalation bias, and such. Interfaces 26(4):25-55.
Armstrong, J.S. (1997) Peer review for journals: evidence on quality control, fairness, and innovation. Science and Engineering Ethics 3:63-84.
Bamford, J. (1983) The Puzzle Palace. New York: Penguin.
Barber, B. (1952) Science and the Social Order. Glencoe, Illinois: Free Press.
Barnes, B. (1974) Scientific Knowledge and Sociological Theory. London: Routledge and Kegan Paul.
Barnes, B. (1977) Interests and the Growth of Knowledge. London: Routledge and Kegan Paul.
Beder, S. (1997) Global Spin: The Corporate Assault on Environmentalism. Melbourne: Scribe.
Bell, R. (1992) Impure Science: Fraud, Compromise and Political Influence in Scientific Research. New York: Wiley.
Bennett, W.L. (1988) News: The Politics of Illusion. New York: Longman.
Berman, D.M. and OConnor, J.T. (1996) Who Owns the Sun? People, Politics, and the Struggle for a Solar Economy. White River Junction, Vermont: Chelsea Green.
Bloor, D. (1976) Knowledge and Social Imagery. London: Routledge and Kegan Paul.
Boffey, P.M. (1975) The Brain Bank of America: An Inquiry into the Politics of Science. New York: McGraw-Hill.
Broad, W. and Wade, N. (1982) Betrayers of the Truth: Fraud and Deceit in the Halls of Science. New York: Simon and Schuster.
Bross, I.D.J. (1981) Scientific Strategies to Save your Life. New York: Marcel Dekker.
Campanario, J.M. (1995) On influential books and journal articles initially rejected because of negative referees evaluations. Science Communication 16(3):304-325.
Colquhoun, J. (1990) Flawed foundation: a re-examination of the scientific basis for a dental benefit from fluoridation. Community Health Studies 14(3):288-296.
Colquhoun, J. and Wilson, B. (1999) The lost control and other mysteries: further revelations on New Zealands fluoridation trial. Accountability in Research 6(4): 373-394.
Cowan, P., Egleson, N., and Hentoff, N. (1974) State Secrets: Police Surveillance in America. New York: Holt, Rinehart, and Winston.
Curtis, M.K. (1995) Monkey trials: science, defamation, and the suppression of dissent. William & Mary Bill of Rights Journal 4(2):507-593.
Curtis, T. (1992) The origin of AIDS. Rolling Stone 19 March:54-61, 106-108.
De Blas, A. (1994) Environmental Effects of Mount Lyell Operations on Macquarie Harbour and Strahan. Sydney: Australian Centre for Independent Journalism, University of Technology, Sydney.
Deyo, R.A., Psaty, B.M., Simon, G. Wagner, E.H., and Omenn, G.S. (1997) The messenger under attack -- intimidation of researchers by special-interest groups. New England Journal of Medicine 366 (17 April):1176-1180.
Dickson, D. (1984) The New Politics of Science. New York: Pantheon.
Diesendorf, M. and Diesendorf, A. (1997) Suppression by medical journals of a warning about overdosing formula-fed infants with fluoride. Accountability in Research 5:225-237.
Drahos, P. (1996) Global property rights in information: the story of TRIPS at the GATT. Prometheus 13(1):6-19.
Dunford, R. (1987) The suppression of technology as a strategy for controlling resource dependence. Administrative Science Quarterly 32:512-525.
Ellickson, R.C. (1991) Order Without Law: How Neighbors Settle Disputes. Cambridge, Massachusetts: Harvard University Press.
Ensign, T. and Alcalay, G. (1994) Duck and cover(up): U.S. radiation testing on humans. CovertAction Quarterly 59 (Summer):28-35, 65.
Epstein, S.S. (1978) The Politics of Cancer. San Francisco: Sierra Club Books.
Epstein, S.S. (1989) Corporate crime: can we trust industry-derived safety studies? Ecologist 19(1):23-30.
Epstein, W.M. (1990) Confirmational response bias among social work journals. Science, Technology, & Human Values 15:9-38.
Fagin, D., Lavelle, M. and the Center for Public Integrity (1996) Toxic Deception: How the Chemical Industry Manipulates Science, Bends the Law, and Endangers Your Health. Secaucus, New Jersey: Carol.
Fort, C. (1975) The Complete Books of Charles Fort. New York: Dover.
Freeman, L.J. (1981) Nuclear Witnesses: Insiders Speak Out. New York: Norton.
Gellhorn, W. (1950) Security, Loyalty, and Science. Ithaca: Cornell University Press.
Glantz, S.A., Slade, J., Bero, L.A., Hanauer, P. and Barnes, D.E. (1996) The Cigarette Papers. Berkeley: University of California Press.
Glasstone, S. (ed) (1962) The Effects of Atomic Weapons. Washington, DC: United States Atomic Energy Commission.
Glazer, M.P., and Glazer, P.M. (1989) The Whistleblowers: Exposing Corruption in Government and Industry. New York: Basic Books.
Hager, N. (1996) Secret Power: New Zealands Role in the International Spy Network. Nelson, New Zealand: Craig Potton.
Halperin, M.H., Berman, J.J., Borosage, R.L., and Marwick, C.M. (1976) The Lawless State: The Crimes of the U.S. Intelligence Agencies. Harmondsworth: Penguin.
Herman, E.S. and Chomsky, N. (1988) Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon.
Hess, D.J (1992) Disciplining heterodoxy, circumventing discipline: parapsychology, anthropologically. In: Hess, D. and Layne, L. (eds). Knowledge and Society: The Anthropology of Science and Technology, Vol. 9. Greenwich, CT: JAI Press, pp. 223-252.
Hess, D.J. (1999) Suppression, bias, and selection in science: the case of cancer research. Accountability in Research 6(4): 245-257.
Hesse, M. (1974) The Structure of Scientific Inference. London: Macmillan.
Hileman, B. (1998) Industrys privacy rights: is science shortchanged? Chemical & Engineering News 76 (17 August):36.
Holdren, J.P. (1982) Energy hazards: what to measure, what to compare. Technology Review 85(3):33-38, 74-75.
Horrobin, D.F. (1974) Referees and research administrators: barriers to scientific research? British Medical Journal 2 (27 April):216-218.
Horrobin, D.F. (1990) The philosophical basis of peer review and the suppression of innovation. Journal of the American Medical Association 263(10):1438-1441.
Hummel, R.P. (1977) The Bureaucratic Experience. New York: St. Martins Press.
The Insight Team of The Sunday Times (Knightley, P., Evans, H. Potter, E.and Wallace, M.) (1979) Suffer the Children: The Story of Thalidomide. London: André Deutsch.
Jackall, R. (1988) Moral Mazes: The World of Corporate Managers. Oxford: Oxford University Press.
Kimball, P. (1983) The File. San Diego: Harcourt Brace Jovanovich.
Kluger, R. (1996) Ashes to Ashes: Americas Hundred-Year Cigarette War, the Public Health, and the Unabashed Triumph of Philip Morris. New York: Knopf.
Kohn, A. (1986) False Prophets. Oxford: Basil Blackwell.
Kuhn, T.S. (1970) The Structure of Scientific Revolutions, 2d ed. Chicago: University of Chicago Press.
LaFollette, M.C. (1992) Stealing into Print: Fraud, Plagiarism, and Misconduct in Scientific Publishing. Berkeley: University of California Press.
Lang, S. (1998) Challenges. New York: Springer.
Lasswell, H.D. (1950) National Security and Individual Freedom. New York: McGraw-Hill.
Leiss, W. (1994) Risk communication and public knowledge. In: Crowley, D. and Mitchell, D. (eds). Communication Theory Today. Cambridge: Polity, pp. 127-139.
Lewis, L.S. (1975) Scaling the Ivory Tower: Merit and its Limits in Academic Careers. Baltimore: Johns Hopkins University Press.
Lubek, I. and Apfelbaum, E. (1987) Neo-behaviorism and the Garcia effect: a social psychology of science approach to the history of a paradigm clash. In: Ash, M.G. and Woodward, W.R. (eds). Psychology in Twentieth-Century Thought and Society. Cambridge: Cambridge University Press, pp. 59-91.
Macdonald, S. and Hellgren, B. (1999) Supping with a short spoon: suppression inherent in research methodology. Accountability in Research 6(4): 227-243.
Mahoney, M.J. (1976) Scientist as Subject: The Psychological Imperative. Cambridge, Massachusetts: Ballinger.
Mahoney, M.J. (1979) Psychology of the scientist: an evaluative review. Social Studies of Science 9:349-375.
Martin, B. (1986) Nuclear suppression. Science and Public Policy 13:312-320.
Martin, B. (1991) Scientific Knowledge in Controversy: The Social Dynamics of the Fluoridation Debate. Albany: State University of New York Press.
Martin, B. (1992a) Science for non-violent struggle. Science and Public Policy 19:55-58.
Martin, B. (1992b) Scientific fraud and the power structure of science. Prometheus 10:83-98.
Martin, B. (ed) (1996a) Confronting the Experts. Albany: State University of New York Press.
Martin, B. (1996b) Critics of pesticides: whistleblowing or suppression of dissent? Philosophy and Social Action 22(3):33-55.
Martin, B. (1997) Suppression Stories. Wollongong: Fund for Intellectual Dissent.
Martin, B. (forthcoming) Suppression of dissent in science. Research in Social Problems and Public Policy.
Martin, B., Baker, C.M.A., Manwell, C., and Pugh, C. (eds) (1986) Intellectual Suppression: Australian Case Histories, Analysis and Responses. Sydney: Angus and Robertson.
Masson, J.M. (1984) Freud: The Assault on Truth. Freuds Suppression of the Seduction Theory. London: Faber and Faber.
Meadows, A.J. (1974) Communication in Science. London: Butterworths.
Medvedev, G. (1991) The Truth about Chernobyl. New York: BasicBooks.
Medvedev, Z.A. (1979) Nuclear Disaster in the Urals. New York: Norton.
Merton, R.K. (1949) Science and democratic social structure. In: Merton, Robert K., Social Theory and Social Structure. Glencoe, Illinois: Free Press, pp. 307-316.
Merton, R.K. (1973) The Sociology of Science: Theoretical and Empirical Investigations. Chicago: University of Chicago Press.
Mitgang, H. (1988) Dangerous Dossiers: Exposing the Secret War against Americas Greatest Authors. New York: Donald I. Fine.
Mitroff, I.I. (1974a) The Subjective Side of Science: A Philosophical Inquiry into the Psychology of the Apollo Moon Scientists. Amsterdam: Elsevier.
Mitroff, I.I. (1974b) Norms and counter-norms in a select group of the Apollo moon scientists: a case study of the ambivalence of scientists. American Sociological Review 39:579-595.
Moore, T.J. (1995) Deadly Medicine: Why Tens of Thousands of Heart Patients Died in Americas Worst Drug Disaster. New York: Simon and Schuster.
Morland, H. (1979) The H-bomb secret. Progressive 43(11):14-23.
Morland, H. (1981) The Secret That Exploded. New York: Random House.
Moss, R.W. (1996) The Cancer Industry. New York: Equinox Press.
Mulkay, M. (1976) Norms and ideology in science. Social Science Information 15:637-656.
Mulkay, M. (1979) Science and the Sociology of Knowledge. London: Allen and Unwin.
Munster, G. (1982) Secrets of State: A Detailed Assessment of the Book They Banned. Sydney: Walsh & Munster.
Nader, R. and Smith, W.J. (1996) No Contest: Corporate Lawyers and the Perversion of Justice in America. New York: Random House.
Nicholls, C. (1994) Whistling in the Dark. Canberra: Deakin.
Noble, D. (1977) America by Design: Science, Technology and the Rise of Corporate Capitalism. New York: Knopf.
Perrow, C. (1979) Complex Organizations: A Critical Essay. Glenview, Illinois: Scott, Foresman.
Pesticide & Toxic Chemical News (1981) Dr. Mel Reuber, pathologist, gets sharp censure, warning from his supervisor. (25 April):22-23.
Phillips, P. and Project Censored (1998) Censored 1998: The News the Didnt Make the News -- The Years Top 25 Censored Stories. New York: Seven Stories.
Primack, J. and von Hippel, F. (1974) Advice and Dissent: Scientists in the Political Arena. New York: Basic Books.
Pring, G.W. and Canan, P. (1996) SLAPPs: Getting Sued for Speaking Out. Philadelphia: Temple University Press.
Proctor, R.N. (1995) Cancer Wars: How Politics Shapes What We Know and Dont Know about Cancer. New York: BasicBooks.
Reece, R. (1979) The Sun Betrayed: A Report on the Corporate Seizure of U.S. Solar Energy Development. Boston: South End Press.
Relyea, H.C. (1994) Silencing Science: National Security Controls and Scientific Communication. Norwood, New Jersey: Ablex.
Revusky, S. (1977) Interference with progress by the scientific establishment: examples from flavor aversion learning. In: Milgram, N.W., Krames, L. and Alloway, T.M. (eds). Food Aversion Learning. New York: Plenum, pp. 53-71.
Rose, H. and Rose, S. (eds) (1976a) The Political Economy of Science: Ideology of/in the Natural Sciences. London: Macmillan.
Rose, H., and Rose, S. (eds) (1976b) The Radicalisation of Science: Ideology of/in the Natural Sciences. London: Macmillan.
Russell, S. and Ferguson, R.A.D. (1980) Assessing the health costs of fuel systems. Science and Public Policy 7(5):365-376.
Samuels, A. (1999) The toxicity/safety of processed free glutamic acid (MSG): a study in suppression of information. Accountability in Research 6(4): 259-310.
Schneider, K. (1982) Hard times: government scientists fall victim to the administrations policy to silence debate. Amicus Journal Fall:22-31.
Schultz, B. (1993) The censorship and suppression of scientific research. Search 24(4):93-97.
Shamoo, A.E. (1997) Attempts at suppressing data. Professional Ethics Review 10(1).
Shannon, C.E. and Weaver, W. (1949) The Mathematical Theory of Communication. Urbana: University of Illinois Press.
Sheldrake, R. (1998) Experimenter effects in scientific research: how widely are they neglected? Journal of Scientific Exploration 12(1):73-78.
Stauber, J. and Rampton, S. (1995) Toxic Sludge Is Good for You: Lies, Damn Lies and the Public Relations Industry. Monroe, Maine: Common Courage Press.
Stern, P.M. (1969) The Oppenheimer Case: Security on Trial. New York: Harper & Row.
Storer, N.W. (1966) The Social System of Science. New York: Holt, Rinehart and Winston.
Thomas, J. (1981) Class, state, and political surveillance: liberal democracy and structural contradictions. Insurgent Sociologist 10 (4):47-58.
Thompson, C. (1999) The tangled methods of quantum entanglement experiments. Accountability in Research 6 (4): 311-332; http://users.aber.ac.uk/cat/Tangled/tangled.html
Tickell, O. (1998) Dirty secrets. New Scientist 159 (29 August):18-19.
Tiffen, R. (1989) News and Power. Sydney: Allen and Unwin.
Toohey, B. and Wilkinson, M. (1987) The Book of Leaks: Exposés in Defence of the Publics Right to Know. Sydney: Angus & Robertson.
Van den Bosch, R. (1978) The Pesticide Conspiracy. Garden City, New York: Doubleday.
Vidal, J. (1997) McLibel. London: Macmillan.
Waldbott, G.L. (1965) A Struggle with Titans. New York: Carlton Press.
Weiner, T. (1990) Blank Check: The Pentagon's Black Budget. New York: Warner.