Playing by the rules – Scientific misconduct in a legal perspective
Michael 2007;4:35–42
A simple lesson to learn from the recent Norwegian research scandal is that there are rules that need to be observed and appreciated. This requires knowledge, understanding and awareness both at the individual level and institutional level.
Given the increasingly complex framework for research, it may sound a tall order, but it is nevertheless reasonable. Contrary to popular belief, rules are not meant to be an inappropriate hindrance for good research. They are meant to foster good research. Ethical, professional and legally acceptable research is crucial for public trust and the legitimacy of science.
Fortunately the awareness of and attitude towards this normative framework is changing. The recent case has speeded things up in Norway, and it has certainly made it easier to explain why we do have and must have rules. For in order to play by the rules, one must know the rules.
This paper concentrates on the rules and regulations governing medical and health related research in general, in the wake of the hereinafter called Norwegian research scandal. Three questions can be raised:
Are there rules?
Is there a problem with regard to the rules and regulations?
If so, what should be done to address the problem?
Are there rules?
In March, 2006, I was asked to talk about whether fraud in science is illegal or not? I was a bit surprised by that request. Is anyone in doubt, I thought.
My answer was of course a simple but clear yes. There are rules. Medical and health related research is subject to a magnitude of rules, just like any other activity (1,2) (tables 1, 2, 3).
|
|
There are a variety of behavioural norms governing the conduct of scientists, from social and ethical norms to more specific and binding professional and legal norms. These norms may be unwritten (e.g. custom based) or text based. They concern anything from prior ethical review, choice of method, risk assessment, consent and confidentiality to publication and authorship. And fraud in science is immoral and illegal, as in any other sector. Moreover we all have a duty of care; ethically, professionally, and legally. Breaches of that duty may for example constitute liability for negligence.
|
Additionally there are established agencies or procedures to oversee and ensure that the rules are observed. These may be internal or external, prior or afterwards. An internal revision board at research institutions is one example. Multidisciplinary ethical committees and governmental agencies, such as data inspectorates and health authorities are other examples. Scientific journal peer-review may also be added. Courts or investigating agencies have a more reactive role in this control or quality assurance system.
Tables 1, 2 and 3 (which are incomplete) also illustrate that the regulation of medical research has been increasing at the international level during the past decade, as it has on the national level in several countries. Furthermore, we see an ongoing shift from professional guidelines to statutory rules (2).
A broad comparative analysis of common basic principles in this field revealed an anticipated intimate relationship between ethics, professional guidelines, and the law (2). Together these norms create a normative framework which any researcher is expected to know and adhere to.
The intention of this framework is to protect human subjects and to state which behaviour is acceptable or not; hence, to foster ethical and professional research. Simultaneously the intentions are then to prevent unethical, unprofessional and bad science. Thus the framework is meant to benefit the interest of human subjects, society, science, and scientists. The Declaration of Helsinki and the Oviedo-Convention both state that the «interests and welfare of the human being shall prevail over the sole interest of society or science» (4, 5 see Article 2). The rules are however obviously not meant to hinder good research, although some researchers seemingly suspect them of precisely that.
Generally speaking, regulations are, in this as in any other field, intended «to codify accepted modes of behaviour; good law is then facilitative, not prohibitive» (5). It is important to stress that although the framework is meant to be guiding, many of current rules, certainly the legal ones, are binding. Thus despite the fundamental character of academic freedom, it is by no means voluntary for researchers to comply or not with the existing framework (4 see Article 15). Bluntly speaking: researchers can appreciate and adapt, or close their eyes and hope for the best.
Another simple point to be made is that there is a gradual scale of wrongdoing. Deviations from existing rules occur in many shades – from the trivial to the conspicuous (6). A similar scale may describe the gradual degree of guilt – from honest errors via indifference and carelessness to intentional fraud. These are all wrongdoings, i.e. unwanted behaviour. Even unintended wrongdoings may be blameworthy and for example constitute liability. Although the intentions are the best, indifference to or ignorance of the law is seldom a valid excuse in a court of law; nor should it be within the scientific community.
Is there a problem with regard to the rules and regulation – a case study
The Investigating Commission’s Report
The Norwegian research scandal may be illustrative of existing challenges or problems with regard to the rules and regulation of medical research.
The Commission concluded in its investigative report that «the bulk of Jon Sudbø’s scientific publications, are invalid due to the fabrication and manipulation of the underlying data material» (7 p. 5). Furthermore the Commission «…found that there are no reasons to believe that other persons than Jon Sudbø, either intentionally or with gross negligence, have contributed to the fabrication of data or committed similar gross and serious breaches of good scientific practice» (7 p. 117). However, the Commission «discovered a series of minor breaches, which in aggregate have contributed to a system in which the breaches of good scientific practice have been allowed to increase without being discovered earlier» (7 p. 106).
The Commission observed that «…co-authors mainly appeared as subsuppliers or as senior guarantors…» (7 p. 99). Although that may be legitimate, «…there are, as the Commission sees it, certain descriptions in the articles which more people should have reacted to. This may be co-authors, supervisors, superiors, critics, colleagues and others» (7 p. 99). The Commission went as far as stating that «[r]esearchers associated with the department indeed seem to have had a relatively relaxed relationship with the for- malities. This applies in relation to the retrieval, delivery and treatment of human biological material and sensitive patient information, recommendations from the Regional Committee for Medical Research Ethics, and licenses for data processing and dispensation from the duty of secrecy» (7 p. 98). In its concluding remarks the Commission states: «A general characteristic seems to be that many of the co-authors did not have a very conscious relationship to the responsibility inherent in being listed as a co-author of a scientific publication. In other words, they have taken this role and responsibility too lightly» (7 p. 118). The researcher’s supervisor was, however, the only individual named and blamed for negligence in the report. The Commission also criticized the primary institution, mainly for:
«Insufficient advance control and organization of Sudbø’s PhD project, including specification of distribution of responsibility.
Insufficient training and consciousness-raising of Sudbø and other employees about the rules for handling patient material, advance assessment of research projects and authorship.
Insufficient management and routines for discovering and handling deviations from internal instructions, etc» (7 p. 114–5)
Three additional research institutions where also criticized for breaches of confidentiality when handing out sensitive patient material and data without patient consent and/or necessary permission.
Finally, the Commission observed that scientific journals could probably have done more to include the co-authors and make them more conscious of their responsibilities.
The failure in all segments from bottom to top added up to what the Commission calls a systematic fault – a malfunction of the research community at large. Thus, interestingly, from a legal perspective, the Commission asserted that it was not «the lack of rules which is the problem, but rather the individual researcher’s and institution’s knowledge and practicing of the rules which actually exists (p. 106)… [and] a lack of measures to prevent breaches of good scientific practice through the implementation of simple and effective routines» (7 p. 107).
Complex, inaccessible, or too rigid rules?
Lack of knowledge and negligence or indifference when it comes to practicing existing rules is worrying. One might ask if lack of knowledge and awareness is due to complex, inaccessible, or too rigid rules.
As shown the regulatory framework is indeed complex and somewhat inaccessible (tables 1, 2, 3). In Norway, a governmental appointed committee (the Nylenna-committee) undertook an investigation of the Nor-wegian framework for medical and healthrelated research (8). The Nylenna-committee recommended a simplification and improvement of the Norwegian framework in order to make it more comprehensible and accessible.
However, it must be noted that complexity is hardly ever an acceptable excuse for not knowing at least the basics. The Investigative Commission stated for example that the «prohibition against improper manipulation and fabrication of data is embedded in rules that all researchers must be assumed to be well acquainted with» (7 p. 106). The same can be said about other basic rules governing research. Furthermore, saying that the rules are too rigid – or even worse, not likeable – is also a rather poor excuse for negligence.
A troubling awareness or attitude towards the framework?
Since lack of knowledge therefore appears to be only part of the problem, the Commission report can be read as suggesting that there is a troubling awareness or even attitude towards existing rules within the scientific community. The Commission noted that testimonies indicated «… a disturbing lack of awareness of the prevailing rules for good research practice. This applies in particular to rules on secrecy, protection of personal data, authorship and advance assessments of research projects …» (7 p.114).
Although adherence by the rules is first and foremost an individual responsibility, the Commission also stresses the institutional responsibility when it states that «there has been a lack of measures to prevent breaches of good scientific practice through the implementation of simple and effective routines» (7 p. 107). In this regard, the Commission goes as far as stating that «… the deviations to a certain degree must have been known to and therefore apparently accepted by management» (7 p. 110).
These observations by the Commission indicate a problem of awareness and attitude towards the framework at all levels. Moreover, it poses the question: do we all actually understand and value the governing and facilitating function of the existing normative framework?
What should be done?
The Commission’s investigating report echoes the findings in a broader survey of US scientists (n= 3 247), which revealed that 33 % of the respondents had engaged in conduct likely to be sanctionable (9). This finding led to the conclusion that «…mundane ‘regular’ misbehaviours represent greater threats to the scientific enterprise than those caused by high-profile misconduct cases such as fraud.» Thus the «bad apple-theory/excuse» must be abandoned and replaced by a culture of prevention and increased awareness at all levels.
Not unexpectedly, one of the Commission’s recommendations is that «Research institutions must to a larger extent make all researchers and supervisors aware of the prevailing rules and the liability attached to breaches of the rules.» (7 p. 119). From a legal perspective this implies education, implementation and a certain degree of follow up (control). These are preventive measures and should of course be aimed at the management and all researchers, not only the «bad apples» (the others).
Scientists cannot be expected to be professional lawyers able to manoeuvre in a complex framework and bureaucracy by themselves. It takes time and demands qualifications. That is the reason why it is an institutional responsibility to make the rules readily available for researchers and arrange for effective and professional research. Simple checklists, adequate schooling and accessible assistance when more complicated issues need to be addressed appear necessary. An adequate and proper quality assurance system is obviously mandatory in professional institutions responsible for research on human subjects and sensitive material.
Several research institutions in Norway have already adopted such measures, and the Commission notes optimistically in its report that «The medical research community is in a transition phase as regards the organization and formalities relating to medical research» (7 p.115). An additional point is accountability. Laboratory personnel, project leaders, authors, co-authors, supervisors, and management, from bottom to top, must be aware of their responsibility.
Acknowledgement
A special thank to Magne Nylenna and Anders Ekbom for valuable discussious when preparing this manuscript.
References
Simonsen S, Nylenna M. Helseforskningsrett. [Biomedical Research Law] Oslo: Gyldendal, 2005.
Simonsen S, Nylenna M. Basic ethical, professional and legal principles of biomedical research. Scand J Work Envir Health Suppl 2006;2:5–14.
World Medical Association. Declaration of Helsinki. Ethical Principles for Medical Research Involving Human Subjects of 1964.
The Council of Europe’s Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the Application of Biology and Medicine: Convention on Human Rights and Biomedicine. Oviedo, 4.IV.1997.
Morris J. Law, Politics and the Use of Force. In: Baylis J, Choen E, Gray C, Writx J. Strategy in the Contemporary World. London: Oxford, 2002: p 66–91
Nylenna M, Simonsen S. Scientific misconduct: a new strategy for prevention. Lancet 2006;367:1882–4.
Report from the Investigation Commission appointed by Rikshospitalet–Radiumhospitalet MC and the University of Oslo January 18 2006 http://www.rikshospitalet.no/content/res_bibl/6876.pdf (June 30, 2006) (accessed Nov 14, 2006). [Translated version. Only the Norwegian text is authentic.]
Official Norwegian Report NOU 2005:1. God forskning – bedre helse. [Good Research – Better Health] Oslo: Statens forvaltningstjeneste, 2005.
Martinson BC, Anderson MS, de Vries R. Scientists behaving badly. Nature 2005;435:737–738.