Managing Scientific Conduct

Discussions of misconduct in science have become prevalent in the literature. Within the past few years, many more misconduct instances have come under scrutiny than ever before. A search using the key words “SCIENCE FRAUD” on the Internet yielded over ten thousand citations. The general public has become increasingly aware of these activities in science. The question is, do we need to regulate and manage scientific conduct? Do scientists need a professional code of ethics?

Scientific research, like other human activities, is built on a foundation of trust. Scientists trust that the results reported by others are valid. Society trusts that the results of research reflect an honest attempt by scientists to describe the world accurately and without bias. But this trust will continue only if the scientific community devotes itself to transmitting and enforcing ethical scientific conduct.

The self-regulating process has stimulated great scientific discoveries in the past centuries. In the past, young scientists learned the ethics of research largely through informal means – by working with senior scientists and watching how they dealt with ethical questions. That tradition is still vitally important. But science has become very complex; it is closely intertwined with other social activities, especially commercials. A more formal means of enforcing scientific research ethics is needed-a professional code of ethics. It will provide a common rule for all the scientific activities, whether it is right or wrong. It will provide a means to isolate scientific activity from other social activities. It will provide a firm foundation upon which science is built.

We need a formal code of Professional Ethics. The American Association for the Advancement of Science has offered the following definition: “Professional Ethics refers to those principles that are intended to define the rights and responsibilities of scientists in their relationship with each other and with other parties including employers, research subjects, clients, students, etc.” (Chalk, Frankel, and Chafer). General moral norms specify that scientists must play by the same fundamental rules that apply to society-at-large. Professionals typically have a certain power advantage over others that must be moderated and restrained. People of such authority should be frequently reminded of their professional code of conduct. If they offend the code, they should be punished, sometimes severely.

As depicted the in book “The Frankenstein Syndrome”, “there is unquestionably a tendency among scientists to ignore or minimize dangers growing out of scientific activity” (Rollin 70), “[they] tend to be cavalier about the dangers emerging from science and technology” (Rollin 71). “The Frankenstein Syndrome” expresses a deep and pervasive fear of the consequences of unrestrained scientific and technological development. While the two major sources of this fear are nuclear and genetic technologies, other technologies are significant contributors as well. One recent example of what technology can do was a miracle happened on Christmas Day, 1993. A 59-year-old British woman gave birth to twins and became the oldest woman on record to have a child (Smith A-1). How did this happen? Through modern technologies, test-tube pregnancy. She was artificially impregnated with eggs from a younger woman. The case could result in more complex social and economical issues, since the woman will be 69 when her children are only 10 years old. This kind of practice has raised many social and medical questions, including if such procedures are fair to the children; whether they are the best use of limited medical resources; and whether they pose too high a risk to older women. None of these questions has a simple answer. We have to formulate some general guidelines to answer these questions and regulate these activities.

Moreover, there are two types of scientific misconduct – negligence and deliberate dishonesty. Negligence is where scientists have provided erroneous information, but with no intent to fraud. Deliberate dishonesty involves the deliberate attempt by a scientist to be dishonest. No matter what kind of misconduct it is, the net result is the same: wrong information provided to the scientific community and the public. They are equally deleterious according to Schmaus: “…negligent, careless, sloppy, and reckless work [are] just as much a violation of moral duty as fraud. The potentially disastrous effects for science and society that may accrue from false information are the same regardless of the intention of the author. Erroneous data reported from the testing of new drugs, for instance, can be dangerous whether they are a consequence of unintentional negligence of deliberate fraud” (12). In both cases, time is wasted by those who attempt to reproduce experiments that do not offer a chance of success. Time is wasted by those who must carefully scrutinize questionable results. Time is wasted by those who must participate in outside investigations and hearings into such matters. In both cases, the time lost could have been better allocated to potentially fruitful research activities. We have to have severe punishments for the offenders and thus discourage people from doing it.

Perhaps even more troubling is the damage inflicted upon the reputation of science. Just as society must be able to trust their policemen, firefighters, and doctors, they must be able to trust their scientists as well. It looks very bad for police officers everywhere when a few of their own physically attack a motorist late at night on a California interstate highway. It looks very bad for doctors everywhere when a doctor uses his own sperm to impregnate as many as seventy-five women at a fertility clinic (The Economist A27). Obviously, it looks very bad for scientists everywhere when a few of their colleagues tell the world that something can be accomplished when it cannot. Finally, it looks bad for all scientists when a number of their colleagues adamantly defend forged data towards which they should have exhibited skepticism. As a result, if the scientists want to enjoy the privilege of self-regulation, they must have a professional code of ethics to enforce it.

In conclusion, scientists need a well-defined and clearly written professional code of conduct. Scientists should know the rules and the nature of their punishment if they fail to abide by their code of conduct. They should be frequently reminded of their professional obligations, formally or informally. A professional code of conduct will provide a means of protection to both the scientific community and the general public.

Works Cited

“Cutting Out the Middle Man: Fertility Fraud.” The Economist. 4 Jan. 1992: A27.

Rollin, Bernard. The Frankenstein Syndrome. UK, 1995.

R. Chalk, M. S. Frankel, and S. B. Chafer. “AAAS Professional Ethics Project.” American Association for the Advancement of Science. 1980. Available URL:

Schmaus, W. “Fraud and the Norms of Science.” Science, Technology & Human Values. Fall 1983: 12.

Smith, Hebert J. “Woman, 59, Gives birth.” The Roanoke Times. 28 Dec. 1993: A-1.

Leave a Reply