Friday, November 20, 2015
Science Rules *Bill Nye Voice*
In class, we
discussed the morality of synthetic biology and its dual-use potential. The key
issue is the unknown and the potential to create a Jurassic Park like world or
a deadly virus. We found ourselves
evaluating the morality by the result, a consequentialist theory that does not
completely address the moral issue.
Creation for
the sake of creation is risky, but it can be considered art. In the case that science is rooted in
innovation and discovery, I do not think it is immoral. There need to be
guidelines for containment or lifespan control, but in a regulated environment,
I do not see issues with synthetic biology.
I agree with the idea that
every synthesizer should be traced and whoever purchased the genome should be
held accountable in case there is inappropriate use.
As far as
modifying diseases, I struggle with coming to a conclusion. I would eliminate
deafness, blindness, and depression if I had the choice, but that eliminates
some people’s free will. If nine people are wishing their blindness away, while
one appreciates the experiences and wants to keep the condition, but a virus is
discovered that will cure all
blindness: who chooses the final result? Is the free will of one individual
more important than majority’s desire to see?
Another tough aspect is the idea that our definition of disease changes with time. For example,
homosexuality used to be called a disease.
Often, the people creating the standards simply do not understand
someone with characteristics different from theirs.
I have a few
qualms with deeming synthetic biology immoral under the guise of “playing God.”
As humans, we create and destroy living things constantly: we are inconsistent
with our theories on who or what can be killed. For example, we treat some
animals like family members while eating animals who are similar in size and
sentience. As far as creation, most people eat genetically modified organisms
daily. Some people avoid GMOs for
health, environmental, or moral reasons, but for the most part, they are widely
accepted. One quote that stuck out to me in the film Transcendence was Dr. Will Caster’s reply to an accusation about
playing God; he said, “Isn’t that what man has always done?” Humans create and
destroy: it is difficult to determine what makes synthetic biology different.
There will always be risks in creation, but the unknown should not hinder
discovery.
Subscribe to:
Post Comments (Atom)
It really is hard to determine on what side to play on. I mean there is a thin line between the two but there is definitely a line there. Now at a time I felt like we would be playing God when trying to achieve AI, but far as creating a cure; I can't really say that. I mean if it cures something that majority feels that it is only right, then why not? Yes mostly we are playing God by creating and destroying, but hey we were all made in his image; so yeah. It is really hard to determine actually. It all depends on majority rule in my opinion when dealing with certain issues.
ReplyDeleteThis is a very difficult topic to discuss when it comes to the morality of it. I believe that playing God is morally wrong, but don't we already do that when we genetically alter plants? I think that if we created an organism that completely took over the innate species of the environment and caused mass extinction, we would be morally responsible for it. When creating new organisms, we do not have any idea of what the outcome might be. I think that creating things, like art, is okay for the most part. But, when it comes to creating things that have a high percentage of coming out badly, we probably shouldn't do it. If we think that an experiment might possibly harm us or the environment, is it considered immoral to go through with it? I feel that in almost every case the answer should be no, but there are those rare cases of maybe the product will cure cancer. Then I feel that the experiment might be worth the risk. When considering the morality of these decisions, we may need to really focus evaluating the risks associated with each choice.
ReplyDeleteI agree mostly with Stephanie, and I like your example that sort of plays devil's advocate. Its interesting how we as humans like to pick and choose what is and what is not morally acceptable to manipulate. I think one of the biggest problems with this discussion is will the modifying of diseases be to ultimately improve someones life or their quality of life? And if so, who says one person has more authority over another to make that decision? Taking a step aside, I love the idea of being able to modify diseases for what we think might be for the "greater" good and I would personally like to take a stab at Alzheimer's Disease but I do think that this ability might ultimately be a detriment to society.
DeleteI personally believe that humans always play God. We always try to change the world and invent new things. We destroy things just as often as we create and invent. So with that being said, I do not think we should just stop. Playing God seems to be instilled in us. We couldn't stop if we wanted to. Although creation can sometimes bring about bad, I believe that we need it to bring about the good and to balance everything out. Without creation, we would not be as advanced as we are today. As you stated previously, there are risks involved but these risks shouldn't prevent us from discovering new things. I strongly believe that without creation, we cannot have progress.
ReplyDeleteI disagree with the phrase "playing God" because it's in human nature to be curious and investigative and want to learn, create, and grow. I don't see anything wrong with creating something new, even if it's just for the sake of creating. Isn't that what we do with, for example, Legos? I understand that the scale is different, of creating something with Legos versus creating something with genetic material, but the principle is basically the same. I think what matters most are intentions and methods. If you are playing with potentially dangerous material (for example, creating dinosaurs), you should have safety measures in place for every scenario you can predict. Obviously, this isn't as easy with cutting-edge technology, like nanotechnology, because we have no idea what the risks are, but you should have some sort of measure in place for imaginable scenarios. As far as intentions go, I d think those matter, a lot. Creating something solely for the purpose of destroying lives, human or other, or the environment is wrong on a fundamental level. If you create something that can be used to do such, but that was not your intention, I think that absolves you of some guilt. Not all, because however whatever technology you create is used, is on you, but you wouldn't be the only, or even mostly, guilty one.
ReplyDelete