We are truly living in a remarkable era of biotechnological progress. Emerging nanotechnologies and immunotherapies offer the possibility of the targeted destruction of cancerous cells. 3-D printing of living cells is on the horizon, engendering the hope of a future with fully printed organs. And simultaneous advances in neuroscience and bioengineering have given rise to promising research and development of “electrocueticals” (neuromodulatory devices that may alleviate the symptoms of, or even cure, inflammatory diseases) and neural bypass machines which enable prosthetics to be controlled by the mind — almost in a natural way.
Despite the tremendous possibilities the aforementioned technologies offer, these are just the tip of the iceberg. Other therapies offer the possibility of preventing disease from occurring and curing other maladies, but with the caveat of altering who we fundamentally are.
Let’s imagine for a second that these technologies were available tomorrow. If I were to offer you a guarantee that your baby would have a reduced risk of a disease that runs in your family, or even a disease that you did not know you family carries through genetic editing would you take it? What if I were to offer you a means for your mother with Alzheimer’s to regain some of her mental capacities via a brain chip implant? And what if I were able to cure your cousin with a hematologic irregularity through the administration of synthetic blood?
Previously relegated to the realm of science fiction and philosophy, everyday Americans have chimed in through a recent survey conducted by Pew Research Center, informing clinicians of the conversations they are likely to have as these technologies become available. As a side note, the NIH announced its intention to lift its bans on funding researching involving chimeric (part human, part animal) embryos this past August, which makes such inquiry particularly relevant.
The Pew survey revealed that 50 percent of U.S. adults said they would not want genetic editing and 68 percent were worried about it, even if it gave their future child a much reduced disease risk. Furthermore, 66 percent of U.S. adults said that they would not want anything implanted in their brain to improve their cognitive abilities and 69 percent were worried about it. And finally, 63 percent of Americans said that they would not want “synthetic blood for much improved physical abilities.” Interestingly enough, 81 percent of U.S. adults do believe that artificially made organs will be available for routine transplant, 66 percent believe that there will be “cures for most forms of cancer,” and 54 percent believe that computer chips will be routinely embedded in our bodies. So why the apparent discord between the beliefs that we are marching toward progress, yet the reluctance to make those changes that will allow us to do so?
To investigate this further, the Pew groups conducted further follow-up surveys, which lend themselves to the conclusion that Americans are wary of such enhancements on religious grounds fearing the “unnatural,” worried that these technologies would become available before they were thoroughly validated, and frightened by the prospect of widening inequality.
For the aforementioned case of reducing serious illnesses of children through genetic editing, 54 percent of people thought it would be an appropriate use of technology if it resulted in a person “always equally as healthier as the average person today,” 52 percent thought it would be appropriate if it resulted in people “much healthier than the average person today,” and 42 percent thought it would appropriate if it resulted in “people far healthier than any human known to date.”
Similar distributions occurred for the case of brain implants improving cognitive abilities and synthetic blood substitutes improving physical abilities for the same parameters. 73 percent of U.S. adults believe gene editing will be available before it is fully tested or the implications understood, 74 percent say this about brain implants, and 73 percent share this view about synthetic blood. Individuals who view themselves as highly religious are disproportionately more likely to view these enhancements as encroaching on nature. Most interestingly for health care, people who identify themselves as having medium (52 percent) to low religious commitment (72 percent) believe that genetic editing to give babies a much reduced disease risk is permissible.
So taken altogether, what do the attitudes of American people mean for biotechnology researchers and the clinicians of the not-so-distant future? First, it is abundantly clear that any technology that has the potential to “alter” who we are in the fundamental sense and results in a permanent change to the individual — even in a therapeutic sense — will have to be outcomes-researched into oblivion. This will likely mean multiple randomized control trials and longitudinal studies that provide incontrovertible proof that these biotechnologies will not have severe side effects or consequences that could alter our perceptions of ourselves as individuals, others perceptions of ourselves as individuals, or our perceptions of ourselves as a human race.
Second, a stringent regulatory system will have to be implemented wherein strict criteria are established as to the medical criteria for the administration/use of these technologies for therapy. In addition, money must be made available so that the sick of every socioeconomic class will have access to such biotechnologies if they meet the aforementioned criteria, in order to rightly relieve fears of inequality.
And most important, there will have to be a cultural shift in the mindset of the public (informed by a thorough national, or even global, discourse on the ethics of such procedures) to administer therapies that might push the boundaries of the unknown.
Samir Shah is a medical student who can be reached on Twitter @SamirxShah.
Image credit: Shutterstock.com