7
Ethical, Legal, and Social Issues
Highlights
- The use of neurostimulation, including off-label use, is rapidly expanding, without a full understanding of safety and efficacy. (Farah, Pascual-Leone, and others)
- Non-invasive neuromodulation has the potential to cause not only physical, but non-physical harms as well. (Parens)
- The safety and efficacy of long-term stimulation is not well understood. (Farah)
- The involuntary or coercive use of neuromodulation presents many ethical concerns. (Chandler)
- The do-it-yourself movement raises questions about the responsibility of researchers to educate the public. (Maslen)
NOTE: The points in this list were made by the individual speakers identified above; they are not intended to reflect a consensus among workshop participants.
Ethics in the context of neuromodulation extends far beyond what Aristotle would have recognized as classical ethics issues in his day, said Hank Greely. With regard to neuromodulation, the topic spans ethical, legal, social, and even political implications, indeed, all things in society that affect the use and potential misuse of these devices now and in the future. For example, Alvaro Pascual-Leone mentioned the reality that off-label application of neurostimulation is rapidly expanding, without examination or a full understanding of safety and efficacy implications. Patients are making devices, buying devices, and getting clinicians to
prescribe devices; companies are developing new consumer-targeted devices with non-medical aims that ultimately get leveraged into the medical setting.
Erik Parens, senior research scholar at The Hastings Center, focused his comments on what he called non-physical harms, that is, how a technology might do harm not to our bodies, but to us as human beings. He cited four major concerns: inauthenticity, complicity, mechanization, and inequality. The first, inauthenticity, threatens to separate us from who we really are or how the world really is. Complicity relates to how these technologies could be used so people can live up to social norms that may be problematic, such as the idea that making money is the greatest good. Mechanization refers to the concern that these technologies could make us think of ourselves as machines that need fixing rather than persons who need and want engagement. Finally, these technologies may exacerbate inequality by providing advantages only to those who have the resources to access the technology. Social inequality is bad for the health of society as well as for the health of individuals, particularly those at the bottom, said Parens.
Parens said that the distinction between treatment and enhancement is abstract and fuzzy, and also unavoidable and potentially useful if we’re trying to articulate what, for example, should go into a basic package of medical care. Since the 1990s, enhancement has been defined in contrast with treatment, where treatment restores normal functioning and enhancement produces something better than normal functioning. Yet, because there is no bright biological line between normal and better than normal functioning, there is no bright line between treatment and enhancement. Even if there was such a bright biological line, it would not follow that there is a bright ethical line. It does not follow from the fact that an intervention is an enhancement that is unethical; to reach that conclusion, one would need to be explicit about additional reasons regarding, for example, the likelihood of harms, either physical or nonphysical. Nor does it follow from the fact that an intervention changes the
brain directly (as with neuromodulation) rather than indirectly (as with traditional education) that is unethical; to reach that conclusion, one would again need additional reasons—perhaps about the different values embodied in the direct versus indirect means of achieving the desired end.
Nobody is against true enhancement, said Parens. People are, however, opposed to things that purport to deliver a benefit but in fact cause harm. For example, soma, the drug in Aldous Huxley’s Brave New World, was supposed to give people the experience of happiness but did so in the absence of engagement in the kinds of activities that normally make human beings feel happy. Parens suggested that people do not object to soma because it is an enhancement, but because it is not.
Martha Farah, Walter H. Annenberg Professor of Natural Sciences at the University of Pennsylvania, cast ethical concerns into four overlapping categories: safety, efficacy, freedom, and fairness, focusing her comments on safety and efficacy. It seems clear that a single session of TMS or tDCS is safe if done properly, she said, although much less is known about their use in repeated sessions over months or years, which is how they will be used for treatment and enhancement. Through empirical experience, the field may eventually arrive at some knowledge of how risky or safe these devices are. Alternatively, a deep understanding of the mechanisms by which these devices work could point to theoretically possible downsides of these methods. Farah suggested that we do not have either the necessary experience or a firm grasp of mechanism.
Roi Cohen Kadosh noted that further study is needed on the long-term safety of neurostimulation and the impact of neurostimulation on the developing brain. He added that there can be trade-offs with neurostimulation, that is, it may improve some cognitive processes while worsening others. Farah commented that this finding should come as no surprise given what we know about neurons wiring and firing together and the competitive nature of plasticity. However, there is much that we don’t know. She suggested that animal research may be one place to begin to understand the long-term physiological effects of neurostimulation.
Atul Pande raised the concern that, in the case of low field magnetic stimulation (LFMS), it may be possible to compact the technology enough so that it can be used easily at home. But, he asked, if it is prescribed for say, 20 minutes, what might the effect be if a person uses it
for 8 hours? Indeed, the safety of unsupervised use was mentioned by several participants as a poorly addressed area of concern. Ana Maiques said the short-term safety of tDCS in adults is relatively well established, but less is known about the long-term safety or the use of tDCS in children or other special populations. Neuroelectrics’ solution to this concern is to permit sales only to individuals who have a prescription from their physician, and to control the device through the cloud, so that the device can only be activated for the number of minutes prescribed.
An additional concern raised by several participants is that devices marketed for enhancement to well-being are not currently regulated as medical devices either in Europe or the United States, and are subject to general, non-specific safety requirements. Hannah Maslen, Postdoctoral Research Fellow in Ethics at the Oxford Center for Neuroethics, showed some examples of the claims manufacturers are making about their devices. While noting that the devices are not FDA approved, they often include statements such as “it has been tested to all of the required standards,” or “scientific papers are being published every day and the results are incredible.” In some cases, they include disclaimers suggesting that the data presented may be inaccurate and encouraging users do their own research.
With regard to efficacy, Farah noted that we are also in the very early days of figuring out whether repeated long-term brain stimulation is helpful. Some studies indicate that it is, particularly in combination with training and maybe in combination with drugs or other modalities. Early studies by many labs suggested that a single treatment with tDCS or TMS enhanced cognitive processes such as working memory, but subsequent meta-analyses of multiple studies found systematically smaller and smaller effect sizes, in some cases zero (Brunoni and Vanderhasselt, 2014; Horvath et al., 2015). Several participants strongly urged caution in interpreting these meta-analyses because of the many types of heterogeneity across studies. This phenomenon may, in part, reflect publication bias, wherein only studies that show an effect are published. Another possible reason for the lack of demonstrated effect in meta-analyses is the heterogeneity of subjects, stimulation parameters, and assessments, all of which result in substantial noise. Farah’s group recently completed a meta-analysis of tDCS studies of working memory, correcting for publication bias and with a fairly homogeneous set of parameters. It showed small effects, some but not all of which were statistically reliable. She noted, however, that this analysis included a range of healthy normal individuals with no classification by ability level or genotype.
Farah suggested that a communal effort is needed to improve what we can learn from the research, ideally pre-registering studies (as done with pharmaceutical trials), archiving null results in a “file drawer” repository, and encouraging studies with higher power to overcome the factors that limit the conclusions that can be drawn from existing studies.
FREEDOM/COERCION/INVOLUNTARY USE
Involuntary or coercive uses of non-invasive neuromodulation, applied for the purpose of changing behavior or gaining compliance with socially accepted norms, present additional complex ethical challenges, although data supporting these uses are sparse, according to Jennifer Chandler, professor of law at the University of Ottawa. One recent paper showed that application of tACS to the right DLPFC reduced aggressive behavior in men (Dambacher et al., 2015), and another showed that tDCS of the right lateral prefrontal cortex (rLPFC) increased compliance with social norms in a computerized simulation (Ruff et al., 2013).
Chandler illustrated the ethical issues presented by two contrasting hypothetical cases: one involving parents who want their child to undergo non-invasive brain stimulation in order to improve their academic or physical performance, and another involving criminal offenders who are offered a reduced sentence if they undergo neuromodulation. Both of these cases raise numerous ethical concerns related to issues of safety, efficacy, justice, fairness, self-identify, and authenticity. For the first case, the primary considerations would be the best interests of the child; in the second case, the issues are less clear, that is, is the intent to punish or treat the offender? Chandler says in the criminal realm, forensic psychiatrists are bound by what is in the best interest of the offender; however, this is also open to interpretation.
In cases involving enhancement in children, one must consider the definition of benefit, that is, is it in the best interest of the child to satisfy the expectations and demands of parents, schools, peers, or society in general? For instance, children may be better off if improvements in their behavior cause their parents to have less stress or their teachers or peers to like them better, but are those reasons enough to subject the child to a procedure with potentially negative consequences? Similarly with criminals, the object of rehabilitation is often compliance with social norms, yet by whose definition are these social norms established? Chandler pointed to the example of the mathematician Alan Turing, who was sub-
ject to antilibidinal drugs, or chemical castration, at a time when homosexuality was considered a criminal offense.
Another concern is that blaming the brain for a social or behavioral “problem” may have self-fulfilling prophecy effects, affecting motivation, self-efficacy, and locus of control by convincing the person that their brain is “broken,” which can result in unanticipated behavioral consequences, noted Chandler. For example, studies have shown that a disbelief in free will can increase aggression and reduce helpfulness (Baumeister et al., 2009). In the criminal context, blaming the brain can support a perpetrator’s belief that he or she is not responsible for his or her acts, perhaps undermining efforts at rehabilitation.
The schematics and directions for building a tDCS device can be easily found on the Internet, and the parts can be purchased for about $25, Greely explained. Thus, it is no surprise that the DIY tDCS movement is rapidly expanding, said Maslen. One of the richest sources of information for the community is the Reddit tDCS forum (in press).1 While contributors to the forum include many people who base their comments on what they glean from scientific papers, there are also a number of people who come to the forum with comments indicating a lack of understanding of tDCS and uses that appear to be unsafe or dangerous. Greely added that one of his graduate students performed an analysis of the Reddit tDCS forum (Jwa, 2015). Anna Wexler, a doctoral student in the Department of Science, Technology, and Society at the Massachusetts Institute of Technology, said she is also doing research on the DIY community using qualitative, in-depth interviews.
Given that people are experimenting with these devices and the near impossibility of preventing this experimentation, Maslen asked whether researchers have a responsibility to laypersons who appropriate their research for parallel purposes. Should appropriation of research be explicitly considered by ethics committees when researchers obtain ethical approval? Should research results be made freely available in order to better inform those engaging in DIY practices?
__________________
1See http://www.reddit.com/r/tDCS (accessed July 1, 2015).
Questions were raised about the obligation of scientists to better educate consumers. Should they, for example, provide a lay summary in their publications to avoid misinterpretation and misuse of the technology by individuals who may lack the scientific background to understand the technical details of the paper? Or should they work with the DIY community to provide expert commentary on questions that arise? Maslen said she could imagine some sort of public engagement initiative to set up such a community. However, other participants raised potential issues of liability. Greely took this one step further, asking to what extent scientists doing their research should think about the possible downstream negative effects, including nefarious or unsafe use by the DIY community. Interestingly, at least in the United States, institutional review boards are forbidden from considering social harms according to the Common Rule, he said.
This page intentionally left blank.