Boston University. Boston, MA 10-15 August 1998. 19 pars. 22 December 2003 from PaideiaProjectOn-Line Website
The future may well involve the reality of science fiction's cyborg, persons who have developed some intimate and occasionally necessary relationship with a machine.
It is likely that implantable computer
chips acting as sensors, or actuators, may soon assist not only
failing memory, but even bestow fluency in a new language, or enable
"recognition" of previously unmet individuals. The progress already
made in therapeutic devices, in prosthetics and in computer science
indicate that it may well be feasible to develop direct interfaces
between the brain and computers.
The cochlear implant, which directly stimulates the auditory nerve, enables over 10,000 totally deaf people to hear sound; the retinal implantable chip for prosthetic vision may restore vision to the blind.
Research on prosthetic vision has proceeded along two paths:
The latest stage in the evolution towards the implantable brain chip involves combining these advances in prostheses technology with developments in computer science.
The linkage of smaller, lighter, and more powerful computer systems with radio technologies will enable users to access information and communicate anywhere or anytime. Through miniaturization of components, systems have been generated that are wearable and nearly invisible, so that individuals, supported by a personal information structure, can move about and interact freely, as well as, through networking, share experiences with others.
The wearable computer project envisions
users accessing the Remembrance Agent of a large communally
based data source.
His colleague, Professor Gershenfeld, asserts that,
Neither visionary professes any qualms about this project, which they expect to alter human nature itself.
Once networked the result will be a "collective consciousness", "the hive mind."
The technology for implantable devices is becoming available, and at prices that make such systems very cost effective.
Three stages of introduction of such devices can be delineated.
As intelligence or sensory "amplifiers", the implantable chip will generate at least four benefits:
For many these enhancements will produce major improvements in the quality of life, or their survivability, or their performance in a job.
The first prototype devices for these improvements in human functioning should be available in five years, with the military prototypes starting within ten years, and information workers using prototypes within fifteen years; general adoption will take roughly twenty to thirty years.
The brain chip will probably function as
a prosthetic cortical implant. The user's visual cortex will receive
stimulation from a computer based either on what a camera sees or
based on an artificial "window" interface.
Michael Dertouzos writes,
This succinctly formulates the essentialist and creationist argument against the implantable chip.
Fears of tampering with human nature are widespread; the theme that nature is good and technology evil, that the power to recreate oneself is overreaching hubris, and that reengineering humanity can only result in disaster, is a familiar response to each new control that man exercises.
The mystique of the natural is fueled by the romantic world view of a benign period when humans lived in harmony with nature.
However attractive, it is probable that
this vision is faulty inasmuch as man has always used technology to
survive, and to enhance life; the use of technology is natural to
man. Thus this negative response to the prospect of implantable
chips is certainly inadequate, although it points to a need to
evaluate the technology in terms of the good or evil possibilities
for its use by men, or governments.
This critique relies on a religious sense that improving on the design of creation insults the Creator. In particular, it proposes that attempts to alter the functioning of the brain for purposes of creating a superior human being can be decried as usurping God's power.
To be persuasive this argument must
depend on a restrictive, even for religionists, view of creation,
one that sees no role for human creativity.
Using this standard, a distinction is drawn between therapeutic and enhancement procedures:
Implantable chips that amplify the senses, or enhance memory or networking capacities would, thus, be suspect.
For others, however, there is no bright line between therapy and enhancement - how deficient does my memory have to be before it would be ethical to wire my brain to a computer? - and the argument is too weak to preclude the use of this technology, anymore than it is possible to proscribe cosmetic surgery, or the use of mood-improving drugs if the benefits seems to outweigh the medical risks.
However, even if we discount the force of these three arguments, there are a myriad of other technical, ethical and social concerns to consider before proceeding with implantable chips.
The areas of concern for technology assessment are extensive, including risks, appropriateness, societal impact, costs and equity issues and need evaluation by a multi disciplinary team. Study of this device would seem to need participants from at least the fields of computer science, biophysics, medicine, law, philosophy, public policy and international economy.
Unlike the scientific community at the
advent of genetic technologies, the computer industry has not, as
yet, engaged in a public dialogue of these promising, but risky
technologies. This avoidance of discussion, and simple reliance upon
principles of free scientific inquiry and the market economy is
itself a moral stance requiring justification.
As is the case in evaluation of any
future technology, it is unlikely that we can reliably predict all
effects. Nevertheless, the potential for harm must be considered.
Evaluation of the costs and benefits of these implants requires a consideration of the surgical and long term risks. One question - whether the difficulties with development of non-toxic materials will allow long term usage? - should be answered in studies on therapeutic options and thus, not be a concern for enhancement usages.
However, it is conceivable that there should be a higher standard for safety when technologies are used for enhancement rather than therapy, and this issue needs public debate. Whether the informed consent of recipients should be sufficient reason for permitting implementation is questionable in view of the potential societal impact.
Other issues such as the kinds of warranties users should receive, and the liability responsibilities if quality control of hard/soft/firmware is not up to standard, could be addressed by manufacturing regulation.
Provisions should be made to facilitate upgrades since users presumably would not want multiple operations, or to be possessors of obsolete systems. Manufacturers must understand and devise programs for teaching users how to implement the new systems.
There will be a need to generate data on
individual implant recipient usefulness, and whether all users
benefit equally. Additional practical problems with ethical
ramifications include whether there will be a competitive market in
such systems and if there will be any industry-wide standards for
design of the technology.
It is possible that the technology could be used to enable those who are naturally less cognitively endowed to achieve on a more equitable basis. Certainly, uses of the technology to remediate retardation or to replace lost memory faculties in cases of progressive neurological disease, could become a covered item in health care plans.
Enabling humans to maintain species typical functioning would probably be viewed as a desirable, even required, intervention, although this may become a constantly changing standard.
The costs of implementing this
technology needs to be weighed against the costs of impairment,
although it may be that decisions should be made on the basis of
rights rather than usefulness.
If people are actually connected via
their brains the boundaries between self and community will be
considerably diminished. The pressures to act as a part of the whole
rather than as a single isolated individual would be increased; the
amount and diversity of information might overwhelm, and the sense
of self as a unique and isolated individual would be changed.
Supersensory sight will see radar, infrared and ultraviolet images, augmented hearing will detect softer and higher and lower pitched sounds, enhanced smell will intensify our ability to discern scents, and an amplified sense of touch will enable discernment of environmental stimuli like changes in barometric pressure.
These capacities would change the "normal" for humans, and would be of exceptional application in situations of danger, especially in battle. As the numbers of enhanced humans increase, today's normal range might be seen as subnormal, leading to the medicalization of another area of life.
Thus, substantial questions revolve around whether there should be any limits placed upon modifications of essential aspects of the human species.
Although defining human nature is notoriously difficult, man's rational powers have traditionally been viewed as his claim to superiority and the center of personal identity. Changing human thoughts and feeling might render the continued existence of the person problematical.
If one accepts, as most cognitive scientists do,
On the other hand, not all philosophers espouse the materialist contention and use of these technologies certainly will impact discussions about the nature of personal identity, and the traditional mind-body problem.
Modifying the brain and its powers could change our psychic states, altering both the self-concept of the user, and our understanding of what it means to be human. The boundary between me "the physical self" and me "the perceptory/intellectual self" could change as the ability to perceive and interact expands far beyond what can be done with video conferencing.
The boundaries of the real and virtual worlds may blur, and a consciousness wired to the collective and to the accumulated knowledge of mankind would surely impact the individual's sense of self.
Whether this would lead to bestowing
greater weight to collective responsibilities and whether this would
be beneficial are unknown.
Standards for entrance into schools, gifted programs and spelling bees - all would be affected. The inequalities produced might create a demand for universal coverage of these devices in health care plans, further increasing costs to society.
However, in a culture such as ours, with different levels of care available on the basis of ability to pay, it is plausible to suppose that implanted brain chips will be available only to those who can afford a substantial investment, and that this will further widen the gap between the haves and the have-not.
A major anxiety should be the social impact of implementing a technology that widens the divisions not only between individuals, and genders, but also, between rich and poor nations.
As enhancements become more widespread, enhancement becomes the norm, and there is increasing social pressure to avail oneself of the "benefit."
Thus, even those who initially shrink
from the surgery may find it becomes a necessity, and the consent
part of "informed consent" would become subject to manipulation.
This data would be collected by biological probes receiving electrical impulses, and would enable a user to recreate experiences, or even to transplant memory chips from one brain to another.
In this eventuality, psychological
continuity of personal identity would be disrupted with indisputable
ramifications . Would the resulting person have the identities of
other persons?
In a prescient projection of experimental protocols, George Annas writes of the,
Using such technology governments could control and monitor citizens.
In a free society this possibility may seem remote, although it is not implausible to project usage for children as an early step. Moreover, in the military environment the advantages of augmenting capacities to create soldiers with faster reflexes, or greater accuracy, would exert strong pressures for requiring enhancement.
When implanted computing and communication devices with interfaces to weapons, information, and communication systems become possible, the military of the democratic societies might require usage to maintain a competitive advantage.
Mandated implants for criminals are a foreseeable possibility even in democratic societies. Policy decisions will arise about this usage, and also about permitting usage, if and when it becomes possible, to affect specific behaviors.
A paramount worry involves who will
control the technology and what will be programmed; this issue
overlaps with uneasiness about privacy issues, and the need for
control and security of communication links. Not all the countries
of the world prioritize autonomy, and the potential for sinister
invasions of liberty and privacy are alarming.
This is, of course, the question that open dialogue needs to address, and it raises the disputed topic of whether technological development can be resisted, or whether the empirical slippery slope will necessarily result in usage, in which case regulation might still be feasible.
Issues raised by the prospect of implantable brain chips are hard ones, because the possibilities for both good and evil are so great. The issues are too significant to leave to happenstance, computer scientists, or the commercial market.
It is vital that world societies assess this technology and reach some conclusions about what course they wish to take.
|