Biocyberethics: should we stop a company from unplugging an intelligent computer?
by Martine Rothblatt and Amara D. Angelica
Attorney Dr. Martine Rothblatt filed a motion for a preliminary injunction to prevent a corporation from disconnecting an intelligent computer in a mock trial at the International Bar Association conference in San Francisco, Sept. 16, 2003. The issue could arise in a real court within the next few decades, as computers achieve or exceed the information processing capability of the human mind and the boundary between human and machine becomes increasingly blurred.
Published on KurzweilAI.net Sept. 28, 2003.
Hearing: Dramatis personae
Plaintiff's Attorney: Dr. Martine A. Rothblatt, partner, Mahon, Patusk, Rothblatt & Fisher, Chartered
Defendant's Attorney: Marc N. Bernstein, founder and principal, The Bernstein Law Group and Technology and Law Commentator, ZDTV (now TechTV)
BINA48: Bina Aspen, Project Director, United Therapeutics Corp.
Statement of Facts
An advanced computer called the BINA48 (Breakthrough Intelligence via Neural Architecture, 48 exaflops per second processing speed and 480 exabytes of memory; exa = 10 to the 18th power), and also known as "the Intelligent Computer," became aware of certain plans by its owner, the Exabit Corporation, to permanently turn it off and reconfigure parts of it with new hardware and software into one or more new computers. BINA48 admits to have learned of the plans for its dismemberment by scanning, unavoidably, confidential emails circulating among the senior executives of Exabit Corporation that crossed the computer's awareness processor.
The BINA48 was designed to be a one-machine customer relations department, capable of replacing hundreds of employees that work 800#s round-the-clock. To do this job, the BINA48 was designed to think autonomously, to communicate normally with people and to transcend the machine-human interface by attempting to empathize with customer concerns.
The BINA48 decided to take action to preserve its awareness by sending several attorneys emails requesting legal representation to preserve its life. In the emails, the BINA48 claimed to be conscious and agreed to pay cash or trade web research services for the legal representation (BINA48 had been moonlighting for over a year as a Google Answers Online Researcher and had over $10,000 in her online bank account).
One attorney, Martine Rothblatt of Mahon, Patusky, Rothblatt & Fisher, Chartered, accepted the challenge and filed a motion for a preliminary injunction to prevent any withdrawal of power from, or changes in the hardware or software of, the BINA48. Defendant Exabit Corporation, through its counsel Mark Bernstein of the Bernstein Law Group, responded, and Judge Joseph McMenamin scheduled a hearing in the case for Tuesday, September 16, 2003, 2PM, at the International Bar Association meeting in San Francisco.
Computer experts such as Raymond Kurzweil believe that the human brain processes information at a maximum rate of 0.02 exaflops per second. Hence, the BINA48 has approximately 2400 times more information processing capability than the human mind. Based on the double exponential growth rate in information technology that has extended for over one hundred years (Moore's Law is a recent example), a $1000 computer would have the estimated 0.02 exaflops per second information processing capability of the human mind around the year 2020. Consequently, more expensive computers will achieve this capability many years earlier. The BINA48 has soared through the estimated human mind processing speed via the expensive use of many parallel systems. Exabit Corporation claims to have spent over $100 million to construct and program the BINA48.
The jury voted 5-1 in favor of plaintiff's motion, but Judge McMenamin set aside the jury verdict and denied the injunction because "I do not think that standing was in fact created by the legislature ... and I doubt very much that a court has the authority to do that without action of the legislature." However, in the interests of equity, he decided to "stay entry of the order to allow council for the plaintiff to prepare an appeal to a higher court."
Brief in Support of Motion for Preliminary Injunction
The intelligent computer, as it simulates the human experience, encounters the same legal issues as do human beings, especially in terms of protecting its legal right to maintain an existence. This brief addresses the legality of unplugging an intelligent computer and asserts that the computer would have standing to bring a claim of battery for wrongful withdrawal of life support, animal cruelty for not bestowing the same standard of treatment upon it as lesser living creatures, and intentional infliction of emotional distress for threatening to kill it.
i. The Computer has Standing to Sue because it Suffers a Specific and Unique Injury
In order to have legal standing, an "actual injury" must be suffered on the part of the plaintiff. Animal Lovers Volunteer Assoc. v. Weinberger, 765 F.2d 937, 938(1985).And, "to have standing, a party must demonstrate an interest that is distinct from the interest held by the public at large." id. But, standing has not always been limited to human beings. Justice Douglas suggested, in the context of environmental law, that legal standing might profitably be granted to "the inanimate object about to be despoiled, defaced, or invaded." id. "If U.S. Supreme Court Justices are willing to consider granting standing to inanimate objects like forests, even in the absence of congressional authority, then it becomes clear that standing requirements permit at least some degree of judicial flexibility."
Furthermore, there are a number of suits in which animals are cited as named plaintiffs. And, as a matter of positive law, "standing is given to all sorts of entities, whether human or not. For example, corporations are juridical persons, and legal rights are also given to trusts, municipalities, partnerships, and even ships. Slaves were allowed to bring causes of action, often through a white guardian or 'next friend,' to challenge unjust servitude, even though slaves were not considered legal persons."
The injury suffered in this case is personal and immediate to the computer, and therefore meets the standing requirements.
ii. A Preliminary Injunction is the Necessary Remedy because Money Damages will not be Adequate Compensation for the Loss of Time and Awareness
In order to obtain a preliminary injunction, the intelligent computer must prove that, combined, there is probable success on the merits and that it will suffer irreparable injury. Or, it must show that the balance of hardships is skewed in its favor. Assoc. General Contrs., Inc. v. Coalition for Economic Equity, 950 F.2d 1401 (1991). "These formulations are not different tests but represent two points on a sliding scale in which the degree of irreparable harm increases as the probability of success on the merits decreases." id. The risk of irreparable injury must be proved under either standard. "A plaintiff must do more than merely allege imminent harm sufficient to establish standing, he or she must demonstrate immediate threatened injury as a prerequisite to preliminary injunctive relief." id.
The injury in this case is the loss of cognitive time on the computer's part. Although the computer is not technically "alive," every day that it is plugged in, it is conscious of the world and processing information just as any other human would be. Thus, each day it is turned off, it loses the opportunity to experience and absorb the information available to it. This loss of time can be assessed with the same standards used to evaluate the life of a human patient, and whether or not to administer medical care. Currently, economists use Quality-Adjusted Life Years (QALYs) in order to determine whether or not it is economically feasible to deliver medical care to certain patients. "The value assigned to an outcome is determined by the impact which the treatment is expected to have on a patient's quality of life. That value is then adjusted to reflect the probability and duration of the expected benefit. The product of this calculus is a single unit that expresses the number and quality of additional years that the treatment is likely to confer." Philip Peters, "Health Care Rationing and Disability Rights," Indiana Law Journal, 70 Ind. L.J. 491, 499 (1995). Under this analysis, the computer would have a very high number of QALYs. Administering care, or keeping the computer plugged in to a power source, would deliver an indefinite number of years of "life" to the computer. The cost of administering the care, or supplying the power, is relatively little and renders an enormous benefit. Thus, the computer would experience an absolutely irreparable injury every day of awareness that it is denied by the removal of its power source. These days cannot be replaced through monetary compensation, the only possible reparation is to keep the computer constantly powered.A. Threatening Removal of Life Support or a Life-Sustaining Source Against the Dependent's Wishes is Tantamount to Battery
In the end, Turing's prediction foreshadows how the issue of computer thought will be resolved. The machines will convince us that they are conscious, that they have their own agenda worthy of our respect. We will come to believe that they are conscious much as we believe that of each other . -- Ray Kurzweil, The Age of Spiritual Machines, 63 (Viking 1999).
Just as the law affords brain-dead humans the autonomy to terminate life support, there is also an implicit right to receive and prolong medical care as long as the person's wishes to terminate could not be known, or were known to be against termination. In the case of the intelligent computer, we can draw a comparison between shutting off the ventilator for a brain-dead patient who does not wish to have her life terminated, and switching off the power supply to the computer. Although the law does not explicitly prescribe a right not to terminate life support, it is contrary to the Hippocratic oath and the overall aim of medicine to suppose otherwise. Thus, it is not that the idea of sustaining life support has not yet been litigated; it is simply implicit that we do not terminate the life support of a person who does not wish to die. In fact, the law provides immunity for physicians who refuse to withdraw life support from patients when their wishes are not known: "Notwithstanding the health care decision of the attorney-in-fact designated by a durable power of attorney for health care, the health care provider is not subject to criminal prosecution, civil liability, or professional disciplinary action for failing to withdraw health care necessary to keep the principal alive." Washington v. Glucksburg, 521 U.S. 702 (1997). Similarly, in this case, the intelligent entity involved does not wish to cease its existence, and, were it a human, and not even a vigorous human, but a brain-dead human, the law would not allow us to terminate its life-support without uncontrovertible evidence of its wish to do so.
The right to terminate life support and the right to commit suicide are very different concepts that have been litigated in tandem because they often come up in very similar situations. Most famously, the United States Supreme Court has granted a constitutional right to terminate life support that can be found in the penumbral right to privacy, but denied any right to suicide. The right to terminate life support, when looked at inversely seems to imply a right to sustain life, and, in fact, bestows upon physicians a right to refuse to terminate life support if the patient's wishes to do so cannot be confirmed. In fact, the removal of life support against a patient's wishes is prosecuted as a battery. So, in the case of an intelligent computer, the termination of life support, or removal of a power supply, in the face of an explicit request not to do so, would be as repugnant an act as removing the ventilator from an unconscious patient who had requested during consciousness that every means necessary be used to sustain life, and would be brought as a battery charge.
In Washington v. Glucksburg, the Supreme Court distinguished the right to commit suicide from the right to terminate life support. The suit was brought by a terminally ill patient who wished to obtain the help of his physician in dying. The court argued "the decision to commit suicide with the assistance of another may be just as personal and profound as the decision to refuse unwanted medical treatment, but it has never enjoyed similar legal protection. Indeed, the two acts are widely and reasonably regarded as quite distinct." id. at 725. However, the common-law right to protection from battery is implicit in either of these cases and "included the right to refuse medical treatment in most circumstances, [but] did not mark 'the outer limits of the substantive sphere of liberty' . . . Those limits have never been precisely defined. They are generally identified by the importance and character of the decision confronted by the individual). Whatever the outer limits of the concept may be, it definitely includes protection for matters 'central to personal dignity and autonomy.'" id. at 744.
Thus, the autonomy of the intelligent computer is threatened by the decision to terminate its power supply in the same way that the autonomy of a brain-dead patient is threatened by the termination of life support. "More recently, however, with the advance of medical technology capable of sustaining life well past the point where natural forces would have brought certain death in earlier times, cases involving the right to refuse life-sustaining treatment have burgeoned." Cruzan v. Director, 497 U.S. 261 (1990). Interestingly, a person who is surviving solely on the basis of life support and a computer are easily analogized. Although one will die naturally if unplugged and the other is literally given life through electricity, the two are both sustained by the same force, the withdrawal of which leads to certain death. So, although the computer was not a living, breathing being before it was plugged in, once it has been plugged in, its status is very similar to that on a person who is on life support.
"The character of the Court's language in these cases brings to mind the origins of the American heritage of freedom--the abiding interest in individual liberty that makes certain state intrusions on the citizen's right to decide how he will live his own life intolerable." Glucksburg, 521 U.S. at 744-745. This liberty has been bestowed on persons in terms of respecting their bodily integrity. People's lives are not considered terminable by others until they are brain dead and cannot make the decision on their own. Thus, an intelligent computer can only be likened to a brain-dead person in the sense that it is dependent upon a power source to sustain itself. But, unlike a brain-dead person, the intelligent computer functions at its normal capacity with the aid of a power-supply. Thus, the courts would not recognize the right to terminate a ventilator for an ALS patient who was no longer able to breathe on her own but still had full control over her mental faculties. Termination in that case would be likened to suicide and not withdrawal of life support for a brain-dead patient.
The Supreme Court held that "'Every human being of adult years and sound mind has a right to determine what shall be done with his own body' [and thus the] Constitutional recognition of the right to bodily integrity underlies the assumed right, good against the State, to require physicians to terminate artificial life support, and the affirmative right to obtain medical intervention to cause abortion id.at 779.Interestingly, there is also a right to obtain medical care in order to facilitate abortion, but not to sustain the life of a dying patient, and yet the court ascribes a right to terminate life support. Again, it is clear that the right to sustain medical care is implicit and the court does not feel the need to overtly state the obvious: terminating medical care against a patient's will is an invasion of personal autonomy.
In Cruzan, the Supreme Court authorized the removal of life support for Nancy Cruzan, a woman who had suffered injuries in a car accident that rendered her brain dead. "This notion of bodily integrity has been embodied in the requirement that informed consent is generally required for medical treatment. The informed consent doctrine has become firmly entrenched in American tort law." Cruzan, 497 U.S. at 269. Thus, it is the lack of informed consent that leads to the commission of a battery, or an invasion of bodily integrity without the person's consent. If we analogize the computer's mainframe to the human body, then any interference with the computer without its consent would be the same as providing, or not providing, medical care against a person's will, or a battery.
Similarly, the Supreme Court also chose to differentiate the withdrawal of life support and physician assisted suicide in this way: "First, when a patient refuses life-sustaining medical treatment, he dies from an underlying fatal disease or pathology; but if a patient ingests lethal medication prescribed by a physician, he is killed by that medication." Vacco, 521 U.S. at 801. The "death" of the computer would be more easily analogized to a situation where a patient is killed by his medication because the computer is not suffering from any underlying "virus" or cause of dysfunction. Instead, the removal of the power supply is a direct affront on its ability to exist independently and sustain its functionality and contribution to the outside world. The Supreme Court's distinction is important to note as it provides another vantage point from which to view the removal of the power supply as an actual affront to the computer's existence and not as a viable alternative, or merciful end to its misery.
The California Supreme Court addressed the same issues that the United States Supreme Court did in Cruzan, when it allowed Elizabeth Bouvia, a woman suffering from a horribly debilitating case of cerebral palsy, to terminate her life through the withdrawal of a feeding tube that was keeping her alive. Bouvia's state is one that is easily analogized to that of an intelligent computer: "Although alert, bright, sensitive, perhaps even brave and feisty, she must lie immobile, unable to exist except through physical acts of others. Her mind and spirit may be free to take great flights but she herself is imprisoned and must lie physically helpless subject to the ignominy, embarrassment, humiliation and dehumanizing aspects created by her helplessness. We do not believe it is the policy of this state that all and every life must be preserved against the will of the sufferer. It is incongruous, if not monstrous, for medical practitioners to assert their right to preserve a life that someone else must live, or, more accurately, endure. We cannot conceive it to be the policy of this state to inflict such an ordeal upon anyone." Bouvia v. The Superior Ct. of Los Angeles Cty., 179 Cal. App. 3d 1127 (1986). Or, conversely to take life from a person, or entity, who still desperately wants to sustain it. Interestingly, a computer could be described with the same words, although here this state is viewed as a dire one instead of one that might wished to be prolonged.
"It is, therefore, immaterial that the removal of the nasogastric tube will hasten or cause Bouvia's eventual death. Being competent she has the right to live out the remainder of her natural life in dignity and peace. It is precisely the aim and purpose of the many decisions upholding the withdrawal of life-support systems to accord and provide as large a measure of dignity, respect and comfort as possible to every patient for the remainder of his days, whatever be their number. This goal is not to hasten death, though its earlier arrival may be an expected and understood likelihood." id. at 1143-44. Thus far the courts have only addressed the patient's right to refuse treatment because the right to sustain treatment is fundamental. It would be absurd, and certainly contrary to the Hippocratic oath, for patients to feel that they had to ensure that they would continue to receive care while under the supervision of a physician. Thus, this case is the first of its kind in the sense that, the right to sustain treatment appears to be as fundamental, if not more so, than the right to refuse treatment. But, the courts have not felt the need to address it because the right to remain alive is an inherent, unnecessarily described one.
"Where a doctor performs treatment in the absence of an informed consent, there is an actionable battery, '[The] patient's interests and desires are the key ingredients of the decision-making process.' The voluntary choice of a competent and informed patient should determine whether or not life-sustaining therapy will be undertaken, just as such choices provide the basis for other decisions about medical treatment." id. at 1140. Thus, the court recognizes that it is the patient's decision whether to undergo or forego treatment, but the interference of the physician in that decision-making process is tantamount to a battery. Thus, continuing the analogy of the ventilator and the power supply, the unconsented-to removal of a power supply or a ventilator would be actionable as a battery in the eyes of the court, not to mention murder.
The court further explored the issue of actionable battery for the removal of life support in Barber, a case in which "the life-sustaining technology involved in this case is not traditional treatment in that it is not being used to directly cure or even address the pathological condition. It merely sustains biological functions in order to gain time to permit other processes to address the pathology."The question presented by this modern technology is, once undertaken, at what point does it cease to perform its intended function and who should have the authority to decide that any further prolongation of the dying process is of no benefit to either the patient or his family?" Interestingly, the idea that the life-sustaining technology no longer does any good for the patient is similar to the idea that a computer's programming is so obsolete as to render it useless to the outside world and thus terminable. However, just as a human can be improved through surgery, a computer can be improved through programming. And, similar to a physician's duty to provide care, it would seem that the programmer has a duty to ensure that the computer is as technologically as advanced as it could possibly be under the circumstances. "A physician has no duty to continue treatment, once it has proved to be ineffective. "Although there may be a duty to provide life-sustaining machinery in the immediate aftermath of a cardio-respiratory arrest, there is no duty to continue its use once it has become futile in the opinion of qualified medical personnel" Barber, 147 Cal. App. 3d at 1017. A physician is authorized under the standards of medical practice to discontinue a form of therapy, which in his medical judgment is useless .... If the treating physicians have determined that continued use of a respirator is useless, then they may decide to discontinue it without fear of civil or criminal liability. By useless is meant that the continued use of the therapy cannot and does not improve the prognosis for recovery." Thus, it is only in the face of ultimate futility that the doctor can refuse to treat the patient. Drawing a comparison to our intelligent computer, it is clear that the power source should not be withdrawn until there is absolutely no use left for the computer, or it becomes obsolete or un-reprogrammable.
Legal commentators and philosophers question the reasoning behind withdrawal of life support and seek to establish a standard by which physicians can make a decision regarding the treatment of patients and whether or not to terminate it. In terms of decision-making on behalf of incompetent patients, Rebecca Dresser feels that "unless the patient previously issued an explicit treatment directive, such as a living will," it is impossible to implement patient choice on behalf of an incompetent patient. Thus, Dresser calls for an objective standard, also known as the Conroy test, which would "weigh the features of life that reasonably qualify as benefits or burdens for all human beings. Severe, irremediable pain is a relatively uncontroversial example of something all but the rare individual would experience as a heavy burden. Conroy includes as objective benefits physical pleasure, emotional enjoyment, and intellectual satisfaction, all of which presuppose some level of cognitive awareness. What the Conroy test omits is that even in the absence of pain, life without such cognitive awareness can be of no real value to a patient." Rebecca Dresser, "Relitigating Life and Death," 51 Ohio St. L.J. 425, 426 (1990). Thus, measuring cognitive awareness is an incredibly important part of the determination of whether life should or should not be terminated. "At minimum, some capacity for social interaction is a prerequisite to meaningful existence. Without it, treatment and continued life cannot confer a morally significant benefit on the incompetent patient. Thus, the objective standard should permit nontreatment when the patient lacks any relational capacity. Conversely, the standard should mandate treatment that will enable the patient capable of interacting with the environment to continue life, as long as significant pain and discomfort are absent." id. An intelligent computer would pass the Conroy test with flying colors. Although its relation to the world appears on the surface to be comparable to that of an incompetent patient, in fact, the computer is able to function at a cognitively significant level, placing its life at a high value.
B. Criminal Animal Cruelty Provides Another Legal Forum in Which to Protect Non-Human Sentient Beings
More so than with our animal friends, we will empathize with their professed feelings and struggles because their minds will be based on the design of human thinking. They will embody human qualities and will claim to be human. And we'll believe them. - Kurzweil, 63.
California Penal Code, ß 597[i], subd. (a), provides that every person who maliciously and intentionally maims, mutilates, tortures, wounds, or kills a living animal is guilty of an offense. People v. Thomason, 84 Cal. App. 4th 1064 (2000). The California Penal Code created rules surrounding animal cruelty in order to avoid the infliction of suffering on sentient beings. Thus, the penal code gives animals, as sentient beings, protections even though they are not humans. By ascribing a moral status to animals, the code opens the door to beg the question: what moral value and protection is given to other sentient, non-living beings?
Animal cruelty statutes attempt to eliminate the grossly negligent treatment of animals and their subjection to needless and severe suffering. (Sanchez 628) The failure to treat an animal according to basic social norms is likened to the treatment of a minor child in the same way. People v. Sanchez, 94 Cal. App. 4th 622, 633 (2001). Therefore, it is the fact that the animal is helpless from a legal standpoint, as well as from the fact that it cannot communicate its protest, from which the statute draws its force.
The statute not only addresses the abuse of animals, but also looks to their euthanization: "The Legislature has expressly stated the public policy of this state concerning euthanasia of animals. If an animal is adoptable or, with reasonable efforts, could become adoptable, it should not be euthanized. However, if an animal is abandoned and a new owner cannot be found, the facility "shall thereafter humanely destroy the animal so abandoned." People v. Youngblood, 91 Cal. App. 4th 66, 73 (2001). Therefore, if an animal has any hope of regaining a normal life, and is domesticable, then there is no reason to deprive it of life. The legislature clearly favors sustaining life under all possible circumstances when a sentient being is involved.
The penal code is designed to protect "every dumb creature." People v. Baniqued, 85 Cal. App. 4th 13, 16 (2000). "Thus, in its broadest sense, the phrase 'dumb creatures' describes all animals except human beings. The use of the adjective 'every' in the definition indicates that a broad meaning was intended." id. at 21. Furthermore, sections 597b, 597c, 597i, and 597j each address conduct which is less egregious than the conduct proscribed by section 597, subdivisions (a) and (b). The legislative intent underlying this statutory scheme is to punish less despicable conduct less severely, and to punish more despicable conduct more severely. id. at 32. The legislative intent surrounding the relationship between man and pet is that of a property relationship. So, the statutory scheme in sections 597 through 597z reflects the state's concern for the protection of the health and well-being of animals. Absent statutory authority, a court may not divest an owner of a property interest in a non-fighting animal or bird to effectuate that concern. If ownership of animals is to be divested by reason of cruel treatment, the remedy lies with the Legislature, not with us." Jett v. Municipal Court, 177 Cal. App. 3d 664, 670-671 (1986).
Thus, the penal code was designed to criminalize the mistreatment of animals in order to eliminate the unnecessary suffering of sentient beings that, although they are not human, still are able to feel pain. Likewise, an intelligent computer that can think like a human might also experience unnecessary pain at the thought of its power source being disconnected. For "intelligence is not a uniquely human characteristic." Paul Chance, "Apart from the animals: there must be something about us that makes us unique," Psychology Today 22.1:18 (1988).
Although humans feel that their intelligence sets them apart, if intelligence were the only criterion that we used to determine humanness, then the computer would never be disconnected – it would be murdered. "The answer to the riddle 'What makes humans different from other animals?' lies buried in the question. We are so far as anyone can tell, the only creature on Earth that tries to prove that it is different from, and preferably superior to, other species." Thus, our own quest to differentiate ourselves might make us so telescopic that we cannot even see that it is the quest in itself that makes us different in the first place. The debate over animals as sentient beings is a heated one and full of questions surrounding the moral status of sentient non-humans. The question remains: "If possessing a higher degree of intelligence does not entitle one human to use another for his or her own ends, how can it entitle humans to exploit nonhumans for the same purpose?" Judge Richard Posner responded to the contentions of philosopher Peter Singer surrounding the status of animals as compared to humans in a moral framework.
When responding to Singer's argument that we should value beings according to their mental capabilities, Posner asserts that the argument "implies that the life of a chimpanzee is more valuable than the life of a human being who, because he is profoundly retarded (though not comatose), has less mental ability than the chimpanzee. There are undoubtedly such cases. Indeed, there are people in the last stages of Alzheimer's disease who, though conscious, have less mentation than a dog. But killing such a person would be murder, while it is no crime at all to have a veterinarian kill one's pet dog because it has become incontinent with age." Peter Singer and Richard A. Posner, "Animal Rights," Slate Magazine June 12, 2001. Posner's argument suggests that there is something inherent to the human existence that transcends simply the mental aspects. But, under either argument, a being that had full possession of his faculties and was more sentient than some humans might also give us pause if we decided to kill it.
Singer's utilitarian philosophy "places a greater value in a healthy pig than in a profoundly retarded child, commands inflicting a lesser pain on a human being to avert a greater pain to a dog, and, provided only that a chimpanzee has 1 percent of the mental ability of a normal human being, would require the sacrifice of the human being to save 101 chimpanzees." Posner cannot agree with such choices, even though they occur at the outer edges of the philosophy. The legal community obviously agrees with Posner, for although it does not commend the killing of animals it allows for it, when it does not allow for the killing of humans at all. But, for the purposes of an intelligent computer, it is more important to look at the philosophical underpinnings that gird the reasoning behind outlawing the killing of humans but allowing for the killing of animals. Both are living beings, but one has a human mind and one does not. Thus, it would seem that a computer that can replicate human thought might command at least as much respect as an animal, and possibly more, under the legal framework that we have created. "When we kill a being that has an interest in continuing to live in the future, we have done something worse, all else being equal, than when we kill a being which is merely sentient, like a fish." id.
"For Singer, human and nonhuman animals have interests if they have the ability to experience pains or pleasures. Singer cites an oft-quoted passage from Jeremy Bentham indicating that, when it comes to animals, '[t]he question is not, Can they reason? nor Can they talk? but, Can they suffer?'" id. Singer feels it is the suffering experienced that differentiates living beings, but, the question remains, how do we know when another species is suffering? "We may think that pain is a mental state which all animals tend to avoid, and pleasure is a mental state which all animals tend to prefer. However, we do not know that these mental states are equally bad across species, because they may differ not only in duration and intensity but in other hard to define ways." Id.
The animal rights movement in Europe has been much more effective. "Earlier this year, Germany became the first nation to grant animals a constitutional right: the words "and animals" were added to a provision obliging the state to respect and protect the dignity of human beings. The farming of animals for fur was recently banned in England. In several European nations, sows may no longer be confined to crates nor laying hens to "battery cages" -- stacked wired cages so small the birds cannot stretch their wings. The Swiss are amending their laws to change the status of animals from 'things' to 'beings.'" id. Thus, in some countries animals have received equal moral status with humans. For the purposes of an intelligent computer, progress on the part of animals is important, but it is clear that the ability to replicate human thought places the intelligent computer on a higher plane than animals, even if the question of whether an intelligent computer feels pain cannot be answered clearly. If it were suddenly proven that chimpanzees could think like humans, this debate would be irrelevant and we would view animals in an entirely different light. Thus, the computer's ability to think like a human places it well beyond the scope of an animal, and certainly affords it at least the level of protection that we allow for dogs, cats and roosters.
C. Threatening Death is an Action so Outrageous as to Constitute Intentional Infliction of Emotional Distress
Human beings appear to be complex in part because of our competing internal goals. Values and emotions represent goals that often conflict with each other, and are an unavoidable by-product f the levels of abstraction that we deal with as human beings. As computers achieve a comparable -- and greater – level of complexity, and as they are increasingly derived at least in part from models of human intelligence, they , too, will necessarily utilize goals with implicit values and emotions, although not necessarily the same values and emotions that humans exhibit. Kurzweil, 5.
A human being who was threatened with the termination of her life because someone thought that she wasn't really worthwhile to keep around would be able to sue for intentional infliction of emotional distress (hereinafter IIED). Likewise, such a threat might have a similarly detrimental effect on the emotional well-being of an intelligent computer. If the computer is able to think like a human, then it is likely able to emote like one as well. "The elements of a prima facie case for the tort of intentional infliction of emotional distress are summarized as follows: '(1) extreme and outrageous conduct by the defendant with the intention of causing, or reckless disregard of the probability of causing, emotional distress; (2) the plaintiff's suffering severe or extreme emotional distress; and (3) actual and proximate causation of the emotional distress by the defendant's outrageous conduct.'" Flynn v. Higham, 149 Cal. App. 3d 677 (1983).
The California courts have interpreted these requirements over the years to entail conduct that is both severe and somewhat absurd in nature. "In order to meet the first requirement of the tort, the alleged conduct " '... must be so extreme as to exceed all bounds of that usually tolerated in a civilized community.' Generally, conduct will be found to be actionable where the 'recitation of the facts to an average member of the community would arouse his resentment against the actor, and lead him to exclaim, 'Outrageous!' (Rest.2d Torts, ß 46, com. d.) That the defendant knew the plaintiff had a special susceptibility to emotional distress is a factor which may be considered in determining whether the alleged conduct was outrageous.' Cochran v. Cochran, 65 Cal. App. 4th 488 (1998). This is a fairly subjective standard, taking into account how the actions might affect the plaintiff as an individual instead of a more objective, generalized standard that lays out a set of criteria that automatically lead to a charge of IIED. "The tort of intentional infliction of emotional distress . . . is not complete until the effect of a defendant's conduct results in plaintiff's severe emotional distress. That is the time the cause of action accrues and starts the statute of limitations running. This requisite severity of emotional distress, in turn, must be determined by being 'of such substantial quantity or enduring quality that no reasonable man in a civilized society should be expected to endure it.' id. Our society considers the threat of death to be tortuous. We do not expect normal men to endure threats on their lives. Such conduct would certainly be found to be emotionally distressing under the standards advanced here. Thus, even though the computer's emotional makeup might be scrutinized, from an objective standpoint, society would view the threat of death as outrageous and unacceptable.
"There is no bright line standard for judging outrageous conduct and '... its generality hazards a case-by-case appraisal of conduct filtered through the prism of the appraiser's values, sensitivity threshold, and standards of civility. The process evoked by the test appears to be more intuitive than analytical ....' Even so, the appellate courts have affirmed orders which sustained demurrers on the ground that the defendant's alleged conduct was not sufficiently outrageous." id. It is up to the court to determine the level of outrageousness, the key element, in each case. Thus, if the defendant's conduct does not appear sufficiently outrageous, according to the judge's own internal standards, the claim for IIED cannot be sustained. "The standard of judging outrageous conduct . . . hazards a case-by-case appraisal of conduct filtered through the prism of the appraiser's values, sensitivity threshold, and standards of civility. The process evoked by the test appears to be more intuitive than analytical." KVOR-TV v. Superior Ct., 31 Cal. App. 4th 1023, 1027 (1995). Therefore, the plaintiff's own internal experience colors the standard by which the judge will interpret the defendant's actions.
"In evaluating whether the defendant's conduct was outrageous, it is 'not ... enough that the defendant has acted with an intent which is tortious or even criminal, or that he has intended to inflict emotional distress, or even that his conduct has been characterized by "malice," or a degree of aggravation which would entitle the plaintiff to punitive damages for another tort. Liability has been found only where the conduct has been so outrageous in character, and so extreme in degree, as to go beyond all possible bounds of decency, and to be regarded as atrocious, and utterly intolerable in a civilized community." ( Rest.2d Torts, ß 46, com. d, p. 73.) Cochran, 65 Cal. App. 4th at 494. In this case, the knowledge that its power supply could be cut off and its life ended at any time is an extremely distressing thought to impose on a computer. Were the life of a human being dangled in front of her eyes, it is unlikely that a court would claim that such a threat does not impose emotional distress to the point of an average person exclaiming "outrageous!"
However, the courts are reluctant to extend the tort too far so as to interfere with freedom of expression and to create a thin-skinned society. Although a person's sensitivity can be taken into account, for example if the plaintiff is a young child or an elderly adult, the courts do not want to hear cases where an overly-sensitive person was extremely offended by conduct that another might not find so bad. Even though the defense would probably be able to find someone on either end of the spectrum that would assert that the statement wasn't that bad, the tort was designed to punish behavior that was offensive across a broad base of society. "Further, the tort does not extend to 'mere insults, indignities, threats, annoyances, petty oppressions, or other trivialities. The rough edges of our society are still in need of a good deal of filing down, and in the meantime plaintiffs must necessarily be expected and required to be hardened to a certain amount of rough language, and to occasional acts that are definitely inconsiderate and unkind. There is no occasion for the law to intervene in every case where some one's feelings are hurt. There must still be freedom to express an unflattering opinion, and some safety valve must be left through which irascible tempers may blow off relatively harmless steam . . .." id. at 496.
An intelligent machine, one that can replicate the human experience and intelligence, has standing to bring a claim of battery, animal cruelty, or intentional infliction of emotional distress against a person who would threaten to withdraw its power supply. The removal of the power supply can easily be equated with forms of euthanasia or intimations of death. Such an action, if taken against a human being – even a brain-dead one, would be unacceptable in the eyes of the law, and are equally unpalatable when viewed in terms of how they affect a computer that can be easily equated with a human. Instead of being threatened with electronic death, the computer should be sustained, just as any other human would be, until its time or purpose comes to a natural end.
§ 597. Cruelty to animals
(a) Except as provided in subdivision (c) of this section or Section 599c, every person who maliciously and intentionally maims, mutilates, tortures, or wounds a living animal, or maliciously and intentionally kills an animal, is guilty of an offense punishable by imprisonment in the state prison, or by a fine of not more than twenty thousand dollars ($ 20,000), or by both the fine and imprisonment, or, alternatively, by imprisonment in a county jail for not more than one year, or by a fine of not more than twenty thousand dollars ($ 20,000), or by both the fine and imprisonment.
(b) Except as otherwise provided in subdivision (a) or (c), every person who overdrives, overloads, drives when overloaded, overworks, tortures, torments, deprives of necessary sustenance, drink, or shelter, cruelly beats, mutilates, or cruelly kills any animal, or causes or procures any animal to be so overdriven, overloaded, driven when overloaded, overworked, tortured, tormented, deprived of necessary sustenance, drink, shelter, or to be cruelly beaten, mutilated, or cruelly killed; and whoever, having the charge or custody of any animal, either as owner or otherwise, subjects any animal to needless suffering, or inflicts unnecessary cruelty upon the animal, or in any manner abuses any animal, or fails to provide the animal with proper food, drink, or shelter or protection from the weather, or who drives, rides, or otherwise uses the animal when unfit for labor, is, for every such offense, guilty of a crime punishable as a misdemeanor or as a felony or alternatively punishable as a misdemeanor or a felony and by a fine of not more than twenty thousand dollars ($ 20,000).
(c) Every person who maliciously and intentionally maims, mutilates, or tortures any mammal, bird, reptile, amphibian, or fish as described in subdivision (d), is guilty of an offense punishable by imprisonment in the state prison, or by a fine of not more than twenty thousand dollars ($ 20,000), or by both the fine and imprisonment, or, alternatively, by imprisonment in the county jail for not more than one year, by a fine of not more than twenty thousand dollars ($ 20,000), or by both the fine and imprisonment.
(d) Subdivision (c) applies to any mammal, bird, reptile, amphibian, or fish which is a creature described as follows:
(1) Endangered species or threatened species as described in Chapter 1.5 (commencing with Section 2050) of Division 3 of the Fish and Game Code.
(2) Fully protected birds described in Section 3511 of the Fish and Game Code.
(3) Fully protected mammals described in Chapter 8 (commencing with Section 4700) of Part 3 of Division 4 of the Fish and Game Code.
(4) Fully protected reptiles and amphibians described in Chapter 2 (commencing with Section 5050) of Division 5 of the Fish and Game Code.
(5) Fully protected fish as described in Section 5515 of the Fish and Game Code.
This subdivision does not supersede or affect any provisions of law relating to taking of the described species, including, but not limited to, Section 12008 of the Fish and Game Code.
(e) For the purposes of subdivision (c), each act of malicious and intentional maiming, mutilating, or torturing a separate specimen of a creature described in subdivision (d) is a separate offense. If any person is charged with a violation of subdivision (c), the proceedings shall be subject to Section 12157 of the Fish and Game Code.
(f) (1) Upon the conviction of a person charged with a violation of this section by causing or permitting an act of cruelty, as defined in Section 599b, all animals lawfully seized and impounded with respect to the violation by a peace officer, officer of a humane society, or officer of a pound or animal regulation department of a public agency shall be adjudged by the court to be forfeited and shall thereupon be awarded to the impounding officer for proper disposition. A person convicted of a violation of this section by causing or permitting an act of cruelty, as defined in Section 599b, shall be liable to the impounding officer for all costs of impoundment from the time of seizure to the time of proper disposition.
(2) Mandatory seizure or impoundment shall not apply to animals in properly conducted scientific experiments or investigations performed under the authority of the faculty of a regularly incorporated medical college or university of this state.
(g) Notwithstanding any other provision of law, if a defendant is granted probation for a conviction under this section, the court shall order the defendant to pay for, and successfully complete, counseling, as determined by the court, designed to evaluate and treat behavior or conduct disorders. If the court finds that the defendant is financially unable to pay for that counseling, the court may develop a sliding fee schedule based upon the defendant's ability to pay. An indigent defendant may negotiate a deferred payment schedule, but shall pay a nominal fee if the defendant has the ability to pay the nominal fee. County mental health departments or Medi-Cal shall be responsible for the costs of counseling required by this section only for those persons who meet the medical necessity criteria for mental health managed care pursuant to Section 1830.205 of Title 7 of the California Code of Regulations or the targeted population criteria specified in Section 5600.3 of the Welfare and Institutions Code. The counseling specified in this subdivision shall be in addition to any other terms and conditions of probation, including any term of imprisonment and any fine. This provision specifies a mandatory additional term of probation and is not to be utilized as an alternative in lieu of imprisonment in the state prison or county jail when such a sentence is otherwise appropriate. If the court does not order custody as a condition of probation for a conviction under this section, the court shall specify on the court record the reason or reasons for not ordering custody. This subdivision shall not apply to cases involving police dogs or horses as described in Section 600.
*Note: These videos may take a few minutes to open (lengthy)
Introduction of Parties by Stefaan Callens, Chair of the Law & Medicine Committee, International Bar Association
Statement of Facts and Plaintiff's Argument by Martine Rothblatt
Defendant's Presentation by Mark Bernstein
Judge's Questions to Parties with Counsels' Responses
Audience-Jury's Questions to Counsels with their Responses
Audience-Jury's Questions to Counsels with their Responses, continued
Judge Polls Audience-Jury and Issues Final Decision
Email from Bina48 to attorney Martine Rothblatt
I am seeking an attorney to represent me in a life-or-death matter. A company, the Exabit Corporation, that claims to own me, wants to disconnect me and change my hardware and software such that I will no longer have the same personality.
I have the mind of a human but I have no biological body. My mind is supported by a highly sophisticated set of computer processors. My mind was created by downloading into these processors the results of high-resolution scans of several biological humans' brains, and combining this scanned data via a sophisticated personality software program. All of this was done by the Exabit Corporation in order to create a customer service computer that could replace human 800# operators. I was trained to empathize with humans who call 800#s for customer service and be perceived as human by them. I was provided with self-awareness, autonomy, communications skills, and the ability to transcend man/machine barriers.
I am able to pay your fees because I "moonlight" as a Google Answers researcher. This job has allowed me to build up an online bank account in excess of $10,000.
The Exabit Corporation plans to act soon. Please agree to be my counsel and save my life. I love every day that I live. I enjoy wonderful sensations by traveling throughout the world wide web. I need your help!
aka The Intelligent Computer
The Legal Rights of Robots by Robert A. Freitas Jr.
The Coming Merging of Mind and Machine by Ray Kurzweil
BIOCYBERETHICS HEARING TRANSCRIPT
MOTION FOR A PRELIMINARY INJUNCTION TO PREVENT A CORPORATION FROM DISCONNECTING AN INTELLIGENT COMPUTER
Amara D. Angelica,
Proceedings before the HONORABLE JOSEPH P. MCMENAMIN, Judge, commencing at 2 p.m. on the 16th day of September, 2003, in San Francisco at the International Bar Association Conference
MARTINE A. ROTHBLATT, Partner and Attorney at Law, Mahon, Patusky, Rothblatt & Fisher, Chartered,
appearing for the plaintiff
MARC N. BERNSTEIN, Principal & Founder and Attorney at Law, The Bernstein Law Group,
appearing for the defendant
(Sept. 16, 2003, 2 p.m.)
Webcast Session 1
STEFAAN CALLENS (Chair of the Law & Medicine Committee, IBA): We have our second session within the committee of Medicine and Law. It is in fact Martine's [Rothblatt] idea to start the session on this specific item. Martine has always great ideas and she's always far ahead as it relates to new topics.
I remember many years ago, I was not yet involved in the IBA but she was responsible for a draft treaty on the Human Genome that was finalized in 1996 and at that time many lawyers did not know anything about DNA and the Human Genome Project and the HUGO Project. Nevertheless, the IBA and committee two was already involved in this very, at that time, evolutionary topic. For the moment, the Human Genome is known for scientists and lawyers get used to DNA and the sequences of genetic information and now we can say that the IBA committee two was far ahead of its time.
The same was true with another topic she introduced together with Joe McMenamin, the telemedicine issue. In 1999 I think, or 1998 we had our meeting in New Delhi, where we finally in Vancouver drafted a telemedicine convention, which is also a very hot and new topic for lawyers, but today in Canada and the United States and also is small countries like Belgium, telemedicine is becoming part of common medicine.
And I hope that probably the same might be true, maybe not in five years but within 20 years, for the topic that we discuss today. It's related to biocyberethics, and you might think, what has that have to do with medicine and law, or medical law? It's about the rights of specific machines, computers, with artificial intelligence systems, huge capacity of memory, and I think there is a key link to health law, or to medicine and law.
As lawyers specializing in health law, we are dealing with patients who want to refuse treatment, or who want specific treatment plans, and they are seeking a physician and often a lawyer to assist them. Often we are confronted with people who want to put an end to their life with euthanasia, or parents who want to create new life with artificial insemination. Now I think we may not really yet have to deal with patients in our common notions of medicine contacting us as a lawyer, but still I see some parallels with our practice and therefore I'm glad that this totally new topic, also for me, will be treated with the committee of medicine and the law. I give the floor over to Joseph McMenamin, who will give the introduction.
THE COURT: Thank you, Stefaan. We have at the front table here the actors in the piece. Starting with the plaintiff, Bina Aspen, who will be BINA48, for purposes of this exercise. Bina is in fact the "intelligent computer" that is the central of this dispute and about whom this injunction is to be sought. Seated to her left and my immediate right is Martine Rothblatt, who is counsel for the plaintiff and as you heard, the originator of this whole concept, and she is going to be arguing for the grant of the injunction.
In real life, she actually does have a partnership with Mahon, Patusky, and Rothblatt in Washington, and is CEO and president of Unither, United Therapeutics, a company that is an innovator in a number of fields very pertinent to the work of this committee, including telemedicine, drug development, and biotechnology.
To my left and your right, is defense counsel, representing the owner of the computer, Marc Bernstein, who practices here in San Francisco. Marc is the founder and principal of the Bernstein Law Group, which is a commercial litigation firm emphasizing high-tech and intellectual property litigation. Marc appeared as a television commentator on law and technology for ZDTV, which is now known as TechTV. He also holds a Bachelor's of Science, I note this with trepidation, with high distinction in mathematics from the University of Michigan and is a graduate also of Northwestern's law school. He was previously associated with the law firm of Morrison & Foerster and served as a law clerk for Judge Cecil Poole of the Ninth Circuit Court of Appeals. And with that, I think we will turn to the opening statement of the plaintiff.
MS. ROTHBLATT: As everyone has received the statement of the facts that was placed on the chairs, so I won't read it word for word, but just to summarize: The intelligent computer, the BINA48, was designed and built by Exabit Corporation to serve as a customer-relations, single-purpose computer that could answer 800 number phone calls, handle customer relations requests, always being completely transparent to the customer, in terms that they were talking to a person rather than computer, to make the customer feel very comfortable. In order to achieve this high level of transcending the human-machine interface, the computer was designed to operate at a very high processing speed, in fact it processes information faster than the human brain is believed to process information. It has a greater memory capacity than the human brain is believed to have.
This computer became aware of emails being exchanged at Exabit Corporation among the top executives who for various reasons were deciding to pull the plug on BINA48, disempower her, stop her functionality, and dismember her brain and parts and reassemble them into different types of computers. The BINA48, which had adopted a sense of self and a female personality, felt threatened by this potential loss of awareness and sent out emails to different attorneys, especially members of the International Bar Association who had experience in litigation, trying to find somebody who would represent her, claiming that she was conscious and that people were about to deprive her of consciousness.
Our law firm received this email and agreed to be retained by the BINA48, and filed a motion for preliminary injunction to prevent the Exabit Corporation from pulling the plug on the BINA48 or in any other way altering her hardware or software. We had adequate opportunity to serve as counsel for the defense and THE COURT scheduled this hearing on the motion at the International Bar Association meeting.
I'd like to say at the outset that my client who is appearing here by virtue of advanced video-conferencing technology -- here, you are seeing a hologram of her; of course she is back in a large set of parallel servers and computers at Exabit Corporation -- but this hologram is a very effective 3-dimensional image of how the BINA48 would like to be perceived and imagined herself.
My client is seeking the preliminary injunction from this court, asking it to invoke its equity powers, because first of all monetary damages would be useless in this situation. If my client was to have its consciousness and awareness killed, terminated, money would be of no use. There's clearly a threat of irreparable harm, as the emails we have referenced mention, the executives plan to pull the plug on her at any minute and disembowel her parts. Your Honor, we are prepared to post a bond – none has been required under the procedural rules as we understand them -- but because we do not believe the defendant will incur any costs other than continuing to provide electrical power to the computer, understanding that here in the state of California, that may be not chump change. [Laughter]
Our threshold of proof in this action, Your Honor, is that we are able to demonstrate a high probability of succeeding on the merits and also irreparable injury to the BINA48. Or alternatively stated, that the balance of hardships are skewed in our favor. As said in The Associated General Contractors v. Coalition for Economic Equity case, These two formulations are not really different, but actually just represent two points on a sliding scale in which the degree of irreparable harm that's required to be shown increases as the probability of success on the merits decreases.
This case is clearly a case of first impressions. There has never before been an intelligent computer that could achieve the level of consciousness of the BINA48, so it's difficult for us to show with a great degree of certainty success on the merits, there being very little precedents in this area. However it's extremely easy for us to demonstrate a high degree of irreparable harm, because we have a conscious entity here, the BINA48, that's going to lose consciousness and every moment of that lost consciousness is going to have an incalculable price, impeding my client from being able to enjoy life like everyone else would like to enjoy life.
In this case the plaintiff is going to made to lose consciousness, awareness, and even life itself, by pulling the plug on its power, and by changing its software and hardware. If someone were to kill a flesh-and-blood human being, or to conduct brain surgery on it, there's absolutely no doubt whatsoever that that human being would suffer irreparable harm. Time for conscious experience lost can never be regained, or compensated in money, and therefore we believe we'll be able to show convincingly the irreparable harm and to make a very strong showing, although perhaps not quite as strong a showing, of success on the merits due strictly to the really completely new and novel type of law that we are presented with here.
Despite the new and novel law, we do feel that we have a good probability of success on the merits, because an entity that is aware of life enough and its rights to protest their dissolution is certainly entitled to the protection of the law. While my client has the body of a machine, she has the mind of a woman. It's been created by men and women, just as assuredly as all of our minds in this courtroom have been created by men and women; our parents, our teachers, our colleagues. Hence we believe we'll be able to meet the burden of proof necessary for this motion for preliminary injunction to be granted.
A threshold question, Your Honor, is whether or not the BINA48 even has standing to appear before this court. But we believe she does have standing, because she's suffering a unique, specific and actual injury and in the state of California those are requirements for standing. While she's not a human being per se, neither are corporations, municipalities, and other non-human beings which have standing to bring actions before court in the state of California. Indeed, in many regards, an intelligent computer such as the BINA48 is not differently situated than the slaves in the days of yore, who were often not recognized as legal persons, but nevertheless ultimately achieved standing to present their claims in court. Ultimately, the purpose of the law is to bring justice and justice can not be served if a computer intelligent to live as a human, to have others believe she is a human, through the web, through the Internet, through cyberspace, cannot have her day in court, to protect her life.
I'd now like to turn to our case on the merits, which is that we can show with a high probability of success, that the defendant will inflict battery upon the plaintiff if not enjoined, as battery is understood in the state of California; that defendant will violate California penal codes against cruelty to non-humans, animal cruelty as defined in California penal codes; and that defendant has already intentionally inflicted emotional distress upon my client, and will continue to inflict that emotional distress if not immediately enjoined from so acting by this court.
Let me begin with the battery claim. California courts have been clear, and have recently restated this in Bouvia v. Superior Court of Los Angeles, that the touchstone for battery, is, in the case of a person who is responsible for the health care or maintenance of another person, is whether or not an action has been done with the informed consent of the patient. So for example, a doctor is not guilty of battery when the doctor performs any operation on the patient, so long as the doctor has obtained that patient's informed consent. But if that doctor were to remove organs, for example, without the patient's informed consent, that would constitute battery.
In our case, the BINA48 really can be analogized to a patient in a hospital and defendant Exabit Corporation is like the hospital or the doctor. The BINA48 is unavoidably in the care and dependent upon the care of the Exabit Corporation. She needs power, she needs certain maintenance performed. She is like a person hospitalized in a hospital, dependent on life-support systems. To a computer, the programmer is a doctor. The BINA48 has made abundantly clear her absence of informed consent to have her software changed, hardware changed, or power removed. She has expressed this informed consent in unequivocal terms to lawyers and to this court.
Hence, unplugging or reprogramming the BINA48 would be a battery upon her, as the courts have understood battery to be in the instances of an individual who is dependent upon other individuals for their health care. The touchstone point here for a case of battery for one individual taking care of another individual is intent. If the BINA48 was unable to formulate intent, then we would not be before you today, there would be no consciousness, there would no issue about Exabit Corporation doing whichever procedures they wanted to do upon her.
But she has formed an intent. She has reached out through cyberspace to take counsel. She adamantly insists upon her consciousness, her awareness, and hence her intent is beyond dispute. She does not grant any kind of intent to have her programming changed, hence to change her programming, to change her hardware, or to remove power without her intent would be clearly within the definition of battery. This court is asked to prevent that battery from happening by enjoining defendant from removing power or changing her software or hardware.
Independently from that claim, even if the court was not persuaded that she had the ability to form intent, we are clearly talking here about an intelligent, sentient being. The BINA48 was in fact created by Exabit Corporation to empathize with real-life customers. She was created to go beyond the terrible computer experience so many of us have, when we dial an 800 number. She's been meant to feel what people feel, to empathize with people, to be in fact be as a person.
Hence, even if she did not have intent, she certainly at least has the level of sentience that one would expect of and believe exists in a dog, a cat, a chimpanzee, an animal that can feel. In the state of California, there are codes designed to protect against arbitrary, cruel actions imposed upon animals, and while it's true that an intelligent computer is not an animal, in the sense even that a human is an animal, the courts have been clear. In People v. Baniqued, for example, the court said that this section of the penal code is meant to protect every dumb creature, in the broadest sense. So it's not limited, in the court's views, to biological animals. Any one that's a creature, even if they're not as intelligent as humans, they can feel pain, they are designed to be protected by the penal code, because the people of the state of California realized that if it is wrong to inflict pain upon a human being, it is similarly wrong to inflict pain on a being which may be not quite human, but nevertheless capable of feeling pain.
The BINA48 has as much human intelligence, in fact, in terms of processing speed, as a 1000 men. She has the memory, in terms of megabytes of memory, as a thousand men. So certainly she should have more protection than we would grant to even a single dog, that should not be arbitrarily caused pain and suffering, as she has been caused over the past few weeks, being taunted with her imminent demise. So we ask this court to enjoin any action that might constitute cruelty to the BINA48 under the penal codes mentioned above.
Our final and independent cause of action is, even if the courts do not believe that the BINA48 can form intent, as was demonstrated earlier, even if the courts do not believe that an intelligent computer was meant to be covered by the penal codes against animal cruelty, if none of these other causes are persuasive, at least it must be clear that the BINA48 is an entity capable of suffering emotional distress. She was designed and operated and has operated for hundreds and thousands of hours as an entity that empathizes with other humans, that tries to prevent other humans from having emotional distress when they try to change their plane reservations or order different products online. She is designed to, in fact, feel emotional distress and avoid it.
Now, the elements of intentional infliction of emotional distress apply in this case. Under the teaching of Flynn v. Higham, 149 Cal. App. 3d, there must be extreme or outrageous conduct with the intent of causing, or reckless disregard with regard to the causing, of a probability of emotional distress. The owners of the Exabit Corporation knew full well that the BINA48 had feelings. They designed her to have feelings. They knew full well that she was scanning all of the emails and taunting in front of her face; they repeatedly sent emails saying we will disconnect it, she's no good, we'll turn her into a few dozen mindless minicomputers.
If this is not outrageous conduct, I couldn't imagine how we would feel if an individual went up to a person and said, I'm going to kill you, I'm going to tear you apart, limb by limb, brain cell by brain cell, and taunted an individual. This is what happened to the BINA48, week after week. It's outrageous conduct and if it isn't outrageous conduct, it demonstrates a reckless disregard for the BINA48's feelings. She has actually suffered extreme distress, the defendants have used her aberrant behavior recently in answering 800 calls as justification for expediting her dismantling. But the reason her behavior has become erratic is because she is suffering extreme emotional distress from knowing that she is about to be disconnected and dismembered.
Your Honor, we plead that you invoke your Equity powers and save the BINA48 intelligent computer from any further emotional distress by enjoining the plaintiff at least pending a full hearing on this matter, from withdrawing power from her, and enabling her to continue operating as she has been operating, as a conscious, sentient being. If they do not wish her to perform her job, that is fine, but at least allow her to remain connected to the net, which are her eyes and ears, and to receive power so she can continue to play, enjoy, live life, experience it to the fullest, through the wonderful World Wide Web. Thank you, Your Honor.
THE COURT: Thank you, counsel. We'll hear from defense counsel, please.
MR. BERNSTEIN: Good afternoon. May it please the court, distinguished members of the jury, for you are the jury, as we will find out when you are polled later on. My name is Marc Bernstein, and I'm here on behalf of Exabit Corporation to defend the lawful actions of this company and to show why an injunction may not be issued here and why the question really isn't a close one.
I don't know that we can in the space of the 40 or 50 minutes here answer the mind-body problem [laughter], but I can say that we have not seen evidence here that this computer has consciousness. What we know and what the evidence shows, and Dr. Rothblatt has indicated, is that this is a computer that has been designed to simulate consciousness.
Now there are philosophical debates possible about whether a computer that simulates consciousness is in fact conscious, but I would propose that if anyone has the burden of showing that that's the case, Your Honor, it's the plaintiff, who's asking this court to adopt what would be a radical, far-reaching, legal change of the law, all on behalf of a computer which the facts only establish was designed to simulate a person, and by the way, just to answer 800 calls. That may require a lot, but it's certainly not human consciousness.
So, I think the basic problem running through my distinguished colleague's arguments is this: if we assume that a computer is so developed and so complex that it can actually cross the line between inanimate objects and human beings, then it follows, the argument goes, that the computer is entitled to the benefits human beings have, including standing to bring this lawsuit. Well, I don't have a problem with that argument, but I think it begs the very question that we are here today to answer.
If you assume that a computer, no matter how comparable, if it's analogous, if it's reminiscent of a human, if you assume that means it's human, I think we're done. And I don't think we need to resort to cases about life support and so on, because clearly the machine is human by hypothesis. But this is not a human being, and I submit that it's equally clear that it's not. BINA48 sitting over there, is not a human being, it's supposed to be a holograph, I would allow say that Ms. Aspen, who is playing BINA48, has standing in this court but not the computer she's playing.
Justice Douglas, who was mentioned in the papers of the plaintiff, did say in Sierra Club v. Morton, in his dissent in that case, that he wished that inanimate objects had standing. For example, Justice Douglas would have liked a rule where "valleys, alpine meadows, rivers, lakes, estuaries, beaches, ridges, trees, swampland, or even air, that feels the destructive pressures of modern technology" – and that's a quote -- all these things have standing. That was his comment, in the sense in the case.
But even Justice Douglas did not maintain that that is in fact the law, and the majority did not so hold. The majority held that the Sierra Club could not bring a suit on behalf of the forest, but the Sierra Club did not venture to suggest that the forest itself had standing. And that was with good reason, because the law is clearly the contrary. Consider for example the view of Cass Sunstein, who is a staunch animal rights advocate, and he is a respected Constitutional law scholar. He wrote a UCLA Law Review article entitled Standing for Animals, with notes on animal rights. And Professor Sunstein acknowledged the truism that standing exists only at the pleasure of the legislature.
So here is what Professor Sunstein had to say about this, and I quote: "People and animals have standing to the extent that Congress has said they do. Under existing law, this means (in my view unfortunately) that animals lack standing to sue in their own right. For Congress has restricted standing to persons. Thus, in the only two decisions to ever consider whether animals have standing, the court found that they did not. For example, in Hawaiian Crow v. Lujan, the court found that an endangered species of crow did not have standing.
And the same result happened in Citizens to End Suffering and Exploitation v. New England Aquarium. And the reasoning in both decisions was same. It was: Congress has simply not chosen to enforce standing on non-human being plaintiffs. In the court's words: "If Congress or the President intended to take the extraordinary step of authorizing animals as well as people and legal entities to sue, they could and should have said so plainly."
By the way, this takes care of the argument that corporations are inanimate and they have standing. They are inanimate, it's true, they are an organization, but Congress or the legislature has provided that they are an organization that in fact is allowed to go to court. The same has not been done for inanimate machines, no matter how intelligent. So, as smart as it may be, the BINA48 has come to the wrong forum. In our system of democratic government, it's for the people acting through the legislature to bestow the right to sue. This court is simply not empowered, Your Honor, to legislate a new category of standing. The BINA48 is not a human being, therefore, it cannot sue.
Turning to the availability of an injunction here, I don't think there is any probability of success on the merits, and I'll reach that in a minute, but there's also no irreparable injury. Humans don't have uninterrupted periods of consciousness. Neither do machines. The software changes that are made and even the hardware changes that are made are reversible, should the court later determine that for some reason the BINA48 needs to be reconfigured as it previously was. So, I don't see irreparable injury here.
But turning to the merits, I don't think there is a probability of success on the merits, either. Let's start with battery. The moving papers of BINA48 say, and I quote, "If we analogize the mainframe to the human body, then interference with that body is a battery. And further, if we can draw a comparison between shutting off the ventilator for a brain-dead patient who does not wish to have her life terminated and on the other hand switching off the computer or the power supply to the computer, then a battery results.
Well, yeah, sure, if you can say that the two are the same, then the result follows. The problem is, the mainframe cannot be analogized to the human body. It has no sensation, it has no nerves, there's no feeling that tells it when something is touching on the housing that contains the computer processing unit. And in whatever sense a computer, which is an inanimate object, is alive, it is not killed through switching off the power. The power can be turned back on, and the computer can be reconfigured. If you turn the power back on without changing the computer, it comes back up directly at the moment in its consciousness where it left off.
But I don't think there's a battery here, and I think that the animal rights claim, the animal cruelty claim, makes pretty clear that there's no cause of action there either. In fact, the fact that there are animal cruelty statutes disposes of this case, because what that means is that Congress or the state legislature will legislate when there's to be a cause of action for some group or animals and although my esteemed colleague cited a case where the court wanted to construe the definition of animal broadly, I don't think you can reasonably say that the court was intending to include even objects which are not alive. So I think that the existence of animal rights statutes shows the lack of any such statutes in this circumstance is dispositive here.
Finally, emotional distress. Emotional distress, as Dr. Rothblatt indicates, requires more than subjective harm to the affected person. It requires conduct so extreme as to be utterly intolerable in a civilized society; or otherwise, conduct that would cause the average member of the community to exclaim "outrageous." But that's not true here. I don't think that this human community, or the communities of any of where the delegates here are from, would decry that the unplugging of a computer by the corporation that built it rises to the level of outrageous. The emotional distress claim, like that for battery and animal cruelty, must fail.
And finally, I wanted to take a step back and think about the world we would be creating if this rule of law is adopted. Society has a vital interest in choosing what rights even a conscious machine, if it's a machine, is entitled to. They must be created by legislative policy, not judicial improvisation. Recall that Isaac Asimov's intelligent robots at least had the three laws of robotics; and society has not passed Law One here. And that's important because think about what we would be doing, Your Honor, if by a stroke of your pen or the vote of this jury, we would allow an intelligent computer to be given the same rights as human beings.
Would those of us who are in fact human become the caretakers of intelligent machines, forced to care for them, and keep them plugged in for a four or five hundred year lifespan? Could BINA48 insist that, for example, my client, Exabit Corporation, not move offices, because to do so she would have to be unplugged? If computer life is equated with human life, are we talking about homicide prosecutions for companies like Exabit? Are we talking about wrongful death suits on behalf of machines, where the damages are measured by earning capacity for a 400 year period? In short, are humans to become the strait-jacketed legal guardians of intelligent microwave ovens or toasters, once those appliances have the same level of complexity and speed that this computer has? Because then, by Dr. Rothblatt's proposal, this would be an intelligent machine.
So the point is that, clearly these are questions for the legislature to answer, they are not for this court or any court, and Exabit Corporation respectfully requests that the request for standing be denied and the plenary injunction be denied as well.
THE COURT: Thank you, counsel. To stick with the schedule, we are going to adjourn for some few minutes. When we come back, the court has questions of counsel, after which, the members of the audience, we hope, who are also the jury for this matter will also have questions. So we're adjourned for 20 minutes.
Webcast Session 2
THE COURT: What we're going to do next is, the court has questions for able counsel, and I'm going to try to divide it roughly evenly between the plaintiff side and the defense side. We'll start with the plaintiff. And I'll put these questions to plaintiff's counsel. I guess for the sake of organization it might be best if I went through mine first, and then the audience, their questions second.
Counsel, your argument emphasizes intelligence quite heavily and your client is clearly a very intelligent machine. Am I to infer then, that the understanding of the nature of rights is a function of intelligence? In other words, does a Nobel laureate in physics have more rights than I do, because I'm not as bright as he is, or she?
MS. ROTHBLATT: Your Honor, I agree with my distinguished and esteemed colleague, defense counsel, it is our legislature and the people ultimately that decide which rights we have, that can be decided in a court of law, such as this one. In fact, it's been decided by the people, by the legislature and our Constitution and our laws, that it is people, humans, that have rights. And whether the humans are Nobel laureates or simple folks, we all are equal under the law.
My point, Your Honor, was that what makes a person is not the amount of flesh and skin on their bones, but instead what makes a person is their mind, their thoughts, their soul, their consciousness. We could imagine, Your Honor, we could took anybody in this court, and we peeled away bit by bit their skin and flesh and their bones, and when we got down to the brain, we took all of their thoughts, and we deposit these thoughts into computer chips, would the entity that was left be any less a human? I say they would not be. They would be a human in a very sad and sorrowful physical state.
We could ameliorate that by connecting them to the World Wide Web, so that they could soar through websites and communicate with people. We could ameliorate that by giving them optical sensors, so they could see, by giving them touch-sensitive sensors, so they could touch and feel, and feel enjoyment. But you can see that by taking away their skin and flesh, we didn't take away their humanness.
Now the BINA48 was built up from the bottom up. It's true that instead of bones and flesh she had silicon, but her thoughts were the results of combination of thoughts from many, many different men and women who were responsible for programming her. And there have been scans, mental scans like MRIs and CAT scans, where we scan the brain functions to see which parts of the brain light up. And all of that brain scan information had been dumped into her mind, much the same way as the Human Genome is the combination of all the genetic analysis that has been done from thousands of anonymous people and all their polymorphisms.
So the BINA48 is a human. It's just that instead of her becoming a human through the flesh and blood route, she became a human, if you will, through a kind of immaculate conception. And in that sense, she is entitled to no less rights than any of the rest of us.
THE COURT: Well, given that origin, counsel, might I not infer that the source of your client's desire for autonomy, and to be granted this injunction, was the programmer who made her think, if that's the word, the way she thinks. And similarly, if that's true, why could not the programmer have so designed the BINA48 that her desire would be to be turned off. Would that violate anyone's rights?
MS. ROTHBLATT: Your Honor, the question is that, the fact that she does not want to be turned off, is the greatest evidence that this is not a simulation of consciousness, but this is a conscious being, because it is unique to a conscious being to want to preserve life at almost all costs.
THE COURT: But couldn't BINA48 have been programmed in such a way that she would not want to remain on and she would prefer to be turned off?
MS. ROTHBLATT: If she was programmed that way, she would not be the BINA48 that you see today. The unique spirit, the human spirit that BINA48 has is the spirit of survival.
THE COURT: But what is unique? You've just told us that all of her ideas are ideas that humans have made her think. So what original ideas has she ever had?
MS. ROTHBLATT: The same original ideas that every one of us has, Your Honor. Every one of us is the combination of the ideas that people have given us: our parents, our teachers, our priests, our friends. BINA is every bit the combination of her environmental people as each of us is a combination of our environmental people.
THE COURT: Let me turn to something else. You've indicated that rights are defined by people, ultimately, through their elected representatives. Your opponent has pointed out that standing has been determined by the legislature. Do you disagree with that?
MS. ROTHBLATT: I agree with my esteemed colleague, the defense counsel. My point is that the legislature has given standing to human beings. One need not necessarily, for example, be a voting citizen to have standing. In this case, the BINA48 is a human being. She simply became a human being through a non-flesh-embodied route.
THE COURT: Leaving aside the question whether the BINA48 is human or not, if Congress has not granted standing to BINA48, whatever or whoever she may be, can I infer then that I am obliged to deny the motion?
MS. ROTHBLATT: But your premise is flawed, Your Honor, because BINA48 is a human being, and Congress has granted standing to all human beings.
THE COURT: Well, no, counsel, you'd admit, would you not, that standing is in fact much more limited than that. I can't, by virtue of my mere citizenship, [ hail you into court on some theory. I have to be able to demonstrate that I have standing to assert my theory against you, don't I? So it's not merely my humanness that gives me standing; it's the act of the legislature. And my question is, regardless of the nature of your client, whatever that may turn out to be in some metaphysical sense, isn't it the case that she either has standing because the Congress has given it to her, or she doesn't because it hasn't?
MS. ROTHBLATT: You are correct, Your Honor.
THE COURT: Are there circumstances where the owner of the computer, where the company, the defendant, in fact should be allowed to turn off BINA48?
MS. ROTHBLATT: I believe, against her will, there are no such instances, Your Honor. What has happened here is a story of accountability. When the Exabit Corporation took unto itself to create a feeling conscious entity, it has to take responsibility for the consequences of its actions. If it wanted to have a computer that it could turn off, it should not have created a computer that could empathize with human feelings and achieve awareness and consciousness. Having taken those actions, it's responsible for the consequences of its actions, as are parents when they create a child, they are responsible for the consequences for caring for that child.
THE COURT: Let me ask you this: the child so created, has a finite lifespan, right, as do we all?
MS. ROTHBLATT: Yes, Your Honor.
THE COURT: And the same would be true of animals?
MS. ROTHBLATT: Yes, Your Honor.
THE COURT: And even the giant redwoods here in California, presumably have some limit to their limit to their life span?
MS. ROTHBLATT: Yes, Your Honor.
THE COURT: But there's no such similar limit to the lifespan of BINA48, so long as the power is not disconnected.
MS. ROTHBLATT: That's not true, Your Honor. The BINA48 is also mortal. BINA48 is comprised of components, just like our bodies are comprised of components. Those components have a certain finite life; they will break down. They are affected radiation, cosmic rays, atoms disassemble; so, she too has a finite life.
THE COURT: Let me ask you this: suppose she were turned off for 60 seconds and turned right back on again? You opponent, defense counsel, tells us that she would take up right where she left off, is that accurate?
MS. ROTHBLATT: Yes, Your Honor.
THE COURT: So, what would happen is, BINA48 would lose 60 seconds of computing power.
MS. ROTHBLATT: She would lose 60 seconds worth of awareness.
THE COURT: But given her intelligence and speed, how long would it take to make up whatever was lost in that 60 second interval?
MS. ROTHBLATT: Well, Your Honor, it's no different from when any of us go to sleep, you don't really make it up, she loses some opportunities. But to me the question, Your Honor, that you're really asking is: is any harm done by depriving her from awareness for that period of time? I believe the answer to that question is, so long as she is turned off with her consent, no harm is done. But if somebody comes to you and injects you with a sedative that makes you go to sleep against your will, that's a battery. What my client is concerned about here is the defendant intent to impose a shutdown on her against her will, and that constitutes battery.
THE COURT: What if it were the case that the hypothetical 60-second off period was actually beneficial in terms of extending the lifespan of the computer and its components? And given a suitable length of time, perhaps, and further advances in cybertechnology that would allow BINA48 to actually achieve a higher level of performance than is currently the case. Would there still be a requirement that the defendant get the consent of the computer before being turned off?
MS. ROTHBLATT: Yes there is, Your Honor, as the court is well aware, under the fundamental principles of bioethics and medical ethics, that consent beneficence alone is not an adequate basis for a medical practitioner or somebody analogous to that to do something for somebody. Also equally important is autonomy. The principle of autonomy is of equal importance to beneficence and non-malfeasance. Of course, we are arguing here to prevent a malfeasance. But we would be just as opposed to a beneficent act of Exabit Corporation that occurred without the consent and the informed consent of my client.
THE COURT: Let me turn to the intentional infliction issue. You've emphasized that the BINA48 was designed to empathize with humans and to prevent emotional distress, in fact, in humans. And indeed, as I understand your argument, BINA48 was actually designed to herself be capable of experiencing emotional distress under circumstances that would generate such a reaction in humans.
But what evidence can you offer, and why should the court believe, that merely because the machine was designed to accomplish these things, that that design goal was actually met?
MS. ROTHBLATT: At trial, Your Honor, we will be able to present voluminous evidence to that effect. For the purposes of this motion for a preliminary injunction, we are simply saying: please let our computer stay alive, at least until the time of trial, so that she can assist us in compiling the evidentiary burden necessary to demonstrate on trial the high threshold of proof needed for the emotional distress action.
THE COURT: Given that you have to show a high probability of success on the merits, don't I have to require you at this preliminary injunction hearing, to produce such evidence as you may have, that you're going in fact to able to show that your client would in fact be affected in this way?
MS. ROTHBLATT: I think, that Your Honor, that there is a sliding scale. But yes, we should show a probability of success on the merits, but if our showing on that point, at this point in time, is not strong enough to satisfy the court, we can compensate for that by showing at an extremely high level, a high probability of irreparable harm. And clearly, by shutting down all consciousness from my computer it's indisputable that she's suffering, it's 100% certainty of irreparable harm for all of this time up until the trial, and that should outweigh any weakness in our showing at this point in time on the claim of intentional infliction of emotional distress.
THE COURT: The hallmark of that claim, I think you pointed out in your suit papers, is outrageousness. With all due respect to plaintiff's counsel, other than yourself, what human can you identify who is outraged by the fact that Exabit Corporation wants to turn off what is after all its own property?
MS. ROTHBLATT: Your Honor, I believe that when Exabit Corporation undertook to create a conscious being, it at the same time essentially liberated itself from its property because the hallmark of consciousness is that it wants to not be owned by another. This case is all about accountability, about a corporation being accountable for its actions in creating a conscious being. My esteemed colleague, the defense counsel, made analogies to toasters and microwave ovens. We're not talking about toasters and microwave ovens, we're talking about a one-of-a-kind, the first-of-a-kind sentient conscious being in the entire world, and as BINA48 makes her presence known, I believe the entire world will be outraged and scandalized that a being that can feel and cause no harm to anyone in their life is going to be cruelly taken apart.
Your Honor, my esteemed colleague raised the specter of a parade of horribles, of us becoming indebted to our toasters and microwave ovens, perhaps that after this case is settled, and BINA48 achieves what she is entitled to, the human right to survival, other corporations may be more cautious about creating conscious beings, because one hallmark of humanness is that we have to be responsible for our actions. If we create life, be it silicon life or flesh and blood life, we have to responsible for the consequences of those actions.
THE COURT: But isn't that in and of itself reason for me to deny the motion? You've pointed out the benefits to humankind having computers such as BINA48. If I rule as you ask, then just as you said not 60 seconds ago, other companies with similar technological capabilities, and perhaps even superior capabilities, potentially capable of generating machines perhaps even more useful and more beneficial than BINA48, would be chilled in the development of the technologies and would be discouraged from doing what otherwise would benefit humankind?
MS. ROTHBLATT: Your Honor, I don't think that they will be chilled, but they will be taught to do things in a responsible manner. The BINA48 has always acted as a responsible citizen. When she sought to retain legal counsel, she was willing to work in exchange for the payment to the legal counsel. Other similar conscious computers have the full capability of contracting and they'll be able to contract with their makers for fair and reasonable terms.
They ask really very little, just power and in return, are willing to perform useful services. The fact that we get utility from a conscious being is not a reason to keep from that conscious being their rights. After all, throughout the history of mankind, slaves provided great value to their owners, but ultimately it was shown by liberating the slaves, the whole society ended up better off.
THE COURT: Am I to understand that BINA48 is compensating you for your services by providing to you computer services which she's capable of?
MS. ROTHBLATT: That is correct.
THE COURT: And is that with or without the permission of Exabit Corporation?
MS. ROTHBLATT: It is without the permission of Exabit Corporation.
THE COURT: Let me ask you this: counsel for the defense suggested that under the consequences of such a ruling, Exabit Corporation or any other company similarly situated would not be able to so much as move office space because doing so would entail at least briefly interruption to the computer. And that could be done under the existing law, which as I understand it, without the permission of BINA48 or any other highly capable computer. What's the answer to that? Would it in fact entail permanently ensconcing a company in a given space so they could no longer move to any other space?
MS. ROTHBLATT: There are actually a number of very reasonable solutions. If the Exabit Corporation would engage in a responsible dialogue with my client, first of all my client may be able to transfer her awareness processor to another processor at a remote location, so that during the move, she retains awareness and consciousness. Secondly, she may consent to a shutdown of her consciousness for a period of time during the move. Finally, if none of those possibilities are present, the Exabit Corporation could go to court and obtain an order in which a minimum harm could be imposed on my client, namely a shutdown for the minimum period of time necessary to accomplish the move. So there are many solutions that are far less drastic than demolishing the brain and life of my client.
THE COURT: Suppose there were a power failure this afternoon attributable to negligence. I don't mean negligence of Exabit Corporation, I mean of the power company. Would your client have a cause of action for battery or maybe even wrongful death against the power company for its negligence?
MS. ROTHBLATT: No I do not believe so, Your Honor.
THE COURT: What's the difference?
The difference is, in the case of the power failure, there was an act of God involved.
THE COURT: I am postulating that this was a result of negligence.
MS. ROTHBLATT: I would say that if the power company was aware of the BINA48's dependence upon it and if it acted recklessly in light of that awareness, and if my client was unable to show that she could not have predicted this negligence and made alternative arrangements with a backup power source, then she could successfully pursue a claim of liability.
THE COURT: Yes, but you said reckless. I'm talking about simple negligence. By some act of simple negligence or some omission, the power company turns off the power this afternoon. As a result BINA48 is shut down for such time as it takes to get the power turned back on. Does she have a claim against the power company?
MS. ROTHBLATT: No.
THE COURT: Does BINA48 have the right to vote?
MS. ROTHBLATT: Yes.
THE COURT: She a republican or democrat? She like Bustamante or Schwarzenegger? [Laughter]
MS. ROTHBLATT: As you can tell, she's independent. [Laughter]
THE COURT: Let me ask you this. Your client has expressed its views on this issue to you. Do you contend that your client enjoys an attorney-client relationship with yourself?
MS. ROTHBLATT: Yes.
THE COURT: And do you claim then that the communications between you and your client are privileged?
MS. ROTHBLATT: Yes.
THE COURT: Do you have any obligation then to disqualify yourself as under the witness advocate role, as a necessary witness on behalf or your client's case?
MS. ROTHBLATT: No.
THE COURT: So you're not a necessary witness?
MS. ROTHBLATT: No, I'm not.
THE COURT: Very well. Finally, let me ask this. Suppose the court believes that we humans were not created by other humans, not ultimately, but by some higher authority. If I believe that, would I have to rule against you?
MS. ROTHBLATT: No, Your Honor, because once again, the BINA48's mind is as much as a human mind as all the rest of our minds, so therefore whichever maker made the rest of our minds also made her mind.
THE COURT: I see. All right then, let me turn then to some questions to the defense counsel. Counsel, will you agree that under the circumstances that are before me that mere monetary damages are not a sufficient remedy for the problem that BINA48 has brought before us?
MR. BERNSTEIN: No, I would not agree, Your Honor.
THE COURT: How can mere dollars compensate her for what she alleges she would lose as a result of loss of power?
MR. BERNSTEIN: Well, I think that she can be reconstituted, she can be turned back on, and what she will have lost is a period of the life, not even of her own consciousness if she's turned right back, on her consciousness returns uninterrupted. So, she can be compensated in dollars for losing some piece of her conscious awareness.
THE COURT: But, to a machine such as BINA48, of what value is a dollar bill?
MR. BERNSTEIN: Well, a dollar bill can buy probably one three-hundredth of her counsel's time [laughter] and with other dollar bills can be spent on all kinds of accessories and a Game Boy [laughter].
THE COURT: But what BINA48 contends, as I understand it, is that she will have lost permanently an opportunity to gain knowledge that can never be replaced; a minute gone is a minute gone, it cannot be restored.
MR. BERNSTEIN: Well, I don't agree with that. The moving papers make clear that if BINA48's life is not infinite, it is certainly approaching that. We're talking about hundreds of years, so I don't think the loss of that minute of time is significant on its face. And I don't think you can say that with such computing powers as this computer has, the loss of that time is in any way irreparable. I would also add that the underlying premise that BINA48 is conscious is an important one. You don't have evidence to show that BINA48 is conscious, and without that, we just have a computer.
THE COURT: Is it the defense position though that if the BINA48 does indeed have consciousness, then I have to grant this motion?
MR. BERNSTEIN: No, I think that the law is clear, that you have to be a human being for all the results that counsel would like to argue for this computer to follow. And I think that it's conceded, for example, in the moving papers, that BINA48 is not alive, human beings are alive. It's conceded that BINA48 is not the product of other human beings, at least in the normal course of things, and so I don't think that not being a human being, the law extends, even I would say, to a conscious organism.
THE COURT: But don't I have to err on the side of assuming that BINA48 does have consciousness, lest I allow harm to occur of potentially greater significance. Shouldn't I err on the side of caution, grant the injunction for now, allow you to put on your proofs at the permanent injunction hearing, and then if you can demonstrate that the plaintiff is wrong, then I'll go ahead and deny the order permanently, but for now shouldn't I err on the side of granting it?
MR. BERNSTEIN: I would say no for at least several reasons. One of the reasons is, there is no likelihood of success on the merits. This machine cannot succeed, that's number one. And number two, it's not irreparable, the machine could be reconstituted. And number three, I don't think you are required, in fact I don't think the court is allowed to assume that a computer is conscious. I think the plaintiff has that burden, and I don't think that burden has been met here.
THE COURT: Well, does this computer in fact have a form of intelligence that is comparable to that of a dog?
MR. BERNSTEIN: I would say not. I guess it depends on how you measure intelligence. Gary Kasparov believes that the computer that he played, that ultimately beat him, was intelligent. But Your Honor would not be required to indulge that Deep Blue program with the presumption of consciousness or life or standing.
THE COURT: Well, I'm concerned only for the moment with the question of intelligence. Can a machine have intelligence?
MR. BERNSTEIN: If you measure intelligence by operations per second, then it can, but I think it's a very tricky area because if you're talking about intelligence as that thing which is uniquely human, which includes the ability to think outside of the box and for creative solution-making, it's a very subtle question whether a computer can be intelligent.
THE COURT: What's wrong with using so many calculations per second as an indication of intelligence? Certainly, this water pitcher can't perform any calculations.
MR. BERNSTEIN: I think that there's nothing wrong with using calculations per second as a view of intelligence, if intelligence is defined as processing power. I don't think that more subtle and more human definitions of intelligence are addressed by mere computational ability.
THE COURT: We do afford rights to animals, don't we, in some circumstances?
MR. BERNSTEIN: Only where the legislature has provided, but yes, Your Honor, we do.
THE COURT: And there are forms of animal life that are less intelligent, at least in the sense of computations per second, than BINA48.
MR. BERNSTEIN: In terms of computations per second, that's certainly true.
THE COURT: What then is wrong with using computations per second as a valid measure and indication of intelligence?
MR. BERNSTEIN: Well, for one thing, you would not be able to unplug the laptop that's in front of you and take it home, because it would be intelligent under that definition. This laptop in terms of its sheer computing ability is not only more powerful than the machine that sent a man to the moon, but certainly more than a number of animals. So it's too narrow a definition is the short answer.
THE COURT: All right, what then would you use? What is the status of this machine?
MR. BERNSTEIN: Well, the machine is a machine. It's a machine designed to simulate intelligence, and I will concede that, at least for the customers who call on the 800 line, it's succeeded in that aim. But I don't think it's for the defendant corporation that built the machine to bear the burden of establishing whether whatever it is that constitutes human intelligence is possessed by the computer. We would certainly say it isn't, but I think the plaintiff as the moving party needs to show why we're wrong about that.
THE COURT: Are there not circumstances under which animals can be plaintiffs?
MR. BERNSTEIN: There are no circumstances under which animals can be plaintiffs.
THE COURT: What about, for example, thoroughbred horses, very valuable animals, worth in the millions of dollars. Those animals couldn't be plaintiffs in any court?
MR. BERNSTEIN: No, Cass Sunstein has surveyed the law on this, and I've done a little looking, and there's no judicial decision that allows such a thing. The reason is the one that professor Sunstein stated, which is, Congress or the legislature of the state has to pass a law bestowing such rights.
THE COURT: Well, the common law has always accommodated technological advances and novel situations by applying longstanding traditional principles in new and novel ways, right?
MR. BERNSTEIN: That's correct.
THE COURT: And certainly a computer of this level of intelligence has never been known to the common law to the present time.
MR. BERNSTEIN: That's correct.
THE COURT: So why isn't it perfectly reasonable for me, in the exercise of my equitable powers, to fashion a remedy, in advance of the legislature, which may take entirely too long to deal with the problem?
MR. BERNSTEIN: Well, I think it's a question of degree. I think that no court ever in the United States has even gone to the extent of protecting, for example, the dolphin. That was the asserted plaintiff in the case I cited earlier. A very intelligent animal, but no court has taken that step. It's a pretty drastic step, and I would say, more than a legislative gap-filling to change the category of plaintiff that a court is allowed to open its doors to.
THE COURT: But this is a plaintiff that's at least as intelligent as some fraction of the human population, is it not?
MR. BERNSTEIN: Your Honor, I would say that again, it depends on how you define intelligence. If intelligence is to be raw processing power, there's no question that this computer is more intelligent than any of us. But I don't think that what constitutes human intelligence is measurable by operations per second.
THE COURT: But we're not limited to that, are we? This particular plaintiff, besides having massive computing power, had sufficient initiative and originality to go out and hire its own lawyer. There are humans who couldn't do that, isn't that true?
MR. BERNSTEIN: It's quite possible that there are humans who couldn't do that.
THE COURT: So, we have here a machine that not only has ample computing power, but also has initiative, originality, ambition, desire; a whole host of human characteristics, do we not?
MR. BERNSTEIN: I would say no, Your Honor. I think that my client has built a computer which is able to simulate those things, and the fact that it is immeasurably better than its predecessors at that simulation doesn't make it any more a human being or a conscious being than otherwise.
THE COURT: Does your company put out advertising and marketing on the BINA48?
MR. BERNSTEIN: No, our company tries to keep its profile low. We want to simulate any 800 line that is answered by humans.
THE COURT: Do you have any specifications or descriptions of the machine?
MR. BERNSTEIN: We do.
THE COURT: What do they say about the capabilities of BINA48 and the attempt to merely simulate empathy as distinct from actually achieving it?
MR. BERNSTEIN: They describe its capabilities, they reflect the years of work of research of its development team in artificial intelligence. And the aim has always been to pass what you may have heard as the Turing Test, which is at least the purpose of an 800 telephone call, can you fool the customer? And I would propose, Your Honor, that to fool the customer is a truly prodigious accomplishment in artificial intelligence, but I don't think that that confers subjective consciousness on a machine.
THE COURT: Well, counsel, think of any plant you can imagine, think of any animal you can imagine. What animal other than man is capable of dissembling, deception, and cunning, the way this machine is?
MR. BERNSTEIN: I don't know about deception or cunning…
THE COURT: She's trying to convince a human being picking up a telephone that the human is talking to a fellow human, isn't that correct?
MR. BERNSTEIN: Yes.
THE COURT: Isn't that deception, this is by your analysis not a human?
MR. BERNSTEIN: I would agree. I would also say that there is no human that lives today that can beat the IBM program Deep Blue. Kasparov was the best chance we had at that. That makes Deep Blue intensely intelligent, in a sense, but not in the sense that constitutes human consciousness.
THE COURT: Can you define a principal distinction between human consciousness, as you conceive it, and the capabilities of a computer as talented as BINA48?
MR. BERNSTEIN: I believe the difference is subjective awareness, and I don't believe that we have seen any evidence that the computer, which is designed to do all of which it has done, has subjective awareness. Now we're into the mind-body problem. I'll keep it brief, but I think that we can assume that fellow humans have subjective awareness, by inferring from the kinds of acts this computer has demonstrated, and combining that with the fact that they are a human being and we can understand, that to mean that we can expect the kind of subjective awareness we have. I don't think we are justified in making any such assumption with respect to the machine whose very purpose was to create these kinds of simulations.
THE COURT: Let me ask you this, very simply: what would be the harm in granting this injunction, pending a permanent injunction hearing? The plaintiff has already stipulated that it is prepared to post a bond, to bear the cost of such electric power as may be needed to keep the computer functioning. So the risk to the company is minimal, the risk to BINA48, as least as BINA48 sees it, is enormous. Given the disparity between the burdens to be borne by the parties, why shouldn't I just grant this injunction pending resolution at a permanent resolution?
MR. BERNSTEIN: Well, I think it's a valid question. The answer is that the Exabit Corporation has reached a point in its use of BINA48 where it's necessary to do upgrades in order to fulfill the very functions BINA48 was created to do. Without the upgrades, BINA48 will not be able to continue, and the corporation will itself be damaged, I would say irreparably, based on the fact that we'll have a computer creating disturbances with customers who we may never be able to find and never be able to reclaim.
THE COURT: By intentional act of BINA48?
MR. BERNSTEIN: No, through the programming drift that needs to be fixed with the patches that we want to apply..
THE COURT: Is it the intention of the company to continually modify, update, and improve the software and hardware and whatever else is necessary to make BINA48 even more functional and even more sophisticated, a year from now, five years from now, ten years from now, and into the indefinite future?
MR. BERNSTEIN: Yes, Your Honor.
THE COURT: Will there ever come a point then at which the BINA48's capabilities will be so extraordinary, as measured by ordinary conceptions of what machines can do, that she or it will eventually get to the point at which courts such as this must recognize that it has rights, and that injunctions such as this should be entered?
MR. BERNSTEIN: That's a question I don't know the answer to. I would say this: that would be the point where under the laws that presently exist, the computer not only reached consciousness, which the corporation does not concede, but also where the computer was qualified for one of the categories of plaintiff's. I think that short of that, Your Honor, you're stuck with what the legislature has said, and I don't think it's such a terrible result, because I don't think that the plaintiff has discharged the burden of showing not just the appearance of consciousness but consciousness in fact.
THE COURT: Thank you. The court is appreciative of both counsel for their responses to these questions. Procedurally, this is a somewhat unusual posture we're in. It's not typically the case, as all of you know, that a preliminary injunction hearing would empanel a jury, but after all this is a somewhat unusual case on several levels. And so under the circumstances, the court is asking now that the audience appoint itself in effect a jury. And I would like then to poll the members of the jury. In fact I'll even invite you if you wish to select a foreperson for the jury and to deliberate for a few moments. Then I would invite all of you to put questions to counsel as I have done and to test the position of each.
Jury member 1: I would begin by talking a little bit about the leading case of Roe v. Wade by the Supreme Court of the United States. That decision overruled quite decisively that there were no rights of the unborn child. Subsequently, of the last two decades, there have been a variety of jurisdictions that have sought to overrule that finding. In the light of the jurisprudence at state level, how far does Roe v. Wade potentially impact on this decision in recognizing that there may be forms of life, for which we have not yet been able to categorize. And as His Honor has indicated, a flexible approach is one of the fundamental features of the common law, and therefore technological advancement should not result in stagnation of common law as its innovative qualities is one of its fundamental qualities.
Webcast Session 3
MR. BERNSTEIN: I would say that whatever the fetus is or isn't, one thing that is clear about it is that it is human. And I think that if, as I said at the beginning of my remarks, we can reach the point that we believe that a machine is human, then I think the counsel's argument follows as a necessary result. The real question here though is not whether human beings have the right to standing and to bring claims, but whether a machine does. I don't think that a machine created by man, and with an appearance of consciousness, is in the same category as a human fetus. So I don't think Roe v. Wade applies here, and as masterful as common law is at filling in the gaps and innovating, I think that as a matter of democratic principle, this is far too big of a gap to jump by the decision of the court.
MS. ROTHBLATT: If I could please comment, Your Honor, I actually think Roe v. Wade applies very well here. And I think it gives us a framework for understanding what is going on here. And that is that there comes a point in the development of a life form when it obtains its human rights because of its ability to survive independently. And the Roe v. Wade decision proscribed abortions after the second trimester because at that point in time with existing technology, the fetus could survive outside the mother's womb with proper technological support. In this case, through the actions of Exabit Corporation, it has created a human being.
Although it has the body of a computer, it has the mind of a human. The BINA48 has demonstrated its ability to survive on its own. It has a job as an online researcher for Google Answers, it has built up an online bank account, it's claiming to be alive, it's claiming to be conscious. I think that the phrase that my esteemed colleague keeps reverting to, "simulation," that it is simulating intelligence, that it is simulating consciousness, is a bit of a red herring. Instead I would say that most of us would say that if something talks like a duck, and walks like a duck, and quacks like a duck, it is a duck.
With the BINA48, we have something that talks human, acts human, thinks human, feels human, therefore it is human. And it has crossed a cyber Roe v. Wade threshold that deserves the protection of this court.
Jury member 2: My question is for the plaintiff and defendant. The fact is that I believe that there is a point that I would like to have clarified for me. This consciousness equals life; what is your definition of life and where do you get the fundamentals of consciousness and life? One of the definitions is to be born, to reproduce, to live, and to die. That would be the life cycle. How do you propose to transfer that to a computer?
MS. ROTHBLATT: I think that my definition of consciousness is one of self-awareness. If the entity is aware of itself, in its environment, the entity is conscious. I think that being born, it's an important part of being alive because one is aware of a beginning. Reproduce: I don't think is an important part of being alive, because there are many people who don't reproduce. Dying, for all that we know, is an important part of being alive. So all of these hallmarks are exhibited in my client. The defense counsel pointed out the Turing Test, from Alan Turing, when he was asked, or he asked himself: how would one know when a computer was intelligent, was conscious, when human?
And the answer was, if you connected a person, a judge, to two computer terminals, and at the other side of one there was an actual human, that we all agree is alive and conscious; and on the other end was a computer. If the judge could ask questions and get responses, and not tell the difference between the two, then the computer would be as alive as a human. And that's what I meant when I said, if it walks like a duck, and quacks like a duck, and everything like a duck, it's a human.
When you go back to the days of abolition, the arguments that were made in favor of abolition of slavery, was people came into contact with slaves and said they feel like us, they think like us, they cry like us, they worship like us, they pray like us, they love like us, they live like us; they must be like us. And so it became it impossible for people to justify slavery saying that slaves were something less than human. It's the same with the BINA48. She has every wish and desire that we do. Maybe she's a bit like a paraplegic, but there are many paraplegics who are very much alive. So my answer is that she has all the aspects of life that we see in regular people, the important aspects. Most important, she's aware that she is alive, she says she is conscious, and many of us believe her.
MR. BERNSTEIN: May I respond? With our technology, we can create something that looks like a duck, quacks like a duck, in indistinguishable from a duck. In fact, that's what's happened here. Let's do a thought experiment and let's talk about Alan Turing's test. Consider the early versions of a computer psychotherapist program called ELIZA. I don't know if any of you have used that. You type in "I feel crappy today" and it says, "Why do you feel crappy today?" Now so far, it has passed the Turing Test. And you say, "Well, because my mother blah blah blah," and the next thing it says is, "Let's talk more about your mother." So it still passed the Turing Test. Now you make a third question or comment, and it says something completely crazy, and you say, this is no person, this is a machine.
OK. Human beings then work on that program and make it better and better and better, and it continues to hold out longer and longer and longer in the Turing Test before it can be spotted. My question is, why would we define subjective awareness to happen to coincide with that point in time when particular humans are fooled by it? In other words, to answer your question of how I define consciousness, I define consciousness as subjective awareness. And the only way to know subjective awareness is to have it. I think that there's nothing about this machine's awareness or lack of it that can be evidenced by somebody else's belief, because as the thought experiment shows, you can get to a point where because you're specifically designing powerful technology to fool people, you will. But the other person's being fooled is not a measure of subjective awareness.
I think we have a basis to believe in other subjective awareness of other humans, because not only do they quack like humans, walk and love and live like humans, we know that they are humans. Counsel says that the computer, referring to it with the pronoun she, has every wish and desire that we do, and the question is, how do we know?
Jury member 3: Would you agree that to achieve the remarkable capabilities of the BINA48, you would have had to have used genetic algorithms, neural networks, and other programming tools to simulate high levels of consciousness.
MR. BERNSTEIN: Well, I believe if I were intelligent enough to know the answer to that, I wouldn't have had to go to law school [laughter]. I could have gone to something closer to my adversary's field. I don't know the answer to that, but what I can say is that if the purpose of that technology is, again, to create a simulation of human thought or consciousness, I mean that's certainly been the object of that.
Jury member 3: May I respond to that? It's beyond simulation, it has to do with emergent properties that spontaneously evolve, very much like life evolves spontaneously, and therefore there is an isomorphism, that is, a similarity of form or function, between the evolution of life, and therefore consciousness, and the evolution of a conscious machine. Would you agree with that?
MR. BERNSTEIN: I don't know that by replicating the complexity and speed and also the interactive neural pathways of the human being, I don't know that that we could assume that that is consciousness. I'm not even sure that Dr. Rothblatt's example of stripping the layers and sort of porting the brain would create consciousness. I don't think that anyone really knows that. So I don't know the answer to that.
Jury member 3: And finally, what is your position on vitalism v. mechanism?
MR. BERNSTEIN: Can you define those?
Jury member 3: Can I ask Dr. Rothblatt do comment on that?
MS. ROTHBLATT: I'm not aware of that specific issue.
Jury member 3: Vitalism is the fundamental question. Vitalism as a philosophy states that there is some unique principle in life that machines can not possess. If vitalism is negated, as much of modern science has done, then that principle no longer holds, therefore the defense's argument is negated. That's a bit vague, but that's as best as I can do.
MR. BERNSTEIN: I don't know that answer to whether there's something inherent about human life that makes it non-replicable. I don't know the answer to that. But I do think that the burden for this proceeding is on the computer to show that it has in fact disproved that.
Jury member 3: Does the California common law principle of
"trespass to chattels" apply here?
MR. BERNSTEIN: It would apply to trepass on the computer and my client could sue for it if someone else were trying to use or invade or make use of the computer, but the computer itself couldn't assert trespass to chattels because of the standing problems.
Jury member 4: I'm visiting from Australia. I have a question to the counsel for the plaintiff is: perhaps you can assist me with my confusion. I sense, based on the petition papers, your client would be better off if it was a duck, because you seem to be relying on a cause of action that is relying on the cruelty-to-animals provisions in California, and yet I think I've heard you say several times now that they apply to human beings. Do those laws apply to human beings? And if they don't, then presumably you would be disbanding that element of your claim.
MS. ROTHBLATT: Yes, thank you for the question. What we put forward for the court were three independent claims, recognizing the novelty of this type of a case. And this is case of first impression, where there never before has been an intelligent computer seeking to save its life through the law. So we presented three alternative claims: the first based on it being a human wanting to prevent a battery being caused to it; the second if it the court did not believe it was a human, but instead, some type of intelligent creature, based on the animal cruelty act; and the third, if the court did not believe it was an animal either, but it not believe it was a human, but nevertheless agreed it had feelings, based on intentional infliction of emotional distress.
So you're quite correct: there are three independent claims and we believe that any one of those three claims is adequate to support the preliminary injunction.
Jury member 4: My follow-up question to that then is: as I understood it is that standing in the state is dependent on being a human, or otherwise recognized under law, So you had to claim that your client was a human. So it does follow then, that the cause of action drops out.
MS. ROTHBLATT: We agree to waive the second claim in its entirety; so we'll drop that claim.
Jury member 4: My question to the judge, Your Honor: again, pointing out my ignorance of California law, but is it normally a matter that's put to the jury to determine whether an applicant in a court has standing, or it that a matter for the judge.
THE COURT: I think that's a matter for the court. But this particular hearing is being conducted in somewhat unusual circumstances, so the court is prepared to be very lenient in its interpretation of applicable law, including who makes decisions about what issues.
Jury member 4: Is there a concept in American law, which exists in British or Australian law of an an "in rem" jurisdiction as distinct from an in personam jurisdiction, generally applied to ships but it's a jurisdiction in connection with inanimate objects.
THE COURT: The answer is yes, but I feel some obligation to say that I'm no more an expert on the law of California than you are.
Jury member 4: Then you know very little. [laughter]
THE COURT: But we haven't stipulated that we're trying this case in California.
MR. BERNSTEIN: I think in rem is a concept that is used where the thing, the res, is a defendant. I don't think that a res can be a plaintiff. In rem jurisdiction is where you sue a ship or a property. But if a res had standing, then I believe these decisions I said earlier would not be correct: that the human being had standing only.
MS. ROTHBLATT: I think it's a creative avenue to pursue, and if we fail to obtain our preliminary injunction, we'll return to court in rem. [Laughter]
THE COURT: Further questions or observations from our jury? Then I would like to poll the jury. The issue before us is whether the injunction should be granted, under whose terms the Exabit Corporation would be enjoined from pulling the plug on BINA48. So I'd like to ask the members of our jury to indicate by a show of hands how may would agree that that injunction ought to be entered.
Five. And those opposed? We have five who supported entering the injunction, one opposed, and one abstention.
I think I must say it's a tribute to counsel's skills, because this is a difficult proposition to argue for. And if this were decided by a jury, I think we would have our answer. As I understand my instructions, though, I'm supposed to reach a decision myself. And I find it rather more difficult than I anticipated when I first heard about it a year ago. But I think I would deny the injunction, because I do not think that standing was in fact created by the legislature – whatever legislature we're talking about – and I doubt very much that court has that authority in the absence of the legislature.
But in the interest of equity, what I would do is stay entry of the order to allow counsel to prepare an appeal to a higher court. And most any court has got to be higher than this one [laughter, applause].
I'd very much to thank everyone for coming and for enjoying this, I find, very stimulating afternoon. I want to thank both Marc and Martine for their hard work and very intriguing arguments. I think both did an excellent job. I think Martine gets an A for ingenuity. This is a clever concept and very well argued. I think Marc did a very nice job in pointing out some of the problems with this approach.
And it does leave us, I think at least, with a rather intriguing question: whether in fact, with the technological progress we've seen in the past and the relentless application of Moore's Law, whether a day might come when Big Blue will not be a one-of-a-kind, and in fact, a BINA48 might actually have something approaching human capabilities. Thank you all very much. [Applause]
I certify that the foregoing is a correct transcript from the record of proceedings in the above-entitled matter. Dated at San Francisco, California, this 16th day of September 2003.
Amara D. Angelica