Skip to main content

2002 | Buch

The Dynamics of Judicial Proof

Computation, Logic, and Common Sense

herausgegeben von: Professor Marilyn MacCrimmon, Professor Peter Tillers

Verlag: Physica-Verlag HD

Buchreihe : Studies in Fuzziness and Soft Computing

insite
SUCHEN

Über dieses Buch

Fact finding in judicial proceedings is a dynamic process. This collection of papers considers whether computational methods or other formal logical methods developed in disciplines such as artificial intelligence, decision theory, and probability theory can facilitate the study and management of dynamic evidentiary and inferential processes in litigation. The papers gathered here have several epicenters, including (i) the dynamics of judicial proof, (ii) the relationship between artificial intelligence or formal analysis and "common sense," (iii) the logic of factual inference, including (a) the relationship between causality and inference and (b) the relationship between language and factual inference, (iv) the logic of discovery, including the role of abduction and serendipity in the process of investigation and proof of factual matters, and (v) the relationship between decision and inference.

Inhaltsverzeichnis

Frontmatter

Introduction

Frontmatter
Making Sense of the Process of Proof in Litigation
Abstract
A new type of scholarship about evidence in litigation first surfaced about thirty years ago1 By the mid-1980s, this new species of evidence scholarship — the “New Evidence Scholarship”2 (“NES”) — had grown substantially; by then it had become a major international movement. Today, NES is a mature field of scholarship — so much so that the label “new” may now be a misnomer.
Peter Tillers

Common Sense Reasoning

Frontmatter
Artificial Intelligence, Mindreading, and Reasoning in Law
Abstract
One aspect of legal reasoning is the act of working out another party’s mental states (their beliefs, intentions, etc.) and assessing how their reasoning proceeds given various conditions. This process of “mindreading” would ideally be achievable by means of a strict system of rules allowing us, in a neat and logical way, to determine what is or what will go on in another party’s mind. We argue, however, that commonsense reasoning, and mindreading in particular, are not adequately described in this way: they involve features of uncertainty, defeasibility, vagueness, and even inconsistency that are not characteristic of an adequate formal system. We contend that mindreading is achieved, at least in part, through “mental simulation,” involving, in addition, nested levels of uncertainty and defeasibility. In this way, one party temporarily puts himself or herself certainly in the other party’s shoes, without relying wholly on a neat and explicit system of rules.
John A. Barnden, Donald M. Peterson
Common Sense, Rationality and the Legal Process
Abstract
Our organizer and moderator, Peter Tillers, asked that I address the topic of common sense, rationality, and the legal process. As you know, everything in the law is controversial, and thus it is not surprising that there are at least two possible explanations for why he did this. One school of thought holds that Professor Tillers believes that I am the walking embodiment of common sense and rationality, and thus that my remarks here, whatever the content, would be an exemplar of just what he asked me to talk about. The other school of thought holds just the opposite: that the reason for the request was that the probability of my saying anything commonsensical, indeed, maybe even coherent, was just about zero. This state of affairs may seem problematic, but in fact it is a testament to the deep insight of our esteemed organizer (this being perhaps the only true proposition in this entire string of sounds that I am now emitting!), because in either case these remarks may be an exemplar, even though there may be disagreement as to what they are exemplifying. Leave it to Professor Tillers to detect Chomsky-like deep structures to our inquiries and to find an efficient way to put the matter before the assembly. By the way, although I certainly would not suggest that I know which of these schools is correct, I do think there is some good evidence before us. Who in his right mind, in the midst of the intellectual feast that we are consuming, would agree to give any remarks that would disturb the consumption of the well-deserved nutritional feast before you, the likely result of which is surely to be either intellectual or physiological indigestion, or both? And while I am on the topic of people being in a tough spot, we should all extend our deepest sympathies to Professor MacCrimmon who will soon be called upon to try to make some sense of all this. Heroic duty if ever there were any.
Ronald J. Allen
What Is “Common” about Common Sense? Cautionary Tales for Travelers Crossing Disciplinary Boundaries
Abstract
Understanding the process of proof in judicial decision making necessarily involves an appreciation of the operation of commonsense knowledge and reasoning in fact determination. Throughout the history of Western civilization, commentators have recognized the crucial role of common sense in our understanding of the world.3 The central role of common sense is further highlighted by the attempts of artificial intelligence (“AI”) to build machines that can see, move, and act. These efforts confirm that we cannot navigate through life without tacitly drawing upon common sense. Steven Pinker illustrates:
You know when Irving puts the dog in the car, it is no longer in the yard. When Edna goes to church, her head goes with her. If Doug is in the house, he must have gone in though some opening unless he was born there and never left. If Sheila is alive at 9 A.M. and is alive at 5 P.M., she was also alive at noon. Zebras in the wild never wear underwear.4
Marilyn Maccrimmon

Fuzzy and Rough Logic

Frontmatter
From Computing with Numbers to Computing with Words: From Manipulation of Measurements to Manipulation of Perceptions
Abstract
Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely,the price of gas is low and declining,Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc. Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech and summarizing a story. Underlying this remarkable capability is the brain’s crucial ability to manipulate perceptions — perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions — a theory which may have an important bearing on how humans make — and machines might make — perception-based rational decisions in an environment of imprecision, uncertainty and partial truth.
Lotfi A. Zadeh
Fuzzy Logic and Its Application to Legal Reasoning — A Comment to Professor Zadeh
Abstract
Fuzzy logic, introduced by Professor Lotfi Zadeh more than thirty-five years ago, has become an established analytical tool in a variety of disciplines, including the study of legal reasoning. Fuzziness supplies a useful instrument for modeling various modes of uncertainty and vagueness, not dealt with by traditional probability calculus relying on the age-old structure of two-valued logic. The latter typically handles the uncertainty emanating from lack of information about the subject of interest, but has proved a poor method for modeling semantic vagueness. Fuzzy set theory,1 and the logic constructed on its foundations, were designed to amend these shortcomings.
Ron A. Shapira
A Primer on Rough Sets: A New Approach to Drawing Conclusions from Data
Abstract
Rough set theory is a new mathematical approach to vague and uncertain data analysis. This Article explains basic concepts of the theory through a simple tutorial example and briefly outlines the application of the method to drawing conclusions from factual data. The presented approach can be used in some kind of legal reasoning.
Zdzislaw Pawlak

The Structure of Factual Inference in Judicial Settings

Frontmatter
Alternative Views of Argument Construction from a Mass of Evidence
Abstract
Work is now progressing on attempts to close a very important technology gap. We are still far better at gathering, transmitting, storing, and retrieving information than we are at drawing defensible conclusions from this information. This gap is as apparent in law as in any other context in which conclusions are reached based on masses of evidence. Some of the work being done to close this gap involves study of what are now called “inference networks.” Such networks are representations for complex probabilistic reasoning tasks often based on masses of evidence. However, this approach is not at all new. Some of it dates from the early twentieth century in the work of John H. Wigmore, whose name is certainly well known among legal scholars and practitioners on this side of the Atlantic. It happens that Wigmore was the very first person to study complex inference networks in any systematic way. Until quite recently, Wigmore’s work on inference networks was not taken seriously within his own field of law, and it has been almost entirely unnoticed by persons in other disciplines in which there is equal concern about complex probabilistic reasoning tasks based on masses of evidence.
David A. Schum
Explaining Relevance
Abstract
It is currently possible to build computational models of single-case legal evidential reasoning, and many software packages are available on the market that are suitable for the task.1 It is true that every legal case is different, but we can reasonably hope to find general abstract structures, “consolidative models,”2 that can be applied to different cases whenever the same kind of evidence is dealt with.
Paolo Garbolino
Theories of Uncertainty: Explaining the Possible Sources of Error in Inferences
Abstract
A central task in legal factfinding is evaluating the warrant for a finding or the soundness of an inference from the evidentiary propositions to a conclusion. This task is especially difficult when there is much at stake, but the evidence is incomplete and the soundness of the inference is uncertain. Analyses of how to improve such inferences have been made at various levels of generality, and for different types of evidence. For example, one general problem is distinguishing “scientific knowledge” from “junk science,” as required for admissibility in judicial proceedings under Federal Rule of Evidence 702, following the Daubert v. Merrell Dow Pharmaceuticals, Inc.1 decision.2 Another general problem is evaluating inferences about unique historical events, the kind of factfinding necessary in criminal cases.3 As opposed to such general problems, some theorists address only particular areas where inferences are difficult in law, such as the “lost chance” cases,4 cases involving “indeterminate plaintiffs,”5 inferences from “naked statistical evidence,”6 or inferences based on DNA identification.7 Such problems of correct inference cannot be solved purely by formal logic, nor can theorists merely duplicate the role of the factfinder by evaluating the specific evidence in a particular case. To be useful as theories of inference, accounts can be neither too general nor too specific. They must provide useful models for handling recurring types of inference in situations where findings must be warranted by incomplete evidence.8
Vern R. Walker
Models of Data Generation vs. Models of Events that Generate Data
Abstract
Let me begin by describing a very general model of a data-generating process. An experimental psychologist studying human decision making decides to obtain data bearing on some hypothesis or other. To record these data, he prepares a blank contingency table with its compartment-defining boundaries and internal subdivisions in place, but with no observations recorded. As data collection proceeds, the cells of the table provide loci for recording the data. By counting the tallies, he can assess numbers that seem to be useful estimators of various probabilities, and ratios of such numbers that are often even more useful. I shall call this blank table the “statistician’s model of data.” The underlying notion is that a data-generating process produces the observations, and we use those observations to infer properties of that process. This model is as appropriate to what Bayesian statisticians do as to what classical ones do. Nothing important changes if we deal with continuous rather than discrete observations — measures rather than counts.
Ward Edwards

Dynamic Inference and Choice in Dynamic Environments

Frontmatter
Action and Procedure in Reasoning
Abstract
Meaningful comparisons between “logic” and “legal reasoning” must evolve with their relata. In this Article, I explain some basics of “logical dynamics,” a current procedure-oriented view of reasoning and other cognitive tasks, using games as a model for many-agent interaction. Against this background, I speculate about possible new connections between logical dynamics and legal reasoning.
Johan van Benthem
Decision Analysis and Law
Abstract
In the context of this conference, I have been asked to comment briefly on the role that decision analysis might play in the various problems faced by the legal profession. While this topic could probably fill several books, I think a few remarks here may be helpful.
Ronald A. Howard

Abductive Inference

Frontmatter
Serendipity and Abduction in Proofs, Presumptions and Emerging Laws
Abstract
Serendipity, in science, is the ability to discover, invent, create, or imagine a finding — a hypothesis, an explanation, a rule, a theory, a law — without deliberately having looked for it. This aptitude involves an ability to give a justifiable interpretation of unexpected, incomprehensible, or unqualifiable facts with a given reference system. There are numerous examples of serendipity, not only in science, but also in technology and art) Such fortuitous findings are generally thought to be the result of a “chance observation” or the result of emerging norms that may evolve in a more or less chaotic way through an underlying principle that can be discovered through a meticulous interpretation of the data.
Pek van Andel, Danièle Bourcier
On the Proof Dynamics of Inference to the Best Explanation
Abstract
“Inference to the best explanation” — here called “abduction” — is a distinctive and recognizable pattern of evidential reasoning. It is ubiquitous at or near the surface of typical arguments offered in judicial and scientific contexts, and in ordinary life. It is part of “commonsense logic.” An abductive argument is open to attack in characteristic ways, and may be defended in characteristic ways by supporting arguments. Abductive arguments are fallible, but there are only a small number of ways in which they can go wrong. This analysis provides a framework for justification, criticism, and dialogue concerning the evaluation of evidence. It should be helpful in the law of evidence and in the training of investigators.
It is useful to distinguish among the primary meanings of "abduction." It may refer to a static pattern of argumentation or justification, or the dynamic inferential processes whereby explanatory hypotheses are generated and evaluated, or the processes of constructing arguments, including abductive arguments. It is also useful to recognize that best explanations are almost always composite hypotheses or "theories" comprised of many parts.
It is desirable to adopt reasoning strategies that produce conclusions that can in principle be justified. Even better are reasoning strategies in which the confidence in a conclusion arrives bearing a structure of argumentation that can be critically examined, and in which little or no additional work is required to extract, organize, or economize arguments. Reasoning strategies for composing, criticizing, and revising hypotheses, and for justifying conclusions, can be investigated scientifically by implementing such strategies in software and testing their performance.
This essay describes a specific reasoning strategy (simplified and idealized from natural reasoning) in which each part of a conclusion formed by using the strategy has a structure of justifications that is strong, but fallible. This strategy has been implemented and performs well in testing. Some possible lessons are drawn for human theory formation, judicial argumentation, and the conduct of investigations. Finally, a definition is suggested for the evidentiary standard "beyond a reasonable doubt."
John R. Josephson
Species of Abductive Reasoning in Fact Investigation in Law
Abstract
Imaginative reasoning is as vital in law as it is in any other discipline. During fact investigation, hypotheses, in the form of possible charges or complaints, must be generated or discovered, as well as evidence bearing on these hypotheses. During the later process of proof, arguments in defense of the relevance, credibility, and probative force of offered evidence on hypotheses must also be generated. In no context known to me are hypotheses, evidence, and arguments linking them supplied at the outset for investigators and attorneys. These ingredients must be generated by imaginative or creative thinking. How we are able to generate new ideas has been an object of study for millennia. In spite of this, our imaginative and creative reasoning abilities are not well understood. There is considerable debate about the forms of reasoning that take place as we generate new ideas and evidential tests of them. This Article concerns a form of reasoning called “abduction,” which was suggested over a century ago by the American philosopher Charles S. Peirce as a reasoning mechanism underlying imaginative and creative thought. Most of us hear about two forms of reasoning: (1) deduction, showing that something is necessarily true, and (2) induction,showing that something is probably true. There is reason to believe that new ideas, in the form of hypotheses, may not be generated by induction or deduction. In this Article I will suggest that there are several species of abductive reasoning by which we show that something is possibly or plausibly true. I relate these species of abduction to intellectual tasks performed by investigators during fact investigation and to those performed by advocates during the process of proof. Consideration of these species of abductive reasoning exposes the richness of this important discovery-related activity.
David A. Schum
Abductive Reasoning in Law: Taxonomy and Inference to the Best Explanation
Abstract
Following the positivistic philosophy of Karl Popper and Hans Reichenbach,1 many traditionalist thinkers on rational proof in law still assume a sharp distinction between “the context of discovery” and “the context of justification.” These traditionalists regard the context of justification as the proper province of legal reasoning. Justification deals with the analysis and appraisal of decisions, judgments, arguments, and verdicts once they are already “on the table.” Thus, questions about the rational adequacy of a judge’s verdict, or about a police decision to charge a suspect, or about the viability of a case that the District Attorney chooses to prosecute are all important to traditional theories. However, questions about discovery2 play little or no role in many accounts of evidential reasoning in law. The traditionalist does not claim that discovery and imagination are unimportant. Rather, her claim is that a theory of legal reasoning should concern itself only with the logic of rational arguments. Imagination and discovery should be left to the psychologist. The legal theorist Neil MacCormick states this traditional stance very succinctly: “[I]n relation to legal reasoning, the process which is worth studying is the process of argumentation as a process of justification.”3
Kola Abimbola

From Theory to Practice: “Intelligent” Procedures for Drawing Inferences Static and Dynamic Legal Environments

Frontmatter
Computational Inference for Evidential Reasoning in Support of Judicial Proof
Abstract
The process of judicial proof accrues evidence to confirm or deny hypotheses about world events relevant to a legal case. Software applications that seek to support this process must provide the user with sophisticated capabilities to manipulate evidential reasoning for legal cases. This requires computational techniques to represent the actors, entities, events, and context of world situations to structure alternative hypotheses interpreting evidence and to execute processes that draw inferences about the truth of hypotheses by assessing the relevance and weight of evidence to confirm or deny the hypotheses. Bayesian inference networks are combined with knowledge representations from artificial intelligence to structure and analyze evidential argumentation. The infamous 1994 Raddad murder trial in Nice, France provides a backdrop against which we illustrate the application of these techniques to evidential reasoning in support of judicial proof.
Tod S. Levitt, Kathryn Blackmond Laskey
Logical Argumentation, Abduction and Bayesian Decision Theory: A Bayesian Approach to Logical Arguments and Its Application to Legal Evidential Reasoning
Abstract
There are good normative arguments for using Bayesian decision theory for deciding what to do. However, there are also good arguments for using logic where we want formal semantics for a language, and where we want to use the structure of logical argumentation with logical variables to represent multiple individuals (things). This Article shows how decision theory and logical argumentation can be combined into a coherent framework. The Independent Choice Logic (“ICL”) can be viewed as a first-order representation of belief networks with conditional probability tables represented as first-order rules, or as a abductive/argument-based logic with probabilities over assumables. Intuitively we can use logic to model causally (in terms of logic programs with assumables). By abducing evidence to the explanations, we can predict what follows from these explanations. As well as abduction to the best explanation(s), from which we can bound probabilities, we can also do marginalization to reduce the detail of arguments. Tillers’ example of judicial proof in the quandary of Able Attorney is used to show how the framework could be used for legal reasoning. The code to run this example is available from the author’s website.
David Poole
Structured Deliberation for Dynamic Uncertain Inference
Abstract
Dynamic uncertain inference is the formation of opinions based upon evidence or argument whose availability is neither disclosed to the analyst in advance nor disclosed all at once. Normative accounts of belief change, which work well when the analyst has prior notice of well-designed experiments and their possible outcomes, may not be applicable to less tidy occasions of inference. In addition, there is the clerical challenge of keeping track of what has been observed, what relates to what, and how. This Article begins with a discussion of subjective valuation in general. An approach to deliberation, similar to what is practiced in the multiattribute utility modeling community, is then suggested for dynamic credibility assessment. Features of the proposed technique are explained through their application to a celebrated French murder investigation. The method presented here may be reconciled with Bayesian belief models by noting that the latter lack a consensus view of how stable beliefs form in the first place. Thus, the ideas discussed here may be taken as an account of original belief formation, and so complementary rather than antagonistic to subjective probability methods.
Paul Snow, Marianne Belis

Judicial Proof and Economic Rationality

Frontmatter
Saving Desdemona
Abstract
Both Richard Posner, in his recent article An Economic Approach to the Law of Evidence,1 and Craig Callen, in Costs,Benefits, Hearsay and Evidence,2 stress a possible important justification for admissibility rules (and hearsay rules in particular): their relationship to the efficiency of interpersonal transactions. Posner endeavors to demonstrate a partial correspondence between the law of evidence and economic efficiency. Many rules of evidence, he asserts, are meant to repair identifiable market failures typical of the adversarial system, where the process of searching information relevant in the decision making is largely privatized. Considering the aggregate social value of a correct judgment on the one hand, and the costs of processing information on the other, the parties in an adversarial system may expend too many resources in gathering and presenting evidence in some cases, and too few in others. The law of evidence is therefore called upon to decrease incentives for excessive search of evidence in the former cases, and to increase such incentives in the latter. Callen corroborates and extends Posner’s insight by using findings of cognitive sciences and studying the efficient exploitation of limited cognitive resources.
Ron A. Shapira
Othello Could Not Optimize: Economics, Hearsay, and Less Adversary Systems
Abstract
A symposium about the relationship of artificial intelligence and evidence law is at the cutting edge of scholarship on the theory of evidence law; bringing economics into it might seem to be too much of a good thing. Economic models are relevant here because, like computer programs, they depend on formal logic. Overreliance on formal logic, and on a concomitant unrealistic assumption about our cognitive capacity, may lead formal models — whether computer programs, equations, or economic analyses — to ignore important aspects of human intelligence. As a result, formalisms may fail to capture important aspects of human intelligence, and, accordingly, may mislead those who rely on them for an understanding of how humans can, or should, resolve issues of fact.
Craig R. Callen

Causality

Frontmatter
Causality and Responsibility
Abstract
Attribution of responsibility, whether as blame or praise, always rests on a claim about causation. In order to be responsible for an event, a person must take an action that causes the event. This creates a philosophical puzzle. As philosophers, we tend to be skeptical about claims of causal knowledge. No matter how thoroughly we have studied a phenomenon, we say, we may not have gotten to the bottom of it and really understood its causal structure. But as judges and juries, we are prepared to conclude that a certain action has caused a certain result. How can we reconcile our philosophical and scientific skepticism with our practical credulity?
Glenn Shafer
Liability for Increased Risk of Harm: A Lawyer’s Response to Professor Shafer
Abstract
Tort doctrine, which insists on proof of causation by a preponderance of the evidence, frustrates two of tort law’s principal objectives — deterrence of harmful behavior and the facilitation of corrective justice — when applied to cases in which causation is extraordinarily difficult, if not impossible, to determine. Causation problems are particularly complex in cases where plaintiffs allege that their injuries result from exposure to drugs or other chemicals. Professor Shafer’s suggestion, which advocates allowing such plaintiffs to recover simply on a showing of increased risk of injury, is a provocative attempt to correct the inadequacies posed by current doctrine, and is an inspiring starting point for rethinking whether and to what extent tort doctrine must change.
Melanie B. Leslie
Metadaten
Titel
The Dynamics of Judicial Proof
herausgegeben von
Professor Marilyn MacCrimmon
Professor Peter Tillers
Copyright-Jahr
2002
Verlag
Physica-Verlag HD
Electronic ISBN
978-3-7908-1792-8
Print ISBN
978-3-662-00323-7
DOI
https://doi.org/10.1007/978-3-7908-1792-8