It appears appropriate that during the worlds incarceration, due to a pandemic, to examine more closely, post-humanism and the possibility or probability that this hand has already been played and we are on re-play while we exist within a ancestor computer simulation. This part 3 completes Nick Bortrom’s Introduction of his paper; ” Are we living in a computer simulation?
Continuation of Introduction:
Apart from the interest this thesis may hold for those engaged in futuristic speculation, there are also more purely theoretical rewards. The argument provides a stimulus for formulating some methodological and metaphysical questions, and it suggests naturalistic analogies to certain traditional conceptions, which some may find amusing or thought-provoking.
The structure of the paper is as follows.
First, we formulate an assumption that we need to import the philosophy of mind* in order to get the argument started.
Second, we consider some empirical reasons for thinking that running vastly many simulations of the human mind would be within the capacity of future civilizations that has developed many of those technologies that can already be shown to be compatible with known physical laws and engineering constraints. This part is not philosophically necessary but it provides an incentive for paying attention to the rest.
Then follows the core of the the argument, which makes use of some simple probability theory*, and a section providing support for a weak indifference principle* that the argument employs.
Lastly, we discuss some interpretations of the disjunction , mentioned in the Abstract, that forms the conclusion of the simulation argument.
Here are some definitions of terms to help us along:
- Indifference theory: is a rule which tells us how to assign probabilities when we don’t have any special knowledge of a situation. The rule states that each possibility should be assigned an equal probability – assuming there is no reason for choosing one above the other.
- Probability theory: a branch of mathematics concerned with the analysis of random phenomena. The outcome of a random event cannot be determined before it occurs, but it may be anyone of several possible outcomes. The actual outcome is considered to be determined by chance.
II. THE ASSUMPTION OF SUBSTRATE-INDEPENDENCE
A common assumption in the philosophy of mind is that of substrate*-independence. The idea that mental states can supervene on any of a broad class of physical substrates. Provided a system implements the right sort of computational structures and processes, it can be associated with conscious experiences. It is not an essential property of consciousness that it is implemented on carbon-based biological neural networks inside a cranium: silicon-based processors inside a computer could in principle do the trick as well.
Arguments for this thesis have been given in the literature , and although it is not entirely uncontroversial, we shall here take it as a given.
The argument we shall present does not, however, depend on any very strong version of functionalism or computationalism, For example, we need not assume that the thesis of substrate-independence is necessarily true ( either analytically or metaphysically*) – just that, in fact, a computer running a suitable program would be conscious. Moreover, we need not assume that in order to create a mind on a computer it would be sufficient to program it in such a way that it behaves like a human in all situations, including passing the Turing test* etc. We need only the weaker assumption that it would suffice for the generation of subjective experiences that the computational processes of a human brain are structurally replicated in suitable fine-grained detail, such as on the level of individual synapses, This attuned version of substrate-independence is quite widely accepted.
Neurotransmitters, nerve growth factors, and other chemicals that are smaller than a synapse clearly play a role in human cognition and learning. The substrate-independence thesis is not that the effects of these chemicals are small or irrelevant, but rather that they effect subjective experience only via their direct or indirect influence on computational activities. For example, if there can be no difference in subjective experience without there also being a difference in synaptic discharges, then the requisite detail of simulation is at the synaptic level ( or higher).
- Metaphysical: derived from the Greek word metataphysika (after the things of nature) referring to an idea, doctrine, or possible reality outside of human sense perception. In modern philosophical terminology, metaphysics refers to the studies of what cannot be reached through objective studies of material reality.
- Substrate : the surface or material on or from which an organism lives, grows or obtains its nourishment.
- Turing Test: is a method of inquiry in artificial intelligence for determining whether or not a computer is capable of thinking like a human being. The test is named after Alan Turing, founder of the Turing Test.
Next: Part 4: the technological limits of computation….