Mike Finch's Site
Home
Meditation
and
Philosophy
Why Meditation?
Why Philosophy?
Influences
Recent Writings
Earlier Writings:
Teachers
Buddhism
Focusing
Body-Based
Maharaji
Introduction
My Book #2
My Book #1
Articles
Open letter
My 30 Years
Photos
Other Interests
Eating & Exercise
Haiku
Brit/US Spelling
Personal History
Bio/Resume/CV
My Life in Music
Academic
Software Creation
Current Company
Contact Me
Search Site or Web
Copyright Conditions
A Metaphysics Of Distinction, Performance And Practice
Michael R Finch
This paper won me the Fellowship of the International Society For Philosophers. It is a longer, more academic and philosophically tighter essay than the three above, but is based on them.
[Return to list of Synthesis articles]   [next article]

Contents

00 - Introduction
01 - Logic
02 - Reason
03 - No Reason
04 - Plurality
05 - Memes
06 - Metaphors
07 - Categories
08 - Relativism
09 - Nothing
10 - Unsaying
11 - Self-Reference and Contradiction
12 - Everything
13 - Two Realities
14 - Nowhere and Nothing
15 - The Practice
16 - Thinking Bodily
17 - Beyond The Self
Acknowledgements
Notes

 

0 - Introduction

No thinking person can escape metaphysics. It is the 'first philosophy' (Aristotle) which attempts to find the 'first principles' (Plato) of what is, including the question 'what is?' itself, and the questioner, and who the questioner really is...etc. It is to follow the seemingly endless chain of 'why?' questions to the end; or if there is no end, to understand and assimilate that fact. It is an attempt to cast the net of human understanding as wide as possible and as all-encompassing as possible.

Metaphysics includes all other inquiries, in the sense that at the edge of your chosen area of inquiry there are questions what is beyond, and what is the context in which your chosen inquiry exists and has purpose. Metaphysics is the only area of inquiry which you cannot step outside of in this way, since it is an attempt to understand everything as a coherent whole. If you consider that to be an overly ambitious task, that is itself a metaphysical position.

To deride or ignore metaphysics, then, is simply to abort this quest for the ultimate explanation. This may well be justified, and any ultimate be unattainable, but then the 'why?' questions must continue a little further, since you then need to say why the ultimate is unattainable - which again is metaphysics.

Since I find I cannot escape metaphysics, I have decided to embrace it. I thus outline in this paper an attempt to place the human situation (to be specific, my situation) in as wide and all-inclusive a context as possible.

 

1 - Logic

I suggest that the western rational endeavour over the last 2500 years is driven by, and has at its heart, the human ability to create distinctions. Once we can divide our domain of discourse into distinct entities, then we can reason and think logically about it. And without dividing our subject matter into distinct entities, we cannot think about it, at least not logically.

We think of an entity X as being 'distinct', or distinguishable from other entities in its domain, if at least the following hold in the appropriate time period:

1) X is X
2) X is not not X
3) An entity is either X or not X.

If X is a proposition, then these three become the 'laws of thought' upon which classical logic is based - the laws of identity (if X then X), non-contradiction (X cannot be true and false), and the excluded middle (X is true or false)[1]. But X can be a part of any domain that one wishes to think logically about: X can also be a predicate or property, an element of a set (Zermelo-Fraenkel set theory[2]), an everyday physical object, a thought, an idea, a concept or even a feeling. Note that the distinction must maintain over a suitable time period, or timelessly in the case of formal or abstract entities.

In this paper I will use 'logic' to denote thinking in such a way that the objects of thought are distinct in the above sense - often called classical logic, Aristotelian logic or commonsense logic. Lakoff and Johnson[3] call it 'container logic', suggesting that we arrive at these ideas using the metaphor of placing things in a container. If entity X is in container Y which is in container Z then X is also in container Z. This idea can be formalized in Johnston diagrams (formally equivalent to propositional logic) or Euler or Venn diagrams (set theory). But the root idea is still one of distinction - X is either in a container or not, there is no third alternative (in Latin tertium non datur, a third option is not given, the classical name for the excluded middle).

There are many logics which do not accept these laws of thought (often called rather dismissively 'deviant' logics, or extended logics[4]). They usually do not accept (3) the principle of bivalence, and are called multivalued logics having three or more truth values. Or even an infinite number of truth values, as in fuzzy logic and fuzzy set theory.[5] Other such logics do not accept (2) and hold that some contradictions are true (called 'dialethic' logics [6]). And there are many others - fractal logic, quantum logic, institutional logic, “included middle” logic, modal logic, doxastic logic, deontological logic, paraconsistent logic, anti-realist logic - to mention just a few.

Most of these other logics are trying to deal with the fact that the world we live in does not come in distinct, separate and delimited things already packaged for us to think about. I discuss this further below, but it seems that classical commonsense logic that obeys the laws of thought is privileged - it is certainly commonsensical. To even think about all these other logics you need to think using classical logic, it is a basic human tool - even for our distant ancestor, the bead was either in the basket or it was not. The act of making a distinction, seeing figure and ground - whether drawing a line in the sand, seeing an apple in your hand, using numbers, doing set theory, or defining your concepts - is a basic human act[7].

Whether distinction exists out there in the world, and we just recognise it, or whether it is something we impose on the world, or a mixture of both, I maintain that the binary view is fundamental. Although we might want to unify the diversity we see around us, and posit a unity accordingly - a God, ultimate reality, Being, the One, the Good - which can explain everything, it seems to me a more natural place to start is 'two' rather than 'one'. This principle of drawing a boundary, of creating two, a distinction, a binary view, will be a recurring theme throughout this paper.

So I apply the bivalence of classical logic to all logics, arriving at classical logic and the laws of thought on the one hand, and all the others which do not on the other hand. As has been wittily remarked, you either accept the law of the excluded middle, or you don't[8].

The first specific formulation of distinction as non-contradiction is attributed to Parmenides: 'Never will this prevail, that what is not, is'. But as the basic principle of clear thinking, it is Plato and Aristotle who argue in many places for such laws of thought[9]. There is no question that as a mode of thinking, commonsense logic is immensely powerful. Humans must have instinctively recognised this since the dawn of self-consciousness, but its power only became obvious and harnessed when the Greek philosophers made the laws of thought explicit, and showed how they can be used. The rest is history, one might say.

 

2 - Reason

In trying to apply commonsense logic to everyday events, the entities we deal with or think about are often not distinct. We meet apparent contradiction, in the logical sense, many times a day.

A common example in many logic books is, for some reason, the statement 'Either it is raining or it is not' (at the same place and time). But in practice, if I am asked 'is it raining?' it is usually because it is not clear if it is raining or not, and I usually express this by answering with a contradiction 'it is raining and it isn't (at this same time and place)', and the person I am speaking to understands my answer.

Because logical thought is so powerful, one would like to be able to dissolve this apparent contradiction and regain a domain of distinct entities with which I can again think logically. In this case, it is easy to do, since we then say that the definition of 'it is raining' is not sharp enough, and when I say 'it is and it isn't raining' all I am really saying is that I am not clear if the mild drizzle is to be defined as rain or not. Once I produce a crisp distinction to the concept of 'rain' then it either is, or it is not, raining, and I can no longer maintain that it both is and is not raining.

I will define 'reason', or being rational, as the process of dividing up the domain of discourse into distinctions so that I can talk or think logically about it. Typically humans use three strategies for doing this:

-- analysis, or taking things apart until we find constituents that we can take to be distinct;
-- abstraction, or omitting or ignoring the messiness of the world so that we can replace things that are not distinct (or that we cannot see as distinct) with distinct concepts;
-- definition, where we arbitrarily impose distinction by our choice of terms.

'Life is both a joy and a sorrow' is not logical and hence not rational. But I can analyze, abstract or define this statement in many different ways to bring it into the domain of logic with distinct entities. That is itself rationality, and it allows me to reason further using the distinct entities I have chosen to analyze, abstract or define 'life' into. There is of course considerable latitude and skill in this process.

Aristotle is often considered the originator of logic, but he was much more than that. He also showed that for analysis to succeed the distinctions must be made 'in the same respect' and 'at the same time'[10], and he applied his new skill to form concepts and classifications in many areas, particularly biology, to which he could then apply logic. Since there is often a large number of ways in which division or distinction can be made, reasoning is itself not a logical process, and can almost be considered an art form - hence the phrase the 'art of reason'.

In practice, there is a wide spectrum of senses in which 'reason' is used. At the most restrictive end, reason is considered almost synonymous with logic. At the other end of the spectrum is perhaps the ancient Greek concept of reason which Nettleship defines as 'that in man which enables him to live for something' [11] It is what allows a person to identify and set appropriate goals, and then focus their life on achieving them. Reason has also been defined as that which makes human beings unique, the 'rational animal'; or alternatively, as the ability to talk to ourselves in our head (a definition I rather like).

As I say, I will use 'reason' somewhere in the mid-range of this spectrum of meanings, which I think is how the concept is normally used - not as synonymous with logic, but as dividing or imposing distinction on what I am thinking about so that I can think about it logically. Interestingly, I am doing this to 'reason' itself. I am suggesting a precise definition or clarity to the term 'reason' or 'rationality', so that I can think logically and reason about it.

Note that reason is not merely the creating of distinctions in order to use logic. It is doing so skillfully, in order to move towards meaningful results in the domain of interest, considering context and integrating fundamentals, which is why reason is an art and not a mechanical skill. Certainly there is a wide scope in many domains for how the distinctions are initially drawn, which includes the first principles required for any logical enterprise. Again, it was Aristotle who first spelt this out: "It is not everything that can be proved, otherwise the chain of proof would be endless. You must begin somewhere, and you begin with things admitted but undemonstrable."[12]

Reason is sometimes thought of, or even defined, as general and objective, non-local and non-relative. As Nagel puts it: "To reason is to think systematically in ways anyone looking over my shoulder ought to be able to recognise as correct."[13] I think this follows from my definition. I would put it this way: Surely the drawing of distinctions in this world (physical and mental) is universal and general. In that sense classical logic is universal and general, and also objective since the mental manipulation of such delimited entities is universal, and as objects of thought they are manipulated in ways that are agreed upon everywhere as correct (everyone understands the logic of containers). Reason as I define it, the skillful drawing of such distinctions, is also I believe a general human ability. But how one creates the distinctions in the first place, that is not general and often not objective either. I will say more on this below.

Reason is often contrasted with faith. Faith too makes distinctions, and argues logically with them. I would characterise the difference as being one of degree only - that faith 'gives up' as it were too early. Faith posits starting principles arbitrarily (whether assumed by society, written in a holy book, passed down from a visionary, or merely tradition). Reason, on the other hand, is always pushing for further analysis or abstraction, not accepting the holy book or traditional story on those grounds alone, but seeing if the rationalizing principle of further distinctions and entities is possible. This process probably has to come to a stop somewhere, and indeed it is one of the themes in this paper that it does. But it does so only after it is clear that all distinguishing must come to a halt, and that this halt is in itself a rational process, and indeed a rational necessity.

In that sense, if there are limits to reason (and I believe that there are, as I hope to show below), they can be approached from within reason itself. We are, after all, rational animals.

 

3 - No Reason

Reason has been hugely successful in increasing our understanding of the natural world, and much else besides. Mathematics, science, technology are all the result of the skillful use of reason. Even the other pillar of science, dispassionate observation, is in a sense a part of reason, as it is a method of dividing the confused mass of everyday things and events into distinct entities which can then be logically manipulated. The tag 'divide and conquer' is surely the motto of our rational endeavour over the past centuries.

But there are, of course, many domains where we do not use reason - either because we consider it of no advantage to do the analysis or abstraction to create the distinct entities, or because it is just inappropriate in some way. My example above of 'it is raining and it is not' is in the first category - in ordinary discourse it is well understand what this means, and to define or refine our terms for logical purposes is of no value, unless I am reporting to the meteorological office for instance. Much of our everyday discourse is like this, where we speak in ways which violate the laws of thought, but which we assume that if it were necessary we could refine our thoughts and speech - analyze, abstract and define - so as to create the crispness and distinction necessary for logic.

Likewise my sentence 'life is both a joy and a sorrow', if encountered in poetry or in some artistic setting would be considered inappropriate to rationalize. Perhaps if I said it to my therapist, she might want me to rationalize it, but in many settings, artistic and otherwise, we are happy to stay in the non-rational. Many situations even depend on the non-rational; for example, telling a joke. Most jokes rely on a contrast between the rational and non-rational for their effect, and they also rely on timing and the performance of the joke-teller. Try explaining a joke rationally to someone who does not get the joke's point!

Fortunately, language seems able to serve our need of communication both rationally and non-rationally. Under the spell and seduction of rationality, much recent Western philosophy has considered language only rationally (called appropriately 'analytic philosophy'). However, Wittgenstein in Philosophical Investigations piles example on top of example of language usage that does not map to distinct entities (physical or mental), and so is neither rational nor logical.

So in everyday life and discourse, in expressing our feelings to one another in language, or in thinking to ourselves, we are often not logical, and nor do we need to be. Much of the time we assume that we could analyze, abstract or define our language rationally to be logical if necessary, but there is no value in doing so, the effort to think in that way is not worth the cost. Sometimes we are by choice non-rational (reading poetry, telling jokes); and sometimes even when we are talking or thinking with agreed distinct entities, we manipulate our objects of thought in ways that are simply not correct, either deliberately or through sloppy thinking, which we term illogical.

There are also some subject-matters where there is no agreement as to what the relevant distinctions might be. In most situations this is not the case, and it is either obvious or easily agreed what are the useful distinctions - whether prehistoric woman putting her beads in the basket, or the numbers in arithmetic, or the bricks and materials to build a house, or the appointments I make in my diary, or the facts in a court case (perhaps these last are not 'easily' agreed, but once the facts are agreed the logic of the case is usually straightforward).

Most examples I can think of where it is contentious over time where the distinctions should be drawn, and there is substantial disagreement even among experts in the subject-matter, all come from philosophy. This leads me on to a fundamental question: are there any domains of discourse that simply cannot be divided or have distinctions maintained in them? In other words, is there anything that we are aware of that cannot, as a matter of principle, be analyzed, abstracted or defined in a meaningful manner? If so, then we would need to accept that whatever it is could not be spoken or thought about logically, and we could not reason about it. Although we almost certainly could and would reason about why we could not reason about it (as indeed I am doing here).

And of course if there were such domains in philosophy, it would explain why philosophers have often been in disagreement about where to draw the distinctions, the 'first principles', with which to think and philosophise.

 

4 - Plurality

The polite term for philosophers being forever in disagreement is 'plurality', and the result 'philosophical pluralism' - not the cultural pluralism of ethnic, racial, religious, literary or political diversity - but pluralism as a plurality of philosophical systems or views.

The name Pluralism was first suggested in 1882 by William James[14], and was developed by Richard McKeon in the 1940's and 50's. McKeon [15] wrote a series of papers analyzing historically the large number of such philosophical views over the last 2500 years, showing that the differences were all due to the distinctions made by the various philosophers - their principles, methods, interpretations, and selection processes. Since McKeon was a scholar of Aristotle, he saw these as being analogous to the four causes, but that is not necessary to follow his analysis.

McKeon distinguished philosophical semantics[16] from philosophical inquiry - roughly equivalent to my idea of creating, and being clear about, the distinctions you think with, and then thinking with them. One of the most often-used words in McKeon's writing is 'ambiguity'; he showed that many philosophical disagreements arise from ambiguity, and that the paramount aim for any serious philosophical inquiry is to resolve ambiguity at the outset - precisely what I mean by drawing or agreeing one's distinctions before you start thinking with them.

First, there is what it is you can be distinct about (McKeon used the term 'selection', as in what you select as your entities) - there are again four of these: things, thoughts, words and actions. McKeon claims these four are exhaustive, but on this point I think he is wrong. I would like to add a fifth: feeling, or felt-sense (unfortunately this would spoil McKeon’s penchant for arranging his many classifications into groups of four!) I discuss this later. I also think there is room for a sixth: process (as in Whitehead's 'process philosophy').

Secondly, there is how you can create your distinctions. Again, there are four of these, what he calls 'modes of thought'. One is to divide into ultimate constituents, and construct your philosophy from them (for example the atomism of Democritus). The opposite is to distinguish an ultimate principle, and assimilate your philosophy into it (he gives Plato as an example, which I think is a little unfair, as Plato in my reading is much more than merely the purveyor of Forms and the Ideal). The third is a process of resolution, where your problems are in the 'middle region', and your distinctions are driven by the aim of resolution (Aristotle being the obvious example here). The fourth area of discrimination is in perspective, so that things 'out there' are made distinct from the distinctions 'in here', and Protagoras is the Greek exemplifier of this 'mode of thought'.

McKeon's ideas are very detailed and prolific. I find that they all come together in a grand synthesis, but it takes a lot of work to read and understand him to get to this point. Each of my summary paragraphs above on McKeon's philosophical semantics could be expanded to fill a whole book (and has been by other commentators), and there is much in his thinking that I have not touched on (for example the historical and cultural milieu in which one makes one's distinctions).

My main point in introducing McKeon, albeit briefly, is to show that once you accept the fundamental importance of distinction, and the meaning of logic and reason that follow, there is much that has been written to support this thesis, and practical ways to help one work with it. You might call it the art of distinguishing your distinctions. This has been of great practical benefit to me personally.

For as long as I have been reading philosophy, over many years, I have always been struck by a recurring phenomenon: I often find myself taking on and accepting the views of the philosopher I have read most recently. He (usually it is a he) expresses and argues his view as if it is the right view, and I am usually persuaded by his argument and agree that his view is correct, sane and sensible.

For example, I read philosophers like Nagel robustly asserting that there is an objective reality or truth 'out there', independent of any human mind. I find their argument compelling, and not just rationally so, but emotionally satisfying as well. I read in them about an 'easy target'[17] called Rorty, whose feeble arguments against this position they easily defeat. Trying to be fair-minded, I then pick up Rorty and read him in his own words, and I am amazed by the sense and obvious clarity of his arguments. I finish his book realising that there is no practical difference between the activities of aligning myself with what my fellow human beings agree to be real, and with aligning myself with some objective reality out there independent of any human view. Therefore I see that the compelling case for objective reality that had me spellbound previously has no practical consequences, and I am grateful to Rorty for pointing this out so forcefully.

This scenario does not play out with just modern philosophers. I read both Rorty and Nagel, and many others, all using the adjective 'Platonic' as if it were an insult, and the terms 'platonist' and 'naive' as being virtually synonymous . However, when I turn to Plato himself, and read him in his own words (or a good English translation) then I find a depth and subtlety beyond anything I read in modern philosophers. Whitehead's claim that all western philosophy is a footnote to Plato then makes good sense.

Before I came across McKeon's forceful explanation of philosophical pluralism, I thought I must just be hopelessly naive to be always taking on the opinions of the last philosopher I had read. For me, McKeon has made it philosophically respectable to be more tolerant of others' views. By having sympathy for a philosopher and trying to stand in his shoes, as it were, then I may be led to a wider view of the issue than I had previously, and have my understanding expanded as a result. I still exercise my critical faculty, and I may reject that philosopher's views, but only after giving him a sympathetic and fair hearing, trying to understand the situation he was in to create the distinctions that he did and so formulate the views that he did.

As Keats wrote: 'Axioms in philosophy are not axioms until they are proved upon our pulses; we read fine things but never feel them to the full until we have gone the same steps as the author'.[18]

 

5 -Memes

In addition to McKeon's formal and exhaustive methodology for distinction-making in all its aspects, in this section and the next I look briefly at two recent and popular explanations for philosophical pluralism, or the fact that as humans we can and do create our distinctions in different ways, and so end up with such differing philosophies.

First, the Memetic Drive of Susan Blackmore: One of the simplest but most profound scientific ideas is Universal Darwinism, whereby if anything replicates with felicity but some variation, in large quantities, but is selected by its environment, then the result must be ever increasing complexity and organisation. The gene is clearly one such replicator, responsible for the evolution of the forms and apparent design of life. In 1976 Richard Dawkins suggested that once human beings were able to imitate (true imitation, rather than learning or social contagion[19]) a new replicator was let loose on the planet, which he called the 'meme'. Blackmore has shown persuasively that memetic co-evolution with genes can explain, at least in principle, human brains, language, much of our behaviour including altruism, our ideas and even the thoughts we have and why we have them.

Thus philosophies can be thought of as memeplexes. In its radical form, this is all the explanation that is needed for philosophies to be created and be sustained, a view which most philosophers would resist. However, just as blind evolution driven by the gene replicator can explain all the wondrous life forms we see around us (the eye, brain etc), Blackmore maintains that blind evolution of the meme replicator in our brains can similarly explain all the cultures and philosophical systems we have. Such a conclusion may be as hard to accept as we had (and still have) accepting that all life with its apparent organisation is due to genetic evolution alone with no God or 'intelligent design' needed.

Not all philosophers are as radical as Blackmore with memetics, for instance Dennett proposes a somewhat diluted version of memetic evolution which does not characterise it as 'blind' in the sense that Blackmore does. Whichever view is correct, and whether it is the whole explanation, or a contributing factor only, memetic drive is in my view a powerful reason why we see such diversity of philosophies and philosophical systems.

 

6 - Metaphors

Second, the Conceptual Metaphor analysis of Lakoff and Johnson [20] : We think with literal concepts (basic-level, spatial, sensorimotor) and with metaphorical concepts, but we do not realise typically how embedded in our thought our metaphorical concepts are, and how sparse our literal concepts are. Lakoff and Johnson give hundreds of examples of ordinary phrases that at first sight we would not consider metaphorical, but which on closer reflections clearly are so. For instance, 'these colors are similar' is literal; but 'these colors are close' is metaphorical, where we are mapping the similiarity between the two colors to the literal domain of spatial proximity.

The source domain is usually a literal concept; the target domain is what we experience. Examples: knowing is seeing, time is motion, help is support, purposes are destinations, more is up, important is big, affection is warmth, bad is stinky. Many of these metaphorical concepts are universal, not because they exist in some Platonic world, but because as humans we all have the same neural wiring and basic bodily experiences, and so we generate the same embodied metaphors.

Lakoff and Johnson's main thesis is that we cannot think effectively without metaphor, we use it to reason with (in my terminology, we use it to form our distinctions). Thinking using literal concepts only is extremely hard to do, and at best would be only minimal and unsophisticated.

For any abstract concept, we usually have a literal core, and several metaphorical mappings (sometimes inconsistent with each other). They give the example of 'love' - this is a basic-level experience, with some literal structure (a lover, beloved, relationship with, perhaps a start and end). But to talk and reason effectively about love we use many conceptual metaphors (eg a journey, physical force, illness, magic, madness, union, closeness, nurturance, giving of oneself, heat, parts of a single object). They are so entwined that we cannot say the concept of love is independent of the metaphors of love; without the metaphors we are left with an impoverished skeleton.

Most concepts used in philosophy have a small literal core surrounded by a large number of metaphors - Lakoff and Johnson show this in detail for Love, Time, Causality, Mind, Self, Morality and the thinking of several philosophers (Pre-Socratics, Plato, Aristotle, Descartes, Kant and others, particularly Analytic Philosophy). Most philosophical systems consist of taking one set of metaphors as true and literal, and reasoning from them. Since we cannot think and reason without using conceptual metaphors (their roots are embodied in us), philosophy must be pluralistic. Few philosophers seem to be aware that they reason from metaphor (as opposed to using it to illustrate their already reasoned points); an example of one of the few would be Aquinas with his 'analogy of being'.

Lakoff and Johnson draw on the latest scientific research in the cognitive fields. You can hold that philosophy is above and beyond scientific findings - either as a prejudice, or as a reasoned argument - but certainly recent research has thrown up questions and viewpoints which I think philosophy needs to answer or at least address, certainly not ignore.

Their three main conclusions are: One, thought and mental activity is mostly unconscious, to which we have no access (and thus the conclusions of the armchair philosopher only introspecting must be very limited). Two, the mind is inherently embodied (not in the weak sense that we need a body to have a mind, but in the strong sense that concepts are neural and sensorimotor); thus reason is shaped by the body (my translation: our ability to see and create distinctions is shaped by the body). And third, the point I have discussed above, that abstract concepts are largely metaphorical, and that we are mostly unaware of this.

 

7 - Categories

One of the early researchers in this field was Eleanor Rosch, who questioned the philosophical assumption that we form a category by abstraction, by admitting certain properties of what we are categorising and omitting others, so that we can define what is in the category. For instance a creature is a 'bird' if it has a beak and feathers, which is then our 'definition' of a bird.

In the 1970's Eleanor Rosch did some experiments to test this assumption, to see how we actually, in fact, make categories. For example, she gave students a list of birds and asked them to rate how well each word typified its category - using a scale of 1 to 7 (1 = an excellent instance of 'bird', 7 = hardly). Typical results were[21]:

robin 1.1
eagle 1.2
wren 1.4
ostrich 3.3
chicken 3.8
bat 5.8

Not only did results match basic intuition, but there was a high degree of correlation. We regard a robin as a better bird than a seagull, and a seagull as a better bird than a penguin. Such experiments have been repeated many times since then[22] with similar results.

Rosch called the best examples of a category 'prototypes' [23] (the prototype of a bird is thus like a robin), and concluded that we do not include X as a bird by definition (eg has beak and feathers), but by how close we judge it to the prototype - rather like Wittgenstein's 'family resemblance'.

Categories also form a hierarchy - superordinate, basic, subordinate - basic being the largest category that has a prototype, where we can have one concrete image representing the whole category; at which category members have similarly perceived overall shapes; and at which a person uses similar motor actions for interacting with category members.

Example of basic-level categories might be: 'elephant', 'table' or 'car'. Going up the hierarchy to a superordinate category is less easy to imagine - I can imagine a table, but not a general piece of furniture; or a car, but not a generalised vehicle. Similarly in going down the hierarchy to a subordinate category, it is less easy to imagine different species of elephants or types of car, unless I am an expert or enthusiast.

Basic-level categories thus have priority over superordinate or subordinate levels. They are named and understood earlier by children, enter a language earlier in its history, have shorter primary lexemes (words), and are identified faster. They are the level at which most of our knowledge is organised, and include not just objects but also basic-level actions (eg swimming, walking, grasping), social concepts (eg families), social actions (eg arguing), and emotions (eg happiness, anger, sadness). Lakoff and Johnson include basic-level in their 'literal' (along with spatial awareness and concepts from sensorimotor events).

I include these last three sections - memes, metaphors and categories - because I think that if distinction-making is at the heart of logic and rationality, then we need to be aware of how we do in fact distinguish and think how we can improve the process. I do not insist on them, although I find them useful. They also help me to understand plurality in action, and to flesh out McKeon's rather austere vision.

 

8 - Relativism

The explanations above of the Pluralist phenomenon are relative and depend upon the humanity of philosophers: on history and the assumptions of the philosopher (McKeon's Semantics), on the culture and mental life of the philosopher (Blackmore's Memetic Drive), and on the embodiment of the philosopher and his choices of metaphor (Lakoff and Johnson).

My argument so far has been that philosophers make their distinctions in different ways, for a number of different reasons, and for different purposes, and so build the smorgasbord of philosophies we have. If a set of distinctions for a particular domain of inquiry is generally regarded as successful and of value, then that way of investigating that domain prevails. Historically, that usually meant that domain was taken away from philosophy and became a science (consider what was included in 'philosophy' in Newton's time).

Philosophical Pluralism in the sense I mean it is contrasted with the usual singular agreement in other areas of rational inquiry. For instance, the ancient Greeks theories of physics are clearly wrong, and now of only historical interest; Greek mathematics is clearly correct and is still part of mathematics today. Yet Greek philosophies are still being argued over, 2,500 year later, with no clear outcome. Is there something about Philosophy, either its method or its subject matter, which prevents us coming to any certain conclusions? Or is there something about being human which prevents arriving at the truth in philosophical inquiry? Or has the truth been arrived at, and amongst the many pluralities a subset is in fact correct, and the rest wrong, even though there is no consensus which is which? Or are we condemned to live in a world where our understanding of the most important issues is forever relative? I divide the world my way, you divide the world your way, and who is to say which is correct?

Part of the problem is that for the out-and-out Pluralist, any proposed answer to the question 'Which one is objectively true?' is merely one more dish on the buffet table of available philosophies, and takes it place amongst the other philosophies that are part of the plurality one is trying to assess. So any attempt to resolve a pluralist view means getting outside of it in some sense, being able to look at the available choices from a vantage point that cannot be included or gathered in to the pluralist frame. But if my analysis is correct, we cannot do this rationally. Whatever distinctions I draw or create to resolve this issue will be another set of distinctions to be added to the plurality.

As I indicated previously, the way I intend to approach this issue is from within. I suggest that there are domains of inquiry in which it is inherently impossible to define meaningful distinctions - that is to say, distinctions which are stable and suitable material for logical thought.

 

9 - Nothing

What might the candidates for a distinctionless realm be? Well, I cannot draw any distinctions in nothing - there is nothing to be distinguished from anything else in nothing. However, as I have stated it, I don't think that gets us very far - what can I do with my distinctionless nothing? But let me approach this same idea obliquely, stating it differently.

Another candidate for a distinctionless realm might be everything - I can imagine positing a wholeness, a unity, a metaphysical All or One in which there are no distinctions. But there are certainly distinctions in my part of the world, physically and mentally, so the distinctionless All would not be all, after all. And if this distinctionless realm were a One, it does not reach my distinction-filled part of the world, so it would need to be at least a two.

Perhaps this does not matter, and I can accept a local distinctionless realm, rather like the surface of a sphere perhaps. Limited to the sphere's surface, there is no distinction, no privileged direction or position, no boundary. But what advantage could I gain from a distinction-free locality, that from my viewpoint would be just another distinct object in my world of distinction and multiplicity, like the sphere's surface would be from my three-dimensional point of view?

But what if I could not step out of the surface of the sphere, and yet still in some sense be aware of the three dimensions the surface is embedded in? This sounds impossible, but I suggest that there is a realm which is exactly like this, and it is I.

There are many linguist uses of the personal pronoun 'I': I-indexical and I-self are the two most obvious. I-indexical is an uncontroversial placeholder for the speaker. I-self is what I normally think of as 'I' or me. It sits behind my eyes and is the agent deciding what I do. Asked 'Do you have a body, or are you a body?' most people would respond 'I have a body'. That is I-self speaking. I-self describes who we think we are, and would include our personality, our likes and dislikes, our hopes, fears and dreams, what we think, how we think - everything that we consider defines our essential being. The literature on the uses of 'I' is extensive, and there are many arguments from a linguistic use to a metaphysical ontology[24].

To get at what I mean, I will not argue from a linguistic use of 'I', but rather describe two brief experiments that can be made while reading this paper. First, the metaphysical 'I' that I am attempting to point to is that sense of 'I' that I cannot step outside of. In that sense I cannot of course point to it, I can only know it by being it - which is why the need for experiment.

The first experiment then is simply to separate out what I am aware of, from the 'I' that is being aware. I can imaginatively do this by holding the content of my awareness at arm's length, as it were. By holding my hand out, I see it out there as an object of my awareness. In a similar manner, you can objectify your social category, your personality, your self-image, idle thoughts, passionate thoughts - in fact anything you are currently aware of. You may think that all you are aware of is some content or another; in which case, can you describe it or even observe it? If you can, then there must be a separation between what is being observed and the observer.

The problem here is that whenever you try to observe this 'I' it recedes away from you, since of course it is what is doing the observing - rather like trying to see if the fridge light goes off when you close the door, however quickly you open the door, or however slowly you close it, you will always see the fridge light on. What is needed is a kind of epoche - a suspension of the usual attitude to mental contents (outward or inward), a reorienting of attention, and a letting go[25]. Some people get it first time, some people need considerable practice, but over time it is possible to get close to emptying your core subjectivity of all content, holding it at arm's length, and seeing that separation - rather like looking at those computer-generated 3-D pictures that look like meaningless coloured patterns, where if you get the right kind of vision (looking through the picture with a soft gaze, focusing on a point just beyond) suddenly a vivid 3-D image bursts into view.

How to characterise this core subjectivity that you cannot observe or point to, but can only experience by being it? Three words come close for me: it is 'I', it is awareness, and it is nothing. It cannot not be 'I' as I cannot get outside it or escape it, except by ceasing to be I - temporarily or permanently. If I can make the distinction between awareness and what I am aware of, then I can also call it 'awareness' - the problem here is that most phenomenological writing about awareness does not make this distinction between awareness as a blank canvass and the contents of awareness as images on that canvass. But even that canvass metaphor breaks down; the point about the 'I' that I am trying to delineate is that it cannot be delineated, there is no referent.

So it is nothing. If I can point to anything, anything at all, and say 'that's I' or 'that is part of I' then I am still pointing to what I am aware of. Even the pronoun 'it' in 'it is nothing' is problematic. I discuss below how we might talk about that which cannot be referenced, but for now I will call this 'I' that is nothing 'I-nothing'.

I-nothing is certainly distinctionless, being nothing, so I have answered my own question at the start of this section - 'what might a distinctionless realm be?' and have found something that cannot be discussed logically. But the answer appears to be at considerable cost, since it raises a host of further questions. Three obvious ones are: First, how do I even talk or write about I-nothing if I cannot reference it? Secondly, to myself, emanating out of I-nothing as it were into the rest of myself and this world, I-nothing means everything - to lose I-nothing at death, say, is not to lose nothing but to lose everything. Thirdly, how does I-nothing react with this objective world - what is its relation to the objective world?

Before I attempt to answer these questions, and others, I will describe the second experiment which gives a flavour of what I am trying to point to without actually pointing (which I cannot do).

Consider looking at yourself in a mirror. You will see yourself objectively, how others see you, and you will see yourself as having two eyes through which you see, just like everyone else in the world except the blind and partially sighted. Presuming that you did not check the last paragraph by actually looking in a mirror, since you could easily imagine what I described and gave your assent, then now comes the actual experiment you will need to make to follow me.

Without moving from where you are, what on present evidence alone are you looking out of? It is certainly not two small eyes. You will see yourself as looking out of one large round and crystal clear window, the further side of this window being the words of this essay, on computer screen or on paper, the landscape or roomscape behind and to either side, perhaps your two arms and hands emerging from the bottom of this window, and also at the window's bottom a faint fuzzy pink shape, sometimes to the left, sometimes to the right, which you objectively recognise as the tip of your nose.

You can trace the edge of this window with two fingers, even while still reading this. Hold your two index fingers up straight in front of you, about a foot away. Now move them slowly apart, and while still reading these words, see where they disappear. Most people have to spread their arms quite wide before it becomes hard to see both fingers at the same time peripherally. That is the left and right edge of this window. Now you can slowly move your fingers first down, and then up, keeping them just in sight while still reading these words, to trace the circumference of this window. It is a surprisingly large circle. It traces a window with no interface, yet which clearly has a 'this side' where you are, and a 'further' side where the world is. Your two pinholes have become one large, matchless, single eye. And yet I suggest that 'this side' where you are is nothing - there is a void, an emptiness, nothing in fact your side of the single eye. There may be feelings and sensations, but this experiment is focused on vision only, and your vision originates in what is, to you at this time, nothing.

When doing this and other 'headless experiments', as Douglas Harding calls them[26], many people dismiss them as just demonstrating a difference in perpsective, and nothing more. They do experience the subjective viewpoint (everyone experiences seeing the world through one large window, and not two little peepholes); but they think objectively, and thus say that objectively ('really') they are seeing the world through two smallish eyes, and to point out otherwise is a childish game, that the one large eye is merely illusion, appearance, a difference in perspective only.

Yes, it is childish. For the first three years or so of our life, as a baby or toddler, we viewed the world in exactly this way. If we looked in a mirror, we only saw another toddler. At about four years of age we started looking in a mirror and seeing ourselves. Psychologists call this 'developing a theory of mind', the ability to objectify ourselves and others, to see others as having a mind that is similar to ours, and to make a distinction between the world and I.

We are by now, as adults, so used to thinking objectively (seeing ourselves in the world), yet experiencing subjectively (looking out of I-nothing), that we find it hard to appreciate the difference, and have got used to the essential dissonance that is called 'growing up'.

Note that I am not presenting these experiments as an argument - after all, it is contingent how one looks out on the world (or in at oneself). They are attempts to experience the flavour of nothingness behind everything we say, do, think, or are aware of, so that the interested reader can sense what it is like to think the way I am thinking in this paper. I consider them counterparts to Heidegger's conceptual framework of being defined as not nothing, and so to experience the nothingness that is behind being, as it were, as opposed to only thinking it.

 

10 - Unsaying

How can we talk or write about I-nothing? Any statement of the form 'X is nothing' inevitably leads to a reifying of X as an object in the world of distinctions. There are at least four responses to this dilemma: One, ignore it. Two, silence (not necessarily the same as ignoring it[27]). Three: Accept reifying it and try to talk or write logically about it. This means either trying to draw distinctions in the essentially distinctionless, or else setting up a proxy realm in which one can draw distinctions, and then think you are in fact talking logically about nothing, when you are only discussing the proxy.

In this paper, using the terms in the way I have been using them, this third response is not an option - I simply cannot reason or discuss logically that in which there are no distinctions - either given or my own creation. Note that in this section, as in the previous one, I-nothing is how I experience my core subjectivity from within, which can only be experienced by being it. Living in the world, I can and do objectify myself; I discuss this below. But here I am asking the question how to discuss I myself as I-nothing, that I cannot step outside of.

For me, that only leaves the fourth response as a possibility - to talk about it but to do so non-rationally. Clearly I am writing about it, and equally clearly it is distinctionless, and so I cannot do so logically. Assuming that I want to talk or write meaningfully, how can I do that non-logically?

Plotinus was faced with the same dilemma when writing about the One, which was also distinctionless and could not be named, yet contained everything. He used apophasis in a novel way to accomplish this[28]. Apophasis was a classical Greek term for a move in rhetoric where you draw attention to something while claiming not to ('I think my opponent is fine and upstanding; I will not dwell on the absurd accusations of his drunkeness...'). The term was later used by medieval theologians to characterise the via negativa or negative theology. I use it in neither of these ways, but as Plotinus did, as an 'unsaying' (apo = away, phasis = saying; saying away, or unsaying).

In apophatic discourse, a sentence or statement does not stand on its own, but posits a seeming contradiction or illogicality, which a succeeding sentence has to 'unsay'. Plotinus gave the example of a translucent sphere at the centre of which is a glowing mass, lighting up the sphere. The speaker then reaches in and removes the glowing mass, yet preserves the illumination of the sphere[29]. So in my example, the sentence 'X is nothing' would need to be followed by the speaker or writer reaching back into that sentence, as it were, and removing the referent 'X' in an attempt to prevent reification. That action itself would probably need to be unsaid as well. Note that I say 'seeming contradiction or illogicality', since the essence of what is being spoken about is that it has no distinctions, so statements about it cannot be logical nor illogical - although taking each sentence on its own out of context it appears illogical. (The medieval mystic Meister Eckhart, whose German sermons were strongly apophatic, used this as his main defence in his trial for heresy - that the inquisitors were taking his sentences one by one out of context, when they were meant to be spoken in sequence as a connected narrative).

So aphophatic discourse, in the sense Plotinus means it, is dependent upon time; it is a performance, a spoken or written sequence in which each utterance refers back to previous utterances to unsay something, and in so doing leads an appreciative audience, who understand what is going on, to a deeper understanding. Commentators on apophatic writings (including medieval inquisitors) usually try to fill in the open referent, to say what the author 'really' meant to say (but didn't) - to reason, in other words to translate into logical terms using distinctions which cannot hold.

An analogy is hearing a poem being read. Trying to explain or comment on a poem, in a logical manner, often comes across as very dry, and missing the point completely. Even reading a poem quietly to oneself can be hard going. But if you hear that poem being read out, by a good reader who understands the poem (often the poet themself), then in that performance, where the poem's theme is developed, often in non-logical ways, you are by being a listener forced to hear in real time (no eyes skimming the page ahead). And in that performance you can often hear real meaning. For myself, my school sometimes engaged professional poets to visit and read their poems to the school - and there are still some poems I remember vividly from hearing them read in those performances.

My contention in this paper is that metaphysics, as the 'first philosophy' (Aristotle) or the 'science of first principles (Bradley's translation of Plato) can only go so far using reason. As you approach the 'first principles' - whether reality, the infinite, God, everything, the One, or I-nothing - distinctions become vaguer and more difficult to maintain, ambiguity harder to remove, and so reason (which I define as the ability to create distinctions suitable for logic) begins to flounder and finds it more difficult to proceed. I have suggested above that this is the fundamental reason for the plurality of philosophies and metaphysical systems we have on offer. It is why Carnap called metaphysics 'sterile and useless' - and relying only on reason, perhaps it is.

I therefore maintain that while reason needs to be pushed as far as it will go, in fact it cannot and should not be abandoned, nevertheless there is a need for metaphysical performance as well, in the sense I have outlined it above. Metaphysics should not only be reasoned, but should also be performed.

Of course, many philosophers since Plotinus have used some version of apophasis - medieval mystics and theologians in the pseudo-Dionysian tradition, and in modern times Kierkegaard, and Heidegger and Derrida with their technique of erasure. One can even make a case that Wittgenstein wrote in this way, with his distinction between showing and saying. It is often pointed out that Wittgenstein managed in the Tractatus to say a lot of things which he claimed could only be shown and not said (Russell even mentioned the point in his introduction to the English edition!) I think Wittgenstein's point is really that there are things you cannot say logically (in my sense, container logic) but can be shown by saying non-logically, or apophatically - 'whereof one cannot speak logically, thereof one can only speak apophatically' is my rewriting of the last sentence of the Tractatus.

To return to the 'experiments' in the section 9 above, it is one thing to read them cold - to attempt to remove all content of awareness from the sense of 'I' by reading about it, even if you are trying conscientiously to do it at the same time. But just as with live poetry readings, such experiments done as a performance are something different. I myself have been in several group sessions, where the speaker gives an apophatic performance (using neither of those words)[26], and through spoken suggestions (many non-logical in the sense above) I have found it considerably easier to create a separation between awareness and contents of awareness, taking a backward step into my own observing, and become 'I' as aware nothingness, being 'brought thereby into perplexity and wonderment' as Plotinus phrases it.

 

11 - Self-Reference and Contradiction

The previous two sections began with the question what might a good candidate be for a realm of non-distinction. While there are no distinctions in nothing, a local 'nothing' would itself be merely an object to me. But if I could not step out from within inside this 'nothing' (local or not), then that will not be an object to me, and it qualifies as a distinctionless realm.

However, by calling I-nothing 'local', I am already objectifying it. While some religious or mystical systems might not accept this, and demand the 'purity' of dwelling only in I-nothing (or their equivalent), it seems to me that a meaningful metaphysics must take account of the world that I find myself in. This leads us straight into self-reference: I, out of I-nothing, see a world; but this world also includes I in it.

Self-reference is often introduced by way of paradox; as Oscar Wilde wrote: ' ...the way of paradoxes is the way of truth. To test Reality we must see it on the tightrope'[30]

The Liar paradox is at least 2500 years old[31], but is most often written in its modern formulation as 'this sentence is false'. There are many similar paradoxes in the same family (my favourite is: This sentance has threee errors.) They are remarkably robust, in that most attempts to resolve them either lead to further paradox, or can be easily circumvented. For instance, a common approach is to hold that the Liar paradox is meaningless, in which case you can simply present a strengthened Liar: This sentence is false or meaningless. (or 'This sentence is either false, or neither true nor false, or meaningless')

Other attempts to 'solve' paradoxes like this built on negation and self-reference are to reformulate them to avoid the self-reference. So for instance, 'This sentence is false' becomes 'This sentence cannot be proven in system X'. The latter formulation is still a paradox within system X, but, depending upon the particular scheme of the solution, this is not a paradox from some other system outside X.

Many people find such solutions inadequate, and I am one. When you read 'this sentence is false', unless you are philosophically inoculated or jaded, there is an aporia, an agony of how to go on as Plotinus put it, a 'birth pang of signification'[32] - what at first you took for the distinct units, this sentence and its property of being false, on reflection you see cannot function as distinct units. It is often held that the referent of a self-reference does not exist, but this is just my point - it does exist, but then is immediately denied - that is the paradox and aporia, as if we are back in apophatic mode. We can only speak the Liar-type paradox as an apophatic performance, precisely because its distinctions are not stable.

'This sentence is false' as a sentence is a very small unit in the world as a whole. Self-reference becomes more aporia-like if what is being referenced is much bigger, and you cannot get much bigger than Russell's set consisting of the set of all sets that are not members of themselves.

In 1902, just as Gottlob Frege was finishing his grand attempt to derive all of mathematics (and more) from logical principles based on set theory, Bertrand Russell wrote to him with what is now known as 'Russells Paradox' (although it was originally discovered 25 years earlier by the Danish mathematician Georg Cantor).

The paradox boils down to the fact that sets can be too big if they are self-referential. The set of all sets leads to paradox, as does Russell's formulation (the set of all sets that are not members of themselves - it both does and does not contain itself). This is remarkable for a number of reasons - one of them being that it is one of the very few deep and fundamental results of mathematics which needs no knowledge of maths or its technicalities to understand.

Mathematics tries to get round this by then making a distinction between sets and classes - basically any 'set' which is paradoxical you call a 'class'. Many mathematicians, myself included, find this ad hoc distinction unsatisfactory.

There is now a substantial body of literature, fronted by Graham Priest[33], showing persuasively that human thinking cannot escape self-referential contradiction. In fact, most of western philosophy since the Greeks has been grappling with apparent contradictions and how to avoid them (only Hegel accepted them), under the assumption that an unsolved paradox indicated there must be something wrong or missing in the philosophy. These range from the Liar paradox, through Aristotle's prime matter (it exists and it doesn't), Kant's noumena (which can't be described but which Kant describes), the antimonies and to modern times with all the paradoxes of set theory, Russell's paradox, and Wittgenstein saying very eloquently what cannot be said.

Priest piles example on top of example of paradoxes, and forcefully shows why 'solutions' to the robust paradoxes fail, or merely relocate the paradox or contradiction elsewhere. Priest maintains that the underlying reason for this is that reality is contradictory, so it is unsurprising that thinking at the limit about reality must also be contradictory. To be precise, his schema for this is that all paradoxes rest on the fact that we can hold totalities such that an object must both be contained within the totality, and cannot be.[34] And the totality need not be the infinite or ultimate (although it often is), for example the oldest recorded paradox, the Liar in its many forms, is strictly local and finite, as are others (eg Berry's paradox).

Many attempts to defuse the paradoxes simply say that any totality that yields such a paradox cannot or must not be held as a totality (the mathematical set/class division above is one such attempt). But we can and do hold such totalities - that is the paradox! In my language, we cannot in those cases maintain stable distinctions.

 

12 - Everything

If Priest is right, and I believe he is, then we have genuine contradictions. But I would express it differently from him: given what I have said above, the fact that classical logic yields genuine contradictions that are true, but are not explosive, must mean that the domain of everything distinct, or even potentially distinct, is strictly smaller in the mathematical sense than the domain of everything. We can even assume that distinct things are infinite, and my argument still holds: I would make an analogy with the set of all rational numbers being strictly smaller than the set of all real numbers, using Cantor's notion of cardinality.

I have an image in my mind of reality being a continuum of non-distinction, overlaid with a grid of distinction. We may be able to conceptualise and distinguish ad infinitum, but what is essentially indistinguishable is always much more, and always in between, as it were. Rather like between any two rational numbers, however close, there are non-rational reals (in fact, an infinite number of them, and in fact an uncountable infinity of them!)

But just because 'everything' sounds comfortably unifying, a word which signifies all things, as one concept, Priest's work and Russell's paradox show clearly that you cannot argue anything from it. Just as a 'set' is a collection of distinct things, if a set is too big and paradoxically self-referential, then you cannot maintain those distinctions in a stable manner, and so cannot argue anything from it either.

The question then becomes, how and in what way can we divide 'everything' into concepts to talk logically about it? Returning to my fundamental act of creating distinction with which I began this paper, where can I draw a distinction in everything?

For myself, the most obvious distinction is that between the world and I. My dictionary has eight different meanings of 'world', of which the first is 'Everything that exists anywhere', so the world includes me as being part of everything, but in calling it the 'world' I am struggling to make my distinction. I take the metaphysical endeavour, then, as being to reason about the world and I.

I am part of the world, yet I am trying to see the world-and-I as a whole from outside. There seems to me something impossible about this endeavour. How can I arrive at an understanding of a world that includes an understanding of I, I being within the world having that very understanding? This is not quite Colin McGinn's cognitive closure[35], but is analogous to it, and can perhaps even be derived from it. It is one way of stating the mind-body problem.

So I take the world-and-I together to be another distinctionless realm. It is not really distinctionless of course, since I have already made a distinction between the world and I, and I make many further distinctions in both the world and in myself. But it is self-reference on a heroic scale, and by virtue of that it is distinctionless as a whole. What I mean by that, is that I can take the world out there and reason about it (form distinctions with which I can think logically); and I can look at myself (that is, I-self, not I-nothing) at arm's length, as it were, as an object, and reason about that and about the various mental events I observe in that object. But when I look at the world-and-I combined, then no distinctions are stable - I make distinctions, but they are all transient and in tension, each thought undoing and de-referencing the one preceding. My logic searches for fixed referents, but because I am in the world I am thinking about, they always recede in a perpetual regress.

Perlis has called this 'strong self-reference', where the 'self' doing the referencing is a conscious agent with 'a robust self-concept as part of the agent's world model'.[36] I take his work to imply that while the paradoxes such as the strengthened Liar (formal self-reference) are symptoms of a genuine metaphysical fact (crisp distinctions are in principle not always possible), the strong self-reference involving the world and I (I as both I-self and I-nothing) leads to this metaphysical fact as being fundamental and overwhelming.

Many of the 'deviant' logics that I mention in section 1 (and other extended disciplines, such as the Polycontexturality of Gotthard Günther[37]) are attempts to deal with this fact of a subject reasoning in a subject-and-object context and the resulting self-reference. I personally find these logics complex, confusing, arbitrary, and too many in number. I prefer the simplicity of Priest's dialetheism and of the binary view, the tertium non datur - I repeat the self-referential quip: you either accept the binary view or you don't.

 

13 - Two Realities

Let me summarise at this point what I am trying to say. Philosophers in the West for the last 2,500 years have pursued metaphysics (the 'first philosophy' or study of first principles) rationally. I take this to mean the search for, or drawing of, meaningful distinctions in the domain of discourse with which one can think logically. However, because of the subject-matter of the 'first philosophy', we are forced up against domains where distinctions cannot be drawn (at least meaningful and stable distinctions necessary for logical thought).

This is the basic reason why metaphysics has spawned such a plurality of systems, leading many modern philosophers to spurn it as a discipline - at least, that metaphysics which asks 'why?' questions to push to the limit and the ulitimate (as opposed to, say, Aristotelian metaphysics which analyzes in the 'middle region').

I identify two such 'realms' where the distinctions necessary for logical thought cannot be maintained: nothing and everything. The 'nothing' of interest to me is my own sense of 'I' once I take the time, trouble and discipline to separate out all contents of awareness from awareness itself, and experience the nothingness from which I am (which cannot be experienced except by being it). The 'everything' is the world with I in it, where strong self-reference prevents my distinctions being stable enough for logical thought.

It is of course quite possible to slice up everything, the world-and-I, so that we are left with domains within the whole in which we can make stable and fruitful distinctions. Science has effectively cut the 'I' from the world, and drawn extremely fruitful distinctions in what is left (the world 'out there'). In fact, I would go further - those sciences are most successful where the 'I' has been cut out completely, such as physics and chemistry, the so-called 'hard sciences'. The more that science needs to consider I's, the less successful it is as a science (for example the 'soft', human or social sciences).

Is there any other way of cutting up the world-and-I fruitfully? From a personal perspective, I believe there is: it is the Two Realities approach of Klempner's 'Naive Metaphysics'[38]. Klempner has argued his case meticulously in a traditional metaphysical manner, and I consider the result and main thesis quite radical. I will not reproduce his arguments, or even precis them, but unashamedly pick out from his work the points I wish to use here.

The main thesis is that there are two realities, objective and subjective. The part I find radical is that every thing is in both realities. The tree I am looking at is in an objective reality, or perhaps I should say 'the' objective reality, which we all share. But the tree is also in my subjective reality, in my terms I might say a content of my awareness or I-nothing (my apophatic comment here would of course be that 'my' awareness is not mine, but me).

This approach is much more than some form of perspectivism - the thing-in-itself and my perception of it. As I understand Klempner, we are not talking of two things, two entities - the 'real' object out there, and my perception 'in here' - but one thing, one object, in two realities. In some ways it seems counter-intuitive - for example, the two realities are not disjoint. One might say it is a trade-off; an attempt to swallow an uncommon view of 'everything' up front, the trade-off being subsequently that many trains of thought one would like to hold can then in fact be held comfortably as a whole, that could not be so held otherwise. I give some examples below in my 'practice'.

As an aside, I have to say here that I am impressed with the latest theories of vision based on change-blindness, where the mental 'in-here' perceptive image of what is out there is downgraded.[39] Rather than keeping a detailed mental representation of what we see in our mind or brain, the change-blindness experiments seem to suggest that actually there is only what is 'out there', and what we have in our mind is not a detailed representation, but a very sketchy outline. As our eye moves over a scene, our brains are constantly doing a 'just in time compilation' as it were (to use a computing analogy) creating the scene for us from the objective. Anyone who has used computers extensively knows how expensive it is to keep an image (like a bitmap) in memory. Why would the brain do that when the real scene is available 'for free'? If this is correct, then Aristotle was right - we are seeing directly what is out there, not a representation in here of what is out there. The light reaching our retina, stimulating the optic nerve which sets our neurons going, is not creating a one-to-one representational mapping, but is simply how we do, as a matter of fact, see what is out there.

What advantages does Klempner's account have over the more usual perspectival account? As I say, I will not attempt to duplicate Klempner's detailed reasoning nor his conclusions. For my purposes, the advantages to the thesis I am developing in this paper are many, and I hope the remainder of this paper will bring them out.

First and foremost, it is a grand, bold idea which seems to me to accord much closer with how I am and how things are. The division between the world and I seems immense, dramatic, fundamental, intense even - to subordinate my fundamental sense of I-ness as neural activity in my brain as one among 6,544,631,748 people (the population of the world at 10:59 GMT on September 17, 2006[40]) seems almost insulting to me. When I die, the universe will cease to exist for me - surely that terrible fact is more deserving of a whole reality than a mere perspective?

Of course, I am being emotional, but why not? My existence and my death are emotional topics. This highlights one of the main issues when discussing the objective/subjective divide, or mind-body problem. To me, my death is the biggest event in my life; to others, in the grand objective scheme of things, it will mean nothing (except hopefully to my close family).

The problem of what consciousness or awareness is, or how it can exist, is being called the 'hard problem'. In this paper I am seeking a metaphysical synthesis, and so cannot even begin to discuss how awareness might arise, except to note that clearly having a body and brain are necessary conditions. Equally clearly, consciousness, awareness and my sense of I are closely related, but for reasons I have given above, I prefer to think in terms of 'I'. In consciousness studies, however, 'I' is usually taken to be what I call I-self, although a number of thinkers in the consciousness field discuss 'I' in terms of what I have called I-nothing.[41]

I personally believe that consciousness will always be a 'hard problem' if the solution is looked for in the world-and-I where primacy is given to the world, as objective reality 'out there'. The beauty of Klempner's view is that the fundamental distinction created in the world-and-I is that the two realities are on equal footing as it were. Note that I am not arguing for the primacy of the subjective over the objective (and neither I believe is Klempner). Certainly the world, and the objective view of it, must be accommodated in any meaningful metaphysics.

 

14 - Nowhere and Nothing

The terms 'objective' and 'subjective' are unfashionable in current philosophy, but I have a suspicion they are outmoded simply because they pinpoint uncomfortably the essence of the mind-body problem, which is much discussed currently and without resolution - another manifestation of philosophical plurality.

Because of this, and also because they are hard to define precisely with everybody in agreement, I will restate the view I am edging towards in my own terms. Nagel has famously characterised the objective view as the 'view from nowhere', and I have argued above that I as a subject am viewing from nothing. So you could say that the objective and subjective viewpoints are views from nowhere and nothing respectively.

Although this might be an eye-catching juxtaposition (Americans might call it 'cute'), I think it captures the essence of how things really are. If metaphysics is the study of 'first principles', then you surely cannot go any further than nowhere and nothing.

I hope I have given a flavor in this paper of why I think I-nothing is metaphysically fundamental: it is experientially verifiable, it is rational in the sense of being approached rationally (in terms of my analysis of reason and distinctions), it is emotionally satisfying (at least, I find it so), and it is a framework for practical results as I show below. As a metaphysical first principle, it is also beyond the mind-body problem, in the sense that any conscious entity anywhere, including a spirit without a body, would be viewing the world and themselves from out of I-nothing. And no conscious entity anywhere, of any sort - real or imaginable - can view the world from any viewpoint other than I-nothing.

The objective view is then truly a view from nowhere, since no one can ever, under any circumstances whatsoever, step out from I-nothing and obtain a different viewpoint. The objective view is a human construct then, a creation of I-self, and cumulatively added to by many I-selves to create our own world-picture, as well as the world-picture western science has given us.

As another aside, several researchers now believe that the world-picture building activity of I-self (I am using my terminology here) is a comparatively recent development that happened around 4000 BCE with the extensive desertification of what is now the Sahara and central Asia. Taylor and others[42] argue convincingly for this thesis, showing both why objectification intensified in those conditions, and what its results have been - on the plus side reason, technology and science; on the negative side conflict, violence, and the oppression of peoples (of women by men, and of men by other men). In other words, there was a true Golden Age pre-4000BCE, and a genuine Fall around that time, starting with the empire building of Egypt and Sumer, which has continued to this day.

Klempner's objective and subjective realities then become, in my terminology, the realities of viewing the world from I-nowhere and of viewing the world from I-nothing. Of course, I-nowhere sees the view from I-nothing as not primary at all, and dependent on physical objective things like brains and bodies, which puts us straight into strong self-reference on a heroic scale, to use my previous phrase. It also highlights what you accept as your 'first principles', what is your metaphysics attempting to understand? What would count as a success in the metaphysical endeavour? The answer is usually expressed as explaining reality, or as explaining what is. But what is what is? Much of traditional metaphysics assumes the primary ontology must be of matter and substance. But not all; you can take Whitehead's and Gendlin's view that processes are primary. Or you can take the view that it is the two 'viewpoints' which are primary, or even that the two 'places' in which you must be situated to have those viewpoints (nowhere and nothing) are primary. (My placing certain terms in quotes can be taken as a shorthand apophatic marker).

My position is that the creation and existence of matter and substance is a scientific question. Certainly there is much scientific work been done currently on zero ontology, the creation of substance from quantum fluctuations in the vacuum or even from nothing (no space and time either, see the latest work on loop quantum gravity[43]). I view metaphysics as an attempt to cast the net as wide as possible, as it were, further than the existence of matter, to find the most inclusive container possible, so that not only 'what is' is contained in it, but also the question 'what is?', and also the questioner, and the awareness that the questioner has, and the origin of that awareness.

From the framework I outline, you can work with a variety of ontologies: my choice in this paper is to take I-nowhere and I-nothing as the most first of first principles, and see what follows.

 

15 - The Practice

The title of this paper is A Metaphysics of Distinction, Performance and Practice. I have assumed that the creating of distinction is a fundamental human act (physical and mental), and I characterise our rationality as the endeavour to create such distinctions meaningfully. I then characterise metaphysics as pushing this endeavour to its utmost, to the 'first principles', and finding distinctions break down in the limit. I suggest two such limits are nowhere and nothing, and so to talk or write about them meaningfully, we have to use language in a way that is beyond logic, which I suggest is apophatic in the sense that Plotinus used the term, and that the use of language in such a way is best characterised as a performance.

It remains finally to show how metaphysics can be practised within the framework I have created. First and foremost, I regard metaphysics as practical - at least, the metaphysics I am trying to present here. It is an attempt to follow the seemingly endless chain of 'why?' questions to the end; or if there is no end, to be comfortable with that as a fact. (Explaining something to a child, who answers 'why?' to every answer, can be infuriating - that is metaphysics in the rough!)

In order to use practically what I have outlined, one must feel the reality of it. Viewing such a metaphysics from I-nowhere only is not enough, it must also be seen from I-nothing. In other words, the metaphysics must be applied to itself. Nagel writes that it is 'usually a good strategy to ask whether a general claim about truth or meaning applies to itself' - and in this case I am saying that not only is it a 'good strategy', it is in fact essential.

As I have written above, it is fairly easy, possibly with a little guidance, to at least have a glimpse of the nothingness behind everything we see, do or feel. Since we are all, without exception, looking out of one large eye or window, we can note that fact and see where it leads - particularly what is on our side of that window. It is only hard to do because of the objective habit, it is not hard in any physical sense. (One can do similar experiments with feeling and thinking[26]). And likewise, even though we have the objective habit strongly, to take that to the limit too, to be objective about our own objectivity, and see the fact that I cannot escape I-nothing, so my objectivity must be I-nowhere.

The practice then can be summed up as taking my normal, everyday, workaday view of whatever is in front of me (physically or mentally) and consciously separating that commonsense view out into views from I-nothing and I-nowhere simultaneously. It is a singular fact that this can be done. I take strong concentration on an object as an analogy: usually we think of concentration as exclusive, as focusing all our mental energies on one thing, to the exclusion of all else. But in fact concentration is not like that. The more we concentrate on something, the more we have the ability to be aware of our own concentration, in addition to the object we are concentrating on.

An example might be a musician, playing a piece with rapture, totally absorbed in it. And yet, there is some part of her attention which can be aware of her own playing, noting tweaks and adjustments to her technique for next time. Or an athlete 'in the zone', absorbed in his performance, and yet still able to note objectively how he is running in the race, in addition to being absorbed in his own body's functioning and the joy of it. This ability to concentrate on the concentration itself as well the object of concentration is the basis of some meditation styles[44].

I have found the effects of consciously bringing my metaphysics into everyday life in this way are remarkable, both in my thinking, generating and understanding new concepts, and in my own inner life generally. These are the topics of the two final sections in this paper.

 

16 - Thinking Bodily

Much of our thinking needs to be logical. As I have expressed above, there is a place for non-logical thought and expression, for instance in the performance of the arts, and also in the performance of apophatic expression of metaphysical first principles, where stable distinctions are not maintainable. But I think our default mode of trying to think and express ourselves should be, and needs to be, logical. If that proves to be difficult in any domain of discourse, then our first step should surely be to reason further, and to try to find suitable distinctions with which to take our thought forward logically. As I have said, that may prove to be impossible, and I believe it is so for the limit-seeking type of metaphysics in the final analysis; but in many, if not most, cases it is not.

Clear distinctions and logical thought do not always need to proceed from other distinctions and logic, however. As a practice, it is possible to delve into the distinctionless, the non-logical, and emerge with clearer and more precise distinctions and logic. An analogy can be made with mathematics and imaginary numbers. In the early 1500's some Italian mathematicians (Cardano, Tartaglia and del Ferro, for instance) were trying to solve the general cubic equation. The only way to do so was to use imaginary (complex) numbers, which were then considered strange, the 'devil's work', madness even. But by following the calculation into this 'madness', you could emerge with respectable natural number solutions that could be discovered in no other way.

Most people's thoughts are for much of time idling away in an unstructured and chaotic fashion, from which clear and concise concepts can and do emerge. But here I am suggesting an actual practice to do this in a disciplined and intentional manner.

To appreciate this practice, it is necessary to heal the split that is common in western intellectual life between the body and the mind. In sections 6 and 7 above I mention Lakoff and Johnson's summary of the latest cognitive science research, particularly that the mind is inherently embodied, and not in the weak sense that we need a body to have a mind, but in the strong sense that concepts are neural and sensorimotor. We are not minds inhabiting a body, nor bodies having a mind, but we are one thing, a mind-body.

As this is a paper on metaphysics, I will not rely on the latest science (which of course is disputed anyway), nor on everyday experience which seems also to deny such a split - even the way we talk does so, such as 'butterflies in the stomach' or 'throat tight with fear'. But I will take it as a metaphysical principle that our body is much more than a flesh-and-blood container for the mind, and that we can think with its sensations and visceral feelings.

The practice is then to view a bodily sensation, a tightness in the stomach say, from the perspective of I-nothing and I-nowhere (either simultaneously, or zig-zagging between the two in a mutually reinforcing epigenesis), while relating it to the area of thought I am trying to think further in. This is not the place to go into the details, but I believe this process is very similar to the practices of Focusing and Thinking at the Edge developed by Gene Gendlin from his Philosophy of the Implicit[45].

Gendlin often gives an example of a poet struggling to write the next line in their poem. There are typically many words or next lines that come to mind, but the interesting thing that Gendlin points out is that only one is 'just right'. It is as if the body can hold the poem as a whole, and through bodily sensation vets each suggested next line, finally giving assent to one that that feels right. Or another example he gives is that you may have a sort of visceral feeling during the day of something not quite right, a gnawing in the stomach that tells you you have forgotten to do something you should have done.

By attending to these physical sensations in a disciplined manner, Gendlin claims that I can react to my situation in a more wholesome and integrated way, and can think better. To think 'better' means to use better-crafted and more pertinent concepts. Such concepts must come from an objective-wards stance, as most thinkers would understand it; and then trying to get more objective, getting a more inclusive view of all aspects of the topic to obtain better and sharper concepts (viewing from I-nowhere). But I have observed in myself that if at the same time I can move subjective-wards, view from I-nothing, and in particular feel how my body holds the issue, as it were, then the interaction between the more subjective and the more objective viewpoints can combine to give a conceptual structure that is both more powerful and more genuine (I use 'genuine' in the sense of Gendlin's 'carrying forward').

 

17 - Beyond The Self

When we use the pronoun 'I' to refer to ourselves (beyond indexical or placeholder) I suggest that we have three main things we can be referring to: I-body, I-self and I-nothing (consider as given the usual apophatic qualifications to calling I-nothing a 'thing' or as something to refer to).

As I have quoted above, the results of cognitive science, as summarised by Lakoff and Johnson, suggest that the body-mind is one unit - what a human being is. When I use 'I' as I-body, I am really meaning I-body-and-mind-both, but I-body is shorter and more succinct. I take the distinction we make between body and mind to be useful to an extent, but taken too far it is artificial, and is perhaps the most damaging part of the worldview we have from western thinking. I have outlined very sketchily in my previous section how one can think more efficiently using body-and-mind as one unit.

I have written above about I-nothing, which I consider the metaphysical limit of subjectivity, and can be experienced to some extent by anyone willing to take a short period of time to see it (or more accurately, to be it). I-body (strictly body-and-mind) seems to me also ontologically fundamental, at least to any real live human being. But between the two hovers this other 'I' construct which I call I-self. This final section is to resolve the 'self' into body and I-nothing, using my framework.

First, to most people, I-self is the most basic. I-self inhabits the body in some sense (Ryle's famous phrase 'the ghost in the machine'), like a homunculus sitting behind my eyes somewhere controlling what I think, what I see, and what I do. Lakoff and Johnson in fact have a whole chapter on the various metaphors we use for our self and how we think about I-self[46]; and while it may be currently popular in philosophy to consider the self as some sort of narrative we create[47], I believe that my emphasis here on a practice based on a metaphysics is a little different.

Blackmore has a thought experiment[19]: you are given a choice...Either you will have your body completely swapped for another body but keep your inner conscious self (I-self), or you will have your inner self swapped for another unspecified self and keep the body. Blackmore suggests that most people would chose the first option, to keep your inner self; and that is because for most people I-self is the most real. I would chose the opposite. Since no experiment can oust me from I-nothing, then if I-body is not changed either (Blackmore's second option) I have all that is necessary for my identity, in all senses of the word. In other words, I-self is a construct of I-nothing and I-body, and is ontologically subordinate to them.

I have mentioned above Blackmore's attempt to derive, at least in principle, much of our mental inner life from meme evolution. She maintains that our sense of self is also a large complex of memes, which has evolved so that the host memes are contained and propagate themselves. This is not the place to explain her views in detail, or argue for them, only to say that I find her explanation persuasive. She furthermore claims to experience this in her practice of Zen meditation, where the ceaseless chatter of her mind (random memes) quietens, and her sense of self is loosened.

I too practise a meditation and have done so as a committed practice since 1968. My views on meditation, and the style of meditation I practise, has evolved and changed considerably over the course of nearly forty years[48]. I have recently felt the need to place my practice in context, and once embarked on that project, I saw that I would only be satisfied with the widest and most inclusive context possible, which in my mind is close to being a definition of metaphysics.

This widest and most inclusive context ('life, the universe, and everything' in Douglas Adam's famous phrase) is what I have tried to present in this paper, a metaphysics which is both rational (in that it has followed reason to what I consider the utmost, and comes out the other side, as it were) and is also experiential (apophatic qualification of course required here).

Blackmore (and Buddhism) claim that there is no self at all. I am not sure what to make of this claim, and I suspect in true pluralistic style that much of the controversy over this and related issues is due to a lack of clear distinctions. My metaphysics and my own meditation practice show me clearly that I-self, while certainly real in the sense of existing, is constructed from the views from I-nothing and from I-nowhere; and by consciously separating out my views can be seen for the construct it is, with all the sanity and wholeness that that implies.

The practice is then the same in essence as above. It is to view a mental object in the same way as a physical object, simultaneously from I-nothing and I-nowhere (or zig-zagging between the two viewpoints), seeing the same object in Klempner's two worlds at once.

Consider my getting angry. In some cases, anger is appropriate, but more often it is inappropriate and results from my misreading the situation in some way. My response these days, if I can, is to apply this binocular vision, as it were, and to view my anger as a mental event in both I-nothing's world and in I-nowhere's. In both cases I have to step outside it, but in two directions at the same time. To view it from I-nothing, it is contained in my awareness, like a picture projected on a screen, but I am situated myself in I-nothing and am aware of both the separation and closeness of my anger. It is to be aware of the is-ness of my anger.

To see my anger from I-nowhere, I have to step outside it and see it as one element in the objective world. It will typically be the biggest and most unruly element in that space, but nevertheless it loosens my identifying with it, and is a start. I also try to move even more objective-wards, and if I can see the situation from a viewpoint other than my own, which if successful will defuse my anger some more.

In my paragraph above, I used the phrase 'if I can' several times. This is of course the problem with such homely and common-sense advice: 'calm down, it is not what it looks like' or 'look on the bright side of things', or 'cheer up' when I am sad. Obviously when I am sad I want to cheer up, and I know I want to cheer up, so just telling me to cheer up is worse than useless, it is intensely annoying. I don't need to be told to cheer up when I am sad, like I have forgotten, I want to know how to cheer up. It is the same with my anger - I typically know that it would be to my advantage if I could objectify, but the problem is that I am in the grip of it and cannot do so or will not do so, even though part of me knows that would be best.

That is why I also need to subjectify - go 'nose to nose' with my anger, in Jon Kabat-Zinn's marvellous phrase - feel it through and through, sink into it. Again, if I only do this, if I only subjectify, I will quite likely simply reinforce it and encourage it to overwhelm me. But if I do both together, subjectify and objectify at the same time, look at my anger with binocular vision, through one lens of up close and personal from I-nothing, and one lens of including it as an element in a wider context from I-nowhere, then I have found that something almost magical can happen. True wisdom and stability can emerge which would be much harder to obtain through one viewpoint alone.

In fact, I can go a lot further than this if I wish to. I can even love other people, and perhaps if I take it even further I can love myself. As my thinking has developed along the lines of this essay, I have come to almost define loving others as simply seeing them through this binocular vision. The same paradigm holds as I have described above. As I interact with someone, there is typically a whole series of viewpoints involved, averaging out to a general view of them. I separate out this viewpoint to view them as objectively as I can, and simultaneously as subjectively as I can, and I see them through my large single eye, out there, but also filling up my field of vision in a way that does not conflict or clash with my emotions and personal issues.

I see them like I see myself in the mirror, human-shaped with two eyes. If my view of myself is out there too, as I see myself in the mirror, then there is the potential for a clash of two similar entities. But if I don't exist as a two-eyed being out there, one among many, but as a singular being in here, including the world and the human I am facing within my large matchless single eye, then there is nothing for that human out there to clash with, and true love is possible.

 

Acknowledgements

Much of my experience of I-nothing is due to Douglas Harding, and my being able to philosophise about it is due to Josiah Hinks. Geoffrey Klempner has given me his time in commenting on this paper. I am grateful to all three, and to countless others who have shaped my thought from interactions that I can now no longer remember.

 

Notes

[1] Much has been written about the Laws of Thought. The phrase became particularly popular with George Boole's 1854 treatise on logic: An investigation on the Laws of Thought. [resume]

[2] Zermelo-Fraenkel set theory (with the axiom of choice) was actually created by Skolem to avoid the paradoxes of Cantor's 'naive' set theory. It consists of a single primitive ontological notion, that of set, and a single ontological assumption, namely that all distinct entities in the universe of discourse (usually taken to be all mathematical objects, but sometimes all objects, real or imagined, physical or mental) are sets. There is a single primitive binary relation, set membership. [resume]

[3] George Lakoff and Mark Johnson 'Philosophy in the Flesh' (Basic Books, Perseus, 1999) [resume]

[4] Haack, S. (1996). Deviant Logic, Fuzzy Logic: Beyond the Formalism. Chicago: The University of Chicago Press. [resume]

[5] For an excellent introduction to fuzzy logic and set theory, and the issues surrounding gradations of truth, see 'Fuzzy Logic' by Daniel McNeill and Paul Freiberger (Touchstone, Simon and Schuster 1993) [resume]

[6] Graham Priest, JC Beall, and Bradley Armour-Garb (eds.), The Law of Non-Contradiction: New Philosophical Essays, Oxford University Press, 2004. [resume]

[7] An almost mystical significance to drawing a distinction is the basis of G. Spencer-Brown 'The Laws of Form', first published in 1969 (Crown Publishers, 1972). I have been thinking the way I describe in this section for many years, and in 1971 I was discussing my idea of distinguishing as a fundamental human act with Jerry Ravetz, then Reader in the History and Philosophy of Science at Leeds University, who knew Spencer-Brown and told me about his 'Laws of Form' which had then only just been published. Apart from the fundamental significance of creating a distinction, my ideas and those of Spencer-Brown have little in common. [resume]

[8] Lew Lefton, quoted in J. de Pillis '777 Mathematical Conversation Starters' (The Mathematical Association of America , 2002) p.156. He actually said 'you either believe in the law of the excluded middle, or you don't', I have changed 'believe' to 'accept'. [resume]

[9] For instance, Plato argues for them in the Sophist 237A, and the Republic 4:436b. Aristotle argues for them in, for example, Metaphysics G, 3&4; De Interpretatione 11, 21a32-33; Topics IV 1, 121a22-4; and Sophistical Refutations 5, 167a1-6. [resume]

[10] "The same attribute cannot at the same time belong and not belong to the same subject and in the same respect." Aristotle: Metaphysics G, 3,1005b18-20. [resume]

[11] R.L.Nettleship 'Lectures on the Republic of Plato' (Macmillan 1901) p.220 [resume]

[12] Aristotle: Metaphysics (trans Hippocrates G.Apostle, Blookmington, Indiana University Press, 1966, p.60) [resume]

[13] Thomas Nagel: The Last Word (Oxford University Press, 1997) p.5 [resume]

[14] Quoted in David H. Richter 'Pluralism at the Millennium', online at http://qcpages.qc.cuny.edu/ENGLISH/Staff/richter/Pluralism.html [resume]

[15] Richard McKeon (died 1985) was a philosopher who deserves to be better known. His students have many stories about him, and many of his students became famous in their own right, such as Rorty, Pirsig, Susan Sontag and Gene Gendlin. In a charming eulogy by one of his former students (http://net-prophet.net/mckeon/mckeon.htm) he was possibly the widest-read philosopher of the last century, and this student found a book in the library by a Greek philosopher which had only been read once, by McKeon, in it's entire 40 year history there. [resume]

[16] McKeon's Philosophical Semantics changed over time, and there are several different, but almost equivalent, formulations of it. All the basic papers around the subject are collected in 'Selected Writings of Richard McKeon; Vol 1 ' eds. Z.K.McKeon and W.G.Swenson (University of Chicago Press, 1998). An online version is at http://net-prophet.net/mckeon/17/2_text.htm. [resume]

[17] Thomas Nagel: The Last Word (Oxford University Press, 1997) p.28 [resume]

[18] John Keats, letter to J H Reynolds May 3 1818 [resume]

[19] Susan Blackmore makes the human ability to imitate the cornerstone of her book The Meme Machine (Oxford University Press 1999). She distinguishes carefully between true imitation, contagion and social learning. Although true imitation can occur in animals (eg parrots imitating human speech), it is only common and basic in humans. [resume]

[20] George Lakoff and Mark Johnson 'Philosophy in the Flesh' (Basic Books, Perseus, 1999) and a website devoted to maintaining their work: http://zakros.ucsd.edu/~trohrer/metaphor/metaphor.htm [resume]

[21] Eleanor Rosch in Timothy Moore ed. 'Cognitive Development and the Acquisition of Language' New York, Academic Press, 1973, pp 111-144 [resume]

[22] A comprehensive list of references is given in Lakoff & Johnson (note [20] above), reference list A4, p.588. [resume]

[23] Eleanor Rosch in 'Cognition and Categorization' Hillsdale, NJ, Lawrence Erlbaum Associates, 1978, pp 27-48 [resume]

[24] The literature on metaphysical, mystical, and consciousness conclusions derived from the use of 'I' is large. See for example the references in: Louise Röska-Hardy "'I' and the First Person Perspective" online at http://www.bu.edu/wcp/Papers/Lang/LangRosk.htm, or just skim Google! [resume]

[25] Epoche is a term introduced by the Pyrrhonian sceptics, and adopted by Husserl and the phenomenologists. It means different things to different people. Montaigne defined it as 'I sustain, I do not move...a pure, total and perfect stay and suspension of our judgement'. N.Depraz, F J Varela and P.Vermersch in 'On Becoming Aware: A pragmatics of experiencing' (John Benjamins, Amsterdam and Philadelphia, 2003) describe what they call the 'basic cycle of epoche' (p.24ff) comprising the three iterative steps I allude to. [resume]

[26] Douglas Harding, still alive and teaching at 97, sometimes calls his way of inquiry 'headlessness': http://www.headless.org/English/main.html I read his best-known book 'On Having No Head: Zen and the Rediscovery of the Obvious' (US: Inner Directions Foundation, 2002; UK: Arkana, 1991) on a plane journey, and during the flight looked out of the plane window, mentally rehearsing the 'experiments', and I instantly 'got it'. There was (and is) no doubt whatsoever that at heart I am nothing, the closer I look, the more obvious it is that there is nothing there! I am grateful to Douglas for this insight (which is not really an insight, of course!). The phrase 'aware nothingness' is his. [resume]

[27] This is St Augustine's response: 'This contradiction is to be passed over in silence rather than resolved verbally' - Augustine: 'On Christian Doctrine' trans D W Roberston Jr (Bobbs-Merrill, Indianopolis, 1958) p.11. Also of course Wittgenstein famously ended his Tractus with the words 'Whereof one cannot speak, thereof one must be silent.' [resume]

[28] Many of the ideas in this 'Unsaying' section are from Michael Sells 'Mystical Languages of Unsaying' (University of Chicago Press, 1994), and his paper 'Apophasis in Plotinus: A Critical Approach' (Harvard Theological Review 78:3-4 (1985)) pp.47-65. I find most academic studies on mysticism and theology tedious, and for me miss the point. Michael Sells is one of the very few academics who I find helpful in this area, and who have original perspectives on the texts. [resume]

[29] Plotinus, Enneads 6.4.7.32-38 [resume]

[30] Oscar Wilde, Mr Erskine in 'The Picture of Dorian Gray' (1891) chapter 3. [resume]

[31] Epimenides, a Cretan, reportedly wrote in the 6th century BC 'The Cretans are always liars', which is mentioned in the Bible (Titus 1:12) although it looks like Paul misunderstood it completely. The oldest written record of the Liar paradox is by Eubulides of Miletus in the 4th century BC. A good review of the Liar paradox, both historical and current, with many references, is at: http://www.iep.utm.edu/p/par-liar.htm [resume]

[32] Plotinus, Enneads 5.5.6.23 [resume]

[33] Graham Priest 'Beyond the Limits of Thought' (2nd edition, Clarendon, Oxford, 2002) [resume]

[34] Priest presents his 'schema' in several different formulations. Ref [33] above contains some of them. [resume]

[35] Colin McGinn first published his 'cognitive closure' in Mind, 1989. It is an argument that we cannot understand our own consciousness, using that consciousness. I say it is 'analagous' to my position, because I am not discussing consciousness per se, but the metaphysical I. Clearly the two are related, and I discuss this later in the main text. [resume]

[36] Don Perlis: 'Consciousness as self-function' (Journal of Consciousness Studies, Volume 4, Issue 5/6, 1997) pp.509-25. Perlis's work has been criticised, particularly by Damjan Bojadziev (Journal of Consciousness Studies, Volume 7, No. 5, May 2000) citing Godel and self-referential sentences, but Perlis has responded in the same issue. To me, Perlis's point is undeniable - there is something fundamentally different about me, a conscious agent involved in self-reference, with the formal self-reference of say a Godel sentence or the Liar. [resume]

[37] Gotthard Günther 'Idea and Outline of a Non-Aristotelian Logic', 1959 (Idee und Grundriss einer nicht-Aristotelischen Logik). [resume]

[38] Geoffrey Klempner 'Naive Metaphysics' (Avebury 1994, but now a free online book, downloadable at: http://www.philosophypathways.com/download/metaphysics.zip) [resume]

[39] Blackmore et al 'Is the richness of our visual world an illusion? - Trans-saccadic memory for complex scenes' online at http://www.susanblackmore.co.uk/Articles/perc1995.htm. For a comprehensive list of online references to this latest thinking of vision, see the references on David Chalmer's page at: http://consc.net/online3.html#change [resume]

[40] U.S. Census Bureau, online at http://www.census.gov/ipc/www/world.html [resume]

[41] See for instance the work of psychologist Arthur J. Deikman 'I = Awareness' (Journal of Consciousness Studies, 3, No. 4, 1996), pp. 350-6; and Robert K.C. Forman 'What Does Mysticism Have To Teach Us About Consciousness?' (Journal of Consciousness Studies, 5, No.2, 1998), pp. 185-201 [resume]

[42] Steve Taylor 'The Fall' (O Books, 2005). A more extreme version of Taylor's hypothesis exists, spear-headed by J.Jaynes 'The Origins of Consciousness in the Breakdown of the Bicameral Mind' (Pelican 1976), which maintains that before 4000 BCE there was no reason or objectification, people 'reasoned' by hearing voices in their heads. Taylor writes in terms of a process of increasing objectification. [resume]

[43] For a popular exposition, see 'Out of the Void' and references therein (New Scientist, Aug 12 2006, p.5 and pp.28-31) - online http://www.newscientist.com/contents/issue/2564.html Alex Vilenkin has also written much about the creation of the universe out of nothing following his seminal paper 'Predictions from Quantum Cosmology' in Physical Review Letters 74 846 (1995) [resume]

[44] The Theravada Buddhist tradition of jhana, for example. The Pali Canon gives the example of a person lying down and also sitting up and still being aware of himself lying down. (Anguttara Nikaya V.28). And then standing up and being aware of himself both lying down and sitting, etc. These verses are normally translated as if it is a different person sitting to the one lying down, but the Pali also allows for the former interpretation which I believe is correct. In any case, it is certainly used as a metaphor for jhana. See my online article: http://mikefinch.com/md/bud/bm.htm (See other articles on the same site for my view that the 'jhana' of the Pali Canon is quite different from the meanings of 'jhana' as written about in the Buddhist commentaries, which to me are quite weird, but unfortunately prevalent in western Buddhism). [resume]

[45] Eugene Gendlin has written many books and papers. Many of these papers are online, and for a bibliography and an outline of Gene's philosophical stance start with http://www.focusing.org/philo.html. I have found the best book is a collection of articles by critics of his, each one of which has a response by Gene, and which also has an introduction by Gene which is a summary of his thinking as a whole: 'Language Beyond Postmodernism: Saying and thinking in Gendlin's philosophy.' Edited by David Michael Levin, Ph.D.. Evanston (Northwestern University Press.) I have met Gene, and found that he truly walks his talk, meaning that his Focusing is a practical practice (many practices in my experience are not practical!) It was due to my interactions with him, and particularly with Josiah Hincks who showed me how to practice Gene's philosophy, that my own ideas became clearer. [resume]

[46] Note [3] above, chapter 13. Also the preceding chapter, 12, on the metaphors for the Mind cover much of this area too. Unfortunately they confusingly use the word 'subject' to cover also much of what I mean by I-self. I think some of their 'subject' is I-self, and possibly some is also I-nothing. [resume]

[47] See for example: Raymond Martin and John Barresi 'The Rise and Fall of Soul and Self: An Intellectual History of Personal Identity' (Columbia University Press 2006) [resume]

[48] I write about my meditation practice on my website http://localhost/MySite, and a collection of my essays on the topics of this paper are collected at http://mikefinch.com/md/sy/intro.htm. [resume]

[Return to list of Synthesis articles]    [next article] Last revised Sep 17 2006

Copyright © 2001 - 2016 Michael R Finch
All Rights Reserved