addressalign-toparrow-leftarrow-rightbackbellblockcalendarcameraccwcheckchevron-downchevron-leftchevron-rightchevron-small-downchevron-small-leftchevron-small-rightchevron-small-upchevron-upcircle-with-checkcircle-with-crosscircle-with-pluscontroller-playcrossdots-three-verticaleditemptyheartexporteye-with-lineeyefacebookfolderfullheartglobegmailgooglegroupshelp-with-circleimageimagesinstagramFill 1light-bulblinklocation-pinm-swarmSearchmailmessagesminusmoremuplabelShape 3 + Rectangle 1ShapeoutlookpersonJoin Group on CardStartprice-ribbonprintShapeShapeShapeShapeImported LayersImported LayersImported Layersshieldstartickettrashtriangle-downtriangle-uptwitteruserwarningyahoo

Cosmology, Quantum Mechanics & Consciousness Message Board › Consciousness, by Andrew

Consciousness, by Andrew

A former member
Post #: 58
Hi Andrew,

We disagree again, but quite subtly.

More specifically, I think you are right but irrelevant.

I think you are right that QM uncertainty does mean that the Universe is indeed non-deterministic, but even on that point I am not absolutely certain.


However:

I think we can envision a genuinely deterministic Universe - one in which the Greeks were right and the atom was unsplittable and QM did not exist. In such a Universe I believe that consciousness et. al. would still emerge. In my view the unpredictability offered by non-analytic systems is sufficient.

If I am right on this point, than we can safely ignore QM effects and treat the Universe as deterministic as per the definition supplied by Ian and Camilla. In this case your view is indeed correct but, to my mind, irrelevant.

On the other hand, if I am wrong, then QM uncertainty may be critical to our discussion. Both Ian and Camilla regularly assert that I am just plain wrong, but as far as I can see their arguments so far amount only to "proof by blatant assertion". I eagerly await some cogent exposition of the flaws in my thinking...



Peter
Camilla M.
user 7151822
London, GB
Post #: 18
There are some controversial claims in your home made definitions.
I am sure you would rather I said nothing at all on this thread but these points might be helpful.
I can tell you what I won't be doing, though. Spending several messages rule setting nor trawling through them for conventions on underlining and italicising which has reached the point of mania now!

I doubt if there is a single animal on the planet, however small and primitive, that does not process sensory information to form some kind of mental representation of its environment,

This is not agreed by all parties by any means. Since a famous book on AI about human cognition and computing, the idea was introduced of external relations (& socialisation for humankind) being the main ordering mechanism. It was generally called connectionism. The point was that there was no content in terms of mental representation. I suppose databases in the form of biological memory kept everything materialistic.

There is plenty of modern evidence supporting that movement and the motor cortex is the most important part of learning, contra perhaps our intuitions to do with pictorial representation. This was prominent on the recent Marcus De Sutoy programme about robot learning.

Then you say:

The real question of course is whether human consciousness is qualitatively different from awareness as defined above. That must be so, since I do believe that my state of conscious awareness is quite different from the awareness possessed by a sea slug (though the alternative has sometimes been argued). I don't believe that a sea slug forms a mental picture of the world the way that I do. But I do expect that higher animals such as birds and mammals have mental pictures. So it seems to me that consciousness is a higher level of awareness in which the mental representation of the environment becomes a mental picture.

Ah, I see why you might be agreeing with Peter that consciousness is more awareness - but with a picture 'content'.
The point I was going to make however, is that your
That must be so above
is unargued for. Just because you choose to believe it is no reason for anyone else to do so.

Ditto, the expectation that higher animals have mental pictures.
On the connectionist concept they probably would not in addition because of lack of symbol and language manipulation. I am just saying what you've assumed is not universally accepted.
I don't myself believe that animals are incapable of thought and some kind of symbolisation but this is based upon a theory that their and our thinking is rehearsed movement. It's performative. The picture part would not be thinking in my lexicon and remains to be explained.

The last part that could be problematic is:
2. Consciousness is very much about deciding physical actions. (Your actions while asleep are involuntary.)
This notion of consciousness is right at the interface between the senses and the responses in physical action.
3. Self-awareness is merely a by-product of consciousness in which the self forms part of the mental picture.


2) Neuroscience is currently revealing that consciousness may not be involved in any decision making.
In sleep we usually have sleep paralysis to stop wandering, but some thoughts and decisions seem to go on.
3) Self-awareness can be construed as part of "access consciousness", i.e. just recursion in ordinary awareness. As above, this does not rely on mental pictures or content.

Don't shoot the messanger, I'm simply reporting what's commonly out there.









A former member
Post #: 59
Dear Camilla,


Fascinating.

Andrew: I doubt if there is a single animal on the planet, however small and primitive, that does not process sensory information to form some kind of mental representation of its environment,

Camilla: This is not agreed by all parties by any means. Since a famous book on AI about human cognition and computing, the idea was introduced of external relations (& socialisation for humankind) being the main ordering mechanism. It was generally called connectionism. The point was that there was no content in terms of mental representation. I suppose databases in the form of biological memory kept everything materialistic.

There is plenty of modern evidence supporting that movement and the motor cortex is the most important part of learning, contra perhaps our intuitions to do with pictorial representation. This was prominent on the recent Marcus De Sutoy programme about robot learning.


I am struggling to understand what a brain that fails to form some mental representation of its environment is doing.

If the brain is performing any computation at all that in any way models the state of the entity of which that brain is a part plus its environment then it must form some kind of mental representation of its environment by virtue of the definition of mental representation.

Or has the term mental representation been loaded with some additional meaning or requirement which might make this apparent truism false?




Then you say:

The real question of course is whether human consciousness is qualitatively different from awareness as defined above. That must be so, since I do believe that my state of conscious awareness is quite different from the awareness possessed by a sea slug (though the alternative has sometimes been argued). I don't believe that a sea slug forms a mental picture of the world the way that I do. But I do expect that higher animals such as birds and mammals have mental pictures. So it seems to me that consciousness is a higher level of awareness in which the mental representation of the environment becomes a mental picture.

Ah, I see why you might be agreeing with Peter that consciousness is more awareness - but with a picture 'content'.
but as I understand it Andrew does not does not agree with me on this point, lol

The point I was going to make however, is that your
That must be so above
is unargued for. Just because you choose to believe it is no reason for anyone else to do so.

Indeed, this is precisely where Andrew and I differ. I do not believe that "my state of conscious awareness is quite different from the awareness possessed by a sea slug" except by virtue of the fact my brain has massively more power than a sea slug's and that this quantitative difference creates a (possibly only apparent) qualitative difference.


Ditto, the expectation that higher animals have mental pictures.
On the connectionist concept they probably would not in addition because of lack of symbol and language manipulation. I am just saying what you've assumed is not universally accepted.
I don't myself believe that animals are incapable of thought and some kind of symbolisation but this is based upon a theory that their and our thinking is rehearsed movement. It's performative. The picture part would not be thinking in my lexicon and remains to be explained.

This concept of a mental picture does trouble me. If there is any suggestion that is is indeed "pictorial" then I fear we are trying to report on a p-conscious event. As it happens I don't think that's what Andrew meant, but as I'm guessing I'll say no more for now.

The last part that could be problematic is:
2. Consciousness is very much about deciding physical actions. (Your actions while asleep are involuntary.)
This notion of consciousness is right at the interface between the senses and the responses in physical action.
3. Self-awareness is merely a by-product of consciousness in which the self forms part of the mental picture.


2) Neuroscience is currently revealing that consciousness may not be involved in any decision making.
In sleep we usually have sleep paralysis to stop wandering, but some thoughts and decisions seem to go on.
3) Self-awareness can be construed as part of "access consciousness", i.e. just recursion in ordinary awareness. As above, this does not rely on mental pictures or content.


In order for Neuroscience to reveal anything of the sort it would need to have a very rigorous definition of consciousness. The whole point of this discussion is that we do not possess any such definition and we are seeking one. What definition of consciousness is employed in Neuroscience to reach this result?

I think Andrew is completely right on point 3.



Peter
A former member
Post #: 84
Peter: I agree with your comments in your last post except at 'report on a p-conscious event'. The house rules on this thread apply to everyone and you will have to re-phrase in plain English if you wish me to reply. Apologies but I am not going to search through the huge message list to check exactly what your phrase means.

Camilla: by all means let's check out the definitions. Specific comments as follows:

(a) You say that some researchers claim that some animals do not carry any form of mental representation of their environment. I doubt that! What about some specifics? Are we talking perhaps of ant colonies for example? Is the suggestion being made that the colony works purely by chemical messengers in the air driving individual actions and thereby maintaining the colony? If so then my response is to point out that each ant, in order to perform its job within the colony, must have the ability to receive some kind of internal nerve signal from its chemical sensor and then process that signal into an appropriate instruction to perform body movements. There you have an example of the most primitive mental representation of the environment to which my definition refers. Food ahead, colony behind, etc. With that perspective my definition is pretty robust, is it not?

(b) As for the relative consciousness of a sea slug and a human, Peter is saying that he differs from the sea-slug quantitatively and I say that I differ from the sea-slug (possibly therefore also from Peter!) qualitatively. Whilst you say that I have not argued my point. Quite, so let me do so now. It seems absolutely crucial to this whole discussion.

To recap, awareness is defined here as internal mental representation of external environment. This is about digital representation in a computer or sub-conscious knowledge in a person. I distinguish consciousness in terms of an additional feature which has been described as pictorial, involving a vivid mental picture rather than a digital or sub-conscious repository of data. But for us consciousness is more than pictorial. It involves the five senses of sight, sound, smell, taste and touch. I envisage that the brain contains the organic equivalent of a 3d cinema screen on which it projects the real-time movie of what is seen by our eyes. It contains another 'screen' which collates all the inputs from the ears and delivers a nice surround sound experience to go with the visual screen. The other senses likewise, in their own ways. But vision is our primary sense and that is the focus of our consciousness. For dogs their mental picture of the world may be dominated by the sense of smell. For bats, the mental picture is created from echo sounds, and so on.

So we must have at least five mental 'screens' which present the results of processing the raw data from all our sensory organs. There must be more than five however, because our conscious experience involves recognition of objects within view, names, associations, connections, the whole panoply.

Basically the brain has to boil down a huge volume of parallel sensory inputs to a few key screens and issues. From an evolutionary perspective the task would often be to decide in a split second between fight or flight. Consciousness is to do with turning all that parallel data, in stages, into sequential decisions. This applies for animals and humans equally, at this stage of description.

Now you may say: I have just invented all that, it sound plausible but I have proved nothing. Peter it seems does not view consciousness anything like this way. But here is the simple proof: we have dreams! A vivid visual dream proves that the mind does have that internal screen for the sense of vision. If you dream in sound then you have demonstrated that your brain has an internal 'screen' for sound. This I think is what people don't understand, and what has Ian tearing out his hair because he cannot get them to understand.

Of course, to prove that I am right and Peter is wrong on this point I need to demonstrate that the sea slug does not share my sort of conscious experience. Well, I don't expect that sea slugs dream, do you? Their nervous systems are too primitive. Maybe there is some research to identify which animals dream and which do not. I have seen reports that at least some animals such as dogs and rats do dream, so this is an area amenable to investigation. It is obvious, is it not, that the most primitive animals cannot have the mental apparatus that we have in order to generate conscious experience of our sort. Those animals are strictly sub-conscious automatons (no Latin here!). As indeed are we when we sleep and are not dreaming.

So my point has now been argued.

(c) You say that neuroscience is suggesting that consciousness is not involved in decision making. As I said to Ian recently, the research in this area, and the implication that decisions are made earlier than conscious awareness of taking the decision, is far from conclusive. On the contrary there is a very strong, I would say compelling argument, that evolution and survival of the fittest has produced consciousness as the best basis for animals to make fast effective decisions. I expect it will be found that instinctive decision making by animals and humans alike is sometimes, maybe often, based upon subconscious processing of sensory data. For example warnings of extreme danger could be sent from deep within the visual cortex well before the picture has been fully presented to the conscious 'screen'. I also expect that sometimes a decision to act will not be taken until the full conscious picture has been generated, because of the need to evaluate the situation. So it's not a simple matter; simple generalisations or assertions are premature.

In any case humans have the extra ability to think about our own thoughts and feelings and to make considered, not instinctive decisions. In such cases it is reasonably clear that conscious thought precedes decision. At any rate I reckon that Einstein did his work that way round!
A former member
Post #: 64
Hi Andrew,

I stand corrected. I will try to do better. lol


Peter: I agree with your comments in your last post except at 'report on a p-conscious event'. The house rules on this thread apply to everyone and you will have to re-phrase in plain English if you wish me to reply. Apologies but I am not going to search through the huge message list to check exactly what your phrase means.


I originally said:

This concept of a mental picture does trouble me. If there is any suggestion that is is indeed "pictorial" then I fear we are trying to report on a p-conscious event. As it happens I don't think that's what Andrew meant, but as I'm guessing I'll say no more for now.

so I was referring to your phrase "mental picture" and to Camilla's comments on it. You use that phrase to try to explain your idea of how consciousness differs from awareness which is a point where you and I differ so I find it harder to understand what you are really trying to say. If we take that phrase literally, which is how Camilla seemed to be taking it, it does look as if you are trying to suggest that consciousness necessarily equates to thinking in pictures. I was trying to make two points:
1. Though I didn't understand what you were trying to say, I doubted that you were suggesting that "consciousness necessarily equates to thinking in pictures".
2. But if that is what you were suggesting, I felt that it was close to trying to do what we have all agreed (I think) is impossible, namely to objectively describe the sort of sensations and subjective experiences that occur inside our minds.


Peter

lan B.
user 10895495
London, GB
Post #: 115

Hello Andrew. Exciting development! I'm replying to your first mail in the thread:


OK Ian, about your question 'What is consciousness?' I will try to tackle this using plain English at all times ..

(Duly chastised .. )

and if anyone does not understand my meaning they should say so please. Equally if I do not understand what anyone else is saying I will ask for full clarification in terms of plain English before we can proceed.

Right then, here are some home-made definitions:

An ' information processor' is defined as any kind of entity that accepts data or physical inputs, that processes that data and uses the results of that processing to generate data or physical outputs. (Thus, both computers and humans fit the definition.)

Agreed.

An 'autonomous information processor' is defined as an information processor that acts without external programming or control of its ongoing processes. (Adaptive neural networks are intended to fit this definition.)

Agreed.

An 'automaton' is defined as an autonomous information processor whose inputs and outputs refer to its own environment. (So inputs correspond to sensory inputs and outputs correspond to directed physical actions. Humans and animals fit this definition, but probably no robots at present. Future robots that are unprogrammed, instead fully relying on adaptive neural networks, would qualify.)

Agreed.

An automaton is defined to be 'aware' if it applies its processes to formulate and maintain an internal representation of the world in order to inform its output choices.

Good. You’re drawing the essential distinction between some system – whether human or other advanced animal on the one hand, or instead some very advanced technology on the other – supporting some internally, physically wired up intellectual construct, rather than it being conscious.

Software, of course, embodies our decisions-in-advance as to how we – or in “expert systems”, the most able domain-specialists on the planet – would deal with certain contingencies if they were to arise. (The effectiveness with which such contingencies are actually anticipated is a good indication of the calibre of both the software writing and the prior think-tanking.)

That is – as briefly argued for in my own paper – thinking is a class of virtual operations. It is a series of rehearsals as to how we would meet situations were they to arise. That is, again, that ironically thinking “belongs to the external world” rather than “to us”. Most of the templates for the styles of thinking of each of us as individuals are supplied courtesy of the overlapping cultures in which each of us is embedded. We draw on the accumulated expertise of at least tens of thousands of years of the experiences, the trial-and-error responses, and, since, say, the Greeks, the cultivated capacity to think serially, hierarchically, building on the work of previous generations in a highly constrained, disciplined way, on the basis of definitions, postulates, axioms and proven theorems. The ideal model for this is of course mathematics, but it’s also clearly present in board games such as chess, and allied to a systematic observation and classification of how the non-human world works, the “game” becomes that of the natural sciences.

It is only the fact that we are able voluntarily to “stop thinking aloud” – which restraint we develop out of social necessity as very young children – which makes us consider thinking to be “inner” and “private” rather than an attempt made by some physical system to construct some near-isomorphic representation of some domain of behaviour outside itself, but this possibility of deliberate non-disclosure is only a very recent cultural mutation. It is an accident, a contingency. It doesn’t impinge on the deeper logic of the distinction between consciousness, the vehicle, and thinking, which can be only a part of the contents of consciousness, some of the time. The fact that these intellectual constructs -- which we truncate according to the contexts of the situations in which we think them and which we somewhat arbitrarily label “thoughts” – can be communicated either by speech, text or otherwise shows that they are not “private” in the sense in which an examination of true consciousness itself discloses to us that it in itself – sensory consciousness, phenomenal consciousness, your own poetically described “inner mental illumination”, Andrew – is, since (to labour the point yet again, for which I apologise) a moment’s reflection shows us that we cannot communicate the particular “flavour” or “colour” of even a single one of any of our multifarious internal sensory representations, such as “seeing green” or “smelling fresh coffee”. These latter are definitely not intellectual constructs. They can only be reported, rather than reported, in Peter’s definitional sense. One can only succeed in communicating that one has just smelled freshly roasted coffee, and assume that one’s interlocutor can make sense of, even try to imagine, what you yourself are experiencing on the basis of their own near-isomorphic experiences. I would claim, rather, that they are the foundational building blocks from which intellectual constructs arise in the first place.

.. Which is why they cannot be communicated!


This definition attributes 'awareness' to all animals. I doubt if there is a single animal on the planet, however small and primitive, that does not process sensory information to form some kind of mental representation of its environment, in order to enhance the survival prospects for its genes.

Agreed. (Not that such animals “know” that they are attempting to enhance the survival prospects of their genes!)

The real question of course is whether human consciousness is qualitatively different from awareness as defined above.

Clearly “the collective human intellect” – only a minute proportion of which can be masterfully known to each one of us – is incomparably different from and richer than that of any other species, which is why we are so in awe of ourselves, but, again, I must stress that all this is an accident of the geologically recent, explosive growth in human knowledge. It could not have developed without the prior basis of conscious, sensory awareness, and it’s by no means clear where the evolutionary “downward cut-off point” lies. It presumably arose at different times within different phylogenies, thus arising independently in evolutionary terms again and again and again, like the eye, for instance.

The irony of the mystery of consciousness is precisely (I maintain) that it’s an entirely “animal”, “basic” phenomenon which is in a sense “too far below the waterline” for us to comprehend easily. It certainly does not consist in our abilities to contemplate the works of the composers of classical music, or to understand the proof of Fermat’s Last Theorem, to play a winning chess strategy, or to formulate approaches to the solution of the problem of consciousness!

[Continued .. ]

lan B.
user 10895495
London, GB
Post #: 116

[Continued .. ]

That must be so, since I do believe that my state of conscious awareness is quite different from the awareness possessed by a sea slug (though the alternative has sometimes been argued). I don't believe that a sea slug forms a mental picture of the world the way that I do.

Right, Andrew, I’m with you so far, although you might have been slightly dismayed at my swift marginalisation of the assumed centrality of human culture to the question of the nature of consciousness, but I feel obliged to point out that it’s your – entirely unaware, I’m sure – smuggling in of the not-even-concept “mental” which is IMV one of the biggest bars against thinkers – and particularly philosophers-of-mind in general! – succeeding in getting even a preliminary handle on how to insinuate one’s feelers into the problem sufficient unto thinking out any scientifically acceptable solution. Science after all ultimately deals in physical measurables (I’m saying this as a biologist, for Pete’s sake!) and scientific understanding ultimately resides only in being able to equate without either deficit or residue one set of such measurables with any other relevant set, to be able to navigate around the relevant theoretical landscape whilst simultaneously being able to track and anticipate the physical circumstances which form such theories’ subject matter. I appreciate that you last time found my disparaging of “mentality” repulsive and assumed perhaps that I was deliberately missing the point for ideological reasons, but I wasn’t. I naturally concede your point that, surely, every competent speaker of English immediately knows what one is getting at whenever one uses such terms as “the mind” or “mental”, but I’m claiming that that’s not "really" the case. I am saying in all seriousness that talk of “mentality” is effectively as pernicious as talk of “spirituality”. What are these people actually talking about? Nothing! Mentalese talk does indeed make sense in social, colloquial, scientifically uninteresting contexts, but no further. Perhaps that is only due to the fact that English in particular uses a wide range of phrases for use in utterly trivial contexts such as “I’ve made up my mind” or “I’ve a good mind to do (so and so)”

But I do expect that higher animals such as birds and mammals have mental pictures.

Yes I entirely agree that their “inner conscious worlds” are as rich and mysterious as that of each one of us. They too exemplify The Hard Problem!

So it seems to me that consciousness is a higher level of awareness in which the mental representation of the environment becomes a mental picture. Of course this definition is woolly. The key question is whether there is any qualitative difference between awareness and consciousness.

Well I think the 2 of us – and certainly also Camilla – concur in thinking that there simply is no continuum between chalk and cheese! If there is smooth continuity between the extremes of mere Turing computability “at one end” and fully-fledged bells, whistles, agonising pains and falling in love at the other, then each extreme must somehow – Yin and Ya-ang style, man (picture me smoking a joint) – contain an element of the other. Clearly consciousness is all about qualities (secondary qualities, so I claim) whereas if you casually talk of “mental representation” then what are you actually getting at? You may think that it’s intuitively obvious because of the prevalence, as said, of everyday talk about “minds” and “mental” and so on, but I can assure you that it isn’t, because what is a “representation” in any case if it’s not all bright lights and screaming sirens and all the rest? Culturally speaking, it could be just some static artefact such as a photograph or a picture hanging in an art gallery, or a movie or video but, most typically and abstractly, it will turn out to be text, and how do we perceive “text”, other than as a visual concatenation of symbols filled in by one hue standing out against a uniform background of some other hue? (Mutatis mutandis with soundtracks and conversations in general, and for the Braille reader!)

.. And so if as I think I’m correct in assuming that you’re denying that representation-in-consciousness is either text, hexadecimal bits, or patterns of reflective paint then what is it? The mere labelling of it as “mental” simply because it hasn’t (yet) been analysed in physical terms serves IMV only to confuse, rather than enlighten, although I think that I can dimly see that where other thinkers have pursued such a quest and encountered the same wall of disbelieving puzzlement then it perhaps becomes apparent why some (especially mathematicians) think that there is instead some supposedly “abstract” realm of representation, and the road becomes cleared for the acceptance of some form of Platonistic realism. Apologies proffered now for the only technical philosophical terminology to be deployed within this entire winding diatribe!


Does the mental picture indicate a novel phenomenon? Is there a more precise definition ? Those are the pertinent questions.
lan B.
user 10895495
London, GB
Post #: 117

PART 3:

Four points to note about the definition:

1. Consciousness is very much about sensory inputs. (Your senses are dormant when you sleep.)

Not in general, but the inferences to be drawn on their basis become radically altered during REM sleep. You are of course absolutely right in implicating sensation as the cardinal feature of consciousness!

2. Consciousness is very much about deciding physical actions. (Your actions while asleep are involuntary.)

As said elsewhere, although the distinction between voluntary and involuntary seems clear cut in everyday parlance, I think that deeper down it’s very misleading, but I don’t intend to pursue tangential argumentation here. I think, rather, that in the context which you have indicated consciousness senses intentions and enables ratification thereof, or not (=feedback).

3. Self-awareness is merely a by-product of consciousness in which the self forms part of the mental picture.

I completely agree.

4. Human consciousness is distinguished by our ability to think in terms of symbolic representations of complicated concepts, most notably in the form of words and language. That separates us from all or nearly all animals but this distinction does not seem to be problematic. So I am talking about the broader notion of consciousness as described above, the sort that applies similarly in humans and all the higher animals. That is what I mean by the word.

.. And there unfortunately you’ve completely lost me. Consciousness is no more about “thinking” than it is about scratching one’s chin, or surfboarding. I am sincerely puzzled, Andrew, that first you draw the distinction, and immediately afterwards you re-elide it! So when you refer to “the broader notion of consciousness” I take it that you’re reverting to more neutral talk about representation which in itself remains uncommitted to more specific interpretation (?)

This notion of consciousness is right at the interface between the senses and the responses in physical action. It exists in the ever-present moment, the mysterious 'now'.

I completely agree!

Next I am going to develop the idea that consciousness and free will are very closely related, almost two sides of the same coin. I am sure that you are waiting, Ian, for me to develop a theme that you can critique, so in due course I hope to oblige. I will not forget that your interest lies in the so-called hard problem which is best expressed in terms of 'how do our brains give us the experience of colour?' Please be patient while I work towards that (and work out my position on it!)

Sincerest thanks for your courteous consideration of others’ obsessions, Andrew! I now feel guilty for having so heavily launched into critique even at your phase of preliminary description of the problem as you see it. I await developments with keen interest.

To be continued in the next day or so.........
lan B.
user 10895495
London, GB
Post #: 118

Absolutely spot-on succinct articulation of THE PROBLEM, Andrew. Thank you. We should return to dwell repeatedly on the issue (IMV) until anyone who "just doesn't see the problem" unequivocally winds up understanding it! Camilla and I are just about knackered by now. Maybe you'd like to take the reins on this specific issue? (Alright, I appreciate the fact that it's your thread in any case.):


>"Now, I think consciousness is an evolved feature of our minds that improves upon this sub-conscious way of life. It enables better instant decisions for survival. The problem which Ian has emphasised and which I have tried to define above is that we do not know how the mind generates that conscious form of mental experience, as in the experience of the colour red, as distinct from the unconscious form that any of today's computers exhibit. A computer will record and process the wavelength of the light and it will assign a digital code to the colour red, but it will not share our conscious experience of red. Nor do we know how to program that experience. That is where the problem lies. If, Peter or anyone else reading this, if you still do not get the issue then we should stay on discussion of this point until you do!"
lan B.
user 10895495
London, GB
Post #: 119

Ah, Peter, so you do after all "see the problem". I think that I must have consistently misinterpreted you right from the start. You are I think saying that you don't see any problem in explaining "freewill" in terms of current knowledge (in principle) (?)

Neither do I, of course, but for yet another reason. For me it is simply a non-issue, a bastard beast-of-burden born of excessively theological rumination on the perceived obligation of moral responsibility:


>"1. Though I didn't understand what you were trying to say, I doubted that you were suggesting that "consciousness necessarily equates to thinking in pictures".

2. But if that is what you were suggesting, I felt that it was close to trying to do what we have all agreed (I think) is impossible, namely to objectively describe the sort of sensations and subjective experiences that occur inside our minds."
Powered by mvnForum

Sign up

Meetup members, Log in

By clicking "Sign up" or "Sign up using Facebook", you confirm that you accept our Terms of Service & Privacy Policy