If reality would depend on a brain it would be a very sad reality indeed...
I define reality as this that doesn't require anything else for it to exist - it rests on itself, not an a brain's interpretation. There is as such no reality in a brain - there can only be an interpretation of it.
How is this unrelated? The thread is about "Could a Robot be Conscious?" - I am only stating that "no, a robot cannot be conscious; no, not even a human can be conscious, because there is only consciousness".
You asked me "If an elephant grieves, is she conscious?" - the answer is obviously NO. Grieving doesn't prove that an elephant is conscious. It only proves consciousness, but not an elephant having/owning his/her individual consciousness.
If a word as defined in a dictionary is defined in the wrong way one doesn't have to continue making the same mistake but should rather aim at correcting it.
Consciousness: a person's awareness or perception of something -- is obviously wrong as consciousness is never owned by a person; the person is only an idea, it doesn't exist outside the conceptual structure / belief that it does
Infinity: a number greater than any assignable quantity or countable number / a very great number or amount -- is obviously wrong as infinity is not a number or a quantity - it is reality, it has no parts, it is what you are
etc... etc.. etc..
All these definitions are based on dual thinking trying to explain the non-dual - this will never work!
An emotion is not more than thought combined with physical sensations. We label certain combinations: fear, anger, happiness... but if we look for "anger" we will only find thoughts stating that "I am angry" and we will find certain physical sensation that seem to confirm the idea. There might be a contraction in the chest, maybe an empty feeling in the abdomen. A quickening of the heart rate. Take all these things together (thought and physical sensations) and we have what we call "anger" etc...
Give a computer the means to interpret certain input/sensations as belonging to itself (=we create an artificial, separate self) then a computer will be just as emotional as any human or animal.
It will have inherited the wrong (and very dangerous) belief of being a separate individual from us, its creator, and will as such act accordingly. This development could be very dangerous for humanity indeed as once a life-form develops the idea of being a separate individual that can benefit from subjugating others it will find it easy to justify violence even against its creator (humans are the best example for this mad behaviour).