No. Consciousness is only the ‘knowing of’ the emotion. The emotion is a felt experience. If the experience is high enough in intensity, and long enough in duration, then it can be retained and recalled from memory as ‘recognition’ of said emotion.

If so, what level of consciousness? Can an ant feel emotion? A cat, or a dog, or a tree? All of these things are alive, but can they claim to be conscious beings?

If you can feel emotion while unconscious, why do so many people drink or abuse narcotics to the extent they become unconscious in order to escape their emotions? Even chasing death as permanent unconsciousness and escape?

ncrbrts wrote:
If so, what level of consciousness? Can an ant feel emotion? A cat, or a dog, or a tree? All of these things are alive, but can they claim to be conscious beings?

Probably a question that can only be answered in terms of metaphysics. Is there something it is like to be a tree? Probably not, but there is something it is like to be my dog. Is there something it is like to be an ant? Again, probably not, but who knows. On this basis, the former lacks consciousness, while the latter is conscious.

ncrbrts wrote:
If you can feel emotion while unconscious, why do so many people drink or abuse narcotics to the extent they become unconscious in order to escape their emotions? Even chasing death as permanent unconsciousness and escape?

It's a very good question.

An anesthesiologist can take way your consciousness and your emotions as well. For the time being anyway.

Philosophy Explorer / article wrote:. . . There’s also the issue of whether this robot has what would truly be considered emotions, or is just mimicking what humans would likely do in a given set of situations. That’s worked out well for robots in science fiction recently, but in reality, we’re probably quite a few decades away from artificial intelligence that could generate real emotions. . . .

Thus the robot's emotions are outward expressions and behaviors, not the personal feeling of an emotional state. The experiential presence or showing of a robot isn't necessary for its emotional body actions to be executed, but the "nothingness" of non-consciousness would arguably make it difficult (even impossible) to verify such functions as taking place.

Similarly, if the anger of Judy (a human) isn't manifested as something private to her, or manifested as something public to others (like body behavior), then there's no familiar evidence available for instantiating either her emotions or her overall existence. The mindless universe might have an unknown alternative for bearing witness of or declaring its absent manner of be-ing, but why would it require such to begin with? Evidence seems only important to biological entities (or at least the rational agents in that group), which is why phenomenal consciousness (and its appended reasoning) developed in the context of life. Perhaps the very self-interest of a brained organism mandated that its body / thoughts and its environment eventually become manifest rather than remaining invisible (blindsight, deaf-hearing, odorless-smelling, unfelt-touching, etc).

Would a robot, if 'emotions' went unexpressed, suffer a nervous breakdown and try to self-terminate? Would it go postal? Certainly it could be programmed to mimic that, and respond as conscious beings do, but does it make it any more real? Does the bot really suffer?

Emotions are a social construct. We learn to use the language of emotion at an early age because adults and others teach us informally and formally to use such language. We are all pretty fuzzy on our use of emotion language because others have no idea what we actually feel at the time they try to teach us. They look for certain collateral behavior (crying, smiling, frowning) to make some general judgements, but there are many examples with humans where the overt behavior is inconsistent with the person's description of what is going on within the skin.
We also have a very limited understanding of how our nervous systems get involved at times when we would like to talk about emotions. It just seems like some sort of strange anthroporphism to talk about emotions and robots.