Nice response!
I still don't think you have gotten yourself out of the difficulty though. You say that a theory of consciousness must match our intuitions about consciousness. Whose intuitions? They differ from person to person, theorist to theorist. And why trust intuitions? They might be generated by distinctive brain mechanisms that lead us astray.
I don't think intuitions about consciousness is the best place to get a science of consciousness started because intuitions vary so wildly from person to person. For some people it's intuitive that a bug is conscious. For other people it's intuitive that only humans are conscious.
Thank you.
I agree that intuitions can be problematic in edge cases, but I think most would agree that awake, dreaming, psychotic and psychedelic states of humans are different conscious states, and that medium-sized animals are conscious. So already there we have a range of different states that can be compared and analyzed against their non-conscious counterparts (e.g. deep sleep, anesthesia and seizures). Edge cases like small animals, fetuses, or even insects and bacteria, as well as (soon enough) artificially intelligent entities, is something one cannot deal with in isolation without a candidate bottom-up theory.
I don't know - I would contest that whether or not medium-sized animals are conscious depends entirely on how we define consciousness. If we define it to be higher-order self-reflection that it's not clear they are conscious. But if we just mean by it "phenomenal states" then plausibly, yes, there is something-it-is-like to be a medium-size animal. But how the hell can you study subjectivity like phenomenology scientifically? We assume there is something-it-is-like to be a bat but psychology is limited to studying behavior and third-person observable properties so how could we ever know? For me it just leads down this giant rabbit hole and we are better off sticking with behaviorist epistemology and neuroscience. To the extent we engage with "cognitive" stuff at all we have to operationalize it in terms of computationally tractable processes and it's unclear what those have to do with consciousness. In my opinion we are better off just not talking about consciousness: operationalize and be done with it.
Consciousness is the last great mystery, IMO. Would be interesting to see how we will go about solving it, if at all.