But the question is, what qualia are these people actually experiencing? Are the vision people actually seeing? Or are they subconsciously aware of what they are 'seeing'? IIRC, there was a ted talk a while back with the same idea, but they did colors->sound I think. I'm wondering if this is actually useful, or if it's just subconscious interaction with the senses. Makes a huge difference.
In the talk Eagleman seems to suggest that the visual region of the cortex is flexible and that it can adapt to sensory input information from other organs. I'm not quite sure to what degree people experience vision in the same way we see with our eyes, I'm sure it's quite different, but I'd be interested to find out more. We are obviously only seeing the tip of the iceberg when it comes to what will eventually be possible for human sense expansion. I don't think it is just a subconscious interaction, that would massively reduce its functional potential.Are the vision people actually seeing? Or are they subconsciously aware of what they are 'seeing'?
I mean, I'm insanely interested in implants and various other 'sensory expansion'. But when I read up on things like the magnet implant, all I can think is that it's just plain old touch but with the familiarity of feeling magnets. Same goes for the vision. All I can think is that it's just whatever system they use (touch/hearing/etc) but just changed a bit to be able to recognize other data. Which is still cool, but not really a new sense.
Have you heard of Neil Harbisson? He is the best example of someone who has explored "new senses" through merger with technology.
Yea, I saw his ted talk a while back. Looking at it again, it seems he's sort of confirmed my suspicion. It sounds like he's aware he's hearing color, rather than some other new perception. And he actually commented on how he conflated sound/color (that is, regular sounds have color). Which tells me it's not actually a new sense, but an old sense used in a new way. It's really baffling. I'm wondering how intuitive it is for him to just be able to look at something and clearly say what sound it is. Or to hear something and go "yea, that's yellow/orange/red/violet." In the talk he really only went over how "color is sound to me, lol" and just did a few examples of how he interacts with things. Not so much on the sensory side itself.
What are some actual practical applications for this outside allowing the blind/def to see/hear? The potential is obviously massive but I'm drawing a blank when in comes to what I could use this for
Imagine we hooked up some false nerves to people, that made carbon monoxide trigger the same nerves that methane does. Imagine we hooked up some extra sensors into human vision that caused some paints to glow brighter, or allowed infrared light to be seen, perhaps even toggling between colors at will. Imagine we hooked up a balance sensor that overrides the ear, and provides true balance, rather than the fluid-based sensors inside of our ear that get screwed up when we spin around a bit. And that's just working in the framwork of our current senses!