#132: The rambling producer

Transcript
Foreign.
Speaker B:Welcome to Blind Guys Chat. Where Oren. Emile.
Speaker A:Hello.
Speaker B:Jan Bloom. Hello, I'm Mohammed Lashear.
Speaker A:Hi there.
Speaker B:Talk about the A to Z of life.
Speaker C:Well, hello, ladies and gentlemen. You're very welcome to episode 132 of Blind Guys Chat. Now CL and I stop laughing. You. So people, people will think that this, this podcast is some way funny. We're going to be talking to Mr. Peter.
Speaker A:No, wrong. I don't have. I don't have sound effects, unfortunately.
Speaker C:All right, I'm not gonna. You go, you go.
Speaker A:His name is Peter. Ruf. Ruf. Ruf.
Speaker D:Yeah. And you know, anyway, he's going to.
Speaker C:Be putting some, some electrodes into Jan. Jan's brain later on.
Speaker B:That could be useful.
Speaker E:Yeah, that be very useful.
Speaker D:Good for you. I think so too.
Speaker C:It's an interview really well worth listening to. I, myself and Claud weren't present because we were on. We were on leave. We were talking a holiday. But the guys did a fantastic job on this interview and it's a really interesting interview. But first I want to talk about Meta.
Speaker D:Oh, you want to go for a Meta? Are you using it, by the way, still?
Speaker C:I am using it, yeah. Even though my dog ran me into a wall earlier on and my glasses are now a little bit more crooked than they were.
Speaker A:Oh no, put those glasses on Larry next time so it can tell them. You can tell him, you know, there's a wall. You need to watch out.
Speaker D:Yeah.
Speaker C:They can help him be even more distracted. Oh, look over there. I can see better over there.
Speaker A:I'm.
Speaker D:I'm still very annoyed with Meta, you know, because I had it for one day. It pops up spontaneously and now. Yeah, after that is gone forever since.
Speaker A:No, it might come back again. Like they're going to release the glasses in the Netherlands. So it should be.
Speaker B:They were just testing or something. Yeah.
Speaker C:What's happened? While we've been away in Bathalona, they're gonna. Meta. Are gonna open up an SDK for applications to become available on the glasses.
Speaker D:Yeah.
Speaker C:And one of them is going to be seeing AI. I read.
Speaker D:Yeah, that's right.
Speaker C:Advantage of it. What else do you know, guys?
Speaker A:Well, essentially not much else because Meta is being relatively tight lipped about this. They're going to release these glasses, or at least the SDK on the glasses for specific developers in the coming months and then it will be generally available to any developer later in 2026. I believe it's in spring or maybe February, possibly January if we're really lucky. But the SDK is going to become available to all developers, of course, you have to get through Meta qa, I believe, before you can publish. But it should be possible, for example, hopefully for guys like Aira to get directly on the glasses now so that you don't have to fiddle with WhatsApp. Of course, be My Eyes is already on there, but possibly Envision could be on those classes. Scene AI will be on there. Maybe Be My Eyes can actually put Baby Be My AI on those classes as well. So it's very exciting.
Speaker D:I wonder though, since you need to switch and. And what I learned it is that it. It's only the microphone. No, yeah, no, not the. The speakers and the camera and not really the microphone yet. So it's not a head that you cannot really interact by voice to it. So that is a little bit. That's disappointing because if you could open it up also for Siri or whatever, you know, that, that, that would be nice.
Speaker A:Yeah, but there's almost no manufacturer that does that. Of course, Android is open source and so some. If you make your own headset, you can build a version of Android where you can say, hey, Bixby, sorry for all the Samsung fans out there. They're obviously talking to them cartoon you're watching. But anyway, that's great news. But on the iPhone, it doesn't work on Meta. Of course it doesn't work for Amazon. If you've got Amazon speaker, you can't call Siri on an Amazon speaker. So that's.
Speaker D:But I was curious how you could easily switch, you know, from one app to the other from Seeing AI to Be My Eyes, for example, or Be My Eyes.
Speaker A:How does that work now? Because there are already apps on there. Like Be My Eyes is already on.
Speaker B:There, like a wake phrase or whatever.
Speaker C:That.
Speaker B:That works for one app that doesn't work for another app, right?
Speaker D:Yeah, I don't know. We'll see. Yeah, but, but, but Be My Eyes. Now, mo is only for the assistants that you can call.
Speaker A:Okay, I know.
Speaker D:And that is. So then, then you can open it up by voice. You can say, call a volunteer or something.
Speaker A:Oh, interesting. Okay.
Speaker D:Yeah.
Speaker E:Y.
Speaker C:Be nice because I'm using. To be honest, I'm using Seeing AI a lot more on the phone to ask it to describe an image rather than Be my eyes.
Speaker A:Oh, interesting. I use Be My Eyes and the chat GPT live. Live video. Yeah, yeah, but I use that all the time. But I. For example, when I turn on my oven, taking a picture would be my eyes or seeing AI takes when you turn on your what oven?
Speaker D:Your oven.
Speaker C:Yeah, I need to look to put.
Speaker A:In stuff to heat it and stuff.
Speaker D:Yeah, yeah, yeah. That is for you. An, an undeveloped area. Of course.
Speaker A:Yeah. Just ask Claudia, she'll explain.
Speaker D:He will burn his fingers, you know.
Speaker A:But in order to set the timer, right, so I use the live video because I can keep shouting at it. What's the timer set to now? What's it set to now?
Speaker B:What's it set to now?
Speaker A:What's it. Are we there yet? Are we there yet? And it, it just stays so nice. It never shouts at me like it's really good. ChatGPT is my best friend. It is.
Speaker B:We were your best friends.
Speaker C:So is there any.
Speaker A:No, you are, you are, you are.
Speaker C:Is there any talk about like I would assume envision will take advantage of this? Is there any talk about who else might take advantage of. Of the SDK and become available on the glasses?
Speaker A:Not that I've read yet. This will presumably become, you know, we'll get news about this sporadically whenever the SDK comes out. But for now I think Meta is only really interested in talking about their partners that they will make it available to in early access, so to speak.
Speaker B:By invitation only type thing.
Speaker A:Yeah, by invitation. And I think most apps are not really, or most developers right now rather are not really into trying to, you know, already announce something if they don't know what the SDK will look like, how much work it will be to support and all that stuff. Because, you know, committing to something right now without having all that information is going to be a little bit difficult.
Speaker D:It is a very well, good development.
Speaker C:I think because I'll tell you, I was. While we were in Barcelona, did you go to Barcelona? Barcelona. They released a new update on Meta. They released a new update on Meta AI which now if you get, you have to, you have to physically switch it on in the settings which allows for more description. And one of the things that I was reading in the release was that if you asked it, for example, am I look and tell me what's in front of me and if it said there is a door in front of you and the handle of the door is on the right or the left hand side, it will tell you okay in the detail. Now however, I have had no success with this because every time it said, you know, you're. There's a sliding door in front of you and the handle is on the right hand side and I'm saying, no, it's on the left hand side and I'm getting sick of this where he goes, yes, mirror. It's on the left hand side.
Speaker D:I think it's mirror then.
Speaker A:Yeah, it get. Yeah, it maybe almost always like. But that chat. GPT does this. Gemini does this. All of them. When you tell them they're wrong, they're like, oh, I'm so sorry. Y course I'm wrong and you are right. Even when they are right, every now and then, they will insist that you'll be like, no, you're wrong. And they'll look, no, no, I'm not.
Speaker C:Interesting thing I had. Interesting thing I had today was I was. I was sitting in the garden because it was a lovely day here at home, and I was sitting in the garden with Larry. And we have these drones now flying over from Russia? No, from. From the man delivering. Delivering food. And three drones.
Speaker D:Really?
Speaker C:Yeah, yeah, yeah. And three drones ran over, came over the house, in the vicinity of the house, or maybe, maybe, I don't know, 100 yards away, 200 yards away. And I said to Meta, you know, look and tell me what you see. And I said, I see a garden with a lovely blue sky and blah, blah, blah. And I deliberately tilted my head up to where I saw where I heard the sound of the drone coming from. And it said nothing. And then I said, is there a drone in the air? And it said, yes, there is a drone in the air. And I would have thought, well, if you're giving me a more detailed description, like, there's a beautiful blue sky and there's no clouds, why wouldn't you also say there is a drone in the air? I had to prompt it because of what I was hearing, which I just. I'm not as enthused by.
Speaker A:Isn't very good anyway. Like, to be honest, like, it's. It's okay. It's not bad, but it's not the best AI out there. Like, there is better. It's one of the reasons why Zuckerberg was paying hundreds of millions of dollars per AI researcher to come to Meta. Because they want to, you know, they want to leap ahead. But right now, what they have is good. It's not bad, but it's not the best.
Speaker D:Yeah, okay.
Speaker E:Okay.
Speaker C:Yeah, okay. All right. Shall we hear from our guest?
Speaker D:Yep.
Speaker A:Yes. Let's. Welcome to the podcast Professor Peter Roofsoma. He works at the Dutch Hessen Institute, which is the Institute for Neuroscience. He will talk to us about prosthesis to be able to see. Maybe one day we'll. We'll see. Of course, there's been a lot of news in this area. Lately, Elon Musk is doing quite a bit of work, but there's also work going on in Australia. I believe there's somewhere work going on in Europe. And there's of course, a lot of societal discussions. Would you take a prosthesis like this? And when. How much should the risk be? How much risk are you willing to take? Today we're going to talk about the hard science because I'm very interested in knowing how far we are, what we do and what we don't know, what our capabilities are and what the near future will bring. Welcome to the podcast.
Speaker D:Yeah, welcome.
Speaker E:Thank you, thank you.
Speaker D:Where do we call you from? Because our. One of our main items in the podcast is always the weather and we are always curious where you're calling from and also what the weather is today or is right now.
Speaker E:I live in Amstelveen, which is next to Amsterdam, and the weather is cloudy and I just walked the dog, caught a little bit of rain, but now it's dry again.
Speaker D:Okay. It is not a guide document.
Speaker E:No, no, I can see normally.
Speaker D:Okay. That's, you know, Oren and me, we are having guide dogs. So that's always, we're always interested in kind of dogs etc. But that's true.
Speaker A:Well, it wouldn't be good if the, you know, the professor who was making eye prosthesis needed.
Speaker D:Oh, yeah, yeah, that's true. That's also true. Yeah, yeah. You know, puppy guide dog.
Speaker A:Maybe not the. Maybe not the best person to create. Otherwise I'd go into that field. Professor, tell us what's the. If I would ask you, what of the art is currently with work that you're doing right now? How would you describe it?
Speaker E:So I think we make. We are making progress at the moment. About five years ago, we published an article where we showed that it's possible to get a sort of very elementary form of vision back with electrical stimulation. So with the visual brain prosthesis. But there are still some problems to be solved and we and also other labs in the world are working hard to solve those problems.
Speaker D:Can you describe, for example, what is then simulating our eyes?
Speaker E:What we are doing are we implants electrodes, so wires basically into the visual cortex. So the eye projects through a small relay station and then to the visual cortex. And the visual cortex is in the back of the brain and it has a map of space. So two points in the outside world that are near to each other are also connected to brain cells in this region, the visual cortex, the primary visual cortex that are nearby. So It's a very systematic map of the outside world onto the cortex. And we know from previous work that if you place an electrode, so wire, in that map, and if you stimulate that wire, then somebody will see a dot of light, and he will see a dot of light in that position in the outside world that would normally be mapped by that place in the cortex. And that also works in people who have been blind also, who have been blind for many years now. If you have one electrodes, the dot of light will always be at the same position. But if you place, say, 1,000 electrodes, they will end up in different positions in the map. And so you can then create 1000 dots of light that all have their own position. And you can work with it like a matrix board above the highway. Or it also looks a little bit like a briar reading thing. Right. So you have dots there as well. And you can just create patterns by switching on multiple electrodes. And that looks then like vision. That looks. For instance, if you switch on a set of dots that have a shape of a particular letter, say the letter T, then a person sees a pattern of dots that looks like a T. And that is something that we demonstrated is possible in monkeys. We tested that it works, and we worked together with a researcher in Spain, Eduardo Fernandez, who did this in four patients. Blind patients, blind people. And they had been blind for several years. So the first person, Bernadette Gomez, she's now quite famous in Spain because he was all over the news at some point. She had 100 electrodes. So not 1,000, but 100. And when they stimulated electrically one of these electrodes, then she indeed saw a dot of light. And she also was able to see some very simple patterns. But the number of patterns that you can create if you have only 100 of these electrodes is somewhat limited. So that is something that people are working on to improve the technology.
Speaker A:So your visual cortex is almost like a screen, then, where the real world gets mapped onto pixels. And you can stimulate those pixels independent of your eye.
Speaker E:Exactly. In many people who are blind, the eye is so severely damaged that there is no useful information coming from the eye to the brain. And in many forms of eye diseases, there's also no information in the optic nerve anymore, because the cells that make the connections through the optic nerve, they are actually sitting in the eye. There are some diseases. Yeah, and there are some diseases where these. These cells are still intact. But in the large majority of diseases that cause blindness, these cells also die off. So if you then want to create A prosthesis. Then you really have to, to look in the brain and, and to put your wires in the brain.
Speaker D:So to say if I look, I have my vision on, on, on a monitor screen, do I see then a. In, in pixels, the, the, the lines of the outside of the, of the screen?
Speaker E:It's not a regular square because you, you don't address all the cells in the visual cortex. You only are connected to say, 1,000 of them.
Speaker D:Yeah.
Speaker E:And in, in, in normal vision, there would be about 1 million or so. It's very, it's actually still very poor. If you have only 1,000 of these electrodes in, it's probably useful. And the estimates are then if you have 1,000 of these pixels, you have more independence. So if you go outside and you go to an unfamiliar place, the prediction is that you should really gain some functionality. But it's still not comparable to the vision of people who don't have an eye disease.
Speaker A:Can we make colors with the electrodes or is it only white or white light or nothing?
Speaker E:That's a very good question. So it's, it's, it's, it's going to take a while before it becomes color. The reason is that if you put an electrode in the brain, then you're not stimulating one cell, but you're stimulating, depending on how much current you put in, you're stimulating 10 or 100 or maybe 1,000 cells. In the cells in the visual cortex that are responsible for seeing different colors, they are all intermingled. The red cells and the blue cells and the green cells, they are all intermingled. The electrical current is super aspecific, so it's not possible to put a current that you would only stimulate the red cells or only stimulate the blue cells. Therefore, if you stimulate very weakly, then sometimes in the literature that has been done in blind people and they just ask, what do you see? Yeah. And often they then see sometimes a little bit of color. Say it's a bit reddish. But if they then increase the current, then it tends to become white or yellowish. And the explanation would be if you stimulate it very weakly, maybe there's sort of more red cells than blue cells and green cells. But the moment you start to stimulate strongly, then you basically activate all of them. And, and the color then becomes more white.
Speaker A:So it's really only stark lines that you could see then.
Speaker D:Yeah, it's.
Speaker E:Yes.
Speaker D:Like bits, a zero or, or an order one. It is yes or no.
Speaker E:Yeah. I mean, you could stimulate a bit stronger and then it becomes. Yeah, More visible, so to say, this dot. But people are still thinking about what is the best way to compress what you see. Right. So the future user of the visual brain prosthesis would wear a camera. Right. You can have these cameras that are embedded in glasses, and so you get the camera feed. So the camera, of course, has a very good vision, maybe even better than normal human vision. It has many pixels. But if you have only 1,000 electrodes, you have to do something with this very rich picture to make the best of 1000 pixels. I think with artificial intelligence, the problem is becoming easier because the camera feed will have objects that are interesting for the user. So suppose you're in traffic. You want to know where the obstacles are. You want to know where the other people in traffic are. So vehicles and pedestrians and cyclists.
Speaker C:Yeah.
Speaker E:So you would probably make those visible, but you would not make visible things that are irrelevant. Say the trees that are not in your way, or maybe the commercial boards. So all the things that would distract you, you would just not show in the prosthetic image. And then you might be doing a bit better than if you would try to present everything at the same time. That's just not going to happen.
Speaker D:So you can really fine tune.
Speaker A:Makes me think of tactile graphics, John.
Speaker D:Yeah, yeah, that was my thinking as well.
Speaker A:This is how tactile graphic production works, because, of course, with your hands, you can't feel the fine details in the colors either. And so you have to strip down the image to make everything clearer and to make the object of the image. So the reason why you would look at the image needs to pop out more, and everything else needs to be made more generic and less present so that you can feel the image.
Speaker E:I think there are some differences with tactile. With tactile, of course, you have to serially scan the picture with your fingers and envision. Things are, of course, more simultaneous. So you see all the pixels at the same time, so there's not so much scanning. Of course, normally you scan with your eye movements, so that is still something that we would also need to take into account.
Speaker D:So if you move your eyes left and right, up and down, you will see different. So the refresh rate is instantly.
Speaker E:That is what people are thinking about. So researchers propose that this would be the best. And we know that if you stimulate the visual cortex and you look straight ahead, then you have this dot of light in a particular location in the outside world. But if you turn your eyes, actually, it shifts. It shifts with your eyes, which is very interesting. So you could also use that. So you could use the eyes of a blind person, because many blind people, they can still look at something, although they don't see anything anymore. You can turn your eyes, then you could still use your eyes to point in the part of the outside world where you would like to have the best picture. And you could also use that to improve the experience, is the idea. Right. This still needs to be tested.
Speaker A:I confess to watching some of your presentations on YouTube as well. And one of the issues that I saw you describing, and this is, I think a year and a half ago now, was that, you know, the brain doesn't really like electrodes. And so what it will do is it will make a little. It will encase the electrodes to protect the brain from those electrodes because they're from the outside. And so it will encase it and I think what looks a little bit like scar tissue. And so I'm wondering, are there ideas to get rid of that issue, get rid of that problem? Are there electrodes that don't elicit this kind of response from the brain?
Speaker E:Yes. So, as I said just a bit earlier, we tested this idea with 1,000 electrodes in two monkeys. And we looked what happens to the region where we had these electrodes to see whether this is a technology that you would like to use in people. And we found indeed what you described. So there was actually these electrodes, they were completely encapsulated by tissue. We call it fibrous tissue. So that is the response of the body basically not liking those electrodes. And so I think the solution seems to be that these electrodes that we used in those two monkeys were very stiff. They were made of silicon. Silicon is the same. Same stuff that computer chips are made of. So it's a very, very hard material. And so these chips, they are actually in the brain. So between the brain cells, but they are super hard. And the brain cells, they move always a little bit. Also during the night. Yeah, the brain is. It kind of moves a little bit. And for instance, during day and night rhythm, the brain also shrinks and then inflates a little bit, maybe less than a percent, But. But these things happen. But of course, the silicon doesn't follow that. Right. So the silicon is just hard. And so the idea is that there's some mechanical pressure all the time against the cells, and that gives rise to this response that the brain really doesn't like those. Those electrodes. So then it builds up all these. All this encapsulation with this fibrous tissue and the brain cells, they locally, they even die or they are pushed aside so that you completely lose the connection between the electrodes and the brain cells that you want to stimulate. But the solution, the solution seems to be that you have these. Nowadays you have these very flexible electrodes and they are made of polymers. Sort of looks a little bit like a plastic foil that you have in the kitchen. Kitchen foil. So it's super, super thin and super, super flexible. And the nice aspect of it is that you can make those foils, those very flexible electrodes in the clean room, actually the same clean room that they use to make computer chips. And you can then make very, very precise electrodes, so very precise connections. And that is something that seems to work really well. There are now some studies out that show that this tissue response, this gliosis, is much less with these flexible electrodes.
Speaker D:How is the combination? Because you implant then a camera inside your eyes or something. Do I understand? Correct. And you implant also something in the brain.
Speaker E:So you don't need to implant the camera, you don't need to implant the camera in the eyes. So you can just.
Speaker D:Or you use an extended camera on your glasses or something.
Speaker E:On glasses, yeah. Of course, if you then want to know where somebody's looking, you should also have an eye tracker. But nowadays you can buy these eye trackers that know where somebody's looking. This is also a camera that looks at the eye and then infers where somebody's looking at. And then in the brain, you really have to open the skull or you have to find a way to bring these electrodes in the brain. Right. So, yeah, I just mentioned this, this patient in, in Spain.
Speaker D:Yeah.
Speaker E:And they really had to make a small hole in our skull and then put the electrodes in the brain and then close the skull, of course, again, and everything.
Speaker D:Is it a dangerous operation?
Speaker E:There are definitely some risks. So now opening somebody's skull is. Is what neurosurgeons do for a living. So in that sense, it's also not super dangerous. But of course, every surgery that you undergo carries a certain risk. So therefore, it's also important for the researchers and maybe later the companies that are going to develop those prosthesis to ensure that they really make sure that the risks are minimized and patient safety is at the forefront of the developments.
Speaker A:But of course, also it won't be very easy to upgrade these then, because also, like, if a system comes out now where they have a million electrodes, well, you're not going to rip out your 10,000 electrode prosthesis to put in a million, unless there's a very good reason to do so. Right.
Speaker E:The systems that we are now going to develop, they may have 1,000 or 2,000 electrodes and maybe in five or ten years there will be better systems with 10,000 electrodes or maybe 100,000 electrodes, who knows? And that is how technology develops. Right. The iPhones that we now have are not comparable to the iPhones that were first on the market. Now, it's not entirely true that if you implant, say, one of the earlier systems, it's quite conceivable that later you could, on top of it, implant a better system. But of course, you like to start with the best possible system. So there's always going to be a trade off. Do I still want to wait for the next version of the technology or do I want prosthesis that right now it's not even a question. Right. There's nothing on the market, so. And that will take at least two or three years and then people will still be testing these prosthesis in blind volunteers. Right. So at some point we will also be looking for volunteers. Not yet now, but maybe in one or two years. And only if you then successfully carried out clinical trials. And maybe you need to do a smaller study and later somewhat larger study. Only then you can get a real product on the market and then you can also ask for the reimbursement. So that is still quite some time before we are there. If everything goes well, then you're most optimistic it will be eight to 10 years before there's something on the market.
Speaker D:And when you're, for example, a little bit. Well, I am not totally blind, so I have only a light perception. When I would go for this system, do I then switch off all the light perception as well? Or is it in addition to my current vision?
Speaker E:The systems that are now being developed will not interfere with what you have.
Speaker A:Oh, that's very interesting.
Speaker D:Yeah.
Speaker A:Because I think about this economically. Right. If you can, for example, sell this to the military so that they can communicate with one another without, you know, without having to write something down, you can just write it down in someone's brain. I would think that they would buy that. So maybe that can bring the price down.
Speaker E:We are not yet working on that.
Speaker D:Yeah. Anyway, how does it then affect your. Your lifestyle? Because can you. Do you have a kind of battery set in? Do you need that for. Or do you need to carry something? Or is it comparable then, for example, what the hearing aids have, you know, with. When they have that they can click on a magnet, something on it, or can you switch it on or off?
Speaker E:I think all these are possible. If you compare it to the auditory implants, it might be that the amount of current is quite comparable to that of an auditory implant. So a cochlear implant, okay. In that case, the similar battery systems could be used. But maybe you need to also have a bit more communication than in an auditory implant. And so then maybe it requires a little bit more energy. So these things are still being investigated. So for me, that's difficult to predict.
Speaker A:And an interesting other thing is that this first implant will really only work for either really young children or people who have been able to see, right? Because your visual cortex gets repurposed by your brain if you're born blind, is that right?
Speaker E:If you are born blind and you don't use your visual cortex, your visual brain for vision, then we know from studies, scientific studies, that these cells are going to do something completely different that has nothing to do with vision. And so if you then, at a later age, so after this development, then would try to connect the camera to those brain cells that are doing something completely different, it just won't work. So that's not going to be possible. So the first people who will be eligible so could. May have benefit from the prosthesis. These definitely will be people who have seen, they have had their normal development. So most of this period when your visual cortex learns to see, so to say, that is the first five, maybe eight years, and then the system is more or less fully developed. And so if you have had these, say, first eight years of visual experience and you become blind at a later age, then you could be a candidate for situational brain prosthesis, but not if you are born blind. Something similar happened in cochlear implants. So cochlear implants for people who are deaf, they now exist for probably already more than 30 years. And the first people who got a cochlear implant, these were people who became deaf at a later age. And so their auditory cortex was fully developed. And so when they then got an auditory cochlear implant, that worked, right? And of course they had to develop the technology. And now it's 30 years later. So it really works quite well. People can hear speech, there are also some limitations because they don't appreciate music so much, but speech is really comparably good. And now they are so confident in the technology that they are now implanting babies, babies who are actually before this whole development. So they implants babies of nine months. And these babies, the only auditory input they have is from their cochlear implants. So they are actually most likely become much better users of the cochlear implant than people who became deaf at, say, age 50 or 60. Because the people who became deaf at age 50 or 60, they have had actually a very different input, right, the normal earth. But the babies, they only ever experience the cochlear implant. So their whole brain gets tuned to the cochlear implant. Now, if you go back to the visual prosthesis, we can of course, extrapolate it a little bit. Right now we are still making the first baby steps. And maybe in 20 or 30 years this field has something that already works for 20 or 30 years. And if that's the case, then maybe people by then will have also the confidence to start implanting babies. But I think we are really 30 years behind what happened in the cochlear implant field.
Speaker A:I have a question. So as a researcher working on this, which finding has surprised you the most and why?
Speaker E:One aspect that we have not covered, but that is interesting and mind boggling, is we are now stimulating the parts of the visual brain that still have this map like structure. If you stimulate there, somebody sees a pixel. Now if you then go deeper into the brain, you'll also find brain cells that are tuned and that represents more abstract concepts like bicycles or cats or cars. And there, there has been one study that I thought was really intriguing. There they stimulated those brain cells and they, they found a place in the brain that is dedicated to seeing faces. There was somebody who had epilepsy and they wanted to know what part of the brain is responsible for the epilepsy. So they placed many electrodes on the surface of the brain and one of these electrodes turned out to be on top of that region that is responsible for seeing faces. When they stimulated that electrodes and the patient was just looking straight ahead, say to a white ball, he saw nothing interesting. So that sounds a bit disappointing. But the interesting thing comes now. So when they placed a basketball in front of the person and they stimulated electrode, the basketball started to look like a face. So the patient said, hey, basketball becomes a face. I see eyes, I see a nose. I thought it was super remarkable. And the same happened when they put a cardboard box in front of the patient. You can even find those movies online. And then the patient also said, oh, the box starts to also look like a face. So there's this intriguing part of the brain where you have these higher level Representations. And I find it super interesting also to think about.
Speaker A:Well, speaking of AI, then. Right. You don't just reduce the image in that case, but you might be able to stimulate the correct parts of the brain so that you maybe don't need all the pixels to have a million pixels in order to maybe see quite complex objects. Because you can sort of make the brain aware of what it is without it being able to see all the details. So that might be interesting.
Speaker E:Yes, absolutely. So that is a direction that one could think in. But right now it will be an add on to something that I think working with the pixels at the low level is a bit more promising and easier to accomplish. Of course, I could imagine that in later years people would also be interested in trying to stimulate those brain regions in the context of a visual brain prosthesis.
Speaker D:If I may ask, also for which kind of patients did this system would work? Is it then if you suffer from RP or MD or it does not or. Yeah, you said that you need to have some brain power left in a way. But, but the. What can be damaged and what may not. What cannot be damaged since it. To make it a success.
Speaker E:In principle, this, this could work for every form of blindness. Oh, and except if it is due to damage to the visual cortex. Okay, right. Because then, then. And. But that is if you look at percentages that is a minority of people who are blind are blind because of a problem in the cortex. Many people who are blind and blind because of. Of an eye disease.
Speaker D:Yeah, yeah.
Speaker E:And so all the people who have an eye disease and who have seen, of course, at an earlier point in life. I think it's also important that somebody is really very blind. So there's not a lot of residual vision because I think definitely in the, in the first patients that, that are going to be implanted, you don't want to run the risk in damaging something that is still somewhat functional.
Speaker D:Yeah.
Speaker E:So I think the first, the first experiences will be.
Speaker D:With total blind people.
Speaker E:Done with people who are completely blind. Or you also have a form of. That is categorized as bare light perception. And I think these are the most likely early candidates to test these prosthesis developments.
Speaker A:Well, Peter, Professor Roofsoma, thank you so much for joining us today. We could talk to you for two hours more, but.
Speaker D:Yeah, nice.
Speaker A:We only have an hour long podcast and we need to give Oren some time to ramble too. So we. So this is about how far we can get. Well, thank you so much for joining us and yeah, we'll hope to See you back someday and to hear about new scientific discoveries and new capabilities as time goes by.
Speaker E:But I want to say thanks for inviting. It was a great pleasure.
Speaker A:You thank.
Speaker E:And I'll be happy to reconnect when the time is right. Thank you.
Speaker C:That was very insightful. And once I. Well, I don't even know if I'm going to get a medical license. I think I'll just go for it. And we need some volunteers for blind guys chat so we can put some.
Speaker D:We will, we will watch in your head.
Speaker A:I volunteer Stuart. We're just going.
Speaker D:We throw him for the bus.
Speaker A:You know, we're going to everyone, we're going to kidnap him after treat and just plant some electrodes.
Speaker C:Still not going to work for Stuart.
Speaker D:Oh, yeah, yeah, yeah, yeah. He has everything.
Speaker A:Come on, Stewart, be helpful, please.
Speaker B:In fairness, he has prosthetic eyes. He doesn't have.
Speaker C:It's interesting to see what's going to happen with this. And as he points out, like with the cochlear implants that are, you know, running for 30 years and they're now able to put the cochlear implants to newborn babies.
Speaker D:It's amazing. Yeah, yeah.
Speaker C:Will this happen with, with this technology for newborn babies who have sight conditions and. Yeah, I'm, it's, it's, it's an interesting space, but I wouldn't be, I won't be holding my breath anyway for me to. Or me or, or you or Mo.
Speaker B:You're very cynical, Lauren.
Speaker C:Well, I'm not being cynical. No, I mean, in fairness to him, he, he indicates that it's, you know, it, it's very much at the developing beginning. I think for people in the future who have eye conditions, this could be the answer. But you're talking maybe eight to 10 years before you begin to see some.
Speaker A:So don't hold your breath anyway because you'd be holding your breath for eight years. That's not very smart.
Speaker D:Would be nice and quiet, you know.
Speaker B:Yes, well, wouldn't make for good podcast.
Speaker C:I should take offense with. I should take offense with Mr. Mohammed Lashear who said. Said at the very end of that, we need to leave some time for Oren to ramble on a bit.
Speaker A:I mean, aren't you now doing just that? Oren?
Speaker D:It's proven.
Speaker C:I saw Sister rambling and asked my dear wife, has she got any emails?
Speaker B:I do, I do, I do. We have an email from a. I don't know if he's a new listener, but he is certainly new to contact us. And his name is Renee. Ludwig.
Speaker D:Hey, Renee. That is our colleague from Germany and we spoke with. Funny. You know about this Eastern Germany. You know, this whole. This where they invited blind people to come. We had him. Them on the show.
Speaker B:Oh, did we?
Speaker D:Yeah.
Speaker B:Sorry. Sorry, Renee.
Speaker D:I didn't realize you were you. You know.
Speaker A:Hello.
Speaker D:Hello.
Speaker B:Yes. Will he like that comparison or not? I don't know.
Speaker E:I don't know.
Speaker A:We can ask him. Well, too late.
Speaker C:What's he saying?
Speaker B:Renee says hi. Yan, Mo and the rest of the BGC team. That's us. Or you and me, right?
Speaker C:We're just the others.
Speaker B:We're just the remainders. It's okay because they're colleagues. So that's all right.
Speaker D:It's the truth, you know.
Speaker A:It's all right, Oren.
Speaker B:We won't take a Brisley.
Speaker E:Don't worry.
Speaker D:I just listened to you're on holiday.
Speaker B:Yeah, that's it. We didn't count. I just listened to the latest episode of BGC and enjoyed it very much. Especially when Mo was talking about his trip to Berlin, which is about 250km distance from Chemnitz, where I live.
Speaker D:You see?
Speaker B:Yes, Chemnitz. At the end you talked about traveling with travel agencies for blind people and about travel eyes in particular. My partner Anya and I did two trips with them which were great. Of course it depends on the group members, but we were lucky in both cases. You do not have the same guiding person during the whole trip. Part of their concept.
Speaker C:Oh, I thought you did.
Speaker B:Yeah, me too. This is really interesting. Part of their concept is that they switch couples every day, so everybody gets to know each other. And if two people don't get along with each other, it's only just for one day. You've.
Speaker A:All right, that is actually smart.
Speaker C:And then you get additional Spindlers thing going on here, is there?
Speaker B:No, I think that's really good. So maybe you could read my comment to the audience in the next episode, if you like. Well, there you go, Renee. In fact, you sent this on the 13th of September, so I'm not sure. I think we had. I think we missed the last episode. So sorry about that. Greetings from Germany. Renee says thank you so much, Renee.
Speaker D:Yeah, yeah, yeah.
Speaker B:Feeling dunk. Great. So, yeah, so it sounds good. I have to say that sounds really. And obviously it depends on who's. If you're traveling in a group and like there's like one annoying person in the group, you know, that can happen sometimes.
Speaker D:It can happen. You know it will happen. You know it will happen.
Speaker C:I think if we Went we'd be the annoying group.
Speaker B:We probably would be the annoying group.
Speaker D:You know yourself very well.
Speaker E:I don't know.
Speaker A:Maybe he was talking about all of us, actually. No, no, no.
Speaker B:He's talking about you and me. He's talking about himself and my. And me. I think that's what it is. Can I. Can I interject briefly on something? Nothing to do with anything at all.
Speaker C:All right.
Speaker B:Oren got a few new T shirts before his holiday, and people were stopping him on the street and laughing.
Speaker D:Really? Tell me, so what is on the. What is printed on it?
Speaker B:What was the first one, Oren?
Speaker C:So the. Well, the. The one you got. Well, the ones you got me.
Speaker D:The.
Speaker C:The first one that people like, it seemed. Well, there.
Speaker D:There were.
Speaker C:There was a lot of Americans in Barcelona, and it was almost like they all knew each other. And they were. They were kind of going around to their other friends saying, you should see the blind guy because he's got a great T shirt. And the T shirt said, yes, yes, I've tried glasses.
Speaker D:Yes.
Speaker B:They loved it. They were, like, standing around laughing.
Speaker C:The other one I've got is. Says, sorry, I didn't see you there.
Speaker D:Oh, my God.
Speaker B:Which is like, you know, in case he bumps into anybody.
Speaker C:But again, I gave my cane a good. A good thrashing because Larry wasn't with us. So I did. I did almost manage to break the cane at one stage.
Speaker B:He bent it quite severely a couple of times.
Speaker D:Yeah, okay.
Speaker C:Good old Shiroski.
Speaker A:Yeah, it's. It's not a helicopter propeller. You can't fly with it.
Speaker E:No, no.
Speaker D:Disappointing. Disappointing.
Speaker A:I know.
Speaker D:Yeah.
Speaker B:It did a good job, though. And he brought a spare, actually. Inspired by you and your stories of being on holidays without.
Speaker A:Yeah.
Speaker D:With a broom, you know, then you end wave of bro.
Speaker B:So we actually brought two canes on holidays with us, but we only needed the one, so that was good. It's like, you know, belt and braces approach, you know, making sure you're covered. So there you go.
Speaker D:That's true. All right, all right, guys, this is enough.
Speaker C:Yeah, that's the end of the show. If you've got any comments, please do write to blind gu chat gmail.com and let us know how you can getting on with your mouth, glasses or anything. We want to hear. We'd love to hear from you. Maybe next time we'll talk about our holiday a little bit. Yeah, the pros and cons of using canes in Barcelona. Okay, folks, Bye.
Speaker A:Bye.
Speaker D:Okay, bye.
Speaker A:Bye, bye.
This week we’re talking to Professor Pieter Roelfsema from the Netherlands Institute for Neuroscience in Amsterdam. He is chatting with the lads about a research project looking at direct electrical stimulation of the brain via cortical visual neuroprostheses, which is a promising approach to restore basic sight for the visually impaired by inducing a percept of localised light called ‘phosphenes’. We are offering Jan as a guinea pig for this one!
Meta are opening up an SDK (Software Development Kit, for those of us who are less technical) so external software developers can develop and add their applications’ functionality to the Meta glasses. Is this a good thing? What do you think? Let us know at [email protected].
We've got an email from friend of the podcast, Rene Ludwig, who is a 'TravelEyes' user and thinks it's a great service.
So, connect those electrodes directly into your ears and prepare to listen to Óran rambling on and on!
Disclaimer: Ramble tags would like to disassociate themselves from any rambling producers.
Links for this show: · Professor Roelfsema’s paper on his sight restoration research: https://pubmed.ncbi.nlm.nih.gov/40578385/#:~:text=Direct%20electrical%20stimulation%20of%20the,localized%20light%20called%20'phosphenes · TravelEyes: https://www.traveleyes-international.com/ · Ramble Tags: https://rambletag.co.uk/
Support Blind Guys Chat by contributing to their tip jar: https://tips.pinecast.com/jar/blind-guys-chat