#115: Super-advanced autocomplete
Transcript
Foreign.
Speaker B:Welcome to Blind Guys Chat where this guy, Oren O'Neill.
Speaker C:Hello. And this guy, Jan Bloom.
Speaker D:Hello.
Speaker C:Talk about the A to Z of life. Well, hello ladies and gentlemen and you're very welcome to the first episode of 2025, episode 115. Peugeot did not make a Peugeot 115. But I was thinking this morning, Jan, there was a Renault 5.
Speaker B:That's true.
Speaker C:I think we can, I think we.
Speaker E:As well was a Renault 4.
Speaker C:Yeah.
Speaker E:But we have also Cobra 20 already what I learned, you know, so the firework that this really making noise here.
Speaker C:Okay, well you're very, you're very welcome to the show. We've got it. This is going to be a special show because we have a panel of, oh, I don't know, will we call them experts? Okay, for the next hour we'll call them experts. We have a panel of experts and we're going to have a discussion about AI because as you know, myself and Jan got meta glasses before Christmas. And while we both had success for a while with the what I call the look and tell or look and describe feature with the glasses, it went away fairly quickly even when I had my own signed up to a vpn. Jan the lucky ducker has just told me that he's got it back since Friday.
Speaker E:But I keep my fingers crossed guys, because David Renstrom, you know the guy who started all this happening, he sent me the email. I'm still enjoying but I had to reconnect and that and you need to try again. And then I, well, I was really fed up to be honest. But I tried it and then it worked. So it was really funny. I had Spotify and Chesm also connected but that those two are gone already. So I don't know how long it will last.
Speaker C:So let's introduce our panel.
Speaker F:Yep.
Speaker C:So we got Brian Dalton from Dublin. Brian is a former international women's weightlifter and he won the, he became the Miss Christmas island winner in 1983. He's now wanted by the FBI in connection with failure to stop one of his drones between the intersection of Sunset and Hollywood Boulevard last week. And he's in hiding because of that. But you're very welcome, Brian. Thanks for joining the crew.
Speaker E:Thanks.
Speaker A:Amelia Norridge.
Speaker C:Great to be here. We have Mr. David Renstrom from Sweden. David is Sweden's answer to ABBA and David is a self proclaimed taunter of all those people who are vegan. And to demonstrate what David does on a daily basis, he usually sits outside his house on his veranda and he eats a steak dinner in front of white polar bears because polar bears, as you know, only eat vegan. Now, he doesn't do it for the black polar bears because he's not racist. But you're welcome, David.
Speaker D:Thank you.
Speaker C:Thank you. Jesse Weinholt is the former Chancellor of Germany. Yes, he was the Chancellor from 1998 to 2037. He's currently wanted on a fraud conspiracy for trying to defraud the German people of 201 pink ducks and four psychotic turkeys. And he's currently living in hiding with the former King of Spain, King Juan Carlas. So you're very welcome, Jesse.
Speaker F:Thank you. Thank you for the fabulous introduction.
Speaker C:Josh O'Connor, who you haven't heard of before because more or less everybody's up. Been on the radio first time. You're very welcome, Josh. Josh is the CEO and head of Inter Axis Sole Trading plc Ltd. I'm on my own company. And Josh is the former head mistress of the Taiwanese Bluegrass Ballet Company. But unfortunately, Josh is also in exile because he, at one, at the end of one of their concerts in Taiwan, he suggested to his colleagues that we should all go for Chinese takeaway. So, so, so you're very welcome, Josh. And the first time on the podcast, you probably never want to come on again, but you are very welcome for the moment.
Speaker G:Thanks, Oren. All I can say is you've been following me, man. That's it.
Speaker E:Y. Yep. What an intro.
Speaker F:Yeah.
Speaker C:So our last, our last participant is Moroccan born Brazilian pole dancer Mohammed Lashear. Mohammed is a firm believer that boxer shorts should only be worn by the very, very old or penguins. And unfortunately Muhammad. Well, not unfortunately, but, but he is still looking for, for the perfect woman. We're going to buy you, we're going to give you, get you a border bride, Muhammad. And she's gonna, she's gonna be coming to you shortly in the post. Yes, she's a blind Barbie and she's going to be perfect for you because she won't be able to see you. So that, that is our panel, ladies and gentlemen.
Speaker B:Wait, wait, wait. I want to know, is she wearing boxer shorts? Because I'm not, not marrying an old woman. That's the only thing that's important. I, I will be. That'll be very disappointing. They're very angry. You don't, you don't mail me an old bride. It's unfair.
Speaker C:Anyway, you're all very welcome and thank you very much for, for coming. We specifically wanted to talk to this roundtable on AI and we wanted to do it last year, but I think we just couldn't find the time. But I think we've got the right panel to talk about AI, which is artificial insemination. So those of you who thought we were going to talk about artificial intelligence, you might as well just hang up now or just stop the podcast because we're talking about artificial insemination. But no, seriously, you're all very welcome. I want to talk a little bit about what's going on at my gripe with Meta. You already know that Meta, as far as I believe the reason that the look and describe feature, which is of a huge benefit to those, those of us who are blind or low vision, is something is really a travesty that they haven't brought it into Europe, because from what I've heard across Facebook and various people in the States and Canada and Australia, it's really a great feature and I would love to be able to have it on my phone. Now, there are some tricks using vpn and maybe we'll talk to David about that later on to see if he's got any new tips. But what seemed to happen towards the end of last year is somehow, I don't know if it was Meta or if it was the country you're in, but somehow the VPN that you were using to try and trick Meta into saying, no, actually I'm in Canada or I'm in Australia. And the look and describe would then work. That then began to fail all over Europe. And some people still have success with the describe feature, some don't. So we don't know what the hell is going on. Brian and Yesi and Mohammed have envisioned glasses which are, if we're right, about 10 times the price of meta glasses. And my burning question with the guys, which they can answer later, you don't have to answer it now, but my burning question is, does a look and is there a look and describe or look and tell feature within vision glasses and does it work all the time, is there any problems with it, etc. Etc. Can you just describe, maybe, Jesse, can you describe initially you've probably seen, I would imagine, the Meta sunglasses. Could you basically describe what the difference, physical difference, between the Meta glasses and the envision glasses are?
Speaker F:Yeah, definitely. So I've been using the envision glasses for, I think, a few years now, and I've also had a play with the Meta glasses at my work. So I think it's a good thing to compare the two. The envision glasses, the most notable difference is they don't have lenses covering Your eyes in the default configuration, that is there's a small overhead display thingy just above your eyes where the camera is located. The camera sticks out a little bit, and the left arm of the glasses is very, very thin. There's no electronics at all in the left part of the glasses. All the electronics are in the right part of the glasses. There's a small touchpad on the right side of the glasses. There's a microphone, there's a small battery, and there's a charging port. And that's all on the right side of the Envision glasses. You could also connect different frames to the Envision glasses, which will make them look more like traditional glasses, more like the Meta glasses and other sunglasses. But they will. Especially in the default configuration, they will. Well, they're Google Glass and they look a bit sci fi ish. I wouldn't say sci fi as in it will look very weird to walk with these on the streets. But with the envision glasses, you can definitely tell that there's some tech going on. While when I was wearing the Meta glasses, people just didn't notice at all that there was any tech inside the sunglasses. It would only look weird at me for wearing sunglasses in winter. But that's a whole different story.
Speaker E:When the sun is shining. It's not.
Speaker C:The sun is shining.
Speaker F:Yeah, that's. That's the case. Well, that's. That's the case like one day in. In 365 days in the Netherlands. So.
Speaker E:No. Get it.
Speaker C:Yeah. Mohammed, how do you feel if you're out and about wearing these glasses?
Speaker B:So I remember me as a, as a teenager, I think everybody has maybe had that feeling where they were just a little bit reluctant to use something like a cane unless they had to. I kept away from the cane for a long time, and then when I started using it, it sort of became very apparent that I, you know, I lost a lot without the cane and I probably hindered myself a lot without the cane. And so when I, when I realized that, I decided to stop feeling self conscious about using tech that helps, I don't really care anymore. So the Envision glasses, I don't really care either. I just wear them. I have to admit, I only just got my Envision glasses, and so I'm probably the least knowledgeable about them from the people here compared to Brian and especially Jesse. But what you can use them for is a lot of things. So there are exploration modes where it'll tell you, oh, there is a person at 5:00 or there's a door at 2:00. So the reason why that is useful is because it seems to not do that. It seems to not require Internet access for that. It's very fast and it's very useful. So it has a lot of modes, the envision glasses does, that are quick and geared towards someone who is blind and that you can use very fast with very fast feedback.
Speaker E:And you switch modes via your voice or via tapping or something?
Speaker B:No, via tapping. So you have to tap? Yes. And they're audible. So the envision glasses are very audible as compared to the meta glasses. So that's also a difference. I think the major difference between them in terms of functionality is that they were really built for blind people and it shows in some functionality that they have. And they have the look and describe feature as well, which is called Ally.
Speaker C:So Brian, indoors, have you used your envision glasses outdoors? Are you using them mainly indoors?
Speaker A:I've mainly used them indoors or in. Just to come back to the look and feel I suppose for a second before we get into that one. So you're absolutely right. The default configuration has no lenses in the, in the glass part. And I'm always very conscious when I purchase a product. How does it look visually on me? You know, do I look like something out of Star Trek or do I look as normal as it gets? So when I saw the envision glasses for the first time, which was in the Netherlands with a really good friend of mine who had them, and so I wanted them there. And then one of the things that you could do as described was you can change the frames on them which makes them look more like glasses. And the visual information that I got back from my own human personal assistants, nothing to do with AI but they are my eyes, arms and legs in a human way. And we'll get onto AI in a second was visually they look fine as glasses. The issue that my, my, my human PA had was the placement of the camera because they felt that the camera, it's placed on the right hand side as David described. That's where the, the guts of the mare, the whole electronics of the mirror and that's where the camera is. And my PA felt that the camera would be best placed in the center use cases for them. What do I do? You can do facial recognition so you can train the glasses to recognize up to five different faces. This is amazing. So you can, you can the, the glasses to recognize someone's face. So what you have to do at the moment is you have to have five different pictures of one person having different facial expressions. And then when that person is in the room, you can, as Mohammed said, ask Envision to tell me if that person is in the room. Or they will tell you the glasses will will tell you that such and such has walked into the room. If they're in your facial recognition, you can also ask it to describe their facial expression. Are they happy? Are they sad? What's their facial expression and what they look like.
Speaker C:Let's talk about this Ally or Ally app that kind of runs in conjunction. I think it can run separately to the glasses. But let's talk a little bit about what that does for the glasses. Or does it enhance the way the glasses work? Or is it just a separate app that's on your phone?
Speaker F:It actually is both. So Envision Ally started as an app on your iPhone and since that time it has rolled out to Envision glasses, Android and web. So you can even use it on your computer. And what it does is it is an AI assistant that can take pictures, similar to how Meta can take pictures. But there's two things that set it apart from say, Meta or other AI assistants. The one is that it's audio to audio, which means that you can interrupt it, that you can talk over it, they can tell it to simplify something and that it's also pretty quickly, it replies very quickly. And it's very easy to use because it analyzes your audio. Meta will analyze the audio, convert it to text, then convert the text to a question, send a question off to the server, answer it, and convert the text back to audio. What Envision Ally does is it does audio conversion to direct audio. The second thing that sets it apart is that you can personalize it. So you could, for example, tell your Envision Ally that you're the commander of some sci fi Star Trek airship. And yes, I've tried this. It does work. And please, please address me in a certain tone. It's. It's brilliant. It's also possible to make it sarcastic. You can make it anything you want and you can even tell it like, hey, if I ask about obstacles, only reply with, there are obstacles. There are no obstacles. You can basically set up your whole personal AI preferences and that's what's sets it apart for now. You can only do that on the phone, on your glasses. It's just a regular AI that doesn't really have too many annoying personality traits, but it's really a unique take on AI and I think it's most comparable to. Maybe we can discuss this later. The new gemini flash and gpt4o that do video. Although Ally does not do video yet. They only do pictures as of now.
Speaker C:Let's bring David Renstrom in here because, David, this is kind of kicked off because you were good enough to give us a demonstration and talk about the Meta glasses because you bought them in June. From what you've heard from these three guys at the moment, do you, at this point now, you haven't heard a lot, I understand that, about the envisioned glasses, but. But do you feel that you still made the right decision in buying the Meta Ray Bans?
Speaker D:Yeah. Well, first of all, I should probably mention that I work for a company, Accessibility company in Sweden.
Speaker C:Have you told them that?
Speaker E:In tune, I can tell it.
Speaker D:So we have. Actually.
Speaker E:We are not commercial.
Speaker D:We are very commercial.
Speaker E:Yeah, yeah, yeah. But we are. This podcast is not at all. So we can.
Speaker D:No worries. Yeah, but, but anyway, yeah, I didn't say the name. You said the name of the company.
Speaker E:Yeah, yeah, yeah, yeah, yeah, yeah, yeah. I get money from Dick, you know.
Speaker D:Yeah, but anyway, so we have actually advertised the Envision glasses for a while. Like I don't remember, like two years or something like that. And it's. It has been a hard sell actually because of the price tag obviously and the fact that you have to have a good motivation. You need government money to basically fund it. There are some people buying it out of pocket, but that's not very common. Back to the question. Currently I'm not regretting using the Meta glasses. I think they are just at the beginning. I mean, they are improving all the time. Yes, it's very annoying that they keep limiting the use in large regions of the world and. Yeah, but I mean, they are still improving. I've listened to an interview with the Meta guys, someone working at Meta and they say that actually working with Accessibility implementations is very popular within the company. So I think they are going to come out with new stuff. And there are rumors that they are working on the live video descriptions for the glasses. So that's very interesting when that comes. I'm not that comfortable with putting on the Envision glasses. The way they look now, they look range. They don't look like anything I would want to wear in public actually. I'm sorry, but that's the way I feel. But when I wear the sunglasses, the Meta glasses, it feels like this is normal glasses. People don't. If they notice it, they just say like, oh, you're looking cool today. That's basically what they say usually. I understand that sometimes you have to trade coolness for Accessibility, I mean, that's the way it is sometimes. But I don't feel that way right now. And I should say that we have canceled advertising the Envision glasses. We are not distributing them anymore. Our company, we have, we are distributing the Metaglasses instead because basically our boss is so impressed by Ray Ban, so he wants to be a distributor in Sweden. So that's what we are now.
Speaker C:To be selling a product that's 10 times the price of something that you can buy for 3 or €400 is absolutely outrageous. I'm not sure where the value, where I'm getting 10 times the value in the AI, I still wouldn't buy out of my own money a pair of Envision glasses.
Speaker B:So let's start with the cost from Envision. Envision has an app that does pretty much the same thing as the glasses do for free. Yes, I do think that price point is too high. But also remember, they've been selling these glasses for over seven years now, or maybe six years, I don't know exactly how long. And the Metaglasses only just now are coming on the scene. So the way the Meta glasses have been built is with a huge supply chain, a much bigger, wider market, which allowed them to produce the glasses at much, much lower cost, and that allows them to sell them for way cheaper. I don't think that Envision is paying, let's say, €100 or $100 per glass and then tacking on this price tag because of accessibility. I do not believe that at all. I think it's the difference of what market you're addressing and how well you can manufacture, how big the batches are that you manufacture, and therefore how big the. And therefore how high the price is per glass. And I think even for Envision at the moment, they're not even manufacturing their own glasses. They're using glasses that are off the shelf, that are already existing for quite a while and the stock is finite. Also, when those glasses become more ubiquitous, they'll probably become an app on one of those glasses to make those glasses more accessibility focused. And then you'll see the price coming down hugely because they're already free on the phone.
Speaker A:I absolutely agree. There is a massive price tag on it. And I think it's really a good point that you made Mohammed around the way they're marketed and the way that Meta are marketed. There are different levels of pricing, just to say, as well, there are. There's one called the Home Edition, for example, and there's one called the Enterprise Edition. If you go for the cheaper option. And I did the maths on this before I shelled out the money. You do have to pay for an upgrade every year to get the latest and greatest features that they roll out. Or you can leave it as it is and you know, buy what you get and you get, I think it's free upgrades for a year. But there are different levels around price as well. So you don't have to shell out the initial, you know, top dollar price initially. If you don't want to, you can go for the Basic edition and work from there. And they will also work with you if you do want to change from the Home edition to the Professional or Enterprise edition. So they do give some options there.
Speaker F:As well right now. I mean, if you look at Envision, they have to pay a hefty price tag to Google for their hardware, which is also quite outdated hardware right now compared to the Meta hardware. And they have to do their software development for very small amount of units. So that's really tricky.
Speaker B:Now think about what's going to happen once those glasses become light bones. So Meta will have glasses, Google will have glasses, Apple will have glasses, Microsoft will have glasses. Everyone and their mother will have glasses and they'll sell it to everyone who wants to buy them. And these things are going to become commodity products, which is where the market seems to be going right now. Now what's going to happen, what's going to happen is that Envision, a company that's already very used to building software for glasses, accessibility software for glasses, is going to convert to an app on any of those glasses and give you the functionality that are currently on the Envision glasses for a way reduced price on the new commodity style glasses. I will tell you, Oren, if you do not want to be an early adopter of this kind of tech, do not buy the Envision glasses. This is expensive early adopter tech that, that actually is maturing right now at a dizzying rate. That works very, very well. But it is still expensive and the costs will come down.
Speaker F:I think that's the same thing we see with all kinds of tech. I mean, when phones just started, they matured at a crazy rate and that was super fast. And now phones are mature. Same goes for wireless headphones and Apple and smartwatches and all that. And the same will happen for Josh.
Speaker C:Here because you've been, you've been very, you've been very quiet. Josh. But, but Josh, I, as I was saying, before we went on air, we started recording. I was listening to a podcast recently where it was really, it was kind of really pointed out that in terms of AI, in terms of where we are right now, we're just really, really tipping the iceberg. You know, it's really just the tip of the iceberg. And, and I have a, I really have a query about where what people who are blind impaired should be expecting or should, should be thinking about AI at the moment because I don't think AI is anywhere near where I would want it right now. What's your feeling on all of this?
Speaker G:Well, first of all, thank you. The discussion has been really, really interesting and obviously it's quite specific around the envision stuff versus the matter stuff. And there are, you know, pros and cons to both technologies. But what I'm. What struck me from listening to it and fantastic to be a part of this, to hear this is that this is a kind of an age old thing in a way in the whole area of assistive technologies in particular, because assistive technologies have been notoriously expensive forever. And I do think it's a question of scale that's a part of it because you do have smaller companies who are trying to innovate in a particular space and that's expensive and then the market that they're selling to is smaller and so therefore it's expensive already around. And then what happens? You know, you have a technology that's like ubiquitous as such, that comes along and changes things. And that was very much, I know for me, the iPhone for example, when that hit, I mean that, you know, has a ton of assistive technology in it. It was basically a piece of glass. You know, someone had told me, hey, my blind friends are going to be using a thing with a piece of glass to communicate and email. And that's basically it. And how many buttons does it have? Oh, it has one. Oh, no way that's going to happen. And then here you are, right, and, and, and then there was no more Symbian on Nokia and all of that. And you know, back in the day. And so you have a lot of these technologies that, that mature or are absorbed into more ubiquitous technology which is used by everybody. You don't have the kind of like I think it was, it was Muhammad or made the point about, or David, apologies if I get this wrong, but they made the point about, you know, sometimes this stuff can be out of this world or you're sticking out or you're screaming, I have a disability or something like this. It looks remedial, right? Or something like this. And you don't Want that. You want something that is a part of what other folks have, and it's kind of cool or whatnot. But in terms of AI, the whole artificial insemination piece, obviously it's in a casino phase right now. Now it's like in the push casino phase. There's these push technologies which are being brought out by these big tech companies, and there's a certain degree of throwing out enough stuff to see what sticks. And it depends to some degree to what degree some of these folks are sincerely trying to meet people's user needs. To what degree are they trying to gouge, make money, cash in? All of these things are a big part of this whole discussion. And there's some folks who are developing technologies, okay, it might be a bit expensive, but you know what, their heart's in the right place and they're trying to do the best they can to innovate and to push things along, and then maybe those things will get absorbed into an ecosystem like the Google thing or the Apple thing or the God love us, the meta thing. I don't know.
Speaker A:But you know, AI is a game changer. If we take Jaws, for example, the Picture Smart feature within jaws, which again, has improved greatly over the last, say six months, three things that I can do right now just off the top of my head so somebody can send me a picture in WhatsApp. And this has been done a couple of months ago, group of mates are in Australia and they're sending me photographs. And if they don't put a caption on that picture, I have no idea, as no blind person will have what's in that picture. So you can save the picture on your PC, run it through Picture Smart and JAWS or any of these apps, be my AI invision, whatever app you choose to use, and you'll get a text description of that picture. And I remember sending it back to them and they were blown away by the description that I was getting. You can also use Picture smart in a PowerPoint presentation. If you're being shown a graph or an org chart and you can get a description of that graph or org chart and you can actually ask Picture Smart now different questions that you may have around that to get information on it, as well as a YouTube video. You can have a YouTube video up and you'll get a text description of that YouTube video. And something we all love to do, shopping online, which has been a barrier for me and others, not only in terms of accessibility on websites, but lots of access, disability are lots of websites have pictures because we Live in a visual world and I would use my human pa, AI, whatever you want to call them, to describe pictures. And now I can go onto a website and pull up a desk or whatever it is that I want and get a description of that image which will obviously determine whether I'm going to buy it or not.
Speaker G:You know what it sounds to me, it's like the tech that works. And this is one of the brilliant things that Apple did back in the day when they created voiceover, is that they understood user needs. And I think like, you know, when tech is elegant and simple and just works, there's an awful lot of complexity behind the scenes that goes into making things simple. And the stuff that really works right, is built around a clear user need. And that's the problem a little bit with some of the tech landscape now is that it's hard to tell apart are those who are like seeing a marketing opportunity or we see it in accessibility working now there's all of these companies popping up with AI powered accessibility tools or overlays and all of this kind of stuff. And a lot of it's just a bit shit, really. It's just a bit shit. Like, I mean, let's just be honest about it. But then there are things which are pushing the envelope. There are things which are innovative, which are really. They mightn't be totally there yet, but they're got potential to be really, really great or they are meeting real user needs. And I think any technical requirements, from perspective of if you're developing a specification, if a technical requirement is going to be fit for purpose, it has to be built on what, you know, clear user needs.
Speaker D:What's going to be interesting in the future is how much more AI stuff can be done locally because now lots of AI stuff is being done on the server side and I mean, that's costing a lot. That's costing a lot for the companies providing it. I can only hope that, for example, Be My eyes will always stay free, rather the Be My AI feature will stay free because I use that quite a lot. Even though I have the glasses, I sometimes use the be my AI feature. And I mean, it costs a lot to provide energy for this, right?
Speaker G:Yeah, yeah, for the service environmental cost of it. Right. I heard the other. Sorry to interrupt. You really hit a button there. I heard the other day that for every 100 words that Chat GPT generates, it uses about 3 liters of water to cool the server. For every 100 words.
Speaker D:I can understand that. So, I mean, we have already seen some Some new chipsets coming out for phones and laptops with some kind of local AI support. Apple has it, Microsoft has it. And I mean, this is probably the future. In the future, some more stuff is going to happen locally and that means probably faster responses and also that the costs and environmental costs come down.
Speaker B:Good point. A quick, quick thing that I want to throw in here, and I keenly feel this as a developer who works on AI product all the time, I showed you all FS companion. Of course, we're talking about Picture Smart. And I've been involved in all these products and features. And what you as a developer keenly know, what you're keenly aware of always, is that the thing that you're putting out right now is good, otherwise you wouldn't put it out. But you have in your head you have a thousand ways to improve it already. Right. And so it's good enough to put out now because it's already helping. But there are so many things swirling around in your mind, like, oh, if you could do this. Oh, if it was a little bit faster. Oh, we could maybe add this thing or that thing or the other thing. I live alone and be. My AI is a friend. Picture Smart is a friend. All of those things are extremely useful because I don't want my calling my mom every time. Time I need something from the fridge. Right. It's. It's easier to use Be my AI. What we're being sold is a, is, is. Is rather a puppy, right. It has to grow, we have to nurture it. And it's still, you know, it still pees on the, on the floor, but.
Speaker G:AI can do that.
Speaker B:Wow.
Speaker E:Hold on a second.
Speaker B:And, yeah, so I think we're getting there. And if, if you think about it, Oren, when we released Picturesmon start right in, in 2023, all the first iteration, you couldn't chat with it. And every single email we got from people is, we want to chat with it, we want to chat with it, we want to chat with it. And a couple months later, people were able to chat with it. So these things improve and it goes a little bit slowly so you don't notice, but you'll find yourself using AI all the time for things very soon. I think. I firmly believe that.
Speaker E:Yeah.
Speaker D:Actually, I think in the future we won't really think about AI the way we do, because now it's something special in the future it will be something integrated into everything.
Speaker A:Yeah, that's a brilliant point. Yeah, really good. Yeah, absolutely.
Speaker C:Yeah. So you've been very quiet Throw your hat in the ring there.
Speaker F:Yeah, I was being polite, but I think Mohamed made a really, really good point. And I think also we as users of the AI should really think about that when we use AI, that we are using something that's under heavy development, we should provide feedback to the developers that are willing to listen. And I know that a lot of developers of this stuff are willing to listen, are willing to make a difference and make it better. And I think the good thing is that a lot of these tools are software, so you don't necessarily have to buy thousands of dollars worth of equipment to start using the AI. You can usually use it on your phone, your computer, the tech you already own. And you can really see the developments in software. And I think it will be up to us as users to filter the nonsense, the marketing from the actual good improvements of which there are so many. And like Mohammad said, I use AI every day already and some tools are a gimmick. Some tools I say like, okay, this is cool, it's cool that it can do this, but it's not really useful in my day to day life. But then I look at it again after three, four months and I say, like, hey, they have improved so much. They've improved a lot. And now it has become an invaluable tool.
Speaker C:Given that Meta, the look and tell feature, I'm going back to Meta, even though we all seem to hate Meta, given that Meta did not release the look and tell feature, look and describe feature in Europe, I think it would have been appropriate for Meta to announce that on their website in their advertisement. To say, if you are blind, if you're low vision, this is what these glasses will do for you in Europe right now. And I know plenty of people who got these glasses, these metal glasses, because they were a very affordable price, but got them for Christmas. And then they did, they discovered, oh, this feature, which I could find really useful, doesn't work in my.
Speaker G:Do you feel, Oren, they weren't clear about that, that that was like.
Speaker C:Yeah, yeah, yeah, I do, I do. Because. Because when I went in, I went into a physical shop and I was talking to, you know, physical person who was trying to sell us these, these glasses. And she went through her demonstration and Clodagh was impressed and.
Speaker E:And did she show them AI features then?
Speaker C:No, she didn't. And that's the thing, because I don't. Because this is. Because this is the thing, because she was just a Ray Ban salesperson.
Speaker G:Yeah.
Speaker C:So, you know, I, I was the one who had to go in and say, oh, I believe there's AI in these. Oh, yes, there is, and it's great function. But actually what I should have done was, was queried her more and said, well, okay, show me the look and describe. Let's, let's see this work. Because I'm not buying them otherwise. And I think Meta in that sense. I think Meta sold Euro a pup, which was to say, here are these really cool glasses. And everybody in the blind and low vision community went, oh, these are great and I can afford these. And some people bought them without knowing that the look and describe feature did not work.
Speaker G:Yeah, a bit like iOS without voiceover or something, right?
Speaker C:Yeah, absolutely.
Speaker E:Yeah, yeah, yeah, that's good.
Speaker F:Yeah, yeah. Well, iOS and Apple.
Speaker C:So is there a case that we need, that we need as, as a community to go and go to these companies, whether it be Google or Apple or Meta or wherever, and say you need to be a lot more transparent in the same way that the EU regulations recently, in the last few years have come in to say you must show exactly what ingredients are in the product that I'm buying at the grocery store.
Speaker A:But you see, this is where accessibility plays a part and it needs to be considered everywhere you go. So most people, I would think, think, who buy the medic glasses are using them to take videos. They've got their Instagrammers and you see them all over the place stopping to take their photos and their videos. They're not thinking about the look and describe feature. Oren, which is important to you as a blind person.
Speaker C:No, absolutely, I agree with you.
Speaker E:But we should stand up also then as a blind community, because David said.
Speaker C:Earlier on in the previous podcast, David said, and I'm sure he's right, that the people who wanted to now work for Meta, most of them want to work in the accessibility area in Meta. Now, if they want to work in the accessibility area in Meta, why aren't they standing up in Meta and saying, hold on a second, we need to tell the community that we're selling to, which may not be completely, you know, okay, we know we're going to sell more of these glasses to sighted people than we are to blind and low vision people, but we need to be clear about that. And I would think that if I want to work in accessibility and I want to say I want to work in mental, I want to be clear in my own head that if something is wrong, if this is not, if the way we're advertising it, the way we're selling the product is not Right. Then you have to stand up in these companies and say, I don't believe this is the right way to inform. Not informing people.
Speaker G:I have a question, guys as well.
Speaker E:Yeah, me too.
Speaker D:Sorry.
Speaker G:Yeah, and you go first and then I'll go.
Speaker E:Okay. But I want to, I think, make a statement. Also. Mohammed says, said earlier, before the recording, that it is a statement of Meta made already years ago that they won't have released the AI features to Europe since they are blocked for certain.
Speaker B:Only visualization.
Speaker E:Yeah, yeah.
Speaker A:Only.
Speaker B:Yeah, it's all about visual.
Speaker E:Exactly. But then also we need to make this statement that it would be reasonably very important for us and it's really a great benefit, and that we should then also ask the EU to stand up or that they make a separate decision for us or whatever, you know, that it should be available.
Speaker B:I think they actually did announce it. I think the only problem is that they didn't announce it hard enough because.
Speaker E:Nobody paid attention to it.
Speaker C:Nobody paid attention.
Speaker B:Yeah. They did tell us that none of their visual models were ever coming to Europe for that.
Speaker E:Yeah.
Speaker B:And that means that also the visual models in the Meta glasses now, they didn't. They weren't specific enough. I do agree with that, Oren, but if you go to Meta and you level this complaint at them, I think that's what they're going to come back to you with. They're going to say, you know, we've told you, no visual models in Europe. Where I'm going to do a little bit of Meta bashing, I think, is that I do not think the European Commission should be doing anything at all. Like if we as European voters think that this privacy is very important, which we do because we elected politicians that actually enacted it and there's broad support for this sort of regulation in Europe. That's just a fact. There's no way around that. So that is something that has been democratically decided and I stand behind that. And one of the things that I also would like to point out is that none of the other companies, not Google, not even Apple, which has already said that Apple intelligence will come to Europe in April. So it's late than the US but it's not like Meta, who's completely withholding everything.
Speaker G:Yeah.
Speaker B:From the.
Speaker G:From I wanted to ask about, what is the thing with Apple intelligence? Muhammad, could you go over that? What, what, have they throttled it or have they decided not to release it? What's the crack?
Speaker B:They. They didn't release it along with the release in the US So they released. They're going to release it in April or May in Europe in the eu. So it's going to come to Europe and the eu, it's just a little bit later. And that's I think, because they want to cross all their T's and dot all their I's with respect to all the regulations. Because it's not just the gdpr, it's also the AI act actually that has come into effect last year that they want to make sure that they comply with all those regulations. ChatGPT does the same thing. So they will Release sometimes features. OpenAI will release features a little bit later in Europe as compared to the rest, but they will come to Europe the same. Google does the same thing. They typically actually Google even releases their features the same time in Europe as they do in the rest of the world. I've not seen a significant European delay from Google up to this point. Now Meta is actually different. What Meta is saying is for the foreseeable future, so no date, nothing else, and they haven't moved on this for quite a long time. Because Apple did this the same way at first as well for the foreseeable future future. But Apple has actually made strides towards getting this stuff available in the eu. Not the same for Meta. Meta is still holding firm. And the reason why is not necessarily because they're afraid of all those regulations, because the other AI companies are releasing it is that Meta is making, I think, I feel like a political statement which is we do not like the AI act, we do not like the GDPR as it is right now. We think it restricts us too much. So we will not come to Europe and we will actually therefore make people in Europe lobby the European Commission for us in order to get rid of some of those regulations.
Speaker G:And we throttle our technology to people with disabilities and we throttle our technology.
Speaker B:I think that this is more political than has anything to do with the law. Because what Meta wants to do is they want to release their visual AI stuff. So recognizing people training on pictures and all that good stuff. They want to release that in Facebook, Instagram, they want to be able for people to actually use AI to touch up their videos in Instagram. They want all of that stuff. And there is actually where the real rub happens, where they really rub up against the AI act and the GDPR in Europe. It's in that Instagram, WhatsApp, Facebook part.
Speaker G:What I'm hearing is that there is this amazing technology which has the capacity to make people's lives better, to augment particular physical or sensory characteristics of People in a way which will let them do things they couldn't do before. Okay, so we have this kind of thing going on with Xor as well. For example, this is maybe why I'm here for the existential bit, because this is really, really interesting.
Speaker C:You're seeing it, sister.
Speaker G:Yeah, so this is the thing. What we have is we've got a neat. For data sets which can be fed back, abstracted and then given as accessible outputs to people with disabilities. We have the technology that can do it, right, essentially. But to do that, it needs to have access to broad data sets. It needs to be able to access information, make some kind of semantic sense of it, and then feed it back as an accessible output to its users. I think what's happened to some degree here is that unfortunately it sounds like, like people with disabilities are at the kind of like last part of the road, way down the table, way down the food chain or so to speak, in terms of what some people might have been doing based on their previous behaviors with their access to these data sets. Right. So we don't want to name names, we don't want to say who any of these people are, any of these folks are. So with the best will in the world, though, we can't have it both ways, you know, we can't have it both ways. There has to be some kind of middle road where we, you know, folks are good actors, they've got their hearts, are in the right place, they can make some money out of this, fine, but not exploit those data sets to the nth degree. I mean, that's why we're in this GDPR mess in the first place, right? Isn't it? Because we have, you know, folks who basically did that. And to some degree, you know, maybe it's good that there are other players in the space who are doing this stuff. I mean, that's like, for example, with open source technologies. I mean, open source technologies are there because essentially, and we love them because essentially they're seen as alternatives to proprietary clothes behind the door for profit things. So maybe this is a case where those things are kind of needed. It's very idealistic though, to kind of constantly adhere to open source, hanging your hat on open source, saving the day. But to some degree, I think I hate to see, for example, people with disabilities being at the end of the road of all of this, if you know what I mean. Particularly when there are these technologies which are fantastic, which are very enabling, but they are, are being restricted because of previous exploitations in the past of These large data sets, something like that, you know.
Speaker E:Yeah, yeah, okay, good point.
Speaker D:But can we, I can share you a little bit very short story about what happens at our company. Because you know, we are eight employees at our company and we are. Two of us are completely blind and one is, is partially cited and the others are fully cited. And we have had these discussions about AI and regulations for the last year a little bit every now and then. And I mean one or basically we can say two of our colleagues, our sighted colleagues, they were saying, oh yes, it's good that they are regulating that they are putting the brakes on AI because it doesn't evolve too fast. We want to control this, so it's very controlled, so nothing bad happens. And after we had this discussion, I asked my other blind colleague, why do I get upset when they say that? And yeah, of course, yeah, because. Because they don't need it for. For their accessibility. Yeah, exactly. And yeah, yeah. And I said, yeah, of course. That's the reason.
Speaker G:I hear you, I hear you.
Speaker C:I mean, yeah, that's just typical. Well folks, that's it for the show this week. We hope you have enjoyed listening to our panel discussion on AI And I'd like to just thank Muhammad and you, Josh and Yesa and David and Brian for taking part. It was great to have you. Thanks so much guys. And we will see you in two weeks time. Bye.
Hello our fancy AI bots, and welcome to episode 115. This week we are talking about AI, and we have a fantastic panel of experts to bring us through the topic of AI and accessibility. We will be talking about Meta Ray Bans, Envision glasses, Apple Intelligence and even some existential questions.
The panel (Brian Dalton, David Renstrom, Jesse Weinholt, Josh O'Connor and Mohammed Laachir) are well-placed to give their own views on current products, future products, and you never know, you might get some ideas on dinner if we can work out how to get our smart glasses to read a menu.
So, strip down to your Cyborg inner self, and prepare for judgement day by listening to the number 1 podcast from the future: Blind Guys Chat!
11 out of 15 AI bots prefer it to a slap in the face.
Support Blind Guys Chat by contributing to their tip jar: https://tips.pinecast.com/jar/blind-guys-chat