Blind Bargains

#CSUNATC20 Audio: Smart Glasses Are The Next Step For Envision AI


It has been almost a year since we had Karthik Kannan, Cofounder and Chief Engineer of Envision AI, on the program to talk about pricing plans and the release of the Android version of the app. So, with a new product announcement and pre-order campaign underway, Shelly swung by the booth at the emptier than normal CSUN Exhibit Hall floor to test out the new Envision AI Glasses. The interview covers topics such as why Google Glass v2 was chosen as the first delivery system outside of smartphones, how the company is taking a platform agnostic approach to future versions of the app and the drive to work with others in the A.T. space to adopt more wearable tech approaches. And don't miss Shelly's real time demo using the glasses to read nearby objects and text. To learn more about the Glasses, or the app for iOS and Android, visit the Envision AI website

CSUN 2020 coverage is Brought to you by AFB AccessWorld.

For the latest news and accessibility information on mainstream and access technology, Apple, Google, Microsoft, and Amazon offerings, access technology book reviews, and mobile apps, and how they can enhance entertainment, education and employment, log on to AccessWorld, the American Foundation for the Blind's free, monthly, online technology magazine. Visit www.afb.org/aw.

Transcript

We strive to provide an accurate transcription, though errors may occur.

Hide transcript
Transcribed By Grecia Ramirez

Direct from Anaheim, it’s blindbargains.com coverage of CSUN 2020, brought to you by AFB AccessWorld.
For the latest news and accessibility information on mainstream and access technology; Apple, Google, Microsoft, and Amazon offerings; access technology book reviews and mobile apps and how they can enhance entertainment, education, and employment, log onto AccessWorld, the American Foundation for the Blind’s free monthly online technology magazine, www.afb.org/aw.
Now, here’s Shelly Brisbin.
SHELLY BRISBIN: Welcome to Blind Bargains coverage of CSUN 2020. I’m Shelly Brisbin, and I’m talking with somebody a lot of you out there might be interested in hearing from. He is Karthik Kannan from Envision AI. And you may have heard that there are new glasses from Envision, but I think we’re going to get started and get a – sort of a general explanation of what Envision AI is all about and then find out what’s new.
Hi, Karthik. How are you?
KARTHIK KANNAN: I’m good, Shelly. Thank you so much for having us on the Blind Bargains podcast. We’ve been big followers of Blind Bargains from the – since the beginning.
So Envision AI is basically an app that helps people with a visual impairment to live more independently. What it does is you can take images of things around you, and it can extract information from those images and then speak them out to you. So, for example, if you’d like to read text or recognize faces or recognize objects, the app is able to do that for you, and it’s available on both iOS and Android.
We’re here at CSUN because we’re launching the Envision AI app on the new Google Glass 2. So you’d be able to do everything that you do with the Envision app, but completely hands free in a totally unobtrusive manner by just wearing on the Google Glass and then doing all the functions of the Envision app.
SB: Is the app that you are adding Google Glass to different from the Envision AI app that people may already be familiar with or is it updated or –
KK: Well, it is a bit different, as in, you know, the app is completely ran on the Google Glass, and it’s totally stand-alone. On the other hand, it is also very familiar because it all – it has all the exact features of the Envision AI app that people really love. So there is instant text, there is read documents, a describe scene. So all of these features is something that users of the app are very familiar with, and that’s also going to be on the glasses.
SB: So talk about Google Glass 2. People may not have seen or certainly physically touched Google Glass. How big is it? How heavy is it? What’s it like to wear Google Glass?
KK: So the Google Glass is very much like just any other pair of spectacles. It’s pretty lightweight. So you can wear on like how you wear any other pair of spectacles. On the right side of the Google Glass is basically where there is a touchscreen, the battery, the speaker, and the microphone.
So the touchscreen’s located near your temple, and then you’d be able to swipe like how you do with either TalkBack or VoiceOver on your phone. You'd be able to swipe through the different options, you’d be able to double-tap to go into a particular option and then take an image or start reading things around you or start finding objects around you and so on.
So there is the speaker that is basically located near your ear. So that speaks out all the information to you as you -- you know, as the Google Glass processes it. And there’s also a microphone that’s built into the Glass itself.
So the glasses can be connected via Bluetooth headphones if you have Bluetooth headphones, and it can also be, you know, connected to USBC headphones and so on.
SB: Okay. So any way that you can hear from your phone, whether it’s through wired headphones or Bluetooth headphones, the Glass would transfer?
KK: Exactly. So it’s very similar to the way you have your, you know, connecting your Bluetooth headphones. You're connecting your wired headphones to your phone, you can just plug them on or pair them and then it puts -- you’d be able to hear all the output on them as well.
SB: Is this available now, or when are you going to have it available?
KK: Yes, it is available now, and it’s available at a special price as well. So we launched the Envision glasses for preorders this Saturday – last Saturday. And it’s available on our website. So when people go on our website, they’d be able to buy glasses. And we’re shipping to people across the world. We’re starting with – the price is around €1500 to begin with, and that’s a special preorder price. And it’s going to be retailing at €1800.
SB: Is it available in the U.S.? You mentioned Euros. So is it available purchase for U.S. customers?
KK: Yes, it is available for purchase for U.S. customers and also for our customers across the world.
SB: Okay. Great. But – so whatever the translation for Euros at any given time, that’s approximately what it would cost. So it’s available now. Have you been beta testing this? Have people been actually using it out in the field for a while?
KK: Oh, yes. We’ve actually been beta testing it for the last few weeks. I mean, in fact, the glasses has been something that we’ve been working on for almost a year now. So we’ve been beta testing the Google Glass -- the Envision Glass on the Envision app on the Google app for the past few weeks. We’ve had people use the glasses with them 24/7, and we’ve been able to, like, work with them to iron out the issues. And we’re here at CSUN to basically, you know, show the glasses to more people and get their feedback and get them to test it out.
SB: Now, you say it’s for blind and visually impaired people. What’s the experience like of somebody who has low-vision? Are they going to not be interacting with the glasses at all? Is it expected that you would be using VoiceOver or TalkBack primarily, or is there any advantage for somebody with some vision?
KK: So for people with low-vision, we’re working with some of our low-vision users to see how we can adapt the Google Glass display to basically make it easier for them to access information around them. So for example, the Google Glass display could double up as a magnifier; right, so people – for people with low-vision. And for people with – who are blind and visually impaired, we do have a version of TalkBack on the glasses, which will basically speak out the information to them as and when they’re using it, just like how they do with the smartphone.
SB: So that’s separate from the phone? So you could use the glasses without your phone, or how does that work?
KK: Yes. The glasses can be used completely independently from your phone. We also have an option where you can pair the glasses with your smartphone. You know, you could pair it with the Envision app and you could control various settings of the glasses directly from the app. So, for example, if you want to increase the volume or decrease it, maybe change the TTS engines -- so you could do all of that stuff once you pair the glasses with your phone.
SB: Now, let’s talk about that. What kind of voices are available? How many languages?
KK: So at the moment, we support 60 different languages. And we’re trying to work with different TTS engine providers. So there’s going to be the Google TTS. That’s going to be there for sure. But we’re also trying to work with other TTS providers to get them on to the glasses so that, you know, people can have a lot more languages, a lot more voices and stuff like that. So we’re talking to people like the Vocalizer guys, we’re talking to Acapela. So we’re talking to a few people and get them on board on the Glass as well.
SB: But it’s all self-voicing? I’m not – or am I able to use the TalkBack or the VoiceOver voice that I’m used to?
KK: It is, at the moment, self-voicing because the Google Glass doesn’t necessarily support VoiceOver or TalkBack, so we’re not able to, sort of, port the voices. But if you’re someone who’s using a Google TTS, you know, or using TalkBack, there’s Google TTS, so there should be some familiarity with the voices.
SB: Okay. We’re going to go out for a demo in a little bit, but tell people where they can find out more information and purchase the glasses if they want.
KK: Sure. So you can find more information on our website. So – and you can also purchase it on our website. The URL is letsenvision.com/glasses. So you can go there and purchase the glasses.
SB: So you would purchase the glasses, and then you would be able to download the app and add it to your phone if you wanted to use it that way?
KK: Exactly. So you could purchase the glasses now. It’s out for preorders. There’s a special price, like I mentioned, and we’re going to start shipping the glasses to customers sometime around July, August.
SB: Great. Karthik, thank you so much.
KK: Yeah. Thank you so much, Shelly.

(Demo.)

SB: I’m wearing the glasses. The left – it’s a pair of glasses. It’s a pair of spectacles that don’t have lenses in them, obviously, and then it’s a thin wire frame that wraps around the left side of my head. And then, I have just the glasses over my head, and then on the right side is a long piece of plastic that goes all the way to my – the back of my head, and I just heard a noise in my ear.
KK: So basically, as soon as you open up the app -- open up the glasses, sorry -- the app loads automatically. So you can just – if I can have your finger for a second.
SB: Sure.
KK: So this is basically the touch area --
SB: Okay.
KK: -- right? And you can just swipe through.
SB: So I can swipe --
COMPUTERIZED VOICE: Read documents.
SB: Read documents.
KK: Oh.
SB: Oh, yeah.
COMPUTERIZED VOICE: Describe scene. Video call. Find object.
KK: So you could swipe back.
COMPUTERIZED VOICE: Video call.
SB: Right.
COMPUTERIZED VOICE: Describe scene.
SB: Let’s do that. Let’s describe scene.
KK: Yeah. So you double-tap on it now.
COMPUTERIZED VOICE: Double-tap to take a picture. Swipe down to –
SB: Oh. Sorry. I single-tapped.
KK: Yeah. You double-tap again.
SB: Double-tap.
KK: Yeah.
SB: Okay. Swipe down – do I –
KK: So now –
SB: So I wait for it to process.
KK: Yeah.
COMPUTERIZED VOICE: A man wearing a blue shirt.
KK: Yeah. That’s me. I’m just standing in front of you.
SB: There you go. You’re standing to my – to the front and to the right.
KK: Yeah. Let’s – let’s –
SB: Let’s look around you. Yeah. I’ll take – let me hold that. There we go.
KK: Yeah. So you can swipe down again.
SB: Okay.
KK: So you can just swipe down again, so –
COMPUTERIZED VOICE: A man wearing –
SB: Oops.
KK: Oh.
SB: Oh. It’ll take another picture.
KK: Yeah. If you just swipe down once again, yeah. You can just take another picture.
SB: Yeah. So I’m going to look this way.
KK: Okay.
COMPUTERIZED VOICE: A group of people standing in a room.
SB: okay.
KK: Yeah. A group of people standing in a room, so –
SB: There we go. It’s a very big room, but yeah. Okay.
KK: Yeah.
SB: So let me – so – I don’t know what else -- okay. There’s doors across the way.
KK: Yeah.
SB: -- let’s see how it –
KK: So if you swipe down again –
COMPUTERIZED VOICE: -- standing in a room. Describe scene.
KK: Oh. You double-tap again.
SB: Oh.
COMPUTERIZED VOICE: Double-tap to take a picture.
KK: You can take a picture.
COMPUTERIZED VOICE: Swipe down to exit. A group of people in a room.
KK: Yeah. It’s a group of people in a room.
SB: Okay.
KK: So that’s – that’s basically with describe scene.
SB: Right.
KK: And you go back to the main menu, you can just swipe down.
SB: Yeah.
COMPUTERIZED VOICE: A group of people in a room.
SB: Oh, it’s still –
KK: Yeah.
COMPUTERIZED VOICE: Describe scene.
SB: Okay.
KK: Okay. So you could just swipe forward.
SB: Swipe forward. Okay.
COMPUTERIZED VOICE: Video call.
SB: There we go.
KK: Or swipe back, sorry.
COMPUTERIZED VOICE: Describe scene. Read documents.
KK: And we could go to “read documents.”
SB: Okay.
KK: And I have an item, and you can basically go ahead and try –
SB: Right.
KK: -- to just take a photo of it.
SB: Okay. I’m going to have to have you hold that. Okay.
KK: Yeah.
SB: So do I double-tap?
KK: Yeah. You double-tap.
COMPUTERIZED VOICE: Double-tap to take a picture.
KK: Yeah. Double-tap again.
COMPUTERIZED VOICE: Swipe down to exit.
SB: It’s kind of dark here. Is it going to be okay with that?
KK: Yeah. That’s going to be fine.
COMPUTERIZED VOICE: 35th annual.
KK: So you can swipe through like how you do with -- oh no.
SB: Oh.
KK: Sorry. So you take – double-tap to take a picture again.
SB: Okay.
KK: So you swipe forward or backward to be able to scroll through the text.
SB: Okay. After it’s done; right? I have to wait.
KK: Yeah.
COMPUTERIZED VOICE: 35th annual –
KK: Yeah. Now swipe forward.
COMPUTERIZED VOICE: March 9 to 13 – O-R R-S CSUN T-E-C-M-N.
SB: So it’s going to read a line at a time or a word at a time or –
KK: It reads basically, like a paragraph at a time, so you’re –
SB: Oh, okay.
KK: -- you’re just holding the first page of the –
SB: Yeah.
KK: -- CSUN Assistive Technology manual –
SB: So – oh, well there’s a – yeah. I was going to say, there’s not – I was going to find where there might be a discreet paragraph for it because –
KK: Yeah. Yeah.
SB: -- there’s only like a table. So anyway, so that – and then, can I change the granularity of it? Could I have it play a paragraph at a time or sentence at a time or –
KK: Yeah. So by default, it plays a paragraph at a time. So you can go ahead and, you know, have it read paragraph at a time or you can do it like how you do with VoiceOver, so –
SB: Right.
KK: -- you can say I want to read a line or a word at a time. So those are things that you can configure the way you want to read it. And also the reading speed and the pitch and things like that.
SB: So the camera is on the right edge of the glasses, so I guess if I’m looking at something, I need to be conscious of that rather than like, looking straight ahead from my nose; right? Is that right?
KK: That is – so, you’re right. The camera is on the right. But then the way the camera is placed, it’s placed looking more towards the center.
SB: I see. It’s angled.
KK: Yeah.
SB: Okay.
KK: It’s angled. So it’s angled in a way that – it looks like it’s at the center, so technically, you can just hold it like how you’d hold it like, you know, if there was a camera sitting on the nose –
SB: Got it.
KK: -- on your nose.
SB: Yeah.
KK: And then just take a picture and then it would be able to go ahead and capture things automatically for you.
SB: Okay. So are there some other functions we can look at? We’ve done – so I have documents.
COMPUTERIZED VOICE: CSUN –
KK: Yeah. And you can also look at, for example, like, instant text –
SB: Okay.
KK: So you can just go ahead and swipe down. Yeah.
COMPUTERIZED VOICE: CSUN –
KK: Oh. Swipe down again.
SB: It’s still reading.
COMPUTERIZED VOICE: T-E-C-M-N.
SB: It’s still trying to read. Do I need to – how do I get out of it?
KK: Yeah. Just swipe down should be – yeah. Swipe down again.
SB: So you can tell what I’m doing? Are you seeing something on a display? Are you –
KK: Yeah. I can see –
SB: “Read documents,” it says.
KK: -- a little bit of what you’re doing. So now, let’s try instant text.
COMPUTERIZED VOICE: Instant text.
SB: “Instant text.” There we go.
KK: So this is very much similar to what it is in the app. So you could just start reading all kinds of text around you.
SB: Okay.
KK: So you would just –
SB: I’ve probably got some text. Oh. Like a sign or something.
KK: Yeah. Like a sign. So you just double-tap –
SB: Let’s see.
KK: -- and it starts to read.
SB: Let’s go find something to read. There’s a big display sign over here. Okay. Here we go. Well, this is sideways. How is it going to do on that text that’s –
KK: It should be good. It should –
SB: All right.
KK: -- it should – let’s give it a shot. I think you’re –
SB: I’m tapping in the wrong place.
KK: -- you double-tap.
COMPUTERIZED VOICE: -- documents.
KK: Oh. You’re looking at “Read documents.”
SB: Oh. I went back to "Read documents.” No. Don’t.
COMPUTERIZED VOICE: Instant text.
KK: Yeah. Double-tap again.
SB: Here we go.
KK: I think – yeah.
SB: I’m making the mistake – I’m turning my head, but that’s not what I want to do. Okay.
KK: Yeah. So you can just double-tap.
SB: Oh.
KK: No. Hmm. Yeah. Okay.
SB: Here we go.
KK: Now you’re – now, you’re starting to read.
COMPUTERIZED VOICE: -- BES – I – CSUH, Crawfurd Tech.
KK: It says “Crawford tech.”
COMPUTERIZED VOICE: CSUN – 10 Crawford Tech.
SB: So what’s nice about that --
COMPUTERIZED VOICE: CSUN – CSUN.
SB: Okay. Stop. It’s keeping --
KK: So you just swipe down.
COMPUTERIZED VOICE: -- Crawford Tech.
SB: Come on. Stop it.
COMPUTERIZED VOICE: Instant text.
SB: There we go.
KK: Yeah.
SB: So I should describe the sign. So it’s -- there’s vertical wording at the top, but then most of -- where it says Crawford Tech –
KK: Yeah.
SB: -- that’s –
KK: -- horizontal.
SB: Well, it’s vertical, but it’s –
KK: Yeah. It’s vertical.
SB: -- yeah. I said that wrong. So it’s not to the left and right, it’s up and down, which is nice. It seemed like it could do that.
KK: Yeah. So it’s able to automatically understand the orientation –
SB: Yeah.
KK: -- and then then, you know, see that, okay. This is in vertical, so I’m going to just basically flip it and then read it directly, so it’s possible to do that.
And also, something that might be interesting for your international listeners is the fact that this can read in over 60 different languages –
SB: Right.
KK: -- including Chinese, Hebrew, Arabic, and so on. So – yeah. It’s really useful for people who are -- like, for example, we have a lot of Japanese users. And this is something that they can buy and make use of because it’s able to read Japanese really well, like how it reads English.
SB: Does it do translations? So if I’ve read that Crawford Tech sign but I’m a Japanese speaker, would I get -- well, that’s not a good example because it’s a proper name, but –
KK: Yeah.
SB: -- like, would it translate languages?
KK: Yes. There is also a possibility to do that. So you can also have the option to translate text that you read. So if you’re in a foreign country and you want to read the menu and have that translate for you, the glasses can do those as well.
SB: Nice. Okay.
KK: Yeah. Yeah. And something that’s also, like, really exciting for us that I probably should have mentioned, you know, earlier as well is the fact that this is basically a platform for other apps to also come on to the Google Glass like what we’re doing. So we’re basically building a platform, where, for example, we could have folks like Be my Eyes or Aira also come on to the glasses. So we’re at CSUN trying to discuss this with them as well. And we have meetings set up, and we should be – we’ll see if we can get them on board the glasses as well. So that should really broaden what the glasses can do. So it’s not just, you know, the Envision app, but then you also have other apps that you might want to use as well.
SB: So your platform is not the AI that we’ve been describing, the Envision AI AI. It is the ability to run this kind of software on the Google Glass platform.
KK: Yeah. More broadly, it’s like being able to run a platform – it’s basically a platform for smart glasses.
SB: Right.
KK: Right? So it’s an AI-powered platform for smart glasses, so the Envision AI app is as an app showcasing that platform, and we’re also talking to other developers who our users are really interested in to get them on to the – onto these as well.
SB: So in theory, if there were another smart glasses platform like Horizon or some other glasses, you could make your software work on that platform also?
KK: Totally. Totally.
SB: Yeah.
KK: One of the key things is that, you know, we’re starting with the Google Glass today because that is the most powerful smart glasses out there, but tomorrow, if, say, Amazon or even Apple –
SB: Yeah.
KK: -- comes out with smart glasses, we’re more than prepared to be able to take our software and put them onto those glasses. So we’re trying to be a very platform-agnostic company. So we’re trying to be on as many platforms as we can.
SB: Are Google Glass widely enough available that when you start taking preorders – you say you’re not going to ship until July or August, but are people who want these going to be able to get them in the quantities that you – I mean, especially with what’s going on in the world right now. I don’t know where these are manufactured, but what are your issues for availability of the hardware?
KK: So we don’t foresee any issues with the availability of the hardware because we’ve been in touch with Google over this for a long time now, and we’re very closely partnering with them. So I think we’d be able to easily, you know, hit our preorder numbers and be able to supply the glasses starting from August.
SB: Okay.
KK: Yeah.
SB: And who do you think this is the right fit for? Obviously, there are lots of different assistive technologies out there and there are lots of people with blindness and visual impairments, but what kind of a person do you anticipate would be a good fit for this product right now?
KK: So that’s a really good question. I think what we’re trying to aim for is a user who is someone who is quite active in their lifestyle. So someone who probably works at an office or someone who is studying as a student and someone who has to basically read a lot of text or be able to go out and about and do things. This is aimed at them, and it’s also – but it’s also made it – we’ve also built it to be very user friendly for even older audiences as well. So that’s what we’re going for to begin with.
SB: And I should say they’re very lightweight. I mean, it’s –
KK: Yeah.
SB: -- like wearing a slightly heavier pair of glasses and -- you’d want to be careful, but it’s not like wearing a headset or even any – I’ve worn heavier pairs of glasses. I mean, it’s pretty – what’s –
KK: Yes.
SB: -- what’s the battery life like for the Google Glass with this platform running on it.
KK: Yeah. So it is like you mentioned, you know, very lightweight. So some of the users that we’ve been testing with told us that they don’t really remember wearing the glasses at all. So they get used to it after a while. The battery life on these are about seven hours. So if you put them on standby, you can, you know, go up to about 10 to 12 hours as a whole. But you should be able to get a solid five- to seven-hour usage with the glasses. And we’re also optimizing the battery life even more, and by the time it ships, you should be able to get five to seven hours easily.
SB: Great. What haven’t we talked about? What haven’t I asked that we need to know about?
KK: Nothing that I can –
SB: Okay.
KK: -- that I can think of.
SB: Great.
KK: So I think we did cover the partnerships, we covered the pricing --
SB: Yup.
KK: -- availability – yeah. I think that’s –
SB: Tell me again the website information because they might want – Patrick might want to put this on at the end of the demo, so let me just ask you again to –
KK: Yeah.
SB: -- tell us where people can learn more and buy these glasses if they want.
KK: Sure. So you can learn more about the glasses at www.letsenvision.com/glasses.
SB: Excellent. Karthik, thank you so much for your time.
KK: Yeah. Thank you so much, Shelly.
For more exclusive audio coverage, visit blindbargains.com or download the Blind Bargains app for your IOS or Android device. Blind Bargains audio coverage is presented by the A T Guys, online at atguys.com.
This has been another Blind Bargains audio podcast. Visit blindbargains.com for the latest deals, news, and exclusive content. This podcast may not be retransmitted, sold, or reproduced without the express written permission of A T Guys.
Copyright 2020.


Listen to the File


File size: 14.3MB
Length: 20:14

Check out our audio index for more exclusive content
Blind Bargains Audio RSS Feed

This content is the property of Blind Bargains and may not be redistributed without permission. If you wish to link to this content, please do not link to the audio files directly.

Category: Shows

No one has commented on this post.

You must be logged in to post comments.

Username or Email:
Password:
Keep me logged in on this computer

Or Forgot username or password?
Register for free

Joe Steinkamp is no stranger to the world of technology, having been a user of video magnification and blindness related electronic devices since 1979. Joe has worked in radio, retail management and Vocational Rehabilitation for blind and low vision individuals in Texas. He has been writing about the A.T. Industry for 15 years and podcasting about it for almost a decade.


Copyright 2006-2024, A T Guys, LLC.