Lecture: Smartphone Accessibility in Low Vision

Smartphone technology continues to evolve and provides improved access for people with vision impairment and blindness. This course will review the current features of iPhone and Android accessibility. It will also discuss some of the most popular apps that are used by people with vision impairment.

Lecturer: Dr. Alexis Malkin, Optometrist, New England College of Optometry (NECO), Boston, Massachusetts, USA

Transcript

DR MALKIN: Good morning! Or good afternoon, depending on where you are. Welcome to the current Orbis webinar. We’re going to be talking about some general low vision technology, but particularly focused on smartphone technology. Please put your questions into the chat, the Q and A, and we will get to those at the end. So let’s begin, because there’s a lot of content to cover. And thank you, Orbis, again, for having me. I’m faculty at New England College of Optometry, and I really enjoy doing these technology lectures, especially for Orbis. So we’re gonna begin with a poll. Just to get a sense of who’s here. So how do you serve patients with vision impairment? How are you connected to the vision impairment community? All right. So if we can see those poll results, we’ve got mostly optometrists and ophthalmologists, but some vision rehab teachers and some who are listed as other. Perfect. Give me one second. All right. So this will be a quick intro, probably review for a lot of you, since we have so many eyecare providers. But I just wanted to quickly touch on the categories of low vision technology. So remember that there’s kind of your traditional video or digital magnification, like CCTV. And you can see those pictured on the right. There’s computer-based technology, both built-in accessibility, as well as software you can add onto your computer. There’s smartphone and app-based technology. That’s where we’re gonna spend most of our time today. There are augmented reality devices, which we may touch on a little bit, depending on time. There’s some specific technology available for students. There is sensory substitution, so using tactile or entirely audio-based technology. And then there’s some mainstream tech with applications in low vision, which again we may touch on briefly today. So let’s begin by talking about smartphones and app-based technology. And we’ll talk a bit about Android, a bit about iPhone, try to do a few demos. And show you how some of these things work. So let’s go to our second poll. What type of cell phone do you use? So mostly Android. And I’m finding that that tends to be pretty typical. I think this is a slightly higher iPhone using group than what the worldwide stats would show us. I’m an iPhone user, but we will actually spend quite a bit of time on both today. That way you can make sure that you are informed, when you’re working with your patients. So before we even get in to smartphones, don’t forget that actually there are some really simple cell phones out there that have some accessibility. So there are flip phones that have large buttons, that have tactile information, and that have voice activation. So if you have a really tech-phobic patient, they may go with something really, really simple. Just for emergency calling. And I would say pay attention to what your patient’s interest is. They may think they can’t use a smartphone because of their vision impairment. So it may not actually be that they’re tech-phobic. It may be that they just didn’t realize that there were so many accessibility features available. And that tech fears are not necessarily based on age. I have young patients who really only want to use a simple phone. And I have older adults who are more than willing to use smartphones and often know at least as much about the phone as I do, if not more. So when we think about Android accessibility, maybe five or six years ago, we weren’t seeing as much in terms of what the Android phone could do for people with vision impairment. Android very quickly caught up to what Apple was doing. And in many ways, it’s become even easier. Because you can actually customize your Android phone. You can now actually have it connect to a Braille display. That is new in the last couple of years. So when we think about Android accessibility, we’re gonna talk about a lot of different features. This is an image of the accessibility menu. So just showing the different features you can turn on. The great thing about Android, as we saw from our stats, you know, 70% of the people on this Zoom are Android users. And like I said, I think you’ll find that with your patients as well. That more people are using Android than iPhone. So I’m really happy that they’ve caught up on the accessibility features. So first let’s talk about how you modify the display. So when you talk about accessibility, you’re thinking about what you can do in terms of vision, but you’re also thinking about what kind of auditory features are there, or what kind of sensory substitution. So in terms of display modifications, on an Android phone, you can increase the overall size of the display. So all of the icons, all of the information gets larger. You can increase the individual font sizes. You can use magnification. You can have it on full-time, so everything on the screen, in addition to the display being magnified, now everything that you do when you open the apps will be magnified. Or you can turn it on as a shortcut. Android has added the shortcuts in, over the last couple of years. And that’s been really helpful for people who don’t necessarily need it magnified all the time. Many people can navigate their home screen. They know what the mail app looks like, or the weather app. It’s when they get into the details that they need to turn on that magnification. Android does have the triple tap feature to magnify. So three quick taps on the screen then zooms in. If you have the magnification turned on. And they say that the magnification goes up to about 8x. Obviously we know that that’s going to be screen size dependent. So 8x on a 4-inch phone is going to be different in terms of ultimate size as compared to a 5 or a 6-inch phone, or compared to a tablet. They do have a feature where you can increase the contrast of the text. So higher contrast text can be available. I’ve played with that a little bit. I haven’t noticed huge improvements. And it’s still under development. And the same is true for the iPhone. That high contrast feature on the text just doesn’t seem to make that much difference. Android phones also have color inversion. And they have the dark mode. So kind of reducing the overall light coming out of the phone. Though that doesn’t happen in all apps. So if we go back to that original image and that accessibility… So you get to that screen by going to your general menu. It looks like the little gear. You go into your — click on the settings, and then you go into general and accessibility. And you can start to make those adjustments. I often show patients where this is. And give them a printout. But I’ve learned that they don’t always want you to adjust it for them. Because then they get home and you may have turned something on that they didn’t want, and they get kind of stuck. But some really nice overall adjustments that are built into Android. And the inverted colors are quite helpful. When we get to iPhone, we’ll talk about the differences between what the iPhone inversion looks like and the Android. That may be the only place that iPhone is kind of still winning in my mind. In addition to what you can do with the display on the Android phone, you can use their TalkBack feature. That’s their voice output. Again, a few years ago, I was not quite thrilled with the way TalkBack worked. I didn’t think it was a great option for people who really relied on auditory output. It’s pretty impressive now. They’ve really worked hard and had feedback from people with vision impairment. So the TalkBack feature is highly customizable. It’s not just an on or off. You can do it that way, for your patients who don’t want a lot of complication. But for your sophisticated screen reader users, they can really make this work the way they want it to work. As I mentioned early on, it has Braille keyboard connectivity. That is a newer feature. It connects through Bluetooth. So a great option for your Braille users, for their long-term reading. On the original Android devices, everything was a single finger gesture, which was a little bit harder for people to work with, just moving one finger on the screen. But on the newer updates, you can now use multifinger gestures. You’ll have to look up with your patient which version of the Android software they’re running, if you want to advise them about how to do the gestures. Typically my strategy with gestures is to encourage the patient to kind of watch some YouTube videos and do some of the tutorials built into their phone to get comfortable with the gestures. I will also refer to either vision rehab teaching, assistive tech, or occupational therapy, to do that extra training, if they need it. Some of the key features with TalkBack is it actually has a screen search. So you can ask it to find a specific phrase on the screen, so that it will highlight it, and then the patient can get into that area. So they’re not scrolling through auditorially all of the ads on a web page, for example. You can actually switch languages with gestures. So bilingual patients or people who speak more than two languages can easily switch back and forth, depending on what they’re looking at. And it has the hide and show screen feature. So someone who is using primarily auditory may not need their screen on. So they may not want somebody who is sitting on the bus next to them to see what’s on the screen. They’ve got their headphones plugged in and they can just use a gesture to hide the screen when they’re looking at their banking, because it’s all auditory. They don’t need that vision input. In addition to those customizable features, there is actually really good reading controls available with TalkBack. So you can set your TalkBack to read character by character. If you’re entering a username and password, that’s a really helpful way to do it. You can have it read word by word. You can set it to read just one line of text at a time. I think most commonly, people have it set to read a paragraph at a time. But you can also have it jump from one heading to the next if the website is encoded properly. When we talk about good accessibility of websites, one of the things is making sure that the text is encoded as headings, versus content, to allow screen readers to jump more easily for people, in the way that if you are using vision, you are able to quickly scroll visually from one heading to the next. And again, if the website is encoded properly, you can use TalkBack to jump to buttons or text boxes or check boxes. So you don’t have to have it read every single item on the screen. These are the more advanced uses of TalkBack. Because patients have become quite efficient in doing this. So it really gives them a good option. And you can jump from link to link. So if you’re doing a Google search, you can have it just read each link, rather than the description underneath the link. And then of course, like all of the voice features, you can change the rate at which TalkBack is speaking. In addition to the visual display modifications and the speech, you can actually adjust a couple of other things in the Android phone that I think are very helpful. One that is particularly helpful for our patients is you can adjust what’s called the time to take action. So when you are logging in, if you have something like two factor verification on, and you get that text message with the code you have to enter, it often pops up and goes away very, very quickly. Sometimes the phone is smart enough and asks if you want to fill that code in. But not always. So you can actually slow down how long it takes for that pop-up to go away. So you don’t have to navigate into your text messages and find it. You can let it stay on the screen a little bit longer. And there’s, I believe, five different settings there, for how long those pop-ups stay on the screen. So I thought that that was a really nice thing that Android has added in, and I have not seen that on the iPhone. In addition, Android has added what they call Lookout. Which is that the phone uses AI to give you information about surroundings. So it has five different modes. It has a text mode. An explore mode. A food labels mode. A document mode. And a currency mode. Right now, the currency is US only. Which is unfortunate. But there are other apps that can do other currencies. But these four modes are really helpful. We’re gonna talking about Seeing AI in a few minutes, which is an app that does many of these things. And the biggest complaint we’ve had with Seeing AI is that it’s not available on Android. So it’s a wonderful, sophisticated app. But people who are Android users have never been able to use it. So now that Android added this Lookout feature, they’re starting to compete with Seeing AI. And that’s a pretty fantastic option for your patients. I would say that the text and the documents are probably the most helpful. Food labels are really particular. You know. It’s not always going to be able to read it. It can’t always identify certain bar codes. It’s very dependent on what’s in their database. And that explore mode only gives you so much information. I’ll show you how some of it works in the iPhone. But it’ll describe perhaps that I’m sitting in an office with some pictures behind me. That there’s bright light coming in. If you have a number of people attending a lecture, that AI can say: Likely in a room with people attending a lecture. Type of thing. But it’s not gonna give you a lot of detail. And it’s still pretty early in development. Lookout is available in multiple languages. I posted here what the most current list is. Of languages. I’d like to see that list grow longer. But it’s not bad. They’re moving in the right direction, with the languages that can communicate that Lookout information. And I imagine they will add more currencies, besides US, once they get a little bit more sophisticated in the software. And they do encourage that you do set your country and your language to get the most effectiveness, especially with things like food labels and document reading. So you actually get it read accurately in the language. So one of the negatives about Android accessibility is that it does vary a bit by device and software version. That’s true of kind of any technology. You need to have the latest software to have the latest accessibility. I did put the website here, but essentially you go to support.Google.com/accessibility. And you pull up Android. And it will walk you through every feature that’s available. So it will let you know when things have been updated. It’s typically where I send my patients if they want the detailed information. I also have a handout with the really brief steps for how to get into the accessibility mode. Android has added an accessibility suite that you can download. So here in the US, many patients are able to receive an Android phone, if they have Medicare or Medicaid. So either of the government insurances. That often qualifies them for a free cell phone. When they receive that Android phone, it’s often a very basic model, without full accessibility loaded in. So they can actually go to the Google Play Store, and download the accessibility suite, in most cases. And have the ability to do the enlargement. That was not available a couple of years ago. So often patients with those free phones were stuck. And unable to actually do anything beyond some very basic font size modifications. So that’s a really nice feature that they’ve added as well. So let’s shift over to iPhone. And you’ll see that accessibility is quite similar. Really the two major phone manufacturers have worked really hard to kind of compete with each other, and as one adds some sort of tech feature, then usually within a few months, the other has that feature added in. It’s nice to see that there is competition in that market. It’s not always something we see in low vision. So we’ll start just like we did with Android. So iPhones — you can do a lot, in terms of display modifications. So you can change the color filters of your home screen, as well as the button shapes. I haven’t really found a good use for changing the button shapes, when we’re talking about vision impairment. But they offer that feature. The text and the fonts can be enlarged and made bolder. Much like the Android phones, I’m not sure that the boldness is that effective for people with vision impairment. Obviously the larger font size can be very helpful. And on the iPhone, that enlarged text will carry over to any apps that support it. That is not quite as seamless on Androids right now. But it’s getting better. My favorite feature of the iPhone is that it actually has a magnifier built into it. And it works much like a portable video magnifier, or like one of the magnification apps. So I’m gonna demonstrate that for you in a couple of minutes. It’s not quite as good as a portable video magnifier. But if someone already owns an iPhone, it’s a great option for them. The iPhone also has dark mode. They have smart inversion instead of just general inversion. And this I think is a really, really good feature. Like I mentioned with Android, this is one of the few places that I actually see a difference, is the smart inversion. So the iPhone will interpret images, and so it will not invert the pictures. It used to invert everything, and the Android phone for the most part still inverts everything. But now it smart inverts. So it tries to invert only the text. And it just makes it a little bit more of a pleasant experience for someone. They don’t have to keep flipping back and forth between inverted and not inverted. To see pictures and things like that. And much like the Android, they can do overall screen zoom. So I’m gonna share my screen here. And see if I can show you… I’m just gonna grab a couple of things to show you. How the magnifier works. So give me a moment while I stop my share, and then reshare for you. It takes just a second here to connect. All right. So you’ve now got my phone up on the screen. And you can see I’m in dark mode. And I’ve got my settings there. Much like the Android, I can go into accessibility. It doesn’t want to show you that on the screen sharing. But I can make the changes. And then I can go into my magnifier. Let’s see if it wants to… It doesn’t seem to want to show you my magnifier. Let’s try that one more time. All right. So I’ve got it in the magnifier mode. You can actually see I’ve got it on reverse contrast. In sort of a grayscale right now. And it’s showing you my keyboard. And I can actually enlarge or not enlarge that. And I can flip my color. I can change my brightness. And it has a little slider to get to whatever color you prefer. So that’s no filter. Gray scale. A bunch of choices your patients probably won’t want. But inverted. And regular are probably the most effective. If I hold it up to my PowerPoint slides, you can see the inversion is quite good. And I can use it with no filter here, if I wanted to look, for example, at, like, a glasses cleaning spray. It lets me enlarge. And zoom out. So we’ll stop that share and go back to the slides. There we go. So I really like that magnifier. But it takes a little bit more physical coordination. So you need a patient who is able to manipulate that. And they do find it harder to work with than your typical video magnifier. And I think it’s not quite as easy even as some of the apps you can put on the phone. But if they own an iPhone or an iPad, it’s free and it’s built in. So that’s a kind of helpful tool. So let’s go on to some of the speech. So as you remember, on Android, that was called TalkBack. On the iPhone, it’s called VoiceOver. They claim that it’s the most sophisticated screen reader available for smartphones. I think Android would argue that they are catching up very quickly. Probably the biggest difference with the iPhone app screen reader compared to the Android is it has rotor controls. So you can use the phone like a trackpad, instead of only gestures. So that makes navigating a web page a bit easier for someone with vision impairment. You can use Braille screen input on the iPhone screen. And like Android, you can connect to Braille keyboards and Braille displays. So really good options for people who use Braille. You can use speak screen, where you just swipe down, and it has the phone read what’s on the current screen. That’s available in 35 languages. So just a bit ahead of the number of languages that Android had in their features, but close. And like TalkBack, you can control the pace of reading very precisely. I didn’t want to go through all of those controls, because we talked about that with Android, but the same features are available on the iPhone. You can jump heading to heading, you can go paragraph by paragraph, you can control what you want read and in what order. In addition, the iTunes Store and Apple TV have audio descriptions available for shows and movies. It’s not on everything. But there is a decent amount of audio description available. And the iPhone has photo and image descriptions. So if you knew you had taken a photo or someone had emailed you a photo or texted you a photo and you saved it, as you scroll through with VoiceOver, it will say: Okay. That’s a photo of… You know, a child doing XYZ. Or that’s a photo of two women standing outside with a mountain background. So they’ll give that information. Again, it’s not perfect. This is relying on AI to interpret the image and give that information. But it is there on iPhones. Takes a little bit to learn to access it. And then for the iPhone, you can just go to Apple.com, to accessibility. Much like Android, it has a pretty comprehensive guide. I find that actually the Android guide is a little bit easier to navigate, because it seems to be designed for people using screen readers. It’s pretty much all text. The Apple Accessibility Guide is sort of broken down into icons and images. And I’m not sure how easy it is to navigate that with a screen reader. Or as someone without vision. But they both have pretty comprehensive guides. So that’s the basics on what’s built in to the Android phone and to the iPhone. And now let’s talk about what you can do, as you add apps on. I think the built-in accessibility is actually quite good and far more advanced than we’ve ever seen. But there are still places where having an app is helpful. So patients often ask about: Which magnifier app should I get? There’s 80 of them, when I go into the Google Play Store or the App Store. And I don’t even know what I should do and what I should get. So the things that I talk to my patients about are asking them to figure out: Does the app do more than just access the light and the camera? Does it do font adjustments or contrast adjustments? Like I showed you with that built-in magnifier on the iPhone. Those are features that are worth having. For the most part, I tell people to look for apps that are free. I have not found there’s a whole lot of value in spending money on a magnifier app. The free ones work quite well. So there are a few different ones that I tend to steer patients towards, just because I’ve worked with them, and I know that they do a little bit more than just magnify. My personal favorite is SuperVision+, and that’s because it has an image stabilizer in it. It does have some contrast adjustments. And it works quite well. When I show patients the built-in magnifier and I show them SuperVision+, they often find that SuperVision+ is a little bit easier to navigate. It’s free. There’s lots of other free apps available. I make no money off of recommending SuperVision+. I just think it happens to be a good magnifier app. And I find when my patients are coming in and asking me for guidance, they don’t want me to give them a list of ten different apps. They want to know one or two that they can look at and try and they can figure out if they’ll work. So I try to keep a favorite list. I test them out. I get feedback from patients. And I update my favorites as time goes on. But SuperVision+ works quite well. So I’ll do a quick demo. Again, assuming that my screen share works. It’s been cooperating. So that’s good. So I’ll do a new share. Okay. So SuperVision+. You can see it’s that top left app. And I’ve got it inverted right now. I’m gonna flip it to regular. We can go back to my glasses cleaning spray. And you can see it’s a relatively stable image. You guys are seeing an enlarged version. It’s a little bit shaky when it gets sent over Zoom. But it actually is quite good image stabilization. The lightning bolt will turn a light on, if you’re in a dark room. You can freeze the image, and then take a screenshot of it and then save it, if you would like. So this is a little bit more stabilized. And I can actually invert the way that the image is. I’m not sure that that’s super helpful. But you can flip the image entirely around an axis. That’s on a tiny bit of a delay there. So pretty easy to use. Pretty good zoom. Good scroll. You can even see all the dust on my keyboard. And we’ll go back to the other… Share here. So I think that SuperVision+ — like I said, really, really easy to use. My patients tend to do well with that. There are a number of apps you can add on that read things out loud. This is just a list of some of them. They all have their quirks. They all have pros and cons to them. But you can try out Zoom Reader, Voice Dream Reader, Text Detective. Any of them are okay. Honestly, I feel like with Android’s new Lookout included in the phone, where it can do some of that text-to-speech, and with iPhone accessibility and our ability to add on Seeing AI, which is one of the apps on the iPhone, I’m not sure you really need to add on another one of these types of apps. But they are available if you decide that you want something like that. There are a variety of apps available for daily living. So there’s the LookTel Money Reader and the IDEAL Currency Identifier. Those do multiple currencies. The IDEAL Currency Identifier is Android-only. So that’s a way to get around the US currency-only feature of the Android. And then Eye Note and the US Currency Reader are developed by the US Mint. So those you can only use for US currency. But there are some different apps that allow for currency reading. There are some different color identification apps. I always give my patients some caveats with these. Because they’re not usually actually designed for people with vision impairment. They’re often designed to identify paint chips. So they’ll give you a really complicated description of the color. Because it’s matching it up to a paint chip. So the more simple version of the apps tend to be the best. That gives you just like red versus brown versus green. If you turn on the sophisticated colors or the more complicated colors, it actually will give you a paint chip tone, and in the image I have here, Red Robin, that’s not so bad. That gives you a pretty good description. Whereas if you were to get a paint chip description, you may not even know what color it’s referring to. I very occasionally have patients who use these product identifiers or QR readers for their items in their home. But you can put your own QR codes on items. So if you… You’re always looking for a particular reference book, and you want a QR code on it, or particular… Jacket in your closet, and you want to label it that way, you can actually put like a tag, a talking tag on it. And scan that QR code. I think there is a role. I think some people use them occasionally. But you’ve got to be pretty tech savvy and willing to label your items and then use those QR codes. And then there are some apps with multiple functions. Like I said, both Android and iPhone are adding a lot of this in on their own. So you don’t necessarily need the apps as much as you once did. It’s been pretty interesting to watch the technology update, so we went from having to have an entire separate device from our smartphone to needing to spend money on some apps on our smartphone to the smartphone is really just building in what we want to use, right into the device. So a lot of users still use the KNFB Reader Mobile App. That app is $99 US. So it’s one of the most expensive apps that’s out there. It’s a really, really sophisticated screen reader and document reader. It has pretty impressive capabilities in terms of text-to-speech. So people do still like it. I haven’t worked with the newer Android document reading enough to say that Android can absolutely replace what the KNFB Reader Mobile App does. But that’s something that I’m gonna start practicing with and seeing. If — you know, can we get people with an Android phone to avoid spending $100 US on an app? I love what they’ve done with the KNFB Reader App. And it’s not that I don’t want to support the app development, but when we’re working with people with low vision, free is always better. In addition to the KNFB Reader, for iPhones there is an app called Seeing AI, which is free. Aipoly Vision and Envision AI — those are hit or miss. Sometimes they’re really, really good, and sometimes I can’t get the apps to turn on at all. So I tend to steer my patients towards Seeing AI if they’re iPhone users and the KNFB Reader if they are Android users. So here’s a screenshot of the KNFB Reader Mobile App. So you can see that it is able to have kind of large icons to help stabilize the image and capture the image. In the bottom right corner of that left picture, it says: Batch off. So you could be scanning multiple pages at once, if you turned that on. It has the flash so you can turn on a light. And you can take a picture, versus just having it read what’s there. And then really sophisticated ways that you can modify the text as it’s reading out loud. So it is a nice feature. But like I said, that $99 is sometimes cost prohibitive for our patients. And then Seeing AI is free. So we will demo that. I’m just gonna grab a couple of things, so it can read them out loud. I will do this new share. We’ll connect it one more time. So if I go back in my apps and I go to Seeing AI…

>> Short text. Vail Valley’s finest luxury vacation rentals.

DR MALKIN: All right. So short text. It just read that really, really quickly. I could scan it…

>> Vail Valley Getaway. Vail Valley’s Finest Luxury Vacation Rentals. 888-617…

DR MALKIN: So you can see when I did the short text…

>> Visible.

DR MALKIN: One moment. Let me tell it to stop talking. There we go. It didn’t pick up the information from the image at the top here. But once I scanned it, it actually captured the entire thing. It has a bar code reader. It has a currency identifier. And it can give me scene descriptions. So this is similar to what the Android Lookout feature has. So if I click on that, and I hold this up… I’m just gonna take a picture of this printer and see what it does.

>> Processing. Probably a black printer on a table.

DR MALKIN: So not bad. And it tells you how confident it is. So probably a black printer on a table.

>> Processing. Two items detected. Move your finger over the screen to explore. Stove. Contains text. Dr. Alice’s Malkin.

DR MALKIN: So it’s actually in the very bottom corner of this image, it’s picking up my name. Which is pretty impressive. And you can move your finger around and see what it picked up in the image. And we’ll go back to the slides. So Seeing AI is a really, really useful app. Patients love it. It’s free. But you have to have an iPhone. So it has its limitations there. But like I said, if you have an Android phone, play with that Lookout feature. See how it works, and see if that’s something you can introduce your patients to. And then this are some apps that rely on sighted users. So I’m actually gonna skip the Aira demo today and just talk to you about it. But there’s Aira, Be My Eyes, and Be Specular. So Aira connects you to a paid employee of Aira. It only works in a few companies, so it’s not something you can use everywhere. Really cool visual interpretation service. You can have a few short calls for free, but typically if people are using Aira, they’re actually getting a paid description. In addition, Aira has free services in certain places. So right now, they’re doing a promotion with Starbucks. So if you go to any Starbucks in the US, you can connect to Aira for free, have a visual interpreter explain to you where the line is forming. You know, what’s available in the refrigerator. What is new on the specials menu. Those kinds of things. Be My Eyes is a free app that connects you to volunteers. And you connect in, and you ask them questions. So someone with vision impairment can call the volunteer and can say: Can you tell me what that picture is? It’s on my desk. Or I got a piece of mail and I’m not sure it’s for me. And they can read you the envelope or they can read you that piece of mail. So those are useful for some patients. I do spend a good amount of time kind of reassuring about privacy policies, especially with Aira. They have published privacy policies. Be My Eyes is entirely volunteer-run. So you are taking some risk, if you’re gonna have them try to read sensitive documents. But it’s a pretty amazing volunteer-run app for non-sensitive documents. So I use it. I’ve helped people kind of reset their air conditioners, find a phone case, pick out a color of something in a store. There are some neat ways to do it. In addition to what we’ve already talked about, for those of you in the US, you can connect to the Library of Congress’s Talking Books App. So that you don’t have to have a Talking Books player. The National Federation for the Blind has their Newsline app, where you can listen to a huge number of different newspapers read out loud through an app. And then, you know, you can really just take advantage of everything that’s built into the phone. So a lot of my patients don’t use any of the sophisticated features. They actually just use the camera. They take a picture of something, and they zoom in. Or they call someone on FaceTime and say: Hey, can you tell me what I’m looking at? So you can keep it pretty simple or you can make it fairly advanced. So in the final few minutes we have this morning, I want to shift away from the traditional thought of smartphones and talk about head mounted displays, because a lot of these actually use smartphone technology as well. So it’s connecting to the smartphone, but through an augmented reality or virtual reality type of device. So our final poll question for this talk: Do you have access to head-mounted displays in your clinic? All right. So most people do not. So we won’t spend too long here. We’ll kind of talk about it briefly and then we’ll get to the question and answer. Because I think that will be really beneficial for everyone as well. So just a quick history on head mounted displays. So they started in 1994 with the Low Vision Enhancement System, or the LVES. That is Dr. Bob Massof in the photo there, holding the original LVES. He developed that with NASA and Johns Hopkins. But it was pretty limited. It was low resolution. It was heavy. It’s big. It had a lot of lag time. And it was very expensive. You’ll see as I talk about the current head-mounted displays, a lot of those are continued issues. So in 2018, this paper was published, by Deemer et al., showing what was available on the market. And the head mounted displays ranged from about $2,000 US dollars up to $10,000 US dollars. Still huge range of weight. And of resolution. So some of them are standalone devices, and some of them, like the IrisVision over here on the end, use a smartphone. So Iris uses an Android phone as its operating system. Which is how it connects to the smartphone technology lecture. Patients are starting to ask more about the head-mounted displays. In part because they see the YouTube videos of people getting them. And they hear about them. Or a friend sends them a link and says: Oh, this might be the Holy Grail. This might actually fix your vision problem. So you really need to counsel your patients appropriately. And I really tend to steer them towards some of the easier to access technology, that built-in technology on their phones, before we go to something that’s gonna cost a huge amount of money. So you want to think about battery life and weight and updates. I do like ones that are using a smartphone, because you can update it more regularly, without having to change out the entire hardware system. But how is it magnifying? What’s it doing beyond what they can access through a different type of device? Some of the issues that we’re facing — especially in the US — is that low vision devices can get regulatory approval without any kind of clinical data. So you don’t have to study it, the way that you would with a new intraocular lens. You can just get regulatory approval. And I think even more of a concern is what I just mentioned. That the marketing and the human interest stories can be really, really deceptive. So I have patients come in all the time, where they’ve printed out an article, and say: Oh, here it says that that blind person saw their child for the first time. So I need that device. And I find that people are not aware of standard low vision devices. So they haven’t even seen a monocular telescope. But they want kind of this high technology item. It’s kind of overwhelming and expensive and perhaps difficult to learn to use. When a monocular telescope might help them accomplish their goal in a much more cost effective and efficient way. It’s not just about cost. But it might get them to what they want to do more quickly and more easily. And like I said, many of the features in these augmented reality devices are on smartphones. So it’s not quite as refined as what you can get in a head-mounted display. I use head-mounted displays pretty regularly in my clinic. But I like patients to know all of their options. I like them to see: Here’s the simple way to accomplish that task. Here’s what your own phone can already do. And here’s what this kind of advanced technology can do. And is there a role to integrate that? So lots of different devices out there. They’re all still fairly noticeable. They are not just kind of built into your glasses, the way that a lot of people are hoping. And like I mentioned, they’re still heavy. They still have battery life limitations. And they just don’t necessarily do what those human interest stories would suggest. But keep watching. Keep seeing what they’re doing. A lot of my patients will buy their own VR headset. You know, one of those Google boxes, or Amazon makes one. And just put their own smartphone in it. And now they have their Android phone, with its Lookout feature, doing some things that are similar to what these augmented reality devices do. Not quite getting all of the features. But doing a lot of connectivity. A lot of visual enhancement. For a much lower cost, and kind of low risk, in terms of trying that out. In addition to those augmented reality devices, there is the OrCam, which does just clip on to glasses. But the OrCam is only text-to-speech. It can do some facial recognition and product recognition. But you aren’t getting a visual image. You’re only getting some auditory information from it. It does have a currency identifier. It is something some patients like to use. I always say that there’s always the right device for the right patient. So there’s no one piece of technology that is perfect for everybody. And like I said at the beginning, just kind of briefly mentioning some of the mainstream technology, a lot of our patients here are using things like the Amazon Echo, or the Google Home. That they can do voice activation to make calls. They can have it read them the news. Have it order things off of Amazon for them. They’re pretty inexpensive devices. And they are a way to add that mainstream technology into what patients are doing. The future… We don’t know where we’re gonna go. But I think every time I prepare this lecture, I see new updates to what smartphone technology can do. And I’m really impressed with the direction that things have gone. We also know there are retinal chips being implanted. There are cortical implants. There’s a lot of technology being worked on to help our patients with vision impairment. I’m not going to go through the Argus today, because I wanted to make sure we have time for questions. But that is a retinal implant. And I think we just need to keep watching, to see: What can we end up getting into our smart device? Can we encourage patients to start with that part of our technology and work with it? And then long-term, perhaps we will be at a level where these retinal implants and cortical implants and augmented reality displays become truly a part of low vision practice for every patient. I think right now, we’re still at the right device for the right patient, depending on their goals and their needs. So I will pause or stop here. And we can go to the questions. Because we just got a few minutes left. All right. So I’m gonna go into the Q and A and try to answer some different questions. So one of the questions is: Is there a game app for low vision patients? So there’s a lot of different games that are actually accessible for low vision patients. I follow an iPhone and Android group on Facebook, and there’s a lot of conversation about which game apps are most accessible. So I don’t know any specific to give you about which games. But you can direct your patients to some of those groups. Like Apple This or some of the Facebook groups where people with vision impairment talk about those. And another question is about sharing the patient handouts. I can share a version of the patient handouts with just kind of the step-by-step for what you get into, to turn on the accessibility. We should be able to upload that when we upload the PowerPoint and the recording. For turning on the magnifier, so that is in the accessibility, and if you’re using the iPhone, this will be in the handout. But you go in, and it says “magnifier”, and you just switch it to turn on. But we can give you the handout on that. Or if you go to Apple’s website, it will walk you through that as well. I’m just looking to see here. There’s a question about using some of these apps at school for writing. So some of the devices can actually do a good amount of recognition of handwriting. The Seeing AI app does recognize handwriting. But it’s still not great. There is a comment that unfortunately some of the magnifier devices aren’t available in low income countries. And that is definitely true. I think that’s where seeing the Android phones actually have so much built in — that has changed the accessibility of this technology. So because you can use what’s built into an Android phone, that’s fairly accessible. I find that when I’ve worked with patients in other countries, many have access to a smartphone. Even a very basic Android phone. That is one way that it can be really helpful. So that they don’t have to see technology that they can’t use. There’s a question about head-mounted displays and augmented reality in more detail. That will be probably another talk. Because that’s probably a separate hour on its own. So we can perhaps for the next Orbis webinar — we can do the head-mounted displays. The same is true — someone asked a question about Bioptics. So those I would count as kind of your optical low vision technology. So that’s something that we can talk about in other places. And some of these we may have to answer… I’m just looking… Some we’ll do through an emailed answer. Let me just mark through so I give you all the information you are looking for. So there is a question about children using head-mounted devices to attend school. So I think it really depends on where the person is in school. So some of my patients do use them. Typically it’s high school or university. I have not found that it’s needed for younger children. I actually try to focus more on using something like a Mac Connect. That’s a CCTV that the student has at their desk. And then they can use that to focus on the board, or use a monocular telescope to focus on the board. But I have had patients use head-mounted displays in high school, in university, and in other settings, where they’ve found it to be really helpful. They’re not gonna walk around wearing it. They would only put it on at their desk. But it really does depend on the support of the environment that they’re in. There are some questions about training patients. It’s really variable with how much time it takes to train patients who are older or younger. So some of my patients pick up on the accessibility features in one visit, and some we really need to do three, four, five visits with occupational therapy, vision rehab teaching, or other technology training programs to get them to learn to use some of the devices. So it’s really variable. It can be five minutes. It can be more than five hours of training to get people using it. Some of the questions — there’s a question about magnification alone being a bit of a limitation. So most of the programs, unfortunately, don’t kind of work around scotomas. Other than just by enlarging and adding contrast. But there are some programs really trying to figure out if they can be used with eye tracking. If we can direct an image to a better part of the retina. But right now, most things are still relying on just contrast and on magnification to get around the scotomas. And it does work somewhat well. It’s not perfect. So we are getting very close to the end of our time. So I’m not sure if… Lawrence, if we want to wrap up or if we want to keep answering questions. Or just do the questions through email. Whichever way you want to do it.

>> So it’s whatever you’re comfortable with. We have maybe five more minutes, if you want to answer a few more. And then you can do the rest by email.

DR MALKIN: Perfect. Okay. I’ll answer a couple more. So there’s a question about whether there are smartphones available for totally blind patients. So there are. I mean, you can use either the iPhone or Android if you’re totally blind. They both do have — like I said, that fully integrated speech output, as well as some Braille connectivity. If they are a Braille user. And I have patients that use both, who are totally blind. It’s really patient preference with what features they prefer. Android phones tend to be less expensive. And so it’s really great that they have caught up with some of the technology. Okay. Let me just scroll through. I just want to try to answer as many as I can. There’s a question about adding languages in that are not included in the apps. On the iPhone, you cannot add any languages in. Android phones are — a lot of it is Open Source. So I mean, if you could do the coding, you might be able to do the update. I’m not sure. I haven’t had anyone actually try to update the language on an Android phone. But they are adding languages on a regular basis. So perhaps sending in requests to Google or to Apple and saying: Hey, this language is really important. Can you work on adding that? But I haven’t tried to do it myself. And I would say it’s definitely gonna be easier with an Android than it is with an iPhone. So there are some questions here that I think are definitely conversations for other topics. Kind of looking at children with special needs. And some of those — I think that will be another Orbis session on another day. The iPhone magnification — they actually won’t tell you what their max mag is, because they have so many different screen sizes. But I think it’s… I would guess it’s pretty similar to the Android. Where they advertise 8x, but that’s without specifying screen. So the larger — if you’re on a large iPad, the iPad Pro, you can magnify to a much higher level. And… Someone asked a question here about Parkinson’s and some of the difficulty using mobile phones. What I recommend here is they do actually sell some stands for the phones. So it makes it a little bit less mobile. But if they want to use it as their magnifier in the home, or as their technology, getting some sort of universal stand, where they can clamp the phone on to a desk or a table, then their hands are free, and they don’t have to deal with the shakiness as much. They can use it like a CCTV, and they can have that clamp. Like a universal phone clamp. And then their hands are free to be able to use the magnification features. It’s a really good question about kind of the shakiness and working with patients. It’s why many of them like the other tech better. But there are some workarounds that vision rehab teachers, occupational therapists, or you can help them with. All right. I got through about half the questions. But I think we will wrap up there. Because we are running out to the end of time. But I am happy to post my answers. I think in previous Orbis lectures, I have kind of sent in my answers, and then we can post that with the lecture on the website. So thank you, everyone. And we will actually answer more questions remotely.

Download Slides

PDF

May 25, 2021

Last Updated: October 31, 2022

1 thought on “Lecture: Smartphone Accessibility in Low Vision”

Leave a Comment