2021-10-02

This transcript is made possible thanks to funding from InternetNZ. You can read the full transcript below, download the transcript in Microsoft Word format, or download the transcript as an accessible PDF file.

Jonathan Mosen:             I’m Jonathan Mosen, and this is Mosen At Large, the show that’s got the blind community talking. Today, Aira and GoodMaps have struck a deal. And the former Sendero GPS mobile apps are free and have come to Android. We have all the details. Plus there’s more on iOS 15 and adventures in Android.

In 2019, Aira announced the acquisition of Sendero GPS apps. The concept was that you would be able to use GPS to get you pretty close to your destination, and then an Aira agent to perhaps help you get into the building and navigate around that building. And while there were beta releases in the early days, an Aira GPS product hasn’t been released. And now, Aira and GoodMaps are announcing that the former Sendero GPS technology has arrived at a new destination.

Now, when blindness GPS products are being discussed, Mike May is generally never too far away, and I’m also joined by Troy Otillio from Aira, who’s their CEO. Really good to have you both here, thanks for coming on the podcast. Can you talk us through the timeline here, Troy, and what happened with the Sendero GPS acquisition?

Troy Otillio:                         Sure. Well, first off, it’s great to be here, Jonathan, thanks for hosting. And it’s great to be on the line here with Mike May. Mike and I were recently in Tampa together at my first conference post pandemic, or I guess we’re not quite post pandemic, but in a phase when we could travel, which does remind us about navigation. Because even for myself and being sighted, I think we all find travel and navigation challenging, right? You got to get to someplace quickly and there’s a lot of information to consume.

But the situation with this product was we had all the intention to… and we did acquire it and began augmenting it to make it more seamless and compatible with the Aira’s experience. Yet, about the time we were getting ready to release was again, back to 2020, was right at the early phases of the pandemic. And we just didn’t believe it was the right time to lean into navigation and outdoor activities. We just felt it would send the wrong message.

And we had also re-prioritized Aira. And for those of you who don’t know, it was about that time there was a management change. And in fact, it was in February of 2020, right before the pandemic, I became the third CEO of Aira. Of course, I was formerly the chief operating officer, I’ve been here since the beginning when Suman started the company, and through Mike Randall as CEO.

But I took over the reins to drive the company in a more sustainable manner, and then the pandemic hit. So you got to realize the context, new leadership and new vision for the company, as well as the pandemic. And with all those things going on, we just didn’t feel it was the right time to release the product. And since then, GoodMaps appeared.

And the more we thought about what we’re good at, which is visual interpretation, not just in a navigation experience, but any task, that’s really our core, and GPS navigation apps while useful and valuable, are not our core. And GoodMaps came along, and having known Mike since the early days, and the folks at GoodMaps, we just thought it was the right thing to do to transfer a super-valuable asset to a company whose expertise is in navigation through an app.

Jonathan Mosen:             There’s nothing like becoming a CEO at the beginning of a pandemic, I totally relate to this. It does tend to throw one’s priorities somewhat all over the place. So Mike, you’ve been associated with the GPS a very long time, and you must be really chuffed, I guess, by how the GPS technology has evolved from those early days, when you’d have to carry a backpack full of technology and a laptop with very minimal battery life, to what we can do now.

GoodMaps has been US-focused for the most part up until now. Can you just give us a bit of background on what GoodMaps is, where it’s sprang from and what its mission is?

Mike May:                           Yeah, Jonathan, thanks for having me on here. And to put this into a bigger context, I’ve always been a proponent of the fact that we need an accessible toolbox, and not one solution fits everybody all of the time in every situation. So that’s why it’s really wonderful to team up with other tools like Aira, because sometimes you need visual assistance, sometimes you don’t, sometimes you need navigation. And as a blind person, I’ve been looking for more location information and better navigation all of my life. So when GPS came along, that was a tool that I certainly focused on, and wanted to develop for myself and for the blind community.

Fast forward to 2017, ’18, when we closed Sendero and wanted the products that we developed to keep ongoing for those users who had them, that Aira took over the mobile products and worked on those for a couple of years, which I was beta testing.

Then when I joined GoodMaps, because it was only formed in 2019. It really made sense to look at all the best ways to improve outdoor navigation, as well as the indoor navigation that they were focusing on. The ideal solution really needs to be an outdoor and indoor combined product. So when I came across both… In my conversation and interaction with Troy and we said, look, GoodMaps is navigation focused, maybe it makes more sense for GoodMaps to take the app, release it, and make two major changes.

One is, to make it free because the previous apps were always subscription-based. And the other was to add Android. Those are the two big changes along with a number of other things under the hood that will be available upon release.

Jonathan Mosen:             Will this be your typical, what we used to call Sendero GPS? Or, are you also including some of the indoor navigation features that GoodMaps has been working on?

Mike May:                           The new app is going to be called GoodMaps Outdoors. And it replaces the three apps that were named in different countries, Seeing Eye GPS in North America, RNIB Navigator in the UK, and Guide Dogs NSW in that region of Australia. So Outdoors will be the new app that will cover all of those features that were previously in the app, and then with the addition of some improvements.

Jonathan Mosen:             What will happen to those who have a current subscription?

Mike May:                           It depends on the version that they have. But if they have auto updates on, in many cases, the update will happen automatically, and others, they’ll be prompted, do you want to update? Certainly, if they aren’t sure, they can always go to the App Store, put in GoodMaps, which is one word space Outdoors, and hopefully land them on the right app to download it and replace their old one.

Jonathan Mosen:             How have people paid for this in the past? Do they use an in-app subscription with iOS and will that be canceled in due time?

Mike May:                           There’s a couple of different solutions depending on the country. There was a one-month trial version that was something like $4.99. There was a lifetime subscription on the Seeing Eye GPS XT, and I think that was only in the US for $199. Australia had some special pricing that was around $10 US. I don’t remember exactly what the situation was in the UK.

Jonathan Mosen:             You may get a bit of flak, I guess, from people who’ve paid $199 lifetime subscription for the Seeing Eye GPS app, and now it’s going to be free to everybody. Is there a way of handling that, or is that just the evolution of apps and how things go sometimes?

Mike May:                           Well, the good news is they don’t lose anything, but this is part of the same philosophy you might think of when minimum wage is raised. And the guys who were making $13 an hour now get 15, but the guys who were making 17, they stay at 17, it’s just an annoying fact. The solution that I’m thinking of right now is that we’re going to create something we call, let’s say the GoodMaps Outdoors circle or round table, something of that nature. And we will invite anybody who’s paid for the app who wishes to join that inner circle. They can make feature suggestions, they can beta test. And I think that will really benefit them, benefit us, and recognize that they have paid into this.

Because the new model with the free Outdoors app with no way to support it is to ask for sponsors. And we have two committed sponsors so far, The Seeing Eye Organization in the US, and Guide Dogs Australia. And we hope to have others, and they will have a seat at that table to make feature suggestions and to help fund the Outdoors app so that we continue to improve it.

Jonathan Mosen:             Do you see any ongoing relationship here, Troy, between what GoodMaps is doing and what Aira are doing? Because presumably, there’s still that synergy that got you interested in this in the first place. The idea that because human agents have to be paid, and therefore that cost must be passed onto the consumer, unless there are again, sponsors of some kind. It does make sense to use GPS where you can, and then have an Aira agent come along to help you in those situations where GPS isn’t the best solution.

Troy Otillio:                         Yeah, first off, one of the reasons we decided to make this change was in part because the GoodMaps scheme, including Mike May, are just an outstanding organization focused on our key use case. Like Mike said, there is no one tool I think for anybody for any situation, yet, that’s our focus.

And in my view of this industry, partnering is so key. There’s a relatively few number of commercial vendors who are supplying tools and capabilities. And I see a future where there’s more partnership between all companies, because you want to make these experiences more seamless, you want to make… And I think Mike can expound on this. I took note of something he said recently at the conference wrap, but it is true. I think removing friction is what is key to making everyone’s life and experience better.

And in the case where you’re going to use multiple tools, how do you reduce that friction? Now, when you started on the partnership side, but I first want to emphasize GoodMaps and Aira are committed to continuing to partner and look at ways to more seamlessly integrate these experiences, so that as you’re using, whether it’s let’s say indoor maps or outdoor maps. And you hit that moment where you want something described either for entertainment purposes, or because its critical information you want to gather. Yes, you have all the GPS coordinates, you have all the navigation. Yeah, there’s that different… How can we make that as seamless as possible?

So we’re going to definitely look at how we can partner together. Things we’ve talked about is, as GoodMaps… And I’m sure Mike can talk more about this, but as GoodMaps begins to map more indoor locations, and this is no commitment on either side. But we’ve talked about what if the Aira agent could get that indoor map?

As you know, we navigate people all over the place. And if we don’t have GPS, we rely on the video feed, which provides an experience. But when we have a map, a moving map, when we’re outdoor, we have GPS, the agent is just more efficient and can provide more information. So what if we could make use of the indoor map that GoodMaps has taken place? So we look forward to collaborating with them on all of these advancements as our roadmaps can afford.

And then as it relates to partnerships, certainly, we’re having a lot of success with industry partners who are paying for Aira for their employees, for their students at universities. Or, we love to talk about Starbucks because I think its a very leading use case. So in any Starbucks location today in the US, you have Aira for free, just like you do at Target and Bank of America.

And I think there’s an appetite out there, I know there’s an appetite out there for commercial organizations, government organizations to provide and pay for services like Aira, like GoodMaps, whether it’s to enhance their brand, simply provide a better user experience, or in some cases, reduce the effort that it would take otherwise to facilitate that use case.

Jonathan Mosen:             You make a very good point about the friction. And it seems to me, if I’m understanding correctly, what we will have is, we’ll have an outdoor GPS app from GoodMaps, as well as another one, and the Aira app. So it will be wonderful to see in the future, just a single app where when you get to a point where you might need an Aira agent, you might be able to call them from that app.

Mike May:                           Yeah, and Jonathan, in the GoodMaps Outdoors app, there’s a button to launch the Aira app, and there’s also a place to search for nearby free Aira Access Points. And I think it’s also important to recognize that GoodMaps has a very simple, basic app that was released one year ago. GoodMaps Explore, that’s focusing on the indoor navigation, but it does have outdoor.

And one of the reasons that it’s exciting to have this new app is that it all of a sudden fast forwards us to having comprehensive outdoor features that a lot of the more medium to advance travelers would like to have. There’s a number of people who are lamenting that Nearby Explorer has not been continued and updated. And it has a lot of wonderful features. And I think in the GoodMaps Outdoors, we cover most of those features.

And there are some like breadcrumb mode and built-in turn-by-turn navigation that are unique to this app only. So we fast forward… Those apps are based on eight, nine years of user input. And all of a sudden, we can take advantage of that and make it available for free worldwide.

Jonathan Mosen:             Right. So you’ve anticipated my next question, which is where is this available? Because am I correct in saying that the current GoodMaps Explore app is US-only right now?

Mike May:                           Explore is available in the US, Canada and the UK. It’s being beta tested in Australia and should be available there shortly. I hope New Zealand’s not too far behind. That app will be rolled out a little bit more slowly in different parts of the world. Our intention is to try to turn on GoodMaps Outdoors in as many countries as possible, early on.

Jonathan Mosen:             That’s very significant because you and I go back a long way. And we remember the BrailleNote days and it costs a lot. You’d get your BrailleNote, and then you’d get the Sendero GPS on the BrailleNote. And it was a pretty pricey proposition, many people really just couldn’t get there without some sort of funding from government or some other source. And so the fact that you can go to an App Store and download this thing with all of those years of experience, and blindness-specific navigation is a very significant development.

Mike May:                           Yeah, back in 1997, 8, when I was forming Sendero and took the concept from Arkenstone, Jim Fruchterman said, “The ultimate day is when GPS is free and it’s ubiquitous.” And I think with this announcement on your podcast, we’re crossing over to that success point.

Jonathan Mosen:             Why not integrate the indoor stuff with this as well? I haven’t experienced myself at what GoodMaps is doing with indoor navigation, but I have heard a couple of interviews about it, and it sounds really intriguing. Do you intend to perhaps integrate all of that into this app, or where will let go?

Mike May:                           A lot of it has just to do with the code base and explore is in an updated native code base that can try to keep Android and iOS simultaneous. And of course, the Seeing Eye, although it’s been updated over the years… it goes back to formation in I think 2012 or ’13, and so it would have to be revamped. And to get the two to work together, it’s not something that’s off the table, but for the moment… Particularly these days, people run multiple apps at the same time anyway, so it may not really be an issue.

Jonathan Mosen:             And while I have you there, can we talk a little bit about this indoor navigation work that GoodMaps has been doing for those not familiar? Am I correct in saying that LiDAR is playing a significant role in this work?

Mike May:                           LiDAR and cameras have played a significant role in mapping the building. So this is done with a commercial grade, high powered LiDAR, not the kind that’s built into your phone, and the building is mapped. And then you can use your phone. You don’t need to have LiDAR, you can use any iPhone or Android phone, or most of them, I think back to iPhone 7 and Android 10. And you can then with your camera exposed, walk around and your position is referenced with that cloud map that was created originally by the LiDAR and camera.

Jonathan Mosen:             Are you concerned about the variance in standards for indoor navigation? Ever since I’ve been using GPS technology, indoor navigation has been this holy grail. And we know of course, that GPS is not reliable indoors and people have been trying to find a way to crack this. But there’s a lot of disparate technologies being offered as a solution to this without one emerging as the victor yet.

Mike May:                           Yeah. Outdoors, we have free worldwide GNSS as they call it now, because there’s multiple forms of GPS. And it’s free, how do you have that indoors? Well, nothing is reached that level yet. People have been working on different indoor positioning techniques for years, from fluorescent lights to Wi-Fi, most recently beacons, and some places we’ll have Bluetooth beacons for indoor navigation. The latest is LiDAR and camera. I saw something in the Wall Street Journal yesterday, talking about radar as coming into its own.

So I think these things will evolve, but the fact that you don’t have to put anything in the infrastructure, is really critical to scaling this kind of navigation throughout many, many buildings. One of the other things that we’ll be doing at GoodMaps is trying to work with other companies because many airports are already mapped. And so rather than mapping those ourselves, we would want to work with the originators of those maps, and make them accessible. There’s a lot of collaboration that will help to scale indoor navigation over the next several years.

Jonathan Mosen:             Do you see this as rival technology, Troy, or complimentary technology? I think for example, when I was doing a lot of international travel, being able to use Aira at an airport was just such a game changer. If you’ve got an app where you’re not paying by the minute or whatever, that can give you some information of this nature, is that a threat or do you see them coexisting in some way?

Troy Otillio:                         No, they’re coexisting, Jonathan. First, just I always have to remind folks is Aira did start off as a paid-only service. Our mission is frankly, to provide professional, quality visual interpretation at the lowest cost, if not free for everyone. And we’re getting there by having these sponsors. So even in the US where we’re up to roughly 45 airports, despite the fact there is a pandemic, and obviously airport travel is down and budgets are tighter.

But if you’re in an airport, or free, whether or not, you can make use of five-minute daily call, but ultimately we look to reduce the price or have it covered by organizations. And what we see in airports or any kind of travel, they’re complimentary. So yes, you can get from point A to point B, and you will know about the waypoints in maybe the stores and the details. But sometimes, people want more information or augmented information such as how many people are in the terminal, or what does the teleprompter say above, or the LCD screen above say about departing flights, or how crowded is that restaurant? And the list goes on.

And so I think if someone wants to get from point A to point B, they should use the best tool and they should use all the tools that they think are necessary. And I think it’s great to have options, we don’t see it as a threat. Another fact that many people don’t realize, would you believe that pre pandemic, only about 12 to 15% of Aira calls were for navigation? So it isn’t our primary use case, even though it’s a super valuable use case.

About 30% of our calls are related to online, or tasks on the computer where there’s a bit of inaccessible UI or experience, so that’s a huge use case. We have a lot of reading, like reading paper mail or prescriptions, or cooking. So there’s so many things you can do with Aira. And users, as they begin to explore all those use cases find even more than we’ve even thought of.

So navigation’s a core use case, and we’re there to support that use case. And as it becomes easier with apps from GoodMaps, whether indoor or outdoor, we just see it as a way to get more people out and about. And if you’re more active, we know you’ll ultimately probably look to use more Aira. So I think it’s actually a stimulant from just a pure Aira perspective.

Jonathan Mosen:             And Aira has significantly simplified the product line, I guess, both from the customer and Aira’s point of view, by not having the glasses. And people are coming up with all sorts of solutions to deal with that in terms of camera attachments, that sort of thing.

Do you think there will be a time when some form of glasses will work with Aira again? There’s been a lot of speculation that perhaps Envision would be a good partner for Aira. We know that Apple’s dabbling in this space and they’re probably going to come out with a product sometime, but we don’t know what that’s going to look like. Has Aira lost something by losing the glasses?

Troy Otillio:                         I’m a pretty upfront person. I know that that use case was very valuable to a subset of Aira users who are very passionate about the hands-free use case. It wasn’t sustainable for Aira, especially back last February, because of the high cost of manufacturing, and the high cost of shipping. And when we looked at the data, only a small percentage of our customers would continue using the glasses beyond the first or second month.

And as we looked into it, looked into its causes, a lot of it has to do with the technology and form factor, just having to have an extra device, maybe having… In our older use cases, having to have it charged, cables connecting the glasses to the device that we chose as a controller, there were a lot of challenges. But what I see and what we’ve talked about going forward is allowing Aira to make use of any connected camera.

Envision is an option that we’re looking at, but let’s just think about all the cameras that are out there, whether you call them spy cameras. Mike and I have been talking about this. If you go on Amazon, there’re all kinds of low-cost Wi-Fi connected cameras with battery life that stream. What if you could buy one of those and attach with Velcro to your hat or whatever you need to? What if you could use… Like you said, there’s so many more glasses coming out.

I think the future that I see is Aira… Again, it’s focusing on its core, which is train professional agents. We’re not a hardware expert, and I think our new strategy is to leverage devices that are out there. And certainly, when you’re looking for a dominant player that’s going to provide at low cost, whether it’s a glass, or as I mentioned so many spy cameras, one could imagine putting Aira in between your ring doorbell let’s say, that provides a video feed, so the agent could describe who’s at the front door.

So the future is making use of more cameras, more devices in a way that allows the individual to choose and doesn’t force Aira to get into the business of hardware and integration which takes us out of our core domain.

Jonathan Mosen:             So for that to be really viable, it sounds like you’d need quite a robust API, ideally. Is that something that you’re thinking about so that people can just integrate an Aira button into whatever technology that they’re producing?

Troy Otillio:                         Yeah, we announced at NFB, our larger roadmap which includes the development of what you might call a true SDK, so that the vision is that anybody could recreate an Aira experience by just using our APIs to stream video, to log in, to do the things you need to do. And so the old Aira was more of a vertical app without a proper SDK, which prevented what you’re exactly talking about.

But we hired a new CTO early this year, named Vinu Somayaji, who comes with years and years of experience. And this is something his team is actively working on literally today, is the new Aira SDK, which allows us to consider those kinds of outcomes with additional cameras.

Jonathan Mosen:             So from your point of view, Mike, with the GoodMaps Outdoors app, does that mean that potentially we might see a day when that Aira button will do more than just shoutout to the Aira app. It could potentially have that SDK somehow integrated, is that something that’s feasible in the future?

Mike May:                           Yeah, I sure hope so, because right now it’s a multi-step process. You have to launch the app, and then once the Aira app is open, then you got to click on that button. So it’s not a big deal, but it certainly would be nice and seamless if our software says, okay, you are 50 feet from your destination, do you want to switch to a visual agent? And pop us over to an Aira agent.

In the GoodMaps Explore app, there’s a Be My Eyes button that works in somewhat the same way. We also have the same issue with looking for a hands-free camera solution. For the indoor navigation, you need to have the camera exposed and vertical. So either in a hand or a pocket, on a lanyard, or in a pouch. And so I’ve been very curious about different glasses which seem like the logical place to have a camera. I have a couple that I’m playing with.

I have learned that Apple does not have a mechanism for parsing through the video, so you can’t replace the internal camera with an external camera. So the solution would have to bypass and go straight to some URL on the internet. And for the ARKit, which is drives part of our indoor navigation technology, there is no hands-free solution to use a camera for that. So at the moment, you’ve got to have your camera out for indoor navigation.

Jonathan Mosen:             When are we going to get these apps in the store, these new GoodMaps Outdoor apps?

Mike May:                           They should be in the store as we speak. We never know-

Jonathan Mosen:             You can’t get better than that.

Mike May:                           Yeah. You never know what the final gyrations of the Play Store or the App Store, when will they get approved and when? But the goal is to have it come out as we’re talking.

Jonathan Mosen:             So that’s iOS and Android right away?

Mike May:                           Yes.

Jonathan Mosen:             The Android users will be very happy about that, because often they feel like it takes a little longer for them to be considered. Now, if I can go back to square one because some of us have been using the products that you’ve been involved in and that this technology is fundamentally based on for years and years, and we take it for granted. But if people have not used this technology before, what is the big deal? What will this app be offering that perhaps they might not find in other free blindness navigation solutions?

Mike May:                           Well Jonathan, I’m a proponent of any navigation is better than no navigation, so use whatever app you like or you’re familiar with or you can afford. Nearby is a great app and the Seeing Eye app have been great, they’ll be a lot better now that they’re free. BlindSquare, Lazarillo, there’s a number of others, so use any of them. Or sometimes use multiple ones together because they each have a strength.

Soundscape is a wonderful app, it’s the only one with the spatial navigation which I think is an awesome feature. The GoodMaps Outdoors has built-in route turning with a lot of prompts and a lot of verbosity, which is considerably better than what you’re going to get in terms of turning information with Apple Maps or Google Maps. I think that’s a real strength.

If you want to create a breadcrumb route, or as we call it a waypoint route, I think this is the only accessible app that does that. So great if you’re camping or if you’re in some place where… Let’s say a campus where there aren’t streets and you want to map a specific route, you drop your breadcrumbs along the way. Those are a couple of the things.

Jonathan Mosen:             Right, the turn by turn is significant too, isn’t it? Because I guess there are a couple of more solutions that offer that now, but the idea that you can get that really clear, blindness specific turn by turn is quite a big value add.

Mike May:                           It is, there’s a lot of prompting. We found that when people make a turn, the question going through their head is, did I turn in the right place? And you don’t get that from Apple or Google, you do from this app. It tells you continue straight, your next turn is in so many meters. And then same thing indoors. So some of the indoor navigation apps don’t actually have turn by turn, they have point by point navigation. And that’s quite different indoors because you don’t have streets named hallways to navigate down. You need to be coached, particularly for in a wide hallway, like in an airport. You may have to go a few feet to the left or the right to find your next turn.

Jonathan Mosen:             And with adding that indoor navigation scenario, does that mean that for the GoodMaps Indoor technology that you’re working on to work, that a professional needs to go ahead and map each building? Or do you anticipate that it will get to the point where somebody who’s got some tech skills could go ahead and map a building say in advance when they know that blind people are visiting?

Mike May:                           Well, currently the mapping process doesn’t have to be by any professional in terms of O & M, it’s just a professional mapper who knows how to capture 360 degrees of their environment, how to walk through, what features to capture or not. And then there’s processing that goes on on the backend when this is created before it’s released to the user. It doesn’t take any particular mapping expertise, it’s just a matter of capturing that 3D cloud image in a effective way.

Jonathan Mosen:             There was a time Troy, and I remember CSUN presentations relating to this, where Aira was thinking, look, we’re getting so much information, data is king, and we’ve got all this data on the way that blind people like to work in different situations, the challenges that lack of sight causes, and that could have some relevance in an AI context. What is Aira’s current thinking on that?

Troy Otillio:                         I still think that’s an important asset that we have and something to be leveraged. Again, being a more practical CEO, I think the use cases where that can exceed the agent experience are few at this point, or expensive to get to.

Gosh, we have over 5 million sessions at this point, and that’s a lot of data. But when you think about that in the narrow use cases, I think we’re still building up that dataset. And here’s the good news, it’s the wonderful thing about technology. The cost to process AI both from a compute perspective as well as the software engineering effort, that continues to drop, and we could talk a long time about why that is. It’s just everything from Moore’s Law that says chips are faster and cheaper over time, memory’s faster and cheaper. Cloud computing is more ubiquitous.

All those things are in our favor. I think right now, Aira, I don’t think right now our focus is on that core use case and the SDK. And getting to some better call experiences, some things on a roadmap include the ability for you to schedule a call, to receive a call back if you call when those certain times, when agents are really busy. We’re focusing on a lot of those key use cases that are near and dear to our primary visual interpretation, and then as we grow and as we gain more commercial partners, that puts us in a position to make value out of that data that we have captured.

But presently on our roadmap we don’t have any large AI roadmap items, even though we have lots of ambition and lots of ideas about what we can do, there’s nothing in the next 12 months that I see coming out that would be what you’re referring to. Something where a agent, like we used to call this agent Chloe, like Siri, will be able to do in place of an Aira agent or as an augmentation to an Aira agent.

What you may see is some upfront coordination before the call starts, where we might offer some optional prompts, whether voice or otherwise, for you to describe what you intend to do so that we can match you to the best agent and we can prepare that agent so there’s less upfront time for you to describe the context of your call, and use those minutes and get right to the task.

And we’ll be applying AI in that place. So I think AI is just another tool that any smart technologist is integrating where you can get value, but the future of an AI driven agent versus a human agent I think is still a little far off.

Jonathan Mosen:             I missed the presentation where you went through the roadmap, so you may well have talked about this. But what about other platforms? I note for example that Amazon has announced another pretty impressive sounding Echo device with a camera that’s designed to be in your kitchen, and so the idea that you might be able to just summon an Aira agent with a command and hold your thing up that you want to inquire about with this big device that’s on your wall in the kitchen, it’s attractive. Google have similar devices. And then of course there’s PC and Mac. Do you envisage Aira coming to other platforms?

Troy Otillio:                         Jonathan, have you been reading my email? Let’s see, what can I talk about? For one, the foundation is this Aira SDK that is actively under development. That’s a foundational piece that is going to enable us to connect and adapt to more devices and more integrations. We also announced on a roadmap is something that goes back to our primary use case these days, where our majority use case, which is computer-based tasks. And so we know that users want to call Aira. They don’t want to have to pick up their phone, call the agent.

For those of you who don’t know about this capability in Aira, but Aira has the ability for an agent to remotely view your desktop or your phone, and if you want them to even remotely control the desktop. And that’s a very popular activity. And so what our users told us, and it makes total sense, is like, “I just want to hit a button while I’m in a document, while I’m doing something, and boom, instantly that Aira agent is there talking to me and I’m able to instantly share my screen, share documents back and forth. I want that to be seamless and quick.”

And so we are also working on and will deliver in the next three months, a desktop version of Aira, where you don’t even need a phone ever. You can use it on the phone but you can also literally use it right there on your Mac or your PC. And that’s going to be a really awesome use case, because I think the appetite is there but there’s still a little bit of friction for some to take the multiple steps you need to share desktop or share documents.

So that will be I think an exciting day. And again, that’s centered also around our SDK development. And from that, then we can look to I think that next level of where can we deploy Aira, in what use cases I think? I have both a… And I better not say it too loud, sorry, near one of the devices.

But I think anytime you want information, I think the internet and AI is great. Sometimes you want a human who you trust, who knows about you because you have a profile, and for those devices with a camera. Why not use that? Why get out the phone? Why not just say, “Hey, favorite platform, ask Aira to call an agent.”

And then you can hold up, like, “Is this tomato soup or is this… What’s in this can?” Or any number of questions that you might have for example in the kitchen.

Jonathan Mosen:             Yes, I look forward to throwing the tomato soup out as soon as possible. So that’s good. We have strayed all over the map here as it were, but I think the important thing that I’m trying to emphasize with doing that is that we’ve got a pragmatic decision that’s been made here. And I think when we look at your tenure, Troy, as CEO of Aira, pragmatism would be the word that I would use, that you have defined what Aira’s core business is.

And you look at GoodMaps and we know what their core business is, and so people are if I might use the expression sticking to their knitting with this announcement, and blind people are better off as a result because people are concentrating, focusing on what they’re good at.

And you can clearly see from both of you that there are plans for the future that will benefit blind people considerably. I think when we look at the fact that the Cadillac as it were of GPS technology is now free to most people who want it, and obviously those markets are going to be expanding quite quickly, this is a very significant announcement. And you must feel some sense of pride about this Mike, because it’s what you’ve been aiming for for what? A quarter of a century?

Mike May:                           Yeah, I can’t believe it’s circled back, and here we are again, and I’ve got the opportunity to keep working on it and try to make it better, and to solicit feedback from other users to make it better. It’s really exciting.

In the AI discussion point, one of the things that I think is very useful is crowdsourcing. And this is something that we had back as far as the Braille note days, the user’s ability to record a point of interest wherever you want, and then to share that with the crowd, with the community. That’s in the GoodMaps Outdoors app, you have that ability to record points and share them. And if you want to take it a step further, what we don’t have yet is the idea of crowdsourcing where people walk. So if you’re in a casino, which are awfully often hideous places to try to get around for anybody blind or sighted.

Jonathan Mosen:             Why would you pick a casino of all places?

Mike May:                           Because it’s so hard.

Jonathan Mosen:             Okay.

Mike May:                           But people walk in certain patterns.

Jonathan Mosen:             And they rob you blind too.

Mike May:                           Blind drunk. Maybe we can get into that terminology discussion. But if you can track where they are you can crowdsource routes, and that’s part of enriching and making indoor experiences even better. When you’re in a museum, if you want to know details about a particular exhibit you’re at, that’s already something that’s on our near term roadmap, to add that ability to punch a button and hear the MP3 file describing the exhibit.

Jonathan Mosen:             Is there any way that people can find out more information about this and keep in touch with what GoodMaps is doing? I take it that there will be options in the app to do that.

Mike May:                           Yes, absolutely, the various links will be there. Safest bet right now is to go to Goodmaps.com and there’ll be a link for Outdoors. It’ll take you to the page with the FAQ’s and a very comprehensive user manual. I know people don’t do those much anymore, but it’s a legacy from the past. And so lots of information will be on that Outdoors link within the GoodMaps page.

Jonathan Mosen:             Well, I really appreciate you both coming on the podcast and letting listeners know about this. It’s going to be a significant talking point in the blind community. And thank you both, not just for your time, but your pragmatism and your ongoing work to make life better for us. So it’s been a great opportunity to talk with you both.

Mike May:                           Thanks, Jonathan. You’re always at the forefront of these things, so glad to share the stage with you.

Troy Otillio:                         Likewise, it’s always a privilege to talk with industry veterans like you Jonathan, and Mike. And I’m humbled often to be in your presence, but it’s even more exciting to be working closely with GoodMaps. And I do look forward to ongoing partnership, not just a one-time trading of an enhanced app. And it’s going to be exciting to see what even comes next.

Speaker 1:                           Like the show? Then why not like it on Facebook too. Get upcoming show announcements, useful links, and a bit of conversation. Head on over now to facebook.com/mosenatlarge. That’s facebook.com/ M O S E N at large, to stay connected between episodes.

Randy Shelton:                  Hi Jonathan, this is Randy Shelton, and I wanted to comment on iOS 15. I am very impressed with this update, I think it’s the best we’ve had from Apple in quite a while. 14 wasn’t too bad, but this one is fantastic.

I think the features… The two features I enjoy the most are the ability to put your notifications in summary and the focus. I’ve been waiting for both of those quite a while and it’s so nice to be able to schedule a time to browse my notifications. It’s nice to be able to talk on the phone or read a book and not be overwhelmed by notifications when I’m done.

I also really like the FaceTime features, I haven’t had a chance to play with sending a link yet because I haven’t had a chance to talk with anybody who uses Android or their PC. Most of my family and friends are Apple users, but still it’s a great feature.

And I like the fact that when you’re in a FaceTime call, you get a sound now when someone enters or leaves, and it also shows you who has come and gone. Just a very nice setup. And I personally think that FaceTime is clearer now than it was.

Overall, it’s a very impressive update. I haven’t done a lot with Safari yet, I did take a look at it, and I don’t mind that the address bar’s on the bottom. I think I’m going to leave it that way for a while and then I may change it back. We’ll see.

As far as the Apple watch goes, I really didn’t expect to be able to update my series three. I was shocked that they decided to continue that for another year. The update was very smooth for me. Last year, I had one heck of a time updating. I had to totally clean out the watch, reset it. It took hours for me to update. It was a mess, but this year, it was seamless. I was very impressed. And I still am considering getting a new one. I wasn’t going to last year. I said, “Oh, I’ll never buy another Apple watch after the hassles of trying to update this one, it’s just not worth it.”

But I do use it, especially for my workouts, and I like getting the notifications on the watch too. And it just makes it so much more convenient when you’re doing a workout to be able to have the watch on your wrist, the phone across the room, and it all transfers over to the phone later.

So I probably will get a new watch later this year, early next, because I’m sure they won’t support the series three after this year.

Christopher Wright:        Hey Jonathan, just a couple of thoughts on iOS 15. I’m running it on an iPhone 6S. It’s actually working quite well. The only gripe I have is the amount of storage it uses. And apparently since, I want to say iOS 13 or 14, iOS is using a crazy amount of storage, not only on the operating system but on what it categorizes as other data, which I guess is anything from downloaded voices to logs and a bunch of other things that it needs to have running.

But yeah, so on a 16 gigabyte phone, it’s using about seven gigs for the operating system and almost five gigs for apparently this other data. So it’s using 12 gigabytes of my data and I have a little less than a little less than four gigs of storage left, which is quite insane. I’m not sure how they justify that.

So I have two or three apps on there, and I don’t want to put too much more on there. So maybe it’s just time for me to get a new phone. But yeah, the storage consumption is ridiculous, and I’m not sure if there’s a reason for that or if it’s just buggy. But yeah, that’s been my gripe with it thus far.

The other minor thing is the interaction mode in voiceover, which is interesting, but it’s not consistent in terms of it doesn’t behave the way that I would expect it to in all apps. And it’s a little confusing when you can touch inside of containers and then it traps your focus inside of those containers until you stop interacting. So I think it needs more polish.

But overall, it’s a decent release. I will be curious to see when… I believe in 15.1, they’re going to add the ability to play audio and video over FaceTime. I’d be curious to see how that works and if you can take advantage of that in other applications, like… I don’t know, Bard or Audible, that would be cool.

Oh, actually I did try FaceTime, and apparently the microphone modes and all the cool audio features of FaceTime are restricted to the iPhone 10R and newer, which is a bit disappointing. And the audio wasn’t that great when I tested it on my computer, but okay, that’s cool that you can finally make links and you can have people join you in calls that are not Apple users.

Steven Jolly:                       Greetings Jonathan, and to everyone listening, I’m responding to a query from recent correspondent, Mary Anne. She encountered a problem which gave me trouble for some time a few months ago. It concerns the pesky speaking of every character even when punctuation level is set appropriately. The solution for me lay buried in the rotorin the activities item. Many would be familiar with activities, a nice feature that came I think with iOS 13.

It allows the user to configure various voice over variables, such as voice and speech rate, as preferences for certain environments such as reading via a browser. These activities can be incorporated into the rotor. An activity which I think comes with iOS is programming for the benefit I suspect of developers working with application code.

It reads everything, including punctuation and special characters. What happened to me, and I suspect could be Mary Anne’s issue, is that the activities rotor item is selected and is inadvertently pointing to the aforementioned programming activity. To resolve this, I suggest make sure the punctuation level is set appropriately, and then check out the rotor. And if the rotor item activities is selected, make sure it’s not pointing to programming.

And then you could even de-select or remove the activities item from the rotor if not interested in being able to use some other activity. So maybe give that approach a shot Mary Anne, and good luck. Best to all Steven Jolly, Melbourne, Australia.

Jonathan Mosen:             Thank you very much, Steven, that is indeed the magic trick. And I want to thank the many people who wrote in on this, others who left contributions included Aditia, James Odell, Ashley Malone, and Keith Renpaul, and quite a few others as well.

So thank you to all who took the time to answer that one. I don’t have activities on the rotor because I don’t really use them very much. I am likely to use them a bit more now that there are a few more things that matter to me, but I don’t often change the speech rate or punctuation, or some of those early things that were available in activities. So I didn’t have it on the rotor.

Of course now, if you want to be safe, you can add it to the quick settings and not the rotor. And you can, if you never use the programming activity, delete it, so you don’t get into that bind again.

I think though that what this illustrates is that there is a bit of an issue with the rotor randomly selecting certain things. And I’m not sure what determines when something random gets selected. So you can have your rotor set to navigate by characters or words, for example, and then you might be hooning around your home screen. And I understand why at that point actions get selected, because voiceover will say, “If you’ve got this setting on, actions available,” and you know you can flick up and down to choose from those actions.

But something then seems to happen when you’ve gone away from an environment where actions are no longer available. Your rotor doesn’t revert to what you had set it to before, like characters or words, it seems to end up in some random place. And I remember for the longest time, one of the problems I had with this was that it would randomly change languages. I’d flick down thinking, the last thing I set this rotor to was words, and suddenly I was changing language.

And so I think I fixed that by moving the language item somewhere else on the rotor. But this does seem to be a bit of a design issue with the rotor.

But thank you to all who wrote in about this. As soon as I got the first contribution, which was quite early after the podcast was published with the solution, I wrote to Mary Anne because I didn’t want to have her in agony for a week with her phone speaking all this punctuation. And she tried it and it worked, and she was really grateful as well.

Holger is writing in and says, “Hello Jonathan, using iPhone 12 Pro with iOS 15 and a Apple watch series five with watch OS 8, VO does not read the message when the screen is locked.” I think that’s the screen of the iPhone. “I know I got a message due to getting haptic feedback on my watch. When not using the watch, VO does read the message on the lock screen. I checked settings and messages, notifications and all settings that are supposed to be on are on. I do get all other notifications on the lock screen.”

Thanks for writing in Holger, certainly if your phone is locked, you should get a ping on your Apple watch if the settings are set up by default. But yes, when you unlock the phone, I would expect the message to be there on your lock screen. I don’t know why this is happening. But perhaps others can comment on it if they’ve seen it. And maybe you can contact Apple support and see if you can log a bug.

Here’s Patrick’s email, which says, “Hi Jonathan, thanks for your efforts with your podcast. I’m in it for the tech bits, but good to get the rest, especially regarding advocacy stuff, that is very much me in my workplace.”

“I guess we all have to do this, even though we are not informed and definitely not trained. I write regarding the WALTR 2 app you mentioned, I cannot find the app in the app store. Have I picked this up right?”

Yes you have, but it’s not an app for your phone. It’s an app for your PC or your Mac, because the purpose of WALTR… It’s now called WALTR Pro by the way, they’ve upgraded from WALTR 2 to WALTR Pro. And WALTR is spelled W A L T R. The purpose of it is to replace iTunes on your PC.

So maybe there’s less of a need for it now on the Mac possibly, but the purpose of it is to copy material from a PC or a Mac to your iPhone. And the really cool thing about this is that when you copy Windows Explorer style or Finder style using WALTR 2, it automatically puts the material you copy in the right place.

So if you copy a whole bunch of music, it’ll appear in the music app. If you copy a video, it’ll appear in the appropriate app. So it’s pretty straightforward, but you do need to purchase it and download it for your computer. So to do that, you can go to softorino.com, which is the name of the company that produces this. That is spelled S O F T O R I N O, softorino.com. And that is the new WALTR Pro that is now available. So I hope that helps.

Kathaleen:                          Hello, I hope there isn’t too much background noise on this recording. I have three parrots here and there are some people talking in the next room, so maybe not ideal recording conditions.

But what it was was I wanted to delete some of my homepages or hide some of them, and I heard that it’s possible to do this in iOS 15. And I just received an iPhone 13 Promax yesterday, and I’ve not updated my old phone to iOS 15. So I had no idea how to do this.

My old phone was running iOS 13. It was an iPhone 7, so I could have updated, but I wanted to wait and just get the new phone and update then.

So I was on my home screen and I was putting it into edit mode, and I was hunting about trying to work out how do I do this? And then at one point I was on the first page of the home screen and VoiceOver said to me, double tap to hide when I press, when I felt the little thing that says for the pages and it says, just swipe up or down to go through the pages. And I double tapped it and it showed me all these options of what pages were visible. And I discovered that if I flicked my finger on say page five, say I wanted to delete page five or hide it, if I flicked my finger on it, it said delete. So I suppose you would double tap it then and it would delete it, but I just wanted to hide some of mine. So I just pressed each one and made sure it said hidden, the ones that I wanted to hide, but it wasn’t immediately obvious how to do that. And it was just sort of luck, I suppose, that I was on the first page when I tried.

So if you’re on any of the other pages, it doesn’t say double tap to hide when you’re scrolling through the pages. It just has that option on the first one for some reason. And I thought maybe it would be something that other people had come across and wanted to know how to do.

Jonathan Mosen:             See that is news you can use from Kathaleen. Thank you so much for that because it may well be one of those features that people have missed if they have turned their hints off. What I always do when I get a new version of iOS is turn the hints back on for a while. Just in case there are any changes because when you turn hints off, you think, I’ve got this. I’ve been using iPhone for a long time. The hints are just verbiage. They’re just distraction and I get that. But every so often when you get a new version of iOS, something new has come along and that’s a really good little tip. But most important of all, much more important than any of that. Do those parrots talk? It’s the male ones that talk, isn’t it? I think that’s right.

Isn’t it the male ones that talk or is that budgies? I don’t know. I don’t know a lot about birds. I have a friend who knows everything about birds. I know very little, but if you could make one of those parrots say “Mosen at Large is great,” or something. Can you train it and then get that recorded? I would really appreciate that, a parrot endorsement for Mosen at Large, my life will be complete. This one is a bit of a developing story, and I’m going to read one of a number of emails I’ve received on this. This is the most comprehensive and it comes from Robert Kinget.

He says, “I’m writing to you for advocacy guidance and to alert your listeners as well. On September the 29th, I asked Siri to check my email, an action it had no problem doing previously. This time, the assistant said it could not do this. Perplexed, I tried it on a device running iOS 14. The same result occurred. Siri’s actions were drastically sliced in half. It could no longer read text messages, check voicemail and more. I did some digging after I encountered this only to discover many blind people having the same issue. I tried contacting Apple’s accessibility department. I asked if its servers were getting an upgrade or a migration. They told me as I’m sure they’ve told others that those functions were purposefully disabled server wide. I could get no explanation beyond that. Other than the representative directing me to the VoiceOver guide on the web. I am friends with many blind seniors and young people that use Siri as an efficiency tool in the Apple ecosystem. What other advocacy steps would you suggest?” asks Robert.

Thanks very much for getting in touch Robert. Good to hear from you again, and also thanks to Rebecca and Marisa who sent me the MacRumors article, which I actually have read. Isn’t it interesting how so many of these Apple related websites just copy off one another? It’s quite hilarious. And the MacRumor’s angle that was taken here was very interesting. MacRumors chose to treat this as a blindness story. Now, I find that incredibly ironic when you consider some of the really serious VoiceOver bugs that we have had to contend with over the years, without any kind of help or comment from the Apple tech press. And this one, which is actually in my view, a mainstream story gets touted as a blindness story. And they’re going on about how blind people depend on Siri. And it makes me think that they don’t understand the difference between Siri and VoiceOver. A lot of people that I talk to… I get an Uber driver or something and they say, “How are you using your phone? Oh you must be using Siri.”

They don’t know about VoiceOver and what it does, but you would expect a site like MacRumors to get it. The reason why I make this point is because I think if this is some sort of permanent change, then we are most likely to get good results if we bear in mind that all sorts of people use Siri. And this was born out in the comments on the MacRumors story, actually, where there were people who said, hang on, this isn’t just a blindness issue. I use Siri to get various things that now are being disabled. I have to say, I can confirm the email thing having gone. Mine is still reading text messages. I can’t comment either way about the voicemail. I’ve got no reason to disbelieve anybody. Here in New Zealand we don’t have Visual Voicemail. No carrier here uses Apple’s version of Visual Voicemail.

But nevertheless, there appears no doubt that Siri’s functionality has degraded and that it is server side because it’s also affecting iOS 14. It’s not just an iOS 15 change. Now it took some days for Apple’s PR people to respond to MacRumors, and finally they did, and essentially said, “We’re aware of the issue.” What you got Robert is a bit more detailed than what they got. The implication of “We’re aware of the issue,” suggests that it was some sort of accidental thing or planned maintenance, as you were suggesting, Robert. If this is in fact permanent as the individual from Apple accessibility has told you, then we’re dealing with something completely different. And I think it’s important that Apple come clean and tell us exactly what they’re doing. Is this a bug? Do they intend to have it fixed? Or is there some sort of issue that has meant that they have now disabled these features and that they consider those features to be disabled permanently?

Because if that’s the case, I think Apple owes everybody, not just blind people, but everyone who uses Siri a damn good explanation as to why they’ve done what they’ve done. And if we can have more facts, then we know whether there should be any kind of advocacy effort, like a petition or something of that nature. And let’s keep in mind that this is a wider issue than blind people. Sometimes I think it’s easy for Apple to ignore the blind community, we’re such a tiny subset of their user base. And while it is true that probably a tiny minority of Siri users go into this level of detail of complexity with their Siri, I would say that a lot more than just blind people are using it. And that gives us more hope of getting a resolution. So I think we need a few more facts before we know how to deal with this one.

Speaker 3:                           (singing)

Jonathan Mosen:             Are you ready? Are you ready? Ooh. And now it time for more Adventures in Android.

Yes. The last Adventures in Android really were incorporated into the interview that I did on last week’s Mosen at Large with the famous Ed Green. But I do have some other things to tell you. I am really enjoying having a play with this Android device. As I mentioned in that interview, I am finding it really viable. Very interesting. I have installed one password on my Samsung Galaxy S21 running Android 11. For those not familiar, we have covered one password on past episodes of Mosen at Large. This is the password manager. It not only stores passwords, it stores credit card information, personal notes, and it’s cross platform. So I have one password on all my devices. It’s on the Chromebook, it’s on Windows, it’s on the Mac, it’s on iOS. And now it’s on this Android as well. So the cool thing is that when you save a password there, it is on all of these devices. Setting up one password was pretty straightforward.

You scan a QR code. You can also type in a very long code, which does remind me to mention to you that I have got my Bluetooth keyboard set up, a Logitech Bluetooth keyboard set up with this S21. There’s a very rich range of Bluetooth commands available. And I did allude to that in my chat with Ed last week. For the most part, 1Password is working really well and doing what is intended, because I don’t really remember any of my passwords anymore. They are unique for each site and they’re between 10 and 24 characters long. So that’s really essential to get 1Password up and running for me. There are some apps where it doesn’t seem to work and it may be ignorance on my part, but the Twitter app appears to be one of those places where I had to do a bit of messing around and copying the password to the clipboard and pasting it in.

So I’m not sure whether it’s as robust as it is in iOS, where you can pretty much get one password to come up everywhere. I think the people who make 1Password, AgileBits, are very steeped in the Apple ecosystem, so that could explain it, but we’ll see how we go. It’s early days and I’m still learning. And I have to say, I haven’t had as much time to play with this device this week because of work commitments. Speaking of Twitter, I don’t like the official Twitter app on Android. It suffers from this thing that I was talking about with LinkedIn on iOS, where you swipe through your tweets and there’s another button related to each tweet that you have to swipe through. So essentially each tweet is taking two swipes. There may be a way to reduce that, and if any expert Android user has the magic secret, I would be interested in that.

I did go through accessibility settings and various other things where I thought it might live, but wasn’t successful in finding anything to stop each tweet taking two swipes. When I last had a look at Android, I know that Tweetings was a very popular third party Twitter client on that platform. And I will install Tweetings and give that a look because I don’t use the official Twitter app on iOS, either. My big beef, as I have said several times is that I like to be able to return to my place and work my way up when I go back into Twitter. I have a list of priority tweets that I never like to miss. And with Twitterrific I can start there and continue to work my way up and it’s no fuss. I can just continue to read my Twitter stream whenever I get a chance.

If anyone knows of a client in Android where you can do that, where you can quit the app, come back after a few hours and return exactly to where you were before and work your way up sequentially, I would be really interested. But I know that Tweetings comes highly recommended so I will get back to that. One of the things that was mentioned in last week’s interview with Ed and it’s a big selling point of Android, we have talked about this before on the show, is that when you find a situation that you wish your phone behaved differently, there’s normally a way to get your phone to behave in that different way. And one example of this is that I wanted my Galaxy 21S side button to activate the Google assistant by holding it down in the same way that you do on your iPhone for Siri.

Now you can have Samsung’s Bixby assistant do this, but there didn’t seem to be a way built into the phone that let you do the same for Google assistant. There are other ways, I mean, there are gestures you can do, but I wanted this on the side button. And I found that by fossicking around on Google and then going to the Play store I was able to install a little utility that allowed me to map the side button to do exactly this. So now when I double tap the side button, actually it launches my Google assistant and I can speak a query to it. That said a couple of things to note, I did need sighted assistance to get that utility up and running. It wasn’t accessible to get through the initial setup stages. And second, if I were to switch to the commentary screen reader, I know that that function is actually built into CSR.

So I am interested in having a good play with this. And speaking of speaking, one of the things I found frustrating about this whole Android experience is that when I invoke the Google assistant, TalkBack is chatting away. Chatting away it is while I’m trying to talk to the Google assistant. And I know that you can deal with this by turning the volume down a little bit on the screen reader and things like that. But I do find it extraordinary that after all these years, TalkBack hasn’t found a way to mute speech while the Google assistant is listening for input. I would put this one in the bleeding obvious category and it surprises me that we are still dealing with that. But as I keep being told by many people about a lot of things lately, it is what it is. Now, I know that there’ll be some people who’ve never used an Android device in their lives, and there’ll be other people who know a lot more about this than me.

It’s their daily device and they’re kind of following along in my journey. So I’m trying to meet both audiences. Let me say a couple of things for people who have never touched an Android device. One of the things that I really do appreciate about the QWERTY keyboard, that at least is coming default on my Samsung device, is that the keyboard has a number row. You’ll be amazed how much time that saves you, just not having to do what you have to do on an iPhone. If you’re using the virtual keyboard and go into the numbers option and write your number and then switch back to the letters. The number row is just right there, at least on the keyboard that is on my Galaxy S21. So I certainly appreciate that. And while we are on the subject of keyboards, I do want to say a big thank you to Satra Indra who let me know that you can calibrate the Braille keyboard.

If you are rocking Braille screen input on your iThing, you’ll know that you can press dots 4, 5, 6, and 1, 2, 3 to essentially align the keyboard with where your fingers are resting. And then you can Braille at a good clip and it’s very reliable. I was trying to find a way of doing this on the TalkBack built in Braille keyboard, and couldn’t find how you did it, but Satra told me that what you do is you rest all six fingers on the screen, just hold them there for a while, and you’ll hear a couple of tones and then you’ll get the confirmation that the keyboard has understood what you’re doing. And the dots are all calibrated. You’re good to go. It makes a big difference actually, in terms of reliability and just not having to be quite so precise about where you are positioning your fingers. The fact that you can press all six dots at the same time is a win for Android.

You cannot do that on an iPhone. I mentioned briefly with Ed last week that when you’re using TalkBack, it’s a feature of that screen reader that when you’re in edit fields, you can use the volume controls to navigate around. In terms of what the screen reader speaks, it works the same way as Windows does. So when you switch to iOS or Mac, many people do get confused by what VoiceOver chooses to speak and when, as you navigate around. And we did a whole section on this, trying to demystify it and explain it, and you do get used to it. It’s second nature for me now. But when you use Android, you will find that as you navigate around, it feels like Windows. And if you like the way that Windows screen readers speak what’s under the cursor, you’ll be right at home when you use Android.

But where I’m going with this in the context of Braille screen input is that you can still use those volume buttons while the Braille keyboard is active. That is a big win for Android. Something that I have talked to a couple of people about that nobody else seems to be experiencing. This is unique to the Mosen variant of the Samsung Galaxy S21 apparently. Every so often when I push the little button at the bottom of the screen to bring up my list of available keyboards, what you get is a series of radio buttons, and you choose the radio button for the keyboard you want. That’s all very well and good. Every so often the radio buttons don’t speak a text label. They just say radio button and they’re silent. And you have to go from memory as to what the keyboard is that you select.

This actually happened to me the very first time I changed to the Braille keyboard. And the only way I was able to do it was to audition each to double tap each radio button and find out what it did. That was not a good experience. Since then it’s probably happened one or two times out of several dozen times. So it’s an intermittent thing and I haven’t yet found what causes it. Another thing that somebody who has never used an Android device may not appreciate is the seamless integration that exists between the Play store on your PC and the Play store on Android. And what I mean by that is that I can go to play.google.com and I’m logged in on my Google account and I can search for an app or something else. And when it comes up, it actually tells me what devices I own that are compatible with this app.

This is actually proving to be very interesting because I also do have a Chromebook, and one day I am actually going to get around to playing you those reviews that I’ve already recorded of the Chromebook, because I think ChromeOS is an interesting platform. It certainly becomes even more interesting and viable if you are in the Google ecosystem. In the same way that if you have an iPhone and you have a Mac, there are advantages. There are compatibilities, there are synergies. The same is true with ChromeOS. So if you go and search for an app on the Play store on your PC, it’ll tell you which of your devices it’s compatible with. And then you can go ahead and install right from your PC, any app that you want on any device that you have. And if your device is on it’s pretty instant. The app gets pushed to your phone.

If your device is not on, then the next time it’s on, if you have told the Play store site on your PC to install an app on a particular device, then it will just go ahead and install it. When Windows volume 11 gets the Android support, this will also be really good to play with because presumably you will be able to install those apps that are compatible with the Android capability of Windows 11. So things are starting to get quite exciting in this space. Now this is definitely a little thing, but there is a song, it’s an old, old song that says little things mean a lot. And this is just a little thing that I like. And you may not know if you’ve never used an Android device. There’s a button on your home screen. You can double tap that and you can go into your recent list of apps.

This is the equivalent in Android of the iOS app switcher. And when you go into this list of recent apps, there is a close all button. Now there used to be, a very long time ago, a close all button in iOS. I can’t remember when they took it away. Maybe it was iOS five or thereabouts, but it was certainly a long time ago that they got rid of the close all button. And if you listen to some Apple people, they say there’s no need to go and close apps from the app switcher. Fear not, they say Apple has this wonderful memory management thing that means you shouldn’t need to do it. Well, that may well be the case but in my experience, when you close apps, you definitely save battery life. There are some apps that are not behaving well, doing things in the background.

And I find that closing them really can help get a lot of battery life out of your day. So I routinely close the apps that I’m not using anymore. And it’s a fairly time consuming process because you’ve got to close each one individually. On Android, just double tap that close all apps button and they are all closed. Another thing that I’m finding really good is the refund policy in the Play store. It gives me the freedom to try a range of apps and just decide, whether it be for accessibility reasons or just cause I don’t like the app, I can return it as long as I’m reasonably quick about it. I can return the app. The app just disappears from my home screen and I get a full refund. It’s a pretty no questions asked kind of policy really. Much less of a bother than trying to get a refund for an iOS app.

A couple of summers ago, I was looking to improve the way that I track macronutrients. If you’ve been listening to the show for a while I have been on an incredible health kick over the last few years and feel heaps better as a result. So a couple of summers ago I auditioned a lot of apps on my iPhone and processing the refunds for those apps that just didn’t work for me or weren’t accessible was a very labor intensive manual process. And you always felt that Apple was doing you a favor refunding you your money. I mean, it’s my money damn it. But with the Android thing, it is just much more straightforward. I do now have a couple of USB C to 3.5 adapters, ironically enough, made by Apple. They’re the ones that I just happened to be able to get hold of.

So I can now run a cable from my hearing aids to my Galaxy S21. That does make things easier, but it’s nowhere near as easy as switching on my made for iPhone hearing aids and just having them connect to my iPhone. So that’s a real consideration for me if I was ever to consider making this my primary device, which is certainly a lot more viable than it was, but it will never happen as long as Braille is in the state that it’s in on Android. On that subject, I have yet to spin up BRLTTY, which is a third party Braille screen reader that’s been around for a long time. It started life on Linux. It’s available on Windows and it’s developed by a blind developer and it’s been a part of the open source community for a while. I don’t think it supports the Mantis.

I don’t know that for sure, but I do have the Focus 40 Blue. I did have a quick look at the command set. And one of the things that is slightly frustrating for me is that it sounds like it doesn’t support what I would consider to be the traditional command set. For example, dots 1, 2, 3 chord to get to the top of something dots 4, 5, 6 chord to get to the bottom of something. I noted that there is an L chord command that performs a specific function on BRLTTY that is not your traditional dot 1, 2, 3 chord command. Now it might be possible to remap all those functions, but I think if BRLTTY wants to go mainstream in the Braille world, then it would be good to have a command set that emulates those almost standard functions that you can find on pretty much any Braille device.

So that might not necessarily be a show stopper. You can get used to anything, but I think the fact that it’s a Mantis that I’m using, maybe, for now I do have the Focus 40 Blue fifth generation still. So I can try and spin that up at some point, when I get around to having a look at BRLTTY. I also want to investigate the really cool technology that Samsung has, where you can use your phone with the PC, not just through your phone app on Windows, which is also pretty good, but you can control your Samsung, have it display on your monitor, use all your keyboard commands that you’re used to. If you’re using a Bluetooth keyboard and really just run this thing from your PC. So there’s a lot still for me to explore, but those are my adventures in Android this week. I am certainly having fun with this thing and enjoying it and being impressed by it much more than I was expecting.

Speaker 1:                           Be the first to know what’s coming in the next episode of Mosen at Large. Opt into the Mosen media list and receive a brief email on what’s coming so you can get your contribution in ahead of the show. You can stop receiving emails anytime. To join, send a blank email to media-subscribe@mosen.org. That’s media dash subscribe at M-O-S-E-N dot org. Stay in the know with Mosen at Large.

Shaun:                                  Thank you for having Ed Green on the podcast. I hope I got his name correctly… From blind Android users. It’s good to know where Android is now to be able to compare it from where it has been in the past and make sure that we’re accurate. I still question how good Android would be for someone who is brand new to the touchscreen. Although they do have the TalkBack tutorial, which is a huge help. And yes, we know Apple does not have such a thing. So, that could be a good help. But one of the things people tend to forget to ask is, are there people around me that actually know this device, whether it be a phone or Braille display or computer program, or whatever. Are there people around me that actually know it, and don’t mind helping? I had somebody recently buy a new device and it is fairly new to the market and come to me and say, “Hey, I need some coaching on using this.”

Well, it’s not a device I’ve ever used and I didn’t know anyone else who used it. Luckily, it’s similar to a different device, that the audio documentation for that would help at least with some of the fundamentals on the particular device she had. But my point remains that you need to make sure and ask, is there someone who can get me out of a jam if I need it? You don’t want to buy a piece of technology and find that, oh, if I’m not good at learning from a manual, now I have no one to learn from that’s not the manual. Also, for those who are transitioning between Android and iOS, for whatever reason, one of the biggest challenges that I noticed is that the two operating systems interpret gestures differently in the sense that with iOS, the flicks can be very small and minute.

And with Android the swipes, and notice that TalkBack is very deliberate in calling them swipes, need to be a bit more broad. And especially if we’re talking about the gesture down to the right to get the TalkBack menu, you really do need to almost feel like you’re conducting an orchestra. I’m doing it with my other hand right now, as I’m sending here. And it does need to be exaggerated a bit more than you’re used to if you’re an iPhone user. So, that may help you if, you are someone who ends up getting a phone that doesn’t have the multi-touch gesture. And certainly for the notification gesture, if you swipe down from the top of the screen with two fingers, that definitely does work. I’m not as sure about the other gestures that Ed mentioned, but the notification one definitely works. I’ve used that on my Android 7 unlocked phone and my BrailleNote Touch, which runs Android 8.

Jonathan Mosen:             That was Sean. The beginning of his message got cut off when I received it. But I know it’s Sean because I never forget a face. Interesting observations regarding the gestures. And I would say that this is a function of the hardware, not of the operating system. So in the years that I’ve used various Android devices in attempts to check it out, I’ve definitely seen a lot of variability with the exception of the taps, which feel different to me in terms of their tolerance level on my S21. I will say that flicking left or right or swiping left and right, and iOS does tend to use those terms interchangeably, feels about the same on either device for me. So, that may vary depending on the hardware that you choose to buy.

This email comes from Andrew Walker who says, “Hello, Jonathan. It is with a little trepidation that I offer some observations about Android. It seems to me to be difficult to pass any comment without upsetting someone. I am both an Apple and an Android user, although most of the time my prime device has been iPhone. I have currently an iPhone SE 2020, but also have a Motorola One Action, which I bought for 140 pounds last year. And this has recently received the Android 11 update. When I bought the Motorola, it looked like Android 11 would deliver those much wanted multi finger gestures. But as you have found, they are confined to Pixel and Samsung devices. Nonetheless, I have managed to use the right angle gestures reliably. So I am not too concerned about this. The right angle gestures seem to be a lot more forgiving than they were.”

And I would frame them more as curved gestures myself. There have been recent additions to TalkBack, which allow a gesture to enable some voice commands for various accessibility features. It is possible even for devices running Android 10 to enable dictation and edit boxes by saying “type” followed by the desired text. Not as slick as the iPhone two-finger double-tap to start voice dictation, but it makes life a lot easier. Even on older devices, you can accomplish several tasks using these TalkBack voice commands, including changing granularity and some edit functions. Clearly I would have liked my Motorola to have had multi-finger gestures, but I have become used to these new features, which I find helpful. Even on this low end phone, the majority of my day-to-day tasks are easy to accomplish for me at least.

As you have found, Lookout is a great app and is snappy, even on devices of low specifications. I cannot agree that Pixel and Samsung phones are always the right phones for people as suggested by one of your contributors. If that is his choice, then fine for him. Personally, if I was spending the money to buy a Pixel, then I would instead buy an iPhone. I have a Xiaomi phone as well, which runs Android 10. And to me, it is a good choice at a low price. These are of course my choices and I respect the choices others make, but I am wary of generalizations about which phones should be purchased over others. The starting point to me is what the phone is to be used for and whether the phone can do what is asked of it in a way the user finds acceptable.

Price does become a factor for many and it is possible to find Android phones for about a hundred pounds, which will run Lookout and other apps. Clearly the better and faster the phone, the better. So there may be trades to be made in this regard. After listening to the most recent podcast, I have carried out an experiment. I have an Amazon Fire 8 Tablet, which does run on Android 9. I have unlocked it and installed the Google Play Store app on it. From there, I have installed the Lookout app on it and to my surprise, this works quite well. Now, bear in mind that the Amazon Fire 8 Tablet only has two gigabytes of ram. And I would describe it as the slowest device I own. And the camera is beautiful. Even so I have just been reading text off packages with it in my kitchen, and it seems to work well. Bear in mind that I bought this device on Amazon Day for 40 pounds.

The great thing about these Fire tablets is that they run VoiceView as a screen reader, which supports, well, multi-finger gestures. Critical for me, it has the magic tap, two finger double-tap for playing and pausing media. My next step was to install my favorite app for playing audio books, Smart Audio Book Reader. I put in a 512 gigabytes SD card, loaded it up with books. And I am now convinced that this will be my device for reading books at night in bed. Smart Audio Book Reader has a great feature on the sleep timer. The sleep timer only kicks in if the device is not moved and if the sleep timer starts. The sleep function as indicated by a decreasing volume. A quick flick will reset the sleep timer. I am not suggesting that someone should use a Fire tablet of this specification as their main device, but it has a long battery life and stereo speakers.

Good enough for speech. I have also just installed a vocalizer TTS on the Fire tablet and it appears to run Dolphin EasyReader to play EPUBs and other text-based books very well. The VoiceView screen reader is a bit different, but no major difficulty. I tried to install TalkBack, but I could not get it to install. My main reason for trying these things was to see if it would be possible to turn a $40 device into a machine for basic reading tasks and playing media with multi-finger gestures. Well, my conclusion so far is yes. Although I have only had a little time to try these things out. I really don’t know why I am so excited about this since I was only inspired to try it out after listening to your podcast, which included your interview with Ed Green, I knew that the Fire tablets support multi-finger gestures in VoiceView, and I wanted to see if standard Play Store apps would work.

I really don’t understand why multi-finger gestures are only available on some phones when they can be made to work on Android 9 forked or not. I will end my ramblings here. I hope that your reintroduction to Android continues to be better than anticipated. Thank you, Andrew. Good luck with your experiments because experiments are fun. Now let’s go to Germany and hear from Renee. And I know this because the email begins. “Hi Jonathan. This is Renee from Germany. I am a regular listener of your podcast and want to thank you for producing so many interesting episodes. Since I am an Android user for quite a few years now, I was happy to hear that you gave it a try. I use a One Plus 8T for one year now. This phone also has a fingerprint sensor, which is located under the display, but after maybe five or six tries, I had no issues locating and using it.

It is located above the USB C port, which makes it quite easy to find. It provides tactile feedback too. Yesterday the Samsung Galaxy One also provides tactile feedback when you touch it.” Renee continues. “Since you talked a lot about the missing multi-finger gestures on other than Samsung and Pixel devices. I thought I would give some information about the alternative screen reader to TalkBack, which is called Commentary Screen Reader. Of course, there might be users who will not give this a try because the latest version is only available from GitHub and not the Google Play Store. And it is made by a Chinese developer. So it could theoretically do unwanted things with your private data. For me, the many features plus responsiveness was more important. So I tried it and even paid for the premium version. After some weeks of using it, I decided to do the German localization and I updated as soon as a new version is released by Lin the app’s developer.

This screen reader provides multi-finger gestures on every phone running Android 11. It responds even faster than TalkBack, which was the main point for me to use it as my main screen reader. It allows users to assign gestures for almost anything, including starting any app, which is installed on the phone. You can also assign these to hardware buttons, such as long press or double/triple press of volume up/down. It has built in OCR, capture recognition, translation features for the currently focused element, a built-in voice assistant, which lets you create custom commands and also supports the Lua script language for creating additional functions. For example, the user is able to create an auto click function for any button on a screen, which can be mapped to any given gesture. Gestures can be assigned globally or per app. You can set up a second TTS, which reads notifications and screen reader related messages.

There are many different sound themes available for the screen reader sounds including several voiceover sound themes. This is not really important, but nice to have. Of course it also has some disadvantages, for example, it has no built-in Braille with an uppercase B keyboard yet. You have to use third party apps like Advanced Braille Keyboard for that purpose. Mariam Mohsen that’s spelled M-O-H-S-E-N from the Blind Android podcast did a series of short demonstrations. The users who want to know more about Commentary Screen Reader could listen to them. Please let me know if you or other listeners have more questions related to Commentary Screen Reader. I am happy to answer them, even though there are still features I haven’t explored myself yet.” Thank you very much for this Renee. I will include the link to Commentary Screen Reader in the show notes, since you have provided it.

And at some point I will definitely give this a try. This sounds like the most incredible mobile screen reader I’ve ever heard of. It really does sound impressive. It is a shame that the Braille input isn’t in there when there are so many other powerful features, but perhaps that will come later. It would also be good if he eventually gets to supporting Braille devices within the screen reader. If you could have a powerful screen reader like this, that had Braille built in, wow, that would be compelling, but it sounds pretty compelling anyway. So I appreciate the summary of all that it can do.

To Mexico we go for this email from [Gera who says “Jonathan firstly, very awesome segment devoted to Android on episode 150. I wish this would have been available back in April of 2015 when, because of economy, I had no choice, but to switch from an iPhone 3G S with iOS 6.1.3, I think it was to an Android. Motorola Mogo G2 running Android at 5.02 because of the fact that I mistakenly came to Android from iOS, thinking that everything worked the same in terms of apps and the iOS itself, but on a cheaper phone, you can imagine my initial frustrations were endless.

Apps like Tune-In Radio. So accessible with iOS though accessible with Android, I don’t know if this changed over these last six years, hopefully it did. They’re accessible enough. There were some unlabeled buttons to cope with. Not everything was frustrating though. I was able to take advantage of widgets and stereo recording way before iOS implemented them. Thus the Motorola technology for the price was definitely worth it. The problem as mentioned above was overall the general workings of Android and not having enough of an open mind or of some sort of step-by-step podcast series to start learning it. In terms of apps I loved on Android.

I’d like to really recommend Amazing Audio Recorder of which its developer really takes accessibility in stride. Thus, it would be neat if you could have him on your show and also demo this recording app. The aspect of Amazing Audio Recorder I loved apart from its accessibility and ease of use and its ability to use the Moto’s stereo recording capabilities was that you could install an app apart from the main app with which you could on the fly assign gestures to pause and stop. A feature, which the audio memos iOS app, as I recently discovered has via the two finger double-tap, which pauses and resumes recording. Good luck in your Android journey and again, great podcasts.”

A few weeks ago, Debee Armstrong was asking about the slate and stylus and how you pick up your pace with that. We’ve got a couple of suggestions here and Lena says “The podcast was wonderful. I am a passionate slate and stylus user. Congratulations to Debee for practicing and developing a useful and fun skill. Yes, we can write as fast as sighted folks. We’ve tested this at our community college. We can draw good pictures too. I’ve been using a slate and stylus for a long time and I write really fast. It is truly my pen. Here are my suggestions. Do not sharpen the stylus. It may poke through the paper. When that happens, dots are unpleasant to read and do not hold up well. Try a variety of slates and styluses. Yes, that is an acceptable plural these days.” Well, I don’t like it Lena. People have got no standards.

Anyway, continue. “Some people like a large handled stylus, others like the saddle shape and some prefer the pin style. My favorite stylus is the smallest round handled wooden one, which has the flat spots so that the stylus is less likely to roll away. Like Stili, I am a Latin lover.” Yeah, that’s barely. No, that’s better. “Slates come in a variety of sizes, shapes and materials. I prefer the metal slate with the pins up six lines and 20 cells. Using paper and landscape orientation, I can write two columns on each page. To save time when writing Braille with an uppercase B for myself, I use a lot of the contractions from Grade 3, all of the contractions from UEB and a few that I made up. I also omit the capital sign. I prepare the paper ahead of time so that moving the slate is faster. The holes are already there.

If I am writing something that only has to last a few years, I save time by using lighter white paper. I like 32 pound bright white paper from Staples or HP. Both brands hold the dots well and function well in the printer and typewriter. When writing something that will be used a lot. I use 50 to 60 pound bright white smooth cardstock. It is less expensive than official Braille paper. You may wonder why a totally blind person cares about the color. Two reasons, the bright white papers and card stocks tends to be very smooth, which gives a nice reading experience. And the dots show up better. If a sighted person is looking, the dots can even be photographed with an iPhone. When I want to illustrate something to a sighted person though, colored paper gives better contrast, especially if I am drawing pictures.

So Debee keep practicing and try different slate, stylus and paper combinations.” Thank you, Lena. What a fantastic message. And to Moose Jaw, Canada, we go where Kelly Sapergia says “Hi, Jonathan it was great hearing Debee Armstrong’s email regarding her experience with the slate and stylus. I’ve been using this excellent writing tool since I rediscovered it when I was 18. I also was taught how to use it when I was in Grade 2 or 3 at school. For whatever reason, I never used it during class, possibly because the teachers may have felt that it was quite slow when taking notes compared to the Perkins Brailler. My mom, however kept encouraging me to use it and would read articles from publications like Future Reflections, the NFB’s magazine for parents of blind children about its many benefits. I was stubborn though and refused to use it as I felt it was indeed too slow to write with and didn’t need it because of all the technology I had at the time.

And so it stayed in a Ziploc bag in a cupboard, in our living room, lonely and unloved, cue the sad violin music” says Kelly, “Though mom kept encouraging me to at least give it a try telling me that my speed would increase the more I used it. One day when I was 18, I came across it while looking for something else. This time I decided to go for it. And after experimenting with it at our kitchen table, while writing on a blank recipe card, mum gave me, I instantly realized what a useful device it really is. You could say I’m now making up for lost time. As a matter of fact, I still have and continue to use that first slate and stylus today. Slates and styluses have helped me out of a jam on more than one occasion. I remember trying various methods of taking notes while performing music at senior centers here in Moose Jaw.

I took my iPhone once, but was scared that I was going to drop it and found it rather awkward to type on. This was before I found out about Braille screen input, but I’m still reluctant to use it during performances. Next, I tried my Olympus LS 7 digital recorder that I’ve had since 2011. The recordings came out fine, but it was awkward going through an entire performance at a later time to find the name of a tune someone had requested. I then remembered the slate and stylus and started taking it with me in a clipboard that stores both my Braille song list and some blank index cards for the slate. The slate fits nicely in a pocket inside the clipboard’s cover. And the stylus is in my shirt pocket. Wow. What a difference? I currently have at least five slates. I like having backups just in case and numerous styluses. Three of my slates are pocket ones with six lines with 19 cells on each, which is perfect for the above scenario.

The last one I bought, like this is the Genesis Slate by APH, which also has five lines on the other side, making it possible to write in what’s known as Interline Braille on both sides of a card. They also have an interpoint Slate, which I may get back to at some point. The other two are what are called standard slates with four lines and 28 cells for each. Perfect for use with eight and a half by three and a half inch paper. My favorite stylus is what’s known as a safety stylus. The tip is effectively screwed into the stylus so that you normally can’t write with it. When you do want to write something. All you have to do is turn the tip until it comes out. At which point it can be reversed and put back into the stylus. When you’re done, you just follow the same procedure as before. This way, you don’t have to worry about poking yourself with it when it’s in a pocket.

A few things I like about the slate and stylus are it’s portable and can be taken anywhere. It’s usually fairly cheap. The ones I purchase are around $9 to $15 though there are some more expensive ones. It’s much quieter than a Perkins Brailler. You can write on just about any kind of paper, not just Braille paper. And most importantly, batteries are not required. I’ll admit that I’m not a super fast writer with it, but I’ve noticed that my speed has begun to gradually increase the more I use it. It’s been a real lifesaver as well as my Perkins Brailler has begun to act up with the carriage lever, for example, refusing to move, unless I press the space bar twice or else not moving until I have to force it. All this is why I’m a firm believer in using the slate and stylus like my mom did when I was growing up.

I encourage everyone to give it a try. If you haven’t already, you never know when it will come in handy.” Thank you so much, Kelly another great message on this. About a week and a bit ago, Microsoft had an event where they announced new devices, including Surfaces and a new Surface Book Studio I think it’s called. Their product names are very confusing. And Rebecca has commented on this. She says, although the product line didn’t excite me, I’m impressed with Microsoft public commitment to accessibility. And some of the simple things introduced in their new tablet, i.e. a lanyard on the kickstand. I’m not giving up on Windows anytime soon because of its functionality and Microsoft philosophy.

This email is anonymous and it says, “I have had a friend who was very sensitive towards disability and more so on my blindness. Although I’ve encouraged her curiosity, she always cries after asking me something or thinking about my blindness. Being a science enthusiast I can’t help it when I saw that kind of backward perception from that friend who is a scientist. How should I deal with that situation? It is happening often and it intervenes our relationship quite a lot. Once when I confronted her and quite harshly, remind her that that’s hard for me to tolerate her perspective. She got a panic attack. Is this something I can more understand?” I have actually seen this once when I was a kid and somebody just cried and cried and sort of like hugged me and cried because I was blind and the other time was a weird one.

And I’ll have to get Bonnie to remind me of all the details, but we were on holiday with the children and this woman who was one of the people who owned the place we were staying at, came and gave us something. I’m trying to remember what it was that she gave us. And she just said, we were a lovely family and it was wonderful how we all sort of functioned and she burst into tears. It was quite moving in a way. I didn’t find it offensive. It was odd that she was just moved by two blind parents with four sighted children. But if you’ve got a colleague or a friend that you work with a lot who keeps bursting into tears about your blindness, I don’t know how you would deal with that. I can understand why it would make you uncomfortable. Obviously you don’t want to be pitied. You are a scientist, obviously. You’re well educated. You’re well adapted to your blindness. You’re just getting on with your thing.

If she’s having a panic attack about you confronting her about the issue, then it sounds like we just need to be a little bit sensitive to and compassionate about any mental health issues or something like that that might be going on in her situation. I don’t know what to advise you on this, except to perhaps gently make the point that you’re getting on with life. You’re loving life. You don’t believe your blindness was anything to cry about. So nor should she, and maybe it will just come right over time. I don’t know. What do others think about this? What would you advise in a situation like this? We’ve got a colleague who bursts into tears because you’re blind. I love to hear from you. So if you have any comments you want to contribute to the show, drop me an email written down or with an audio attachment to Jonathan, J-O-N-A-T-H-A-N @mushroomfm.com. If you’d rather call in, use the listener number in the United States, 8-6-4-6-0-6-6-7-3-6.

Show more