2014-03-17

iSee - Getting the Most out of Your Apple Product’s from a Blind Person’s Perspective

By David Woodbridge

First Edition 2013

Table Of Contents

Chapter 1: Introduction

Chapter 2: Apple’s Accessible Product Lines

Chapter 3: Mac Accessibility Overview

Chapter 4: Getting Started with your Mac using VoiceOver

Chapter 5: Mac/VoiceOver Keyboard Commands and Gestures

Chapter 6: Shared Built-in Mac and iOS Apps

Chapter 7: My Favourite Mac App Store Apps

Chapter 8: My Favourite 3rd Party Mac apps

Chapter 9: Accessibility iOS Overview

Chapter 10: Getting Started with your iPhone, iPod touch or iPad using VoiceOver

Chapter 11: iOS/VoiceOver Bluetooth Keyboard Commands and Gestures

Chapter 12: What iOS device is best?

Chapter 13: My Favourite iOS Apps

Chapter 14: Hardware Bits and Pieces that I have found Useful

Chapter 15: Switching from Microsoft Windows to OS X

Chapter 16: Resources

Chapter 17: Bringing It All Together: My Family and Apple

Back to top of Table of Contents

Chapter 1: Introduction

This book came about as I have been writing about Apple Accessibility since 2009, and I have ended up with a collection of articles spanning the Apple product line in relation to accessibility, and of course all my audio demos.

Recently I thought, why not update these articles, and put them in to a book to share my tips with others, and to make a place where I, and others, can go to find out tips on getting the most out of their Apple devices.

But I guess the biggest reason for me doing this, is that it just works for me out of the box for speech output, and using and maintaining all of these devices for me and my family is only possible because it is accessible.

So, its also a way of saying thanks to Apple for having this commitment to accessibility for all.

So here it is, and I hope you get some useful information out of reading this book.

A Little Bit About Me.

The following bio is what I give to anyone who asks, so please forgive the third person narrative:

David Woodbridge is a Senior Adaptive Technology Consultant at Vision Australia where he has worked since 1990. Over this time he has assisted people who are blind or vision impaired in their home, education, and work settings to take advantage of the benefits of using assistive technology.

In the last five years, he has also been involved with evaluating technology for use by people who are blind or vision impaired covering both low and high tech equipment (covering Microsoft, Google, Nokia, and Apple). David is also one of the key spoke persons for Vision Australia relating to technology.

David has been using the Apple platform since 2008 evaluating it for low vision and blind users covering desktop, mobile, apps/software, hardware, and the Apple TV.

He has been an Apple Ambassador for Apple Australia since 2009 with a group of other Ambassadors/Apple Distinguished Educators (ADE’S) covering the range of Apple’s Accessibility solutions throughout Australia.

David has been involved with the beta testing of OS X with Apple US for Snow Leopard (10.6), Lion (10.7), Mountain Lion (10.8), and Mavericks (10.9).

He regularly feeds back to accessibility@apple.com accessibility related issues across all of Apple’s product line, in particular, VoiceOver on Apple’s mobile platform (iOS) since it first supported accessibility with the iPhone 3GS in 2009.

David produces a range of podcasts covering Apple and other technologies which are distributed on his own iSee podcast, Vision Australia AT Podcasts page, Applevis podcasts, and heard on the ACB (American Council for the Blind) Main Menu Technology show. In addition, David is also one of the editors on the http://www.applevis.com website.

He has a regular Talking Tech program which can be heard every Tuesday at 4:30 Eastern Summer Time on Vision Australia Radio Melbourne, and stories supporting the program can be accessed on his own blog at iSee - David Woodbridge Technology Blog.

He has also spoken on various radio stations concerning technology for blind or low vision including 2GB in Sydney, 2RPH in Sydney and 4RPH in Adelaide, and ABC Radio in Queensland.

David has presented at various conferences (including Spectronics in 2010 and 2012), conducted training workshops on the use of Apple Technologies (including Royal New Zealand Foundation for the Blind Learning about Apple Accessibility 2011, and the use of iPads with speech/Braille Tasmania 2013), and has been written up in a number of articles including “Putting the I back in I Devices” in November 2012 which was listed on Apple’s Hot News. At the Spectronics conference in 2010, David presented an unofficial launch of the iPad when it was first available in Australia.

David lost his sight when he was 8 years old and had to learn Braille. Since then, he completed high school, went to Sydney University receiving a Social Work degree, spent 4 years drug and alcohol counselling, and move into his current job.

As a person who is blind, David believes that as a user of the technology that he recommends to others, that he is well situated to look at the strengths and short comings of the assistive technology that he comes across in both his professional and personal life, in particularly, mainstream technology that is accessible.

Connect Follow me on Twitter: @dwoodbridge I post about articles of interest in relation to Apple and other assistive technologies.

Subscribe to my Apple and Other Technologies podcast at: or via iTunes

Subscribe or access my blog at:.

Email me at: davidw9@me.com

Check out this news article on my use of Apple technology:

My Trip Through Time With Adaptive Technology

This section outlines my personal experience with adaptive technology from when my sight deteriorated so that I had to start learning Braille and was at Boarding school at the Institute for Deaf and Blind Children North Rocks (RIDBC) 1972–1978, mainstreamed into Northmead High School 1979–1981, onto Sydney University to do a Bachelor of Social Work 1982–1985, my first of two jobs as a telephone drug and Alcohol Counsellor at ST Vincents Hospital Alcohol and Drug Information Service (ADIS) 1986–1990, complete loss of sight in 1988, and my current job at then the Royal Blind Society (RBS) of NSW (now Vision Australia) from 1990–2013.

The chapter is divided up in to the above time zones with boarding school and high school, University, My first job as a drug and Alcohol Counsellor, my second and current job as a technology consultant, and conclusion.

Its all focussed on my trip through time with assistive technology. So no mention of the rest of my life in regards to personal details.

Some of the software and devices I talk about may not have arrived on the scene exactly in the year that I remember, so apologies in advance for this.

BOARDING SCHOOL AND HIGH SCHOOL

In 1972, my eyesight deteriorated to the point that I was no longer able to see the board in class, despite glasses. The decision was made to send me to boarding school at the Royal Institute for Deaf and Blind Children. During my time at boarding school at North Rocks in Sydney, and moving onto high school at Northmead High School in Sydney, I was introduced to a number of different types of technology, some of which I never thought I’d use when I got older.

The first thing I had to learn to use when I started at the school at North Rocks was the Perkins Brailler. This was and still is a manual Braille writer. All my classroom work was done on the Perkins and lucky enough my teacher could sight read Braille. All the books I had to read for school were also in Braille.

I can remember in my second year, that I had to learn to touch type on a manual typewriter and asking my teacher, “what was the use of using the typewriter when I couldn’t see what I was typing?” I can’t remember her response now, but I’m certainly glad that I stuck with it. The odd thing is that once I learnt to touch type at North Rocks, I didn’t use this skill again until my third year at university.

For recreational reading, the boarding house had a number of Mark IV talking book players. These were something like a rather large wooden box with a speaker, on top of which you put this big metal 16 track cassette tape, and listened to the talking book. A bit clunky (but it worked), but I started to ask why can’t I read directly what everybody else reads? Every time I’d hear the date when the talking book was recorded which seemed to be a long time in the past to my young self, I felt sad, that apparently being blind I always had to get things that were old and out of date.

There was also this amazing electronic games console which for the life of me I can’t remember the name of and which would be fantastic for children who are blind today. It was a rectangular box. On the top in the middle you had a numeric keypad exactly the same as our telephone keypads today. On each side of this keypad there were a few more buttons. On the left and right edge you had a (for the want of a better word) paddles that you could push forward or back in a track. There were about ten games that you could play on this console, but I can only remember a few now.

The games that stick in my memory was Tennis where wearing headphones, you heard the ball represented by an increasing or descending tone, and with the Paddle you had to match the tone of the paddle to that of the ball: of course, once the tones matched you got a point. The other game was Pigeon Shooting where a voice would say a sequence of numbers and you had to press the last number in the sequence (like 1, 5, 9 or 4, 5, 6), and when you got it right you’d hear the gun go off. Just writing about this device, reminds me how much I loved playing that thing. Perhaps readers of this will know the name and share back to me.

The other two items that have stuck in my mind about my time at the boarding school was firstly the size of the encyclopaedia in the library, Braille volumes of which filled an entire wall. I’m probably exaggerating a bit, but I think there were over 100 volumes. Secondly, the Little Oxford Concise dictionary in Braille was 16 volumes, hated to have seen what the full version would have been: probably a small forest.

In 1978 I was shown the Sonic Glass’s which was an electronic travel aid based on ultra sound waves like some devices today such as the Mini Guide. While listening to the sound coming back to the glasses through little ear plugs, you could detect the distance of an object and get to know the composition of that object. I remember thinking that a glass window sounded very different to a brick wall. At the time I felt that having something constantly making noise in my ears would distract me from using my primary mobility aid, which was the white cane. Nevertheless, another neat bit of technology and I was pleased that people had thought to show it to me.

When I went to Northmead High School, I still had my trusty Perkins Brailler. Unfortunately for me, the teachers at the school could not read Braille (sight or otherwise). When I completed my classroom/homework, tests etc, I had to give the work to my Itinerant Support Teacher who would write over the Braille in print thereby allowing the classroom teacher to read and mark my work. I always felt somewhat frustrated that I never got my comments from the teacher at the same time as all the other students. Of course, all my textbooks were also in Braille and fairly cumbersome to cart around. As all my textbooks were done for the year, any additional reading was out of the question.

From about year 10 onwards at school, I wanted two things very much: to read print directly, and be able to give my work to people directly and get feedback straightaway, as everyone else did.

My first wish sort of came true in late year 10 when I was introduced to the Optacon at the Royal Blind Society at Enfield. Funny enough it wasn’t my school work which prompted this opportunity, but my wish to become a better sail boat crew person, and hence read books on sailing.

The Optican was a device which through a camera tracks along a line of print, brings up the shape of the characters on a set of vibrating reeds which through touch; you read. I found learning the system quite challenging and a bit frustrating as I had forgotten to some extent the shape of the print letters and punctuation. Also by this time I was quite happily using Grade II (contracted) Braille and trying to work out what a word such as “one” was supposed to be was a pain. For those of you that don’t know, in Braille one is dot 5 and the letter o. Oddly, when I went from print to Braille, and then Braille to typing on a typewriter, I can still remember the male teacher at the time calling me an idiot because I couldn’t spell the word “one”.

When I was sixteen in 1980, I went to the Royal Blind Society (now known as Vision Australia) for one of those overall assessments that tries to determine your strengths and skills and from this works out what path you are most likely to take. My most likely path appeared to be working back on the farm doing farm things. Only problem was, my parents didn’t have a farm, I’d never been on a farm and I certainly didn’t want to do any “farm things”. I knew what I wanted to do, and it wasn’t anything to do with getting closer to nature. However, every time I brought up the thing I wanted to do (computer science), I was met with caring but negative comments. So I learnt to keep my mouth shut about my dream and wait: one day.

When I was finishing up the High School Certificate (HSC) in 1981 and looking at what to do at University, I made the mistake, yet again, of opening my mouth and telling my dream with exactly the same results I had met previously. As a consequence I did Social Work rather than Computer Science.

Just goes to show you can’t thoroughly destroy a dream if it’s powerful enough as I am now living my dream not so much as a computer scientist but as a technologist, which in my book is pretty good thank you very much.

Looking back to 1981, I’m not sure if the technology then would have been able to support me in doing computer science with respect to accessing the computer systems. But a little part of me still feels like I should have at least given it a go. However, when you’re only 17 and getting told by people who have your best interests at heart, it’s hard to argue. As someone once said (or maybe I’ve just made it up), “Sometimes it’s not the things that happen along the way that are important, but the fact you got there in the end”.

UNIVERSITY

By the time I got to university (1982), my eye sight had deteriorated to the point of complete blindness. I could only determine the difference between light and dark.

I commenced my four year stint at Sydney University in 1982 still holding my trusty Perkins Brailler (now 10 years old).

As I couldn’t really use the Perkins in lectures due to the noise of the Brailler, I used a four track cassette tape recorder to record all my lectures and tutorials. I then would scuttle back to the library to translate what was on the tape into Braille on the Perkins: a very time consuming process. My life seemed to be in these early years at Uni split between going to lectures/tutorials, and spending time in the library transcribing.

One good thing at least, all my textbooks were on cassette tapes which I stored where I lived in cassette draws. These tapes came from Student Services of the Royal Blind Society and I would have not been able to study if not for this service. Of course, I didn’t use this service for additional reading. For this I had several volunteer personal readers who used to spend quite a lot of time with me in the library reading documents out to me whilst I took notes and recorded the sessions.

The first of two fairly major technology related events that occurred in my third year at Uni was to commence a 1 year computer science course at Macquarie University specifically designed for the blind or low vision ran by Professor Ron Atchison. Myself and 4 other blind and low vision course participants attended lectures and computer labs on a weekly basis throughout the year.
Professor Atchison’s wife assisted me in the labs to learn computer programming and she was a tremendous help in assisting me to complete the course. At the end of the year I was the only one standing as it were, and earned myself an A+ on course completion. I did feel like jumping in to the mythical TARDIS, going back in time, and waving my result in front of those folks that said it couldn’t be done and I didn’t have the aptitude for it. To this day, I really appreciate the time that Professor Atchinson and his wife put into making my dream become a reality. I think the computer I used back then was an Apricot computer with an external Voctrax external serial synthesiser. I remember that every time I turned the synthesiser on it said “error 7” which I never found out what it meant. If anyone knows, let me know. Whilst doing the course, I had the opportunity to use an IBM electronic golf typewriter that Professor Atchison had developed with speech output. The way I seem to remember it working was that you could correct any word on the line you were typing through speech feedback and then press the enter key to type out your line to the paper. Pretty exciting stuff at the time till I arrived at the second significant event for the 1984 calendar year.

My second significant event came when I purchased my first computer. An Apple IIe with 64K RAM, duo 128K floppy disk drives, a 9 pin dot matrix printer, and a very high speed modem racing along at 300BPS. I have no idea what the screen was, but I guess a black and white 12inch monitor which you could running in either 80 or 40 columns. To make it talk, the Apple was purchased with an Echo II synthesiser with TexTalker and a number of talking programs. My first talking program was Word Talk which was a talking word processor with no spell check. To spell my documents, I had to run a separate talking program which I purchased soon after called Sensible Speller. In those ,days you could only run one program at a time hence the jumping between Word Talk and Sensible Speller.

Armed with my accessible computer, I was now able to keep my notes on floppy, and write up my assignments and print them out to hand in. Unfortunately my first attempt at doing this sort of failed because when I handed the lecturer my print out, it was his unfortunate task to tell me that the pages were blank: the ink had run out. I remember him saying that this was probably the best excuse he had heard about not handing in an essay on time.

After I settled down with the computer, I got the “I want access to information” bug. My world had just opened up, and the days of accessing out of date content were potentially over. So I spent a weekend transferring from Braille and typing into the Apple IIe the Australia Post Code book: very odd: but I could look up any post code I wanted in a couple of seconds and do it electronically. I know it was already in Braille, but there you are: no comment.

One of my practicals in third year uni was at a welfare agency where their referral database was on print cards and somewhat out of date. I set my task to update their referral information and give them a nice new shiny referral book. With the assistance of Word Talk I did indeed accomplish this task. When I think back and the limitations of the Word Talk program compared to what I use now, I still can’t quite believe that I managed to produce a professional layout referral book for the agency.

With the Apple IIe, I also got a modem. My time on the modem was mainly spent ringing up Bulletin Boards (BBS) and sending and receiving email. The only thing I liked about the modem was that when the phone rang, the ring tone of the phone attached to the modem sounded like a cute little cricket.

Oddly now in 2013, I can run the Apple IIe with the Echo II synthesiser in an emulator on my Mac Air with lots of talking programs: bit of a trip back in time. But no cricket.

The rest of the time at uni past fairly uneventfully and most things were covered with the use of cassette tape textbooks, personal readers, tape recorder, Perkins Brailler, and of course the Apple IIe. When I finished uni I sold off all my storage cassette cabinets which I think from memory could hold about 2000 tapes.

By the end of uni, I no longer had light perception, I was now completely blind, all I now see is grey. I think’s it grey, I can’t really remember what colours look like anymore.

MY FIRST JOB AS A DRUG AND ALCOHOL COUNSELLOR

Yes, I still had the Perkins Brailler (now 13 years old), the tape recorder, and the Apple IIe, and these were extremely useful in doing my job.

The way I initially got access to the Alcohol and Drug Information Services (ADSI) computer database was not through the use of the Apple, but through some clever programming from Phillip (sorry can’t remember his last name) from the Garvin Institute next door, and the use of the DECTalk Classic synthesiser. This thing was quite large, 60CM, by 30CM, by 15CM. The dumb terminals that ADIS used which comprised of a keyboard and a monitor linked via serial to the mini PDP11 computer upstairs were patched into the DECTalk Classic. Phil then wrote some software that allowed me to review from line 1 to 24 each line on the screen and repeat each line if required. I soon got to memorise what line specific information was on in a database record. For example, line 5 was the telephone number of the agency I was using. I can’t quite remember how I did my database searching, but somehow it all worked. Of course I couldn’t review each line by word or character, but it gave me access, which at that time was all that counted.

I remember ringing up the Royal Blind Society and asking if there was any other way of me getting access to the information in the database. The answer came back no. This inspired me to then think outside of the square and change the No to a Yes. I approached the Commonwealth Rehabilitation Centre (CRS) to see if I could get my hands on some equipment which may assist me in getting what I wanted to achieve.

CRS purchased two products for me; The first of these was another Apple IIe which I used in quite an unexpected way to gain proper access to the work database. I went out and purchased another talking program called Proterm which was a telecommunication program. I then setup Proterm to capture any data coming in through the serial port and save it onto a floppy. Luckily by this time the size of floppy’s I was using were I think about 800K capacity. I then instructed the PDP11 to think of my Apple as a printer and print out (or dump) the entire database to my system. After this was done, it was just a quick job of loading the document into my Word Processor. I used the find function to find records in the database file. As this wasn’t live data, I had to update my copy of the database at least once a week to make sure I had all current changes. Of course being a young smarty pants, I couldn’t resist ringing the RBS back and telling them that I had solved the problem thank you very much.

I also used the Apple IIe to print out my stat sheets for the day and any other information that my manager required.

The second bit of equipment that was very useful at the work place was the Braille to print device which attached to my Perkins Brailler. This device attached to the bottom of the Perkins once the bottom cover of the Perkins was removed. As keys were depressed on the Perkins, this would cause springs to be pushed down, and with the aid of a bit of electronics, produce print characters which would then be sent off to a 9 pin dot matrix printer. The upshot of this was that anything I brailled on the Perkins I could have a print copy of to give to other people to read: this was extremely useful in the work place.

One other little device which snuck into my ever increasing pile of assistive technology was a light probe. This device sounds a tone when a light source is detected. I used this useful little device to find out which line was in use or which line was ringing on my telephone. Before this, I just had to hit one of the 4 telephone line buttons until I got the line that was ringing: very hit and miss and not very efficient. Oddly enough, I now have a Light Detector app on my iPhone these days.

So between the Apple IIe, the DECTalk classic, the Braille to Print, and the light probe, I had all my job tasks covered.

I think it was at about this time that I began putting the pedal to the metal in moving towards becoming a technologist for adaptive technology for the blind or low vision. It was in 1990 that I was asked to apply for the job of Technology Resource Officer at the Royal Blind Society. This could have been to stop me calling and bugging them.

My second job - RBS and now Vision Australia as a technology consultant

Once I left ADIS (1980), the Apple IIe’s got consigned to the big computer room in the sky. There wasn’t much need for the Braille & Print, the DECTalk Classic or the light probe at RBS: but the Perkins still came in handy as a backup (now 17 years old).

I commenced at RBS in June 1990 as a Technology Resource Officer. This job was to assess, recommend, install, and give basic support to clients of RBS across the areas of home, education, and employment. I was finally in my dream job.

In order to do the job, it was necessary for us to evaluate or should I say “play” with the adaptive technology and relevant PC hardware, software, and peripherals at the time to best fit our client needs. This is also true today. To enable me to carry out the job when I first started, I had a Toshiba modified laptop called the PC Plus which was basically RAM and a 3.5 720KB floppy disk drive running a talking program called Keysoft which contained built-in applications such as a Word Processor, Calculator, File Manager , etc.

I also had a Braille & Speak which was a little note taker with a Braille input keyboard with speech output with text files that you created to store all your information in. This was a very quick and easy device to use. If I still had it today, I would keep using it as it was just quick: turn it on, input a note, and turn it off (no mucking around).

My need for a laptop and note taker to enable me to do my job hasn’t changed to this date. From 1990, the laptops became more powerful and moved away from MSDOS and up the Microsoft Windows tree. I now use both a Mac and a Windows laptop at work, with most of my research, podcasting, social media etc being done on the Mac (including writing this Multi-Touch smile).

I have only been through 4 note takers counting the Braille & Speak. The second was the Braille Lite which was a Braille & Speak with a 40 cell refreshable Braille display which I only stopped using in 2003. Thirdly the Pac mate with a 40 cell refreshable Braille display (2004) based on Windows Mobile and a screen reader which I stopped using in 2009 replaced by the iPhone 3Gs and later the iPad in 2010 (updated versions of which I am still using today as one of my main note taking devices). Although this year at the time of writing (2013) I have now a Braille Sense U2 which I use as a notetaker and a Braille display to my iOS devices and Macs.

I should mention at this time, the Perkins Brailler was sadly laid to rest. After 20 years of loyal service, I donated the little fella back to the Royal Blind Society, to hopefully gain a second life with another young hopeful.

I had my first talking Nokia phone in 2002, with a number of different Nokia hand sets up to 2009, at which time, Apple introduced the iPhone 3GS with VoiceOver, and my Nokia phone stopped being used. Up until the talking Nokia, land lines were my main communication, the normal keypad phones, and before that the dreaded rotary style telephones. With the rotary I had to count the holes manually to know what number I was dialling, this used to take a very long time to make a call.

I remember in the week I started at the RBS playing with the Macintosh SE running System 6.07 with OutSpoken which was a screen reader for the Mac developed by Berkley Systems in the US. Amazingly it actually used the sound chip in the Mac itself for its synthesiser unlike IBM compatible screen readers at the time. It was actually a great experience to use a Graphics User Interface (GUI) with a screen reader which I didn’t get to do with Microsoft Windows 3.1 until several years later.

I actually purchased my own Macintosh LC 520 a few years later in 1993 for home, but it wasn’t until 2010 that I purchased my next Mac (iMac) and subsequently Macbook Pro, and Macbook Air.

In 1993, work purchased a Macintosh LC475, 1996, a Power PC, and a PowerBook 1998. I seem to remember that these Macs were mainly used to demonstrate to low vision folks, either CloseView (which came with the OS) or InLarge from Berkley Systems. 2000 to 2005 was quiet on the Mac side of things. I believe Outspoken stopped being developed with System 9, and it wasn’t until Apple itself with OS X 10.4 (Tiger) introduced VoiceOver to allow blind or low vision folks again access to the OS which I started using again in 2005.
To support the help desk function at Vision Australia in assisting people using the Mac (which could now include screen reader users), the Mac mini was purchased. I’d have to say that the years between 2005 and 2009 were a bit thin on the ground (yes again) as far as the up take of the Mac was concerned amongst the blind or low vision community.

It wasn’t until 2009 when I was asked by Apple Australia to join a group of similarly minded folks in evaluating, supporting, and training in the use of Apple products, that my interest got captured by what Apple was doing in the Accessibility space, and I am still enthusiastic about Apple’s commitment 5 years later.

Also in 2009, Vision Australia obtained 10 Mac minis to support the technology trainers in various offices, and then later on, quite a number of iPhones, iPods, iPads, and iPad mini. Of course, the uptake of the various iOS devices (iPhone, iPod touch or iPad) has been tremendous. In the last couple of years, interest in the Mac has very much increased as well.

My job in 2013 at Vision Australia is to help run the Adaptive Technology Help-desk, conduct workshops on adaptive technology, present at conferences, produce fact sheets, record and distribute podcasts, and evaluate equipment. I also present on a weekly technology radio program, “Talking Tech”, as part of Vision Australia radio. My direct client work is more when required to support other staff these days.

CONCLUSION

Oddly, I started using Apple products in 1984. Had a break from Apple after System 9 was the last OS that OutSpoken supported. Started getting back in to the Mac when I was first asked to support a person who was blind in using Mac OS X Tiger in 2005. Gave up my Nokia talking phone in 2009 for the iPhone 3GS with VoiceOver. Then after this, it was a gradual up take of the other Accessible Apple products such as the iPad, and the Apple TV.

Interesting that I started off with Apple just under 30 years ago, and now I’m using Apple again. My motivation for writing this chapter concerning my experiences with adaptive technology was to see where technology has come from, where it is today, and perhaps to allow speculation on where it will go tomorrow.

Besides my constant wish to have accessibility mainstreamed (which Apple is doing - and hear I say it Google and Microsoft), is to have more everyday type devices accessible. Hopefully I’ll be around to see it happen.

Ok, So What’s in this Book?

I know you can check out the contents, but here is a quick summary of what you will find in this book.

I won’t list them here, but each chapter has a number of sub-sections which you can work out from the chapter names (if not, read the contents smile).

I’ve also tried to make the chapters somewhat independent of each other so you can just go to the chapter that is of particular interest to you.

The main chapters are:

About this book.

Apple’s Accessible Product Line.

Accessibility Mac Overview.

Getting Started with your Mac Using VoiceOver.

Mac/VoiceOver Keyboard Commands and Gestures.

Shared Built-in Mac and iOS Apps.

My Favourite Mac App Store Apps.

My Favourite 3rd party Mac Apps.

Accessibility iOS Overview.

Getting Started with your iPhone, iPod touch or iPad Using VoiceOver.

iOS/VoiceOver Keyboard Commands and Gestures.

What iOS device is best?

My Favourite iOS Apps

Hardware bits and pieces that I have found useful.

Switching from Microsoft Windows to Mac OS X.

Resources.

Bringing all together: My Family and Apple.

By the way, in case you were wondering, I did indeed create this book using iBooks Author on the Mac using VoiceOver. Setting up the chapter and section structure was reasonably easy once I understood how the iBooks Author templates worked. Inserting text in to the appropriate chapter, section, and the actual text content was more challenging as each text area was not identified specifically by VoiceOver. One thing I did miss which VoiceOver does very very well in other applications on the Mac, is telling me when I have spelled a word incorrectly either due to typos or my tendency to still spell words phonetically. Adding audio content to the book was not achievable by the use of VoiceOver, for this reason, I decided to go ahead with this text version of the book. When adding audio content becomes accessible, I will be updating the book to include my audio demos. I did get some sighted assistance for adding the movie of the Mac SE startup boot sound, images in some of the chapters, and of course my dreaded activity: Proof Reading. Creating a free iTunes account to publish the book, and using the iTunes Producer to actually publish the book to the iTunes Store was also relatively painless with a few suggestions from my sighted conspirator (smile). Given all this, I still feel likeI wrote and published the book myself.

Back to top of Chapter 1Return to Table of Contents

Chapter 2: Apple’s Accessible Product Line

On the whole, most of the Apple product line is accessible. The following are a list of the products and a link to the Apple website for more information. I’ve snuck in the iPod classic, which is not truly accessible, but you can still certainly use it without sight.

In later chapters will go more in to the iPod touch, iPad, iPhone and the Mac, in the later sections in this Chapter I’ll deal with the iPod shuffle, iPod nano, and the Apple TV.

iPod Shuffle

VoiceOver takes speech files for VoiceOver from the Mac or Windows PC.

iPod Shuffle web link

iPod nano

VoiceOver takes speech files for VoiceOver from the Mac or Windows PC.

iPod nano web link

iPod classic

Ok, so as I said above not strictly accessible (no speech). However, (and yes it sounds odd until you have tried it yourself) count the clicks when you are navigating the menus with the good old fashion click wheel, you can indeed navigate/play your music, audio books, movies, and TV shows.

iPod classic web link

iPod touch

Full VoiceOver and other accessible options, including Bluetooth Braille and Bluetooth keyboard support.

iPod touch web link

iPad Air, iPad 2 and iPad mini

Full VoiceOver and other accessible options, including Bluetooth Braille and Bluetooth keyboard support.

iPad web link

iPhone 4s, 5c and 5s

Full VoiceOver and other accessible options, including Bluetooth Braille and Bluetooth keyboard support.

iPhone web link

Apple TV

VoiceOver supported, ability to navigate via Bluetooth keyboard. Low vision style options also available.

Apple TV web link

Mac: Macbook Air, Macbook pro, Macbook retina, Mac mini, iMac, and Mac Pro

Full VoiceOver and other accessible options, including Bluetooth Braille and Bluetooth keyboard support.

Mac web link

Ipod shuffle

Before I get in to the in and outs of the iPod shuffle, let me give you a physical description so you will know what it looks like.

Physical description

Top edge: from left to right - 3.5mm ear phone/usb jack, mode button, and shuffle switch. Right edge: no controls. Bottom edge: no controls. Left edge: no controls. Back: clothing clip. Front: raised round button with top of the circle volume up, bottom volume down, left previous track and right next track: with the middle of the circle which is indented for play./pause. Remember, no internal speaker.

Now on to the ins and outs as it were.

The iPod shuffle is a very cheap entry level audiobook reader. Oh and it plays music as well (smile).

The iPod shuffle is very small, has no screen, is controlled by physical buttons, and the titles of the music tracks or audio books are spoken out via text to speech.

You can easily start/stop your media playing, switch between sequential (one track after another in the correct order) or shuffle (random playing of tracks), switch between play lists, and check how much battery you have left. Oh and of course, turn it off.

Rather than just trying to dump all of your media on to the iPod shuffle via iTunes (which probably won’t fit: hint hint hint), in the Music and Books tab, you can selectively choose what actual items you want to put on to the iPod shuffle.

Whilst I’m talking about listening to music, audio books, and switching between or just listening to play lists, on the Mac, remember that you can use Add to iTunes as a Spoken Track from the services menu from any application on your Mac where you can highlight text to convert this selected text in to spoken MP3 files which are automatically added to the play list Spoken text. This means that you can listen to any text info away from your Mac. The Services menu is located in the context menu which VoiceOver users can get to by pressing VO+Shift+M to bring up the Context menu. You will find the Play list selection table where you can choose the Spoken Text play list in iTunes within the Music tab.

Some of the reasons why I like using this snazzy little device is that it has a clip that can be attach to my clothing, has physical controls which are easy to locate and use, can be used as a USB stick, you can recharge the internal battery on your computer or USB style wall charger (my iPhone charger), and it saves me getting more expensive items out of my bag when traveling on the train for security peace of mind.

You will need to Enable Disk Use (USB stick type functionality) when you plug the iPod shuffle in to iTunes on your Mac or Windows PC if you want to use it as a USB stick. Just go to your device in the source list, in the Summary tab, just tic or check Enable Disk use.

One thing to think about here is if you are going to share the iPod shuffle as a USB stick between a Mac and a Windows PC, is that make sure it is formatted via Windows. If its Mac only, you will not be able to copy files on to it, but you will still be able to use it as an iPod shuffle: i.e. copy music and audiobooks on to it. Even though you may not own a Windows machine at home like I do, you still may want to share files to other folks who do use Windows.

The format function is in the Summary tab in iTunes. Keep in mind that whilst you will be able to list all the files that you have copied over to the iPod shuffle as your USB stick, you will not be able to by default see anything else (the content synched via iTunes).

When you are just using the iPod shuffle as an iPod shuffle as it were (i.e. you didn’t enable disk use), when you have it plugged in to your Mac, you can just pull it physically straight out. However, when you have it enabled as a USB stick, you will have to eject it as you would for any USB stick. This makes sure that all files are written to the device properly before physically removing it from your Mac.
For those that may have forgotten the keyboard short-cut Eject command, its Command+E when you have selected the iPod shuffle on your desktop.

Several things to keep in mind when using the iPod shuffle is that it is 2GB, you will need to ware headphones as it doesn’t have an internal speaker, the USB connection to a Mac or PC is a very small 3.5mm connector (which is not used by any other Apple product so don’t loose it), and the speech output that tells you what track you are playing etc, takes the synthesiser voice off the Mac or Windows PC.

Two things that I like to do if I don’t want to where head phones with the iPod shuffle is to plug it in to my treadmill speakers via the audio cable that came with my treadmill or to use the direct play mode on my AQ Audio smart speakers which I mention in Hardware Bits and Pieces that I have found Useful.

Again speaking about not having to use head phones, if I want to check the battery status of the iPod shuffle, if I have it plugged in to my Mac, I can check the battery level on the device context menu in the source list in iTunes: remember to bring up the context menu for VoiceOver users, its VO+Shift+M when your on your device name in the iTunes source list.

Speaking of battery level, you should get about 15 hours of continuous listening pleasure out of the iPod shuffle.

I find I recommend the iPod shuffle for use in schools for students to listen to audio books or content from Add to iTunes as a Spoken track, as there is nothing else on the device (besides what has been put on it) for the student to get distracted by.

Can I just say here, that putting content in to audio format is not just for folks who may be blind, but for anyone who may have a print disability or prefers to listen to content rather than visually reading it.

I should just remind you here, that the iPod shuffle cannot update itself which is possible with all of the iOS devices (iPod touch, iPad/iPad mini, and the iPhone), to do this you will need to access iTunes on your Mac or PC and in the Summary tab within the device, choose Check for Updates.

To sum up, the iPod shuffle works quickly with the physical controls, I find it to be very useful and handy, particularly when running on my treadmill at home, and I don’t (for a change) want to use Zombies Run! on my iPhone.
All in all, a great little device.

For audio orientation and use of this device - go to my podcast website http://davidwoodbr.podbean.com

iPod nano

Now for the ins and outs of the iPod nano.

The iPod nano is just a great all round device for listening to audio books,and listening to music. Oddly enough, it reminds me of holding a little iPhone in my hand as the controls are fairly much in the same place.

The version of VoiceOver on the iPod nano feels like using VoiceOver on the full iOS devices, and has the same gestures for navigating the device. In addition, because of the Home button, you can toggle VoiceOver by pressing the Home button 3times (sound similar to any iOS devices that you may know smile).

For low vision folks, you can also invert the colours on the screen to make things a bit easier to see. As with the iPod shuffle, rather than just trying to dump all of your media on to the iPod nano via iTunes, in the Music and Books tab, you can selectively choose what actual items you want to put on to the iPod nano.

Whilst I’m talking about listening to music, and audio books, remember that you can use Add to iTunes as a Spoken Track from the services menu from any application on your Mac where you can highlight text to convert this selected text in to spoken MP3 files which are automatically added to the play list Spoken text. This means that you can listen to any text info away from your Mac. The Services menu is located in the context menu which VoiceOver users can get to by pressing VO+Shift+M to bring up the Context menu. You will find the Play list selection table where you can choose the Spoken Text play list in iTunes within the Music tab.

Some of the reasons why I like using this great device is that I can link it up to a Bluetooth speaker, listen to the built-in FM radio, use the pedometer, can be used as a USB stick, you can recharge the internal battery on your computer or USB style wall charger (my iPhone charger), and it saves me getting more expensive items out of my bag when traveling on the train for security peace of mind.

Since the iPod nano has the lightening connector, I can use the same lightening cables that I use for my iPhone or iPad/iPad mini: so I always have a spare cable and do not live in fear of losing a device specific cable as it is with the iPod shuffle.

You will need to Enable Disk Use (USB stick type functionality) when you plug the iPod nano in to iTunes on your Mac or PC if you want to use it as a USB stick. Just go to your device in the source list, in the Summary tab, just tic or check Enable Disk use.

One thing to think about here is if you are going to share the iPod nano as a USB stick between a Mac and a Windows PC, is that make sure it is formatted via Windows. If its Mac only, you will not be able to copy files on to it, but you will still be able to use it as an iPod nano: i.e. copy music and audiobooks on to it. Even though you may not own a Windows machine at home like I do, you still may want to share files to other folks who do use Windows.

The format function is in the Summary tab in iTunes. Keep in mind that whilst you will be able to list all the files that you have copied over to the iPod nano as your USB stick, you will not be able to by default see anything else (the content synched via iTunes).

When you are just using the iPod nano as an iPod nano as it were (i.e. you didn’t enable disk use), when you have it plugged in to your Mac, you can just pull it physically straight out. However, when you have it enabled as a USB stick, you will have to eject it as you would for any USB stick. This makes sure that all files are written to the device properly before physically removing it from your Mac.
For those that may have forgotten the keyboard short-cut Eject command, its Command+E when you have selected the iPod nano on your desktop.

Two things that I like to do if I don’t want to where head phones or use my various Bluetooth speakers with the iPod nano is to plug it in to my treadmill speakers via the audio cable that came with my treadmill or to use the direct play mode on my AQ Audio smart speakers which I mention in Hardware Bits and Pieces That I HaveFound Useful.

Again speaking about not having to use head phones, if I want to check the battery status of the iPod nano, if I have it plugged in to my Mac, I can check the battery level on the device context menu in the source list in iTunes: remember to bring up the context menu for VoiceOver users, its VO+Shift+M when your on your device name in the iTunes source list.

Speaking of battery level, you should get about 15 hours of continuous listening pleasure out of the iPod nano.

Can I just say here, that putting content in to audio format is not just for folks who may be blind, but for anyone who may have a print disability or prefers to listen to content rather than visually reading it.

I should just remind you here, that the iPod nano cannot update itself which is possible with all of the iOS devices (iPod touch, iPad/iPad mini, and the iPhone), to do this you will need to access iTunes on your Mac or PC and in the Summary tab within the device, choose Check for Updates.

To sum up, the iPod nano works efficiently with the touch screen using VoiceOver, I can quickly toggle on and off Voiceover by pressing the Home button 3 times (like on the other iOS devices), I find it to be very useful and handy, particularly when running on my treadmill at home, and I don’t (for a change) want to use Zombies Run! on my iPhone.
All in all, a great device.

For audio orientation and use of this device - go to my podcast website http://davidwoodbr.podbean.com

iPod classic

Before I get in to the reasons why you can really use the iPod classic, let me give you a physical description so you will know what it looks like.

Physical description

Top edge: left 3.5mm head phone jack, and right rectangular power on/off button. Right edge: no controls. Bottom edge: 30 pin connector. Left edge: no controls. Back: no controls. Front: about half the way from the top taken up by the screen (and no not a touch screen just for looking), and then the very large and slightly raised click wheel with the select/play/pause button in the middle. Case for the iPod classic is plastic.

Ok so not strictly an accessible iPod in the true sense of the word, but if you listen to my audio demo, you can indeed navigate the iPod classic via the Click wheel (and yes I know it sounds odd) and count the clicks when navigating the menus: but it does work.

If you enable disk use via iTunes on your Mac or PC for the iPod classic, you can use the 160GB hard drive as storage for all your other files.

In case you’re wondering how you check the battery level on the iPod classic since it doesn’t talk, when you plug it in to iTunes, the device item in the source list gives you the current battery charge. Also you’ll know what is on your iPod classic, as you can control what goes on to it in the way of Music, Movies, TV shows, and Audio books.

For audio orientation and use of this device - go to my podcast website http://davidwoodbr.podbean.com

iPod touch and iPhone

As I will be covering the iPod touch & iPhone in more details later on, let me just share these main points:

iPod touch wifi only with 16GB or 64GB.

iPhone 5c wifi/cellular 16/32GB, and iPhone 5s wifi/cellular 16/32/64GB.

The iPhone 5s currently has the finger print sensor.

iPhone is the only iOS device that actually vibrates.

Siri runs on all of these devices.

VoiceOver and the other accessibility options work as they do on the iPad.

This sounds a bit odd, but one of the reasons I like the iPhone is that the grill at the top of the phone (not present on the iPod touch) makes it just that little bit easier to drag my finger down from the grill and locate the status line when using VoiceOver.

Please read my chapters on getting started with your iPod touch, and getting started with your iPhone.

For audio orientation and use of these devices - go to my podcast website http://davidwoodbr.podbean.com

iPad

As I will be covering the iPad in more details later on, let me just share these main points:

Currently the iPad 2, iPad Air, iPad mini, and iPad mini retina.

iPad 2 and iPad air 9.7 inch screens, and iPad mini/retina mini 7.9 inch screen.

iPad 2 wifi only, iPad air or iPad mini/retina mini wifi or wifi/cellular models.

iPad 2 16GB, iPad air 16/32/64/128GB, iPad mini 16GB, and iPad mini retina 16/32/64/128GB. Siri available on the iPad, and iPad mini.

VoiceOver and the other accessibility options perform the same way as on the iPhone or iPod touch.

The multi-tasking gestures certainly speed things up with switching between running apps.

Please read my chapter on getting started with the iPad.

For audio orientation and use of these devices - go to my podcast website http://davidwoodbr.podbean.com

Apple TV

I haven’t got a specific chapter on the Apple TV, I more or less address using it throughout the book, so I will go a bit more in to it here.

For those that are not sure what the Apple TV actually does, it is a box that you plug in to your TV set, which then is connected to the internet via a physical network or Wi-Fi network. This plays content from your iTunes account (movies, TV Shows, Music etc) on the internet or from a local machine on the same Wi-Fi network that the Apple TV is connected to. There are also app icons on the home screen of the Apple TV that access their own content.

Besides the home screen icons, you have the menu at the top of the screen which you use to navigate to your Movies etc, access your computer (of course on the same Wi-Fi network), and access settings for the Apple TV.

With the introduction of iTunes Radio in Australia in 2014, Settings has been bumped off the menu and now is with the rest of the icons on the home screen unless you have it hidden.

The home screen icons can be moved around or hidden so that you don’t need to navigate certain icons that you don’t use.

As with iOS (iPhone, iPod touch, iPad) and OS X (Mac), you can set Parental Controls to restrict what items your children have access to and what content they can watch or listen to.

As I have two separate Apple IDs that I access: my one for work and the family’s Apple ID: the Apple TV allows me to switch between Apple IDs and play the content from either account.

If you need to sign out, add or switch to another Apple ID, go in to Settings, iTunes Store, Apple IDs and make your selection: if you have more than one Apple ID in this menu already, you will find these at the bottom of the menu.

I love Apple TV, so much so, I have 4 in my house. Highly recommend you take the time to check it out, it is one of my favourite Apple devices. Oh and remember, it is all accessible by VoiceOver.

Physical Description

The Apple TV is basically a small flat square box. On the back, you have your power socket, HDMI port, optical audio port (which I only use in the lounge room connected to my stereo system), and a network port.

The remote that comes with the Apple TV is very simple with a round raised button at the top with an indent in the middle. The top, bottom, left and right of the raised round button are your arrow keys (up, down, left, right), and the indented button is your Select button. Below this button are two buttons: left button is the Menu button, and the right button is the Play/Pause/Power button. All buttons are very tactile and easy to locate.

VoiceOver on Apple TV

As with all Apple products, you can toggle speech (VoiceOver) on when you first setup the Apple TV by pressing the Power button on the remote 3 times: very similar to pressing the Home button on one of the iOS devices 3 times to toggle VoiceOver on or off as well.

If you want to increase rate of the VoiceOver speech, go to Settings, General, Accessibility, Speech Rate (toggle through slow, normal, fast, and very fast).

You can select a different voice (language) that VoiceOver uses to speak. As I am in Australia, I went in to Settings, General, Language, and chose the English Australia voice (Karen). This means that my Mac, iOS, and Apple TV are also speaking the same language so to speak.

Because I have sighted children who don’t always want to hear VoiceOver talking, I have selected the Accessibility Short-Cut that allows me to toggle VoiceOver On or Off via the Menu button. You can set this in Settings, General, Accessibility, Accessibility Short-Cut (on or off toggle).
Another good thing about this option is that you can use it to quickly go back to the main menu when your deep in other menus. To toggle VoiceOver, just hold down the Menu button for about 2 seconds, VoiceOver will be the first option if you’re not in a sub-menu (otherwise the first option is Return to Main Menu with VoiceOver being the second option which you can get to by pressing Down arrow on the remote), and press Play/Pause to turn VoiceOver off (repeat steps to turn VoiceOver back on). If you want to go back to a previous menu, just press the Menu button once without holding it in.

While I’m sort of talking about using the Apple remote, you can also use it to play/pause music on your Mac, and increase or decrease the system volume. Having both the Mac and the Apple TV respond to the remote at the same time can be a bit of a pest, particularly if your working on the Mac. If your Mac does this, you can disable the rmote option on your Mac by going in to System Preferences, Security and Privacy, General TAb, Advanced, and disable remote.

The Apple TV can be also navigated using VoiceOver by a Bluetooth keyboard which is generally paired to your Apple TV: i.e. not VoiceOver specific. Pair the Keyboard in Settings, General, Bluetooth. The arrow keys on the Bluetooth keyboard work as you would expect (performing the same action on the remote). The Escape key on the keyboard is the Menu button on the remote. The greatest benefit of course using the Bluetooth keyboard with the Apple TV, is replacing the need to use the remote to navigate the on-screen keyboard which you navigate by using the arrow keys, and selecting each letter etc with the Play/Pause button: doable, but a lot faster using the physical keyboard. Tip: Turn Bluetooth on in this menu for the Apple TV to begin scanning for your Bluetooth keyboard.

The Apple TV can AirPlay to other AirPlay devices (such as my AQ Audio Smart Speakers), and when I am listening to the cricket on my iPhone, I can AirPlay the audio to the Apple TV, which leaves VoiceOver speaking on the iPhone: makes things a bit easier: splitting up the speech of VoiceOver to that of the Cricket audio.

The menu which lists your AirPlay speakers is in Settings, AirPlay, and the speakers are listed at the bottom of the menu. Like on the iPhone, when AirPlaying from the Apple TV to another AirPlay device, VoiceOver speech goes through the local speakers where the Apple TV is connected.

Styles for low vision, including contrast, text font etc, can be found along with other options in Settings, General, Accessibility, Styles. This is where you can set your specific colour screen to make it easier for you to read the menus, etc.

For audio orientation and use of this device - go to my podcast website http://davidwoodbr.podbean.com

I have podcasts covering the Apple TV overview, Using Apple TV with a bluetooth keyboard and Apple TV with AirPlay Speakers and Low Vision Options.

Mac

As I will be covering the Mac in more details later on, let me just share these main points:

The Mac mini is a good entry level computer if you don’t need to worry about a screen as all you get when you get a Mac mini is the Mac mini: no screen, mouse or keyboard. For voiceOver users, just grab a Bluetooth keyboard and a Bluetooth Magic trackpad. It is only mains powered, so not a portable solution.

I have both a Macbook Air and Macbook Pro for different reasons. My Macbook Air, due to its light weight and great battery life, is the machine I use most of the time. For a bit more grunt for audio editing etc, I tend to use the Macbook Pro.

The iMac is great for a family computer, 21 inch or 27 inch screen, and is great for watching movies or TV shows on.

As I don’t do video editing etc, there is no need for me to have a Mac pro.

Voice output (VoiceOver), Voice input (voice dictation), and the other accessibility options works fine on all of the Macs.

Please see my chapter on getting started with your Mac. For audio orientation and use of these devices - go to my podcast website http://davidwoodbr.podbean.com

Back to top of Chapter 2Return to Table of Contents

Chapter 3: Mac Accessibility Overview

Following is a list of the accessibility options found on the Mac.

The Short-Cut key: Option+Command+F5: will bring up the Accessibility Options dialog which includes: Enable Zoom using keyboard short-cuts, Enable Zoom using scroll gestures, Enable VoiceOver, Enable Sticky Keys, Enable Slow Keys, Enable Mouse Keys, Invert Display colours, Contrast adjuster (slider), Keyboard Short-Cuts,Preferences (takes you to System Preferences/Accessibility panel), and the Done button.

The nice thing about this Accessibility Options panel is that it will speak out loud whether VoiceOver is on or not when you press the Tab or Shift+Tab keys to navigate the possible options.

Before getting in to the list, a trick I always do on a Mac to see what Accessibility options may be runnin

Show more