A11YSD October 2023 Video Transcript
Expand Transcript
Speaker 1: Thank you.
Speaker 2: Thank you for joining us for our December Accessibility San Diego Inclusive Design Meetup. I’m Chris, this is Harris, we are your hosts. Thanks everybody for coming out, it’s great to see the group thriving and growing thanks to our viewers at the home audience.
Speaker 2: So we’re about to kick off our lightning talks and we’ve got an interesting range of topics for you tonight. Harris and I are speaking both too. I don’t know if that’s a plus or a minus, but there you go.
Speaker 2: Let’s see. Next slide. Yes, would you be my slidesman? Thanks. Go ahead. Okay, just a couple things to cover. If you know anybody who would be interested in sponsoring the group, please send them our way.
Speaker 2: If you’re interested in giving a lightning talk at a future meetup, we’d love to have you. We like to range from advanced topics to simple, dev, UX, business, marketing, anything really related to accessibility is fair game.
Speaker 2: What else? If you have an accessibility question, reach out to us. Anything else you’d like, you can catch us after the presentations. Or there’s all these different ways you could get in touch with us.
Speaker 2: A11ysd .com, I think links to pretty much all the others. Let’s see. So we’d like to thank our sponsors first before we get started with the talks. So first, going to have Brent say a
Speaker 3: Hey there, my name is Brent Summers. I’m the director of marketing here at Blink. Blink is a UX research and design company. We work with other companies to make their products easier to use. If you’ve used an Amazon, a Microsoft product, Facebook, Oculus, Go, Quest, be ordered a Starbucks on the mobile app, we helped with that.
Speaker 3: In 2008, we did something with Domino’s called the Pizza Tracker. You get the idea. Coming up in 2020, we are hosting a major conference in Seattle. There was a flyer in your seat. If you would like to attend that conference, it’s 65 sessions over three days.
Speaker 3: It’s a really big conference. It’ll be 600 people this year. It’s our eighth year doing that conference in Seattle. And I’m going to play just a quick video, which is a highlight reel of some people speaking about UX at last year’s conference.
Speaker 3: I thought it was on autoplay.
Speaker 4: UX matters because design is in everything that we do.
Speaker 5: User experience matters because great products and services can improve people’s lives. UNIX matters because it’s the connection to the human. How do we treat the consumer? How do we make them feel? The world can become a better place when we design things that are useful that people can connect with or connect to other people with.
Speaker 6: Technology is evolving so quickly. It’s a way for us to connect with people and to preserve the humanity and the products and experiences that we build. Making sure the design of air.
Speaker 7: Every product and every experience is really relevant for users no matter where they are. Awesome, thank you for that, Brent. You did miss out on an opportunity to get a little Oprah on and say, everyone look under your seat, there’s a flyer.
Speaker 8: Um
Speaker 7: But yeah, I work for Deque. They’re one of our first sponsors. We just went beta with a product called Axe Pro. It’s built on top of the pretty popular framework Axe Core. It’s free to use right now, so please sign up and tell me how terrible my software is.
Speaker 7: That’d be awesome. We’re trying to find bugs and we’re trying to figure all that stuff out. So big thanks to Deque. Find all the Deque swag over at the table.
Speaker 2: And the company I’m with is Level Access. They’re another sponsor. We provide a full range of accessibility products and services. And we are still hiring aggressively. If you know anyone in roles, I think there’s sales, marketing, accessibility consultants, developers, anybody that is interested and passionate about accessibility and wants to get involved, come see me after the talks.
Speaker 2: And I’d love to get them connected with the company. Our next meetup, Save the Date, Thursday, February 13th. We haven’t planned anything yet, but that will be basically our one -year anniversary of the group.
Speaker 2: So it’s going to be something special. OK, and then we are about to bring up our first speaker. Here. Gina is a technical support engineer, junior developer at, is it, Sochi? Sorry, a freelance copywriter and a volunteer at Girls in Tech San Diego.
Speaker 2: She’s passionate about accessibility and empowering others through education and community development. And she blogs at codecopycoffee .com. Gina.
Speaker 4: Thank you so much. So the first thing I’d like to do is get a feel for the broom. So how many of you are new developers? What are you, okay. How about more seasoned experienced developers? And where are my non -developers at?
Speaker 4: Awesome. So maybe I’m naive, but I’d like to think that everyone can get something out of this presentation. So for the new folks, I’m hoping that you get techniques for making your code more accessible.
Speaker 4: The more seasoned folks, hopefully a refresher on the basics. And you non -developers, an increased understanding and awareness of the challenges that certain users face when using the web, and what can be done about it because the more people who have this understanding and are talking about it, the more likely it is to change and the more accessible of a web we get for everybody.
Speaker 4: So what do I mean when I’m talking about accessibility? Usually when we talk about web accessibility, it’s in terms of the acronym POR. Your content should be perceivable, operable, understandable, and robust.
Speaker 4: And really to put it more simply, the more users who are able to engage with your content, regardless of their ability, disability, or circumstances, the more accessible your content is. So how can you achieve that as a developer?
Speaker 4: My first tip is use semantic HTML, which is just a fancy way of saying use those HTML5 tags. And that’s important because screen readers use those tags in order to understand the layout of your web page and present it clearly and effectively to the folks using them.
Speaker 4: Another feature of screen readers is that they group all of the links on a page together. So that brings us to my second tip, which is to use descriptive text for links. So if you’re using a screen reader and you’re hearing, click this, click here, click me, you really have no idea what you’re doing when you click those links.
Speaker 4: So having text like click here to download a PDF or click here to go to the Smithsonian’s website makes it a lot easier for the folks using a screen reader to know what they’re getting into before they click a button.
Speaker 4: Tip number three is to make sure that your content is able to be navigated and interacted with effectively using only a keyboard. So that’s for those folks who have either temporary or not so temporary mobility issues.
Speaker 4: You can see the code for and what the form looks like on the page up here. Tip number four is to provide captions in or transcripts for audio and video and make sure they make sense. So this is pretty intuitive, right?
Speaker 4: If you have some audio or some video, you should add captions. However, if you’ve never listened to, or rather watched with the audio off a YouTube video with the inbuilt YouTube captions, I encourage you to go home and do that for no more than five minutes because you’ll wanna throw your computer out a window.
Speaker 4: They stink for the most part. Text to speech has a long way to go, especially if the presenter has an accent. My partner has a British accent and speech to text, not great for him. Also, there is the Able Player, which you can download.
Speaker 4: It’s a fully accessible cross browser media player with keyboard controls. And it’s also open source and it supports a variety of languages. So if there’s a language that you speak that is not supported yet, get in there, help them out.
Speaker 4: Tip number five is to minimize distractions such as popups and animation and use clear consistent UI. So we’ve all been to those websites where there’s a million popups and then this flashing out over here and then something down at the bottom that’s moving.
Speaker 4: And that’s gonna make it really difficult for folks with cognitive disabilities to follow and understand your content. And it also stinks for people like me, whose eyes get really tired by the end of the day, especially even though I’m wearing my blue light glasses.
Speaker 4: The flashing is still a problem. So you’re implementing all of these tips. How do you know if you’re doing a good job? Test, test, test. There are resources online and there are also tools for testing.
Speaker 4: For instance, the color contrast tool will tell you if your colors have enough contrast for folks with varying levels of vision. And the Wave Accessibility tool, you can actually put a web page in there, and it will run through and flag issues with accessibility.
Speaker 4: But really, the most important testing you can do is user testing, because the feedback you get from the folks who actually face these challenges every day is completely just more valuable than anything, because they can tell you if what you’re doing is helpful and if it allows them to engage effectively and comfortably with your content.
Speaker 4: It shouldn’t be a struggle. So, by show of hands, how did I do? Who learned something or got a refresher on something? Awesome, then I did my job. And this is me, so if you have any questions, wanna connect, feel free to find me everywhere I am on the web.
Speaker 4: Thank you guys so much for coming to my talk, and thank you so much to all of the sponsors here for having me, I really appreciate it.
Speaker 2: Thank you, Gina. Oh, okay. Next up, we have a man who needs no introduction since he was up here earlier. No, Harris Schneiderman is a software developer with a strong passion for accessibility. He’s a member of the W3C ARIA working group, co -organizer of this group, and works for Deque Systems.
Speaker 2: He enjoys working on open source projects, aimed at making the web a better place for everyone. He’s going to build a fully tested, accessible, reusable component in approximately 10 minutes. Thank you, Harris.
Speaker 7: Give or take, preferably give five minutes to that. Thank you, thank you. Oh yeah. And give it up for Mac who’s going to be holding this for me because I need to type. Actually, I don’t need to type till the end so I’ll flag you or something.
Speaker 8: All right, well…
Speaker 7: I hope the timer didn’t already start. And I’m struggling to find the window that I was using. Did it get?
Speaker 8: Oh, that’s okay. Here it is, cool. All right, so let’s get started.
Speaker 7: And like Chris just said, we’re going to be building a fully tested, accessible, reusable menu button. And how we’re going to do that is we’re going to gather some requirements from something called the ARIA authoring practices.
Speaker 7: They kind of document the ins and outs of implementing an accessible widget. I chose a somewhat simple one, but it’s fun because it has a lot of keyboard interactivity. So we’re going to do a menu button, which is a button that when clicked, it has a little drop down menu, and you can arrow through the options.
Speaker 7: So the path we’re going to follow is something called test driven development. So I’m going to take those. I’m going to literally copy and paste stuff out of the authoring practices and turn those into test cases.
Speaker 7: And then I’m going to fast forward and not live code it, but fast forward my branch to a state where I have all the tests written out, but I will not have written out the component itself. So all the tests are going to be failing.
Speaker 7: And then together, we’re going to build the menu button, and slowly we’ll start to see all my unit test passing. And once I get all green check marks, I’ll know that I’ve at least implemented all my accessibility requirements.
Speaker 7: So I’m going to go over the thing that I struggled to find to the RR authoring practices. All right, so we’re looking at the menu button section. This document gives you a brief description of the component and some common conventions for it.
Speaker 7: So if you’re uncertain of whether to use this component, you can read up on that. Right away, they throw you into some examples, which if you’re a developer or a designer, it’s really nice to be able to point at some good examples of an accessible version of whatever widget you’re looking at.
Speaker 7: And they also get into the keyboard interaction. So stuff like what happens when you click Enter on the button or Down Arrow on the menu options. And then they go through and map out all the various role states and properties, or your attributes that you have to use.
Speaker 7: So I’m going to get right into one specific example. This flavor, if you will, of a menu button is going to shift focus to the menu items themselves, and we’ll be able to arrow through them. Other implementations include using REA active descendant, which actually leaves focus on the button itself.
Speaker 7: But we’re not going to get into that yet. So the way I interpret this information that we’re looking at, which is kind of a table -like thing with a key and then what that key does, I treat this kind of like a spec, which is like, down arrow, space, or Enter, it opens the menu, and shifts focus to the first menu item.
Speaker 7: Up arrow, opens the menu as well, and shifts focus to the last menu item. So I’ve actually gone ahead and translated all those into unit tests. And cool, let’s look at that. I could start using a mic holder now.
Speaker 7: All right, so I have a bunch of tests written out, and they should look very familiar because it’s literally, like I said, copy and paste it from that document. So I kind of, I describe some context.
Speaker 7: We have a trigger button. When down arrow is pressed, we’ll open that menu, just like I just went over that. And so I’ve mapped out all the various requirements I have, and I’ll show you all of them right now by running my tests.
Speaker 7: And I put to do on it, which means they’re just going to get skipped. So look at that, I have like an awesome spec of all the requirements I need to meet in order to make an accessible menu button. So I’m going to fast forward to a version of this code where tests are actually written out.
Speaker 7: And what that looks like, I’ll try to describe a couple of them, we don’t have a ton of time. So for that example where the down arrow on the trigger button opens the menu, shifts focus to the first menu item.
Speaker 7: What we can do is I use this simulant library. These tests are running in something called JS DOM, which is kind of like a lightweight fake browser like context where you can do stuff like query for elements and fire event listeners and fire events on elements.
Speaker 7: So what I’ve done here is I’ve created a down arrow key down event and I’m firing it on the button and then I’m making an assertion based on the requirement that I got from the spec. So I say expect the menu to have this active class and expect the currently focused element to be the first menu item in the menu.
Speaker 7: And I do all sorts of things just like that. Some more interesting ones on the attribute side are they’re a little simpler actually. So for the button, which must have aria has popup equals true. So that way assistive technology knows that when it’s clicked they can say, hey, this has a popup.
Speaker 7: So that’s a really simple test case to implement. We’re expecting the return value from get attribute aria has popup to be true. And there’s all sorts of tests like that, but I want to get down into the fun live coding part.
Speaker 7: So I’m going to cheat again and fast forward a bit more and let’s run actually I got ahead of myself. I’m going to go back to that branch and I’ll show you all the tests failing because that really brings me down.
Speaker 7: inspires me to want to make these tests pass. All right, so I have all sorts of errors, nothing but read across the board, but that’s fine, that’s the whole idea of test -driven development. It allows me to focus on the desired features, which I actually just stole from the ARIA authoring practices, and I’m not really thinking ahead of time about how I’m gonna implement it, so it actually allows me to write better, cleaner tests just based on the desired functionality rather than knowing ahead of time what the implementation looks like and kind of cheating and cutting corners, which we all do.
Speaker 7: So this is a good practice. So where was I before? I’m gonna go check out this branch. All right, so I’ve cheated, I’ve said that already. So I’m gonna run the test again, there’s a little bit of code written, so we’re gonna have a few tests passing.
Speaker 7: Let me go back to the summary. So I’ve written out some basic stuff where I’m handling some initial attributes. What that looks like in code is this, I have a menu button class, and I’ve just done some basic stuff, like to find some element references I’m gonna be using throughout the class and set up those initial attributes, like ARIA has pop -up, ARIA controls is on the button, which establishes a relationship between the button and the menu itself.
Speaker 7: Inversely, we have the menu labeled by the button, so that way when a screen reader enters the list, we get to hear the buttons text, so you kind of know which menu you’re on. And then I’ve just bound a bunch of event listeners that do nothing right now, because that’s kind of the fun part here, I’m gonna start writing the key down stuff.
Speaker 7: So I’m actually gonna run these tests in a watch mode. Oops, there we go. So let’s start with the first failing test. Down arrow opens menu, moves folks to the first menu item. You’re probably sick of hearing me say that.
Speaker 7: So I’m gonna go to my on button key down method and make this pass. So I’m gonna, oh, sorry. I’m going to call this open menu method, which that’s where I’m gonna kind of generically handle the opening of the menu as it sounds.
Speaker 7: So let’s see, what all do I need to do? I need to make the menu be displayed and I’ve made a decision up front to just use this active class and I’ll let CSS handle that. But that’s kind of beside the point.
Speaker 7: So I’m gonna do is add the class. And remember the second part was that I’m gonna shift focus to the first menu item. So I’ve already set up all those element references. I have an array of menu items.
Speaker 7: So it’s really as simple as me saying focus the first one. I’m gonna hit save and I’m expecting to see a couple more tests passing. Hey, look at that. We have down arrow and space passing. We’re well on our way to an accessible menu button.
Speaker 7: Next up, I have clicking and pressing enter and up arrow. Those are both failing. So when I click the button or when I press enter, it should do exactly what the space bar and down arrow do. So I’ve already basically written that, right?
Speaker 7: So what I can do is on button click, call this dot open menu. Yay, I have another test passing. So the last one here with the keyboard logic for specifically the trigger button is that when I press up arrow, it should open the menu but move focus to the last menu item.
Speaker 7: So it’s very similar to what I’ve written so far in open menu except for the second line where I’m going to focus the menu item. I need to focus a different menu item. So what I will do is give myself a parameter that defaults to zero because that’s what I did for the other three, but I’ll be able to pass an alternate value to that for my specific use case.
Speaker 7: So for arrow up, I’m gonna call this dot open menu and pass in the index of the last menu item. So that would be menu items dot length minus one. So let’s see if we get another pass. Yay, so all of my key events for the button are passing.
Speaker 7: I’m gaining some confidence, I’m feeling better about myself. Now it looks like I’m not managing the stateful attribute called are you expanded? As it sounds, I’m gonna set that to true when the menu’s open and false when the menu’s collapsed.
Speaker 7: So let’s make that update and it looks like I’m just missing it here in this open menu method. All I have to do now is update that buttons attribute. So I’m gonna call set attribute are you expanded and set it to true.
Speaker 7: And now that I’m thinking about it, I know I’m gonna need a close button, a close method. So I’m gonna just cheat and put that there while I’m thinking about it. But we’ll see, it looks like all of the button tests are passing and I’m still good on time.
Speaker 7: This is awesome. Now I’m gonna move on to the keyboard events of the menu itself, which are a lot more fun. Cause if you remember, I don’t think I described it. So up arrow and down arrow are going to traverse through the menu items.
Speaker 7: So if I press up, I will obviously go to the previous menu item unless I’m on the first one, in which case we’ll cycle back to the last menu item. And inversely, for down arrow, I’m going to go to the next menu item unless I’m on the last one, we’ll go to the first.
Speaker 7: So it creates this kind of circular focus sequence. So it’s kind of fun to write. So let’s just start with which ones are failing. So enter and escape. They both do the same exact thing. They close the menu and set focus to the trigger of the menu.
Speaker 7: So I’m going to write that out. I’ve grouped them together already. Like I said, I cheated a lot to make this easy. So I’m going to call this .close menu. But I’ve only done the RE expanded attribute manipulation.
Speaker 7: So I needed to actually close the menu. So copy and paste this class list .add and change it to remove so that will close it. And instead of focusing that menu item, I’m going to focus the button. So I’m actually able to steal a lot of stuff from my open menu method.
Speaker 8: All right.
Speaker 7: Oops, that’s the wrong one. OK, there we go. All right, now I’m going to write arrow down and arrow up. So I just described that behavior, so I’m going to try to map that out. So arrow down, remember, we’re going to the next item unless we’re on the last.
Speaker 7: So what I’m going to do is get the currently focused index within the array of menu items. So that is menu items .indexofe .target. So what that means for people who don’t write code I’m looking in a list of menu items for the current, the event target, which is the menu item in which I pressed the down arrow in this case.
Speaker 7: So I’m going to use that information to decide if I’m at the end of the line, if I need to go back up to the top. So let’s do that. So this dot, oh, VS code. All right, this dot menu items. And now I’m going to do a fancy ternary to decide if I’m at the end, I’ll go to the top.
Speaker 7: If not, I’ll go to the next. So if current is this dot menu items dot length minus one, that means if I’m on the last one, go to the first. Otherwise, go to the next, which is in this case just current plus one.
Speaker 7: And I’m going to take that and focus it. So let’s see if arrow down works yet. Yay, let’s do the arrow up now. And I really like cheating and stealing code for myself. So I’m going to do that. So we’re going to just flip a couple of things around.
Speaker 7: So instead of if current is the last, I’m going to say if it’s the first, focus the last. Otherwise, go to the previous one. Yay, so now I have left the home and end hotkeys. Home should move focus no matter where you are in the menu to the first menu item.
Speaker 7: And end should do the opposite no matter where you are, move it to the last. So let’s do that real quick. That one’s going to be really easy, because we have an array of menu items. So I can just call focus on which ones I want.
Speaker 7: So home, I’m going to take menu items. The first one, call focus. And then I’m going to take a lot of copying, because I don’t feel like typing. Instead of the first, I’m going to go last. I’m going to hit Save.
Speaker 7: And if I have all greens, yay. Thanks, Mac. So as you can see, actually writing these tests upfront made it really easy for me to code it out. I was a little rushed because I’m doing a lightning talk.
Speaker 7: But it really is beneficial. It’s kind of sometimes not exactly the most fun practice to follow. But I urge everyone to try this out if you’re a developer and you’re developing new content, because it really does have an impact on how rapidly you can develop a prototype.
Speaker 7: If you have a finite list of requirements that you need to meet, you know exactly what you need to do. You’re not going to over engineer anything. And you’ll be left with something nice and accessible.
Speaker 7: So just before I hand the mic off, I want to show you what this looks like. And there’s no styles written or anything, but I have a little demo page of what we just built. And I should run the demo server.
Speaker 8: So yarn starts.
Speaker 7: Yarn dev, I forget what I name my script sometimes. All right, so when I refresh local host, I’ll have this actions menu button. I’m gonna use just my keyboard, I just focused it. I hit space bar, the little menu is open and I can press my arrow keys to the day’s end and I’m moving around.
Speaker 7: So that’s it, thank you for listening to building a reusable component in 15 minutes. Hello, Republicans. Simple things. It’s hot up in the spotlight. I’m sweating. So next up, we have Christopher Land, who is a manager of accessibility services, like he said, with level access, and the co -organizer with me of A11YSD.
Speaker 7: He has experience in training, coding, design, UX, and enterprise systems. He sees digital accessibility as crucial in providing people with disabilities an unprecedented level of independence. And we’re really happy to have him here tonight.
Speaker 2: Okay, good evening. So I’m going to talk to you a little bit about accessibility, big data, AI, and machine learning. So necessity is the mother of invention. What we’re gonna look at first is some cases where helping people overcome accessibility issues has helped push innovation and technology.
Speaker 2: And then these things end up benefiting everyone. So first up, we’ve got a early typewriter that was designed in 1808 by an Italian inventor named Pellegrino Turi. And he designed it specifically so that he could exchange letters with a blind friend from school.
Speaker 2: So this enabled her to touch the keys, find out what letters she was pressing, and send legible letters to friends. Text messaging was initially developed to help people with low hearing or no hearing.
Speaker 2: Now it is the preferred form of communication for like billions of people. And auto suggest and auto correct were first designed to help users that either had cognitive disabilities or mobility disabilities where typing was, took more effort.
Speaker 2: And again, now it’s helpful to everybody, right? Here we have the first commercially available OCR, that’s Optical Character Recognition Machine, the Kurzweil Reading Machine from 1974. And the first one that Ray Kurzweil put together, he gifted to Stevie Wonder.
Speaker 2: And there’s this touching video of Stevie Wonder accepting it and he’s crying because it’s literally the first time in his life he’ll be able to read a book alone, like in the privacy of his own home.
Speaker 2: And now we have what used to take about two suitcases, worth of material, can just be loaded into your phone so you can take a picture of text and it will read it right out to you. We’ve got lots of advances in AI -powered voice, recognition and control, and it’s especially helpful to users with mobility impairments.
Speaker 2: Here we can see a device mounted to a motorized wheelchair, and this really opens up the world to people that have a high degree of paralysis, maybe from the neck down, they’re able to go online, pay their bills, write letters.
Speaker 2: And then even now we’ve got these AIs, Alexa and Suri and so forth that are getting better and better all the time. So definitely big help. So machine learning and big data allows the computer to go ahead and like write alt text for images and videos, and this is rolled out already to Facebook.
Speaker 2: So if you are uploading images now, if you take a look at the code, Facebook is already just writing alt text for you. So this happens to say, image may contain one person smiling, standing in stripes.
Speaker 2: So okay, she’s not wearing stripes, but you can see where the computer got that wrong. This technology is just getting better and better and more accurate. So we’ve got trained AI algorithms that are, we’re teaching them to read lips, and they’ve actually crossed the threshold where the AI lip -reading machines are more accurate than human lip -readers.
Speaker 2: So that’s gonna help with live captioning and videos, kind of like what we’re doing at the bottom of the screen here tonight. If we had an AI helper reading our speaker’s lips, these could be even more accurate.
Speaker 2: So I checked it out, but there’s no apps available yet in the app store for this. This’ll be great for people with low or no hearing, and it’ll also be really cool for spying on people from a distance away and hearing what they’re saying about you.
Speaker 2: So furthermore, we’ve got AI that can understand language enough to break it down into more simple language, and this is definitely a big benefit to folks with cognitive or attention -based disabilities, but it’s also gonna help people with, whether they’re doing English or another language as a second language, or it could just help people when they’ve got a headache or they’ve had a long day and they’re really tired and they just want it simplified.
Speaker 2: So this all sounds great, right? It’s very promising, these technologies are gonna help users with disabilities, and it’s gonna also then help everybody else, right? And now is the part where we go to the dark side of technology threats.
Speaker 2: So technology itself is not good or bad, it’s immoral. It really depends on using it. So a lot of these things that are being developed to help people can also be used by people with not very good intentions.
Speaker 2: So we’ve got a bear over here that is being built in Japan to help people, elderly people with disabilities get in and out of bed or help them around the house. And with only a few weapons and some nefarious intentions, somebody that’s not trying to help people, the technology can have negative impact.
Speaker 2: So that sounds a little hyperbolic. But here’s a case where this is a robot created for diffusing bombs to keep law enforcement officers out of the way. And they can send this in and if they cut the wrong wires, the robot blows up but no humans get hurt.
Speaker 2: Well in 2016 there was an incident with a sniper in Dallas. So law enforcement, the guy was shooting at police. So the police said, aha, let’s strap C4 to this thing and just roll them out in there. And that’s exactly what they did.
Speaker 2: So I’m not going to get into the moral waters of that. But that’s the first instance we have of police using a robot to kill a suspect. So I don’t know, do you think that’ll be the last case of that?
Speaker 2: You know? Okay, so you might be saying, wait, what does this have to do with accessibility? And you’d be right to ask that. I’m going to pull it back from the inevitable destruction of humans by robots and talk about something more relevant which is discrimination.
Speaker 2: AI and machine learning systems basically developing bias or prejudice. And we already have that with these systems. This has come up many times in regards to race, gender and age. So what we have here, this young man is trying to upload his picture to some sort of online service and it says at the top, the photo you want to upload does not meet our criteria because the subject’s eyes are closed because he’s Asian.
Speaker 2: And so one of the problems is a lot of people designing these systems are young white males. And if they’re not thinking inclusively from the start, there’s a big risk of coming up with systems that are biased.
Speaker 2: So machine learning is going to be based on patterns and statistics. So if the data you’re using is skewing and it’s not diverse, the machine is going to learn bias. Bias nature machine learning is, I mean, it’s using statistics to find patterns and it by nature is going to minimize outliers.
Speaker 2: So in this example, it’s not perfect. We’ve just got a lot of white faces on the left. You’re going to have a completely different outcome than if you train your machine to recognize faces based on the data set on the right.
Speaker 2: So even more challenging than race, gender, and age is bias against folks with disabilities. This is for a couple of reasons. Disability is not always shared. It’s not always obvious to people. And the range of people when we take into account all the different disabilities, there’s just a lot of diversity just within that group.
Speaker 2: Okay, so bias is a problem and a risk. Another risk I wanna go over is big data. And what’s going on right now is companies that do marketing and social media are collecting vast amount of data on all of us.
Speaker 2: And using this data, what these companies are, have the potential to do is effectively out people with disabilities, based on their online trace, their online behavior. And I guess this isn’t bad if maybe you’re using this to market new assistive technology specifically to someone, but you can also use market to people’s vulnerabilities and take advantage of them.
Speaker 2: And we’ve got in place HIPAA, the health insurance, portability and accountability act. That’s the, when you go to the doctor, you basically, you’re protected. The privacy of your health is protected.
Speaker 2: The problem with HIPAA is it only applies to healthcare and healthcare related industries. So Facebook, social media, they have no obligation to protect people’s health status. Yeah, so there’s no protection.
Speaker 2: And proof and point, I don’t know if you can see this interface too well, but this is Facebook’s advertiser interface. And you can choose your location, you can choose what types of groups to include, and you can choose what types of groups to exclude.
Speaker 2: So in this case, you can just go in and say, yeah, let’s not show my ad to people with disabilities. I really don’t, yeah, I’m not feeling it. So they got in trouble with department of housing and urban development because that’s literally what happened.
Speaker 2: Someone that was doing retail leasing went in. I don’t know, maybe it was a case of a historic building that lacked elevators or something. I can only, I didn’t look that deep into it, but they were able to turn that off.
Speaker 2: And so you could lock people with disabilities out of jobs, opportunities. The potential there is nasty. Another problem here is AI is used more and more for screening, especially first level screening for things like health insurance, life insurance, jobs, college applications.
Speaker 2: And the thread I wanna point out here is, you might be using an AI in the top of the funnel. This represents a funnel of screening job applicants, right? And if AI is biased and people are getting cut out of the top for whatever group they’re in or if they have a disability, imagine if there’s a, say you take a short quiz to get past that and you get graded, and maybe part of it is how long you take to do the quiz.
Speaker 2: So I can imagine a scenario where maybe somebody using assistive technology or they have a mobility impairment, they just take longer to do the test. So there’s a possibility here of accidentally screening people out before they’ve even talked to a real person.
Speaker 2: In fact, I can get hyperbolic again. Think globally with, say, China’s social credit system. You guys are familiar with this, where people’s online behavior, things are reported to neighbors. The government is essentially tracking people, and so these young women are holding up their current social credit score.
Speaker 2: I’m not really sure what the disability, if there’s a stigma or not in China, but there certainly are stigmas of varying severity against people with different disabilities in different locations in the world.
Speaker 2: So I can just imagine a pretty dystopian scene where people get sidelined programmatically by AIs. I watched too much Black Mirror, if that didn’t come through. So what do we do about this? So two things I wanna point out, we need to expand the legal protection to protect folks.
Speaker 2: And this is because the ADA and HIPAA were written before this was a risk. So HIPAA just should probably be expanded, like, hey, you don’t just have to be healthcare. Anybody that has the big data ability to out, like your health status, we need to protect people.
Speaker 2: And then we need to educate AI developers because you can design these things to not have that bias, but you need to know the problem and you need to know how to fix it or address it. So using diverse training data sets, making sure you’ve included all the groups that represent human diversity.
Speaker 2: And I did find this tool, it seemed pretty cool. It’s out from IBM, it’s called, what is it called? fairness 360 and I’m not quite smart enough to use it or write AI or anything like that, but this seems like it goes through and run some advanced data analysis, but it’s open source IBM, IBM built it for free.
Speaker 2: So there are people aware of this addressing it and we just need things to move more in that direction. Yeah. Thank you. Okay. Last but not least, we have Jana Kimmel health, wellness and accessibility of driven Jana’s work for the past 25 years from designing accessible clothing to creating patient centered websites at Blue Cross Blue Shield.
Speaker 2: Jana has made it her mission to create better user experiences. She’s had the privilege of doing research at Intel Microsoft, Cambia Health, Dexcom and others. And she is currently a senior UX researcher at Dexcom in Portland, Oregon.
Speaker 2: Thanks for flying down Jana.
Speaker 1: It’s good to always live in Portland, but work for a company that has headquarters in the sunny climate. So thanks for inviting me here today. Do we have the notes version on here? You have the notes screen here.
Speaker 1: Oh, there we go, thank you. And the clicker is this. All right. So initially I had a talk all set up and then our legal department got ahold of it and changed my mind. So we’re gonna talk today about some work I did it back in 2006 when I was with Intel.
Speaker 1: And to start with, I’d like to ask you to just really quickly, if you can, reach somebody nearby and just shake their hand. And then if you’re able to just sit in your chair and just kind of stamp your feet.
Speaker 1: And then finally, you can just quietly think and postulate something about theoretical physics in your head. A little harder. So although he can’t shake your hand, he can’t stamp his feet, Stephen Hawking was one of the greatest minds of our time.
Speaker 1: And tonight I’d like to talk with you about a few of the simple solutions to a seemingly complex problem, proving that accessibility doesn’t have to be rocket science or even theoretical physics. And I’ve got some quotes in here and as you can see, those quotes are direct from Stephen Hawking’s interviews.
Speaker 1: So as I mentioned in 2006, I had this awesome opportunity to go out and meet with Stephen and some of his caregivers. I had started a conversation, I kind of worked my way into a team that was building his Intel computer.
Speaker 1: And one of his caregivers said, hey, it would be really awesome if you could fly over to London and then take a flight back with us and see what happens to both Stephen himself and the technology and what happens on the other end because it’s not good.
Speaker 1: And unfortunately, because of time and budget, that timing didn’t work out. So instead, we discussed meeting down at UCLA where Stephen was gonna be that spring and set up to meet down at UCLA. And what you see there is kind of a joke that they had, I have very few of the images from this project this many years later, but this was a great photo of what they call his entourage, which is multiple caregivers, multiple devices, lots of things to keep him safe and alive.
Speaker 1: So down in LA, I had the opportunity to walk actually into his office down there, which was pretty exciting, talked a little bit with him and again, much longer with David, his caregiver. We started having a conversation about his needs, what’s working around the Intel provided computer, what isn’t working.
Speaker 1: And the first thing that came up was the battery that Stephen uses. And that battery is the source of his communication with the rest of the world. So he has one active battery on the computer and then one backup battery, because again, without that battery, he has absolutely no way to communicate.
Speaker 1: And unfortunately, there was no really great way to carry that battery. So observation number one was what do we do to make sure that that battery is with him and safe at all times? So later that evening, we started walking back towards the house where Stephen was staying.
Speaker 1: What he does is he has about six grad students that double as caregivers. And they were kind enough to invite me to dinner to kind of see how everything works real time and enjoy a nice dinner. And along the path, we started down an accessibility program and we were walking down a possible path down in the area towards his home or the home that he was staying in.
Speaker 1: And there were light, there were trees similar to this and kind of the dabbling effect that you see, the light, the dark, the light, the dark. And we’re walking along and I kind of started hearing some beeps and boops and things that didn’t really make a whole lot of sense.
Speaker 1: And I asked his caregiver, you know, what’s going on? What is the computer okay? And he said, yeah, you know, there’s no way. I don’t know if you’ve ever seen Stephen Hawking has like a cheek switch on his glasses.
Speaker 1: And what he uses is a very simple, one of the few muscle movements he had left at the end was the ability to move his cheek up and down. And so when he was passing through light and dark, it was reading that as his cheek is moving up and down.
Speaker 1: So the second observation was that there’s a real problem with this cheek switch. There’s no pause button and no way for him to turn it off momentarily. And there’s a danger in turning it off for good because again, there’s no other way to communicate.
Speaker 1: So note taken and we continue on our way. get to the house and open the door and there’s this awesome smell of like garlic and pasta and butter. These grad students really knew how to cook. It was great.
Speaker 1: But it also gave me an opportunity while they were cooking to talk to some of the folks and talk to some of those caregivers and learn a little bit more about what wasn’t working for them. And I started talking with a woman named Nikki and she said, you know what?
Speaker 1: What’s really difficult for me is the fact that I’m not a computer hardware person. I’m a theoretical physicist and it’s my job if his computer stops working, it’s my job to fix it. That’s not a great match.
Speaker 1: And what happens the most is that a plug gets unplugged. And honestly there’s a huge, we had amazing photos of that I don’t have for you here today, but just picture in your head a very complex computer, lots of ports, lots of places for things to plug in.
Speaker 1: And most of the caregivers didn’t really know where to plug things in, so they would call for tech support. So third observation, we need to do something about making it really easy to plug these cables back in.
Speaker 1: So once back at the office, we started thinking about how to simplify some of these problems. The pack you see on the left is similar to the battery pack that I ended up sewing. I have a first career doing sewing and costume design.
Speaker 1: So I built him a neoprene bag that was able to carry that battery behind his wheelchair safely with him at all times. That was probably the easiest solution. In the middle you can see that cheek switch, unfortunately due to time and my definite lack of expertise around engineering, that did not get solved at that time.
Speaker 1: And then the final solution for his computer issues was nail polish. We had a cable, we had a port, and anybody who had a computer in the late 90s saw color coded ports, but we hadn’t thought to do that originally.
Speaker 1: So the engineer took a couple different colors of nail polish, put it on the cable, put it on the ports, and voila, the calls for tech support, you wouldn’t believe how much they went down. It was fabulous.
Speaker 1: All because of nail polish. So what I learned from this, and what I hope you’ll take away from this, is that accessibility doesn’t have to be as complicated as computer science or as theoretical physics.
Speaker 1: Sometimes there’s a really simple fix that can significantly fix a complicated experience. So now this amazing man was able to continue sharing his universe with the rest of the world. Thank you.
Speaker 2: Well, thanks everybody. That’s gonna conclude our talks for the evening. But we’ll have, we have a little bit more time, so please feel free to grab a piece of pizza, grab a drink if you like, talk to each other, talk to any of us who spoke tonight if you have any questions.
Speaker 2: Thanks everybody for coming. We hope to see you at the next one.