W3C ARIA AT: Screen readers, Interoperability, and a new era of Web accessibility
[Music] but first a brief video introducing the w3c aria at project on a vintage computer web pages cycle through some work and some show incompatibility errors back in the 1990s web browsers were all over the place a website that worked in netscape didn't always work in internet explorer or the other way around only big companies with dedicated teams could afford to make websites work in multiple browsers on a split screen several hands tight and one uses a mouse to fix this thousands of people from dozens of internet companies worked for over a decade to standardize the web it was a massive investment of effort that improved the state of browser interoperability but there are still lots of compatibility bugs back to the monitor an error message reads this page best viewed in netscape so in 2012 a group of people started the web platform test project or wpt for short today you can go to wpt.fyi and see the results of millions of small tests that run every day to make sure websites render consistently across all platforms on the wpt website a cursor moves over a browser interoperability graph when you use a website and it works the way you expect it's because we have web standards and automated testing technology the reason we're here today is because assistive technology users were not included in this push for interoperability an animation of a giant sphere pushing clusters of shapes into the far corner of the screen lewis a light tan-skinned blind man smiles at us my name is lewis pronouns he him his i am from southern california palm trees sway in the wind a close-up of someone strumming an acoustic guitar i play guitar in a musical duo and i am proud to be an accessibility tester behind me is a desk on which is a technology setup keyboard connected to a desktop a monitor and a laptop the experience for a screen reader user can vary dramatically from one screen reader to the next in practical terms this means that i can spend a morning trying out different screen reader and browser combinations to overcome whatever inaccessible content i'm dealing with that day or sometimes depending on how the content is rendered i i can't get past that and i'm somebody with the tools to experiment and figure out these accessibility issues now i'm going to show you how two screen readers voice the web differently delicious pizza that you can get from an accessible site when you need a quick blank list of pizza options region checkbox not checked pepperoni checked not checked checked now i'll be using the screen reader on the laptop pepperoni off you are currently on a switch screen reader logos appear on screen voiceover talkback jaws narrator nvda voice view and orca there are many screen readers with hundreds of interpretation differences like the one lewis just shared these differences can exclude people an animation of shapes dropping onto a conveyor belt and moving right developers who can afford it test across screen reader and browser combinations but the way screen readers and browsers interpret web code changes all the time shape clusters jump in and out of their place in line to make the web truly equitable and inclusive assistive technologies need to work with web code in predictable ways this is why we started the accessible rich internet applications and assistive technologies community group arya at for short here we are on the area at website in aria at we are bringing together a community of accessibility experts user advocates assistive technology vendors and web developers to create what we call assistive technology interoperability several different shapes intersect with each other to form one big shape the big shape shifts around trying out different combinations we are testing common aria patterns and collaborating with screen reader companies to resolve inconsistencies in the future we envision developers will be able to focus on accessible experience design and assistive technology users will finally enjoy a web platform designed for them the shape grows bigger and envelops the screen visit us at aria.at.w3.org to read our interoperability reports learn more about our automated testing technology or even join the group a person uses their laptop at a dining room table they click on join community group and a webcam turns on we'd love to have you contribute your expertise big spheres and streams of smaller shapes all flow organically in the same direction let's make sure that web platform reliability includes assistive technology users we build better when we build for everyone well hi everyone i am so thankful we had that video prepared by matt's team uh because it really breaks down this interesting topic we'll be digging in today i'm carolyn derosier founder and ceo of scribely and today we're going to be talking about how screen readers are working right now across browsers and devices and a w3c project years in the making that promises to create a better path forward for assistive technologies we'll learn more about how this project solves some critical inefficiencies with screen reader testing and its potential to open doors for future innovation so on today's panel we have mike chabanek head of accessibility at meta matt king accessibility technical program manager at meta and michael fairchild senior accessibility consultant at dq matt and michael are the co-chairs of the w3c aria and assistive technologies community group behind this project and in a little bit we're going to dig into the history behind how this community group started and also find out what happens next for this initiative but first question to the panel can you talk a little bit more about some of the key points from that intro video what is assistive technology interoperability exactly and why should those of us who use or test with assistive technologies care about it yeah uh this is matt and i would love to talk about that a little bit especially since i am a screen reader user have been for decades and i think this is something that almost every screen reader user is familiar with even if they don't recognize it as what we are talking about is assistive technology interoperability but it's basically interoperability means that enter between i can go between from one a screen reader to another screen reader in one browser to another browser and things should basically work the same way things that i expect to be there and be announced are there and they're announced and i can access them and so many of us are familiar with the scenario where gosh i was on this site last week and it was working what's going on this week it's not working for me and it doesn't seem like the website has changed uh and so then you go and try it with a different screen reader and now it's actually working for you you know like huh what's going on here and there are so many different things that can change that can cause that kind of problem and that problem because because there's so many different things that can change we have to have different ways of testing and uh to ensure that the things that could change that could break it for the user don't and that's what this is all about is we we've just kind of assumed this is how life is sometimes it works with one screen reader and not another and it doesn't have to be that way even the people who are testing and trying to build websites every day and make them accessible for anybody using a screen reader are running into this problem all the time you know they they thought they could use this specific web component and if they get it working with one screen reader and then it doesn't work with the other other and now we're creating kind of what's really almost an impossible situation for web developers to try to keep it working well for anybody using a screen reader yeah man i hear this all the time as head of accessibility at multiple different companies in my career people ask me all the time why isn't it better why is this so hard why can't companies do this it seems straightforward you make a website i do this every day i go to a website my browser it works but when i use a screen reader it's not like that at all and so we're feeling that pain as an industry we're feeling that pain as end users and we now have a way of addressing that but really we want to help people understand like why is it that way what's not happened or what has happened that's led us to this point right so it does sound like there's a lot of pain points and things standing in the way for web developers right now does this mean that it's currently impossible to make a website that actually works with assistive technologies or can you tell us about what it's like for developers right now working through those issues yeah this is michael i can try to answer that so it is uh possible right now to make websites that conform to accessibility standards uh but compliance doesn't require good support in all screen reader or browser combinations but for teams that want to provide great inclusive experiences beyond that minimal compliance it it can be possible to make some of those experiences work well for a lot of assistive technologies but doing so requires a lot of manual testing in many different assistive technologies browsers operating systems and it just creates a huge matrix of testing that needs to be done and often it requires a very high level of skill for those teams to test all of those combinations and understand what the expectations really are in each of those individual assistive technologies and what support really means and this often means uh that teams end up making non-standard tweaks or hacks to the code to achieve good support and these hacks are often brittle and break over time as browsers and screen readers change often leading to worse experiences so and in some cases it's even impossible to achieve good support with all assistive technologies so this boils down to a lot of overhead in terms of time cost and frustrations for teams that want to provide good support and inclusive experiences and it results in a fragile sort of ecosystem that's prone to breaking and most importantly it means the exclusion of people that use the assistive technologies right so it sounds like a lot of developers as matt said have kind of accepted that this is just the way things are but it really doesn't have to be um and there are a lot of practical implications at play here right um so what is this all about can one of you explain some of the reasoning behind meta's decision to focus energy on this project in particular yeah you know we feel this pain from both sides we have people at our company like matt who use screen readers every day and it's amazing to watch him switch from one to the other to the other and then go it works here but doesn't work there and if we make this fix it breaks the other one and just that frustration of like what choice do we make and then the the other choice where it's you know we only have so much time and so many people even as many resources as we have um how do we cover all the combinations like as michael was saying the number of combinations of device and operating system and web browser and screen reader grows so fast and there's more coming and there'll be more in the future uh it's getting out of control for us and so it's not a trivial thing for developers who want to do the right thing like us and others um to sit down and say well just test it and it'll work across all these different combinations we literally have to test every single one so we also hear this from our end users we we serve over three billion people in the world on meta products across the globe um on multiple different platforms mobile desktop and others uh and and they have the same frustration like you know i come with my product that i can afford that i have access to which we also know is not always the latest and greatest one because not everybody has the money to upgrade the latest thing at the moment they come out um how come it doesn't work for my version so even the variations of not even just the platform but the versions you're on or the age of the device these all grow that matrix to become basically overwhelming and that's where companies find themselves so whether it's our company or another company whether you're a small developer with three people or a mid-sized company or an enterprise you're facing this same giant matrix and to guarantee a good accessible experience across all those combinations as matt said is kind of impossible so we all do the best we can and what ends up resulting is that people who use screen readers say i guess is just the way it is because no one's been able to sort of wrangle this and figure this out until now so theoretically the dream behind this project actually sounds great uh because with this initiative we now have what we need to make screen readers actually work more consistently and more intuitively and i think this is all kind of what we expect from technology these days um however as you mentioned uh mike there are so many different kinds of assistive technologies so many different kinds of screen readers with all the options we have right now um you must still see that interoperability for assistive technology is realistic and achievable so my question is do you believe it's possible and what will it take to get us there i'll let matt and michael address the what it takes to get us there but in terms of is it possible it absolutely is possible and i think the if we roll the clock back and think about what the web itself was like in the early 2000s you had to have a certain web browser to go to certain websites and if you had the wrong one you couldn't load that page you couldn't perform that transaction or send that message or get information and we've kind of forgotten that that's really how the web started and it took uh some really dedicated people um and in a lot of ways the work of the w3c to propose the idea of interoperability and standards something that everyone could agree on so people could have more choice in the browser and device they wanted to use which has resulted in a plethora of really cool different browsers and really cool different platforms but still work together so we have the uniqueness that we want we have the choice that we want and it's allowed companies to be really innovative and do lots of cool features and things without having to spend all of their time just making it work and making it interoperable and that's what we're proposing here with this project is we can now sort of come to agreement on what the standard should be so we can cover it quickly and then spend all of our time and money and resources and and creativity on the really cool innovative stuff that's going to push us forward and matt or michael do you want to talk more about what it will take to get us there yeah so um if you think about what it took uh to get there for browsers um and by the way we have a lot of the same people who are working on making this happen for browsers helping us make it happen for assistive technologies and it kind of boils down to getting hundreds if not thousands of little agreements in place that this is what ought to happen in this scenario and this is what happened in this scenario and it has to wait we have to have consensus across all the stakeholders and then uh once we have that consensus we have to have the machinery the infrastructure in place to run a process and a program where we can get everything that needs to be tested uh tested and you know track all that data highlight the problems get the problems resolved and that's a lot of work to do and a lot of work to track a lot of data to track so there's a big infrastructure that has to be built and technologies have to be invented right now there is no way to run automated screen reader tests in a standardized way you can't just say um you know i have a machine that's going to run a test uh now you know plug in any screen reader you want and and let's see if it works with this screen reader you can't do that today we can do that with browsers we have technologies for doing that with browsers we don't have technologies for doing that with screen readers or any other assistive technology i think it amazed most people to learn that even for just the traditional web browser there are literally millions of performance tests run every single day in the public you can go find these sites and check it out to ensure that they're working correctly with the internet standards like html and css that happens all the time in the web space it doesn't happen at all really effectively in the accessibility screen reader space so i think that's the kind of may i think that's the kind of automation you're referring to is like if we can start to make those robust tools agreements like the automation that will help us be so much more consistent and so much quicker delivers so much better experiences will speed up like it's a giant change giant step forward we just the accessibility world missed its moment like the web world had it we saw it happen it was there we need to catch up to that and so that's really what this project is about is is bringing this up to speed in a lot of ways yeah essentially every time a browser changes or every time a screen reader changes or every time there's a change in a standard that affects either one of those every single test needs to be rerun so that's a lot of test running that has to happen that's why we need machinery and technology to help us do that got it well this all sounds really positive and encouraging um but i am sure there are skeptics out there at this point um so does this mean that all screen readers actually have to be the same and what would you say to those who are concerned that this might negatively impact competition and innovation within the assistive technologies industry a great question so the goal is not to have identical output between screen readers the goal is to have equivalent meaning and uh robust support of the expectations for the standards um and then once you standardize the fundamentals of a technology as matt just alluded to it's easier for innovation to blossom and for people to embrace that technology um the web itself is evident of that as we just described uh matt do you have anything else to add about yeah i mean i would say that i understand the skepticism because if people love the way their screen reader works you know they want it to work the way it works and we that's not what we're not trying to change the things that actually work we're just trying to when something doesn't work figure out like which what is the right thing to have happen and uh within the context of that given screen reader and that's why this takes a long time that's why i mean this is a lot of negotiation that has to go on but the thing is in the past there may have been discussion of this there may have been negotiation like we think this is what screen readers ought to do but there wasn't any way to codify the output of those discussions and then to make sure that yeah the thing that we thought should happen should keep on happening as all the technologies change yeah so michael and matt you guys have been working in the in the working groups for the w3c the standards body to develop all of this infrastructure and these definitions uh to actually make this possible now like that's that's what's been going on for the last few years but now we're at a different place aren't we yeah now now we're at the place where we've we have a good chunk of this infrastructure developed we have a what we call a working mode in place where we know how we want to interface with the different stakeholders we have agreements with a lot of people about how to work but now it's time for us to really start processing all of this data running the test we have a lot of tests written i think as of today we have 30 common web design patterns uh have tests drafted for them and that represents something like 4 000 tests somewhere in that neighborhood so um and these are all proposals at this point we have to begin that long arduous process of getting industry-wide consensus on them right and in addition to that sorry and in addition to that we are also making good progress on um creating the technology to automate all of those tests we've started to uh we have a prototype of um of what of a driver to automate this on windows and we're starting to explore it on other platforms and we're starting to try to push forward standard standardization of that automation the automation is really amazing when you stop to think about what that means because if you're a small web developer house the power of that automation is the same as for a company like meta like we're all running the same automation at that point we can all guarantee this is going to work the way it's supposed to work and so suddenly like all boats rise in the tide right everybody gets to be better everybody gets to be more consistent and so again we can focus on making really cool websites to do different things we can focus on more content we can focus on all the stuff we want to focus on knowing that it will also be equally accessible which is what we're trying to achieve so um that automatic that the promise of automation around this is i think maybe one of the most significant opportunities that this presents got it so let's get down to it because i'm really curious about uh the answer to this question we're of course in 2021. wickag is more than 20 years old screen readers are more than 30 years old so if this technology standard is so fundamental uh to you know the type of work that we're doing and what we expect from our technology i'm wondering why hasn't it happened yet why do we have interoperability for everyone who uses a browser but we don't have it for people who rely on assistive technologies and um maybe you know we can get into some of the history behind all of this yeah it's kind of a long complicated question but without going too deeply into it i think the first thing we have to realize is just the sheer volume of work required to get this done we've already alluded to the amount of investment that had to happen to make web platform form test work for browsers and that took many years and it's ongoing and so there has to have been somebody at some point to make a decision like we just got to get down to business and do this right so that had to happen but even before that the standards that make that possible um the accessible rich internet application standard itself only came into being in 2014 so that that standard is people should understand is something different from wikag wick is like the building code that says put a ramp on the building and arya is the standard that says how do you build that ramp and so uh we didn't have that until 2014. so
you didn't know like there wasn't any way to say that this element on the page is a button and this element on the page is a link and this one's a checkbox unless you were using out of the box what we call pure vanilla html but if you were doing building websites the way most modern websites are built there's a lot of other stuff you know fancy stuff on that website that there was no way to make accessible without that aria standard yeah and while screen readers have been around for 30 years they haven't been around for 30 years on mobile devices i mean we think of our world today as well we've always had smartphones right how have we ever lived without them but there was a day not that long ago we didn't even have smartphones and even when smartphones came out screen reader didn't come out on the first one it took a while to figure that one out too so uh it does take some time but it's been a dream for a long time and it's just needed the time to incubate and that's what's been happening and so really why we're excited about this and you can hear the enthusiasm in our voices we're at that moment where it's really time to sort of um present this and encourage people now like there's enough here for people to jump in on and take part of and actually make this real um so pretty exciting in that respect yeah right like you said they've been around for 30 years mike but i think it's really important for people to understand that for the first 15 years at least 20 years screen readers were really a lot more of a hack than anything else like literally they read the print being sent to the screen they would suck data out of video drivers and things like that and uh they were they weren't what we call engineered approaches that there was you couldn't design accessibility really back in the early days of windows and so forth not until some standards came in her api as we call them really came into place where it was possible for a screen reader to know what was on the screen without guessing now they can know what's on the screen without guessing we have all those standards and technology in place now we can actually test to make sure that it's doing the right thing yeah i remember i had a hand in helping develop the voice over screen reader for mac os and that was only in 2005 and that was really one of the first efforts to actually build into the operating system as if it belonged and not just sort of let someone hack into the system and figure out what to say or do or enable them to interact with a computer so to your point man i think that's a good one it seems like there's been a lot longer time at this than maybe there has been so we're catching up we're catching up fast right yeah it seems like the stars have finally aligned perhaps and uh that was you know part of the drive towards uh pursuing this project in the first place we finally have uh what we need to do it um so uh i guess my next question would be what comes next uh what can we expect what are the milestones ahead and how long will this initiative take to roll out so we have a 2023 gold that's pretty big we want three desktop screen readers and two screen readers on mobile operating systems to be really well positioned for nearly full interoperability by the end of 2023 that means that all of what we call accessibility semantics that are defined for html and for aria will have uh at least proposed tests for all of them and we'll be running all those thousands of proposed tests every time any one of those technologies changes we'll know what all the bugs are and by that time we will also have in place all of the processes and the consensus and we will have uh leveraged enough to fix most of them i can't say that we'll be at the point where all five of those are really fully interoperable at the end of 2023 but all the machinery to make sure that it will happen and that it is inevitable that's our goal to have it in place at that time but this is a never-ending thing so that's only five screen readers this is in fact right now we're just focused on the three desktop screen readers that's it because if we don't do it there you know we know that we can't do it anywhere else so we're starting there and then we're growing and then it'll go to more screen readers and then more kinds of assistive technologies so but that 2023 that's a big giant milestone for us that we're aiming to we're really pushing hard to get there fantastic well uh i feel like there's so much we didn't get to cover today uh and i'm really curious uh to know how we go about making this happen in practice and what kind of impacts this is ultimately going to have on accessibility moving forward um so how can people listening today learn more at this point yeah so that's a deep topic it'll take a while to cover it and there's a lot of different angles there so we have another session tomorrow on the schedule and we'll cover that uh topic in more depth and dive into some of the more technical details okay great uh so final thoughts today um what's your main takeaway on why this initiative is so important and why should people believe in this well it certainly can be done we've seen this happen with the web and with other big projects where standards came along and allowed interoperability and then suddenly those ecosystems flourished and that's what we're proposing here we want people to grab onto that vision that things can and should be much better than they are and in a lot of ways we're just calling on everyone to say don't put up with this anymore you know it shouldn't be that you have to know how to use three or four screen readers on three or four different devices with different operating systems just to get your work done just to go to school and learn just to stay in touch with your friends you should be able to pick the one that suits you best and enjoy it and have it work and know it's going to work before you even try it so that's the call to action here that's the excitement around this and that's a massive step change for accessibility to become sort of more formal and more consistent and higher quality across the board so for us it's very exciting moment and we're really encouraging if you're one of those developers if you're one of those um platform owners if you're a web browser even if you're just an end user of a screen reader it's time to start saying like we want more and we want it to be better and we have a way to do this let's get on board let's work together and make it the world we want it to be yeah it is going to take a community a lot of work and effort and um we know that uh there's going to be you know some uh more bumps along the way it's not gonna be just clear smooth sailing but we really believe that this isn't actually optional at this point because if we're ever going to have a web where it is possible to build for every single user then we just we just have to have this it's a big investment it's a lot of work but uh it might seem like you know compared to you know the number of people involved and what we get out of it like it's it's disproportionate it's not people matter every one of us matters the investment is important and it needs to be made [Music] yeah i'll just go ahead oh this is going to say i i when it comes i was thinking about your question carolyn it's such a good one and i'm sure there are people out there saying this seems impossible like it's never happened it probably will never happen and i just i want to encourage people and just say this when apple proposed to make a smartphone out of glass the entirety of the vision loss community said that's impossible we will never be able to touch that glass and understand what's under our finger and be able to use that device and yet when you go around almost everyone who's blind and uses technology has a smartphone and it's glass and it was figured out and it works even better than people thought they had before and so i think we're really encouraged by breakthrough moments like that and we're feeling like this is that breakthrough moment for web browsing and for assistive tech interoperability and uh we want to paint that picture for people to say like this can really happen and we've seen these we've experienced these moments before um and and this is that and that's why we're excited to share it yeah i think mike and matt said it best let's make the web more stable and more inclusive well this was a wonderful discussion and an absolute pleasure to moderate today thank you matt mike and michael for sharing your work and your ideas and i'm very much looking forward to learning more and watching this project unfold so bottom line i feel like we need to continue to find ways to make screen readers a great experience so that people have technology that's actually working for them to achieve what they set out to do and we hope that this panel has provided everyone with an introduction to this project and the possibilities for the future so please bring your questions to tomorrow's session which will be an interactive deep dive into the technical aspects of how this all works and how we can make it real together we hope to see you there [Music]
2021-12-06 13:39