The Screen Reader Interoperability Gap – A Hidden Systemic Barrier

Show video

[Music] on a vintage computer web pages cycle through some work and some show incompatibility errors back in the 1990s web browsers were all over the place a website that worked in Netscape didn't always work in Internet Explorer or the other way around only big companies with dedicated teams could afford to make websites work in multiple browsers on a split screen several hands type and one uses a mouse to fix this thousands of people from dozens of internet companies worked for over a decade to standardize the web it was a massive investment of effort that improved the state of browser interoperability but there were still lots of compatibility bugs back to the monitor an error message reads this page best viewed in Netscape so in 2012 a group of people started the web platform test project or WPT for short today you can go to WPT FYI and see the results of millions of small tests that run every day to make sure websites render consistently across all all platforms on the WPT website a cursor moves over a browser interoperability graph when you use a website and it works the way you expect it's because we have web standards and automated testing technology the reason we're here today is because assist of Technology users were not included in this push for interoperability an animation of a giant sphere pushing clusters of shapes into the far corner of the screen Lewis a light tan-skinned blind man Smiles at us my name is Lewis pronouns he him his I am from Southern California palm trees Sway in the wind a closeup of someone strumming an acoustic guitar I play a guitar and a musical Duo and I am proud to be an accessibility tester behind me is a desk on which is a technology setup keyboard connected to a desktop a monitor and a laptop the experience for a screen reader user can vary dramatically from one screen reader to the next uh in Practical terms this means that I can spend a morning trying out different screen reader and browser combinations to overcome whatever inaccessible content I'm dealing with that day or sometimes depending on how the content is rendered I I can't get past that and I'm somebody with the tools to experiment and and figure out the success ability issues now I'm going to show you how two screen readers voice the web differently delicious pizza that you can get from an accessible site when you need a quick blank list of pizza options region check boox not checked pepperoni checked not checked checked now I'll be using the screen reader on the laptop pepperoni off you are currently on a switch on off on on screen reader logos appear on screen voice over TalkBack Jaws narrator nvda voice View and orca there are many screen readers with hundreds of interpretation differences like the one LS just shared these differences can exclude people an animation of shapes dropping onto a conveyor belt and moving right developers who can afford it test across screen reader and browser combinations but the way screen readers and browsers interpret web code changes all the time shape clusters jump in and out of their place in line to make the web truly Equitable and inclusive assisted Technologies need to work with web code in predictable ways this is why we started the accessible Rich internet applications and assistive Technologies community group Arya for short here we are on the ARA at website in ARA at we are bringing together a community of accessibility experts user Advocates assisted technology vendors and web developers to create what we call assistive technology interoperability several different shapes intersect with each other to form one big shape the big shape shifts around trying out different combinations we are testing common ARA patterns and collaborating with screen reader companies to resolve inconsistencies in the future future we envision developers will be able to focus on accessible experience design an assist of Technology users will finally enjoy a web platform designed for them the shape grows bigger and envelops the screen visit us at araw 3.org to read our interoperability reports learn more about our automated testing technology or even join the group a person uses their laptop at a dining room table they click on join community group and a webcam turns on we'd love to have you contribute your expertise big spheres and streams of smaller shapes all flow organically in the same direction let's make sure that web platform reliability includes assisted technology users we build better when we build for everyone the screen reader interoperability Gap a hidden systemic barrier speakers Brett Lewis senior software engineer vispero Isabelle del Castillo soless access technology specialist Prime access Consulting Matt King accessibility technical program manager meta moderator Carolyn D roier founder and CEO scribbly thanks Alice and happy to be here at sitech Global and welcome everyone to our session titled the screen reader interoperability Gap a hidden systemic barrier and hopefully that intro video provided all of you with a sense of what the work of the Arya at community group is all about today we're going to be catching up with some of the members of the Arya at community group and get an update on the progress they've made and where they're going next the Arya at community group officially launched at site Tech Global about two years ago um in 2021 so really excited to hear about all the progress since then uh so let's do a quick round of intros um and if you could all please tell us what is your day job and what is your connection to the Arya at project Isabelle let's start with you hi um I'm Isabel I my U I work for Prime access Consulting uh back for short and basically my day job is uh to create and write test plans for the are project and I also do some B baking on the side great and Brett hi really happy to be here today as part of this panel uh I'm a senior software engineer at vispero working on Jaws and I've really been interacting with the arat group trying to make sure that Jaws supports everything that is required for these tests and providing some feedback on how the tests and their recommendations have been received right that leaves us with Matt over to you Thank You Caroline yeah I'm a technical program manager at meta on our Central accessibility team and I'm one of the chairs of the Arya and assistive technology community group I I help get it off the ground wonderful well happy to have you all collected here as a group uh to really dig into what's going on with Arya at so let's get into some questions what exactly is screen reader interoperability and why is it so essential to digital inclusion and Matt maybe we'll start with you yeah I would like to start by talking about how people who do not need a screen reader can use the web and what they have that people who need a screen reader don't have so what most people can do is go to just about any website using any browser and it basically works and the people who made that website they don't really care if different people are using different browsers because they can safely assume that things are mostly going to work they are benefiting from what is called browser interoperability inter means between or among and so the concept of browser interoperability is simply that users can move from one browser to another browser and the website is still operable this is a really big deal it's really important because before browser interoperability was a given in just sort of part of the air that we breathe developing a website that worked for everyone was really complicated it was pretty expensive and not everybody could do it so the web was a lot less open and free and certainly a lot less Equitable unfortunately those of us that are screen reader users we don't have access to that same kind of equity because we don't have interoperability it makes a big difference which screen reader or web developer use when they're developing their website and This Ss major negative ramifications for web developers and screen reader developers and most importantly for users sites that we would consider accessible in many cases they're really only accessible to what I call the elite disabled and for people who are blind that means people who have multiple devices multiple screen readers and most importantly the skills and the training knowledge and experience to be able to work around the kind of problems that are posed by the lack of interoperability right so screen reader interoperability extremely important for usability for accessibility um everyone from developers to users are paying a price because screen reader behavior is just not sufficiently predictable um so let's try to dig into this and understand a little bit more Isabelle in your work you help clients build accessible websites and in your experience what are the everyday consequences that website developers experience in today's world where screen reader interoperability is not a given uh firstly there's an impression of H creativity and sometimes the impression is accurate you know um designers and developers feel that they can't use or just uh don't know if they can use certain patterns because they might not be compatible with all screen readers um it makes for a negative impression perception of accessibility um it feels like a bom like a it's not adapting to how products are really made um there there are and then there are additional costs of training testing maintenance and so on people are having to learn and stay updated on how area and other specifications are supported by different screen readers understand the nuances of how various screen read work determine if what if what they've developed is meeting user expectations and uh they have to create fragile work around it's a lot and mostly misplaced and testing is not being done by the disabled people which uh with Decades of informed experience of how this text would work but by people who are just trying their best to understand the the basics uh this could be an argument for more employment among disabled users who rely on this access technology every day but in practice that's not happening and many disabled people don't uh really want to be shored into accessibility work anyway it takes too much time too much money and all of this contributes to accessibility being sidelined and deprioritized um quite often disabled users can't expect an equitable experience at launch um sorry a product or feature launch time because too much of the testing and fixing is happening afterwards it's the classic domino effect uh you spend time remediating something um that's gone before which take resources away from the new things uh being built thank you Isabelle so the benefits of interoperability to site developers are pretty clear um there are definitely ways that to save time save resources and also pave the way for um Innovations to happen uh Brett it is not very obvious how a world where you know your competitor screen reader Behavior and ways that are much the same as your product um and if that's advantageous to you could you explain that a bit more yeah this is a really really interesting question so at vispero one of the things we are hoping to do with every release of Jaws is provide you know innovation in how we support things uh customization our goal is to let the users have as much opportunity to take tailor their screen reader experience to what works for them do they want to hear something do they not want to hear something are there ways we can make it easier for them to access you know information in a way that's just efficient and not time consuming you know we offer customization for you know things like scripting with jaws you can do things that you can't do with other screen readers and so on so all of these things allow us to differentiate ourselves from maybe one of our competitors Isabelle touched on this a little bit earlier in her response in when she was talking about you know the costs of testing so the big thing that I think all of this uh reat effort allows us to do as a screen reader developer is to rely on testing for basic functionality so work that we've always done with every release of Jaws to make sure that you know we're not regressing something hasn't broken accidentally can easily be done by this Suite of Arya at tests we can say to ourselves with confidence you know we know that we have met the basic requirements of this and we don't necessarily have to do separate testing which now is a manual process but in the future with raat will be an automated process so we can take those results and rely on them what this means is that we can free up resources that would have been spent on all of this testing internally to really providing this enhanced experience for our users the thing I talked about at the beginning so you know the Innovative new features efficiency customization all of those things so really should enable us to provide more for our users if we have this kind of basic testing provided by raat right so this standard standardization of testing also helps you on your side um with saving time and resources and also innovating as well um so it sounds like everything just comes together um under a unified effort um let's switch gears a little bit and talk about users all three of you are screen reader users and given how important this technology is to you I'd imagine you have some thoughts on why interoperability would be so important animportant to the individual user and Isabelle let's start with you uh well uncertainty is a big factor uh will this website work tomorrow when my browser installs an update I mean will I uh will I be um find myself needing an alternative screen reader or assistance to complete a task I could do just fine before uh not everybody has the choice and the tools to deal with those things and even if if they do uh is not necessarily welcoming great and Brett how would you respond to this question it's funny I had written down an answer to this and then I was looking at some of the things we've actually fixed as part of the raat project and it reminded me of what I think the biggest benefit that I've seen and that's one of the things that's really hard for a screen reader user to know is things that they haven't heard so in other words where there's an error and your screen reader leaves something out it's often really difficult as a screen reader user to know maybe I've missed something and one of the things that I think has been the benefit is we now know that for example if you navigate into a radio group you're always going to hear the qu the text of the radio group The name of the radio group associated with that so that ability to not miss things I think is really what it does has really done the most so far and I think has the potential to do in the future for me Matt any comments on this one yeah I would say like for me personally besides the equity big Equity Gap problem that we talked about earlier the effect of the lack of interoperability on us ability is really personally very frustrating ironically a lot of the work that web developers do to try to improve the usability of their websites actually makes it a lot worse uh for people who are using screen readers a really good example of this that's pretty common is that there is code that web developers can put into the web page that tells the screen reader how a table is sorted but very often because the way that screen screen readers don't all respond to that code in the same way web developers will go ahead and put in what they think are super helpful uh comments I'll put in extra sentences that tell you for each column how it's sorted and how you can change the sort and what you need to do in order to activate the button that changes the sort uh and all of that extra word wording that's in the design of the site it's that you can't turn off so it can make reading the table a lot harder and so I'm hoping that in the future when screen readers behave more consistently they won't feel like they need to do this kind of thing so much and we should be able to have much more usable experiences across the board yeah so it sounds like um these developers who are taking steps to uh improve accessibility with the best of of intentions uh would really welcome something like this because uh they have an exact framework and guideline to follow to achieve that that result um great uh so let's learn a bit more about the actual work that needs to be done to make this whole thing a reality uh and I know this project has been in the works for um actually close to five years uh so changing one product um is hard enough as we know changing the behavior of products across industry is clearly a big challenge um and I understand that a first step is to define the behaviors we want through some sort of standardized testing and the question is how do we exactly do that and Isabelle can you speak to this one uh in some ways uh we are doing the work that users un willingly take part in every day assessing Uh current screen reader support levels or passions and finding the gaps and asserting what people need in the real world I suppose the difference is that via this project we have an opportunity to exhaustively establish and formalize those things uh in an effort to solve many of the problems we've already covered it always starts uh with the creation of a test plan targeting a specific uh specific functionality pattern that's a collection of testing tasks representing how users are likely to interact with it commands they might use and the the results uh they should be able to expect those expectations now are rated based on the the likely user impact between things that access technology must do to create an accessible experience the things it ideally should do and extra things that it may do to help a user out with extra convenience uh it doesn't stop there though um everybody who then Works through that test plan uh to record results may have ideas um on how to add to it or make it better and more applicable to users it's a lot of work and not only to thoroughly anticipate and catalog what users need but to comprehensively evaluate the degree to which they're being they're currently being met and um also recall the results because in is important we want our data to stand up to scrutiny as well as being useful thank you uh Brett as someone working on screen reader code what do you then do with the tests that Isabelle and the community group are delivering um why don't you take that question sure so they've not only delivered us the tests but they've delivered us some of the test results and so when we first started getting the test results we took those and basically fixed a Jaws um you know there were areas where we just weren't necessarily meeting all of the requirements and so we would you know create issues internally and address those issues and get fixes out as soon as possible um one example of this was we I mentioned the radio groups earlier one of the things we weren't doing was announcing the name of the radio group when navigating into it using a quick nav key and this was a pretty important Improvement we were announcing if there was a focus chain for example if a user tabs in there but we weren't otherwise and so you know it really helped us improve the experience for all of our users right there one of the other things that we found is that sometimes we take the tests we take the test results and we say here's a gap that we need to fill in Jaws and we Implement what the tests recommend and we roll it out to our users and what we find is that it's an experience that for sort of some anticipate unanticipated reasons doesn't work for our users as a requirement so an example of this we had a disclosure pattern where we were supposed to speak list items as you navigate into them using quick nav keys and so we made this change and we rolled it out to our beta testers and we found found that on the audible website for example when you do a search all of the individual search results were inside a list item so as you navigate through the page every single time you'd navigate to a search result you'd hear too much information so we sent this feedback back to the arat group and they Revis the tests because they you know they're screen reader users they understand the complexities and why this might not work as a requirement so really both of those things implementing fixes based on what the tests identified and also you know helping us provide feedback to the a reat group on what tests work and what should be required and you know maybe an optional addition to the test so it sounds like we end up with some tests that reflect uh a reasonable set of basic behaviors and if those tests are covering screen reader behaviors across whatever is possible for site developers to build uh sounds like that would be a lot of data so Matt how much data is there exactly and how are you managing it yeah it's a lot of data uh and is only getting bigger we have uh currently draft test plans for 35 examples of common UI patterns we get these examples from the ARA authoring practices guide and for each of those test 35 test plans they may have like 10 to 40 tests in them and so that's hundreds of tests across the plans and then for each test in every plan we have to map the commands of the screen reader uh that can be used to execute that command so we end up with thousands of assertions just with the amount of data that we have right now uh over time as we cover more of the APG and right now we're only covering three screen readers and we're going to cover more screen readers and then uh in the future we hope to cover other kinds of assisted Technologies uh so it's not just screen readers so the amount of data is going to grow a lot and uh we have been U building the infrastructure that it takes to manage all this data and manage the consensus process for all these tests when we get the kind of feedback that Brett and Isabelle were talking about on the test from members of the community group and from screen reader developers um and to in order to do all that it's just it's a lot of work doesn't come for free we've been contracting with u Boku uh to build the infrastructure they are right people for the job because they are some of the same people who have also worked on browser interoperability infrastructure and how do you keep all of this current in terms of the the test plans and the reports uh don't you need to rerun over and over again as screen readers change yeah that's exactly what we need to do so initially everything has to be run by a human and validated by a human uh there's no getting around that but once we have defined expected results for every single test there isn't any reason why a machine couldn't rerun that except for the fact that there was no standard way of doing this and so one of the things that we're doing is developing something called at Drive which is a standardized API for driving screen readers and once we have that the same code can be used to run tests from for any screen reader that implements that API recently we've made quite a lot of progress in the space and we have started to rerun some of the tests uh using Automation and and generate some of the reports that way so automation so so important so the project is in year five at this point and you clearly have made gains in test writing and automation uh what other big wins do you have and what is the outlook for 2024 Matt over to you yeah so I would say the other biggest win we had this year is we launched at support tables in the authoring practices guide so now when somebody goes to an example of how to Cod a pattern for some of them they can see what is the level of support for that pattern and uh this is uh a really big deal for us because it exposes the data in ways that we hope are going to be generally useful to a broader audience great and how can people help at this point Isabelle maybe you can comment on that um yes uh well we've already talked about the level of investment faced uh by individual companies and product creators and essentially this project is taking that on at scale um we are always in need of experienced screen reader users to review and run test uh tests uh people to propose improvements to those tests and the tools developed by the project and generally uh looking to expand a read and diversity of point of view points uh we really encourage you uh to reach out if you have any interest in what we're trying to do um even if you're not exactly sure what you could offer because there is uh plenty to be done Smart Set we have um we are hoping to get a lot more data into those uh tables yeah in 2024 we're hoping to get several dozen of those at support tables out there and incorporate more of the feedback we've received from the community so there is a lot of work to do in that space certainly sounds like it and yes please uh get in touch with this group um it's certainly an important uh mission that you all are pursuing here um with this project so um yes please get in touch uh I'd like to do just a quick round of final thoughts from all of you about uh what's next uh for the Arya at project or maybe additional calls to action uh that you'd like to make and Brad let's start with you I think the promise of this project is really hard to underestimate particularly with the impending automation I think we're really going to see sort of an explosion and the benefit from this as you know automation becomes a reality as the number of tests increases it's really going to provide benefit for all screen reader users thearrow is very committed to this in the long run um it's one of the things I have to hear about all the time um which is great but you know probably my performance review as well uh but there will be long-term commitment for you know this project with jaws and Sparrow and I think the benefit is just going to increase thanks and Isabelle final thoughts uh yeah I just want to say that the aim of this project is not to make all screen readers the same or have websites um and apps be carbon copies of each other uh quite the opposite we want uniqueness to always shine through uh while being being backed by consistent support that way experiences can be created that legitimately stand out while being inclusive and better for everyone Matt what's next for the future well I just can't stop dreaming of the day when it really is possible for web developers to build experience experiences that are truly usable for people with disabilities and accessibility is is really inclusive of all blind people and I don't think that's possible until screen reader interoperability is part of the air we breathe just like browser interoperability and when we have that then I think it will be possible for accessibility to not just be limited to the elite disable it should be available equally available to everyone well thank you so much to our panelists for a really great session and everyone in the audience for attending uh so exciting to hear about the progress that's been made clearly a lot of work ahead um but thank you all for for sharing a bit more about your work I'm sure everyone has questions for our panelists so please bring those to the Q&A session uh thanks again and back to the team at sitech global [Music]

2023-12-16

Show video