Listener Kenn wanted to know about the process of designing assitive and accessible technologies. Technology has the potential to assist people with disabilities. But designing such tech requires a lot of consideration.
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Get in touch with technology with tech Stuff from how stuff works dot com. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with how stuff Works, and I love all things tech. And one of my favorite things about technology is how we can provide people with capabilities they otherwise might not possess or might not be able to tap into. For all walks of life. It can open up access to incredible volumes of knowledge and expertise. It can boost our ability to do everything from travel faster to figuring out what book we should read next. And it can enhance the lives of people who live with injuries, disabilities, or other factors that make our world a little more challenging to navigate for them than for other people. Which brings us to today's topic, which is a request from tech stuff. Listener can can ask me this quite some time ago, so I apologize for how long it's taken me to get to this. But Ken wanted to know about the development, design, and deployment of accessible technology. That is, technology meant to provide accessibility to people who might otherwise have a challenge accessing that technology. They might have a visual impairment or hearing impairment, or health condition or injury that prevents them from accessing technology the way other people do. So this episode is all about making tech accessible, and this is just an overview because accessible tech covers an enormous spectrum of approaches and technologies. So I may go into further detail on some of these in future episodes, but this is kind of a very high level look at the the whole idea of accessible technology, with just a few examples thrown in. So I want to stress that this is something that can be really difficult to do well. And part of that is just because the designers and engineers who are responsible for making the hardware and software that we rely upon might not have the same challenges as some of the people who will be using that technology do, and so those designers and engineers might make assumptions about how that tech needs to be used, and sometimes those assumptions can exclude people who have these problems, these issues, these disabilities, these challenges. I don't want to classify everything as a problem. I'm just saying, you know, people who are incapable of accessing the technology the way everyone else tends to do. The intent these designers might have is to help, and even if they're trying to design assistive technology specifically for that purpose, the execution can sometimes be clunky or inelegant, or it fails to make the technology more accessible at all. I think it's of critical importance that designers work with people whose life experience mirrors that of those who actually uh do have those challenges, right, So, if you're trying to design accessibility technology to help with people who have visual impairments, it's important to have people working with you on the development who also have those visual impairments to give you real time feedback on designs to make sure that the technology you're creating actually does what was intended. So in the United States, there are no laws that mandate accessibility with tech hardware. In general, no company is legally required to make certain the technology it creates is accessible to those who are differently abled, and there's a lack of standards and expertise in the field in general, a lack of awareness and education in the field, and that complicates matters. It is a little different if you happen to work for the federal government in the United States, there there is a legal basis for accessibility, but only if you're working for the federal government. Section five oh eight Standards for Electronic and Information Technology states that quote, when federal agencies develop, procure, maintain, or use electronic and information technology, federal employees with disabilities have access to and use of information and data that is comparable to the access and use by federal employees who are not individuals with disabilities, unless and undoe burden would be imposed on the agency end quote. So it's saying, anytime you're going to be uh incorporating technology into a federal agency, there must be consideration put toward making that accessible to people who have disabilities. But that puts the burden more on the agency than the companies that are making the technology, Right Like, unless the federal agency is making its own technology, it's outsourcing that. And when it outsources it, it may mean that the agency has to take a very close look at the different competing products that are on the market, and it may very well mean that the federal agency will make a choice of a product not because it was the best cost for what they needed to do, but because it fit this other requirement more than the competitors did. It does create an incentive. But this is just a policy that only applies to people who work for the federal government. It doesn't apply to state governments or local governments, and it doesn't apply to consumer products in general. But things are not dire. There are a lot of companies, major companies that are working very hard to make certain that the technolog ology they create is accessible to people of all abilities. More on those efforts in a bit. One other thing I want to mention is that accessible technology is distantly related to another concept called regionalization. This refers to adapting technology so is meaningful and helpful to people in different parts of the world. Again, people who speak a different language than the developers do, or who live in a culture that has different values and therefore different needs, different challenges. Those require special consideration. Technology is not always a one size fits all solutions. Sometimes you have to tweak things for the end user. And I'll probably do a dedicated episode about regionalization in the future because there's a lot of different aspects to that as well, from software to hardware. But let's go back to accessibility. I'd like to quote the organization earn E a r N that stands for employer Assistant and Resource Network on disability inclusion, as they lay out in their web page about technological accessibility quote when talking about technology, accessible means tools that can be used successfully by people with a wide range of abilities and disabilities. When technology is accessible, each user is able to interact with it in ways that work best for him or her. Accessible technology is either directly accessible, whereby it is usable without additional assistive technology a t or it is compatible with a system of technology. For example, a mobile smartphone with a built in screen reader is directly accessible, whereas a website that can be navigated effectively by people with visual impairments using a screen reader is a syste of technology compatible. Now, another thing I want to mention early on is another tool called edge, which in North America is a technology assessment tool developed by the Urban Libraries Council. The ULC creates a place for libraries to share and learn from best practices in numerous fields that relate to libraries and communities. And to all you librarians out there, thank you. You guys are awesome. And hey, a former tech stuff host is among your number. One of our former tech stuff hosts is a librarian. So back to edge and to talk about what's important here for this podcast. The Edge Toolkit includes many different benchmarks to guide libraries in making the best use of technology in order to help communities, and that includes everything from training people to become more tech savvy, to increase digital literacy, provide access to the Internet, and other valuable services that are absolutely crucial to interact with society today. So, uh, it's it's very easy to get left behind if you are not of a particular demographic, right. I come from a very rural part of Georgia. I live in Atlanta now, but I come from a very rural community. When I was growing up, it was extremely rural, to the point where we had cattle farms and chicken farms around us, and uh, you know, I was in a little subdivision, so it wasn't like I was miles away from everybody, but a lot of people didn't have access to technology. And granted, the Internet was not really a big thing when I was growing up. I'm old enough to predate when the Internet was it was prevalent, but even to this day, people in those communities don't necessarily have access on a daily basis to the technology that a lot of us take for granted and their local libraries act as a community center and a source of contact for that kind of technology. They can also serve as a way for people to become more educated about that technology so that they can make meaningful contributions in a world that is increasingly reliant on that tech. And so libraries are one of those points where people aren't being left behind because they have the access to those libraries. This is Jonathan on a soapbox saying libraries are really important more. It's more than just checking out books, which by the way, also really important, But it's more than that anyway. Let's get back to the benchmarks. There's a specific one called benchmark eleven that's about accessibility with technology. The core statement for benchmark eleven is libraries ensure participation in digital technology for people with disabilities. So to meet that benchmark, that libraries should in fact also provide access to people who have disabilities, provide access of technology to peopleho have disabilities. The library must have at least one public terminal has assisted of technology incorporated for the visually impaired. It has to have a public terminal that is accessible for people who have motor and dexterity impronments. It needs to have a workstation that can accommodate people who are in a wheelchair or other mobility vehicle, that kind of thing, etcetera. So it needs to have these points of access for people who otherwise would be left out. And I think it's valuable to look at the edge tool, not just to acknowledge how important libraries are and uh and because me it's clear that I think they are very important, but because they lay out goals, they get the conversation rolling about making technology accessible to all people. Now, some areas of tech are ahead of the game compared to others, like the world Wide Web. So the Worldwide Web Consortium or W three C formed back in Nino with the goal to develop standardized protocols for the evolution of the Web to make sure that it moved forward in a way that was uh easily distributed and developed so that you didn't have a bunch of splintered experiences. They wanted something that was going to be interoperable across the entire web. One of the four domains that covers is the Web Accessibility Initiative or w a I, and that has five primary purposes to ensure that web tech supports accessibility, to develop guidelines for accessibility, to develop tools to evaluate and facilitate accessibility, to conduct education and outreach, and finally to coordinate with research and development. So as such, the W three C has issued guidelines for Web content accessibility to help ensure that developers make sites and services that are in fact accessible for a broad range of people. And those guidelines are pretty exhaustive. They have sixty five checkpoints and each checkpoint has an assigned priority between one and three. A Priority one checkpoint is one that tackles a barrier that would make access impossible for one or more groups of people, so it identifies something that could potentially make technology completely inaccessible too an entire group. Priorities two and three mark challenges that would make it difficult, but not necessarily impossible, for one or more groups to access the web content. And this is a big issue UH in general, just the idea of accessibility. A two thousand eleven report from the University of New Hampshire's Institute on Disability saw that people with disabilities make up nineteen of the United States population. Now that report points out that if people with disabilities were recognized as a minority group, that would actually be the largest minority group in the United States. It represents a large population of people, and without these UH considerations for excess stability, that would be nearly one fifth of all citizens of the United States left out of being able to access that technology. So that's why this is of critical importance. When I come back, I'll give you some more examples of some of the types of assistive technology and accessible technology that are out there, But first let's take a quick break to thank our sponsor. For people who have visual impairments, there are a variety of assistive technologies and they range from ways to help people with limited vision to ways to help people who have no site at all. And one of the ones that is a fairly simple implementation is screen magnification systems, and these are digital screen magnification systems. They enlarge text and graphics on the computer screen make it easier for people who have paired vision to read and see things. So it does act like a magnifying class. Typically, a user will move a cursor over the page using keyboard strokes or a mouse, and wherever the cursor is, that's the area that will increase in size to make it easier to see, so you can scan across the page as if you're using a magnifying glass and you're reading text on a piece of paper. Some systems also will include things like a a an option to scroll down the page at a predefined rate, so that you don't have to just sit there and and babysit the mouse and or keyboard and move the cursor manually. It'll move automatically at a speed that you've already determined, so that you can just read along. Many will also offer a contrast adjustment so that the highlighted or magnified text or image stands out more dramatically against the background. Here in Georgia are electronic voting systems have that kind of capability built into them, which is a good thing. But pretty much everything else about those electronic voting kiosks isn't so great. Uh. They are a very old and vulnerable technology, but that's a matter for another podcast. The accessibility features of that technology are good, so I want to stress that's the rest of it that may not be so great. Operating systems like Windows, and it's not just Windows, just saying Windows does have this have accessibility features that can adjust font size, resolution, scroll bars, icons, color scheme, and more. Mac Os has a similar series of options, all designed to make use of that technology more uh well easier for people who have visual impairments. Uh. Windows also or vision impairments. I guess I should say visual impairments is a improper way of saying it, but vision impairments. Windows also has a screen magnifier feature built in. It's not the only U BRING system to do this, but again it has that. There are also screen reader technologies. Essentially that gets down to sort of a text to speech program that can read out text to a user. I've definitely engaged that by accident in the past. Found it pretty interesting that it was a integrated feature. Now I don't need a screen reader, but the fact that it was built into the operating system did impress me. It meant that the designers were thinking ahead and wanted to have that capability to be standard and not something a user would need to add on as an option. Apple has a feature called voice over. It's frequently cited as an innovative assistive technology. Voiceover gives auditory descriptions of what is on a screen, including all the different on screen elements such as on screen controls, and it can work with apps as well. Now that requires developers to actually tag the various elements in their apps so that the voiceover function can effectively relay that information to the user. So it's kind of like meta information at specific meta information for voiceover. So, for example, a play button on an app that would initiate a sound would need an appropriate tag so that the voiceover system could alert the user about the function and relative position of the button. Then there are things like refreshable brail displays that provides access to information on a computer display through brail. Uh. And this time we're talking about a computer display that's not necessarily it's it's not a screen. It's typically a a flat surface that has little holes through which retractable pins rounded pins can extend up and then form characters in brail. And a typical brail display can show around eighty characters at a time. It's refreshable because the pins can fall or rise up and form different characters, and they'll do this as you move a cursor around the screen, So you can scan over a screen and these characters will change under your fingertips and you can read Brail that way. The history of Brail, by the way, is pretty darn interesting all on its own, so here's a quick tangent about the history of Braille. Louis Braill was a young man in France born in eighteen o nine who suffered a childhood injury at the age of three, and it left him blind. He attended school and he would listen very carefully to lessons. He was a good student, and then went on to attend the Royal Institution for Blind Youths in Paris UH and that school owned a few books that had raised print on them. That raised print was essentially created by stamping pages with copper wire that was bent in the shape of letters, so it was just, you know, the letters of the alphabet UH embossed essentially on pieces of paper. Braill heard a story about a soldier named Captain Charles Barbier de la Sere. He was a captain in the French army who had developed a system of writing for soldiers so that they could send and read messages even at nighttime without turning on any lights, because a light would give away your position, which in war time is not great. So this system used raised dots and dashes on paper. You would create these raised elements and then by running your finger across you could read the message. So Braile decided to adopt that method when developing his own system of writing. In eighty nine, Louis published the first book written in Brail, and he would later add some more characters to his system to incorporate stuff like musical notation and mathematics, and a system would be tweaked over the decades that followed. It really only began to get widespread adoption well after his death in eighteen fifty two. But flash forward to nineteen seventy and a company called Pop and Meyer and a man named Dr Werner Bolt of dortmun University in Germany created a refreshable Brail display called Braile X. And so it's like Brail with an X at the end. It was a stand alone system for information storage and retrieval, so it's not like it was a computer peripheral in itself. Was a system that would allow you to create information and store it in Brail, and to retrieve information and have it displayed in Brail. When writing on the device, you would assign a code word to that information, and it was essentially a file name, so you could call up a code word or file name and then retrieve the information and read it in brail. It originally used an audio cassette data storage system, so would store everything on magnetic tape, and then later models would upgrade to a floppy disk system. Later still it would include a serial port which would allow the user to connect to the device to other peripherals like a computer system or a brail emboss or or brail printer, so you could actually print your workout on paper. The first refreshable brail display for users in the United States was the Versa Brail that did not launch till night teen e two, and the National Federation of the Blind released NFB Trands, which was a brail translation program in nineteen eighty. That program could translate text into brail and send it to a special brail embosser or brail printer. That original program sold for hundreds of dollars when it first came out, but in nine its utility was considered so important that the NFB released the software to the public domain. In Dr Kenneth Yearnigan opened up the International Brail and Technology Center for the Blind at the National Federation of the Blind Headquarters, and that center evaluates assistive technology for the blind, so it puts it to the test to make certain that the technology does in fact help as opposed to just become a frustration or or an impediment to using technology. And as you would expect, this technology found its way into other foreim factors, including portable devices. In fact, really early and it found its way into portable devices. In seven the Brail and Speak became an early forerunner of the personal digital assistant, and this was all for the blind. This became the first of a class of devices called note takers, and it had a speaker and a six key Brail keyboard, and it was made in Germany, so it had a data system where it could speak to you. And because it was made in Germany, the system actually had a bit of a German accent and it synth the size speech, which is kind of interesting. I watched a video that played some of the audio from one of these devices. Typing would involve pressing key combinations to access options and type in letters and I could save information and bring you back to it later. And this was all before personal digital assistants would emerge. For cited users there have been many note takers since that one way back in There's been some cool pieces of technology to incorporate brail displays over the years. One recent one I've seen that I thought was pretty interesting was the dot watch, which I think debuted in TV. It's a smart watch with its own refreshable brail display. So it looks like a big white watch face with a little little round circles cut into the watch face which through which the pens can extend and create the various characters in brail. The watch thus can can display time, it can alert you to messages that can actually spell out messages. It's it's like a smart watch, and it's pretty nifty and concept. I have no, well, no idea how well it performs in the wild. I don't know if it works as well as the concept UH would have you think, but I love the idea. I hope it works as well as the videos suggest it does, because I think it's super cool to have a brail smart watch. Well, I have a lot more to talk about with accessibility technology, but first let's take another quick break to thank our sponsors for the Heart of Hearing and the Deaf Community. Assistive Technology can include things like captions and transcripts. Automatic systems that generate captions and transcripts do exist. They typically use speech recognition technology to analyze speech and try to create text based on that. They are a variable quality. I've seen some caption programs that are pretty good, and I've seen others that end up telling a story that turns out to be very different from the one that's being delivered by audio, which could be funny while you're watching it if you're able to hear, you know, you're able to hear the difference, and you're reading the text and you're seeing where it's not matching up, and you're thinking, oh, that's kind of funny. But if that's the way that you get the information, it's really frustrating. It is it is not accessible, it is not inclusive. It's excluding you, and I can't imagine how uh upsetting that's got to be after a while, when you realize as you're reading in you're like, there's no way that's what that person is saying, because it doesn't make any sense. But those are technologies that are getting wider us, and as speech recognition gets better, I will expect that we'll see those error rates go down and it will become less of a frustrating experience. But in the meantime that is a problem. Now. Granted, there are plenty of UH services out there that actually use human beings to create transcripts and captions, where it's someone listening to the content and actually typing it out, and they tend to be a little more reliable in general, depending upon the content they're listening to, the dialect or accent that is being spoken, and the home dialect or accent of the person doing the transcription, because sometimes you listen to someone from a different part of the country or a different part of the world, and it becomes a real challenge to figure out what the heck they're saying. I should know, I married a woman from Philadelphia, and there are days where we just don't understand each other. She talks faster than I can hear. To be fair, we Southerners are, We're slow talking people. So another technology that is often used for accessibility is haptic feedback technology. Haptics referred to feedback that engages the sense of touch. So typically it involves a rotating weight that's slightly off center, and so when it rotates, it creates vibration or a rumble sensation. This is the technology that's inside a phone to create the vibrate feature or in a rumble feature in a video game controller uses the same sort of thing. Uh. That's a very common use of haptics, and they're often incorporated into things like robotic prosthetics, which help give pressure feedback to users so that they can get a sense of important informations just how hard a robotic hand is gripping an object. So if you have a robotic prosthetic, let's say you've got a robotic arm, and it gives you the capability of picking up different objects, and you can pick up a whole variety of things that a variety of weights. You know, ideally you want to have a robotic prosthetic that is as capable as the average human's arm is. Then you need to have some sort of feedback to let you know how hard you are gripping the thing right, so that way you're not going to cause damage to something. If you're picking up let's say a lightbulb, you don't want to squeeze it so hard that you break it. But if you're picking up a brick, you want to make sure the grip is strong enough to actually lift the brick. So that's frequently used as a feedback mechanism to alert the user how much force is being exerted by that robotic prosthetic. It's also used in systems to help visually impaired people navigate through different environments. I've seen some videos of this. I don't I'm not aware of any technology that's out in the wild beyond the prototype stage, but I've seen this approach and I think it's fascinating. It provides haptic feedback systems that are hooked up to optical sensors cameras, essentially, so the wearer can walk around where who has a impairment in their in their vision, They're walking around and whenever they get close to an obstacle that's picked up by the optical sensors, they get feedback. They get a vibration that tells them, oh, you're getting close to something, maybe it's a table or some stairs or something. And it's sort of like spidy sense in a way, which I think is pretty interesting. But again, I'm not aware of any technology on the wild that people have access to easily. I have seen laboratory videos of this sort of stuff. Uh, there's so much more we can talk about. UM, I haven't really touched on mobility issues. There are many technologies designed to help people who have challenges in moving on their own to operate technology. These range from Bluetooth controllers that can map app commands to simple switches that some one might operate with a foot or a finger, um or even their mouth. These sort of devices do exist. These switches do exist, and mapping them to apps allows people to navigate that technology and that it improves accessibility. Uh. There are all these other types of apparatus that people might depend upon, including things like mouth sticks or head pointers that can allow them to interface with computers and other devices. Uh. There's eye tracking technology that can allow someone with mobility issues to take control of the device using their eyes to navigate or to type out messages. There are brain computer interface designs that are the next step out. They would interface directly with a brain to send commands from someone's brain to a computer system to create some sort of effect. Those are still fairly rare, and they're still very much in the testing phase for a lot of different use cases. But we've seen some incredible advances in that it does require a very long session of training, not just the person who's using the technology, but the technology itself to interpret the person's UH commands, because our brains when we send commands are not doing it in a in a precisely identical way. So it actually requires quite a bit of adaptation and tweaking to get the system tuned just right to the specific and but it does have the potential to really improve accessibility. Uh, it's really exciting stuff. Some R and D labs are incorporating various ways to simulate these challenges so that a person a designer or an engineer who may not have these sort of disabilities or other issues, can simulate that and see what it's like to interact with their technology with that simulated disability. So, for example, you might have gloves that limit hand mobility, or you might have goggles that reproduce color blindness, and then you try to navigate your product, your technology, and you see where the challenges may be and you think, oh, we need to fix this because I didn't realize that because I made this design, you can't effectively see a very important element on the website because people who have color blindness. They're not gonna be able to to differentiate these colors I've picked out, so it may require either creating an alternative so that people who are color blind are able to see it, or just changing the design entirely. All of it is meant to get designers to think outside of their own experience when they're creating products like hardware and software to make sure it's accessible to that larger user base. UM there's an organization that I think is really important that's going to help a lot called Teach access It's working to create tools and curricula to introduce accessibility design in education in academia. So ultimately the goal is to create courses, educational courses that would be part of your design education when you were starting to look into going into this sort of field, and to learn about accessibility design while you're still in school. And the group hopes to encourage the engineers and designers of tomorrow to take accessibility into consideration as a core design principle, not as something you add on at the very end, but something that's incorporated from the beginning. The group's website asserts that a lack of understanding about basic accessibility issues presents a huge challenge today and that by creating these curricula, these courses that can be incorporated into a student's work, much of that can be mended. And this goes back to what I was saying at the beginning of the show. Designers draw primarily upon their own experience, their own life experience when they're building their tools and their products, and it takes a lot of discipline and effort to step outside of your own world experience and to create stuff that's for everyone, not just people who happen to think and behave the way you do. And again, this isn't coming from any sort of exclusionary mindset necessarily. It's not like people are trying to create stuff that's that's excluding others. You can probably think of a few different companies that have made products that appeal to a certain niche audience that happens to be similar to the people who made those products. So, for example, I'm gonna throw Google under the bus here. Now, I'm an Android user. I use a lot of Google products, and I've come to the conclusion that many of their especially their web based products, many of their web based products were made by engineers for engineers, and that people who were not of the engineering mindset may have found some of those products and services challenging to navigate or counterintuitive or non intuitive, and as a result, those various products and services didn't get widespread adoption. And so it's become kind of a longstanding joke that Google would introduce a new product online product and then within three or four years discontinue it because of lack of adoption. But I think a large part of that was because some of the design was while it made perfect sense to the people who were making the product, did not necessarily make sense to other people from different experiences. So it wasn't that the products were bad, It's just that they were not accessible to everybody. Teach access has grown out of efforts from accessibility teams that were founded in Yahoo and Facebook, but now there are lots and lots of companies and institutions that are part of it, like Stanford, Microsoft, Twitter, Adobe, Google is part of it, LinkedIn A T and T, A whole bunch of different companies and organizations are part of this initiative. It's exciting to see companies take such an active role in a holistic approach to increasing accessibility. Acknowledging the education and incorporation into fundamental design strategies is a really good start. And there are two really big reasons that I am so passionate about this. You know, I know I've talked about this kind of thing and a very touchy feely kind of way that some people might find off putting, but uh, frankly, I don't care, because I think the two things that really drive my passion in this is. First, there's compassion. I want inclusions so that people are not frustrated, they're not isolated because of the design of technology. I want them to be able to enjoy the benefits of technology, to flourish in part because technology is allowing them to do so in ways they could not otherwise. I want that to be the case. And from a selfish perspective, I want it because you can't even imagine the contributions people might be able to make if they just have access to technology in a meaningful way. You never know who's going to come up with the next incredible innovation or next scientific principle. Imagine if Stephen Hawking had been silenced due to his uh A L s, If if he had been rendered incapable of communicating without the use of technology, or even with the use of technology. If he could not communicate, it would have not just been a terrible personal tragedy for him, because of course it would be, and for his family, it would also have left a gap in scientific discourse that would have been a terrible detriment for everybody. So I think accessible technology is important for everyone, not just the people that directly benefit from it, but from everybody, because we can all stand to benefit in the long run in ways that we can't imagine. Here's here's a simple and silly, somewhat silly example in the sense of the aid that it gave everybody. So back in and Kalamazoo, Michigan, the city began to alter their sidewalks. They started to create curb cuts. So these are the first curb cuts. This is the sloped section on a herb that allows someone who's in a wheelchair to move from street level up to sidewalk level so they can cross streets. That part is not the silly part. That part is incredibly important. It gives people in wheelchairs the capability of navigating around their city in a way that they otherwise would have problems doing. So. That part is important, But the part that's a little more silly is that that those singing ramps allowed for other uses, Like if you happen to travel around on the scooter and you don't want to go in the street and there are no, i you know, no local laws that prevent you from going on sidewalks with a scooter, then these slopes allow you to do that. For parents who are pushing strollers, they can move from street to sidewalk. Those were not the primary intent of those curb cuts, but they are other benefits that rose up from that program. So there can be benefits beyond the primary intended purpose for accessible technology, both from the technology standpoint and the people who get to take advantage of it. So that's why I think it's an important concept and an important endeavor to pursue. Now, obviously we could talk in greater detail about all these different types of technology, and probably in the future I will cover them more in more depth. UM eye tracking technology I've talked about before in the past, but going into greater detail with that would be great because the tech has improved significantly since I first started playing around with it a few years ago, and it'd be great to sort of talk about the implications of that, but we've kind of run out of time, so I want to thank Ken again for the suggestion to do accessible Technology. And uh, if you guys have any suggestions for future episodes of Tech, whether it's a type of tech, a company, a person in tech, maybe there's someone I should enter you let me know. Send me an email. The addresses tech Stuff at how stuff works dot com. Remember, you can also get in touch with me on Twitter and Facebook. The handle with both of those is tech Stuff hs W. Don't forget. If you haven't yet picked up your amazing Ada Lovelace T shirt, you should go to the tech Stuff store over at t public dot com slash tech Stuff. That's t e public dot com slash tech Stuff. We've got all sorts of designs that can go on anything from a T shirt to a tope bag, to a coffee mug to phone case, lots of different stuff, stickers, and every purchase you make helps out the show a little bit. Plus you get cool stuff, so go check it out and don't forget follow us on Instagram. That's it for me. I'll talk to you again really soon for more on this and thousands of other topics because it how stuff works. Dot com want