[ MUSIC ] Speaker 1: Oh, I love that. Ailsa Leen: Okay, great. Hello, everybody.
Thank you so much for coming to an 8:30 session on, I think, the last day of Ignite. I don't know. It's the last day for me.
But we're really excited to be here, to be bringing this session about inclusion to Ignite audiences. I know you've been here for a couple days already, heard a lot of things about AI and building with AI and all the great tools. But we're here to talk about who gets to build this AI and how we can make AI development more inclusive.
So I'm from Microsoft. Our presenters will introduce themselves in a second. But at Microsoft, we've been doing some great work with our friends at EY on making AI development more inclusive.
And we're excited to share it today. So my name is Ailsa Leen. I'm a principal program manager on the Azure AI team.
I am an Asian woman wearing a black t-shirt today. My pronouns are she and her. Heather Tartaglia: Right.
Thanks, Ailsa, and good morning, everyone. By way of introduction, Heather Tartaglia. I am with EY, and I'm EY's Neurodiverse Center of Excellence Global and America's Implementation Lead.
That's a mouthful. But what that really means is that I help internally at EY as well as with our amazing clients, likely many of you in the audience today, on their own neuro-inclusion journeys. I'm proudly wearing an EY Microsoft branded zip-up today black zip-up.
I have long dark black hair. My pronouns are she and her. David Mondello: Hello, everyone.
Welcome to our session this morning. My name is David Mondello, and I'm a Design Researcher at Microsoft. I am an Italian American man wearing a brown jacket today, and my pronouns are he/him.
John Franzen: Hello, everyone. My name is John Franzen. I'm a Supervising Associate and a Technologist with EY's Neurodiversity program.
And my pronouns are he/him. My visual description, I'm a white man wearing a navy-blue polo. Ailsa Leen: Great.
Well, we're here today to talk about inclusion, like I said. And I think inclusion is so imperative in the AI space in particular. As we know, for the last few years, if not more, AI has been really changing our society and our daily lives.
And so the impact of it is so vast and far reaching. And therefore, it's so important that the opportunity that gets unlocked by AI is something everyone can leverage. That we design the society of today and tomorrow to include rather than exclude.
And that benefits everyone. That benefits all of us, not just marginalized groups. And we can create more creative ideas if people are brought into the AI economy.
We can create more accommodating and more navigable products and spaces. And so one example that I use a lot is teams captions, captions and teams. And those are something that is so imperative for the deaf and hard of hearing to follow along during meetings and participate.
But it's also something probably almost everyone in this room has used at some point to help make it easier for them to follow the meetings as well. So Heather, what about you? How do you see the need for AI and inclusion and all those things at EY?
Heather Tartaglia: Yeah, Ailsa, thank you for that. And by the way, you'll quickly find out that I'm extremely passionate about this subject. And like Ailsa, I also use captions.
It's extremely important as part of my day-to-day. But from an EY perspective, we really think about inclusion as a catalyst for innovation. So, when you think about disruptive technology and where the world is headed from an AI perspective, the power of thinking differently is critical now more than ever.
And from an EY perspective, we really focus on how can we capture all of our unique differences and strengths that we bring to really think about innovation or applying levels of creativity. And so it's how do you create the psychological safety, the safe environments to tap into that level of creativity, level of inclusion, and ultimately drive innovation. And it's not masking or shaving down those differences but really harnessing the power of all of our differences.
Ailsa Leen: Yeah, thanks, Heather. So I know that you work at EY, in particular on neurodiversity. So I wanted to start us off for the audience and just everyone with some common definitions.
So could you tell us how you define neurodiversity at EY? Heather Tartaglia: Yeah. Thank you for that.
I think it's really important that we all start out with some kind of standard and unifying language. And so when we talk about neurodiversity, it's really an umbrella term. It means everyone.
And this really highlights that our brains all think differently. Our motivations, our interests, the way that we think or socialize may all be different. And now, within neurodiversity, there are two types of cognitive profiles.
Neurotypical is one cognitive profile which is the, quote-unquote, "majority," the way that individuals kind of think, socialize, etc. And this is the way that society would expect when we talk about neurotypicals. Whereas neurodivergent individuals are hardwired, have a permanent cognitive difference, and they think differently.
And this is a beautiful thing because, as I talked about a bit earlier, when we talk about how do we tap into all of our unique strengths and how do we drive innovation, it's really critical that we think differently. We're not going to solve the most complex problems if we all think the same. One other key piece, I mentioned majority.
When we think about the global population, it's said to be 15 to 22% of the world's population is neurodivergent. Last year there was new research that was shared. And it's really staggering research that the Gen Z population, this is around 13- to 27-year-olds, 53% of the Gen Z population identify as being neurodivergent.
So that majority is now changing as you think about the workforce that's entering now for the next 10 years and our future leaders and managers. Ailsa Leen: That's such a cool statistic. We've practiced this, we've done talks before.
And every time you share that statistic, I just think it's so cool, Heather, because it really just shows it's not a minority anymore, unlike the neuro minority that we talk about. But it's just acknowledging that everyone has differences once we get to the majority. Please tell me more specifically, though, about what you do at EY because I know you do some very cool work here.
Heather Tartaglia: Yeah. Maybe I'm a little bit biased, but I think it's pretty cool. So about nine years ago, we launched our EY's Neurodiverse Centers of Excellence, and you can see on the screen here.
And it was really, we started with the ambition to harness the power of thinking differently. I should say that I do not sit in DE&I. I do not sit in HR or talent role.
I actually come from the business side. And so we really set out to harness, from a business perspective, how we could drive innovation and tap into emerging technology skill sets that ultimately could solve our business problems differently, but also our clients' business problems differently. And since launching our neurodiversity program, which we're leveraging the talents of truly integrated neurodiverse teams, both neurotypical and neurodivergent profiles, we are now in 25 cities, 15 countries around the globe, and expanding.
So I'm super proud of that. And you'll see some of those data points on this screen. But we've seen tremendous value creation, whether we talk about innovation, process efficiency, employee pride, and a sense of belonging, which is really important, We've seen $1 billion of value creation internally for our clients and our communities.
Ailsa Leen: Great. Thank you for sharing, Heather. And so to bring this back to why we're here together, talking about neurodiversity and talking about AI, you probably are aware that we announced the Azure AI Foundry at Ignite this year.
So Azure AI Foundry is our unified platform at Microsoft to design, customize, and manage AI solutions. So when our team came up with this concept, when we started working on it, we knew it was so important that anyone, any developer with any ability, cognitive ability, physical ability, would be able to build and leverage Microsoft's AI stack. We're building the tool.
We want it to be the go-to tool for the AI of today and tomorrow. So everyone needs to be able to build with the AI of today and tomorrow. So the team, we did quite a lot of work to make sure that AI Foundry was accessible for assistive technologies in particular, so getting feedback from blind developers, developers with mobility disabilities testing the product, finding so many bugs, fixing them.
And so we got to a place that, it's always a journey, but we feel like we're at a pretty good place with assistive technology support. But we knew we wanted to go a little bit further. That's one portion of developers out there.
But as Heather was talking about, neurodiversity is another area that is growing, and we also think quite correlated with folks in the technology sector. A lot of the neurodivergent skills are things that would correlate really well to roles in building and developing AI. And so we knew that we wanted to work with EY.
They've been longstanding partners in the neurodiversity space with us. And we basically came together to get feedback from neurodivergent technologists on how we could improve the portal experience at Azure AI Foundry for everyone. So what I'm going to do is show a little quick video that sums up some of the work that we've done.
And then we'll have a little chat about that after. So yeah. That's a little bit about what we did.
Essentially, Microsoft and EY came together to develop improvements and brainstorm. And that resulted in the notification experience that we showed in the video today. The thing that I really loved about this project was being able to bring not just myself and the folks that are presenting today but a really large number of the product team, engineers, product managers, and designers who worked on AI Foundry together with users, with our developers who are interested in AI.
They want to learn more about it. Some of them have been building for a while. We all came together, mostly virtually.
It looks more collaborative in the video, but it was such an energizing time. And I really think more product teams, whether it's inclusive design or just good design and research, should come together with these groups more often. So that was a great experience for me.
And the other thing was we had such a great opportunity to learn from you on how to do so in a neuro-inclusive way. The way that we collaborate, building pauses into our design (inaudible) and spaces. We really had to be more prepared as well and know what we were going to do each day.
Because sometimes, I'll have to admit that I run meetings a bit more on the fly than we did with you all so that we could prepare our participants and colleagues accordingly. So, yeah, it pushed us to be better. That was how it was for me.
How was it for you? Heather Tartaglia: Ailsa, that's wonderful to hear. We've been working on this over the past year, and it was such an exciting collaboration.
We really could not have been more excited to collaborate with you and David and obviously John and our broader team that is not represented here today. But ultimately, as we think about this project, it really aligned to EY's core principles as we think about inclusive design. And those three core principles are accessibility, usability, and co-creation.
And so we were so excited that this project really aligned in thinking through those three elements and how you can really incorporate neuro-inclusion in inclusive design more holistically. So we are super proud of the results and how this is ultimately going to help more technologists enter the room to drive more creativity for the product. And it's an entire segment of the world's technologists that I think will allow for better usage of the product.
So we are really excited about that. Ailsa Leen: Yeah. Great.
Well, I'd love to invite David and John, John, you saw in the video, to come and talk a little bit more about the details of that work and what went into it. David Mondello: Great. Thank you so much, Ailsa and Heather.
And as you guys so nicely laid out, EY and Microsoft really approached the design process with a shared understanding of the differences that people have and how that can inform innovation. And so we're going to talk a little bit now about the specifics of how we translated some of those shared values into a deep multi-month collaboration. So stepping back a little bit, during the public preview version of AI Foundry, we received lots of feedback that the experience was a little too complex.
So we were trying to think about who might be able to help us work through some of these issues of complexity. And we specifically wanted to partner with neurodiverse developers who might experience the cognitive strain even more acutely than the general population of developers. And so by working to resolve the complexity for this one group, we hope to then extend a simpler product experience to all developers.
And solving for one and extending to many is one of the core principles of inclusive design here at Microsoft. So once EY came on board, it really provided a unique opportunity, an extended opportunity to inform the broad arc of the product development process. So it wasn't just involving neurodiverse developers at the start to learn a little bit more about where they were running into issues.
It wasn't bringing them in in the middle just to get some ideas around how to design solutions a little bit better for them. And it wasn't at the end, after everything had already been built, and we're just looking to evaluate the product. But it was really stitching all those different phases together and having an extended collaboration.
So the inclusivity was really built into the very fabric of the collaboration. So I want to bring John in now. And, John, when you first heard about the opportunity to partner with Microsoft on co-designing the Azure AI Foundry, what was it about the opportunity that made you want to participate?
John Franzen: Well, two things. First, like you all saw in the video, it was an opportunity for us to talk about neuro-inclusion and raise awareness of it. Second, on a more personal level, it was an opportunity for me to learn a new emerging technology.
Just as a bit of background for me, I've done a fair amount of work with several previous emerging technologies like robotics process automation and blockchain. So I also saw this as a good opportunity to learn another new emerging technology. David Mondello: Mm-hmm.
Yeah, and I think a lot of developers are in the same spot when it comes to developing with generative AI. So your curiosity and your clear-eyed perspective were really critical in helping to ground the collaboration. John Franzen: Thank you.
David Mondello: So to kick us off, one of the first things we wanted to do was to get a little bit more granular about what was contributing to the perceived complexity. And so the main question we were trying to answer was around what parts of the public preview experience were putting the greatest strain on cognitive demands like focus, communication, learning, decision-making, and memory. And to explore these questions, we had our co-designers from EY build a chatbot on their own over several days using the public preview version of Azure AI Foundry.
And you can see here in the video some of the steps that they went through, including grounding responses in a set of relevant documents, manually and systematically evaluating the responses, and then ultimately deploying a web app. After each step in the process, we had the co-designers fill out structured reflections about their experience. And so, John, I'm wondering if you could reflect a little bit about that first phase of the collaboration.
And what about your initial development experience in Azure AI Foundry worked well for you? John Franzen: So two things. First, with AI Foundry itself, I found that once I got over the learning curve, it had a lot going for it.
It was very straightforward, with only a few hang-ups. Second is, our UI developers had a lot of support from the Microsoft development team. David Mondello: Got it.
So that's good to hear. I'm wondering if, in addition to some of the things that worked well for you, what were some of the things that you remember being kind of frustrating about the experience when you first started? John Franzen: Yeah.
So early on, when I was starting to work on my AI chatbot, I ran into a problem. Just for context, in order to set up an AI chatbot, you need to select a region where it's deployed. I didn't know that.
And I kept on selecting a region that the account also had another bot there. So it wouldn't let me set up my bot. But it wasn't telling me enough information about what was going on.
So I ended up having to talk directly with the development team to figure out why it wasn't working. David Mondello: Mm-hmm. Yeah, so just understanding a little bit more about what's going right and how you might be able to take practical actions (inaudible).
And then the final thing is, having gone through the process of trying to stand up your own and deploy your own chatbot, did you have a sense of what sort of improvements you wanted to see in the Foundry? John Franzen: Yes. So obviously, as you might expect, most of the improvements that I wanted to see were things to make it easier to understand but also more accessible.
Just as an example, going back to the region, the thing I had to work through, part of the reason why that was frustrating to me was because I wasn't getting enough information from Azure AI Foundry. It wasn't telling me something like this region is unavailable. Please select a different region.
Yeah. Had I been told that earlier, it would have helped me solve the problem faster. And it would have gotten me back on track with building my AI chatbot.
David Mondello: Yeah. So some of what John is describing from his own experience, we heard from the broader set of co-designers that we worked with from EY. And some of the trends that stood out from this phase of the collaboration were that structured learning was not very well supported within the Foundry.
So it was really hard to discover, especially for people who are new to developing with generative AI, how all the various pieces fit together, specifically a lot of the different features and functionality. It was hard to understand a clear workflow and figuring out how all those things came together. The other thing that we learned at this phase was when we were looking across the different cognitive demands, we found that focus was relatively well supported.
And so we knew that there were going to be more things added to the Foundry. And we wanted to ensure that that focus was preserved and the ability to focus while using the Foundry was preserved. And so we wanted to keep that in mind as we went to the next phase of the collaboration, which we'll talk about now, and that's design sprint.
So we had these insights from our co-designers. We also had an understanding of the product roadmap and the product priorities. And so we did an intensive three-day design sprint.
And it was focused on three primary topics. So there was one around getting started, so having clear learning paths about developing with generative AI in Azure AI Foundry. The second, and we'll talk a little bit more about this one in depth, is around notifications.
And we saw that in the video. So preserving context and communicating high-severity issues in an actionable way. Finally, transitioning between code and UI.
So just helping developers better understand the trade-offs of selecting where they're going to do their work. Each day of the design sprint, the co-designers and Microsoft product builders shared time together. So in the mornings, both groups were present, and the product builders from Microsoft had a chance to ask questions of our co-designers and also get feedback on designs.
In the afternoons, product builders from Microsoft would iterate based off of the feedback. And we did that throughout the week, finally settling on three concrete design directions to start to build into the product. So, John, I'm going to ask you to reflect a little bit about your experience in the design sprint.
I think this was a forum that you hadn't been familiar with before. And so when you went into it, what were some of your expectations about what you were going to be doing? John Franzen: Yeah.
So I knew there would be some back and forth with the developer team. And I knew that they would be taking a lot of our feedback into consideration. Other than that, I went in just about ready for anything.
David Mondello: Mm-hmm. That's great. And I'm also curious to hear from you what it was like to work side-by-side with product builders from Microsoft.
What was that process like for you? John Franzen: I thought it was a good experience. The product builders, like David said, they showed me several different mock-ups about how they wanted to implement our feedback.
And they asked me for my input on each one of them. So it was a recurring process of looking at one mock-up and being asked, how does this look? Okay.
How does this other mock-up look? How do the two of them compare? Are there elements here that you like better than there and otherwise?
So I could tell that they took our feedback seriously and gave it a lot of consideration. David Mondello: I'm glad to hear you feel that way about working so closely with our product builders. Lastly, I was wondering how well you felt the feedback that you did provide made its way into the designs that you saw.
John Franzen: Yeah. Thank you. So most of my feedback was about the notifications over troubleshooting, once again going back to the regions situation I had to deal with.
So what I wanted to see were notifications that could help you figure out why something wasn't working and get you on the right track but also notifications that didn't distract you from what you're working on. Some neurodiverse individuals have a hard time dealing with distractions like that. So now there's a sidebar with notifications that you can open up to check.
And many of those notifications are actionable. And so, as an example, if you're trying to deploy your AI chatbot, you can just open up the sidebar and check the progress. And the benefit here is that if you're easily distracted, you can just wait to open up that sidebar until after you're done with your current task and then open up the sidebar and get caught up on things going on in there.
David Mondello: Okay. Well, again, I'm glad to hear that, some of the feedback that you provided, you saw that and reflected in the designs that we iterated on throughout the week. So a couple of thoughts about the design sprint and what we heard from some of the other participants, and again, echoed from some of what John has already discussed.
But from other co-designers from EY, we heard that -- This is a quote. "Seeing our thoughts and ideas evolve into practical solutions for neurodiverse individuals was quite rewarding and inspiring. " And then on the product side from Microsoft, the sprint really helped underscore and personalize the different needs to consider when designing AI Foundry.
And often, researching, designing, and building are done in more of a sequence, kind of in isolated sequences. But in the sprint, product builders had direct and dynamic access to users of the product, leading to a more parallel product development approach. It also provided the space to explore ideas or directions that would have been maybe a little bit more difficult to do during the regular rhythm of business product development.
So those are some of the things we heard from the co-designers and from the product builders that participated in the sprint. So we had these ideas coming out of the sprint, and we had a clear direction. But we still had to do the work of embedding the design ideas into the actual product, so putting the outputs into the product.
To help ensure that the spirit of what came out of the sprint actually landed in the GA experience, we had our co-designers review early builds featuring the improvements that they helped envision. We're going to talk a lot about notifications here. But I'm wondering, John, if you could describe what it felt like to see something that you helped shape in the process of the design sprint actually land in one of the early builds?
John Franzen: Yeah. So I thought it was rewarding to see our feedback be implemented into Azure AI Foundry. It told us that they really did take our feedback seriously.
And it was also humbling in a way that they would trust us this much with their product. David Mondello: Well, we were only able to get it to that spot with all the very useful contributions that you and all the rest of the co-designers provided us with. So in addition to the feedback that we got from our co-designers, this phase also featured insights from neurotypical developers.
And what we found was our explorations that came out of the design sprint with our co-designers also provided a more focused and actionable experience for neurotypical developers as well. And again, this is addressing that core inclusive design principle about solving for one and extending for many. So again, we talked a lot about notifications in this session.
It's the first of the design ideas to land in the product. But other insights from this collaboration are continuing to inform current and upcoming features, including how guidance and learning opportunities show up within Azure AI Foundry. So, John, I'm going to ask you to reflect a little bit on kind of the whole experience now.
So, as you look back across all these various activities, what did you personally find most valuable? John Franzen: So the first thing, as surprising as this might sound, I think the first thing that I found valuable was actually my initial difficulties in getting my AI chatbot working. Yeah, so just for context, I'm the kind of person who can just sit down and devote a lot of time and energy into working on something and figuring out all the ins and outs of how it works.
And in going through this, I learned a lot about how AI chatbots are supposed to work. Then the second thing I found valuable was actually just being involved in the process from start to finish. It was really rewarding to see how Azure AI Foundry went from what we had started with to what it is now.
David Mondello: It's good to hear that you felt kind of involved throughout the entire process. And I think you're coming away from the collaboration with some new tools in your toolbox as a developer and getting some exposure to the capabilities of AI Foundry. So now that you're on the back side of this process, this collaboration that we did, John, could you talk a little bit about how you see the different pieces fitting together?
So when we started, it maybe was a little bit more abstract. But now that you've seen how everything kind of fits together, I'm wondering if anything's a little bit clearer than when you first started out. John Franzen: Sure.
So, at the start, it felt a bit like beta testing. We had some instructions on what to do. And we were being recorded as we did those tasks for the development team so that they could see how we did, see our progress.
So it wasn't until we started getting to the second and third sprints that things really started falling into place and where it became clear that the builders wanted to see how we think and look at Azure AI Foundry so that they could make it more neuro-inclusive. David Mondello: And it's good to hear the process of co-designing AI Foundry, but this allowed you to kind of directly shape the experience in a neuro-inclusive way. And we're very grateful for your involvement and continued involvement.
So lastly, John, I'm wondering, if there's others here today that are thinking about taking a similar approach within their own organization, is there any advice that you would give them on how to effectively co-design with neurodiverse individuals? John Franzen: Yeah, so two pieces of advice. The first one is be willing to make adjustments.
As has been said here several times and throughout several other events here several times, neurodiverse people think differently, and sometimes they have different needs. As an example of this, several of our EY offices have quiet rooms with dimmed lights, lower environmental feedback. And it's helpful for neurodiverse individuals who get overstimulated.
They can just go there, calm down, and focus on their work. My second advice is listen to them. All of our developers and technologists at EY's Neurodiverse Center of Excellence, we all have technology experience that we brought to this project.
And we all had valuable feedback that helped make Azure AI Foundry a better product. Yeah. So just to recap, first, be willing to make adjustments.
And second, listen to your neurodiverse colleagues. David Mondello: Okay. Well, thank you, John, for your contributions throughout this collaboration, including the way you've helped us tell the story.
I've learned a lot from you and everyone at EY's Neurodiverse Center of Excellence about how to expand my own research practice. So thank you again for helping to broaden our approach to product development and to create more neuro-inclusive experiences. So I'm going to hand it back over to Ailsa now.
I think she's going to do some wrap-up slides, and then I think we'll have some time for questions. Ailsa Leen: Cool, so don't worry, everyone. I'm not going to take too long wrapping us up.
So actually, and I loved -- Thanks, John, for sharing your two tips at the end. That really kind of -- I think those were two really important points. And it kind of made me think of one that I wanted to add, which was on the Azure AI Foundry side.
On our team, we had a neurodivergent developer who was part of the Azure AI Foundry team who helped us shape this experience with us. So I think really having folks on the product team who can help make sure that the way we're designing that co-design process was inclusive, not just from our partners at EY, but from someone sitting on the product team, made a big difference for us. So big call out to diverse teams from within.
But yeah, basically, my wrap-up is this slide. We have so much good content for anyone who's interested in doing inclusive design projects, co-design at inclusive. microsoft.
design. And actually, David and I pretty much used this website to figure out and plan our design sprints. There are great resources in here.
We relied very heavily on the Inclusive Design for Cognition Guidebook that you can see on the screenshot of this webpage. So David had talked about the different cognitive demands. And when we were doing the research with the EY co-designers, we were looking for issues with the demands such as learning, focus, memory, a few others that I'm remembering incorrectly.
But we took all of that from the Cognition Guidebook and learned more about these demands and how to find out more about them. There's also a pretty cool little mini booklet on inclusive AI. So that's another relevant and interesting one to look at.
But big call out for the inclusive design assets on this website. And I think you can find a lot there on how to do these kind of things yourself. Love to encourage people to do things like this themselves.
And yeah, that's us. Final thing, we have one more day of Ignite, and there are a few more accessibility-related sessions. So these are things you can attend later today.
And otherwise, I'll pause for questions and maybe we can bring the rest of the folks up. I think we have about eight minutes. So, if anyone wants to come to the mic or just put their hand up.
Yeah, thank you for listening. Speaker 2: Hi. Thank you for explaining how your process went.
My question is for folks who are building with Azure AI Foundry, do you have any tips in terms of how they can build more inclusively using that tool? Ailsa Leen: That is an interesting one. I think I could have a go.
So the Azure AI Foundry has -- I guess the first thing I would say is anything that you're building with generative AI, therefore with the Foundry, has such strong potential to be an inclusive and accessible tool. Of course, you need to make sure your UI is accessible, screen reader accessible, all those things. But don't forget about the power of generative AI itself as an assistive technology.
So there's been some really great sessions about this at Ignite earlier as well. And so I almost wonder if Anna was asking a leading question because she did a wonderful session on how Copilots can benefit accessibility. And I think, do you remember the stat that EY has, and it's something like 95% of neurodivergent individuals called Copilot an assistive technology?
Heather Tartaglia: Yes, that's exactly right. Ailsa Leen: Yeah, so inherently, anything you're building that is this conversational assistive tool can help people simplify content, get a recap of the meeting. All of these kinds of things are inclusive tools.
So just to kind of short answer the question, I feel like anything you do with generative AI has great potential for inclusion. And then we also have some multimodal abilities in Foundry. You can use our speech services, vision services.
Those are great places to add accessibility into anything you build as well. Generative AI can describe screens now, describe images, videos even. You can add voice to your chatbots in Copilot, and that's another great inclusion tool.
So yeah, lots to be done. Thanks for the question. Anyone want to add anything?
Sorry. David Mondello: You did a good job. Heather Tartaglia: Well said.
David Mondello: Yeah. Speaker 3: Morning. This could be a bit of a difficult question.
I noticed the chatbot removed your speech impediment. How do you feel about that? So when it was doing the translation up here, your speech came out as normal speech.
It removed your stutter. John Franzen: Well, as for me, I'm fine with that. If it helps people understand what I'm trying to say, then I'm perfectly fine with it.
Speaker 3: Cool. Thank you. Appreciate it.
Ailsa Leen: Cool. Do we have any questions in the chat, Courtney, or nothing too much? Heather Tartaglia: We will also be around for the next five minutes.
If there are no questions and you want to connect separately, we're here. Ailsa Leen: Yeah. David Mondello: Thank you, everybody, for coming.
John Franzen: Thank you.