[Settings]: When tooltip is visible on the screen Voice Access are not able to access any control on the Terminal app. #18271

Closed
opened 2026-01-31 06:08:49 +00:00 by claunia · 6 comments
Owner

Originally created by @ghost on GitHub (Aug 23, 2022).

Windows Terminal version

1.15.2003.0

Windows build number

10.0.25179.1000

Other Software

Test Environment:
OS: Window 11 22H2 OS Build 25179.1000
App: Windows Terminal Preview
AT: Voice Access

Steps to reproduce

Repro Steps:

  1. Open Windows Terminal.
  2. Open Voice Access AT.
  3. Inside Settings>Startup page, Use 'Expand launch Size' command, when command get executed, a tooltip will appear.
  4. When Tooltip is visible on screen, try to access app using available commands without dismissing the tooltip.

User Experience:
People with limited mobility who use Voice Access will impacted here as they might not be able to use controls and get confused regarding the same.

Guideline Reference:
EN 301 549 V3.2.1: 11.06.02 No disruption of accessibility features

Attachments:
Tooltip.zip

Attachment Note: Not able to inspect Terminal Preview app with AI and Inspector tool.

Expected Behavior

When tooltip is visible on the screen Voice Access users should be able to access all control on the Terminal app.

Actual Behavior

When tooltip is visible on the screen Voice Access are not able to access any control on the Terminal app like nothing happens when users use 'Click control name' and 'show numbers' etc. commands.

Observation:
Voice Access users can access controls after dismissing the tooltip or using 'Show Grid' command.

To learn more about Voice Access and how an application can support its different modes, Refer https://www.osgwiki.com/wiki/VoiceAccess_Testing

Voice Access Commands page: https://sway.office.com/Q5JvoIGBI5glyoKw?ref=Link

Originally created by @ghost on GitHub (Aug 23, 2022). ### Windows Terminal version 1.15.2003.0 ### Windows build number 10.0.25179.1000 ### Other Software **Test Environment:** OS: Window 11 22H2 OS Build 25179.1000 App: Windows Terminal Preview AT: Voice Access ### Steps to reproduce **Repro Steps:** 1. Open Windows Terminal. 2. Open Voice Access AT. 3. Inside Settings>Startup page, Use 'Expand launch Size' command, when command get executed, a tooltip will appear. 4. When Tooltip is visible on screen, try to access app using available commands without dismissing the tooltip. **User Experience:** People with limited mobility who use Voice Access will impacted here as they might not be able to use controls and get confused regarding the same. **Guideline Reference:** EN 301 549 V3.2.1: 11.06.02 No disruption of accessibility features **Attachments:** [Tooltip.zip](https://github.com/microsoft/terminal/files/9400232/Tooltip.zip) Attachment Note: Not able to inspect Terminal Preview app with AI and Inspector tool. ### Expected Behavior When tooltip is visible on the screen Voice Access users should be able to access all control on the Terminal app. ### Actual Behavior When tooltip is visible on the screen Voice Access are not able to access any control on the Terminal app like nothing happens when users use 'Click control name' and 'show numbers' etc. commands. **Observation:** Voice Access users can access controls after dismissing the tooltip or using 'Show Grid' command. To learn more about Voice Access and how an application can support its different modes, Refer https://www.osgwiki.com/wiki/VoiceAccess_Testing Voice Access Commands page: https://sway.office.com/Q5JvoIGBI5glyoKw?ref=Link
Author
Owner

@zadjii-msft commented on GitHub (Aug 23, 2022):

This definitely isn't specific to one single control - the mouse was left on top of the "launch size" expando, and that caused the tooltip to appear over it.

Now, the bigger question here is how should Voice access interact with XAML tooltips? Clearly it can't walk around the tooltips in the UI tree. I suspect this needs to be moved way upstream, either into WinUI itself, or into Voice Access. @carlos-zamora thoughts?

@zadjii-msft commented on GitHub (Aug 23, 2022): This definitely isn't specific to one single control - the mouse was left on top of the "launch size" expando, and that caused the tooltip to appear over it. Now, the bigger question here is how should Voice access interact with XAML tooltips? Clearly it can't walk around the tooltips in the UI tree. I suspect this needs to be moved way upstream, either into WinUI itself, or into Voice Access. @carlos-zamora thoughts?
Author
Owner

@carlos-zamora commented on GitHub (Oct 11, 2022):

Yup, this is a XAML Islands bug. I've filed https://github.com/microsoft/microsoft-ui-xaml/issues/7815 on the XAML team

CC @v-rpundir

@carlos-zamora commented on GitHub (Oct 11, 2022): Yup, this is a XAML Islands bug. I've filed https://github.com/microsoft/microsoft-ui-xaml/issues/7815 on the XAML team CC @v-rpundir
Author
Owner

@ghost commented on GitHub (Oct 19, 2022):

Tracking ID: Bug 41849571: [XAML Controls Gallery -> ToolTip]: When tooltip is visible on screen, unable to access app using available commands ('Click control name' and 'show numbers' etc.) without dismissing the tooltip.

FYI- We do not track XAML bug in GitHub as per XAML testing team guidelines.

@ghost commented on GitHub (Oct 19, 2022): Tracking ID: [Bug 41849571](https://microsoft.visualstudio.com/DefaultCollection/OS/_workitems/edit/41849571): [XAML Controls Gallery -> ToolTip]: When tooltip is visible on screen, unable to access app using available commands ('Click control name' and 'show numbers' etc.) without dismissing the tooltip. FYI- We do not track XAML bug in GitHub as per XAML testing team guidelines.
Author
Owner

@carlos-zamora commented on GitHub (Oct 19, 2022):

Closing now that the bug has been accepted.

@carlos-zamora commented on GitHub (Oct 19, 2022): Closing now that the bug has been accepted.
Author
Owner

@damian-666 commented on GitHub (Aug 3, 2023):

hey, Mr Carlos, just an idea. Just from kind of having a quick look over this vault, it seems like you're you're pushing this forward more so than anyone else. So I want to Just run it right to you Another protocol on this. I see
you work for Microsoft, and this isn't really an open source project. And I just. wanted to throw ideas at you. I'm a user. I'll use whatever works for me. I've been using windows a 99% my career. I don't want to use Ubuntu in Julia, but I think I have to for this one's the scientific project which could have a huge impact, but I just noticed in power tools, that there is a OCR feature that Google's always had, but which will let you maybe go to a video game which has buttons rendered

and get the text from those buttons and then allow you to press buttons and then you won't have to do all the work that you've done with all these different UI protocols and stuff. What I would think you would do for XAML or for anything if you get into that visual tree somehow I don't even know how to do it is you find a button via name and localization. Have the words that you want. to say put those into a dictionary and limit the language models domain so that it will guess first that you might be referring to those buttons if you're in that context. before just you know, putting up numbers or letters I would suggest that I'm just going to look at my screen if I see a button and I'm going to say its name. If I see a tool tip by my I might remember what was on that tool tip and use in that the next time I want to invoke the command I might not even be looking at my screen after tat . know, so I really think some of those work that's being done is redundant. And maybe you just do the ocr. I know it's a little bit more expensive, but when you're feeding it into the voice to text AI first. it's going to make that cheaper, more accurate, and it's going to get implemented faster and the faster it is out there and better , you're going to save the planet 20,000 hours of mucking around clicking and mousing, I know what it's like working in a huge company, especially on UI. Everyone likes their own ideas better, but what's great about this is that you can leave that messy UI as it used to be and just put this over and then people will discover d then they'll have the code feature searches like developer studio preview has . I don't know if I mentioned this before, but it combines the discoverability of of of menus and big dialog boxes with the random access of commands without having to remember them. its is a sort of visual completion for UI wihtout putting numbers on the whole page. but going deep into nested tabbs and dialogs, also command and query and control . So after. a while you won't even have to look. If you can get it multiplexed, have it talk back. And this has just such great potential. It's very exciting. This probably doesn't belong here. This belongs under ideas. Somebody's gonna run this thing. Somebody's gonna push it forward. It's not going to, it's going to get stalled. Usually UI is a mess because it's every no one can come to a consensus on it. But I think this really does have a lot of potential to to have consensus, no matter how bad the UI is, you're just going to be telling your computer what you wanted to do and it could figure it out because the AI is getting really, really good. So I think the biggest risk is the engineering team may be. huge, but they're they're they're the least amount of work is the best and and that's why I'm making this suggestion is not it's not necessarily a bug. That I'm logging here or complaint It's a suggestion on how to consolidate some of the implementation. That's what this is. I don't know where it goes. So if you want a copy paste it forward it to some of the people in engineering, see what they think it's going to save him some work, whatever. Most important thing is context sensitivity. Those LMS need to be domain specific models You're you're doing. you're doing your understanding natural language and you're also understanding speech that needs to be combined. There are papers on it. Bar is doing it. You guys can do it. That's basically the only problems I'm having with it is that I'm talking about very specific in a very specific context. and it's guessing words that I would never say in that context. So I have to speak very clearly and I don't want to shouldn't have to. And sometimes it's just, you know, it's it's guessing something that's like, right on my screen. It's just it just shouldn't do that. And it needs hot words. And the hot words kind of need to be careful. You gotta be careful with those. And there's some safety issues involved. A good assistant doesn't ask you for confirmation all the time.
A good assistant if you're normally doing something, or if it can be undone, isn't going it's just going to go ahead and do it. So I would just picture what would be the best personal assistant you can have to pretend you can't even type and you have someone typing for you. What would annoy what would ? How? How would they most annoy you? You know, besides spying on you and try to sell you junk We're offer you products or interrupt your conversations as Siri is done..

So what people are working on right now? I mean, I'm not talking about the superconductivity. I'm talking about superluminal space travel. There's those. Those are the people I talked to and they're not all crazy. They're. they're theoretical possibilities that we haven't thought of Still, 1918, neither Emily Noether , a conservation of momentum. And the hoofed Swiss Nobel laureate who is. the father of the holography principle. These are people that I want to help to get the simulations that they need operative. And I'm stumped with chewing and. typing. So that's why I'm here bugging you. I think you know we can, we can pound this out and get it done. Let me know if there's anything I can do to help. I'll look over everything that you've done and I'll see if I can find any bottlenecks or redundancies and Just let me know if I can help. I'm not here to troll. I'm not here to annoy you. I know this is in the wrong place. Maybe it needs to be copy pasted into future discussion or features or whatever. But you know, no one's going to look at that. This. I know , how, how these organizations run some of the best product designer resign and that's it. Sometimes it's bullies. You who who get their way and who drive how things change. And there's the people who are most aggressive and bullying aren't necessarily strongest people in your organization. So that's what usually goes wrong so. I'm trusting that you aren't going to be intimidated by any of those kind of people.

It's, you know, it's too many cooks in kitchen and such. but there's people who need to be using computers right now in the theoretical physics And I'm not talking about material science or free or cheap or free energy. I'm talking about space transport And not to Mars somewhere nice.. So I'm trying to make this happen faster. I promised that I would push for this and that's why I'm here being annoying and putting things in the wrong place. Sorry I know this should be under enhancement. Or whatever. But I think it should be addressed before features are implemented, which might be subsumed later by doing it 100 percent now. "Go big or go home, ai, high " etc. I don't want to touch my keyboard or mouse again. We can do this. Sorry, I just get excited because my hands are arthritic and my back is broke and I'm spending too much time engaged on my computer now holding this mic isn't a good idea. My hands went numb, so I'm going to find out a better mic and that's very important to have a card a directional night that you speak directly into. I don't know if you want to use my grays, but. you know noise is a big deal. No big habit. Clean signal. I'm using a $200 preamp and Mic. Don't know if you're getting the same results as me, but I'm getting very, very, very good results given that it's not using context now and I'm very impressed. Things have changed. You know, since the last time I looked at this feature, I thought no way, you know. I'm too old for this. I can't be coding anymore. None of people I talked to are going to code. and it takes 50 years to get to this level of theoretical physics to where you can figure out something like what we're trying to figure out. No one's got time to type. So you really. And then no one can hire. No one even can afford interns or freelancers to do this either. They all have to be professors and have graduate students. So none of our best theoretical physicists have the. money and the. time to write simulations. So they need funding and they need a lot of graduates to do this stuff. They never finished so we don't get this done. We're stuck on Earth using rockets and Mars isn't a great place to be aspiring to go to for a second home.

@damian-666 commented on GitHub (Aug 3, 2023): hey, Mr Carlos, just an idea. Just from kind of having a quick look over this vault, it seems like you're you're pushing this forward more so than anyone else. So I want to Just run it right to you Another protocol on this. I see you work for Microsoft, and this isn't really an open source project. And I just. wanted to throw ideas at you. I'm a user. I'll use whatever works for me. I've been using windows a 99% my career. I don't want to use Ubuntu in Julia, but I think I have to for this one's the scientific project which could have a huge impact, but I just noticed in power tools, that there is a OCR feature that Google's always had, but which will let you maybe go to a video game which has buttons rendered and get the text from those buttons and then allow you to press buttons and then you won't have to do all the work that you've done with all these different UI protocols and stuff. What I would think you would do for XAML or for anything if you get into that visual tree somehow I don't even know how to do it is you find a button via name and localization. Have the words that you want. to say put those into a dictionary and limit the language models domain so that it will guess first that you might be referring to those buttons if you're in that context. before just you know, putting up numbers or letters I would suggest that I'm just going to look at my screen if I see a button and I'm going to say its name. If I see a tool tip by my I might remember what was on that tool tip and use in that the next time I want to invoke the command I might not even be looking at my screen after tat . know, so I really think some of those work that's being done is redundant. And maybe you just do the ocr. I know it's a little bit more expensive, but when you're feeding it into the voice to text AI first. it's going to make that cheaper, more accurate, and it's going to get implemented faster and the faster it is out there and better , you're going to save the planet 20,000 hours of mucking around clicking and mousing, I know what it's like working in a huge company, especially on UI. Everyone likes their own ideas better, but what's great about this is that you can leave that messy UI as it used to be and just put this over and then people will discover d then they'll have the code feature searches like developer studio preview has . I don't know if I mentioned this before, but it combines the discoverability of of of menus and big dialog boxes with the random access of commands without having to remember them. its is a sort of visual completion for UI wihtout putting numbers on the whole page. but going deep into nested tabbs and dialogs, also command and query and control . So after. a while you won't even have to look. If you can get it multiplexed, have it talk back. And this has just such great potential. It's very exciting. This probably doesn't belong here. This belongs under ideas. Somebody's gonna run this thing. Somebody's gonna push it forward. It's not going to, it's going to get stalled. Usually UI is a mess because it's every no one can come to a consensus on it. But I think this really does have a lot of potential to to have consensus, no matter how bad the UI is, you're just going to be telling your computer what you wanted to do and it could figure it out because the AI is getting really, really good. So I think the biggest risk is the engineering team may be. huge, but they're they're they're the least amount of work is the best and and that's why I'm making this suggestion is not it's not necessarily a bug. That I'm logging here or complaint It's a suggestion on how to consolidate some of the implementation. That's what this is. I don't know where it goes. So if you want a copy paste it forward it to some of the people in engineering, see what they think it's going to save him some work, whatever. Most important thing is context sensitivity. Those LMS need to be domain specific models You're you're doing. you're doing your understanding natural language and you're also understanding speech that needs to be combined. There are papers on it. Bar is doing it. You guys can do it. That's basically the only problems I'm having with it is that I'm talking about very specific in a very specific context. and it's guessing words that I would never say in that context. So I have to speak very clearly and I don't want to shouldn't have to. And sometimes it's just, you know, it's it's guessing something that's like, right on my screen. It's just it just shouldn't do that. And it needs hot words. And the hot words kind of need to be careful. You gotta be careful with those. And there's some safety issues involved. A good assistant doesn't ask you for confirmation all the time. A good assistant if you're normally doing something, or if it can be undone, isn't going it's just going to go ahead and do it. So I would just picture what would be the best personal assistant you can have to pretend you can't even type and you have someone typing for you. What would annoy what would ? How? How would they most annoy you? You know, besides spying on you and try to sell you junk We're offer you products or interrupt your conversations as Siri is done.. So what people are working on right now? I mean, I'm not talking about the superconductivity. I'm talking about superluminal space travel. There's those. Those are the people I talked to and they're not all crazy. They're. they're theoretical possibilities that we haven't thought of Still, 1918, neither Emily Noether , a conservation of momentum. And the hoofed Swiss Nobel laureate who is. the father of the holography principle. These are people that I want to help to get the simulations that they need operative. And I'm stumped with chewing and. typing. So that's why I'm here bugging you. I think you know we can, we can pound this out and get it done. Let me know if there's anything I can do to help. I'll look over everything that you've done and I'll see if I can find any bottlenecks or redundancies and Just let me know if I can help. I'm not here to troll. I'm not here to annoy you. I know this is in the wrong place. Maybe it needs to be copy pasted into future discussion or features or whatever. But you know, no one's going to look at that. This. I know , how, how these organizations run some of the best product designer resign and that's it. Sometimes it's bullies. You who who get their way and who drive how things change. And there's the people who are most aggressive and bullying aren't necessarily strongest people in your organization. So that's what usually goes wrong so. I'm trusting that you aren't going to be intimidated by any of those kind of people. It's, you know, it's too many cooks in kitchen and such. but there's people who need to be using computers right now in the theoretical physics And I'm not talking about material science or free or cheap or free energy. I'm talking about space transport And not to Mars somewhere nice.. So I'm trying to make this happen faster. I promised that I would push for this and that's why I'm here being annoying and putting things in the wrong place. Sorry I know this should be under enhancement. Or whatever. But I think it should be addressed before features are implemented, which might be subsumed later by doing it 100 percent now. "Go big or go home, ai, high " etc. I don't want to touch my keyboard or mouse again. We can do this. Sorry, I just get excited because my hands are arthritic and my back is broke and I'm spending too much time engaged on my computer now holding this mic isn't a good idea. My hands went numb, so I'm going to find out a better mic and that's very important to have a card a directional night that you speak directly into. I don't know if you want to use my grays, but. you know noise is a big deal. No big habit. Clean signal. I'm using a $200 preamp and Mic. Don't know if you're getting the same results as me, but I'm getting very, very, very good results given that it's not using context now and I'm very impressed. Things have changed. You know, since the last time I looked at this feature, I thought no way, you know. I'm too old for this. I can't be coding anymore. None of people I talked to are going to code. and it takes 50 years to get to this level of theoretical physics to where you can figure out something like what we're trying to figure out. No one's got time to type. So you really. And then no one can hire. No one even can afford interns or freelancers to do this either. They all have to be professors and have graduate students. So none of our best theoretical physicists have the. money and the. time to write simulations. So they need funding and they need a lot of graduates to do this stuff. They never finished so we don't get this done. We're stuck on Earth using rockets and Mars isn't a great place to be aspiring to go to for a second home.
Author
Owner

@carlos-zamora commented on GitHub (Aug 3, 2023):

Hi @damian-666, I suggest you submit this feedback to the relevant team (sounds like Voice Access, maybe) via Feedback Hub. Here's a link: https://support.microsoft.com/en-us/windows/send-feedback-to-microsoft-with-the-feedback-hub-app-f59187f8-8739-22d6-ba93-f66612949332

That'll make sure it gets to the right place :)

@carlos-zamora commented on GitHub (Aug 3, 2023): Hi @damian-666, I suggest you submit this feedback to the relevant team (sounds like Voice Access, maybe) via Feedback Hub. Here's a link: https://support.microsoft.com/en-us/windows/send-feedback-to-microsoft-with-the-feedback-hub-app-f59187f8-8739-22d6-ba93-f66612949332 That'll make sure it gets to the right place :)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/terminal#18271