electron: [Windows] Menus don't follow native keyboard conventions

I created the Quick Start app and ran it in Windows 8.1 Professional (64-bit) with "electron-prebuilt": "^0.30.4".

Here are some ways in which using Electron menus from the keyboard don’t work like Windows native menus:

  1. Pressing Alt by itself should select the first menu from the menu bar. (Electron just flashes the accelerator keys at you.)
  2. Similarly, pressing and releasing Alt then pressing a shortcut key should bring that menu up (Electron requires you to hold Alt while pressing the shortcut key, per item # 1 ).
  3. When you use the keyboard to choose a menu, the first item in the menu should be pre-selected. (Electron holds the menu open, but no item is selected.)
  4. When you have a menu selected, pressing or should take you to the first item in the menu to the left/right, respectively. (Electron ignores the left and right arrow keys.)
  5. When you have a menu selected, pressing should highlight the last item in the menu. (Electron highlights the first.)

About this issue

  • Original URL
  • State: closed
  • Created 9 years ago
  • Reactions: 42
  • Comments: 37 (4 by maintainers)

Most upvoted comments

There is precedent for both—XUL famously reimplemented the entire UI framework of every platform they’re on, while Qt provides a meticulous API around the native menus.

Native menus are easier to make work with assistive technologies, etc., since they conform to the existing OS APIs. Native menus are also more likely to work with wacky configurations (custom themes, RTL languages, OS-level extensions, programmability, etc.)

The advantage of emulating the native behavior is that it allows per-app customization using web technologies, which is the main purpose of the Electron project.

Personally, I lean toward implementing the “known” Windows behavior, since it serves the purpose of the project better.

This is bugging me in VS Code too. I’m used to pressing and releasing Alt before pressing a letter for a menu. Works this way in all apps I use but not Code.

I don’t care whether the implementation is native or not, as long as it behaves the same. A user shouldn’t have to remember to press keys differently to do something like open the File menu.

As a user of assistive technology, I feel that implementing known OS behavior is the best route. Trying to have native menus on each platform could be a maintainability nightmare.

Would you guys accept a pull request for this?

@ckerr , can this be reopened? Left/right keys in the menus still don’t work on Linux (I’ve tested it on VSCode, Slack and Skype)

Are there any progress on this? I’m blind, and I’m developing an app with Electron. The problem is that I can’t Access to menu bar.

I did some research into this a few weeks ago but got kind of stuck. I’ll post what I have so far.

Starting with the source files mentioned by @MarshallOfSound:

Possibly relevant MenuViews class that also inherits from Menu (see below for more about Views)

The most relevant-looking files are the *View classes in Chromium’s views namespace and the Electron classes (atom namespace) that inherit from them. Chromium’s Views system is used on Windows and I think also Linux.

Some potentially relevant-looking classes from Electron’s views-related code (atom/browser/ui/views):

Also potentially useful is Electron’s RootView, which owns the MenuBar object and checks for whether Alt is pressed for whether to hide/show the menu.

Chromium’s FocusManager looked potentially relevant for saving/restoring focus when moving back/forth between the menu and the main window, but I’m not sure about that.

The comments in Chromium’s MenuItemView header (menu_item_view.h) look interesting:

Note, that as menus try not to steal focus from the hosting window child views do not actually get focus. Instead |SetHotTracked| is used as the user navigates around.

Thus SetHotTracked() may be more relevant than using Chromium’s FocusManager, but I’m really not sure.

That’s all I got for now. I’ll do more research/poking around the code when I can.

Would you guys accept a pull request for this?

You would make us all very happy 🙌

So …this issue is just sitting there. Has any more thought/work been put into a fix for this?

I have recently tried Visual Studio code. The good news is that the menu items are actually visible to NVDA if you know advanced features. I’m not saying they’re usable. The process of actually getting to them was incredibly, incredibly involved. But they show up as labeled buttons so all that’s needed is:

  • When alt is pressed and released, the menu bar gains focus.
  • When escape is pressed and the menu bar is focused, focus goes back to where it was before.
  • Left and right arrows move between the menus and enter and down arrow both open them.
  • The menus announce the accelerator key.

I saw in another issue that other menus don’t do this, which is true. But they do give you the first letter mnemonic. Since this menu is not actually a native menu and screen readers aren’t going to do this, they need to be labeled with the key. It’s not always the first letter.

Yet again Electron has come up, this time in a work-related context, and yet again I’ve found this issue. The Accessibility guide should be mentioning that menus have no discoverability and that users of AT may be unable to interact with menus. It’s possible to guess that apps will have standard menus, but enumerating the list of available menus isn’t currently possible save for the most advanced of screen reader users.

The functionality whereby alt moves focus to the menu is how the standard discovery process works for blind users of Windows applications. It’s not a screen reader provided feature. The alternative is to use NVDA’s object navigation or the equivalent. Screen readers can expose the accessibility tree as an actual tree and users who know to crawl it can find them, but that’s the only way to discover that there’s even a menu bar in the first place. None of the common workarounds that I can think of an intermediate screen reader using (I.E. focusing the system menu with alt+space and then using the left/right arrow keys, which works in some apps) apply here.

for screen-reader users

It is important to note this issue, with equaled importance goes beyond screen reader users

Any progress on this?

I’m now seriously considering Electron for an app. But not having an accessible menu is a deal breaker.

I am guessing emulating the OS-specific behavior would be better in this case. It is less of a hassle to keep up to date and I can confirm assistive technologies can work with the current menu implementation itself, as demonstrated in the VS Code project. I do think this is something that deserves attention though, I didn’t even know pressing alt flashes the acelerator keys at you since I’m fully blind. I just press the accelerator keys I’m used to and hope for the best 😃

Probably a new ticket needs to be opened, because there’s no reply on this one

Please reopen this.

Hi, This has been closed but the above comment says it is fixed for vscode not electron, can this be re-opened if it’s not working in electron ?

Cheers S

Have to admit I was surprised to encounter this issue today in an electron app, only to find out here that there’s no official fix/suggested approach yet.

Any progress on this item?

It’s really counter-intuitive, and I keep losing code and having to undo to get it back when doing things like saving using standard keyboard key presses ingrained for 25 years. OK, it might be Windows specific, and there might be work-arounds, but nothing stops that initial code corruption requiring a correction hundreds of times a day, when you forget you happen to be in THAT ONE application that doesn’t support the standard keyboard interface.

@kastwey The work-around is to use ALT-F, ALT-E, ALT-S, ALT-V, ALT-G, ALT-D, ALT-T, and ALT-H.

  1. When you use the keyboard to choose a menu, the first item in the menu should be pre-selected. (Electron holds the menu open, but no item is selected.)
  2. When you have a menu selected, pressing ← or → should take you to the first item in the menu to the left/right, respectively. (Electron ignores the left and right arrow keys.)

I’m not sure if I should open a separate issue, items 3 and 4 are a problem for Linux too. Confirmed earlier today with latest atom.x86_64.rpm installed on Fedora 24.

Are we sure that screen readers can interact with the top level of the menu? This seems like it would be hard to test without keyboard support in the first place. If they can, i.e. if arrowing is going to announce the menu names, then that’s fine. Otherwise, native menus need to be considered. The alternative is going into accessibility API implementation land, and that’s probably harder. Me and a couple other people I know are blind and would like to use Electron, but this is a pretty big deal. Menu bars need to function properly.

I’m interested in tackling this issue. I’ve been reading the Electron and Chromium docs and starting to look at the code. I use Win 10 Pro and VS 2017 with a screen reader (mostly NVDA) and screen magnifier.

Does anyone on the team have any suggestions on where in the code to start looking for how to fix this? So far I’ve been looking at the files mentioned earlier by @MarshallOfSound as well as other menu-related code and working on wrapping my head around it all.

@devil418 it also hampers users not being able to use a pointing device.

Any progress on this? This is a very important issue for screen-reader users and one of the only ones stopping us from using electron-based apps.

FWIW, The way this works for Windows is that application components are revealed to the system, and exposed to MSAA. Assistive Tech grabs onto that information and does what it needs to, such as announcing it to the user, For Windows, you can use Inspect32 (if you have the Windows SDK installed) to see what roles are revealed to MSAA. Mac does it basically the same way, but not sure of the names.

This wouldn’t fix the arrow key issue.