There are important elements of the EventGhost UI that aren't usable by blind people relying on text-to-speech software like NVDA or Jaws, because the groups of settings are described only as "tab panel" and there's no way to use the tab key to navigate within those panels. Ex: add action only allows the name, test, and ok, plus tabs to a "panel" and back out of it. Emulated mouse movements also don't read any text in there. Now I recall that this was the barrier to my using it ten years or so ago as well. I wish the interface could be made keyboard-accessible, which would probably minimize the difficulties I have using Jaws.
Alternatively, the XML file seems to contain everything, and I'd as soon just edit that directly. However, that'd require exhaustive syntax documentation for all the parameters, which I'm not finding and would be daunting for some of the more complex events/actions. The Python documentation suggests configuring and then pasting into an editor to see the parameters. The only remedy I might imagine here would be to have one of you contribute a USB-UIRT xml logger so that I can figure out the syntax from that. My goal for EG is a fairly simple set-it-and-forget-it set of remote control actions with keyboard bindings and IR transmission.
I'd appreciate comments on if either of these solutions is feasible. Thx.