Display/Screen/Fold info from (accessibility) service

Copper Contributor

I'm trying to adapt my accessibility service app (Quick Cursor) to Surface Duo foldable screens and found some inconveniences and issues:

 

Sample code: https://github.com/micku7zu/SurfaceDuoDispathGestureBug

Sample video: https://streamable.com/z7cgub

 

In the video there are 2 buttons that dispatch click at (300,300) and (1350+300, 300) coordinates. As you can see, the buttons are positioned at (400,100) and (800,100) and when only the right screen is used, the buttons are positioned correctly on the screen, but the click dispatch coordinates are not correct, clicking on (300, 300) clicks on the left screen (which is turned off). 

 

If this is not a bug, and in fact this is intended, than there should be a way to detect the screen which is used from an (accessibility) service context in order to adjust the coordinates from the app. All the SDK libraries for display/fold info are based on Activities, and no info from a service.

 

  • The other inconvenience is that there is currently no way to detect the current state of the fold/display from an (accessibility) service (or I didn't find any) straight forward. All the SDK Duo libraries are based on UI Activities. I understand that the helper libraries are focused on UI Activities, but accessibility services shouldn't be left out, there are users that needs this services for easier use of the device.

A solution that works on most foldable devices (from my app user feedback):

WindowManager windowManager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE);
Display display = windowManager.getDefaultDisplay();
Point displaySize = new Point();
display.getRealSize(displaySize);

Doesn't work on Surface Duo as it should (in my opinion). The display returned by DisplayManager or WindowManager is not updated. There is only one display with the same values on all possible fold states. On other foldable devices this display values are updated/a list of displays.

 

Some practical examples that are really hard to do in an accessibility service without an UI activity:

  1. How to get the current resolution (1350x1800 or 2700x1800)?
  2. How to get the hinge rect?
  3. How to get which part of the display is active (left or right)?

 

I've made some ugly hacks for Surface Duo for the above problems, but this shouldn't be the case. I've not tested on all foldable devices, but currently from my app user feedback the issues are only on Surface Duo, other foldable devices are running ok without any special needs.

 

I didn't fixed yet the left/right screen detection to adjust the coordinates for the right screen because I'm tired of looking for hacks. Probably there is a sketchy way to detect this, but I'm out of ideas for the moment. 

 

Any idea how I could get over these issues more easily?

 

Thanks!

1 Reply
Thanks for breaking out the two issues. I'll share internally and follow up with you here.