I have an app that uses two processes to show UI components - one for the main app, which uses regular activities, and one where the UI consists of overlays (that live in a Service) that are added to the WindowManager using the ViewManager API. I want to test the interaction between the overlays and the app using UiAutomator.
The overlay views/windows, as described in this question, are difficult to access directly. If I move the overlays to the main process, I can access the views by calling methods on WindowManagerGlobal using reflection, as Appium does here, but if they're running in the overlay process the calls to WindowManagerGlobal do not return the views. In any case, the test can't interact with the views since only the thread that created the views can access them, unless they're accessed through the Accessibility framework, which UiAutomator makes use of.
Is there any way I can expose the overlay views though the Accessibility framework so that UiAutomator can find them, even though they're running in a different process?
I had a look at multiprocess Espresso, which sounded very promising since it could potentially bridge the gap between the two processes, but Espresso cannot find the overlay views. Is it possible to change the way the overlays are added, perhaps by providing some sort of a root window for that process, in a way that would let UiAutomator, and possibly even Espresso, to find the overlay views?
If I dump the view hierarchy using UiDevice, I do see the system decor views on the same level as the app's views. I had a look at the SystemUI code where the views are added to the WindowManager and it's very similar to how the app adds its overlays. The only difference from what I can see is the layer type. It'd be interesting to know how the system UI components are injected, for lack of a better word, into that hierarchy, and if it's possible to do the same with overlays created by apps.