Forum Widgets
Latest Discussions
is TwoPaneBackStackEntry composable list of routes incompatible with arguments / URI Routes?
Hello, I am doing some tests on the SDK and have encountered a roadblock. Currently, my application uses route arguments to find item details. However, upon navigating to them, the following exception is returned: java.lang.IllegalArgumentException: Invalid route item/4, not present in list of routes [home?, items?, item/{itemId}, ...] I am unsure if this is a problem stemming from the fact that listOf[Routes] cannot take into account those values or if it is an error on my end. Could you please clarify? Thank you for all your hard work!Papes96Apr 08, 2023Copper Contributor1.7KViews1like6CommentsLooking for MVP 22/23 recommendations (who to follow)
Hi team, the new MVP year started and I just wanted to ask if someone of you has recommendation who to follow on Twitch, GitHub, Twitter, etc. I know there is no MVP program related to the Surface Duo community, but maybe there are some Xamarin or other MVPs that are also covering this topic 🙂 Thanks a lot in advance and let's grow this community!tscholzeJul 22, 2022Iron Contributor721Views0likes1CommentMore control over the placement of activities on the screens
Hello again. I'm in the process of adapting my launcher app for the Surface Duo. My idea is to use the left screen as the primary screen and the right screen as the secondary screen (it will display the drawer). In general it works well, but I have two problems: 1. I want to make applications always run on the secondary screen. I tried using the Intent.FLAG_ACTIVITY_LAUNCH_ADJACENT flag, but it doesn't always give the right result (I guess because the launcher itself takes up both screens). 2. I would like to make the launcher adapt the interface depending on which screen the other application is running on (roughly like Microsoft Launcher does with the dock). But I don't see an API that lets me know which screen is currently occupied by another application.EvgenyZobninApr 25, 2022Copper Contributor975Views0likes2CommentsUnable to launch Surface Duo emulator in Windows 11
Here's my environment(s): Surface Book 2 running Windows 11 Surface Book 3 running Windows 10 I have the Surface Duo emulator installed on both systems. On Surface Book 3 the emulator starts fine. On Surface Book 2, I can't launch the emulator (the taskbar slightly opens as if an app is launching but then quickly closes). I don't know enough to be able to figure out this issue. I've scoured the web looking for an answer but can't find one. Possible related issue: I will eventually use this emulator in Visual Studio. On my Surface Book 3, I can run my android project / debug it from Visual Studio using the Surface Duo emulator. On the Surface Book 2, I can't even start an android emulator for the same project. I've found numerous complaints on running Android emulators in Windows 11, but I'm not sure it relates to the Surface Duo emulator (I figure if I can get the Surface Duo emulator issue worked out, it might fix the android emulator issue in general). I've tried all the various suggestions on fixing the android issue (disabling Hyper-V, updating android emulator to version 31.2, etc), but nothing is working. Can someone here suggest another way to get this emulator working in Windows 11?samurai2083Apr 08, 2022Copper Contributor1.5KViews0likes5CommentsUnable to access internet on Edge
Hi Team, I am not able to access internet using Edge on Surface Duo emulator(Android 11 version). The browser stays in loading state and don't end upon a web page. As Edge is the only browser that comes with the emulator, I am not able to download another browser to access internet. Please help me in this situation.Vedant_ThakkarMar 08, 2022Former Employee1.2KViews0likes2CommentsMagic Trackpad 2 and multitouch in Android emulator
In reference to https://devblogs.microsoft.com/surface-duo/android-emulator-multi-touch-support/ I was trying to use the app I am the developer of (Ink&Paper) on the android emulator with a Cintiq 16 on a mac. Everything works except using the Magic Trackpad 2, which has unexpected behavior. I don't know if it's a device type issue but I think it's more likely common to all trackpads. However it would be nice if it was implemented in the correct way (it works well in Bluestacks but it is a completely different type of emulator). Regards.Francesco_Paolo_TorreNov 03, 2021Copper Contributor6KViews0likes4CommentsWill the Surface Duo (2) run the Android12L variant?
Hi team, I just try to understand what's up with Android 12L and why it's another flavor and not just a feature of Android 12 - sorry if I understood it not correctly. Does anyone know if the Surface Duo (2) will get Android 12 or the L version? And if yes, in my eyes it makes no different for developers, am I correct? Thanks for helping me out in understanding the new Android 12 versions! - TobytscholzeOct 29, 2021Iron Contributor958Views0likes0CommentsFLAG_LAYOUT_NO_LIMITS it is not respected on one screen mode
The https://developer.android.com/reference/android/view/WindowManager.LayoutParams#FLAG_LAYOUT_NO_LIMITS is not respected on one screen mode on real device and emulator. This was reported by one of my app user and can be replicated really easy on emulator. Placing a View with https://developer.android.com/reference/android/view/WindowManager.LayoutParams#FLAG_LAYOUT_NO_LIMITS outside the screen boundaries does not work on single screen mode, as you can see in the screenshots above. When both screens are visible, the blue rectangle (a view) it is correctly placed outside the screen, when the screen is closed it is restricted inside the screen. Video demo with the problem: https://streamable.com/2tgevfmicku7zuSep 22, 2021Copper Contributor2.2KViews0likes5CommentsDisplay/Screen/Fold info from (accessibility) service
I'm trying to adapt my accessibility service app (https://play.google.com/store/apps/details?id=com.quickcursor to Surface Duo foldable screens and found some inconveniences and issues: https://developer.android.com/reference/android/accessibilityservice/AccessibilityService#dispatchGesture(android.accessibilityservice.GestureDescription,%20android.accessibilityservice.AccessibilityService.GestureResultCallback,%20android.os.Handler) doesn't respect the screen coordinates when the device is using the right screen on single screen mode. Sample code: https://github.com/micku7zu/SurfaceDuoDispathGestureBug Sample video: https://streamable.com/z7cgub In the video there are 2 buttons that dispatch click at (300,300) and (1350+300, 300) coordinates. As you can see, the buttons are positioned at (400,100) and (800,100) and when only the right screen is used, the buttons are positioned correctly on the screen, but the click dispatch coordinates are not correct, clicking on (300, 300) clicks on the left screen (which is turned off). If this is not a bug, and in fact this is intended, than there should be a way to detect the screen which is used from an (accessibility) service context in order to adjust the coordinates from the app. All the SDK libraries for display/fold info are based on Activities, and no info from a service. The other inconvenience is that there is currently no way to detect the current state of the fold/display from an (accessibility) service (or I didn't find any) straight forward. All the SDK Duo libraries are based on UI Activities. I understand that the helper libraries are focused on UI Activities, but accessibility services shouldn't be left out, there are users that needs this services for easier use of the device. A solution that works on most foldable devices (from my app user feedback): WindowManager windowManager = (WindowManager) context.getSystemService(Context.WINDOW_SERVICE); Display display = windowManager.getDefaultDisplay(); Point displaySize = new Point(); display.getRealSize(displaySize); Doesn't work on Surface Duo as it should (in my opinion). The display returned by DisplayManager or WindowManager is not updated. There is only one display with the same values on all possible fold states. On other foldable devices this display values are updated/a list of displays. Some practical examples that are really hard to do in an accessibility service without an UI activity: How to get the current resolution (1350x1800 or 2700x1800)? How to get the hinge rect? How to get which part of the display is active (left or right)? I've made some ugly hacks for Surface Duo for the above problems, but this shouldn't be the case. I've not tested on all foldable devices, but currently from my app user feedback the issues are only on Surface Duo, other foldable devices are running ok without any special needs. I didn't fixed yet the left/right screen detection to adjust the coordinates for the right screen because I'm tired of looking for hacks. Probably there is a sketchy way to detect this, but I'm out of ideas for the moment. Any idea how I could get over these issues more easily? Thanks!micku7zuSep 21, 2021Copper Contributor740Views1like1Comment
Resources
Tags
- question8 Topics
- ask-the-pm2 Topics
- Layout2 Topics
- Flutter2 Topics
- Tutorial2 Topics
- Lessons learned2 Topics
- design2 Topics
- show-and-tell1 Topic
- info1 Topic
- SDK update1 Topic