Hello, I use latest Edge Dev on MacOS on MBP 15 2018. Edge is very nice, but has one important "bug" related to almost all chromium-based browsers. It sometimes uses dedicated GPU (Radeon) instead of Intel on battery on some pages. Example: Google Maps, Messenger, Facebook. I think it's related to SVG operations, but not sure.
It causes big battery impact and reduces work time. I know two exceptions: Safari (but it Webkit based) and... Opera. These two browsers can do all fine without using decidated GPU, and use only Intel GPU. Opera has special option in preferences to enable using decidated GPU and it's disabled in default.
I use small app called gSwitch (https://github.com/CodySchrank/gSwitch) it allows to control dedicated/integrated GPU and also show current status. Please see screenshoot below: in first you can see, that Edge forces Mac to use dedicated GPU, Opera doesn't. On second, there is this mentioned setting in Opera.
I think it's possible to add similar option into Edge and it will improve batery life on this browser a lot. There is no any control panel on MacOS to allow/disallow apps using dedicated GPU like for example Nvidia control panel on Windows, so it's superb important.
@HotCakeX No, MacOS doesn't have any panel/options for that. There is only option to disable auto-switching GPUs and force to use Radeon. In effect, it's superb important that apps should avoid using Radeon on battery.
@Lukasamd the same happens on my 2017 MBP 15". Opera, Safari and Firefox do not use the high perf. GPU. This is Chromium related issue of which most of the devs are complaining. There were some rumors that Opera actually blacklists the autodetected high perf. GPU to prevent it from being used, which is dirty but basically pretty much the only solution due to Chromium removing the API/ parameters to disable the egpu.
I can't really say what is happening, but all Chromium browsers except Opera are currently utilizing the egpu. This is madness.
My personal observations show that the egpu is awaken when you try playing h264 videos. Probably for the HW acceleration, however this seems like insanity due to the intel 6xx GPU's and even the older ones till Sandy Bridge being capable enough to decode H264.