Forum Discussion
Edge can't use the mic in the Google Translate page
- Jan 13, 2020My guess is that all these Tools use the WebSpeechApi which only works on Google Chrome/Chromium (at least the recognition part) because Google uses it's own cloud servers for the audio transcription. Since the new Edge is basically Chromium it probably thinks that the WebSpeechApi is supported and breaks. The API behaves very strange currently in Edge.
Presumably this issue is also the same as microphone not working for Google Search ie at https://www.google.co.uk/ ?
Update please? This issue impacts accessing the Google homepage (https://www.google.com/ as well when using Microsoft Edge and prevents the microphone icon from appearing. If I access this page using a Chrome browser, there is a microphone in the search bar that you can click to verbally dictate search queries. If I access the page using the Edge browser, the microphone is not visible. I have tried using the Windows Key + H command, but it is pretty laggy and buggy. I have clicked the lock in the address bar to change the site permissions and attempted to give access to the microphone, yet it does not provide me that option (only shows me the ability to enable Adobe Flash). Finally I have activated microphone access privileges through Windows 10 settings / privacy menu but it has no impact either.
Microsoft, when can we expect this bug to be fixed? I feel that Edge is vastly superior to Chrome and would like to continue using as my primary browser. However, I don't to have to open up a Chrome window just to use the microphone dictation feature.
current version:
Microsoft Edge 41.16299.15.0
Microsoft EdgeHTML 16.16299
- florianSBJun 22, 2020Brass Contributor
I think it's important to understand that there are two things at work here.
The first one is a bug and is related to the WebSpeechAPI of the browser. The trivial fix for the bug would be to return a correct error message, indicating that the WebSpeechAPI is not functional. In the context of all websites using this feature like Google Search, Google Translate, DuoLingo etc. this means they can check for speech input support properly and act accordingly (Google usually just hides the microphone button, like it does with Firefox).
The second one is a feature and its up to Microsoft to decide if they want to implement it or not. Technically speaking it means to connect their Windows speech recognition to the WebSpeechAPI of the browser. Since they already offer easy access to the speech interface (Windows + H) I don't see any reason why they want to block this ... so I guess its just a matter of priorities at the moment 😉
- Micheal_EspinolaJun 22, 2020Copper Contributor
florianSB the thing to understand about this is that its ridiculous for a browser to restrict how you can or cannot use your microphone with a web site. This has been an issue for far too long without any real response from Microsoft.
I continue to not use Edge because of this. I'm not going to make my daily life more cumbersome because my browser won't let me use my microphone with a search engine. It's ridiculous.
- florianSBJun 23, 2020Brass Contributor
Its frustrating for me too but Microsoft does not "restrict how you can or cannot use your microphone with a web site". It's not the website that does the speech recognition its Google. The website just uses this Google service that is integrated into Chrome for free! If Microsoft decides to host servers for this (which they actually already do for Cortana) you get the same feature. If Google decides to take their servers down none of these websites will work anymore in Chrome.
I'm not trying to defend anyone here but I think its important to understand that you actually talk to Google when you use speech recognition in Chrome and not to the website host. It costs millions of dollars to host this infrastructure this is also the reason you'll probably not see it in Firefox for a long time.