Media Services
33 TopicsFormer Employer Abuse
My former employer, Albert Williams, president of American Security Force Inc., keeps adding my outlook accounts, computers and mobile devices to the company's azure cloud even though I left the company more than a year ago. What can I do to remove myself from his grip? Does Microsoft have a solution against abusive employers?50Views0likes0CommentsCan we legally use Azure services Icon on third party tool.
Hello MS Team, We are building Azure Automation tool for our client to create landing zone and building different azure services. e.g. Storage Account, Key Vault, VM, APIM, Azure SQL etc. Please let us know, if we can use legally Azure service Icons in our automation tool. Regards, Param547Views0likes0CommentsLinking scanner & Azure
Hi all, I have a question regarding scanner (device) & Azure. I have a user who wants to scan numerous documents & when he scans it, they directly go to some folder located on Azure. I presume this would be connected to Azure Blobs Storage/Azure file share? I am missing the link how would we set it up on both sides? I mean, what are the requirements? Kind regards, Dino12KViews0likes1CommentDefault Security permissions for Audiocodes Mediant in Azure
After troubleshooting an issue with our OVOC server not being able to access an SBC in Azure and modifying a few firewall restrictions discovered the default network port permissions on Azure. Added SNMP and allowed OVOC to talk to the SBC. This is a quick example of the rule we added, instead of using any on the source we did modify that to match the IP address of the OVOC server. Hope this helps someone else when spinning up an AudioCode SBC within Azure.1.5KViews0likes0CommentsHow to feed back the results from a batch?
Hi! I have tried contacting Azure technicians and he said maybe the paid support can answer me but they don't know the answer. My project is to scan 1GB of unstructured text, quickly, by splitting it onto ~128 vCPUs, checking each 10MB for word matches, and then feed back all results to one location to weigh-in and tally up the match counts. The question is, is there a bottleneck delay? If so, how long? I would need all cores to send the results back and meet up within ~some few seconds. Further, what if the 128 vCPUs can't even send back results to 1 location, or they can but they share the same bus/RAM and hence the individuual 10MB checks are like 500MB checks and basically no faster.580Views0likes0CommentsIntroducing an experimental content-aware encoding in Azure Media Services
Overview In order to prepare content for delivery by adaptive bitrate streaming, the video needs to be encoded at multiple bit-rates (high to low). In order to ensure graceful degradation of quality, as the bitrate is lowered, so is the resolution of the video. This results in a so-called encoding ladder – a table of resolutions and bitrates, as you can see in some of our fixed encoding presets, such as “H264MultipleBitrate1080p”. Interest in moving beyond a one-preset-fits-all-videos approach increased after Netflix published their blog in December 2015. Since then, multiple solutions for content-aware encoding have been released in the marketplace – see this article for an overview. The idea is to be aware of the content – to customize or tune the encoding ladder to the complexity of the individual video. At each resolution, there is a bitrate beyond which any increase in quality is not perceptive – the encoder operates at this optimal bitrate value. The next level of optimization is to select the resolutions based on the content – for example, a video of a PowerPoint presentation does not benefit from going below 720p. Going further, the encoder can be tasked to optimize the settings for each shot within the video – Netflix described such approach in 2018. In early 2017, we released the Adaptive Streaming preset to address the problem of the variability in the quality and resolution of the source videos. Our customers had a varying mix of content – some at 1080p, others at 720p, and a few at SD and lower resolutions. Further, not all source content were high quality mezzanines from film or TV studios. The Adaptive Streaming preset addressed these problems by ensuring that the bitrate ladder never exceeds the resolution or the average bitrate of the input mezzanine. The experimental content-aware encoding preset extends that mechanism, by incorporating custom logic that lets the encoder seek the optimal bitrate value for a given resolution, but without requiring extensive computational analysis. The net result is that this new preset produces an output that has lower bitrate than the Adaptive Streaming preset, but at a higher quality. See sample graphs below, that show the comparison using quality metrics like PSNR and VMAF. The source was created by concatenating short clips of high complexity shots from movies and TV shows, intended to stress the encoder. By definition, this preset produces results that vary from content to content – it also means that for some content, there may not be significant reduction in bitrate or improvement in quality. The present currently is tuned for high complexity, high quality source videos (movies, TV shows). Work is in progress to adapt to low complexity (eg. PowerPoint presentations) content, as well as poorer quality videos. This preset also uses the same set of resolutions as the Adaptive Streaming preset – we are working on methods to select the minimal set of resolutions based on the content. Using the experimental preset You can create Transforms that use this preset as follows. If using a tutorial like this one, you can update the code as follows: TransformOutput[] output = new TransformOutput[] { new TransformOutput { // The preset for the Transform is set to one of Media Services built-in sample presets. // You can customize the encoding settings by changing this to use "StandardEncoderPreset" class. Preset = new BuiltInStandardEncoderPreset() { // This sample uses the new experimental preset for content-aware encoding PresetName = EncoderNamedPreset. ContentAwareEncodingExperimental } } }; Next steps Now that you have learnt about this new option of optimizing your videos, we invite you to try it out. You can send us feedback via the link below, or engage us more directly at amsved@microsoft.com2KViews0likes0CommentsIntroducing live transcriptions support in Azure Media Services
Azure Media Services provides a platform which you can use to ingest, transcode, and dynamically package and encrypt your live video feed(s) for delivery via industry-standard protocols like HLS and MPEG-DASH. Live Transcriptions is a new feature in our v3 APIs, wherein you can enhance the streams delivered to your viewers with machine-generated text that is transcribed from spoken words in the audio feed. When you publish your live stream using MPEG-DASH, then along with video and audio, our service will also deliver the transcribed text in IMSC1.1 compatible TTML, packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments. You can then play back this video+audio+text stream using a new build of Azure Media Player. The transcription relies on the Speech-To-Text feature of Cognitive Services. This new feature is being demonstrated at the NAB 2019 trade show at the Microsoft booth #SL6716. Following the show, we are planning to make Live Transcriptions available for a private preview, during a period of 3 to 4 weeks in May 2019. We are looking for customers and partners who are already using our service for streaming live events, preferably with our v3 APIs, and can dedicate a few viewers to watch the stream live (and not on-demand) and provide feedback. If you are interested in participating, please fill out this form. We will get in touch with the selected participants in the first week of May 2019.3.2KViews0likes0CommentsHow to launch a ml studio web app in azure request response ml web service?
Hi I saw this video and it was pretty easy to launch https://www.youtube.com/watch?v=c-BtoTdbjuQ but in the new versión of azure web app RRS, I get to create and run the service but I dont see where should I put my API key and URL, I searched in the documentation specific for setting the azure ml in a web app like the video but I did not find anything. Any help to get my web app functioning like in the video would be very helpful. Thanks you1.2KViews0likes0CommentsWhat's new in Azure Media Services video processing
Developers and media companies trust and rely on Azure Media Services for the ability to encode, protect, index, and deliver videos at scale. This week we are proud to announce several enhancements to Media Services including the general availability of the new Azure Media Services v3 API, as well as updates to Azure Media Player. Read about it in the Azure blog.1.1KViews0likes0Comments