Media Services
34 TopicsLinking scanner & Azure
Hi all, I have a question regarding scanner (device) & Azure. I have a user who wants to scan numerous documents & when he scans it, they directly go to some folder located on Azure. I presume this would be connected to Azure Blobs Storage/Azure file share? I am missing the link how would we set it up on both sides? I mean, what are the requirements? Kind regards, Dino12KViews0likes1CommentIntroducing live transcriptions support in Azure Media Services
Azure Media Services provides a platform which you can use to ingest, transcode, and dynamically package and encrypt your live video feed(s) for delivery via industry-standard protocols like HLS and MPEG-DASH. Live Transcriptions is a new feature in our v3 APIs, wherein you can enhance the streams delivered to your viewers with machine-generated text that is transcribed from spoken words in the audio feed. When you publish your live stream using MPEG-DASH, then along with video and audio, our service will also deliver the transcribed text in IMSC1.1 compatible TTML, packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments. You can then play back this video+audio+text stream using a new build of Azure Media Player. The transcription relies on the Speech-To-Text feature of Cognitive Services. This new feature is being demonstrated at the NAB 2019 trade show at the Microsoft booth #SL6716. Following the show, we are planning to make Live Transcriptions available for a private preview, during a period of 3 to 4 weeks in May 2019. We are looking for customers and partners who are already using our service for streaming live events, preferably with our v3 APIs, and can dedicate a few viewers to watch the stream live (and not on-demand) and provide feedback. If you are interested in participating, please fill out this form. We will get in touch with the selected participants in the first week of May 2019.3.2KViews0likes0CommentsIntroducing an experimental content-aware encoding in Azure Media Services
Overview In order to prepare content for delivery by adaptive bitrate streaming, the video needs to be encoded at multiple bit-rates (high to low). In order to ensure graceful degradation of quality, as the bitrate is lowered, so is the resolution of the video. This results in a so-called encoding ladder – a table of resolutions and bitrates, as you can see in some of our fixed encoding presets, such as “H264MultipleBitrate1080p”. Interest in moving beyond a one-preset-fits-all-videos approach increased after Netflix published their blog in December 2015. Since then, multiple solutions for content-aware encoding have been released in the marketplace – see this article for an overview. The idea is to be aware of the content – to customize or tune the encoding ladder to the complexity of the individual video. At each resolution, there is a bitrate beyond which any increase in quality is not perceptive – the encoder operates at this optimal bitrate value. The next level of optimization is to select the resolutions based on the content – for example, a video of a PowerPoint presentation does not benefit from going below 720p. Going further, the encoder can be tasked to optimize the settings for each shot within the video – Netflix described such approach in 2018. In early 2017, we released the Adaptive Streaming preset to address the problem of the variability in the quality and resolution of the source videos. Our customers had a varying mix of content – some at 1080p, others at 720p, and a few at SD and lower resolutions. Further, not all source content were high quality mezzanines from film or TV studios. The Adaptive Streaming preset addressed these problems by ensuring that the bitrate ladder never exceeds the resolution or the average bitrate of the input mezzanine. The experimental content-aware encoding preset extends that mechanism, by incorporating custom logic that lets the encoder seek the optimal bitrate value for a given resolution, but without requiring extensive computational analysis. The net result is that this new preset produces an output that has lower bitrate than the Adaptive Streaming preset, but at a higher quality. See sample graphs below, that show the comparison using quality metrics like PSNR and VMAF. The source was created by concatenating short clips of high complexity shots from movies and TV shows, intended to stress the encoder. By definition, this preset produces results that vary from content to content – it also means that for some content, there may not be significant reduction in bitrate or improvement in quality. The present currently is tuned for high complexity, high quality source videos (movies, TV shows). Work is in progress to adapt to low complexity (eg. PowerPoint presentations) content, as well as poorer quality videos. This preset also uses the same set of resolutions as the Adaptive Streaming preset – we are working on methods to select the minimal set of resolutions based on the content. Using the experimental preset You can create Transforms that use this preset as follows. If using a tutorial like this one, you can update the code as follows: TransformOutput[] output = new TransformOutput[] { new TransformOutput { // The preset for the Transform is set to one of Media Services built-in sample presets. // You can customize the encoding settings by changing this to use "StandardEncoderPreset" class. Preset = new BuiltInStandardEncoderPreset() { // This sample uses the new experimental preset for content-aware encoding PresetName = EncoderNamedPreset. ContentAwareEncodingExperimental } } }; Next steps Now that you have learnt about this new option of optimizing your videos, we invite you to try it out. You can send us feedback via the link below, or engage us more directly at amsved@microsoft.com2.1KViews0likes0CommentsUse Azure Media Services with PowerApps
You can now build PowerApps with media hosted on Azure Media Services. Create a new Azure Media Services account, if you don’t have one already From your Azure Media Services account, locate and publish your video assets from Settings > Assets. Encode your videos. After the videos are published, copy the manifest URLs. Start the streaming endpoint of your service, if not already. If you haven’t tried PowerApps, you can always sign-up for a trial using your work or school account. Download PowerApps Studio from the Windows Store. Alternatively, login to the PowerApps portal and choose New app. For more step by step instructions, read about it on the Azure blog.1.9KViews0likes1CommentEmbed Video Indexer insights in your website
Video Indexer embeddable widgets is a great way to start adding AI insights to your videos. Whether you want to add deep search ability to your published videos or let your users be more engaged with the video content on your website, you can easily achieve that by using the embeddable option at Video Indexer web application or by using Video Indexer API. To get started embedding Video Indexer insights to your website you must have a registered account. If you don't have an account you can easily Sign-In to Video Indexer using a Microsoft, Google, LinkedIn, or Azure Active Directory and get one generated for you. Read more about it on the Azure blog.1.9KViews1like0CommentsGetting Started with the Video Indexer API
Earlier this month at BUILD 2017, we announced the public preview of Video Indexer as part of Microsoft Cognitive Services. Video Indexer enables customers with digital and audio content to automatically extract metadata and use it to build intelligent innovative applications. You can quickly sign up for Video Indexer and try the service out for free during our preview period. On top of using the portal, developers can easily build custom apps using the Video Indexer API. In this blog, I will walk you through an example of using the Video Indexer API to do a search on a keyword, phrase, or detected person’s name across all public videos in your account as well as sample videos and then to get the deep insights from one of the videos in the search results. Read about it on the Azure blog.1.6KViews0likes1CommentEnabling Azure CDN from Azure web app and storage account portal extension
Enabling CDN for your Azure workflow becomes easier than ever with this new integration. You can now enable and manage CDN for your Azure web app service or Azure storage account without leaving the portal experience. When you have a website, a storage account for download or a streaming endpoint for your media event, you may want to add CDN to your solution for scalability and better performance to make your CDN enablement experience easy for these Azure workflow. When you create a CDN endpoint from Azure portal CDN extension, you can choose an "origin type" which lists all the available Azure web app, storage account, and cloud services within your subscription. To enhance the integration, we started with CDN integration with Azure media services. From Azure media services portal extension, you can enable CDN for your streaming endpoint with one click. Now we have extended this integration to Web App and storage account. Read about it on the Azure blog.1.6KViews0likes1CommentAnnouncing HTTP/2 support for all Azure CDN customers
In August 2016, we announced the HTTP/2 support for Azure CDN from Akamai. We are pleased to announce that HTTP/2 is now also available for all customers using Azure CDN from Verizon. No further action is required from customers. HTTP/2 is on by default, for all existing and new Azure CDN profiles with no additional fees. HTTP/2 is designed to improve webpage loading speed and optimize user experience. You will start enjoying the benefits of HTTP/2 without the need to update any of your code base today! Read about it on the Azure blog.1.5KViews0likes0CommentsDefault Security permissions for Audiocodes Mediant in Azure
After troubleshooting an issue with our OVOC server not being able to access an SBC in Azure and modifying a few firewall restrictions discovered the default network port permissions on Azure. Added SNMP and allowed OVOC to talk to the SBC. This is a quick example of the rule we added, instead of using any on the source we did modify that to match the IP address of the OVOC server. Hope this helps someone else when spinning up an AudioCode SBC within Azure.1.5KViews0likes0CommentsVideo Indexer – General availability and beyond
Earlier today, we announced the general availability (GA) of Video Indexer. This means that our customers can count on all the metadata goodness of Video Indexer to always be available for them to use when running their business. However, this GA is not the only Video Indexer announcement we have for you. In the time since we released Video Indexer to public preview in May 2018, we never stopped innovating and added a wealth of new capabilities to make Video Indexer more insightful and effective for your video and audio needs. Read about it in the Azure blog.1.5KViews0likes0Comments