media services
36 TopicsLinking scanner & Azure
Hi all, I have a question regarding scanner (device) & Azure. I have a user who wants to scan numerous documents & when he scans it, they directly go to some folder located on Azure. I presume this would be connected to Azure Blobs Storage/Azure file share? I am missing the link how would we set it up on both sides? I mean, what are the requirements? Kind regards, Dino12KViews0likes1CommentWindows media player "class not registered"
Hello flocks, Whenever I try to open a plays videos with windows media player it always says 'class not registered' I've tried everything. I tried the "regsvr32 quartz.dll" and that didn't resolve it. Method 2: Re-register files related to Windows Media Player. To do this, follow the steps below. Click on start and in the start search bar type cmd, right click on the command prompt icon in the programs area and then click on run as administrator. At the command prompt Type regsvr32 wmp.dll and then press enter. Type regsvr32 jscript.dll and then press enter iii. Type regsvr32 vbscript.dll and then press enter Restart the computer and then check if the issue is resolved. Method 3: Disable and enable Windows Media Player from Windows Features and check if it helps. a)Go to Start. b)Type Turn Windows features on or off and open it. c)Uncheck Windows Media Player under Media Features and click OK. d)Restart your computer. e)Open Windows Features on or off. f)Enable Windows Media Player under Media Features and click OK. g)Restart your computer Turn Windows features on or off Please help me on this issue. Thank.s9.1KViews0likes0CommentsIntroducing Live Video Analytics from Azure Media Services - Now in preview
Azure Media Services is pleased to announce the public preview of a new platform capability called “Live Video Analytics” or in short LVA. LVA provides a platform for you to build hybrid applications with video analytics capabilities. The platform offers the capability of capturing, recording, and analyzing live video and publishing the results (which could be video and/or video analytics) to Azure services in the cloud and/or the edge.4.3KViews0likes0CommentsIntroducing live transcriptions support in Azure Media Services
Azure Media Services provides a https://docs.microsoft.com/en-us/azure/media-services/latest/live-streaming-overview which you can use to ingest, transcode, and dynamically package and encrypt your live video feed(s) for delivery via industry-standard protocols like HLS and MPEG-DASH. Live Transcriptions is a new feature in our v3 APIs, wherein you can enhance the streams delivered to your viewers with machine-generated text that is transcribed from spoken words in the audio feed. When you publish your live stream using MPEG-DASH, then along with video and audio, our service will also deliver the transcribed text in IMSC1.1 compatible TTML, packaged into MPEG-4 Part 30 (ISO/IEC 14496-30) fragments. You can then play back this video+audio+text stream using a new build of http://amp.azure.net/libs/amp/latest/docs/index.html. The transcription relies on the https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text feature of Cognitive Services. This new feature is being demonstrated at the https://www.nabshow.com/ trade show at the Microsoft booth #SL6716. Following the show, we are planning to make Live Transcriptions available for a private preview, during a period of 3 to 4 weeks in May 2019. We are looking for customers and partners who are already using our service for streaming live events, preferably with our v3 APIs, and can dedicate a few viewers to watch the stream live (and not on-demand) and provide feedback. If you are interested in participating, please fill out https://aka.ms/livetranscription-earlyaccess form. We will get in touch with the selected participants in the first week of May 2019.3.2KViews0likes0CommentsIntroducing an experimental content-aware encoding in Azure Media Services
Overview In order to prepare content for delivery by https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming, the video needs to be encoded at multiple bit-rates (high to low). In order to ensure graceful degradation of quality, as the bitrate is lowered, so is the resolution of the video. This results in a so-called encoding ladder – a table of resolutions and bitrates, as you can see in some of our fixed encoding presets, such as “https://docs.microsoft.com/en-us/azure/media-services/previous/media-services-mes-preset-h264-multiple-bitrate-1080p”. Interest in moving beyond a one-preset-fits-all-videos approach increased after Netflix published their https://medium.com/netflix-techblog/per-title-encode-optimization-7e99442b62a2 in December 2015. Since then, multiple solutions for content-aware encoding have been released in the marketplace – see https://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Buyers-Guide-to-Per-Title-Encoding-130676.aspx article for an overview. The idea is to be aware of the content – to customize or tune the encoding ladder to the complexity of the individual video. At each resolution, there is a bitrate beyond which any increase in quality is not perceptive – the encoder operates at this optimal bitrate value. The next level of optimization is to select the resolutions based on the content – for example, a video of a PowerPoint presentation does not benefit from going below 720p. Going further, the encoder can be tasked to optimize the settings for each shot within the video – Netflix described such https://medium.com/netflix-techblog/optimized-shot-based-encodes-now-streaming-4b9464204830 in 2018. In early 2017, we released the https://docs.microsoft.com/en-us/azure/media-services/latest/autogen-bitrate-ladder preset to address the problem of the variability in the quality and resolution of the source videos. Our customers had a varying mix of content – some at 1080p, others at 720p, and a few at SD and lower resolutions. Further, not all source content were high quality mezzanines from film or TV studios. The Adaptive Streaming preset addressed these problems by ensuring that the bitrate ladder never exceeds the resolution or the average bitrate of the input mezzanine. The experimental content-aware encoding preset extends that mechanism, by incorporating custom logic that lets the encoder seek the optimal bitrate value for a given resolution, but without requiring extensive computational analysis. The net result is that this new preset produces an output that has lower bitrate than the Adaptive Streaming preset, but at a higher quality. See sample graphs below, that show the comparison using quality metrics like https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio and https://en.wikipedia.org/wiki/Video_Multimethod_Assessment_Fusion. The source was created by concatenating short clips of high complexity shots from movies and TV shows, intended to stress the encoder. By definition, this preset produces results that vary from content to content – it also means that for some content, there may not be significant reduction in bitrate or improvement in quality. The present currently is tuned for high complexity, high quality source videos (movies, TV shows). Work is in progress to adapt to low complexity (eg. PowerPoint presentations) content, as well as poorer quality videos. This preset also uses the same set of resolutions as the Adaptive Streaming preset – we are working on methods to select the minimal set of resolutions based on the content. Using the experimental preset You can create Transforms that use this preset as follows. If using a tutorial like this https://docs.microsoft.com/en-us/azure/media-services/latest/stream-files-tutorial-with-api, you can update the code as follows: TransformOutput[] output = new TransformOutput[] { new TransformOutput { // The preset for the Transform is set to one of Media Services built-in sample presets. // You can customize the encoding settings by changing this to use "StandardEncoderPreset" class. Preset = new BuiltInStandardEncoderPreset() { // This sample uses the new experimental preset for content-aware encoding PresetName = EncoderNamedPreset. ContentAwareEncodingExperimental } } }; Next steps Now that you have learnt about this new option of optimizing your videos, we invite you to try it out. You can send us feedback via the link below, or engage us more directly at mailto:amsved@microsoft.com2.1KViews0likes0CommentsUse Azure Media Services with PowerApps
You can now build PowerApps with media hosted on Azure Media Services. Create a new Azure Media Services account, if you don’t have one already From your Azure Media Services account, locate and publish your video assets from Settings > Assets. Encode your videos. After the videos are published, copy the manifest URLs. Start the streaming endpoint of your service, if not already. If you haven’t tried PowerApps, you can always sign-up for a trial using your work or school account. Download PowerApps Studio from the Windows Store. Alternatively, login to the PowerApps portal and choose New app. For more step by step instructions, read about it on the Azure blog.1.9KViews0likes1CommentEmbed Video Indexer insights in your website
Video Indexer embeddable widgets is a great way to start adding AI insights to your videos. Whether you want to add deep search ability to your published videos or let your users be more engaged with the video content on your website, you can easily achieve that by using the embeddable option at Video Indexer web application or by using Video Indexer API. To get started embedding Video Indexer insights to your website you must have a registered account. If you don't have an account you can easily Sign-In to Video Indexer using a Microsoft, Google, LinkedIn, or Azure Active Directory and get one generated for you. Read more about it on the Azure blog.1.9KViews1like0CommentsGetting Started with the Video Indexer API
Earlier this month at BUILD 2017, we announced the public preview of Video Indexer as part of Microsoft Cognitive Services. Video Indexer enables customers with digital and audio content to automatically extract metadata and use it to build intelligent innovative applications. You can quickly sign up for Video Indexer and try the service out for free during our preview period. On top of using the portal, developers can easily build custom apps using the Video Indexer API. In this blog, I will walk you through an example of using the Video Indexer API to do a search on a keyword, phrase, or detected person’s name across all public videos in your account as well as sample videos and then to get the deep insights from one of the videos in the search results. Read about it on the Azure blog.1.6KViews0likes1CommentEnabling Azure CDN from Azure web app and storage account portal extension
Enabling CDN for your Azure workflow becomes easier than ever with this new integration. You can now enable and manage CDN for your Azure web app service or Azure storage account without leaving the portal experience. When you have a website, a storage account for download or a streaming endpoint for your media event, you may want to add CDN to your solution for scalability and better performance to make your CDN enablement experience easy for these Azure workflow. When you create a CDN endpoint from Azure portal CDN extension, you can choose an "origin type" which lists all the available Azure web app, storage account, and cloud services within your subscription. To enhance the integration, we started with CDN integration with Azure media services. From Azure media services portal extension, you can enable CDN for your streaming endpoint with one click. Now we have extended this integration to Web App and storage account. Read about it on the Azure blog.1.6KViews0likes1CommentAnnouncing HTTP/2 support for all Azure CDN customers
In August 2016, we announced the HTTP/2 support for Azure CDN from Akamai. We are pleased to announce that HTTP/2 is now also available for all customers using Azure CDN from Verizon. No further action is required from customers. HTTP/2 is on by default, for all existing and new Azure CDN profiles with no additional fees. HTTP/2 is designed to improve webpage loading speed and optimize user experience. You will start enjoying the benefits of HTTP/2 without the need to update any of your code base today! Read about it on the Azure blog.1.5KViews0likes0Comments