Ignite, 2021: Scale-up with the latest enhancements of Azure Video Analyzer for Media files (AVAM)
Published Nov 02 2021 08:10 AM 2,530 Views
Microsoft

Why the number ‘3’ is important? Azure Video Analyzer for Media (AVAM), formerly known as Video Indexer, celebrates 3 years as a general-available cloud solution for analyzing media files. AVAM is an Azure Applied AI solution with 33 machine-learning based vision and audio models extracting significant business insights from your media files. We are happy to announce that this 3-year product is now ready to scale-up thanks to a set of innovative enhancements. We believe that now, any digital media asset library in the world can get the most valuable insights in less than 3 days efforts. As an example, we now enable an ARM-based account management (and it takes only 3 minutes to create a new account).

 

New Azure Video Analyzer for Media enhancements as of Microsoft Ignite Fall 2021

We see huge growth in the video files consumption as a trend, especially due to Covid, which result with clear need to have deep and robust analysis of video/audio libraries. With Fall Ignite, 2021 release, we provide you with the following enhancements:

  1. Public preview of Azure Video Analyzer for Media account management is based on Azure Resource Manager (ARM)
  2. Expanding globally to additional 6 new regions and 3 new languages
  3. NPM package for our Insights widgets
  4. Automatic scaling of Media reserved units

In addition, we are proud to share a sneak peek to AVAM collaboration with Streamland in the post-production space.

Here comes the details:

 

[1] Public Preview: Manage your Azure Video Analyzer for Media account based on ARM

Azure Resource Manager is the deployment and management service for Azure. It provides a management layer that enables the creation, update, and deletion of resources in your Azure account. (RBAC), locks, and tags, to secure and organize your resources after deployment. Azure Video Analyzer for Media introduces a public preview of ARM-based account management (Learn more about ARM).

As an account owner, you now can centralize your Azure resource management, benefit from account management using command line and ARM templates. There is a no-code option to create the account using Azure Portal (aka ‘Ibiza’). ARM-based account management is a significant milestone, as going forward our customers will benefit from more secured and standard Azure ARM-dependent features such as private links, service tags, customer-managed keys, and integration with Azure features. To get all the details, sample codes, examples and how to connect your existing account - I invite you to read  Itai’s blog.

 

Join our preview: We invite our existing customers, which have an existing classic account and already subscribed to Azure, to join our preview program by connecting their account data into the new ARM-based account and benefit from Azure CLI, templates, secured deployments and more. Bill Gates frequently talks about the need for constant self-improvement “We all need people who will give us feedback. That’s how we improve.” He said, so we consistently improved and looking to get your feedback. Please contact us if you are interested or simply follow the instructions in this blog.

 

 

[2] Global Expansion: Video Analyzer for Media deployed in six new regions and support 3 additional source languages

Starting July 2021, you can create an Azure Video Analyzer for Media paid account in France Central, Central US, Brazil South, West Central US, Korea Central, and Japan West regions. The full list of regions is available here per product.

 

By end of Dec, 2021, we will also enable additional 3 source languages: Persian (Iran) fa-IR, Hebrew (Israel) he-IL and Portuguese (Portugal) pt-PT. If you find a language in MS Azure speech-to-text service, which is not yet enabled in Azure Video Analyzer for Media, and critical for your business need, please contact our product group at visupport@microsoft.com. Setting the source language will improve the transcription when selecting e.g. Hebrew as a source language before uploading and indexing a new video/audio file, it will allow you to translate and search and also get translated insights.

insagiv_0-1635791158561.png

 

 

Figure 1 Screenshot from Azure Video Analyzer for Media Web Application when Hebrew language is enabled as a source language

 

[3] Embed and customize widgets in your app using new package

 

What’s the value of our widgets? We provide a way to embed web components (widgets), in your business application, so you do not need to invent the wheel and simply leverage UI parts we generate for you. As a solution developer, you can embed three types of widgets into your apps: Cognitive Insights, Player, and Editor. More information is available here.

 

Using our new Javascript SDK (npm package) @azure/video-analyzer-for-media-widgets you can add our insights widgets to your app and customize it according to your needs, while it gives you full flexibility. Instead of adding an iframe element to embed the insights widget, with this new package you can easily embed & communicate between our widgets. Customizing your widget is only supported in this package - all in one place! With that package we give you full flexibility to bring your own custom models/data and integrate that in the insight widget, customize the styling and colors and edit the data we extract. Get to know all the details and a working example, end to end, in Nofar’s blog.

 

insagiv_2-1635729612078.png

Figure 2 Three different examples of a customized insight widget.

 

[4] Automatic scaling of media reserved units

Starting August 1st 2021, Azure Video Analyzer for Media enabled Media Reserved Units (MRUs) auto scaling by Azure Media Services, as a result, you do not need to manage them through Azure Video Analyzer for Media. It will allow for cost optimization, for example price reduction in many cases, based on your business needs as it is being auto scaled.

 

What about additional models? Forward Looking 

In the next quarter, we plan to release in preview additional models such as person clothing and improve the accuracy of both observed people model and audio event detection to meet our customer feedback and digital media scenarios such as “find a person with a red shirt” and build a pre-production curated content for content producers easily, or identify laughter or hands clapping for accessibility scenarios.

 

Sneak Peek to AVAM collaboration with Streamland in the post-production space

AVAM and the Streamland Media’s Technology Team, have been in close collaboration over the past few months to ideate solutions that drive innovation in the post-production space. Working closely with Jeremy Stapleton (Global Head of Software) and Aline Shaw (Director of Product Innovation), the AVAM team is working on developing slate detection functionality that will allow the algorithm to automatically identify slates, as well as the writing and timecode on them.

The collaboration focused on defining the specific requirements, including the meta data to be extracted from the slates through the user experience and reviewing the results.

We plan to release the new capabilities by mid-2022: Clapper board detection and meta data extraction, color bars detection, and textless slate detection, including scene matching.

Upon release, this functionality will also be available in Scout, Streamland’s review and approve software. Today, Scout is already leveraging AVAM in a multitude of ways, such as searching through media library by labels and faces, eliminating the need for manually scanning through hours and hours of content.

This is an exciting partnership that helps expand usability and functionality of AVAM across multiple post-production workflows. Combined with PULSE, Streamland's Azure-enabled original camera file asset management tool, Scout and AVAM are providing a suite of virtual tools to drive efficiencies for the creative community.

More to come!

 

insagiv_3-1635729692231.png

Figure 3 Example of a color bar 

 

Join us and share your feedback

 

For those of you who are new to our technology, we encourage you to get started today with these helpful resources:

 

 

 

 

 

 

 

 

 

Co-Authors
Version history
Last update:
‎Nov 02 2021 08:18 AM
Updated by: