Menu extends

Jul 20, 2015
Why Doesn’t Anyone Label The Audio?
How do you tag your audio? Find out how standards are simplifying the internationalization of content, and how they can cope with identifying languages. Standardized audio metadata and subtitle metadata - it may be a tiny-tiny addition to your assets, but over time it helps you find, use and monetize versioned content with no effort at all.

Why Doesn’t Anyone Label The Audio?

How do you tag your audio? Find out how standards are simplifying the internationalization of content, and how they can cope with identifying languages. Standardized audio metadata and subtitle metadata - it may be a tiny-tiny addition to your assets, but over time it helps you find, use and monetize versioned content with no effort at all.

The great thing about language is its ability to allow us to exchange ideas and concepts, and hopefully create a business by doing so. With the increasing number of multi-platform delivery opportunities, the increasing bandwidths and channel densities, we are also seeing an increasing opportunity for content owners to create revenue with their content. Successfully exploiting that opportunity involves tailoring the version of the content meant for the audience to reduce friction and increase enjoyment of the viewer / listener. The blockbuster movie community has known for a long time that efficiently making versions of a movie and its collection of trailers on a territory by territory basis can make a significant difference to the number of people who watch that movie. I believe that we are entering an era where turbo-charging the versioning efficiency of media companies is going to be a dominant differentiator.

To reduce the costs of versioning and to make life simple for the creative human processes, it is necessary to automate the processes that can be done by machines (or in our case, software). To a company that deals with video, all issues will looks like video issue. The processes for segmenting video content and replacing elements are pretty well understood. Organizations like the UK's DPP have created standards for interchanging that segmentation information.

In today’s blog, I'm going to assume that the video issues are largely understood and look at a “simple” issue that two customers approached me about here at the SMPTE Australia show.

Right now, on the planet, there are many more languages spoken than there are scripts for writing those languages down. There are also many more scripts than there are countries in the world. This makes the labeling of languages and scripts an interesting challenge for any media company, as the variables are virtually endless. There are many schemes used in the world for labeling audio and any naïve person entering the industry would assume that there must be some sort of global tag that everyone uses for identification ... right?

Wrong.

Traditionally, TV stations, broadcasters, content creators and others have created content for a specific market. Broadcasters, distributors, aggregators and others have sent their content to territories with only a handful of languages to cope with. Usually proprietary solutions for “track tagging” are developed and deployed.

The compelling business need to streamline and standardize the labeling of audio channels hasn’t really existed until now. The internationalization of distribution compels us to find an agreed way in which labeling can be done.

Thankfully, someone got there before the media folks. The internet community has been here before - and quite recently. The internet standard RFC5646 is very thorough and copes with the identification of primary languages as well as dialects, extinct languages and imaginary vocabularies such as Klingon. With such a comprehensive and interoperable specification that is widely used for the delivery of web content to billions of devices every day, you'd think that any media system designer worth his or her salt would have this electronic document in their favorites list for regular look-up.

You'd think ...

The MXF community knows a good thing when it sees it, so you'll find that when it comes to a standardized way to tag tracks in MXF – the SMPTE standard ST 377-4 uses RFC5646 as its vocabulary for labeling. ST 377-4 additionally recognizes that each channel of an audio mix might contain a different language. Each channel might also belong to a group intended as a stereo group, or a surround sound group, or a mono-group of one channel. This hard grouping defines the relationship of channels that should not be split. Going further, ST 377-4 defines groups of groups that are used as metadata to enable easy versioning so that, for example, a French group might consist of a French stereo group, a clean M&E surround mix and a French mono audio description channel.

Reality

ST 377-4 with RFC5646 solves a difficult problem in a simple and elegant way. Up until now, it's been easier for media companies to do their own thing and invent their own metadata vocabularies with proprietary labeling methods rather than use a standard. Today, to get cost effective interoperability we're starting to rely on standards more and more so that we don't have to stand the cost of an infinite number of proprietary connectors to make things work.

As you see more versions of more programs being created, spare a thought for the future costs and revenues of media that needs to be exchanged. A little up-front-standardized metadata builds the launch ramp for a future searchable and accessible library of internationalized content. Standardized audio metadata and subtitle metadata - it may be a tiny-tiny addition to your assets, but over time it helps you find, use and monetize versioned content with no effort at all. Take action now and learn the difference between en-US and en-GB. It's more than just spelling.

YOU MAY ALSO LIKE...
Dalet Showcases Plug-and-Play IMF Workflows at NAB 2018
Dalet, a leading provider of solutions and services for broadcasters and content professionals, extends the Interoperable Master Format (IMF) capabilities and collaborative workflows within the new Dalet Galaxy five solution. Shown at NAB 2018 on booth SL8010, the Dalet Galaxy five component-based approach to IMF provides an exceptional user experience for maximum efficiency – simplifying IMF production and delivery workflows with visually intuitive graphical tools and automation. “The overall change in moving from traditional workflow to Dalet’s IMF component-based approach is more than a technical uplift, it's a change in the business value model,” states Matthieu Fasani, product manager, Dalet. “All too often a facility can be perceived as a provider of tapes and files that satisfy individual creative projects, but the real goal here is to move into a position of being able to fulfil a continuous business need by providing a technical back office and, potentially, a store-front for fulfilling the national and international distribution requirements. Dalet enables this by demystifying the complexity of IMF, industrializing the process with a component-based approach that is incredibly flexible and quick to adopt across organizations.” Dalet has been pioneering component-based workflows and the IMF workflow industrialization with Dalet Galaxy, its flagship MAM and Workflow Orchestration platform, creating real business opportunities for global distribution. “In a traditional workflow you need to re-render your entire project several times throughout its lifecycle including every version you need to distribute,” explains Arnaud Elnecave, vice president, marketing, Dalet. “The Dalet Galaxy five data model and toolset supports comprehensive component-based workflows. This means you never render, you change only the specific elements that need to be modified or added supplements within a project or collection, eliminating the need to duplicate media files. This is a much more flexible approach that offers significant savings on time and resources for production and delivery. This will become more and more critical for efficient program versioning and distribution workflows where various essences are produced, transformed, arranged, packaged and exchanged repeatedly between media properties.” NAB attendees can book a private briefing here to learn more about Dalet Galaxy five IMF and other component-based workflows. New IMF Workflow Features of Dalet Galaxy five New features, introduced with Dalet Galaxy five and shown at NAB 2018, include native IMF support and a Video Timeline view to speed approvals and production: Native support for IMF assets, such as the Composition Play List (CPL) and the Output Profile List (OPL), comes as default alongside objects like video, audio or images when you open Dalet Galaxy Generating proxies for preview of track files and CPLs in Dalet WebSpace Visualizing asset relationships (between track files and CPLs, CPLs and OPLs, and between CPLs belonging to the same title) Automating the creation of new versions by leveraging the Workflow Orchestration platform capabilities Automatically connecting CPLs using EIDR (Entertainment Identifier Registry) and ISAN (International Standard Audiovisual Number) identifiers Orchestrating the rendering of compositions and the generation of new IMPs manually or automatically based on configurable business rules A Great User Experience “The IMF standard brings huge benefits at several key steps of the media supply and distribution chain but at the cost of some complexity. In order for users to leverage all of its benefits, IMF workflows need to be intuitive and easy to manage,” adds Fasani. “That’s exactly what Dalet Galaxy five is introducing. The way the Dalet user experience exposes IMF assets for manipulation is so unique, it will change the way people perceive and understand IMF. In just one or two days you can set up sophisticated workflows to receive IMF packages, review the content, trigger export to Netflix or any distributor you wish, and simplify end-to-end IMF workflows for all of your team. We are replacing complexity with usability.” Dalet Galaxy five introduces a new set of tools within Dalet WebSpace specifically designed for preview, review and approval of all objects of an IMF package, including the CPL, with a glance at the Video Timeline available on any web client. There is no need for an extensive mastering tool. Simply import the package content into Galaxy, automatically generate proxies and extend collaborative review and approval workflows. The Bundle Object groups all the components that belong to the same title under one umbrella. Inside the Bundle Editor, users will find the lifecycle of component-based workflows presented in a simple, visual layout: The Media Tab lists all the component track files (video, audio, subtitles/captions The Production Tab lists all the CPLs/OPLs, which describe how components have been assembled and should be transformed to an output deliverable The Version Tab exposes the renderings of CPLs/OPLs into final deliverables for consumption by end-users (AS-11 for linear playout, HLS/DASH for OTT distribution) The Packages Tab tracks how individual components have been received/delivered and allows to generate new packages by simple drag-and-drop operations. Extensive Third-Party Integration Dalet works with the IMF eco-system of vendors (mastering tools from Marquise Technologies, Rohde & Schwarz and ColorFront; QC tools from Netflix Photon, Interra Systems and IRT) to ensure robust end-to-end IMF workflows, including receiving IMF packages, referencing IMF assets in a central repository and leveraging IMF metadata, both technical and editorial, to automate processes at scale and expose the relevant information to the appropriate users. Learn more about IMF and Dalet Component-Based Workflows Click here to visit the dedicated webpage. Watch the Video Presentation Learn More About IMF and Dalet component-based workflows at Dalet Pulse Join 250+ media and technology executives at Dalet Pulse to discover latest innovations from Dalet and technology partners. Dalet Pulse will take place on April 8th at the SLS Hotel in Las Vegas. To register, please click here. Schedule a Private Press Briefing with Dalet at NAB 2018 Members of the NAB registered press are invited to request a private press briefing with Dalet at NAB 2018. For more information, please contact Alex Molina at alex@zazilmediagroup.com. About Dalet Digital Media Systems Dalet solutions and services enable media organizations to create, manage and distribute content faster and more efficiently, fully maximizing the value of assets. Dalet products are built on three distinct platforms that, when combined, form versatile business solutions that power end-to-end workflows for news, sports, program preparation, production, archive and radio. Individually, Dalet platforms and products offer targeted applications with key capabilities to address critical media workflow functions such as ingest, QC, edit, transcode and multiplatform distribution. The foundation for Dalet productivity-enhancing workflow solutions, Dalet Galaxy is the enterprise Media Asset Management (MAM) & Orchestration platform that unifies the content chain by managing assets, metadata, workflows and processes across multiple and diverse production and distribution systems. Specially tailored for news and media workflows, this unique technology platform helps broadcasters and media professionals increase productivity while providing operational and business visibility. Dalet AmberFin is the high-quality, scalable transcoding platform with fully integrated ingest, mastering, QC and review functionalities, enabling facilities to make great pictures in a scalable, reliable and interoperable way. Addressing the demanding needs of studio production, multi-camera ingest, sports logging and highlights production, the innovative Dalet Brio video server platform combines density and cost-effectiveness with high reliability. Adopted by leading broadcasters, Dalet Cube is a suite of applications to create, manage and deliver graphics in a newsroom scenario. Dalet supports customers from the initial planning stages to well beyond project execution. Our global presence includes 17 offices strategically located throughout Europe, the Middle East, Asia Pacific, North America and South America, and a network of more than 60 professional partners serving 87 countries worldwide. This collective experience and knowledge enables our customers to realize potential increases in productivity, efficiency and value of their assets. The comprehensive Dalet Care program ensures deployments remain up and running with 24/7 support 365 days a year. Dalet systems are used around the world by many thousands of individual users at hundreds of TV and Radio content producers, including public broadcasters (ABS-CBN, BBC, CBC, DR, FMM, France TV, RAI, RFI, Russia Today, RT Malaysia, VOA), commercial networks and operators (Canal+, FOX, eTV, MBC Dubai, MediaCorp, Mediaset, Orange, Time Warner Cable, Warner Bros, Sirius XM Radio), and government organizations (Canadian House of Commons, Australian Parliament and UK Parliament). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
IMF, the Age of Reason – Key Takeaways from HPA Tech Retreat
Some say the annual HPA Tech Retreat in Palm Desert, CA shapes the agenda for the rest of the year in the media industry, and as it relates to program distribution, 2018 is no exception. We’ll likely look back at 2018 as the year that IMF (the Interoperable Mastering Format) went from an emerging capability to full-blown industry reality. IMF is a file-based framework for the exchange and processing of multiple content versions of the same high-quality finished work. One of the primary benefits of this innovation being that version files reference media from the original master, limiting storage overhead and delivery time. This year’s HPA Tech Retreat hosted an Innovation Zone with a focus on IMF, where multiple vendors showed how interoperable IMF has become, with mastering, management, QC and playout all working together, a reality. IMF moved beyond talk at HPA 2018…it became a working set of production-ready capabilities. Taking a step back… The transition to file-based media brought a massive transformation in the production and post-production business, removing the restrictions of real-time processing from workflows and introducing a real boost in efficiency. In the 10 years since, however, collaboration, the workflows themselves, and media operations have remained relatively similar, even while the demands have started to change dramatically… We live in a multi-multi world! In listening to our customers and talking with many others in the market, we’ve been hearing that media businesses are getting squeezed with the increase in multi-platform distribution. While the creation of new outlets is a boon for the business as a whole, the reality of the acceleration of release windows, coupled with the increasing number of outlets, is really leading to some interesting considerations: How do I create more versions for less money per version? Is it possible to increase version creation and distribution outlet deliveries automatically? Can my single-purpose work tracking system and silos of media processing equipment work together? How do I bring my creative, production and marketing teams together to share information and media to avoid duplication and increase our offerings? What do I do with IMF? Seriously, what can be done about creating more versions? At Dalet, we’re big believers in automated processing, driven by recipes designed by people that know the business best. If you start with operators and administrators looking at the overall customer work that has been coming in, it is possible to build media recipes that can be automatically fulfilled as the media supply chain operates, including: Video Audio languages Subtitle languages Metadata transformations Packaging instructions Once recipes are defined, it really becomes as simple as connecting media processes and work operations to fulfill the recipe and create versions that go beyond simple transcodes and include everything necessary, including the proper metadata, images and folder/ZIP structures to make a delivery to Netflix, Comcast, Hulu or others. Of course, a modern system that can handle the creation and implementation of these recipes and integrate with the equipment already onsite is key to making this work well. Dalet Galaxy has both the back-end and the user tools that enable your operators to define the best workflows and recipes for your business. Moving toward automated version creation To start, it really takes a firm evaluation of what is happening ‘down on the farm’ so to speak. In a lot of our discussions, we’re hearing about processes and systems that worked really well when they were implemented 10 years ago, and just aren’t designed to keep up today, particularly single-purpose work order tracking systems that don’t have any integration with media processing systems. So, while it can look like there is a bunch of automation available to the operation, there is really just a bunch of manual interaction required by operators to kick off the automated processes, and when order volume increases, it is difficult to get the supply chain ramped up, since it is dependent on human interaction. These types of scenarios are best solved by starting to utilize media orchestration & management solutions (also known as business process management, BPM and workflow orchestration), which allow a single platform to ride on top of the existing media processes, interacting with the post operations’ existing equipment, so that, for example, media files are automatically categorized for work, assigned to the proper tasks, sent into a craft edit environment for final processing, transcoded and sent to the proper distribution channel. Of course, these orchestrators can’t run completely automatically, as the post business relies on a high degree of review and approval processes for human operators to inspect media as it progresses through the supply chain, and the best systems provide user task panes with customized user interfaces to not only fulfill the manual review, but also utilize the operators’ input to continue down the proper automated path. For example, if a review reveals that the subtitles received for the content are actually Cantonese instead of Mandarin, the workflow engine should allow the operator to send a request back to the subtitle provider to redeliver and wait till the updated content arrives. And in the same token, content that is deemed as ready to proceed should be automatically transcoded to the proper formats, packaged with the metadata, and sent to the required distribution outlets, based on the original work order, all without any further operator interaction. The two keys to being more efficient here are: Implementing a system that can drive your media supply chain, automatically, end-to-end, utilizing your existing equipment Ensuring that this system can also interact with your users and utilize their input to drive automated media processes It is certainly possible to house a central catalog or small Media Asset Management (MAM) system to pull this off, and then join that with your existing work order system to try and get these teams together. What we’ve seen, though, is that since operators have to utilize different systems to move between receiving work, and actually doing work, and then marking work complete, they really stick with a disjointed setup like this. Even worse, rarely are the systems purchased capable of handling user access control lists so that each team is able to find what they need without disrupting the other’s work. The solution here is to make sure that you have a central system that can handle user permissions, assign access to media when and where it is necessary and allow teams to receive and perform work all from one interface. Enter IMF The IMF standard allows the creation of a single original version with a full set of media essence, where each additional distribution receives a supplemental package that just contains the differences of that version from the original. In the case where we need to deliver a French subtitled version of the original movie, it is only necessary to send the French subtitles and a composition that references the original version’s media assets. Similarly, if we need to create a full-on French language version, we only need to deliver the titles, credit and dub video sequences, the French language audio track and again, a composition that references the original media. Consequently, storage needs are massively reduced, sometimes by as much as 90% over legacy version creation! Also, IMF saves on QC cost and improves workflow automation. However, and to make the most of IMF, media organizations need a powerful, IMF-aware asset management & orchestration platform. Indeed, as each of those supplemental versions gets delivered as a new folder in the mastering structure, organizations can end up with multiple sub-folders for each additional version in a nested structure that becomes very difficult to manage, instead of one nice flat file structure. IMF seems to be the train that isn’t stopping. The support from Netflix has really ensured that the entire industry pays attention and considers its use in their operations. With NABA & DPP also on-board with IMF for broadcast and online, the world is continuing to move toward component media workflows in a big way. Component media workflows are great in theory; you get to retain your media essence files in the original component format, mixing and matching them, as necessary, to create the versions for production, without the need of wasting time or storage space in conforming interleaved files. In practice, managing component media, including IMF, has become a serious challenge. At Dalet, we’ve seen everything from complex folder structures to multi-tab spreadsheets to track which packages belong with which titles from multiple providers. We believe managing IMF should be easier than that, and so, at Dalet, we’ve been building native IMF management into our MAM and Orchestration platform, Dalet Galaxy, to make sure that facilities can easily match up IMF packages and create the deliverables they need (IMF or otherwise). This is why Dalet has developed dedicated features to make IMF management as simple as possible, with two platforms available for two different use cases: Dalet AmberFin – for customers that need IMF processing: IMP generation, validation, CPL transcoding and delivery. Dalet Galaxy – for customers that need enterprise-scale IMF management and orchestration. Of course, Dalet Galaxy and Dalet AmberFin work perfectly together, with Dalet offering a single unified solution that allows you to create IMF, ingest IMF titles, manage the relationships between original and supplemental versions, and deliver IMF packages for your distribution needs. For our customers that need to submit IMF to Netflix, Dalet AmberFin’s IMF workflow allows you to create a simple IMP with all required files in the IMP from any input. Dalet AmberFin will take care of the JPEG 2000 transcoding, the audio track labeling, the Timed Text Markup Language (TTML) creation and the IMF metadata, as well as submission of the package to Netflix Photon, and coming soon, delivery directly to Netflix Backlot, all in one, orchestrated workflow. For those that are wondering, AmberFin can be used for ProRes IMF also! Dalet AmberFin can create complex CPL packages as well, and since it works as a distributed transcoder, it can scale both on-premises or in a cloud environment. Customers with large quantities of existing IMF content can utilize the Dalet Galaxy platform to tie original versions and supplemental packages together, giving a simple browse and search capability for titles with multiple versions. Dalet Galaxy harnesses the power of Dalet AmberFin to parse the IMF packages, and retains all essential packaging information, including the Composition Playlist – known as CPL -, in the title versions. The Dalet Context Map, allows a visualization of the relationships between the different tracks, different CPLs and an IMF title. Never before has there been a tool specifically built for versions like this in an asset management system. And Dalet built this specifically for versioning customers like studios and post-production houses. Delivering an IMP from Dalet Galaxy is as simple as selecting the version desired. The IMP package will be created by resolving all media references in the CPL for the selected version, and these packages can be distributed over any of the existing connectors built into Galaxy, including Aspera, Signiant, File Catalyst, Amazon S3, Azure Blob Storage, FTP and network file systems. Users can also utilize the workflow engine to drive a rendering of a composition for iTunes delivery as ProRes or any other format deliverable that you might need in your operation. What about the investment? That really depends on the vendor a facility works with. At Dalet, we work with two kinds of operations: The ones that love to get the cost out of the way up front and then amortize that cost over time on their projects, and; The ones that need systems that can be priced out per project, on a lease or SaaS basis For example, we know that in the post-production world, there are only a handful of facilities that fit in that first category, so we’ve implemented methods to bring our media systems into their environments as OpEx, either inside their facility, in the public cloud like Amazon Web Services (AWS) or Azure, or as a hybrid of the two. With Dalet IMF solutions, media organizations have the ability to fulfill the entire promise of IMF. Not only can they save storage space by moving to supplemental versions rather than storing the entire package, they also have the ability to target distribution outlets with IMF requirements (such as Netflix) while retaining the capability to deliver the same content to any other vendors. Want to go further? Download the Dalet IMF White Paper Learn more about Daler Versioning & IMF solutions Contact us and plan a Dalet IMF demo
Dalet Galaxy Federates IMF Interoperability Showcase at 2018 HPA Tech Retreat
Dalet, a leading provider of solutions and services for broadcasters and content professionals, will showcase concrete IMF production and delivery workflows at the 2018 Hollywood Post Alliance (HPA) Tech Retreat's Innovation Zone. The special HPA showcase organized by the worldwide IMF User Group, an HPA open forum for the community of end users and implementers of the IMF family of standards, will feature member companies including Colorfront, Marquise Technologies and Rohde & Schwarz demonstrating typical IMF scenarios with Dalet Galaxy media asset management (MAM) and orchestration at the core of the interoperability showcase. “Turning IMF from a promise to an actual production-ready implementation is a great step for the industry as a whole. The adoption of the standard will become more and more critical for content originators and distributors to optimize workflows in today’s explosive film and broadcast market. Dalet has committed to this standard and initiative from the get go. We are glad to spearhead and enable this new generation of workflows and format,” comments Matthieu Fasani, product line manager for Dalet MAM & SAAS solutions. “There are real substantive business benefits in adopting IMF. Simplified distribution, optimized delivery, reduced storage footprint, shorter lead time for product turnaround and reduced QC costs allow content producers to industrialize their productions and scale delivery to meet demands. Deploying IMF-proven solutions is game-changing and now is the time for broadcasters and content producers to make the leap.” The only MAM vendor participating in the IMF user group showcase in the HPA Innovation Zone, Dalet will demonstrate with member companies an end-to-end IMF workflow including receiving IMF packages, referencing IMF assets in a central repository, leveraging IMF metadata, both technical and editorial, to search on all assets and data; generating proxies for preview of track files and CPLs in Dalet WebSpace; visualizing assets relationships (track files and CPLs, CPLs and OPLs), automating the creation of new versions, connecting CPLs using EIDR (Entertainment Identifier Registry) and ISAN (International Standard Audiovisual Number) ids. HPA attendees can book a meeting with Dalet to learn more about IMF workflows via: http://www.dalet.com/events/hpa-tech-retreat-2018. About Dalet Advancements in IMF Workflows Dalet has been pioneering the IMF Workflow industrialization with Dalet Galaxy, a MAM and Orchestration platform that features a comprehensive and intuitive set of tools to manage IMF packages at the production, distribution and contribution parts of a media business. Users can easily import and preview IMF packages natively into Dalet WebSpace, visualize IMF structures and components with Dalet Context Maps, build versions and supplemental packages with Dalet Track Stack and Dalet Version Editor, and eventually wrap compositions for distribution using a connector to an IMF-compliant transcode farm, such as Dalet AmberFin. A white paper on how Dalet IMF Workflow solutions can benefit your facility is available here: www.dalet.com/white-paper/imf. About the IMF User Group The Interoperable Master Format (IMF) User Group (UG) is a forum for the worldwide community of end users and implementers of the IMF family of standards. The IMF UG brings together content owners, service providers, retailers and equipment/software vendors to enhance and promote the use of IMF globally, across domains of applications. The IMF UG discusses technical operational issues that arise in practical implementation, conducts interoperability testing, develops best practices, and seeks to broaden the awareness of IMF. The IMF UG is organized under the umbrella of the HPA. About Dalet Digital Media Systems Dalet solutions and services enable media organizations to create, manage and distribute content faster and more efficiently, fully maximizing the value of assets. Dalet products are built on three distinct platforms that, when combined, form versatile business solutions that power end-to-end workflows for news, sports, program preparation, production, archive and radio. Individually, Dalet platforms and products offer targeted applications with key capabilities to address critical media workflow functions such as ingest, QC, edit, transcode and multiplatform distribution. The foundation for Dalet productivity-enhancing workflow solutions, Dalet Galaxy is the enterprise Media Asset Management (MAM) & Orchestration platform that unifies the content chain by managing assets, metadata, workflows and processes across multiple and diverse production and distribution systems. Specially tailored for news and media workflows, this unique technology platform helps broadcasters and media professionals increase productivity while providing operational and business visibility. Dalet AmberFin is the high-quality, scalable transcoding platform with fully integrated ingest, mastering, QC and review functionalities, enabling facilities to make great pictures in a scalable, reliable and interoperable way. Addressing the demanding needs of studio production, multi-camera ingest, sports logging and highlights production, the innovative Dalet Brio video server platform combines density and cost-effectiveness with high reliability. Adopted by leading broadcasters, Dalet Cube is a suite of applications to create, manage and deliver graphics in a newsroom scenario. Dalet supports customers from the initial planning stages to well beyond project execution. Our global presence includes 17 offices strategically located throughout Europe, the Middle East, Asia Pacific, North America and South America, and a network of more than 60 professional partners serving 87 countries worldwide. This collective experience and knowledge enables our customers to realize potential increases in productivity, efficiency and value of their assets. The comprehensive Dalet Care program ensures deployments remain up and running with 24/7 support 365 days a year. Dalet systems are used around the world by many thousands of individual users at hundreds of TV and Radio content producers, including public broadcasters (ABS-CBN, BBC, CBC, DR, FMM, France TV, RAI, RFI, Russia Today, RT Malaysia, VOA), commercial networks and operators (Canal+, FOX, eTV, MBC Dubai, MediaCorp, Mediaset, Orange, Time Warner Cable, Warner Bros, Sirius XM Radio), and government organizations (Canadian House of Commons, Australian Parliament and UK Parliament). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
Metadata is “The New Gold”!
Plain and simple. Some might categorize the above statement as an exaggeration, nevertheless, we persist! A number of technology trends got the lion’s share of buzz and display at IBC’s 50th edition: cloud and smart hybrid infrastructure, video over IP, workflow orchestration and automation, and last but not least big data, machine learning and artificial intelligence (AI). As game-changing as they can be, we believe that these are in fact enablers of a much bigger trend: better serving and engaging with audiences. Content discovery for consumers It is well documented that the success of online video relies, in part, on metadata. Metadata-centric workflows give viewers the freedom to become more engaged. They can discover and explore more content, navigating directly to the most interesting scenes (including the intent of the scene, e.g. serene, suspense, etc.). Publishers can fully monetize viewer habits and experiences in the most effective way possible with a Media Asset Management (MAM) & Orchestration platform that allows end-to-end authoring and managing asset metadata in their production workflow. For on-demand video consumption, accurate description of content is key to help narrow recommendation engines to more relevant suggestions. For ad-based or product placement models, metadata helps define the optimal in-stream video insertion points, allowing publishers greater control and flexibility with their advertising strategies. Scene metadata such as character name, player name, topic, keyword, etc. become key. The more accurate and rich the description of these insertion points, the better the advertisers can pick the slots that fit both their target audience and the brand experience they look to create. Metadata-driven operations & workflows Today, metadata is also at the heart of the orchestration of workflows and automation of processes, both of which become increasingly important to streamline and scale production and distribution operations. Any process and/or action in the chain of operations can be triggered by any change or fulfillment of a condition on any of the fields of the data model. These configurable metadata-driven workflows are extremely powerful. While the industry has moved away from the simplicity of "one profile per customer", we can today create an environment where a single workflow can produce all the desired outputs just by changing the metadata that initiates a particular process. Managing complex media objects Metadata is core to structure and manage complex media objects and their relations, to enable operations like versioning and packaging of track stacks or compositions. To enable better collaboration and flawless content transformation and curation, organizations need to disintermediate the supply chain. One of the key concepts is to avoid the confirmation of a project or a composition until it actually needs to be packaged for an output. To enable this, media management platforms need to handle complex objects seamlessly so that users can work directly with the elementary components of a media package or project. This gives them the ability to manipulate, transform, share and repurpose content in a very efficient and agile way - a concept called transclusion. The Dalet Galaxy platform’s user tools, data model, and workflow engine offer a robust and configurable framework to power these types of operations. They support all the industry latest standards like IMF, DPP and many others. Augmenting media tagging & operations with AI Video is a complex medium that requires both human-authored metadata, which is flexible and traditionally more accurate, and automatically-created metadata, which is quickly growing thanks to AI-powered services. AI is indeed a key next step for the media industry. Dalet has recently showcased, at IBC 2017, some first prototypes connecting a selection of AI engines to the Dalet Galaxy platform in order to build new services that range from simple, automated content indexing and metadata generation, all the way to smart assistants that augment user operations. Deployable on-premises or in the cloud, these services produce time-correlated, multi-dimensional metadata from audio and video data, unlocking new insights. Coupling these services with the Dalet Galaxy platform provides broadcasters and media organizations with an end-to-end solution that will serve as an enabler to capture tomorrow's business opportunities and generate new benefits.
Hollywood’s Colortime Facility Industrializes Client Content Services with Dalet Workflow Orchestration and Media Asset Management for Post-Production
Dalet Digital Media Systems, a leading provider of solutions and services for media organizations, is taking on the post-production market with Colortime, a Burbank, California-based post-production facility that specializes in feature film and television episodic editorial, dailies, VFX, color correction and content management services for a global customer base. Like many post facilities, Colortime is faced with an increasing number of versions to produce while simultaneously reducing the time to deliver those versions to various content channels. Taking a different approach to the traditional work arounds and spreadsheet management tools, industry veteran Moshe Barkat turned to Dalet for its expertise in workflow automation and media asset management (MAM), selecting the open, ready-made Dalet Galaxy platform. The new workflow, which will include Dalet Galaxy MAM and Workflow Engine, Dalet WebSpace and Storyboarder and Dalet Xtend for Adobe® Premiere® Pro, aims to maximize efficiency for managing client content and package delivery by connecting post-production and archives as well as managing work orders for file delivery and automating the corresponding transcodes. “We are managing a massive amount of content for a growing client base and required a solution that would provide both content organization features and automation capabilities with high scalability. We found the Dalet Galaxy platform delivers on all of these needs,” states Moshe Barkat, CEO of Colortime. “For the many clients and brands that we work with, it is critical that the content is organized in a manner that allows both Colortime and the client - or even the client’s client – to efficiently access it and distribute it in any format they require during or after post.” Moshe elaborates, “Dalet Galaxy will serve as the foundation to efficiently organize and easily retrieve the vast amount of files we store across post-production storage servers and archives. It will also allow us to simplify distribution, by creating automation profiles for the many transcode and package assembly jobs we do at Colortime, saving us quite a bit of time, whether it is a final file delivery or pulling content from the archive in a particular format that a client needs.” Dalet Galaxy will manage content ingest and metadata indexing onto Colortime storage and archive servers, creating a central content repository for a 360-degree view into assets. Purpose-built applications such as Dalet WebSpace will allow staff to quickly browse the repository and review work in progress or archived content from any laptop or workstation. Dalet Storyboarder will provide simple cuts and assembly capabilities, allowing non-editing staff to storyboard and share concepts. Dalet Xtend for Adobe Premiere will connect editors to the main content pool with bi-directional metadata tracking across all systems, keeping content easily searchable throughout the post process. “From workgroup to enterprise implementations, Dalet continues to prove that its renowned MAM and media workflow technology - honed over years of project experience - is capable of assisting customers not only in broadcast operations, but also in multi-screen distribution and post-production,” states Simon Adler, General Manager, West Coast USA & Canada, Dalet. “Our continued investments in R&D allow our offer to remain flexible and constantly adapt to new market demands. As the chain between content producers and the audiences keeps getting more integrated and faster, post-production and media services facilities have a tremendous new business opportunity to grasp, provided they build their operations on top of the right technology foundation. We see this project as a key reference in this new generation.” The Dalet Workflow Engine will seamlessly integrate with Colortime’s existing Dalet AmberFin installation, near eliminating the need to manually transcode files and organize final packages. “We will use the Dalet Workflow Engine to automate the many different client file delivery requirements,” says Bill Womack, CTO of Colortime. “Combined with our Dalet AmberFin installation, Dalet will facilitate creation-on-demand for many customer deliverables that require transcoding. And from a top down view, the Workflow Engine will let us define certain governance or rules to manage work orders and orchestrate the creation of packages for each client.” Bill explains the complexity in automating this aspect of the business: “The automation process is rules-based and unique to each client, which you don’t typically see in television or broadcast environment, as they’re only developing content for their own proprietary channels. They don’t have the same breadth of workflows or depth of rules that we are working with. Being able to dial into each client and define a profile innovates the way we deliver content and customer service.” About Dalet Digital Media Systems Dalet software-based solutions enable media organizations to create, manage and distribute content faster and more efficiently, fully maximizing the value of assets. Dalet products are built on three distinct platforms that, when combined, form versatile business solutions that power end-to-end workflows for news, sports, program preparation, production, archive and radio. Individually, Dalet platforms and products offer targeted applications with key capabilities to address critical media workflow functions such as ingest, QC, edit, transcode and multiplatform distribution. The foundation for Dalet productivity-enhancing workflow solutions, Dalet Galaxy is the enterprise NRCS and MAM that unifies the content chain by managing assets, metadata, workflows and processes across multiple and diverse production and distribution systems. Specially tailored for news and media workflows, this unique technology platform helps broadcasters and media professionals increase productivity while providing operational and business visibility. Dalet AmberFin is the high-quality, scalable transcoding platform with fully integrated ingest, mastering, QC and review functionalities, enabling facilities to make great pictures in a scalable, reliable and interoperable way.
Addressing the demanding needs of studio production, multi-camera ingest, sports logging and highlights production, the innovative Dalet Brio video server platform combines density and cost-effectiveness with high reliability. Dalet supports customers from the initial planning stages to well beyond project execution. Our global presence includes 17 offices strategically located throughout Europe, the Middle East, Asia Pacific, North America and South America, and a network of more than 60 professional partners serving 87 countries worldwide. This collective experience and knowledge enables our customers to realize potential increases in productivity, efficiency and value of their assets. The comprehensive Dalet Care program ensures deployments remain up and running with 24/7 support 365 days a year. Dalet systems are used around the world by many thousands of individual users at hundreds of TV and Radio content producers, including public broadcasters (ABS-CBN, BBC, CBC, DR, FMM, France TV, RAI, RFI, Russia Today, RT Malaysia, VOA), commercial networks and operators (Canal+, FOX, eTV, MBC Dubai, MediaCorp, Mediaset, Orange, Time Warner Cable, Warner Bros, Sirius XM Radio), and government organizations (Canadian House of Commons, Australian Parliament and UK Parliament). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
CCW, SOA, FIMS and the King & Queen of the Media Industry
All-Star Panel Sessions at CCW 2014 The NAB-backed CCW held some impressive panels, and our own Stephane Guez (Dalet CTO) and Luc Comeau (Dalet Business Development Manager) participated in two of the show’s hot topics. MAM, It’s All About Good Vocabulary – Luc Comeau, Senior Business Development Manager The saying goes, “behind every great man, there is a greater woman.” Within the panel – “Content Acquisition and Management Platform: A Service-Oriented Approach” – there was a lot of talk about content being king. In my view then, metadata is his queen. Metadata gives you information that a MAM can capitalize on and allows you to build the workflow to enable your business vision. Done correctly and enterprise MAM will give you visibility into the entire organization, allowing you to better orchestrate both the technical and human process. Because at the end of the day, it’s the visibility of the entire organization that allows you to make better decisions, like whether or not you need to make a change or adapt your infrastructure to accommodate new workflows. In our session, the conversation very quickly headed towards the topic of interoperability. Your MAM must have a common language to interface with all the players. If it doesn’t, you will spend an enormous amount of time translating so these players can work together. And if the need arises, and it usually does, you may need to replace one component with another that speaks a foreign language, well then, you are back to square one. A common framework will ensure a smooth sequence through production and distribution. A common framework, perhaps, such as FIMS… The One Thing Everyone Needs to Know About FIMS – Stephane Guez, Dalet CTO I was invited by Janet Gardner, president of Perspective Media Group, Inc., to participate in the FIMS (Framework for Interoperable Media Services) conference panel she moderated at CCW 2014. The session featured Loic Barbou, chair of the FIMS Technical Board, Jacki Guerra, VP, Media Asset Services for A+E Networks, and Roman Mackiewicz, CIO Media Group at Bloomberg – two broadcasters that are deploying FIMS-compliant infrastructures. The aim of the session was to get the broadcasters’ points of views on their usage of the FIMS standard. The FIMS project was initiated to define standards that enable media systems to be built using a Service Orientated Architecture (SOA). FIMS has enormous potential benefits for both media organizations and the vendors/manufacturers that supply them, defining common interfaces for archetypal media operations such as capture, transfer, transform, store and QC. Global standardization of these interfaces will enable us, as an industry, to respond more quickly and cost effectively to the innovation and the constantly evolving needs and demands of media consumers. Having begun in December 2009, the FIMS project is about to enter it’s 6th year, but the immense scale of the task is abundantly clear, with the general opinion of the panelists being that we are at the beginning of a movement – still very much a work-in-progress with a lot of work ahead of us. One thing, however, was very clear from the discussion: Broadcasters need to be the main driver for FIMS. In doing so, they will find there are challenges and trade offs. FIMS cannot be adopted overnight. There are many existing, complex installations that rely on non-FIMS equipment. It will take some time before these systems can be converted to a FIMS-compliant infrastructure. Along with the technology change, there is the need to evolve the culture. For many, FIMS will put IT at the center of their production. A different world and skill set, many organizations will need to adapt both their workforce and workflow to truly reap the advantages of FIMS.
More Secrets of Metadata
Followers of Bruce’s Shorts may remember an early episode on the Secrets of Metadata where I talked about concentrating on your metadata for your business, because it adds the value that you need. It seems the world is catching onto the idea of business value of metadata, and I don’t even have to wrestle a snake to explain it! Over the last 10 years of professional media file-based workflows, there have been many attempts at creating standardized metadata schemes. A lot of these have been generated by technologists trying to do the right thing or trying to fix a particular technical problem. Many of the initiatives have suffered from limited deployment and limited adoption because the fundamental questions they were asking centered on technology and not the business application. If you center your metadata around a business application, then you automatically take into account the workflows required to create, clean, validate, transport, store and consume that metadata. If you center the metadata around the technology, then some or all of those aspects are forgotten – and that’s where the adoption of metadata standards falls down. Why? It’s quite simple. Accurate metadata can drive business decisions that in turn improves efficiency and covers the cost of the metadata creation. Many years ago, I was presenting with the head of a well-known post house in London. He stood on stage and said in his best Australian accent “I hate metadata." You guys want me to make accurate, human oriented metadata in my facility for no cost, so that you guys can increase your profits at my expense.” Actually he used many shorter words that I’m not able to repeat here J. The message that he gave is still completely valid today: If you’re going to create accurate metadata, then who is going to consume it? If the answer is no one, ever, then you’re doing something that costs money for no results. That approach does not lead to a good long-term business. If the metadata is consumed within your own organization, then you ask the question: “Does it automate one or many processes downstream?” The automation might be a simple error check or a codec choice or an email generation or a target for a search query. The more consuming processes there are for a metadata field, the more valuable it can become. If the metadata is consumed in a different organization, then you have added value to the content by creating metadata. The value might be expressed in financial terms or in good-will terms, but fundamentally a commercial transaction is taking place by the creation of that metadata. The UK’s Digital Production Partnership and the IRT in Germany have both made great progress towards defining just enough metadata to reduce friction in B2B (business to business) file transfer in the broadcast world. Cablelabs continues to do the same for the cable world and standards bodies such as SMPTE are working with the EBU to make a core metadata definition that accelerates B2B ecommerce type applications. I would love to say that we’ve cracked the professional metadata problem, but the reality is that we’re still half way through the journey. I honestly don’t know how many standards we need. A single standard that covers every media application will be too big and unwieldy. A different standard for each B2B transaction type will cost too much to implement and sustain. I’m thinking we’ll be somewhere between these two extremes in the “Goldilocks zone,” where there are just enough schemas and the implementation cost is justified by the returns that a small number of standards can bring. As a Media Asset Management company, we spend our daily lives wrestling with the complexities of metadata. I live in hope that at least the B2B transaction element of that metadata will one day be as easy to author and as interoperable as a web page. Until then, why not check out the power of search from Luc’s blog. Without good metadata, it would be a lot less exciting.
Life before and after DPP (Digital Production Partnership)
People that know me will be aware that file-based workflows are a passion of mine. Ten years ago I was co-author of the MXF (Media Exchange Format) specification and ever since I have been engaged in taking this neatSMPTE standard and using it to create a business platform for media enterprises of every size and scale. This is why I’m so excited by the Digital Production Partnership (DPP): it represents the first ratified national Application Specification of the MXF standard and is set to revolutionize the way that media facilities and broadcasters work.To explain what I mean, let’s compare life with a DPP ecosystem to life without. Less pain to feel the gain 
In a standardized DPP world, there would be a limited amount of pain and cost felt by everybody but this would be shared equally amongst the organizations involved and it would be a limited cost, which is incurred only once. After this point, our industry has a fantastic common interchange format to help encourage partnerships and build businesses. In an unstandardized world, where different facilities have decided to use different tools and variants of MXF or other formats, the major cost becomes the lack of third-party interoperability. Each time content is exchanged between different facilities, a media transcode or rewrap in that format is required. This means that all vendors in all the facilities will ultimately support all the file formats andmetadata. The engineering required to implement and test takes time and costs money on an on-going basis. Interoperable metadata helps the content creator 
In a world that has adopted DPP, media and metadata interoperability is not an issue since the format is built on a strong, detailed common interchange specification. In this homogeneous scenario the resources that would have been used in the interoperability engineering process can be used in more creative and productive ways, such as programme making. Programme making is a process where most broadcasters utilise external resources. In a world without DPP, whenever a broadcaster or production facility receives a new file from an external facility, such as a Post House, the question must be asked whether this file meets the requirements of their in-house standard. That evaluation process can lead to extra QC costs in addition to possible media ingest, transcoding, conformance and metadata re-keying costs that need to be taken into account. Building a business platform
 This heterogeneous environment is an issue not just for interaction with external facilities: often different departments within the same major broadcaster will adopt slightly different file standards and metadata making interoperability a big issue to them. As a result, today only about 70 per cent of transactions within companies are file-based – the remainder employ tape. However, this is much higher than where external agencies are involved – here, only 10 – 15 per cent of transactions are file-based. The essence of the problem is the lack of a common interchange format to enable these transactions. DPP is the first open public interchange format that is specifically designed to address this issue. DPP is intended to transform today’s 20 per cent trickle into an 80 per cent flood in the shortest time. To find out more about DPP and how it can transform the way your operation works and also your effectiveness working with other organizations read AmberFin’s White Paper on DPP.
Could your MXF files be infected with a virus?
We all know not to click on those shifty-looking attachments in emails, or to download files from dubious websites, but as file delivery of media increases, should we be worried about viruses in media files? In the case of the common computer virus, the answer is “probably not” – the structure of media files and applications used to parse or open MXF, QuickTime and other files do not make “good” hosts for this type of virus. Compared to an executable or any kind of XML-based file, media files are very specific in their structure and purpose – only containing metadata, video and audio – with any element labeled appropriately sent to the applicable decoder. Any labels that are not understood or supported by the parser are simply ignored. However, this behavior of ignoring unsupported or unrecognized labels facilitates the existence of “dark metadata,” and this is a potential area of weakness in the broadcast chain. Dark metadata isn’t necessarily as menacing as the name could suggest and is most commonly used by media equipment and software vendors to store proprietary metadata that can be used downstream to inform dynamic processes – for example, to change the aspect ratio conversion mode during up or down conversion, or audio routing in a playout video server. When you know what dark metadata you have, where it is and what it means, it can add value to the workflow chain. Since dark metadata will usually be ignored by parsers that don’t understand/support the proprietary data it carries, it can also be passed through the media lifecycle in a completely harmless way. However, if you are not aware of the existence of dark metadata and/or the values of the data it carries, then there is a risk that processes in the media path could be modified or activated unintentionally and unexpectedly. In this case, the media is in some way carrying a virus and in the worst case, could result in lost revenue. The anti-virus software installed on your home or work PC isn’t going to be much help in this instance, but there are simple steps that can be taken to ensure that you don’t fall foul of “unknown unknowns.” Implement a “normalization” stage at the entry point for media into your workflow. You can read other articles in this blog about the benefits of using a mezzanine file format, but even if files are delivered in the same format you use in-house, a simple re-wrapping process to “clean” and normalize the files can be a very lightweight process that adds little or no latency into the workflow. Talk to your suppliers and vendors to make sure you’re aware of any proprietary metadata that may be being passed into your workflow. If you have an automated file-QC tool, check whether it has a “dark metadata” test and switch it on – unless you definitely use proprietary metadata in your workflow, this won’t generate false positives and shouldn’t add any significant length to the test plan. We’ll be looking at some of the other security concerns in future blogs, but as long as you know your dark metadata, there’s little risk of viral infection from media files.
Media Asset Data Models in MAM Systems – An Evolutionary View
As I was working on a presentation I gave recently on the data model our Dalet Galaxy MAM system is built upon, I realized that looking at the evolution of this data model was a nice way of explaining it. It only made sense to share it with a wider audience. By illustrating how media assets are tracked and cataloged within a MAM system, and how that model has changed significantly over time, I hope to provide a deeper understanding of the changing needs in our industry and how we can not only continue to address these needs, but also begin to predict and plan for new ones. Let’s take look at what I call the “Dark Ages of MAM,” when our operations were almost exclusively tape-based, and there was no real MAM system. What we had were tapes with stickers (“metadata”) on them or, best case scenario, a tape management database. Figure 1 - The Dark Ages - Tapes and Stickers Figure 2 - The Stone Age - One file at a time Then, as professional media workflows started to introduce file-based workflows, we saw the first MAM systems appear, i.e., a digital catalog to organize and track your media assets. This is a time I call the “Stone Age.” The asset was represented in a very simple way: one descriptive metadata set that pointed to one media file (audio or video). This was intended to allow users to search for and find their media assets, along with some information on them. Figure 3 - The Iron Age - Multiple versions of the file Then, things got a bit more complex in the “Iron Age.” We no longer had a single file attached to a metadata record. You needed multiple versions of that media asset, in multiple formats; let’s say one version for proxy viewing, and a few different versions for archiving, distribution to FTP or web sites, etc. And then again, as time went by, things became even more advanced, and we reached what I would call the “Industrial Age.” The asset was not just a single media file anymore; it became a combination of many individual building blocks, with a master video track, individual audio tracks for multiplelanguages, caption or subtitle files, and even secondary video files and still images. And from this you then had to create different “virtual versions,” each with a different subset of files and their own specific metadata, in order to manage and track the delivery to the many new linear or non-linear platforms. And of course, all of these needed to be linked in order to track the various relationships. This “Industrial Age,” as I like to call it, is the time we are in today. The complex data model I describe above allows us to automate production and delivery workflows in an efficient way, by building media production factories for delivering multilingual, multiplatform content. And since a number of standards have recently emerged for delivering these complex bundles (AS-02 and IMF, for example), we have really reached a point where the full preparation, assembly and delivery workflows can be highly optimized. Figure 4 – The Industrial Age - An asset is more than just one media file but a bundle with various versions derived from it As the term “evolution” would imply, this “Industrial Age” is just another phase in the progression of MAM platforms, which are only going to become more advanced and more complex in the future. The next challenge for MAM platforms (or more accurately, the engineers who develop them) will be to include in their data model all the new requirements and paradigms of social media platforms and semantic technologies. The MAM data model will need to be aware not only of what’s happening inside the media factory but also of everything happening in the whole wide world of the semantic web. For us, this will be the next step in this long journey of constantly evolving our products’ data models. We have already begun the process, and it looks like it’s going to be a lot of fun for our engineering teams ;-). Figure 5 – The Networked Age – Metadata relations will include Social Media and Semantic technologies
5 Reasons why we need more than ultra HD to save TV
If you were lucky (or unlucky) enough to get to CES in Las Vegas this year, then you will know that UHD (Ultra High Definition TV) was the talking point of the show. By and large the staff on the booths were there to sell UHD TVs as pieces of furniture and few of them know the techno-commercial difficulties of putting great pictures onto those big, bright, curved(?) and really, really thin displays: In my upcoming webinar on the 29th January I will be looking into the future and predicting some of the topics that I think will need to be addressed over the next few years if TV as we know it is to survive. 1. Interoperability The number of screens and display devices is increasing. The amount of content available for viewing is going up but the number of viewers is not changing greatly. This means that we either have to extract more revenue from each user or reduce the cost of making that content. Having systems that don’t effectively inter-operate adds cost, wastes time and delivers no value so the consumer. Essence interoperability (video & audio) is gradually improving thanks to education campaigns (from AmberFin and others) as well as vendors with proprietary formats reverting to open standards because the cost of maintaining the proprietary formats is too great. Metadata interoperability is the next BIG THING. Tune in to the webinar to discover the truth about essence interoperability and then imagine how much unnecessary cost exists in the broken metadata flows that exists between companies and between departments. 2. Interlace must die UHD may be the next big thing, but just like HDTV it is going to have to show a lot of old content to be a success. Flick through the channels tonight and ask yourself “How much of the content was shot & displayed progressively”. On a conventional TV channel the answer is probably “none”. Showing progressive content on a progressive screen via an interlaced TV value chain is nuts. It reduces quality and increases bitrate. Anyone looking at some of the poor pictures shown at CES will recognise the signs of demonstrations conceived by marketers who did not understand the effects of interlace on an end to end chain. Re-using old content involves up-scaling & deinterlacing existing content – 90% of which is interlaced. In the webinar, I’ll use AmberFin’s experience in making the world’s finest progressive pictures to explain why interlace is evil and what you can do about it. 3. Automating infrastructure Reducing costs means spending money on the things that are important and balancing expenditure between what is important today and what is important tomorrow. There is no point in investing money in MAMs and Automation if your infrastructure won’t support it and give you the flexibility you need. You’ll end up redesigning your automation strategy forever. The folks behind xkcd.com explain this much more succinctly and cleverly than I could ever do. In the webinar, I’ll explain the difference between different virtualization techniques and why they’re important. 4. Trust confidence & QC More and more automation brings efficiency, cost savings and scale, but also means that a lot of the visibility of content is lost. Test and measurement give you the metrics to know about that content. Quality Control gives you decisions that can be used to change your Quality Assurance processes. These processes in turn allow your business to deliver media product that delivers the right technical quality for the creative quality your business is based on. So here’s the crunch. The more you automate, the less you interact with the media, the more you have to trust the metadata and pre-existing knowledge about the media. How do you know it’s right? How do you know that the trust you have in that media is founded? For example. A stranger walks up to you in the street and offers you a glass of water. Would you drink it? Probably not. If that person was your favourite TV star with a camera crew filming you – would you drink it now? Probably? Trust means a lot in life and in business. I’ll explore more of this in the webinar. 5. Separating the pipe from the content If, like me, you’re seeing more grey hair appearing on the barber’s floor with each visit then you may remember the good old days when the capture standard (PAL) was the same as the contribution standard (PAL) and the mixing desk standard (PAL) and the editing standard (PAL) and the playout standard (PAL) and the transmission standard (PAL). Today we could have capture format (RED), a contribution standard (Aspera FASP), a mixing desk standard (HDSDI), an editing standard (MXF DNxHD),a playout standard (XDCAM-HDSDI) and a transmission standard (DVB-T2) that are all different. The world is moving to IP. What does that mean? How does it behave? A quick primer on the basics will be included in the webinar. Why not sign up below before it’s too late? Places are limited – I know it will be a good one. Register for our next webinar on: Wednesday 29th January at: 1pm GMT, 2pm CET, 8am EST, 5am PST OR 5pm GMT, 6pm CET, 12pm EST, 9am PST ‘til next time. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Does it take you 1/2 million years to test your workflow?
It is now obligatory to start every broadcast technology blog post, article or presentation with a statement reminding us that we are now living in a multi-format, multi-platform world, where consumers want to view the content they choose, when they want it, where they want it, on the device they want. However, unlike other marketing platitudes, this one is actually true: Many of us in this industry spend our days trying to develop infrastructures that will allow us to deliver content to different platforms, ageing prematurely in the process because to be honest, it's a really hard thing to do. So why is it so hard? Why is it so hard? Let me explain: For each device, you have to define the resolution: a new iPad has more pixels than HDTV, for example (2048 wide), and is 4:3 aspect ratio. Android phones have different screen sizes and resolutions. Don’t even get me started on interlaced or progressive. That video has to be encoded using the appropriate codec – and of course different devices use different codecs. Along with the pictures there will be sound. Which could be in mono, stereo or surround sound, which in turn could be 5.1, 7.1 or something more exotic. The sound could be encoded in a number of different ways. Digital audio sampling could be at 44.1kHz or 48kHz and a whole range of bit depths. Then the audio and video need to be brought together with the appropriate metadata in a wrapper. The wrapper needs to be put into a delivery stream. If it is for mobile use, we now routinely adopt one of the three different adaptive bitrate formats, which means essentially we have to encode the content at three different data rates for the target device to switch between. If you want to achieve the admirable aim of making your content available on all common platforms, then you have to take into consideration every combination of resolution, video codec, audio codec, track layout, timecode options, metadata and ancillary data formats and bitrate options. This is a very large number. And it does not stop there. That is only the output side. What about the input? How many input formats do you have to support? Are you getting SD and HD originals? What about 2k and, in the not too distant future, 4K originated material? If you are producing in-house, you may have ARRI raw and REDCODE (R3D) files floating around. The content will arrive in different forms, on different platforms, with different codecs and in different wrappers. We are on to the third revision of the basic MXF specification, for example. Any given end-to-end workflows could involve many, many thousands of input to output processes, each with their own special variants of audio, video, control and metadata formats, wrappers and bitrates. Each time a new input or output type is defined the number increases many-fold. Quality Control All of which is just mind-boggling. Until you consider quality control. If you were to test, in real time, every variant of, say, a three minute pop video, it would take a couple of hundred years. This is clearly not going to happen. It’s all right, I hear you say. All we need do is define a test matrix so that we know we can transform content from any source to any destination. If the test matrix works, then we know that real content will work, too. Well, up to a point. I have done the calculations on this and, to complete a test matrix that really does cover every conceivable input format, through every server option, to every delivery format for every service provider, on every variant of essence and metadata, it is likely to take you half a million years. Maybe a bit more. So are you going to start at workflow path one and test every case, working until some time after the sun explodes? Of course not. But what is the solution? Do you just ignore all the possible content flows and focus on the relatively few that make you money? Do you accept standardized processing which may make you look just like your competitors; or do you implement something special for key workflows even though the cost of doing it – and testing it – may be significant? We have never had to face these questions before. Apart from one pass through a standards converter for content to cross the Atlantic, everything worked pretty much the same way. Now we have to consider tough questions about guaranteeing the quality of experience, and make difficult commercial judgments on the right way to go. If you want to find out more about how to solve your interoperability dilemma, why don't you register for our next webinar on: Wednesday 29th January at: 1pm GMT, 2pm CET, 8am EST, 5am PST OR 5pm GMT, 6pm CET, 12pm EST, 9am PST I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?