Menu extends

Jun 10, 2015
5 reasons why media delivery standards might be good for your business
Like me, I am sure that you have been to a restaurant in a group and everyone orders from the set menu EXCEPT for that one person who orders the exotic, freshly prepared fugu, which requires an extra 30 minutes of preparation from a licensed fugu chef so that the customers don't die eating it. Restaurant etiquette means that our main course is served at the same time, forcing everyone to spend a long time hungry, waiting for the special case. And if you split the bill equally, the special case becomes subsidised by the people wanting the set meal. Does this model relate to the media industry? Is there a cost for being special? How can we reduce that cost? What gets done with the cost savings? How can you help? Fortunately those 5 questions lead into 5 reasons why delivery standards might be a good idea.

5 reasons why media delivery standards might be go

Like me, I am sure that you have been to a restaurant in a group and everyone orders from the set menu EXCEPT for that one person who orders the exotic, freshly prepared fugu, which requires an extra 30 minutes of preparation from a licensed fugu chef so that the customers don't die eating it. Restaurant etiquette means that our main course is served at the same time, forcing everyone to spend a long time hungry, waiting for the special case. And if you split the bill equally, the special case becomes subsidised by the people wanting the set meal. Does this model relate to the media industry? Is there a cost for being special? How can we reduce that cost? What gets done with the cost savings? How can you help? Fortunately those 5 questions lead into 5 reasons why delivery standards might be a good idea.

Like me, I am sure that you have been to a restaurant in a group and everyone orders from the set menu EXCEPT for that one person who orders the exotic, freshly prepared fugu, which requires an extra 30 minutes of preparation from a licensed fugu chef so that the customers don't die eating it. Restaurant etiquette means that our main course is served at the same time, forcing everyone to spend a long time hungry, waiting for the special case. And if you split the bill equally, the special case becomes subsidised by the people wanting the set meal. Does this model relate to the media industry? Is there a cost for being special? How can we reduce that cost? What gets done with the cost savings? How can you help?

Fortunately those 5 questions lead into 5 reasons why delivery standards might be a good idea.

1. The set meal is more efficient than the a la carte

I must confess that when I write this blog while hungry there will be a lot of food analogies. I'm quite simple really. In the "set meal" case - you can see how it's easier for the kitchen to make a large volume of the most common meal and to deliver it more quickly and accurately than a large number of individual cases. In the file delivery world, the same is true. By restricting the number of choices to a common subset that meet a general business need, it is a lot easier to test the implementations by multiple vendors and to ensure that interoperability is maximised for minimum cost. In a world where every customer can choose a different mix of codecs, audio layout, subtitle & caption formats, you quickly end up with an untestable mess. In that chaotic world, you will also get a lot of rejects. It always surprises me, how few companies have any way of measuring the cost of those rejects, even though they are known to cause pain in the workflow. A standardised, business-oriented delivery specification should help to reduce all of these problems.

2. Is there a cost for being special?

I often hear the statement – "It's only an internal format - we don't need to use a standard". The justification is often that the company can react more quickly and cheaply. Unfortunately, every decision has a lifespan. These short-term special decisions often start with a single vendor implementing the special internal format. Time passes and then a second vendor implements it, then a third. Ultimately the custom cost engineering the special internal format is spent 3 or 4 times with different vendors. Finally the original equipment will end of life and the whole archive will have to be migrated. This is often the most costly part of the life cycle as the obsolete special internal format is carefully converted into something new and hopefully more interchangeable. Is there a cost of being special? Oh yes, and it is often over and over again. 

3. How can we reduce costs?

The usual way to reduce costs is to increase automation and to increase "lights out" operation. In the file delivery world, this means automation of transcode AND metadata handling AND QC AND workflow. At Dalet and AmberFin, all these skills are well understood and mastered. The cost savings come about when the number of variables in the system is reduced and the reliability increases. Limiting the choices on metadata, QC metrics, transcode options, workflow branches increases the likelihood of success. Learning from experiences of the Digital Production Partnership in the UK, it seems that tailoring a specific set of QC tests to a standardised delivery specification with standardised metadata will increase efficiency and reduce costs. The Joint Task Force on File Formats and Media Interoperability is building on the UK's experience to create an American standard that will continue to deliver these savings

4. What gets done with the cost savings?

The nice thing about the open standards approach is the savings are shared between the vendors who make the software (they don't have to spend as much money testing special formats) and the owners of that software (who spend less time and effort on-boarding, interoperability testing and regression testing when they upgrade software versions.)

5. How can you help?

The easiest way is to add your user requirements to the Joint Task Force on File Formats and Media Interoperability list. These user requirements will be used to prioritise the standardisation work and help deliver a technical solution to a commercial problem.

For an overview of some of the thinking behind the technology, you could check out my NAB2014 video on the subject, or the presentation given by Clyde Smith of Fox.

Until next time. 

 

YOU MAY ALSO LIKE...
What is HDR, WCG and Dolby Vision and why does it matter?
Alphabet soup starring HDR and WCG "Hey Guys, let's re-invent the entire TV and Cinema chains from Camera to Screen!" said no high-ranking executive in any board meeting ever. The whole concept sounds like crazy talk when you say it out loud, but in reality that's what the High Dynamic Range (HDR) and Wide Color Gamut (WCG) revolution have done over the recent years. We've moved on from glowing goop The cinema world, shooting on film, has always had a little more freedom that the TV world when it comes to controlling brightness, color and contrast between the camera and the screen. There were limitations in physics and chemistry, of course. You could make the projector brighter assuming you didn't melt the film and you could make the film more sensitive provided you liked that grainy look on the screen. The TV world, however had a fixed and inflexible transmission infrastructure that was stabilized in the 1950s. The relationship between the photons going into a camera and the photons coming out of most of today's TV are still based on the response characteristics of the glowing goop you find inside CRTs (Cathode Ray Tubes) of that early era. So in comes HDR. "Hey guys the eye can capture about 14 stops of brightness so let’s transmit that." is the fundamental idea behind HDR. In a basic MPEG system, the brightness of most pixels is represented by a number between 0 and 255. This gives you the ability to represent 8 stops (28 values) whereas we would like to represent 214 values in our HDR chain i.e. the brightness of each pixel is represented by a number between 0 and 16383. Sounds simple really. But, what is Dolby Vision HDR? Let's redesign the entire Cinema and Broadcast Value chain The complexity comes with making sure that each and every device in the value chain from camera through switcher and ingest and transcode understands what these new numerical values actually mean. In an ideal world we would replace all the old kit with brand new kit, but that's not really practical so the HDR systems that were created have compatibility modes to allow these new bright, colorful pixels to travel across traditional SDI, H.264 and IP transmission paths with good integrity to appear at the final display to show wondrous pictures. Now, what is Dolby Vision HDR? Dolby Vision is one of the HDR systems that requires metadata to work. Its trick is identifying that in any typical scene you only use a portion of the total available dynamic range. A dark shadowy scene in a cave will need more bits allocated in the small numerical pixel value ranges. A bright seaside scene on a sunny day will need more bits allocated in the large numerical pixel value range. This scene by scene adaption is enabled with metadata that tells each device how to behave for that scene. The Dalet AmberFin team is really proud that it's the first software only transcoder and workflow engine to have full support for the Dolby Vision system. It can do this in a wide range of different codecs in parallel with the usual array of high quality video processing functions from scaling to Standards Conversion. The Dolby Vision metadata itself might be carried in a sidecar XML file or embedded within the media file as a data track. Whichever mechanism is used, it's vitally important to retain the synchronization between the metadata and the images to get the best results particularly when aligning metadata changes to hard cuts in the video. This becomes doubly important when frame rate converting because blended frames and mis-timing of metadata combined are highly visible, highly annoying and consume a lot of bitrate in the final encoding. A transcoder like the Dalet AmberFin platform gets all of those complex factors right first time, resulting in high efficiency, low bitrate, outstanding picture. In today's era, the consumer often lead the professionals So, what is Dolby Vision HDR and why is it important? HDR is important because the consumers of media get to see HDR on the content they make on their mobile devices. If the paid-for entertainment content they see on other platforms looks washed out and old-fashioned by comparison, then this will be a factor in what media gets consumed. If anyone has a spare crystal ball to help predict what this future might look like, then I would be very grateful to borrow it for a while!
Dalet Solutions Earn DPP 2020 Security Certification for Production and Broadcast
Dalet, a leading provider of solutions and services for broadcasters and content professionals, continues its commitment to the highest security standards, attaining the DPP Security Marks for Production and Broadcast under ‘The DPP Committed to Security’ program. The marks certify that all Dalet products and solutions are developed, configured and deployed according to stringent DPP cyber security best practices across R&D, code safety, and operational measures. As early adopters of DPP compliance and security initiatives, both Dalet, and Ooyala - now part of Dalet - have worked with the DPP on certification and security initiatives since 2014. The company also attained its ISO/SEC 27001:2013 certification in 2018, earning the highest level of security practices across Dalet internal development processes, its product line and its practices. “Security has always been of paramount importance at Dalet. With media organizations quickly pivoting their operations to enable work from home scenarios, security has taken on an even higher level of urgency for our customers,” states Rami Pinku, Dalet Deputy General Manager, R&D Operations. “Our commitment to developing and delivering highly secure solutions that embrace industry best practices is stronger than ever. Dalet security processes start from the moment we begin developing our solutions to the time they’re delivered. We are proud to have achieved the DPP’s security marks for our solutions as these are key criteria for media organizations investing in enterprise-grade workflow solutions.” Achieving the DPP security marks demonstrates Dalet’s commitment to working towards and adhering to cyber security best practice across its entire solution range. Powering exceptional user workflows from enterprise productions to OTT preparation and finished asset distribution, Dalet’s line of DPP accredited media supply chain and broadcast solutions, including Dalet Galaxy five, the Ooyala Flex Media Platform and Dalet’s latest SaaS offerings, Dalet StoreFront, Dalet Media Cortex and Dalet Galaxy xCloud, offer secure, hybrid and highly scalable workflows on-premises and in the cloud. "We're delighted that Dalet has been awarded the DPP's Committed to Security mark for both Broadcast and Production," says Rowan de Pomerai, DPP Head of Delivery & Growth. "Building on the great work they did with the Ooyala Flex Media Platform, all Dalet products are now developed in line with our Security guidelines, meaning that they remain part of a community of forward-thinking companies demonstrating a clear commitment to cybersecurity best practice, and to playing their part in building a more secure media supply chain." About the DPP Committed to Security program The DPP launched the Committed to Security program in October 2017 to help technology providers advance, hone and demonstrate their commitment to security best-practices. Participants are assessed according to a rigorous set of controls specifically applied within the categories of Production, including policies and procedures, physical security, incident planning, recovery management, IT security, business continuity and other areas; and Broadcast, including documentation & testing, authentication and controls. For more information go to https://www.dalet.com/platform and https://www.thedpp.com/security. About Dalet Digital Media Systems Dalet solutions and services enable media organizations to create, manage and distribute content faster and more efficiently, fully maximizing the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions. Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloging, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. The integration of the Ooyala Flex Media Platform business has opened vast opportunities for Dalet customers to deploy successful strategies that better address their audiences with agile multi-platform content distribution in a wider range of markets, such as sports for teams and leagues, brands and corporate organizations, as well as Media and Entertainment companies looking to scale up their digital offerings. Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, TV2 Denmark, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Fox Sports Australia, Turner Asia, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio), sporting organizations (National Rugby League, FIVB, Bundesliga) and government organizations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
The story behind Dalet StoreFront: open innovation, team collaboration and workflow expansion
Economists have an unusual word to describe the value in simple commodities like gold and platinum. It’s “fungible,” meaning that a substance is exchangeable. One piece of gold is the same as another piece. When you’re ordering gold, you don’t need to specify anything apart from how much of it you want to buy. You can split it up, mould it, melt it, recombine it and absolutely nothing has changed. You still only ever need to specify the weight of gold that you want to buy - or sell. Most things are not like that. Cars aren’t fungible. Nor are houses. And nor is media. You can’t buy media by the ounce. Media has many more dimensions and characteristics, all of which affect its value. But that’s only part of the story, because any given piece of media will have a different value to different buyers. Wildlife footage has very little value to an organisation that specialises in motor racing. Let’s look at this in more detail The media landscape today is significantly different in almost every way to how it was thirty years ago. Films are now files. Negatives are numbers. Cupboards full of tapes and reels have migrated to the cloud. And “supervising” all of this is a Media Asset Management system (MAM). Files are not physical things, and that opens up an incredible range of possibilities, but, because you can’t store non-physical things on shelves, you need an all-embracing MAM system like Dalet to keep track of all the ephemeral properties of millions of blobs of data. What is Dalet StoreFront? Dalet StoreFront is a window into the hidden value of a media organisation's media assets. It allows existing Dalet users to display their content to other media organisations, safely and simply. Essentially, it uses Dalet’s ability to orchestrate content to provide a browsing and fulfillment back-end to Dalet StoreFront’s users. The beauty of this arrangement is that there is virtually zero extra effort needed to prepare media. All the information - the metadata - about the media would have been input to the Dalet MAM as part of the normal process of onboarding media files. This is likely to include information about the title, authors, rights (including restrictions about usage) and also data about what’s contained within the clip, possibly including timecode references. There would also be information about format, resolution and whether or not the clip is in HDR, for example. This metadata, which needs to be there anyway as part of the normal usage of a Dalet MAM system, is exactly what’s needed as the basis for a transaction with a potential buyer. And because of the richness of the metadata, Dalet StoreFront is able to make sure that a media purchaser only sees content that it is allowed to acquire. Dalet StoreFront in Use Imagine a subscription based television provider specialising in travel and wildlife programming. Their world-class media content – programs, trailers, and B-roll content – needs to be distributed to a global network of broadcasters and partners. In a traditional model, the broadcaster/partner would need to email a request for materials. This request could be for marketing material to promote a program, highlights or materials to create the highlights, or the program itself. On the receiving end of the request – the television provider would need to check the rights of the content, the agreement with the partner, search the materials and send over a selection proxy assets. Once confirmed there is yet another step is to finalize the transaction and send assets, hopefully in the right format, via a file transfer service like Aspera. Every step requires manual interaction and investigation. When pressed for time, corners get cut and a sampling of what could be offered from the rich archives is shared for consideration. It’s a daunting process that affects the entire operation and more importantly, could shortchange the impact of the final material if a lesser quality asset was provided. Marketers Love Self-Serve Partners and broadcasters require marketing materials to promote programming. Eliminating the one-to-one requests, access to assets is predetermined, so only pre-approved marketing content is exposed to shoppers. Not only does this simplify the mass distribution of marketing material for new shows, it also makes it far more efficient to serve those broadcasters looking for a very specific asset...one that could promote the re-airing of an older program in a specific region. Find that B-Roll! Even a five-second shot used as B-roll can make all the difference to a producer looking for a specific shot to use in their highlight reel or production. Dalet StoreFront flips the traditional model and lets producers browse the catalog as opposed to an individual sending producers a handful of shots that may or may not be relevant. Dalet StoreFront broadens the selection to include ALL suitable content available for use. Requests for assets are sent to sales bringing the process down to a few simple steps. Prep Your Content for Global Delivery Much like marketing, handing localization of programming content under the traditional model involved many steps and efficiencies. New programs slated for worldwide distribution often need to be dubbed/subtitled in multiple languages. Dalet StoreFront presents localization entities (ex. companies like SDI Media, a Dalet customer) with the required proxy videos to begin their work. This eliminates the guesswork of who is to translate what along with the transfer back and forth of materials. The Dalet MAM back-end manages the delivery in the right file format, and delivers it to the relevant, pre-configured endpoint (e.g. a cloud secure storage location, a CMS, etc). Add More Angles to Your Fast Breaking News With news organizations constantly updating their catalog, Dalet StoreFront answers the call for immediacy and access to assets that will help journalists deliver hyper-local reporting. News organisations can share and deliver media as soon as content has been ingested and logged into the Dalet MAM. It doesn’t matter whether content has been on the system for years, or has just arrived. It’s all equally available giving newsrooms the material they need to build breaking stories and journalists the right media to localize their stories or bring in historical context. Open Up Your Archives… Safely! In the world of sports, archived content becomes even more valuable with time. Iconic plays and players are safely preserved in well-guarded content vaults. The sheer value of the material means no direct access for outside partners. Dalet StoreFront connects to the Dalet MAM archive, creating a separate security layer that tethers the archived assets in a safe manner. This allows clubs, leagues and other partners to browse the archives and select the materials they want to use in their productions whether it’s for highlights, programs or game recap. The Dalet back-end manages the entire process from presenting the materials, to requests for assets and delivery. Running on Amazon Web Services, Dalet StoreFront makes these and many other workflow scenarios happen. Every shape and form of content becomes searchable, browsable - and obtainable. It’s safe, efficient, and is set to transform the way businesses find, acquire and incorporate content into their own productions. Do You Need Dalet StoreFront? If your organization needs to seamlessly connect and expose content inventory to your community, empowering discoverability of untapped content, ripe for monetization and licensing, then Dalet StoreFront is the right solution for you! A Cloud-native SaaS service running on Amazon Web Services, Dalet StoreFront brings in untapped revenues connecting content to clients. Learn more and request a Dalet StoreFront demo at https://www.dalet.com/business-services/storefront.
A Brand New Knowledge Base for Ooyala Flex Media Platform
Now part of Dalet, the Ooyala Flex Media Platform and complementary offerings, are constantly being refreshed with new features and a fully revamped user interface. We continuously strive to bring our clients the best experience, and with that in mind, we have fully refreshed the Ooyala Flex Media Platform Knowledge Base, aligned to our new product design. Ta-daah! Check out those slick icons. Increased collaboration, quality and regularity Learning from product development CICD best practises, we have brought this pipeline into how we document the Ooyala Flex Media Platform. This means increased collaboration, quality and regularity to which we update our documentation. We have taken a fresh look at the information architecture and the way users access content, by looking at 3 key personas: developers, administrators and end users. To achieve this, we have created independent guides for configuring the platform, developing with the API/SDK, and using each application. Within these guides, a user can search, navigate, and identify the category their question fits into, speeding the route to the information required. Each section starts with a menu that targets hot topics and recent updates that highlights what functionality we, and you, get excited about. For any other feature, the release notes are all accessible. Better navigation, faster results Delving into the guide, the reader will find that every article has its own helpful table of contents with anchored section titles. The “copy link” icon facilitates quick and easy sharing of content. So take a look at our new Knowledge Base, also available from the Ooyala Flex Media Platform landing page on dalet.com. We’ve received lots of positive feedback from our users, so please do get in touch with any suggestions for improvement. Happy reading :)
Dalet AmberFin Release Adds Dolby Vision and Socionext Acceleration
Dalet, a leading provider of solutions and services for broadcasters and content professionals, today announced that the latest release of the Dalet AmberFin media processing platform is the first software-based transcoding asset management system to include Dolby Vision® technologies, a leading HDR technology for media professionals, studios and streaming service providers. In addition to Dolby Vision and a raft of new software-only features, Dalet AmberFin can also be enhanced with hardware acceleration tools such as the M820L HEVC encoder plugin card from SoC (System On a Chip) specialist Socionext. Steve Higgins, Product Manager, Dalet AmberFin, states, “The latest release of the Dalet AmberFin media processing engine marks a new era for this renowned high-quality content processor. Whether you process on-premises, in the cloud or with a mix of both; the efficiency of the processor coupled with high throughput and exceptional audio-visual quality are key for delivering your content in a timely and cost-conscious fashion.” Unique in its class, Dalet AmberFin media processing capabilities enable automated Dolby Vision mastering and distribution workflows that generate outputs from IMF, through broadcast, cable and satellite packages to OTT HLS and DASH bundles. The integrated BPMN-compliant workflow engine and API let administrators configure user interfaces, assign tasks for operators, and push through ad-hoc QC processes; creating workflows that are just right for their facility. Dalet AmberFin's file outputs can vary in packaging sophistication from simple MP4s through to complex IMF packages that are automatically synthesized from collections of input files and automation instructions. Dalet active participation in SMPTE’s IMF plugfests gives confidence that all its IMF processes are interoperable at the very highest level and are a key component of the IMF mastering and distribution revolution sweeping the industry. Jean-Christophe Coin, CEO from Leading French Post-House VDM says of the platform, "By integrating Dalet AmberFin with our Dalet Galaxy MAM platform we have managed to build the Content Factory that we always dreamed of. Work orders for original and versioned content flow quickly and efficiently through our facility with human operators lending creative skills where required and automation fulfilling all other processes". On the complexity of today's workflows, he added, "There are more and more variations in the complexity of a modern title. When it was just SD or HD life was simple. Today we track different HDR variants and multiple languages, compliance and re-versioned variants for the large number of platforms that a single title might appear on." Higgins concludes, “Whether your outputs are high bitrate JPEG2000 and ProRes or low bitrate H.264 and HEVC, the platform offers advanced features like outstanding software-only, cloud-friendly frame rate conversion and sophisticated interlace handling that will ensure your content is created quickly and with quality that viewers will love regardless of their delivery bandwidth.” Dalet AmberFin version 11.8.2.0 is now available to all existing customers on a support contract. For more details, go to https://www.dalet.com/platforms/dalet-amberfin. About Dalet Digital Media Systems Dalet solutions and services enable media organisations to create, manage and distribute content faster and more efficiently, fully maximising the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions. Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloguing, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. In July 2019, Dalet announced the acquisition of the Ooyala Flex Media Platform business. An acceleration of the company’s mission, the move brings tremendous value to existing Dalet and Ooyala customers, opening vast opportunities for OTT & digital distribution. Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, TV2 Denmark, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Fox Sports Australia, Turner Asia, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio), sporting organisations (National Rugby League, FIVB, Bundesliga) and government organisations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
The Future of Transcode
A long time ago in a time in a laboratory far, far away, a small team unpacked a shiny new server and ran their media software. Discovering that they could get standard definition video to decode and encode at almost real time, the transcode market was born. Thanks to Moore’s law and a little performance optimization, things progressed rapidly and, for a little over a decade, the bulk of the transcoding market was all about getting the codecs right. The rise of online services The rise of online services, the move from tape delivery to file delivery and an increased focus on efficiency and cost savings has changed the transcode landscape forever. We’ve moved from a focus on codecs to a focus on the industrial manufacture of deliverables to satisfy a media business. So what does that mean in practice, and what is the outlook for the future? As a long time, high quality transcoder manufacturer, we see a change in the way our customers are engaging with us and a change in the way the humble transcoder is viewed within the business. A decade ago, the transcoder was a necessary evil because different companies could not agree on common formats. The transcoder is now seen as a business tool for optimizing the content for different customers to maximize revenue. It is rare to see a “simple” transcode job nowadays. We often see jobs where bumpers are being added to the start and end of material, extra audio channels are being added and / or replaced. Captions are a BIG deal. The insertion / extraction and replacement of captions is increasingly an area where significant cost savings can be made. What mezzanine format should I use? A decade ago, the big decision for a media company was “What mezzanine format should I use?” The choices were limited to variants of MPEG2, DV or JPEG2000. Today that choice is still critical, but in addition to optimizing CPU usage, storage, network bandwidth and I/O loading, there is also the question of optimizing the versioning capability of the mezzanine. With captioning and versioning becoming a critical business function, it is worth considering what caption mezzanine should be used. In my opinion, the only viable choice is a TTML variant and that almost certainly means either an EBU-TT variant or an IMSC1 variant. Caption mezzanine workflows are pretty rare today, but continued downward pressure on pricing makes them inevitable. It’s worth remembering that a good choice of mezzanine can dramatically improve business efficiency and that workflow islands can use different mezzanines if there is no dependency on those mezzanine formats in upstream workflows. Upstream workflows may be tied to editing format mezzanines, but the distribution and archive portions of the business can improve flexibility by considering new formats like IMF as the mezzanine for future transcoding. It is gaining a lot of traction and there are definitely more companies attending “interoperability events” (such as the UK’s DPP events) than a couple of years ago. The future of transcoding If the future of transcoding is becoming more business oriented, then the transcoding engines themselves are migrating to have split personalities. There will always be the high speed calculation engine that optimizes the use of the underlying hardware. Anyone who has tried to encode High Dynamic Range UHDTV 120fps video on a 5-year-old laptop will have an intimate knowledge of a progress bar that moves like an aged tortoise through setting concrete. In addition to that engine will be a workflow controller of some kind where bespoke business logic can be quickly and easily implemented. This is key for the users of the transcoder to move quickly and efficiently and to harness the underlying power of the transcode engine. What is the future of transcoding? I think that it is very healthy and that the media conversion tool will be with us for a long time. The high power processing element of the transcoder will be hidden from view and the business functionality of optimizing media for consumption by businesses and consumers alike will be the way in which the humble transcoder is viewed. If you’re coming to SMPTE’s IMF interoperability event in Amsterdam, then I will see you there between my moderation duties. If not, then keep reading this blog for more news of good stuff from the Dalet Academy. Until next time. Bruce P.S. No tortoises were harmed in the writing of this blog. Go further with the Future Series - The Future of Ingest - The Future of Media Asset Management
The Power of the Dalet Search
In today’s multi-platform world, simply put, finding stuff is becoming more complex. In the past, a mere browse through the shelves would suffice. But the digital era brings forth the "hoarding" syndrome. Just think, for example, of your own collection of home pictures – I know mine are in an unmanaged mess. But before we get into searching, we first need to address quantifying things. This is where a MAM's role is to be the record keeper of your valuable content and its associated information. More importantly, having a metadata model extensible enough to address the multiple levels and hierarchy of data is key to the success of your search power. As the amount of content owned, archived and distributed by broadcasters is rapidly growing, it is also evolving, resulting in an exponential expansion of files that must be managed. What was once a one-to-one relationship between the "record" and the media, has evolved into a model where a complex collection of elements (audio, video, text, captions, etc.) forms a record relationship. And don’t even get me started on versioning. To illustrate what I’m talking about, let’s look at the example of the TV Series “24,” starring Keifer Sutherland. You could annotate an episode with the actor’s name, the actor’s character’s name, the actor’s birthday, and so on ... and for each element of that collection (let’s say the source master, the poster, the caption). Having the ability to define a taxonomy and ontology so that when I specify that “24” ALWAYS has Jack Bauer in all the episodes and that the character Jack Bauer is played by actor Keifer Sutherland, we can then have a way to inherit that information down the tree for any element that is part of that tree: Series/Season/Episode. Then for the users, only saying that “this” video is actually 24/season2/ep7 will automatically inherit/apply all it's “parent” associated metadata... without needing to enter each individual value. This greatly reduces the amount of data entry (and time) necessary to quantify something when considering the immense amount of content associated with any given record. But the big impact of the rich metadata engine found in our MAM is its ability to not only search but to discover as well. What I mean is that there are typically two methods of searching: The first is explicit search – the user chooses the necessary fields to conduct their search, and then enters the values to obtain a result, e.g. looking for “Videos” with “Jack Bauer” in “Season 2.” The result is a list that the user must filter through to find what they want. The second way to search is through discovery, with the MAM's ability to display facets. For example, I could type “Actor’s height” (6'2") in “Action role,” “On Location” (Los Angeles). The return would display facets organized by user-defined relevancy, such as Series, Media Type, Actor Name, to then produce a resulting list along with facet boxes that the user can "filter down" within the search. The above example would show: "I found 12 Videos with Keifer Sutherland as an actor," and “I found 34 assets shot in Los Angeles.” And then by checking the 12 Videos of Keifer and the 34 in Los Angeles to cross-eliminate, I would find that there are actually three assets of Keifer in Los Angeles. And then you would also see that the character Jack Bauer also has a cameo on “The Simpsons.” Rich metadata allows us to create relationship between assets at multiple levels. Those various facets allow you to not only navigate through hundreds if not thousands of media assets, but to easily discover specific content as well. And finally, having immediate access to these results for viewing or editing is what makes the Dalet MAM a harmonious ecosystem for not only information but also action/manipulation of said assets.
CCW, SOA, FIMS and the King & Queen of the Media Industry
All-Star Panel Sessions at CCW 2014 The NAB-backed CCW held some impressive panels, and our own Stephane Guez (Dalet CTO) and Luc Comeau (Dalet Business Development Manager) participated in two of the show’s hot topics. MAM, It’s All About Good Vocabulary – Luc Comeau, Senior Business Development Manager The saying goes, “behind every great man, there is a greater woman.” Within the panel – “Content Acquisition and Management Platform: A Service-Oriented Approach” – there was a lot of talk about content being king. In my view then, metadata is his queen. Metadata gives you information that a MAM can capitalize on and allows you to build the workflow to enable your business vision. Done correctly and enterprise MAM will give you visibility into the entire organization, allowing you to better orchestrate both the technical and human process. Because at the end of the day, it’s the visibility of the entire organization that allows you to make better decisions, like whether or not you need to make a change or adapt your infrastructure to accommodate new workflows. In our session, the conversation very quickly headed towards the topic of interoperability. Your MAM must have a common language to interface with all the players. If it doesn’t, you will spend an enormous amount of time translating so these players can work together. And if the need arises, and it usually does, you may need to replace one component with another that speaks a foreign language, well then, you are back to square one. A common framework will ensure a smooth sequence through production and distribution. A common framework, perhaps, such as FIMS… The One Thing Everyone Needs to Know About FIMS – Stephane Guez, Dalet CTO I was invited by Janet Gardner, president of Perspective Media Group, Inc., to participate in the FIMS (Framework for Interoperable Media Services) conference panel she moderated at CCW 2014. The session featured Loic Barbou, chair of the FIMS Technical Board, Jacki Guerra, VP, Media Asset Services for A+E Networks, and Roman Mackiewicz, CIO Media Group at Bloomberg – two broadcasters that are deploying FIMS-compliant infrastructures. The aim of the session was to get the broadcasters’ points of views on their usage of the FIMS standard. The FIMS project was initiated to define standards that enable media systems to be built using a Service Orientated Architecture (SOA). FIMS has enormous potential benefits for both media organizations and the vendors/manufacturers that supply them, defining common interfaces for archetypal media operations such as capture, transfer, transform, store and QC. Global standardization of these interfaces will enable us, as an industry, to respond more quickly and cost effectively to the innovation and the constantly evolving needs and demands of media consumers. Having begun in December 2009, the FIMS project is about to enter it’s 6th year, but the immense scale of the task is abundantly clear, with the general opinion of the panelists being that we are at the beginning of a movement – still very much a work-in-progress with a lot of work ahead of us. One thing, however, was very clear from the discussion: Broadcasters need to be the main driver for FIMS. In doing so, they will find there are challenges and trade offs. FIMS cannot be adopted overnight. There are many existing, complex installations that rely on non-FIMS equipment. It will take some time before these systems can be converted to a FIMS-compliant infrastructure. Along with the technology change, there is the need to evolve the culture. For many, FIMS will put IT at the center of their production. A different world and skill set, many organizations will need to adapt both their workforce and workflow to truly reap the advantages of FIMS.
An IBC preview that won’t leave you dizzy
When we write these blog entries each week, we normally ensure we have a draft a few days in advance to make sure we have plenty of time to review, edit and make sure that the content is worth publishing. This entry was late, very late. This pre-IBC post has been hugely challenging to write for two reasons: Drone-mounted Moccachino machines are not on the agenda – but Bruce’s post last week definitely has me avoiding marketing “spin.” There are so many things I could talk about, it’s been a struggle to determine what to leave out. Earlier this year, at the NAB Show, we announced the combination of our Workflow Engine, including the Business Process Model & Notation (BPMN) 2.0-compliant workflow designer, and our Dalet AmberFin media processing platform. Now generally available in the AmberFin v11 release, we’ll be demonstrating how customers are using this system to design, automate and monitor their media transcode and QC workflows, in mission-critical multi-platform distribution operations. Talking of multi-platform distribution, our Dalet Galaxy media asset management now has the capability to publish directly to social media outlets such as Facebook and Twitter, while the new Media Packages feature simplifies the management of complex assets, enabling users to see all of the elements associated with a specific asset, such as different episodes, promos etc., visually mapped out in a clear and simple way. Making things simple is somewhat of a theme for Dalet at IBC this year. Making ingest really easy for Adobe Premiere users, the new Adobe Panel for Dalet Brio enables users to start, stop, monitor, quality check and ingest directly from the Adobe Premiere Pro interface with new recordings brought directly into the edit bin. We’ll also be demonstrating the newly redesigned chat and messaging module in Dalet Galaxy, Dalet WebSpace and the Dalet On-the-Go mobile application. The modern, and familiar, chat interface has support for persistent chats, group chats, messaging offline users and much more. Legislation and consolidation of workflows mean that captioning and subtitling are a common challenge for many facilities. We are directly addressing that challenge with a standards-based, cross-platform strategy for the handling of captioning workflows across Dalet Galaxy, Dalet Brio and Dalet AmberFin. With the ability to read and write standards-constrained TTML, caption and subtitle data is searchable and editable inside the Dalet Galaxy MAM, while Dalet Brio is able to capture caption- and subtitle-containing ancillary data packets to disk and play them back. Dalet AmberFin natively supports the extraction and insertion of subtitle and caption data to and from .SCC and .STL formats respectively, while tight integration with other vendors extends support for other vendors. There are so many other exciting new features I could talk about, but it’s probably best to see them for yourself live in Amsterdam. Of course, if you’re not going to the show, you can always get the latest by subscribing to the blog, or get in touch with your local representative to get more information. There, and I didn’t even mention buzzwords 4K and cloud… …yet!
AmsterMAM – What’s New With Dalet at IBC (Part 1)
If you’re a regular reader of this blog, you may also receive our newsletters (if not, email us and we’ll sign you up) – the latest edition of which lists 10 reasons to visit Dalet at the upcoming IBC show (stand 8.B77). Over the next couple of weeks, I’m going to be using this blog to expand on some of those reasons, starting this week with a focus on Media Asset Management (MAM) and the Dalet Galaxy platform. Three years ago, putting together an educational seminar for SMPTE, Bruce Devlin (star of this blog and Chief Media Scientist at Dalet) interviewed a number of MAM vendors and end users about what a MAM should be and do. Pulling together the responses – starting with a large number of post-it notes and ending with a large Venn diagram – it was obvious that what “MAM” means to you is very dependent on how you want to use it. What we ended up with was a “core” of functionality that was common to all MAM-driven workflows and a number of outer circles with workflow-specific tasks. This is exactly how Dalet Galaxy is built – a unified enterprise MAM core, supporting News, Production, Sports, Archive, Program Prep and Radio, with task-specific tools unique to each business solution. At IBC we’ll be showcasing these workflows individually, but based on the same Dalet Galaxy core. For news, we have two demonstrations. Dalet News Suite is our customizable, Enterprise multimedia news production and distribution system. This IBC we’ll be showcasing new integration with social media and new tools for remote, mobile and web-based working. We’ll also be demonstrating our fully-packaged, end-to-end solution for small and mid-size newsrooms, Dalet NewsPack. In sports workflows, quick turnaround and metadata entry is essential – we’ll be showing how Dalet Sports Factory, with new advanced logging capabilities, enables fast, high-quality sports production and distribution. IBC sees the European debut of the new Dalet Galaxy-based Dalet Radio Suite, the most comprehensive, robust and flexible radio production and playout solution available, featuring Dalet OneCut editing, a rock-solid playout module featuring integration with numerous third parties and class-leading multi-site operations. Dalet Media Life provides a rich set of user tools for program prep, archive and production workflows. New for IBC this year, we’ll be previewing new “track stack” functionality for multilingual and multi-channel audio workflows, extended integration with Adobe Premiere and enhanced workflow automation. If you want to see how the Dalet Galaxy platform can support your workflow, or be central to multiple workflows click here to book at meeting at IBC or get in touch with our sales team. You can also find out more about what we’re showing at IBC here.
More Secrets of Metadata
Followers of Bruce’s Shorts may remember an early episode on the Secrets of Metadata where I talked about concentrating on your metadata for your business, because it adds the value that you need. It seems the world is catching onto the idea of business value of metadata, and I don’t even have to wrestle a snake to explain it! Over the last 10 years of professional media file-based workflows, there have been many attempts at creating standardized metadata schemes. A lot of these have been generated by technologists trying to do the right thing or trying to fix a particular technical problem. Many of the initiatives have suffered from limited deployment and limited adoption because the fundamental questions they were asking centered on technology and not the business application. If you center your metadata around a business application, then you automatically take into account the workflows required to create, clean, validate, transport, store and consume that metadata. If you center the metadata around the technology, then some or all of those aspects are forgotten – and that’s where the adoption of metadata standards falls down. Why? It’s quite simple. Accurate metadata can drive business decisions that in turn improves efficiency and covers the cost of the metadata creation. Many years ago, I was presenting with the head of a well-known post house in London. He stood on stage and said in his best Australian accent “I hate metadata." You guys want me to make accurate, human oriented metadata in my facility for no cost, so that you guys can increase your profits at my expense.” Actually he used many shorter words that I’m not able to repeat here J. The message that he gave is still completely valid today: If you’re going to create accurate metadata, then who is going to consume it? If the answer is no one, ever, then you’re doing something that costs money for no results. That approach does not lead to a good long-term business. If the metadata is consumed within your own organization, then you ask the question: “Does it automate one or many processes downstream?” The automation might be a simple error check or a codec choice or an email generation or a target for a search query. The more consuming processes there are for a metadata field, the more valuable it can become. If the metadata is consumed in a different organization, then you have added value to the content by creating metadata. The value might be expressed in financial terms or in good-will terms, but fundamentally a commercial transaction is taking place by the creation of that metadata. The UK’s Digital Production Partnership and the IRT in Germany have both made great progress towards defining just enough metadata to reduce friction in B2B (business to business) file transfer in the broadcast world. Cablelabs continues to do the same for the cable world and standards bodies such as SMPTE are working with the EBU to make a core metadata definition that accelerates B2B ecommerce type applications. I would love to say that we’ve cracked the professional metadata problem, but the reality is that we’re still half way through the journey. I honestly don’t know how many standards we need. A single standard that covers every media application will be too big and unwieldy. A different standard for each B2B transaction type will cost too much to implement and sustain. I’m thinking we’ll be somewhere between these two extremes in the “Goldilocks zone,” where there are just enough schemas and the implementation cost is justified by the returns that a small number of standards can bring. As a Media Asset Management company, we spend our daily lives wrestling with the complexities of metadata. I live in hope that at least the B2B transaction element of that metadata will one day be as easy to author and as interoperable as a web page. Until then, why not check out the power of search from Luc’s blog. Without good metadata, it would be a lot less exciting.
Why Ingest to the Cloud?
With Cloud storage becoming cheaper and the data transfer to services such as Amazon S3 storage being free of charge, there are numerous reasons why ingesting to the Cloud should be part of any media organization’s workflow. So, stop trying to calculate how much storage your organization consumes by day, month or year, or whether you need a NAS, a SAN or a Grid, and find out why Cloud could be just what your organization needs. Easy Sharing of Content Instead of production crews or field journalists spending copious amounts of time and money shipping hard drives to the home site or being limited by the bandwidth of an FTP server when uploading content, with object storage services like Amazon S3 or Microsoft Azure, uploading content to the Cloud has become easy and cheap. Once content is uploaded to the Cloud, anyone with secure credentials can access it from anywhere in the world. Rights Access to Content In recent news, cloud storage services such as Apple iCloud were hacked and private content was stolen, increasing the concern about security and access rights to content in the Cloud. With secure connections such as VPN and rights access management tools, you can specify, by user, group access rights and duration of how long content can be accessed on the Cloud. Both Microsoft and Amazon have setup security features to protect your data as well as to replicate content to more secure locations. Cloud Services to Process the Data By uploading content to the Cloud, in the backend you can setup services and workflows to run QC checks on the content, stream media, transcode to multiple formats, and organize the content for search and retrieval using a Media Asset Management (MAM) System hosted on the Cloud. Cloud Scalability Rather than buying an expensive tape library or continuing to purchase more hardware for a spinning disk storage, with cloud storage, one can scale down or scale up with the click of a button. No need for over-provisioning. Disaster Recovery An organization can easily set up secure data replication from one site to another or institute replication rules to copy content to multiple virtual containers, offering assurance that content will not be lost. Amazon S3 provides durable infrastructure to store important data and is designed for durability of 99.99999999% of objects. Moving Towards an OPEX Model As operations and storage move to the Cloud, you can control your investment by paying as you use services and storing content on the Cloud. Instead of investing on infrastructure maintenance and support, with operations on the Cloud, you can focus the investment on what makes a difference, the content and not the infrastructure to support it. Why Upload to the Cloud? The Cloud is no longer a technology of the future, with cloud storage adopted by Google, Facebook and Instagram, Cloud technology is the reality of today. By adopting this technology you control your investment by usage needs, backup your data and provide secure access to content to anyone with credentials anywhere in the world. The biggest limitation now is bandwidth, and the hurdle is adjusting the current infrastructure to support Cloud operations. Many organizations are turning towards a hybrid Cloud model, where content and services are hosted both locally and via Cloud solutions. Learning from the Cloud experience, Dalet has made initiatives over the past few years to evolve existing tools and services for the Cloud. Dalet now offers direct ingest from the Dalet Brio video server to Amazon S3 Storage and, at NAB this year in Las Vegas, Dalet showcased the first MAM-based Newsroom on the Cloud. To learn more about Dalet ingest solutions, please visit the ingest application page.