Top banner


Better Together! Dalet Acquires Ooyala Flex Media Platform business.

Menu extends

Jun 20, 2015
Reinheitsgebot: A clear and positive influence on the definition of European media file exchange and delivery formats
By taking a look at Reinheitsgebot – the “German Beer Purity Law” – we examine how restricting a “recipe” to specific “ingredients” can result in consistent “flavours” of beer – or in this case, media. Find out why the very definition of file formats for exchange and delivery in the media industry has everything to do with the purity, or quality, of media files.

Reinheitsgebot: A clear and positive influence on

By taking a look at Reinheitsgebot – the “German Beer Purity Law” – we examine how restricting a “recipe” to specific “ingredients” can result in consistent “flavours” of beer – or in this case, media. Find out why the very definition of file formats for exchange and delivery in the media industry has everything to do with the purity, or quality, of media files.

It doesn’t take much research into either Reinheitsgebot or file specifications to realise that this title is almost complete nonsense. When Reinheitsgebot, aka the “German Beer Purity Law,” was first endorsed by the duchy of Bavaria 499 years ago (23rd April 1516) it actually had nothing to do with the purity of beer and everything to do with the price of bread – banning the use of wheat in beer to ensure that there was no competition between brewers and bakers for limited supply.

Reinheitsgebot has come to represent a mark of quality in beer and something that German brewers are very proud of, but as the law spread across what is now modern Germany in the 16th century, it actually lead to the disappearance of many highly regarded regional specialities and variations.

By contrast, the definition of file formats for exchange and delivery in the media industry has everything to do with the purity, or quality, of media files – indeed the initiative that has lead to the publication of the ARD-ZDF MXF Profiles in the German-speaking community was lead by the group looking at quality control and management.

This has represented a fairly significant change in mind-set in our approach to QC. Within reason, the file format should not really affect the “quality” of the media (assuming sufficient bit-rate). However, to have a consistent file-QC process, you need to start with consistent files, and the simplest way to do this is to restrict the “ingredients” in order to deliver a consistent “flavour” of file. By restricting the variations, we considerably simplify QC processes, mitigate risk of both QC and workflow errors occurring downstream, and reduce the cost of implementation through decreased on-boarding requirements.

This point is critical, and for illustration, one need only refer to the results of the IRT MXF plug-fest that takes place each year. At the 2014 event, outputs and interoperability of 24 products from 14 vendors, restricted to four common essence types and two wrapper types, were tested.

Even with these restrictions, a total of 4,439 tests were conducted. Assuming each test takes an average of 60 seconds, that equates to very nearly two whole man-weeks of testing before we even consider workflow-breaking issues such as time-code support, frame accuracy, audio/video off-set, etc.

Constrained media file specifications equate to far fewer variations, simplifying the on-boarding process and enabling media organizations to easily facilitate thorough automated and human QC, while focusing on the quality of the media, not the interoperability of the file.

However, the file specifications themselves may not completely answer all our problems. Referring back to the German beer market, despite the regulation being lifted in 1988 following a ruling by the European Court of Justice, many breweries and beers still claim compliance with Reinheitsgebot, even though very, very few beers actually do. We have two issues in media that are equivalent – future proofing and compliance.

When introduced, Reinheitsgebot specified three permitted ingredients – water, barley and hops. Unknowingly, however, brewers were adding another ingredient – either natural airborne yeast, or yeast cultivated from previous brews, a necessary addition for the fermentation process. Without launching into a convoluted discussion about “unknown, unknowns,” from this we learn that we have to accept the extreme difficulties of scoping future requirements.

Reinheitsgebot was replaced in 1993 by the Provisional German Beer Law, allowing for ingredients such as yeast and wheat, without which the famous Witbier (wheat beer) would not exist – one of the German beer industry’s biggest exports. Globally, this has lead to much confusion over what Reinheitsgebot compliance means, especially with many wheat beers claiming adherence. In the media industry, the UK DPP launched a compliance program run by the AMWA, but there are many more companies claiming compliance than appear on the official list.

While I suspect that many beers have been consumed in the writing of media file specifications, in reality it is unlikely that the story of the German beer purity law has had much impact – it may still have some lessons to teach us though.

And now, time for a beer! Cheers!

Note: this article also appeared in the June 2015 issue of TV Technology Europe

Life before and after DPP (Digital Production Partnership)
People that know me will be aware that file-based workflows are a passion of mine. Ten years ago I was co-author of the MXF (Media Exchange Format) specification and ever since I have been engaged in taking this neatSMPTE standard and using it to create a business platform for media enterprises of every size and scale. This is why I’m so excited by the Digital Production Partnership (DPP): it represents the first ratified national Application Specification of the MXF standard and is set to revolutionize the way that media facilities and broadcasters work.To explain what I mean, let’s compare life with a DPP ecosystem to life without. Less pain to feel the gain 
In a standardized DPP world, there would be a limited amount of pain and cost felt by everybody but this would be shared equally amongst the organizations involved and it would be a limited cost, which is incurred only once. After this point, our industry has a fantastic common interchange format to help encourage partnerships and build businesses. In an unstandardized world, where different facilities have decided to use different tools and variants of MXF or other formats, the major cost becomes the lack of third-party interoperability. Each time content is exchanged between different facilities, a media transcode or rewrap in that format is required. This means that all vendors in all the facilities will ultimately support all the file formats andmetadata. The engineering required to implement and test takes time and costs money on an on-going basis. Interoperable metadata helps the content creator 
In a world that has adopted DPP, media and metadata interoperability is not an issue since the format is built on a strong, detailed common interchange specification. In this homogeneous scenario the resources that would have been used in the interoperability engineering process can be used in more creative and productive ways, such as programme making. Programme making is a process where most broadcasters utilise external resources. In a world without DPP, whenever a broadcaster or production facility receives a new file from an external facility, such as a Post House, the question must be asked whether this file meets the requirements of their in-house standard. That evaluation process can lead to extra QC costs in addition to possible media ingest, transcoding, conformance and metadata re-keying costs that need to be taken into account. Building a business platform
 This heterogeneous environment is an issue not just for interaction with external facilities: often different departments within the same major broadcaster will adopt slightly different file standards and metadata making interoperability a big issue to them. As a result, today only about 70 per cent of transactions within companies are file-based – the remainder employ tape. However, this is much higher than where external agencies are involved – here, only 10 – 15 per cent of transactions are file-based. The essence of the problem is the lack of a common interchange format to enable these transactions. DPP is the first open public interchange format that is specifically designed to address this issue. DPP is intended to transform today’s 20 per cent trickle into an 80 per cent flood in the shortest time. To find out more about DPP and how it can transform the way your operation works and also your effectiveness working with other organizations read AmberFin’s White Paper on DPP.
5 reasons why media delivery standards might be good for your business
Like me, I am sure that you have been to a restaurant in a group and everyone orders from the set menu EXCEPT for that one person who orders the exotic, freshly prepared fugu, which requires an extra 30 minutes of preparation from a licensed fugu chef so that the customers don't die eating it. Restaurant etiquette means that our main course is served at the same time, forcing everyone to spend a long time hungry, waiting for the special case. And if you split the bill equally, the special case becomes subsidised by the people wanting the set meal. Does this model relate to the media industry? Is there a cost for being special? How can we reduce that cost? What gets done with the cost savings? How can you help? Fortunately those 5 questions lead into 5 reasons why delivery standards might be a good idea. 1. The set meal is more efficient than the a la carte I must confess that when I write this blog while hungry there will be a lot of food analogies. I'm quite simple really. In the "set meal" case - you can see how it's easier for the kitchen to make a large volume of the most common meal and to deliver it more quickly and accurately than a large number of individual cases. In the file delivery world, the same is true. By restricting the number of choices to a common subset that meet a general business need, it is a lot easier to test the implementations by multiple vendors and to ensure that interoperability is maximised for minimum cost. In a world where every customer can choose a different mix of codecs, audio layout, subtitle & caption formats, you quickly end up with an untestable mess. In that chaotic world, you will also get a lot of rejects. It always surprises me, how few companies have any way of measuring the cost of those rejects, even though they are known to cause pain in the workflow. A standardised, business-oriented delivery specification should help to reduce all of these problems. 2. Is there a cost for being special? I often hear the statement – "It's only an internal format - we don't need to use a standard". The justification is often that the company can react more quickly and cheaply. Unfortunately, every decision has a lifespan. These short-term special decisions often start with a single vendor implementing the special internal format. Time passes and then a second vendor implements it, then a third. Ultimately the custom cost engineering the special internal format is spent 3 or 4 times with different vendors. Finally the original equipment will end of life and the whole archive will have to be migrated. This is often the most costly part of the life cycle as the obsolete special internal format is carefully converted into something new and hopefully more interchangeable. Is there a cost of being special? Oh yes, and it is often over and over again. 3. How can we reduce costs? The usual way to reduce costs is to increase automation and to increase "lights out" operation. In the file delivery world, this means automation of transcode AND metadata handling AND QC AND workflow. At Dalet and AmberFin, all these skills are well understood and mastered. The cost savings come about when the number of variables in the system is reduced and the reliability increases. Limiting the choices on metadata, QC metrics, transcode options, workflow branches increases the likelihood of success. Learning from experiences of the Digital Production Partnership in the UK, it seems that tailoring a specific set of QC tests to a standardised delivery specification with standardised metadata will increase efficiency and reduce costs. The Joint Task Force on File Formats and Media Interoperability is building on the UK's experience to create an American standard that will continue to deliver these savings 4. What gets done with the cost savings? The nice thing about the open standards approach is the savings are shared between the vendors who make the software (they don't have to spend as much money testing special formats) and the owners of that software (who spend less time and effort on-boarding, interoperability testing and regression testing when they upgrade software versions.) 5. How can you help? The easiest way is to add your user requirements to the Joint Task Force on File Formats and Media Interoperability list. These user requirements will be used to prioritise the standardisation work and help deliver a technical solution to a commercial problem. For an overview of some of the thinking behind the technology, you could check out my NAB2014 video on the subject, or the presentation given by Clyde Smith of Fox. Until next time.
How to bring standards to your organisation
Back in the 1990s, I was told of an old maxim: "If you can't win the market place, win the standard." I thought that this was a cynical approach to standardisation until we looked through some examples of different markets where there are a small number of dominant players (e.g., CPUs for desktop PCs, GPU cards, tablet / smartphone OS) versus markets where there is enforced cooperation (Wi-Fi devices, network cabling, telephone equipment, USB connectivity). So, how does this affect technology in the media industry, and how can you use the power of standards in your organisation? It seems that the media technology industry hasn't made its mind up about what's best. We have come from a history that is strong in standardisation (SDI, colour spaces, sampling grids, etc.), and this has created a TV and film environment where the interchange of live or streaming content works quite well, although maybe not as cheaply and cleanly as we would like. When the material is offline or file-based, there are many more options. Some of them are single-vendor dominant (like QuickTime), some are standards-led (like MXF), some are open source (Ogg, Theora) and others are proprietary (LXF, FLV). Over any long timeframe, commercial strength beats technical strength. This guiding principal should help explain the dynamics of some of the choices made by organisations. Over the last 10 years, we have seen QuickTime chosen as an interchange format where short-term "I want it working and I want it now" decisions have been dominant. In other scenarios – as in the case of "I am generating thousands of assets a month and I want to still use them in six years time when Apple decides that wearables are more important than tablets" – MXF is often the standard of choice. Looking into the future, we can see that there are a number of disruptive technologies that could impact decision-making and dramatically change the economics of the media supply chain: IP transport (instead of SDI) High Dynamic Range (HDR) video 4k (or higher) resolution video Wide colour space video HEVC encoding for distribution High / mixed frame rate production Time Labelling as a replacement for timecode Specifications for managing workflows Some of these are clearly cooperative markets where long-term commercial reality will be a major force in the final outcome (e.g., IP transport). Other technologies could go either way – you could imagine a dominant camera manufacturer “winning” the high / mixed frame rate production world with a sexy new sensor. Actually, I don't think this will happen because we are up against the laws of physics, but you never know – there are lots of clever people out there! This leads us to the question of how you might get your organisation ahead of the game in these or other new technology areas. In some ways being active in a new standard is quite simple – you just need to take part. This can be costly unless you focus on the right technology and standards body for your organisation. You can participate directly or hire a consultant to do this speciality work for you. Listening, learning and getting the inside track on new technology is simply a matter of turning up and taking notes. Guiding the standards and exerting influence requires a contributor who is skilled in the technology as well as the arts of politics and process. For this reason, there are a number of consultants who specialise in this tricky but commercially important area of our business. Once you know “who” will participate, you also need to know “where” and “how.” Different standards organisations have different specialties. The ITU will work on the underlying definition of colour primaries for Ultra High Definition, SMPTE will define how those media files are carried and transported, and MPEG will define how they are used during encoding for final delivery. Figuring our which standards body is best suited for the economic interests of your organisation requires a clear understanding of you organisation’s economics and some vision about how exerting influence will improve those economics. Although a fun topic, it's a little outside today's scope! So how do you bring standards to your organisation? Step 1: join in and listen Step 2: determine whether or not exerting influence is to your advantage Step 3: actively contribute Step 4: sit back and enjoy the fruits of your labour For more on the topic, don't forget to listen to our webinars! Coming soon, I'll be talking about Business Process Management and standards – and why they matter. Until the next one...
The other day, a member of our talented development team commented, quite accurately, that every time we return from an NAB Show, we nearly always refer to it as the biggest, busiest and best NAB ever. If you’ve ever watched or read one of my presentations or blogs on workflow, you may recollect that I’m a fan of the Toyota Production System and the “Kaizen” concept of continuous improvement. However, I do confess that, following my colleagues’ observation, I momentarily felt a certain amount of pressure to come back from NAB 2015 with evidence that it really was bigger, busier and better than previous years. However, earlier today I was talking to the editor of one of our excellent industry magazines about the most likely themes and trends for this year’s show and something struck me. Although I’m not much of a fan of “buzzword bingo,” given the host of announcements we at Dalet have for this year’s show, I’d place a bet on us sweeping the board. Even before the show, we’ll bring UHD to Dalet AmberFin – supporting UHD inputs in our next release at the end of March. By decoupling format from transport mechanism, Video over IP is one of the most revolutionary changes to the industry in some time, and our Dalet Brio video server platform is spearheading that charge. Building on all of this, Dalet Galaxy, our media asset management platform, continues to facilitate and enhance collaborative workflows with new features for user interaction and geographically dispersed operations –I can barely contain myself from mentioning the “C” word! It doesn’t stop there though. Back in September, we got quite emotional about being one of the first vendors to have a product certified for the creation of UK DPP files. The DPP has led the way in specifying standards and operational guidelines for file delivery and as other regions has followed, Dalet has been right there supporting them. Demonstrating our continued commitment to international standards that improve, ease and simplify the lives of our customers, we’ve now implemented the FIMS capture service in the Brio video server. I believe that initiatives like FIMS become ever more important as the video world increasingly leverages IT technology and, specifically, interaction between control and capture devices as we move to an era of hybrid SDI and IP acquisition. Despite regulatory rulings in the US and elsewhere, captioning and subtitling technology has seen little innovation in the last few years. Since Dalet and AmberFin came together a year ago, we’ve really focused on this as an area where our knowledge and expertise can benefit the industry as a whole. We’re now ready to show you what we’ve been up to and how we can simplify captioning workflows and bring them into multi-platform, multi-version workflows in an effective and efficient way. You’re probably aware of the Dalet Academy, which was launched with much fanfare in January this year. The response from the wider industry has simply been immense, and we now have many thousands of followers subscribed to the Bruce’s Shorts videos and reading our educational blog. For NAB 2015, we’ll be donning our robes and mortarboards to bring the Dalet Academy to the stage, live on our booth (SL4525). Bruce will be there – in his actual shorts – to present special live editions of the video series with support from other Dalet and industry experts for more short seminars. All of the presentations at the show will be followed by a special round-table discussion (limited seating). And while you’re keeping your media knowledge in good shape, there will also be an opportunity to win prizes that are sure to keep you in good shape too! To make sure the excitement doesn’t overwhelm too much, we’re keeping a couple of bits of news to ourselves until the show itself, but if you want to find out more on any of the topics I’ve touched on here, be sure to get in touch, book an appointment, or read more on our dedicated NAB page. As for our development team – sorry guys, I can already tell you that this year is going to be the biggest, busiest and best NAB Show so far!
Could your MXF files be infected with a virus?
We all know not to click on those shifty-looking attachments in emails, or to download files from dubious websites, but as file delivery of media increases, should we be worried about viruses in media files? In the case of the common computer virus, the answer is “probably not” – the structure of media files and applications used to parse or open MXF, QuickTime and other files do not make “good” hosts for this type of virus. Compared to an executable or any kind of XML-based file, media files are very specific in their structure and purpose – only containing metadata, video and audio – with any element labeled appropriately sent to the applicable decoder. Any labels that are not understood or supported by the parser are simply ignored. However, this behavior of ignoring unsupported or unrecognized labels facilitates the existence of “dark metadata,” and this is a potential area of weakness in the broadcast chain. Dark metadata isn’t necessarily as menacing as the name could suggest and is most commonly used by media equipment and software vendors to store proprietary metadata that can be used downstream to inform dynamic processes – for example, to change the aspect ratio conversion mode during up or down conversion, or audio routing in a playout video server. When you know what dark metadata you have, where it is and what it means, it can add value to the workflow chain. Since dark metadata will usually be ignored by parsers that don’t understand/support the proprietary data it carries, it can also be passed through the media lifecycle in a completely harmless way. However, if you are not aware of the existence of dark metadata and/or the values of the data it carries, then there is a risk that processes in the media path could be modified or activated unintentionally and unexpectedly. In this case, the media is in some way carrying a virus and in the worst case, could result in lost revenue. The anti-virus software installed on your home or work PC isn’t going to be much help in this instance, but there are simple steps that can be taken to ensure that you don’t fall foul of “unknown unknowns.” Implement a “normalization” stage at the entry point for media into your workflow. You can read other articles in this blog about the benefits of using a mezzanine file format, but even if files are delivered in the same format you use in-house, a simple re-wrapping process to “clean” and normalize the files can be a very lightweight process that adds little or no latency into the workflow. Talk to your suppliers and vendors to make sure you’re aware of any proprietary metadata that may be being passed into your workflow. If you have an automated file-QC tool, check whether it has a “dark metadata” test and switch it on – unless you definitely use proprietary metadata in your workflow, this won’t generate false positives and shouldn’t add any significant length to the test plan. We’ll be looking at some of the other security concerns in future blogs, but as long as you know your dark metadata, there’s little risk of viral infection from media files.
HPA: Mapping the Future, One Pixel at a Time
I love the HPA Tech Retreat. It is the most thought provoking conference of the year, one where you're guaranteed to learn something new, meet interesting people and get a preview of the ideas that will shape the future of the industry. Here are the six most interesting things I learned this year. Collaborating competitors can affect opinions At this year’s HPA Tech Retreat, I had the honour of presenting a paper with John Pallett from Telestream. Despite the fact that our products compete in the market place, we felt it important to collaborate and educate the world on the subject of fractional frame rates. 30 minutes of deep math on drop frame timecode would have been a little dry, so we took some lessons from great comedy double acts and kept the audience laughing, while at the same time pointing out the hidden costs and pitfalls of fractional frame rates that most people miss. We also showed that there is a commercial inertia in the industry, which means the frame rate 29.97i will be with us for a very long time. In addition to formal presentations, HPA also features breakfast round tables, where each table discusses a single topic. I hosted two great round tables, with John as a guest host on one, where the ground swell of opinion seems to be that enforcing integer frame rate above 59.94fps is practical, and any resulting technical issues can be solved – as long as they are known. I will never be smart enough to design a lens Larry Thorpe of Canon gave an outstanding presentation of the design process for their latest zoom lens. The requirements at first seemed impossible: design a 4K lens with long zoom range that is light, physically compact, and free from aberrations to meet the high demands of 4K production. He showed pictures of lens groupings and then explained why they couldn't be used because of the size and weight constraints. He went on to show light ray plots and the long list of lens defects that they were battling against. By the end of the process, most members of the audience were staring with awe at the finished lens, because the design process seemed to be magical. I think that I will stick to the relative simplicity of improving the world's file-based interoperability. Solar flares affect your productions We've all seen camera footage with stuck or lit pixels and, like most people, we probably assumed that they were a result of manufacturing defects or physical damage. Joel Ordesky of Court Five Productions presented a fascinating paper on the effects of gamma photons, which, when passing through a camera’s sensor, cause the sensor to permanently impair individual pixels. This is something that cannot be protected against unless you do all of your shooting underground in a lead lined bunker. Joel presented some interesting correlations between sun spot activity and lit pixels appearing in his hire stock, and then showed how careful black balance procedures can then reduce the visibility of the issue. UHD is coming – honest The HPA Tech Retreat saw a huge range of papers on Ultra High Definition (UHD) issues and their impacts. These ranged from sensors to color representation to display processing, compression, high frame rates and a slew of other issues. I think that everyone in the audience recognised the inevitability of UHD and that the initial offering will be UHDTV featuring resolution improvements. This is largely driven by the fact that UHD screens seem to be profitable for manufacturers; soon enough they will be the only options available at your local tech store (that’s just good business!). The displays are arriving before the rest of the ecosystem is ready (a bit like HDTV), but it also seems that most of the audience feels better colour and high dynamic range (HDR) is a more compelling offering than more pixels. For me, the best demonstration of this was the laser projector showing scenes in true BT2020 wide colour range. First we saw the well-known HDTV Rec.709 colour range and everything looked normal. Next up was the same scene in BT2020 – and it was stunning. Back to Rec.709, and the scene that looked just fine only seconds before now appeared washed out and unsatisfactory. I think HDR and rich colors will be addictive. Once you've seen well-shot, full color scenes, you won't want to go back to Rec.709. The future is looking very colourful. Women are making more of an impact in the industry (Hooray!) There were three all-women panels at this year's HPA, none of which were on the subject of women in the industry. This was a stark contrast to the view of women in the industry as shown on a 1930s documentary of the SMPTE Conference, where men with cigars dominated the proceedings and women were reduced to participating in the chattering social scene. This contrast was beautifully and ironically highlighted by Barbara Lange (Executive Director of SMPTE) and Wendy Aylesworth (President of SMPTE 2005-2015), who hosted their panel in bathrobes with martini glasses, while explaining the achievements of the society over the year. If you haven't yet contributed to the SMPTE documentary film project or the SMPTE centennial fund, it's time to do so now. These funds will help support the next, diverse generation of stars. IMF and DPP are a symbiotic pair One of the most interesting panels was on the Interoperable Mastering Format (IMF) and the Digital Production Partnership (DPP) interchange format (and yes, this was in fact one of my panels!). One format’s purpose is to distribute a bundle of files representing several versions of one title. The other is designed to create a finished, single file with ingest-ready metadata, where the file can be moved to playout with virtually no changes. Both formats have a strong foothold in the life cycle of any title and are likely to form the strongest symbiotic relationship as we move into the future. One thing that I pointed out to the audience is that the DPP has done a huge amount of work educating UK production and postproduction houses about the change management that is required for file-based delivery. They have written a wonderful FREE guide that you can download from their website. All in all, the HPA Tech Retreat is a wonderful event with so much information flowing that it takes weeks to absorb it all. I must confess though, that one of the highlights for me was being able to cycle up the mountain every morning before breakfast. It meant that I could go back for seconds of all the wonderful cake that was on offer. Happy days! Until next time – don't forget about our UHD webinar, happening today. If you didn’t sign up in time, drop us a line at and ask for a re-run. The more people that ask, the more likely that we'll do it!
5 Reasons why we need more than ultra HD to save TV
If you were lucky (or unlucky) enough to get to CES in Las Vegas this year, then you will know that UHD (Ultra High Definition TV) was the talking point of the show. By and large the staff on the booths were there to sell UHD TVs as pieces of furniture and few of them know the techno-commercial difficulties of putting great pictures onto those big, bright, curved(?) and really, really thin displays: In my upcoming webinar on the 29th January I will be looking into the future and predicting some of the topics that I think will need to be addressed over the next few years if TV as we know it is to survive. 1. Interoperability The number of screens and display devices is increasing. The amount of content available for viewing is going up but the number of viewers is not changing greatly. This means that we either have to extract more revenue from each user or reduce the cost of making that content. Having systems that don’t effectively inter-operate adds cost, wastes time and delivers no value so the consumer. Essence interoperability (video & audio) is gradually improving thanks to education campaigns (from AmberFin and others) as well as vendors with proprietary formats reverting to open standards because the cost of maintaining the proprietary formats is too great. Metadata interoperability is the next BIG THING. Tune in to the webinar to discover the truth about essence interoperability and then imagine how much unnecessary cost exists in the broken metadata flows that exists between companies and between departments. 2. Interlace must die UHD may be the next big thing, but just like HDTV it is going to have to show a lot of old content to be a success. Flick through the channels tonight and ask yourself “How much of the content was shot & displayed progressively”. On a conventional TV channel the answer is probably “none”. Showing progressive content on a progressive screen via an interlaced TV value chain is nuts. It reduces quality and increases bitrate. Anyone looking at some of the poor pictures shown at CES will recognise the signs of demonstrations conceived by marketers who did not understand the effects of interlace on an end to end chain. Re-using old content involves up-scaling & deinterlacing existing content – 90% of which is interlaced. In the webinar, I’ll use AmberFin’s experience in making the world’s finest progressive pictures to explain why interlace is evil and what you can do about it. 3. Automating infrastructure Reducing costs means spending money on the things that are important and balancing expenditure between what is important today and what is important tomorrow. There is no point in investing money in MAMs and Automation if your infrastructure won’t support it and give you the flexibility you need. You’ll end up redesigning your automation strategy forever. The folks behind explain this much more succinctly and cleverly than I could ever do. In the webinar, I’ll explain the difference between different virtualization techniques and why they’re important. 4. Trust confidence & QC More and more automation brings efficiency, cost savings and scale, but also means that a lot of the visibility of content is lost. Test and measurement give you the metrics to know about that content. Quality Control gives you decisions that can be used to change your Quality Assurance processes. These processes in turn allow your business to deliver media product that delivers the right technical quality for the creative quality your business is based on. So here’s the crunch. The more you automate, the less you interact with the media, the more you have to trust the metadata and pre-existing knowledge about the media. How do you know it’s right? How do you know that the trust you have in that media is founded? For example. A stranger walks up to you in the street and offers you a glass of water. Would you drink it? Probably not. If that person was your favourite TV star with a camera crew filming you – would you drink it now? Probably? Trust means a lot in life and in business. I’ll explore more of this in the webinar. 5. Separating the pipe from the content If, like me, you’re seeing more grey hair appearing on the barber’s floor with each visit then you may remember the good old days when the capture standard (PAL) was the same as the contribution standard (PAL) and the mixing desk standard (PAL) and the editing standard (PAL) and the playout standard (PAL) and the transmission standard (PAL). Today we could have capture format (RED), a contribution standard (Aspera FASP), a mixing desk standard (HDSDI), an editing standard (MXF DNxHD),a playout standard (XDCAM-HDSDI) and a transmission standard (DVB-T2) that are all different. The world is moving to IP. What does that mean? How does it behave? A quick primer on the basics will be included in the webinar. Why not sign up below before it’s too late? Places are limited – I know it will be a good one. Register for our next webinar on: Wednesday 29th January at: 1pm GMT, 2pm CET, 8am EST, 5am PST OR 5pm GMT, 6pm CET, 12pm EST, 9am PST ‘til next time. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Does it take you 1/2 million years to test your workflow?
It is now obligatory to start every broadcast technology blog post, article or presentation with a statement reminding us that we are now living in a multi-format, multi-platform world, where consumers want to view the content they choose, when they want it, where they want it, on the device they want. However, unlike other marketing platitudes, this one is actually true: Many of us in this industry spend our days trying to develop infrastructures that will allow us to deliver content to different platforms, ageing prematurely in the process because to be honest, it's a really hard thing to do. So why is it so hard? Why is it so hard? Let me explain: For each device, you have to define the resolution: a new iPad has more pixels than HDTV, for example (2048 wide), and is 4:3 aspect ratio. Android phones have different screen sizes and resolutions. Don’t even get me started on interlaced or progressive. That video has to be encoded using the appropriate codec – and of course different devices use different codecs. Along with the pictures there will be sound. Which could be in mono, stereo or surround sound, which in turn could be 5.1, 7.1 or something more exotic. The sound could be encoded in a number of different ways. Digital audio sampling could be at 44.1kHz or 48kHz and a whole range of bit depths. Then the audio and video need to be brought together with the appropriate metadata in a wrapper. The wrapper needs to be put into a delivery stream. If it is for mobile use, we now routinely adopt one of the three different adaptive bitrate formats, which means essentially we have to encode the content at three different data rates for the target device to switch between. If you want to achieve the admirable aim of making your content available on all common platforms, then you have to take into consideration every combination of resolution, video codec, audio codec, track layout, timecode options, metadata and ancillary data formats and bitrate options. This is a very large number. And it does not stop there. That is only the output side. What about the input? How many input formats do you have to support? Are you getting SD and HD originals? What about 2k and, in the not too distant future, 4K originated material? If you are producing in-house, you may have ARRI raw and REDCODE (R3D) files floating around. The content will arrive in different forms, on different platforms, with different codecs and in different wrappers. We are on to the third revision of the basic MXF specification, for example. Any given end-to-end workflows could involve many, many thousands of input to output processes, each with their own special variants of audio, video, control and metadata formats, wrappers and bitrates. Each time a new input or output type is defined the number increases many-fold. Quality Control All of which is just mind-boggling. Until you consider quality control. If you were to test, in real time, every variant of, say, a three minute pop video, it would take a couple of hundred years. This is clearly not going to happen. It’s all right, I hear you say. All we need do is define a test matrix so that we know we can transform content from any source to any destination. If the test matrix works, then we know that real content will work, too. Well, up to a point. I have done the calculations on this and, to complete a test matrix that really does cover every conceivable input format, through every server option, to every delivery format for every service provider, on every variant of essence and metadata, it is likely to take you half a million years. Maybe a bit more. So are you going to start at workflow path one and test every case, working until some time after the sun explodes? Of course not. But what is the solution? Do you just ignore all the possible content flows and focus on the relatively few that make you money? Do you accept standardized processing which may make you look just like your competitors; or do you implement something special for key workflows even though the cost of doing it – and testing it – may be significant? We have never had to face these questions before. Apart from one pass through a standards converter for content to cross the Atlantic, everything worked pretty much the same way. Now we have to consider tough questions about guaranteeing the quality of experience, and make difficult commercial judgments on the right way to go. If you want to find out more about how to solve your interoperability dilemma, why don't you register for our next webinar on: Wednesday 29th January at: 1pm GMT, 2pm CET, 8am EST, 5am PST OR 5pm GMT, 6pm CET, 12pm EST, 9am PST I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Digital Production Partnership (DPP) - A Broadcaster's Perspective
Recently, we staged a webinar at AmberFin in partnership with ATG Broadcast which focussed on theDigital Production Partnership (DPP) file standards. We had a number of contributors giving the perspectives of a service provider, a media facility and a broadcaster. The broadcaster’s perspective was provided by Shane Tuckerfrom UK broadcaster, Channel 4. The broadcaster does not produce content itself, however it does commission a great deal of content. Channel 4 is not alone in the UK commissioning market – together with the BBC and ITV, Shane explained how it identified the need for a joined up approach. DPP - Strength through a unified approach
 The UK broadcasters that established DPP have a desire to establish the means for shared learning and best practice with other UK broadcasters. Also, if UK broadcasters are unified in their adoption of digital file-based workflows, they can exert a greater influence outside of the UK. Both of these opinions were confirmed by Shane Tucker in the webinar. Another key advantage for our industry focuses on production companies. Major production companies supply several UK broadcasters and it will be a big benefit to them if we can standardise on one media interchange standard. What benefits does DPP bring to Channel 4?
 To start with DPP just makes sense. This view was confirmed by Shane Tucker, who said in the Webinar that DPP represents a common file format based on established standards (MXF, SMPTE, EBU) and has been established in conjunction with major UK broadcasters, screen & production companies. Furthermore, Shane highlighted that DPP offers a common descriptive metadata schema. It offers the ability to access, process and automate metadata within digital file-based workflows is so important. It creates improved efficiency associated with automated workflows between broadcasters and their trusted suppliers. It cuts down on the need for data re-entry and speeds up material transfer and processing from delivery to playout/CDN avoiding unnecessary transcoding. Automated QC workflows 
Another advantage of DPP is the potential to capitalise on automatic QC workflows with the production company or facility. In the webinar, Shane Tucker pointed out that there is a strong likelihood that QC processes will have been performed at numerous stages in the workflow before the media file reaches Channel 4 so any further QC cycles are unnecessary. Shane concluded his contribution to the webinar by highlighting a number of challenges that remain in the successful adoption of DPP, not least the support needed from equipment vendors. At AmberFin, we recognise this and we’re straining every sinew in our efforts to support this fantastic UK initiative. To see the webinar in full, please click "Watch the Webinar" button below.
Why is Hollywood interested in DPP?
The Digital Production Partnership (DPP) is a thoroughly British concept. Its major sponsors include the BBC, ITV, Channel 4 and Sky. From its earliest inception, DPP was an initiative forged by the British broadcast industry, for the British broadcast industry. So why is it that in my day-to-day dialogue with executives in and around the Hollywood studio community DPP is frequently raised and I’m asked for regular updates on its progression? Without doubt, AmberFin has been a big supporter of DPP initiative from day one. With people on our team such as my colleague Bruce Devlin - who is co-author of SMPTE’s MXF specification - we really know our onions when it comes to intra- and inter-company file-based media workflows, so who better to ask than AmberFin? But I believe this US interest in an inherently UK initiative goes much deeper than technical curiosity and is fuelled by commercial considerations on a global scale. Can the DPP initiative support a global industry? As we all know, today’s broadcast media market is entirely global. International markets offer tremendous opportunities for broadcasters to maximise their revenues. The BBC’s Top Gear programme, for example, has become a global phenomenon and is watched by more than 350 million viewers in 170 countries – each broadcaster in each market having their own delivery specification and without doubt, there is a significant amount of duplicated pain associated with the necessary testing of delivery specs – especially when upgrading software and systems. In order to maximise global revenues, more and more versions of material are created to satisfy this demand. Standards such as SMPTE’s IMF (Interoperable Master Format) address the need to “publish” multiple versions of a media asset for shipping to the ‘despatch’ company, which is typically a Post House. Now standards such as DPP in the UK and the tongue-tripping, US-originated Joint Task Force on File Formats and Media Interoperability (JTFFFMI) provide delivery specifications for a commercially significant territory. The underlying ethos of both of these initiatives is that they reduce overall system complexity without affecting business flexibility and through this they make very sound commercial sense for anybody involved in this market, on either side of the Mill Pond. JTFFFMI – now there’s an acronym to be proud of! The JTFFFMI is a new industry group that has recently been formed by AMWA, SMPTE, NABA, 4A's, ANA, EBU and the IABM. The kick-off meeting took place last week in New York and I attended. So far, it is too early to tell if any of the practices and specifications deployed by the DPP will be utilised in the US, but without doubt, the DPP is providing a valuable testing ground and insight as to the best practices and potential pitfalls that manifest themselves in this area. Having been involved with the DPP from the earliest stages, I applaud their approach. The DPP’s intention was never to be dictatorial and impose its preferred flavour of MXF. It has created a forum where all interested parties, including broadcasters, facilities and equipment vendors, can come together and evolve a set of application specifications that are most appropriate for the UK. What will happen on 15 October 2014? Without doubt, not everything that is appropriate to the UK will necessarily be equally appropriate to other regional markets, but what the DPP has achieved is to break the back of this herculean challenge and create a blueprint that will help any other territories develop their own application specifications for MXF-based file-based workflows. This time, it really is rocket science and this is why Hollywood, and indeed, many other areas of North America are closely watching the development and deployment of DPP. DPP Deadline Day – 15th October 2014 – is fast approaching. The world is watching to see what will happen – will they be watching your organization? If you would like more information about the DPP, then a very good starting point is AmberFin’s Technical White Paper, which you can download for free here -
QC control within file-based workflows – an EBU update
At AmberFin, Quality Control (QC) has always been close to our hearts. Media files must always be fit for purpose – when they are not they quickly become toxic and can be highly destructive within any file-based workflow: The EBU shares our point of view- QC is key. “Broadcasters moving to file-based production facilities have to consider how to use automated Quality Control (QC) systems. Manual quality control is simply not adequate anymore and it does not scale,” so says the EBU. The EBU recognised QC as an important topic for the media industry in 2010. In 2011 it started anEBU Strategic Programme on Quality Control, with the aim to collect requirements, experiences and to create recommendations for broadcasters implementing file-based QC in their facilities. So what is QC all about? In an earlier blog, I questioned whether the broadcast industry really understands what Media Asset Management (MAM) systems are, and here too in QC, I fear that some vendors are muddying the waters. In Quality Control the big clue is in the word “control” and the role played by QC systems must not be confused with Test & Measurement (T&M) or Quality Assurance (QA) systems. In short, QC requires careful T&M of media parameters as a starting point but then the user needs to analyse the T&M data in order to make decisions (control) before modifying processes and making other decisions to assure quality is maintained throughout a file-based workflow. The role of EBU in bringing stability to QC To do this consistently, we need all the technology vendors to measure and report on these measures in the same way. The EBU QC project has succeeded in defining a set of metrics that can be consistently measured and reported on. The full range of QC criteria has been presented by the EBU as a periodic table which would grace (and fill) any chemistry lab wall. But the great standardisation work does not stop there – within the UK-based Digital Production Partnership (DPP) much more work has been done to identify the optimum number of tests required to prove that a DPP compliant file is indeed, compliant . The DPP’s aim is to take the completed EBU definitions & create a minimum set of tests & tolerance levels required to deliver a compliant DPP AS-11 file to UK Broadcasters.The timeline for implementation will depend on the outputs of the EBU group, but publication is likely to be during Spring this year. Over time, the UK’s broadcasters will move away from performing a full QC check on all delivered programmes, and rely on a spot check. A spot check, as opposed to full QC, is a technical video and audio check for every programme at the start, mid-point, and end. It also includes checks on key metadata such as SOM, duration, identifiers etc. Production companies will be required to deliver their compliant files along with a valid QC report, as has previously been the case with the PSE report. Why not engage with the EBU project online? The important thing in all of this work is that the recommendations that are made are based on the largest collective point of view possible. The EBU really appreciate input from as many people and organizations as possible. If you are a broadcaster, an SI, or indeed a QC product provider then why not get involved – go to the EBU’s website and follow the instructions to provide your feedback to this important initiative. Also, if you want to know more about the underlying principles to QC within file-based workflows you can download AmberFin’s free technical White Paper.
Mixed cadence, a broadcaster’s view
Sadly, nothing in life is perfect and this is as applicable in the broadcast sector as anywhere else. Despite the best laid plans, what starts out as something good will over time fall foul to simple time and commercial pressure: As soon as compromises are made, the potential for problems – either immediate or further down the workflow – is created. The media value chain is long and the lifetime of media can also be long. In today’s busy world, there just isn’t the time or money to keep material in its native format throughout a broadcast facility. The consequence is that a US broadcaster or media company will normalize all of its content to a standard Mezzanine form on ingest. Commonly this will be a format like XDCAM HD or AVCIntra or DNxHD or ProRes, but in nearly all cases the frame rate will be 29.97. How deeply seated is the mixed cadence problem? Providing that the only thing that happens next is simply the playout of the media, then all is well. Standard, reliable, known ways of inserting a 2:3 cadence are in common use in ingest devices, transcoders etc. and those devices, by and large, are quite good at it. Life, however, is never that simple. Many processes are required before the content airs, depending on the material and the territory. Whereas a few years ago, these processes may have been done on a live video stream, there is a trend for these processes to happen in the editing software. Edit for duration Transcode on the way into and / or out of an edit platform Overlay a logo Edit for censorship Insert black segments / slugs for adverts Squeeze & tease New credit roll Transcode on the way into an archive Transcode on the way out of an archive (maybe as a partial restore) So we now have a sensibly made movie that has become a video sequence with a 2:3 cadence where the chance that 2:2 cadence video edits, effects and overlays corrupting the underlying 2:3 cadence has dramatically increased. In short, your perfect film has turned into a video nasty. Commercial pressures lead to rushed decisions Again, if all you’re going to do is play it out at 30/1.001 fps then any problems are generally invisible. Life, today, is getting more complicated and there is huge commercial pressure to quickly and automatically push that content on a multi-platform distribution system, or move the content to another territory for re-broadcast. The lifecycle of the media becomes more erratic now. Sometimes there will be an attempt to recover the original 24fps material, often there will be scaling to change resolution, sometimes there will be scaling to change aspect ratio, sometimes there will be a standards conversion to a new frame rate, sometimes there will be further editing. This is where the problems really start. Compression algorithms work best when content is predictable. A regular 2:3 sequence should be easier to compress than a 2:3 sequence with breaks, hiccups and video overlays. Removing a 2:3 sequence is trivial if the sequence is regular. If, however, the sequence is irregular with video overlays on portions of the content and with video inserts, then getting back to the “original” 24fps becomes very difficult. Failing to handle the cadence correctly has a knock-on effect on every downstream process because once upstream problems have been “stamped into” the content by a compression stage, then removing is more and more difficult. Mixed cadence – keep the solution simple Fixing cadence issues assumes that you know what problem you want to fix. There are some very complex, and costly, cadence correction solutions on the market but you don’t need, or want, something that is too complicated. In an ideal world, your operators need a solution with a GUI that is intuitive and easy to operate. Today, more flexible post-production workflows make this mixed cadence challenge is a more common occurrence. You need a solution that fixes the business problem and delivers your perfect film back from the video nasty. At AmberFin, we have looked at the various techniques involved in removing cadence problems and in changing frame rates in general. We have developed a method of adaptively switching between different conversion mechanisms, which provides the ability for user control of conversion policy on a file by file basis as well as the ability to review automated decisions within a QC environment. To learn more about this really neat, business focused mixed cadence solution you can download our free White Paper. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?