Menu extends

Jan 21, 2015
Getting back to basics

Getting back to basics

Regular readers of this blog may well have noticed a small change in the last week or so – the blog has moved. As part of the launch of the new dalet.com, we have also merged the AmberFin and Dalet blogs to a single new location.
 
This got me thinking about how the blog got started, what we wanted to do with it and how those plans have evolved. 
 
The first post in this blog is dated 4th March 2013 – a little over 18 months ago. Since then, the blog has grown massively in terms of followers, posts and even the number and breadth of knowledge of the contributors.
 
What hasn’t changed is the intention. In that first post, Bruce wrote that the “blog will not be a platform for us to repurpose press releases, datasheets and corporate brochures, and we promise we will not write endless, self-serving product posts.” 
 
A quick review of the 154 posts since Bruce made that promise and, with the exception of some shameless self-promotion back in May, I think we’ve kept our word.
 
Our motivation for launching the blog was simple: “The rate of technological change in the broadcast and film markets continues unabated yet nobody has the time or resources to train their employees. As a result there is a real need for people to educate themselves to keep up to date in their current role and to advance their career.”
 
Since the blog launched, a number of industry organizations have done a great job in formalizing their educational and training programs and making courses available to businesses and individuals, but that statement and motivation still stands true today.
 
“When we say we want to help educate technology professionals in the broadcast and film market, we mean it.”
 
As this “mini re-launch” of our blog coincides with our planning for 2015, we are looking at how we can do even more to support that goal. Here are our ideas so far:

  • More contributors. Not just more, but contributors from all areas of the industry. Starting in the new year, we will be inviting end users, our partners and other independent industry professionals to bring their expertise to the blog – expanding the breadth and depth of knowledge we are able to provide.
  • Regional views. While all media organizations face broadly the same challenges, there are undoubtedly some variations across the globe. Dalet has 18 offices worldwide offering unique insights into localized issues, solutions and opportunities – we’ll be exploring some of those viewpoints and looking at how they might benefit the global market.
  • Reader interaction. Education should be interactive, and we have a number of initiatives in the works to enable this, starting now with suggestions for blog topics. If there’s a topic you’d like us cover in this blog, contact us or tweet your idea with #DaletBlog.

Over the next few weeks, we’ll be revisiting the topics of some of those early blog posts to see how and where the industry has moved on – which of those topics are now “non-issues” and which are still causing us headaches.
 
As we said in March ’13: To your workflow success! 

YOU MAY ALSO LIKE...
Dalet Appoints Patricio Cummins as General Manager of Dalet Asia-Pacific
Dalet, a leading provider of solutions and services for broadcasters and content professionals, today announced the appointment of Patricio Cummins as General Manager of Dalet Asia-Pacific (APAC). Based out of the Dalet regional headquarters located in Singapore, Cummins will be responsible for Dalet sales, project and customer success teams across the APAC territory. Cummins, who joined Dalet through the acquisition of the Ooyala Flex Media Platform business, was previously vice president of sales for Ooyala Asia-Pacific and Japan (APJ). “Patricio joins the Dalet team with two decades of experience in the broadcast, media and telecommunications industries and a proven track record of successfully developing new business and expanding into new markets across Asia Pacific. He is a well-prepared leader who brings expertise, enthusiasm and a fresh perspective to the team,” states Stéphane Schlayen, chief operating officer, Dalet. “Both Dalet and Ooyala have prestigious references in Asia Pacific that, when merged, have an even more promising potential under the guidance of Patricio. We wish him great success in his new endeavor.” An IABM APAC Council Member, Cummins has held key positions with Ooyala since 2014, driving customer adoption and managing service deployments, first in the Latin America region, then in APJ. His tech-savvy leadership has helped broadcasters, corporate brands, telcos, leagues, and sports teams modernize their content supply chains and reduce the time-to-launch personalized multi-platform experiences. Cummins succeeds Cesar Camacho, who has stepped into a new role at Dalet as Head of Business Development for Latin America. Schlayen concludes, “I want to personally thank Cesar for the dedication he has put into managing the Dalet business across the APAC region. His contribution was instrumental in driving our business development, growth and customer success. I am confident he will bring the same level of commitment and achievements to the Latin American market.” Meet Patricio Cummins and Dalet @ IBC2019 IBC2019 attendees can book an appointment to meet with Patricio Cummins or have a private demonstration or workflow consultation with a Dalet expert to learn more about the latest products and solutions at https://www.dalet.com/events/ibc-show-2019. Press can contact Alex Molina at alex@zazilmediagroup.com to schedule a media briefing. Better Together - Join us for a Very Special Dalet Pulse Event @ IBC2019! This IBC2019, the Dalet Pulse media innovation summit will expand its platform to include Ooyala. Celebrating the joining of two great media teams and technologies, the Dalet Pulse theme this year, Better Together, will give attendees a chance to learn about the extended product portfolio and how it helps leading media organizations develop agile content supply chains, deliver unique content experiences to multi-platform audiences, and increase revenues with Dalet solutions and partner technologies. It’s also a unique opportunity to meet the expanded team. Thursday, 12 September Pompstation, Amsterdam Keynote: 17:30 - 19:00 Party: 19:00 - 22:00 Register now via https://www.dalet.com/events/dalet-pulse-ibc-2019. About Dalet Digital Media Systems Dalet solutions and services enable media organisations to create, manage and distribute content faster and more efficiently, fully maximising the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions. Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloguing, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. In July 2019, Dalet announced the acquisition of the Ooyala Flex Media Platform business. An acceleration of the company’s mission, the move brings tremendous value to existing Dalet and Ooyala customers, opening vast opportunities for OTT & digital distribution. Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, TV2 Denmark, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Fox Sports Australia, Turner Asia, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio), sporting organisations (National Rugby League, FIVB, LFP) and government organisations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
25 Years of Dalet
In 1990, six friends from engineering and business school formed a company that pioneered the first audio software and centralized database solution for the radio industry, Dalet. Canada’s national radio channel, CBC Radio, became the first Dalet system deployed with a centralized catalogue and, throughout the 90s, we expanded across Europe, Asia and the Americas, providing solutions for radio and newsrooms so successfully that in many cases the same software is still in use today. Radio continues to be an important part of our business with large customers such as Voice of America and SiriusXM Radio relying on Dalet radio solutions. One of the six founders Stéphane Guez reminisces, “We knew we wanted to start a company together, but at the beginning, we weren’t entirely sure what that company was going to be. Spread across the northern hemisphere, we’d spend hours talking on the phone with one another. Well before Skype even existed and before cell phones had come to the masses, we’d find ways to connect via telephone, even though we were flat broke and didn't have access to our own lines. When we realized what it was that we wanted to create, there was no turning back.” It wasn’t long before we realized the potential of our approach beyond radio and, in our 10th year we began extending our software to establish a comprehensive solution for television news. That same year, in June (2000), Dalet also became a publicly traded company on the Paris Bourse (Euronext Paris). By 2002, we had created an end-to-end news production system – incorporating NRCS (newsroom computer system), ingest tools, video production features, and playout control with archive capabilities – which was rapidly adopted by the industry’s most forward-thinking broadcasters, including NBC, Prime TV, and Russia Today, to name a few. The first decade of this century saw immense changes in the media industry, with the monumental shift to file-based workflows. With this came the growing need for flexible and comprehensive media asset management (MAM) solutions, a trend that we had identified and were well positioned to address with our background in news and radio. In fact, in 2009, we were honored with the IBC Innovation Award in the Content Management category having provided RTBF with a highly flexible and scalable tapeless workflow, facilitating the production across news, program and sports production operations from ingest to playout to archive. In our 20th year (2010), Dalet completed the strategic acquisition of Italian company Gruppo TNT. With their Brio video server platform, Gruppo TNT had already experienced great success in their domestic market, but Dalet saw the potential in this technology, highly complementary to our own, as the next generation of video servers on the global market. Not only has the Brio augmented our MAM-driven solutions, it has also, on its own merits, become the cornerstone of ingest and studio infrastructure at some of the world’s most prestigious media facilities. “Growing from an idea between six friends into a global business has not been without its challenges,” Michael Elhadad, another of the original six, notes. “We had to take a lot of chances and make decisions based solely on our vision of the future. We’ve had our fair share of disagreements throughout that process! It’s also been extremely rewarding to see the results of those decisions and the success that’s come thanks to the many exceptional people we’ve worked with over the years.” Looking to repeat the success of the Gruppo TNT acquisition, and further complement the now well-established 4th generation of our MAM platform, Dalet Galaxy, in April of 2014, UK-based AmberFin joined the Dalet family. Well known for high-quality transcode and file-based frame rate conversion products, the potential in combining the AmberFin expertise in media formats and processing with the workflow and media management experience of Dalet is truly exciting and already proving beneficial for our customers. Over a quarter of a century, from humble beginnings, we have become a truly international organization, proudly supporting our customers with software-based solutions that have and will continue to innovate and evolve in response to an ever-changing media economy. We especially want to thank the many individuals who have contributed to the success and growth of this company – naturally, all our past and present colleagues at Dalet, our partners, who have challenged us along the way and, of course, our customers who we exist to serve but also who have also provided their invaluable wisdom to help better our offerings. In our 25th year, and as we look to the next 25 years, we will use those secure foundations to continue firmly on that path, working in close partnership with our customers to embark on new journeys and reimagine the media enterprise.
An Amsterdam Education! … No, Not That Type of Education
Maybe it’s a result of having two teachers as parents, but I am passionate about education and, particularly, education in our industry. Technology and innovation move forward so fast in our business that even as a seasoned industry professional it can sometimes be tricky to keep pace. That’s why I’m so excited to be doing something a little different with the Dalet Theater at IBC this year – here’s what we’ve got going on. Dalet @ IBC One of the primary reasons for visiting the IBC Show is to find out what’s new. Each morning, about an hour after the show opens, we will host a short presentation to explore all the key announcements that Dalet is making at IBC. Whatever your reasons for visiting IBC, this is a great opportunity to find out what’s new. Bruce’s (Orange) Shorts After a short break, Bruce Devlin (aka Mr. MXF) will be back on stage to preview a brand new series of Bruce’s shorts, due out later this year. Every day at 13:00 and 16:00 Bruce will present two short seminars on new technologies and trends. Partners with Dalet Across the globe, Dalet works with a number of distributors and resellers who package Dalet solutions and applications with other tools to meet the needs of their geographies. We’ve invited some of our partners to talk about how they’ve used Dalet and other technologies to address the needs of their regions (12:00). Product Focus If you want to know a little bit more about Dalet products and give your feet a bit of a rest, at 14:00 each day we’ll be focusing in on part of the Dalet portfolio. Click here to see what’s on when! Case Studies There’s no better way to learn than from someone else’s success. We will feature a number of case studies at 15:00, followed by Q&A, based on the most cutting-edge deployments of the past year. Dalet Keynote The big one…each day of the show (Friday through Monday), at 17:00, we’ve partnered with industry giants, including Adobe, Quantum and others, to bring you Dalet Keynotes, which will focus on the biggest challenges facing our industry today. There will also be some light refreshments and an opportunity to network with speakers and peers after the presentation. We’re expecting standing-room-only for the Dalet Keynote sessions so register your interest (Dalet+Adobe; Dalet+Quantum) and we’ll do our best to save you a seat. It’s going to be an amazing lineup with something for everybody – be sure to check the full Dalet Theater schedule and stop by the stand during the show for the latest additions and updates. Of course, if you want talk one-on-one with a Dalet solutions expert or have an in-depth demo tailored to your requirement, you can click here to book a meeting with us at the show. We'll be in hall 8, stand 8.B77. We can’t wait to see you there – but if you’re more of a planner and want to know what to expect elsewhere on the Dalet stand, visit our dedicated IBC page on the Dalet website. Who knows, you might even stumble across some intriguing bits of information or a clue (or two) for what we might be announcing at the show (hint, hint!). We’re looking forward to seeing you in Amsterdam! Until then…
Pictionary, Standards and MXF Interoperability
Four weeks ago, I posted in this blog about the IRT MXF plugfest, the new MXF profiles that were published in Germany this year by the ARD and ZDF, and how these new profiles would bring forth a new era in interoperability. This week, the first results of that plugfest and reaction from some of the end users and vendors were presented at a conference on file-based production also hosted by the IRT in Munich. As usual, the results were fascinating. As with all statistics, they could be manipulated to back up any point you wanted to make, but for me there were a couple of highlights. First, as mentioned in my last post, this was the 9th such MXF plugfest, and therefore we have a good historic dataset. Comparing previous years, there is an obvious and steady increase in both the interoperability of exchanged files and also compliance with MXF specifications. For most of the codecs and variants tested by the 15 vendors who took part, over 90% of files are now exchanging successfully (up from 70-80% five or more years ago). In one case, the new ARD-ZDF MXF profile HDF03a, 100% of the files submitted interchanged successfully. Quite interestingly, the same files all failed a standards compliance test using the IRT MXF analyser. This highlights one of the difficulties the industry faces today with file interoperability, even with constrained specifications such as the AMWA Application Specifications and ARD-ZDF MXF profiles. The IRT MXF analyser categorises test results as pass, fail, or with warning. It is notable that all files with MPEG 2 essence (e.g. XDCAM HD) either failed or had warnings, while AVC-Intra and DNx files each had a significant number that “passed.” However, when it came to interoperability, the differences between the different codecs were much less obvious. One theory would be that because MPEG 2 in MXF is the oldest and most widely used MXF variant, it has resulted in a near de facto standard that enables a reasonably high degree of interoperability – despite the fact that most of these files are not compliant with specifications. I mentioned in my previous post that the new ARD-ZDF profiles have accommodated this deviation from specification in legacy files by specifying broader decode parameters than encode parameters. This was the focus of my presentation at the conference this week, illustrated through the use of children’s toys and the game of Pictionary. However, the additional decoder requirements specified are not without issue. For example, if not impossible, it’s certainly impractical to test all the potential variations covered by the broader decoder specification given that it would be difficult to find test sources that exercise all the possible combinations of deviation from the encoder specification. In another area, while the profile says that the decoder should be able to accommodate files with ancillary data tracks, there is no guidance as to what should be done with the ancillary data, should it be present. As a vendor, that’s particularly problematic when trying to build a standard product for all markets where the requirements in such areas may vary by region. Overall though, while there are improvements that can, and will, be made, it’s clear that for vendors and end users alike the new profiles are a big step forward, and media facilities in Germany are likely to rapidly start seeing the benefit in the next 6-12 months. Exciting times lie ahead.
3 ways to fix QC Errors – Part 2 – What the DPP is doing about QC
Recently I spoke at a symposium on media QC run by the ARD-ZDF Medien-Akadamie and IRT in Munich, Germany. Andy Quested of the BBC, who spoke on behalf of the EBU, opened his presentation by asking how many of the 150 or so representatives of German language broadcasters in the audience were actually using automated QC in their workflows: Despite most of those in attendance having purchased and commissioned automated QC systems, it was possible to count those responding positively on one hand. In a previous blog post I wrote about how automated QC systems were under utilized and suggested three simple steps that can be taken to reduce the number of QC errors in a typical workflow. In following up on that post, here is how the work of the UK’s Digital Production Partnership (DPP) and EBU QC group reflects these suggestions. Reducing the number of QC tests When the EBU QC group started looking at automated QC tools they counted a staggering 471 different QC tests. By rationalizing the differently named or similar tests and removing those deemed unnecessary, the list was whittled down and turned into the periodic table of QC – now containing just over 100 different tests. This is still a large number so the DPP has reduced this to a list of about 40 critical tests for file delivery. The failure action for these tests have also been identified as either absolute requirements (must pass) or technical and editorial warnings. QC test visualization Each test in the EBU Periodic table of QC has been categorized into one of four groups: Regulatory – this means making sure that the media conforms to regulations or legislation such as the CALM act in the US or EBU R128 in Europe. A failure here may not actually mean that the quality of the media is poor. Absolute – physical parameters that can be measured against a published standard or recommendation. Objective – this refers to parameters that can be measured, but for which there is no published standard to describe what is or isn’t acceptable. Often, pass/fails in this category will require human judgment. Subjective – this refers to artifacts in video and audio that requires human eyes and ears to detect. These last two categories in particular require the QC events to be presented to operators in a way that effective evaluation can be made. EBU focuses on how to QC the workflow The work of the EBU group is ongoing and having now defined a common set of QC tests and categories, the current and future work is focused on QC workflows and developing KPIs (Key Performance Indicators) that will demonstrate exactly how efficient media workflows are with regard to QC. This is a key area and one where the EBU is well positioned to see this initiative come to fruition. As the EBU has stated, “Broadcasters moving to file-based production facilities have to consider how to use automated Quality Control (QC) systems. Manual quality control is simply not adequate anymore and it does not scale.” The EBU recognised QC as a key topic for the media industry in 2010, and in 2011 it started an EBU Strategic Programme on Quality Control, with the aim to collect requirements, experiences and to create recommendations for broadcasters implementing file-based QC in their facilities. I left Munich with the clear impression that real momentum is being generated by organizations such as the EBU and DPP in the field of media quality control. It is reassuring when you see that what you have been advising customers for years is supported by leading broadcast industry bodies - QC is key! At AmberFin, QC has been a passion of ours for many years. To understand our approach to this critical component of file-based workflows, why don’t you download our free Whitepaper on the issue. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
When is a workflow not a workflow - can airports learn from modern media workflows
Passing through Frankfurt airport last week I was reminded of the chaos at Amsterdam Schiphol airport when returning from IBC earlier this year. Like many airports, Frankfurt and Schiphol have replaced friendly-faced check-in clerks withautomated check-in and bag drop: As visitors returning from the conference and exhibitions queued up to use the shiny new automated bag drop, what started as friendly chatter about previous five days’ events turned to increasingly vocal demonstrations about the delays the new system was causing. The delays were largely caused by bags that slightly exceeded the weight or size limits, or were simply the wrong shape to fit the uniform dimensions of the drop-off – problems that a small amount of human judgment would have easily resolved. Eventually, a large team of KLM staff were dispatched to the scene to calm the mounting insurrection, help reduce the increasing delays and ensure people caught their flights. Workflow automation does not always increase efficiency and throughput It seems mad that a system billed as expediting the check-in process for customers and reducing costs for the airline actually had the opposite effect – but we are in danger of doing something very similar in the media industry. From the airlines perspective, the process of checking in a passenger and their baggage is actually very similar to the process of ingesting media. Before online check-in and automated bag drops, a check-in clerk would have verified a passengers ID, issued their boarding pass, asked the appropriate security questions and weighed and checked their baggage. Can we replace men with machines in media workflows? In a traditional ingest scenario we would have taken a tape, placed it in a VTR, visually verified the content and checked that it was successfully written to disk. Whether or not QC was formally a part of ingest, a human operator was likely to be interacting in someway with the media and able to apply judgment as to whether there was any issue with the media. With automation in media systems as advanced as it is, it is possible to pass media through aworkflow without a human ever viewing it end-to-end. Much like in an airport, if everything about the passengers and their baggage is within the defined constraints, the process will be quick and efficient – issues only arise when there is an exception – when the passenger’s bag is a kilo overweight, or the media file fails an automated QC. Combining automation with a human touch The challenge we have to face in the media industry as file-based delivery increases and SDI disappears is how we handle these exceptions in the workflow in a fast and effective way, combining automation with the human touch to ensure the quality of our output. In order to do this, we need to unify manual and automated QC through a single interface that enables users to both make judgment on automated measurements and add commentary to QC reports. Taking this approach ensures that media “failed” by automated QC can quickly move on (or back) in the workflow and where an error has been “over-ruled” by a human, the certificate of trust can follow the content. Once trusted, the media should pass through the rest of the workflow without issue before flying off into the sunset. At AmberFin, we have learned that whilst automation is good, there is still an important place for human intervention in media workflows. I can’t help wondering how long it will take – and how many travelers’ journeys will be affected – before the airlines come to the same conclusion. If you would like to learn more about AmberFin’s unique approach to enterprise-class workflow automation. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Three steps to QC heaven – Practical hints to fix file-based QC errors
As previously written about on this blog, automated Quality Control (QC) within file-based production facilities has been regarded as a key issue for a number of years ago. Back in 2010, the EBU recognized QC as a key topic for the media industry and has subsequently stated that manual quality control processes are simply not adequate anymore and do not scale: So, you could be forgiven to think that this would have heralded a boom period for QC tool manufacturers. However, if you look at this market more carefully that prediction does not appear accurate. Following impressive launches and demonstrations at NAB and IBC in 2006, the potential savings in op-ex and gains in efficiency that automated QC tools offered grabbed the attention of budgeting and planning teams in media facilities worldwide. But nearly eight years later, and despite some really significant advances in their functionality, accuracy and performance of these tools, sadly, many of the automated QC tools bought and installed lie dormant or, at best, under-utilized. The most frequently given reason for this is simply that the systems would generate so many errors across so many metrics that it was nearly impossible for a piece of media to successfully pass. At AmberFin, we hate waste and love efficiency, so here are three simple steps to fix QC errors and make the best use of your automated QC. 1. Turn off the QC tests. No, really! Perhaps not all of them, but work out which ones are actually going to identify real problems downstream in the workflow or presentation of the media and turn off the remainder. Just last week we were talking to a customer who was having problems with every piece media failing QC due to audio peak levels. Clearly, there could have be an issue here, but the previous step in the workflow was to normalize the audio to meet EBU R128 loudness specifications, which it did – so the peak level errors were not only spurious, but the test itself unnecessary. 2. Visualize it! If you take the event data generated by an automated QC and present it in a clear, interactive way, it becomes much quicker and easier for operators to make sound judgments and distinguish real errors from marginal issues or “false positives” / “false negatives”. This is whyAmberFin created UQC and use it to validate our own ingest and transcode tools in iCR. The timeline gives a clear view of any problems detected and, alongside video and audio playback, makes it considerably faster and more efficient to identify genuine problems. 3. QC the workflow Toyota gained a reputation for building hig quality cars at a low price. Their QC process did not involve a single gigantic QC operation at the end of the production line. They implemented a production system where the processes themselves were checked – the theory being that if you start with the right input and have the right processes, then the output will also be right. We can implement the same idea in media workflows by identifying issues introduced in the workflow and fixing the workflow rather than fixing individual items of media. This should, in turn, reduce the number of error events reported by automated QC tools and further increase efficiency. Don't let your Automated QC tool sit Idle! If you have an Automated QC tool sitting idle and unloved, why not try these three easy steps to get closer to those promised savings and gains. If you are still trying to get your head around this important issue, then you can learn a great deal if you download AmberFin’s QCWhite Paper – Unified Quality Control from AmberFin.
The sidecar is dead – long live the sidecar
The DPP (Digital Production Partnership) has “no intention of taking over Europe”, let alone the world, however that has not stopped the world looking on with great interest and, in most cases, with great admiration. Models established for the UK media industry by the DPP will undoubtedly be adopted across the globe and that makes the announcements made during the DPP event at IBC highly significant: Notably, the host of updates to the file delivery specification, which will be the preferred method of delivery to UK Broadcasters from 1st October 14, includes, perhaps controversially, the deprecation of the XML sidecar. The DPP Technical Standards for file delivery, and the AMWA AS-11 specification on which they are based, specify that the descriptive metadata shall be stored within the MXF media file. Previous versions of the DPP specification have also included the requirement for an XML sidecar, carrying the same descriptive metadata, resulting in a duplication of the metadata. Removing the requirement for the XML sidecar greatly simplifies management and manipulation of the media as the descriptive metadata is no longer stored in multiple locations. A single storage location for the metadata facilitates easier interchange and interoperability and reduces the risk of erroneous or incomplete metadata. However, many file-based delivery operations have become dependent on XML sidecars to ‘register’ the receipt of media. This sidecar-driven registration of the media file is unlikely to go away for some time, but the inclusion of the DPP metadata within the media file itself means that the sidecar can become focused on transactional and operational, e.g. QC (quality control) metadata, which have equal inherent value as descriptive metadata (in some cases having a direct relationship to revenue) but are of a much more transient nature. The perpetual nature of descriptive metadata means that it’s natural home is within the media file. Until such time as an infrastructure exists for the exchange of transactional metadata associated with the transfer of media files between facilities, the only practical home for this data is in a sidecar. For now at least, the sidecar lives on!
Five Things that you Need to Know about DPP
When organizations use the term ‘revolutionary’ to describe a concept, I find that it's normally a pretty good cue to turn off and move on to the next thing. Usually, puffed up descriptions conceal a flaky or fundamentally compromised proposition. The first thing I can tell you about the Digital Production Partnership (DPP) is that it is not snake oil: Efficient & cost-effective DPP is a real life platform that offers the potential for broadcasters and facilities of all types to revolutionize their production workflows. It enables organizations to adopt digital file-based workflows in ways that are both efficient and cost effective. It enables organizations to adopt file-based technologies for intra- and inter-company media transfers. It finally consigns the ‘sneaker-net’ to the rubbish heap of history. An industry funded initiative 
So, the next big question is who is behind DPP – is it the brainchild of some multi-national corporation, developed to encourage broadcasters to buy more kit? No, just the opposite. DPP is an initiative formed by the UK’s public service broadcasters to help producers and broadcasters maximise the benefits of digital production. The partnership is funded by BBC, ITV and Channel 4, with representation from Channel 5, Sky, S4C and the independent sector on its working groups. DPP draws on industry experts from the worlds of technology and broadcast production to help fulfil its remit. Building on success of Media Exchange Format (MXF) 
Today, is unique. It has taken all the hard work it took to create the SMPTE MXF specification nearly 10 years ago and developed a set of Application Specifications for the UK industry that transform this technical standard into a real world business platform. Looking at it from an international perspective, DPP is the first of these Application Specifications to receive national scale adoption. I'm pretty certain that it won’t be the last. DPP is already a winner 
DPP has been successful in establishing a road map in digital production in the UK. It provides a framework that enables the UK industry to come together and share best practice in digital production and help producers and broadcasters maximize the potential of the digital revolution. Also, it leads the standardization of technical and metadata requirements within the UK, helping to ensure digital video content can be easily and cost-effectively distributed to audiences via multiple platforms. Strong Vendor support DPP is supported by many of the leading broadcast technology vendors. At a recent DPP Vendor Day, I counted 13 manufacturers present in the room – all co-operating to develop a harmonized digital file-based working environment. At AmberFin, we’re proud to say that we are at the leading edge of this cross industry co-operation. Already, we have introduced a family of new DPP compliant media ingest, media transcode, playback and quality control products that will, for the first time, provide broadcasters and content owners with efficient, targeted and cost-effective production tools. At AmberFin, we whole-heartedly support the DPP initiative here in the UK. We believe it has the ability to transform the UK broadcast industry. Furthermore, we believe it provides a blueprint that could be easily adopted in many other international markets. If you're not up to speed on DPP, then I recommend having a good read of our white paper and then checking out the DPP websire (url is in the white paper).
Why do we all need Broadcast IT Training?
When I'm standing in front of 100 engineers all expecting words of wisdom from me, it gives me a few moments to reflect on the fact that there are many individuals in the audience who know a LOTmore than me on the subjects that I am about to deliver. In fact it's remarkable that most lectures I give are to people who already have a lot of knowledge! The most recent lecture series was delivered on behalf of SMPTEfor the SMPTE regional seminars on the topic of File Basedworking. We covered a large range of topics including: * video basics * file basics * transfer basics * database and identification basics * how to glue workflows together * how to optimise transcodes * etc. Everyone in the audience learned something and EVERY INSTRUCTOR learned something as a result of the Q&A sessions. In many ways it was disappointing that there were only 100 engineers listening. It was obvious from the audience that the information covered was vital to the running of their media business. In a world where the business rules are constantly changing and we need to use technology to keep our businesses running, the most valuable resource in a media company is still the people. The people who understand VIDEO and AUDIO and METADATA and STORAGE and NETWORKS and DATABASES and SYSTEMS ADMIN and the BUSINESS are like gold dust and command both respect and decent salaries. It came as a surprise, therefore, that one of the SMPTE seminars had to be cancelled because only 9 people had registered. I learn a lot by sharing my knowledge with others and I often feel that I need a bigger brain to hold all the facts inside. I hope you gain knowledge and maybe a little wealth from the knowledge shared in the AmberFin Academy. Great to have you on-board!
How to Maximize your time in Sports Bars and Airport Lounges with Closed Captions
If, like me, you spend far more time in airports than is good for you, then you will be familiar with the television sets dotted around the lounges, largely silent but with the subtitles or closed captionson. Usually tuned to a news program, the captions themselves become hypnotic, and you cannot help but read them: I’m told that the same thing also happens in sports bars, but obviously I have far less practical knowledge of such establishments myself. It seems that the mere appearance of the words forces you to read. This phenomenon was first formally observed by Brij Kothari, an Indian then studying at Cornell in the States. He was trying to learn Spanish, but the local cinemas that showed films from Spain put English subtitles on them. It made it much harder to hear the original language. He realized that if they had Spanish captions it would be much easier to learn the language, with the written script reinforcing the sound of the spoken words. “Then it occurred to me that if all Indian television programming in Hindi was subtitled in Hindi, India would become literate faster,” recalled Kothari, now professor at the Indian Institute of Management. Today one of the most popular programs on Indian television is the Sunday night sing-along: Bollywood hits with same-language subtitles. Not only do people read, listen, sing and learn, but children copy the lyrics down so they can sing them with their friends later. This karaoke-for-literacy effort reaches 200 million viewers a week. In the last nine years, functional literacy in the areas covered has more than doubled. A researcher focusing on one particular town found that newspaper reading has risen by more than 50%, so the population is better informed. Women are now capable of reading bus timetables so social mobility is boosted. Literacy is liberating in so many ways. Now here is the killer message: this does not only work in developing countries. Research in the USA by Nielsen’s ORG Center for Social Research found that same language subtitling doubles the number of functional readers among primary school children. Across the developed world there is a huge number of adults who, while not being illiterate, cannot read fluently. According to the World Literacy Foundation, one in five adults in the UK struggles with basic reading. If they do not feel they can pick up a newspaper, or read a bus timetable, they cannot take a full role in society. I’m not suggesting that same-language subtitling of every MTV broadcast is the complete solution. But it looks like it would help, and with today’s technology it is a low-cost win. Literate viewers are more responsive to advertising, too, so there are potential returns. So next time you find yourself in an airport, or even a bar, remember that same-language subtitling is not just for those who cannot hear the words, for whatever reason. It could – and should – be changing peoples’ lives.
Bringing archives into your news workflows
In modern newsrooms speed and accuracy are everything. The difference between breaking a news story or being an “also ran” can come down to the efficiency of your workflow in the newsroom. The ability to capture, store and find information is central – the quality of your journalists’ output relies heavily on their ability to locate and obtain relevant background information to the story that they are writing. In this environment, the sophistication of your archive and its ability to offer up its treasures effectively can be the difference you are striving for. So, an airplane crashes and your ENG crew are first on the scene – the reporter needs relevant information fast – what is the safety record of this particular aircraft; when was the last time an aircraft crashed leaving this airport; what was the weather like at the time of the crash? This is the kind of information that turns a sketchy scene of the crash report into a well-researched and insightful news story. But within the constantly changing landscape of the newsroom it is not so easy to maintain your archive. Do you find the task of indexing material difficult to do and time-consuming in your newsroom? And do you struggle to find content that you know exists but have not the time nor resources to locate it within the timescale of news? If you do fall into either or both of these traps you are in good company – in our experience at Dalet these are very common problems. When is the right time to archive your news content? Traditionally, archiving used to be done after broadcast. Stories and related videos were indexed then archived after they had been aired. But, there are new techniques being evolved that involve bringing archives into the heart of the news production workflow. This means starting the archiving process during media ingest! Many of our customers have developed a small team of "media coordinators". This team monitors everything that enters the workflow: they add time-coded tags to any relevant content at the earliest possible stage. This automatically creates a clip or a sequence of clips than can then be quickly identified and directed to a journalist and a group of journalists as they work on a news story. Straight away, this is valuable metadata information will travel alongside the content throughout the entire life cycle of the story, up to the archive. This is a key term, which we call metadata inheritance. Put simply, it means that you don't have to restart each time and enter metadata from scratch. But there is much more to it. With story-centric workflow, this approach enables users to aggregate a diversity of information sources into the same news story quickly and easily: Related wires, video material Different versions of the story that evolved over time – for example, the plane crash will certainly make a lot of stories with new elements, but this is essentially the same story. Associated scripts, graphic objects, web links, metadata coming from dope sheets Broadcast right information GPS information Occasionally, journalists will manually enter metadata All of this information is generated as part of the production workflow so when it comes to the archive, the challenge centres on the capture of high-resolution content on a cost-effective storage medium (traditionally a tape library, however these days more and more storage is utilizing a public or a private cloud). The proxy video remains online. The metadata is there and it is essential since it makes the content searchable. Within a typical newsroom environment there is little or no time available to re-index each story. If this process can be automated and integrated within the main workflow, it offers major time and cost savings compared with manual indexing! And what's next – harness the power of the semantic web Once you have put an infrastructure in place that is capable of organizing metadata, managing media and providing search tools (it is what we call a MAM at the end of the day), then you can think further ahead – the creative possibilities are considerable. Developed by Sir Tim Berners-Lee back in 2001, the semantic web is a web of data that can be processed directly and indirectly by machines with very little human intervention. It requires clever technology underpinning data mining, but from a user’s point of view it presents a way to identify and manage complex relational links between broadcast assets and information in a simple and readily understandable fashion. Semantic web tools make it easy to explore and correlate multiple sources of information, to evaluate what they are finding, and to explore links and lines of development that may never otherwise appear. Semantic web tools enable broadcasters to make sense out of what was previously unorganized metadata. Using this technology you can organize your media assets into topic groups, and gain a far better understanding of the personalities related to a story. And importantly, you can easily relate different stories to each other. At Dalet, we are working on ways to integrate this technology within newsrooms and other production environments. The starting point for this process is simplifying and automating the essential metadata entry process – already we are doing this with many of our customers. If you believe that it’s time to turn up the burners on your newsroom archive systems, then we would love to talk to you and explain how Dalet can make that process easier and more productive.Contact your local Dalet sales office today and let’s kick off that conversation.