Recently I spoke at a symposium on media QC run by the ARD-ZDF Medien-Akadamie and IRT in Munich, Germany. Andy Quested of the BBC, who spoke on behalf of the EBU, opened his presentation by asking how many of the 150 or so representatives of German language broadcasters in the audience were actually using automated QC in their workflows:
Despite most of those in attendance having purchased and commissioned automated QC systems, it was possible to count those responding positively on one hand.
In a previous blog post I wrote about how automated QC systems were under utilized and suggested three simple steps that can be taken to reduce the number of QC errors in a typical workflow. In following up on that post, here is how the work of the UK’s Digital Production Partnership (DPP) and EBU QC group reflects these suggestions.
Reducing the number of QC tests
When the EBU QC group started looking at automated QC tools they counted a staggering 471 different QC tests. By rationalizing the differently named or similar tests and removing those deemed unnecessary, the list was whittled down and turned into the periodic table of QC – now containing just over 100 different tests. This is still a large number so the DPP has reduced this to a list of about 40 critical tests for file delivery. The failure action for these tests have also been identified as either absolute requirements (must pass) or technical and editorial warnings.
QC test visualization
- Each test in the EBU Periodic table of QC has been categorized into one of four groups:
- Regulatory – this means making sure that the media conforms to regulations or legislation such as the CALM act in the US or EBU R128 in Europe. A failure here may not actually mean that the quality of the media is poor.
- Absolute – physical parameters that can be measured against a published standard or recommendation.
- Objective – this refers to parameters that can be measured, but for which there is no published standard to describe what is or isn’t acceptable. Often, pass/fails in this category will require human judgment.
- Subjective – this refers to artifacts in video and audio that requires human eyes and ears to detect.
These last two categories in particular require the QC events to be presented to operators in a way that effective evaluation can be made.
EBU focuses on how to QC the workflow
The work of the EBU group is ongoing and having now defined a common set of QC tests and categories, the current and future work is focused on QC workflows and developing KPIs (Key Performance Indicators) that will demonstrate exactly how efficient media workflows are with regard to QC.
This is a key area and one where the EBU is well positioned to see this initiative come to fruition. As the EBU has stated, “Broadcasters moving to file-based production facilities have to consider how to use automated Quality Control (QC) systems. Manual quality control is simply not adequate anymore and it does not scale.”
The EBU recognised QC as a key topic for the media industry in 2010, and in 2011 it started an EBU Strategic Programme on Quality Control, with the aim to collect requirements, experiences and to create recommendations for broadcasters implementing file-based QC in their facilities.
I left Munich with the clear impression that real momentum is being generated by organizations such as the EBU and DPP in the field of media quality control. It is reassuring when you see that what you have been advising customers for years is supported by leading broadcast industry bodies – QC is key!
I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?