Menu extends

Sep 24, 2014
Bruce's Shorts Goes Live at NAB

Bruce's Shorts Goes Live at NAB

Bruce's Shorts Live at NABIf you've enjoyed our weekly broadcast IT training video shorts emails and monthly in-depth workflow webinars, or if you want to find out what Bruce's Shorts is all about, come and join us on booth SU 8505 for Bruce's Shorts Live from the show floor:

Every day of the show, we will be hosting a series of 10-minute presentations on some of the industry's biggest topics includingQuality ControlCaptions andEnterprise Workflows

Rest assured, these are not sales pitches, or product demos. These presentations will be lively, informative, jargon-free and non-vendor specific workflow training sessions brought to you by the AmberFin Academy.

Win an i-Pad Mini!

If you attend any of the training sessions on MondayTuesday or Wednesday, you will be entered in a draw to win an i-Pad mini! All you need to do is enter your details on a special card and come back at 5pm to see if you've won!

(Detailed terms and conditions available at: https://www.amberfin.com/NAB2013/shorts-competition-terms-and-conditions)

Schedule of presentations

10.30 Quality Control: Turning a cost into a business benefit

11.30 Unified QC with DigiMetrics Aurora and AmberFin workflow

12.30 Enterprise Workflow: The secrets of efficiency and profitability

13.30 Captions: How to keep workflows simple and comply with FCC Regulations

14.30* Quality Control: Turning a cost into a business benefit

15.30* Enterprise Workflow: The secrets of efficiency and profitability

16.30* Captions: How to keep workflows simple and comply with FCC Regulations

17.00* Live Draw. You must be present on the AmberFin Booth at 5pm today to collect your prize.

* Because the show closes at 2pm the afternoon sessions and i-Pad draw will not take place on Thursday 11th

If you are not going to NAB don’t worry you can still win an iPad Mini as we will be running an extra competition for Bruce’s Shorts subscribers after the Show. But you must be a subscriber to win so why not click on the box below and sign-up now?

 describe the image

 

YOU MAY ALSO LIKE...
An IBC preview that won’t leave you dizzy
When we write these blog entries each week, we normally ensure we have a draft a few days in advance to make sure we have plenty of time to review, edit and make sure that the content is worth publishing. This entry was late, very late. This pre-IBC post has been hugely challenging to write for two reasons: Drone-mounted Moccachino machines are not on the agenda – but Bruce’s post last week definitely has me avoiding marketing “spin.” There are so many things I could talk about, it’s been a struggle to determine what to leave out. Earlier this year, at the NAB Show, we announced the combination of our Workflow Engine, including the Business Process Model & Notation (BPMN) 2.0-compliant workflow designer, and our Dalet AmberFin media processing platform. Now generally available in the AmberFin v11 release, we’ll be demonstrating how customers are using this system to design, automate and monitor their media transcode and QC workflows, in mission-critical multi-platform distribution operations. Talking of multi-platform distribution, our Dalet Galaxy media asset management now has the capability to publish directly to social media outlets such as Facebook and Twitter, while the new Media Packages feature simplifies the management of complex assets, enabling users to see all of the elements associated with a specific asset, such as different episodes, promos etc., visually mapped out in a clear and simple way. Making things simple is somewhat of a theme for Dalet at IBC this year. Making ingest really easy for Adobe Premiere users, the new Adobe Panel for Dalet Brio enables users to start, stop, monitor, quality check and ingest directly from the Adobe Premiere Pro interface with new recordings brought directly into the edit bin. We’ll also be demonstrating the newly redesigned chat and messaging module in Dalet Galaxy, Dalet WebSpace and the Dalet On-the-Go mobile application. The modern, and familiar, chat interface has support for persistent chats, group chats, messaging offline users and much more. Legislation and consolidation of workflows mean that captioning and subtitling are a common challenge for many facilities. We are directly addressing that challenge with a standards-based, cross-platform strategy for the handling of captioning workflows across Dalet Galaxy, Dalet Brio and Dalet AmberFin. With the ability to read and write standards-constrained TTML, caption and subtitle data is searchable and editable inside the Dalet Galaxy MAM, while Dalet Brio is able to capture caption- and subtitle-containing ancillary data packets to disk and play them back. Dalet AmberFin natively supports the extraction and insertion of subtitle and caption data to and from .SCC and .STL formats respectively, while tight integration with other vendors extends support for other vendors. There are so many other exciting new features I could talk about, but it’s probably best to see them for yourself live in Amsterdam. Of course, if you’re not going to the show, you can always get the latest by subscribing to the blog, or get in touch with your local representative to get more information. There, and I didn’t even mention buzzwords 4K and cloud… …yet!
An Amsterdam Education! … No, Not That Type of Education
Maybe it’s a result of having two teachers as parents, but I am passionate about education and, particularly, education in our industry. Technology and innovation move forward so fast in our business that even as a seasoned industry professional it can sometimes be tricky to keep pace. That’s why I’m so excited to be doing something a little different with the Dalet Theater at IBC this year – here’s what we’ve got going on. Dalet @ IBC One of the primary reasons for visiting the IBC Show is to find out what’s new. Each morning, about an hour after the show opens, we will host a short presentation to explore all the key announcements that Dalet is making at IBC. Whatever your reasons for visiting IBC, this is a great opportunity to find out what’s new. Bruce’s (Orange) Shorts After a short break, Bruce Devlin (aka Mr. MXF) will be back on stage to preview a brand new series of Bruce’s shorts, due out later this year. Every day at 13:00 and 16:00 Bruce will present two short seminars on new technologies and trends. Partners with Dalet Across the globe, Dalet works with a number of distributors and resellers who package Dalet solutions and applications with other tools to meet the needs of their geographies. We’ve invited some of our partners to talk about how they’ve used Dalet and other technologies to address the needs of their regions (12:00). Product Focus If you want to know a little bit more about Dalet products and give your feet a bit of a rest, at 14:00 each day we’ll be focusing in on part of the Dalet portfolio. Click here to see what’s on when! Case Studies There’s no better way to learn than from someone else’s success. We will feature a number of case studies at 15:00, followed by Q&A, based on the most cutting-edge deployments of the past year. Dalet Keynote The big one…each day of the show (Friday through Monday), at 17:00, we’ve partnered with industry giants, including Adobe, Quantum and others, to bring you Dalet Keynotes, which will focus on the biggest challenges facing our industry today. There will also be some light refreshments and an opportunity to network with speakers and peers after the presentation. We’re expecting standing-room-only for the Dalet Keynote sessions so register your interest (Dalet+Adobe; Dalet+Quantum) and we’ll do our best to save you a seat. It’s going to be an amazing lineup with something for everybody – be sure to check the full Dalet Theater schedule and stop by the stand during the show for the latest additions and updates. Of course, if you want talk one-on-one with a Dalet solutions expert or have an in-depth demo tailored to your requirement, you can click here to book a meeting with us at the show. We'll be in hall 8, stand 8.B77. We can’t wait to see you there – but if you’re more of a planner and want to know what to expect elsewhere on the Dalet stand, visit our dedicated IBC page on the Dalet website. Who knows, you might even stumble across some intriguing bits of information or a clue (or two) for what we might be announcing at the show (hint, hint!). We’re looking forward to seeing you in Amsterdam! Until then…
Pictionary, Standards and MXF Interoperability
Four weeks ago, I posted in this blog about the IRT MXF plugfest, the new MXF profiles that were published in Germany this year by the ARD and ZDF, and how these new profiles would bring forth a new era in interoperability. This week, the first results of that plugfest and reaction from some of the end users and vendors were presented at a conference on file-based production also hosted by the IRT in Munich. As usual, the results were fascinating. As with all statistics, they could be manipulated to back up any point you wanted to make, but for me there were a couple of highlights. First, as mentioned in my last post, this was the 9th such MXF plugfest, and therefore we have a good historic dataset. Comparing previous years, there is an obvious and steady increase in both the interoperability of exchanged files and also compliance with MXF specifications. For most of the codecs and variants tested by the 15 vendors who took part, over 90% of files are now exchanging successfully (up from 70-80% five or more years ago). In one case, the new ARD-ZDF MXF profile HDF03a, 100% of the files submitted interchanged successfully. Quite interestingly, the same files all failed a standards compliance test using the IRT MXF analyser. This highlights one of the difficulties the industry faces today with file interoperability, even with constrained specifications such as the AMWA Application Specifications and ARD-ZDF MXF profiles. The IRT MXF analyser categorises test results as pass, fail, or with warning. It is notable that all files with MPEG 2 essence (e.g. XDCAM HD) either failed or had warnings, while AVC-Intra and DNx files each had a significant number that “passed.” However, when it came to interoperability, the differences between the different codecs were much less obvious. One theory would be that because MPEG 2 in MXF is the oldest and most widely used MXF variant, it has resulted in a near de facto standard that enables a reasonably high degree of interoperability – despite the fact that most of these files are not compliant with specifications. I mentioned in my previous post that the new ARD-ZDF profiles have accommodated this deviation from specification in legacy files by specifying broader decode parameters than encode parameters. This was the focus of my presentation at the conference this week, illustrated through the use of children’s toys and the game of Pictionary. However, the additional decoder requirements specified are not without issue. For example, if not impossible, it’s certainly impractical to test all the potential variations covered by the broader decoder specification given that it would be difficult to find test sources that exercise all the possible combinations of deviation from the encoder specification. In another area, while the profile says that the decoder should be able to accommodate files with ancillary data tracks, there is no guidance as to what should be done with the ancillary data, should it be present. As a vendor, that’s particularly problematic when trying to build a standard product for all markets where the requirements in such areas may vary by region. Overall though, while there are improvements that can, and will, be made, it’s clear that for vendors and end users alike the new profiles are a big step forward, and media facilities in Germany are likely to rapidly start seeing the benefit in the next 6-12 months. Exciting times lie ahead.
3 ways to fix QC Errors – Part 2 – What the DPP is doing about QC
Recently I spoke at a symposium on media QC run by the ARD-ZDF Medien-Akadamie and IRT in Munich, Germany. Andy Quested of the BBC, who spoke on behalf of the EBU, opened his presentation by asking how many of the 150 or so representatives of German language broadcasters in the audience were actually using automated QC in their workflows: Despite most of those in attendance having purchased and commissioned automated QC systems, it was possible to count those responding positively on one hand. In a previous blog post I wrote about how automated QC systems were under utilized and suggested three simple steps that can be taken to reduce the number of QC errors in a typical workflow. In following up on that post, here is how the work of the UK’s Digital Production Partnership (DPP) and EBU QC group reflects these suggestions. Reducing the number of QC tests When the EBU QC group started looking at automated QC tools they counted a staggering 471 different QC tests. By rationalizing the differently named or similar tests and removing those deemed unnecessary, the list was whittled down and turned into the periodic table of QC – now containing just over 100 different tests. This is still a large number so the DPP has reduced this to a list of about 40 critical tests for file delivery. The failure action for these tests have also been identified as either absolute requirements (must pass) or technical and editorial warnings. QC test visualization Each test in the EBU Periodic table of QC has been categorized into one of four groups: Regulatory – this means making sure that the media conforms to regulations or legislation such as the CALM act in the US or EBU R128 in Europe. A failure here may not actually mean that the quality of the media is poor. Absolute – physical parameters that can be measured against a published standard or recommendation. Objective – this refers to parameters that can be measured, but for which there is no published standard to describe what is or isn’t acceptable. Often, pass/fails in this category will require human judgment. Subjective – this refers to artifacts in video and audio that requires human eyes and ears to detect. These last two categories in particular require the QC events to be presented to operators in a way that effective evaluation can be made. EBU focuses on how to QC the workflow The work of the EBU group is ongoing and having now defined a common set of QC tests and categories, the current and future work is focused on QC workflows and developing KPIs (Key Performance Indicators) that will demonstrate exactly how efficient media workflows are with regard to QC. This is a key area and one where the EBU is well positioned to see this initiative come to fruition. As the EBU has stated, “Broadcasters moving to file-based production facilities have to consider how to use automated Quality Control (QC) systems. Manual quality control is simply not adequate anymore and it does not scale.” The EBU recognised QC as a key topic for the media industry in 2010, and in 2011 it started an EBU Strategic Programme on Quality Control, with the aim to collect requirements, experiences and to create recommendations for broadcasters implementing file-based QC in their facilities. I left Munich with the clear impression that real momentum is being generated by organizations such as the EBU and DPP in the field of media quality control. It is reassuring when you see that what you have been advising customers for years is supported by leading broadcast industry bodies - QC is key! At AmberFin, QC has been a passion of ours for many years. To understand our approach to this critical component of file-based workflows, why don’t you download our free Whitepaper on the issue. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
When is a workflow not a workflow - can airports learn from modern media workflows
Passing through Frankfurt airport last week I was reminded of the chaos at Amsterdam Schiphol airport when returning from IBC earlier this year. Like many airports, Frankfurt and Schiphol have replaced friendly-faced check-in clerks withautomated check-in and bag drop: As visitors returning from the conference and exhibitions queued up to use the shiny new automated bag drop, what started as friendly chatter about previous five days’ events turned to increasingly vocal demonstrations about the delays the new system was causing. The delays were largely caused by bags that slightly exceeded the weight or size limits, or were simply the wrong shape to fit the uniform dimensions of the drop-off – problems that a small amount of human judgment would have easily resolved. Eventually, a large team of KLM staff were dispatched to the scene to calm the mounting insurrection, help reduce the increasing delays and ensure people caught their flights. Workflow automation does not always increase efficiency and throughput It seems mad that a system billed as expediting the check-in process for customers and reducing costs for the airline actually had the opposite effect – but we are in danger of doing something very similar in the media industry. From the airlines perspective, the process of checking in a passenger and their baggage is actually very similar to the process of ingesting media. Before online check-in and automated bag drops, a check-in clerk would have verified a passengers ID, issued their boarding pass, asked the appropriate security questions and weighed and checked their baggage. Can we replace men with machines in media workflows? In a traditional ingest scenario we would have taken a tape, placed it in a VTR, visually verified the content and checked that it was successfully written to disk. Whether or not QC was formally a part of ingest, a human operator was likely to be interacting in someway with the media and able to apply judgment as to whether there was any issue with the media. With automation in media systems as advanced as it is, it is possible to pass media through aworkflow without a human ever viewing it end-to-end. Much like in an airport, if everything about the passengers and their baggage is within the defined constraints, the process will be quick and efficient – issues only arise when there is an exception – when the passenger’s bag is a kilo overweight, or the media file fails an automated QC. Combining automation with a human touch The challenge we have to face in the media industry as file-based delivery increases and SDI disappears is how we handle these exceptions in the workflow in a fast and effective way, combining automation with the human touch to ensure the quality of our output. In order to do this, we need to unify manual and automated QC through a single interface that enables users to both make judgment on automated measurements and add commentary to QC reports. Taking this approach ensures that media “failed” by automated QC can quickly move on (or back) in the workflow and where an error has been “over-ruled” by a human, the certificate of trust can follow the content. Once trusted, the media should pass through the rest of the workflow without issue before flying off into the sunset. At AmberFin, we have learned that whilst automation is good, there is still an important place for human intervention in media workflows. I can’t help wondering how long it will take – and how many travelers’ journeys will be affected – before the airlines come to the same conclusion. If you would like to learn more about AmberFin’s unique approach to enterprise-class workflow automation. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Three steps to QC heaven – Practical hints to fix file-based QC errors
As previously written about on this blog, automated Quality Control (QC) within file-based production facilities has been regarded as a key issue for a number of years ago. Back in 2010, the EBU recognized QC as a key topic for the media industry and has subsequently stated that manual quality control processes are simply not adequate anymore and do not scale: So, you could be forgiven to think that this would have heralded a boom period for QC tool manufacturers. However, if you look at this market more carefully that prediction does not appear accurate. Following impressive launches and demonstrations at NAB and IBC in 2006, the potential savings in op-ex and gains in efficiency that automated QC tools offered grabbed the attention of budgeting and planning teams in media facilities worldwide. But nearly eight years later, and despite some really significant advances in their functionality, accuracy and performance of these tools, sadly, many of the automated QC tools bought and installed lie dormant or, at best, under-utilized. The most frequently given reason for this is simply that the systems would generate so many errors across so many metrics that it was nearly impossible for a piece of media to successfully pass. At AmberFin, we hate waste and love efficiency, so here are three simple steps to fix QC errors and make the best use of your automated QC. 1. Turn off the QC tests. No, really! Perhaps not all of them, but work out which ones are actually going to identify real problems downstream in the workflow or presentation of the media and turn off the remainder. Just last week we were talking to a customer who was having problems with every piece media failing QC due to audio peak levels. Clearly, there could have be an issue here, but the previous step in the workflow was to normalize the audio to meet EBU R128 loudness specifications, which it did – so the peak level errors were not only spurious, but the test itself unnecessary. 2. Visualize it! If you take the event data generated by an automated QC and present it in a clear, interactive way, it becomes much quicker and easier for operators to make sound judgments and distinguish real errors from marginal issues or “false positives” / “false negatives”. This is whyAmberFin created UQC and use it to validate our own ingest and transcode tools in iCR. The timeline gives a clear view of any problems detected and, alongside video and audio playback, makes it considerably faster and more efficient to identify genuine problems. 3. QC the workflow Toyota gained a reputation for building hig quality cars at a low price. Their QC process did not involve a single gigantic QC operation at the end of the production line. They implemented a production system where the processes themselves were checked – the theory being that if you start with the right input and have the right processes, then the output will also be right. We can implement the same idea in media workflows by identifying issues introduced in the workflow and fixing the workflow rather than fixing individual items of media. This should, in turn, reduce the number of error events reported by automated QC tools and further increase efficiency. Don't let your Automated QC tool sit Idle! If you have an Automated QC tool sitting idle and unloved, why not try these three easy steps to get closer to those promised savings and gains. If you are still trying to get your head around this important issue, then you can learn a great deal if you download AmberFin’s QCWhite Paper – Unified Quality Control from AmberFin.
The sidecar is dead – long live the sidecar
The DPP (Digital Production Partnership) has “no intention of taking over Europe”, let alone the world, however that has not stopped the world looking on with great interest and, in most cases, with great admiration. Models established for the UK media industry by the DPP will undoubtedly be adopted across the globe and that makes the announcements made during the DPP event at IBC highly significant: Notably, the host of updates to the file delivery specification, which will be the preferred method of delivery to UK Broadcasters from 1st October 14, includes, perhaps controversially, the deprecation of the XML sidecar. The DPP Technical Standards for file delivery, and the AMWA AS-11 specification on which they are based, specify that the descriptive metadata shall be stored within the MXF media file. Previous versions of the DPP specification have also included the requirement for an XML sidecar, carrying the same descriptive metadata, resulting in a duplication of the metadata. Removing the requirement for the XML sidecar greatly simplifies management and manipulation of the media as the descriptive metadata is no longer stored in multiple locations. A single storage location for the metadata facilitates easier interchange and interoperability and reduces the risk of erroneous or incomplete metadata. However, many file-based delivery operations have become dependent on XML sidecars to ‘register’ the receipt of media. This sidecar-driven registration of the media file is unlikely to go away for some time, but the inclusion of the DPP metadata within the media file itself means that the sidecar can become focused on transactional and operational, e.g. QC (quality control) metadata, which have equal inherent value as descriptive metadata (in some cases having a direct relationship to revenue) but are of a much more transient nature. The perpetual nature of descriptive metadata means that it’s natural home is within the media file. Until such time as an infrastructure exists for the exchange of transactional metadata associated with the transfer of media files between facilities, the only practical home for this data is in a sidecar. For now at least, the sidecar lives on!
Five Things that you Need to Know about DPP
When organizations use the term ‘revolutionary’ to describe a concept, I find that it's normally a pretty good cue to turn off and move on to the next thing. Usually, puffed up descriptions conceal a flaky or fundamentally compromised proposition. The first thing I can tell you about the Digital Production Partnership (DPP) is that it is not snake oil: Efficient & cost-effective DPP is a real life platform that offers the potential for broadcasters and facilities of all types to revolutionize their production workflows. It enables organizations to adopt digital file-based workflows in ways that are both efficient and cost effective. It enables organizations to adopt file-based technologies for intra- and inter-company media transfers. It finally consigns the ‘sneaker-net’ to the rubbish heap of history. An industry funded initiative 
So, the next big question is who is behind DPP – is it the brainchild of some multi-national corporation, developed to encourage broadcasters to buy more kit? No, just the opposite. DPP is an initiative formed by the UK’s public service broadcasters to help producers and broadcasters maximise the benefits of digital production. The partnership is funded by BBC, ITV and Channel 4, with representation from Channel 5, Sky, S4C and the independent sector on its working groups. DPP draws on industry experts from the worlds of technology and broadcast production to help fulfil its remit. Building on success of Media Exchange Format (MXF) 
Today, is unique. It has taken all the hard work it took to create the SMPTE MXF specification nearly 10 years ago and developed a set of Application Specifications for the UK industry that transform this technical standard into a real world business platform. Looking at it from an international perspective, DPP is the first of these Application Specifications to receive national scale adoption. I'm pretty certain that it won’t be the last. DPP is already a winner 
DPP has been successful in establishing a road map in digital production in the UK. It provides a framework that enables the UK industry to come together and share best practice in digital production and help producers and broadcasters maximize the potential of the digital revolution. Also, it leads the standardization of technical and metadata requirements within the UK, helping to ensure digital video content can be easily and cost-effectively distributed to audiences via multiple platforms. Strong Vendor support DPP is supported by many of the leading broadcast technology vendors. At a recent DPP Vendor Day, I counted 13 manufacturers present in the room – all co-operating to develop a harmonized digital file-based working environment. At AmberFin, we’re proud to say that we are at the leading edge of this cross industry co-operation. Already, we have introduced a family of new DPP compliant media ingest, media transcode, playback and quality control products that will, for the first time, provide broadcasters and content owners with efficient, targeted and cost-effective production tools. At AmberFin, we whole-heartedly support the DPP initiative here in the UK. We believe it has the ability to transform the UK broadcast industry. Furthermore, we believe it provides a blueprint that could be easily adopted in many other international markets. If you're not up to speed on DPP, then I recommend having a good read of our white paper and then checking out the DPP websire (url is in the white paper).
Why do we all need Broadcast IT Training?
When I'm standing in front of 100 engineers all expecting words of wisdom from me, it gives me a few moments to reflect on the fact that there are many individuals in the audience who know a LOTmore than me on the subjects that I am about to deliver. In fact it's remarkable that most lectures I give are to people who already have a lot of knowledge! The most recent lecture series was delivered on behalf of SMPTEfor the SMPTE regional seminars on the topic of File Basedworking. We covered a large range of topics including: * video basics * file basics * transfer basics * database and identification basics * how to glue workflows together * how to optimise transcodes * etc. Everyone in the audience learned something and EVERY INSTRUCTOR learned something as a result of the Q&A sessions. In many ways it was disappointing that there were only 100 engineers listening. It was obvious from the audience that the information covered was vital to the running of their media business. In a world where the business rules are constantly changing and we need to use technology to keep our businesses running, the most valuable resource in a media company is still the people. The people who understand VIDEO and AUDIO and METADATA and STORAGE and NETWORKS and DATABASES and SYSTEMS ADMIN and the BUSINESS are like gold dust and command both respect and decent salaries. It came as a surprise, therefore, that one of the SMPTE seminars had to be cancelled because only 9 people had registered. I learn a lot by sharing my knowledge with others and I often feel that I need a bigger brain to hold all the facts inside. I hope you gain knowledge and maybe a little wealth from the knowledge shared in the AmberFin Academy. Great to have you on-board!
5 Reasons why we need more than ultra HD to save TV
If you were lucky (or unlucky) enough to get to CES in Las Vegas this year, then you will know that UHD (Ultra High Definition TV) was the talking point of the show. By and large the staff on the booths were there to sell UHD TVs as pieces of furniture and few of them know the techno-commercial difficulties of putting great pictures onto those big, bright, curved(?) and really, really thin displays: In my upcoming webinar on the 29th January I will be looking into the future and predicting some of the topics that I think will need to be addressed over the next few years if TV as we know it is to survive. 1. Interoperability The number of screens and display devices is increasing. The amount of content available for viewing is going up but the number of viewers is not changing greatly. This means that we either have to extract more revenue from each user or reduce the cost of making that content. Having systems that don’t effectively inter-operate adds cost, wastes time and delivers no value so the consumer. Essence interoperability (video & audio) is gradually improving thanks to education campaigns (from AmberFin and others) as well as vendors with proprietary formats reverting to open standards because the cost of maintaining the proprietary formats is too great. Metadata interoperability is the next BIG THING. Tune in to the webinar to discover the truth about essence interoperability and then imagine how much unnecessary cost exists in the broken metadata flows that exists between companies and between departments. 2. Interlace must die UHD may be the next big thing, but just like HDTV it is going to have to show a lot of old content to be a success. Flick through the channels tonight and ask yourself “How much of the content was shot & displayed progressively”. On a conventional TV channel the answer is probably “none”. Showing progressive content on a progressive screen via an interlaced TV value chain is nuts. It reduces quality and increases bitrate. Anyone looking at some of the poor pictures shown at CES will recognise the signs of demonstrations conceived by marketers who did not understand the effects of interlace on an end to end chain. Re-using old content involves up-scaling & deinterlacing existing content – 90% of which is interlaced. In the webinar, I’ll use AmberFin’s experience in making the world’s finest progressive pictures to explain why interlace is evil and what you can do about it. 3. Automating infrastructure Reducing costs means spending money on the things that are important and balancing expenditure between what is important today and what is important tomorrow. There is no point in investing money in MAMs and Automation if your infrastructure won’t support it and give you the flexibility you need. You’ll end up redesigning your automation strategy forever. The folks behind xkcd.com explain this much more succinctly and cleverly than I could ever do. In the webinar, I’ll explain the difference between different virtualization techniques and why they’re important. 4. Trust confidence & QC More and more automation brings efficiency, cost savings and scale, but also means that a lot of the visibility of content is lost. Test and measurement give you the metrics to know about that content. Quality Control gives you decisions that can be used to change your Quality Assurance processes. These processes in turn allow your business to deliver media product that delivers the right technical quality for the creative quality your business is based on. So here’s the crunch. The more you automate, the less you interact with the media, the more you have to trust the metadata and pre-existing knowledge about the media. How do you know it’s right? How do you know that the trust you have in that media is founded? For example. A stranger walks up to you in the street and offers you a glass of water. Would you drink it? Probably not. If that person was your favourite TV star with a camera crew filming you – would you drink it now? Probably? Trust means a lot in life and in business. I’ll explore more of this in the webinar. 5. Separating the pipe from the content If, like me, you’re seeing more grey hair appearing on the barber’s floor with each visit then you may remember the good old days when the capture standard (PAL) was the same as the contribution standard (PAL) and the mixing desk standard (PAL) and the editing standard (PAL) and the playout standard (PAL) and the transmission standard (PAL). Today we could have capture format (RED), a contribution standard (Aspera FASP), a mixing desk standard (HDSDI), an editing standard (MXF DNxHD),a playout standard (XDCAM-HDSDI) and a transmission standard (DVB-T2) that are all different. The world is moving to IP. What does that mean? How does it behave? A quick primer on the basics will be included in the webinar. Why not sign up below before it’s too late? Places are limited – I know it will be a good one. Register for our next webinar on: Wednesday 29th January at: 1pm GMT, 2pm CET, 8am EST, 5am PST OR 5pm GMT, 6pm CET, 12pm EST, 9am PST ‘til next time. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
Thanksgiving - It's a marathon (and not just for Turkeys)
I love the Thanksgiving weekend. It’s the one time of the year when most of the USA decides to shut down and have a little leisure time. For me, it means that my inbox tends to drop by several hundred emails before the madness of the working week starts again: I am sure that the native American Turkey feels less good about the season. There are many reports of a shortage of fresh turkeys and that despite this shortage, the average US consumer still prefers to spend $20 on a bulk farmed bird instead of $300 on a free range hand reared bird that has taken a year of natural living to reach its market weight. Interesting times for turkeys, but also for the media industry. Forcing 100x the “natural” number of turkeys through a farm may raise ethical concerns, but forcing 100x the “traditional” number of files through a media facility is now becoming the norm for many who have decided to take the plunge and go fully automated and file based. We know what happens to all the turkeys – there is an endless marathon that starts with the enormous roast bird on the table, followed by turkey soup, turkey sandwiches, turkey stew and baked Alaska with turkey (my son’s experimental cooking!). According to this excellent article on yahoo, it would appear that the great American public is also in for a marathon this weekend. Many, many channels have decided that now the content is all digitised, it can be delivered in one great orgy of content where you can choose your channel, settle into your couch and then watch several complete series of your favourite comedy / drama / movie franchise / cooking show / documentary / …. without ever having to move. Obviously there is the need to eat more turkey and visit the lavatory, but essentially it seems that the Thanksgiving weekend is a TV highlight of the year. I am encouraged to see that many of the shows and many of the channels have used or are using AmberFin software to deliver these marathons. Our whole raison d’etre is to allow a media company to quickly, efficiently and economically move large volumes of material from a cost centre (like an archive) to a profit centre (like a playout facility). Getting into the file domain and delivering resilient, reliable workflows is key to doing this profitably. If you’d like some free training on the fundamentals of enterprise workflow design (especially around transcoding), then please sign up for the next webinar in my Bruce’s Shorts series. I won’t pitch product at you, but I will tell you some of the theory and good practise around getting your business goals implemented in realistic timescales at low cost. Save me some turkey! ‘till next time. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?
4K, HDR, HFR, 3D, Internet - where does the future lie?
In my recent webinar, I outlined where I thought the future was going. I covered quite a lot of the technicalities and a little of the market dynamics. If you missed the webinar then please sign-up to request the recording: It is interesting to me that one of the big drivers for 4k is the consumer electronics industry. Essentially these hi-tech, covetable pieces of furniture are being used to drive the sensor-size of the devices used to make films and TV shows. Compared to a decade ago, I feel the tail is starting to wag the dog quite violently. We're not doomed though. Over the last couple of years, there has been an increasingly vocal group of expert individuals and companies that I respect who have been talking in detail about HFR (High Frame Rate), HDR (High Dynamic Range), 3D (not-quite-dead-yet), OTT (and its business models) and fractional frame rates (aaarrrggghh) in terms of the real problems that we're solving as an industry. In an ideal world, our industry is an entertainment pipe that transfers great ideas from creative people to the consumer. It doesn't matter if the genre be fiction, news, sports or other, but it does matter that the consumer sees value in the pipe. 4k will be wonderful if the compression scheme used gives enough bandwidth to see all the pixels. HFR will give better results for certain genres like sports and some documentaries, but may make other genres less immersive. HDR improves dramatically the signal to noise of the transmission pipe and allow much greater viewing latitude for the furniture (sorry) screen makers. The camera folks at RED have put together a neat page that shows some of the issues. I don't think there is any one-size-fits-all technology that works for every genre all the time. Radio did not kill off the newspapers. Cinema did not kill off Radio. TV killed neither Radio nor Cinema. The internet has, so far, not killed TV. I think we'll see increasing fragmentation on the distribution channel side and thus an increasing demand for "Squeeze this HFR, HDR HD content into that 4k LFR Channel and make it look good" pieces of software. This makes me happy because that's what we set up AmberFin to do - make great video processing software that joins the economic uncertainty of distribution to the technical choices made in production. It would be nice, along the way, to prevent commercial drivers introducing unwanted and unnecessary technical degradation. Fractional frame rates and film-cadence errors are my current bug-bear. We have just released our new adaptive cadence correction software in our v9.7 of iCR. This performs an Inverse telecine function to correct for inappropriate handling of cadence in a TVworkflow. This is important because if you're going to put that content onto the web, or 4k or up-frame-rate to 120fps at some distant time in the future then the visibility of the plague of blended frames and mixed video-filmic degradations will be enhanced. I try not to wear my "sales-hat" in these blog posts, but we do have a pre-NAB special offer ontranscode nodes with this new high quality cadence corrector that has received rave reviews from our beta testers. Why not get in touch with your local sales rep or download the white paper to see why I think this is an important topic for toady and for the future. 'till next time.'