Menu extends

Jul 30, 2019
France
Dalet Brio Provides a Clear Path to IP with SMPTE ST 2110
Flexible, high-density I/O platform enables controlled transition to IP; Passes JT-NM SMPTE ST 2110 testing

Controlled Transition to IP

The Dalet Brio I/O platform provides media organizations a clear path and controlled transition to IP with support for SMPTE ST 2110.

Dalet, a leading provider of solutions and services for broadcasters and content professionals, is providing media organizations a clear path and controlled transition to IP with support for SMPTE ST 2110 in the latest release of its Dalet Brio I/O platform. Supporting both SMPTE ST 2110 and SDI standard workflows, the high-density ingest and playout platform allows media facilities to invest in their future IP infrastructure without disrupting their current operation. The cornerstone of advanced, IP-ready media operations, Dalet Brio adapts to new production and distribution environments with advanced capabilities that manage ingest, transfers, and playout to and from a wide range of systems and devices. Its extensive IP support enables users to process a wide range of parallel streams including SMPTE ST 2110, ST 2022-2 and NDI for linear channels, and RTMP for digital platforms like Facebook Live, YouTube and Twitter.

"With the latest version of the Dalet Brio product, our customers will be able to easily mix SDI and SMPTE ST 2110 workflows, transitioning to full IP with confidence and more importantly, at their own pace,” states Matthieu Fasani, Director of Product Marketing, Dalet. “Media professionals know that IP is the future, yet for most operations, it is not an overnight transformation. Unless you are re-architecting your entire media supply chain, a controlled transition to IP is the best strategy.” 

Fasani adds, "Ingest and playout solutions are key to the media operation and therefore need careful consideration when upgrading. Dalet Brio meets the needs of the new generation IP workflows. Its performance and support for SMPTE ST 2110 workflows are backed by trusted interoperability tests led by the Joint Task Force on Networked Media (JT-NM) in April 2019, ensuring that you are implementing a solution that is going to be compatible with the industry standard. Dalet Brio is an investment that will take your media operation into the future.”

Bruce Devlin, Dalet Chief Media Scientist and SMPTE Standards Vice President comments on the importance of IP workflows and SMPTE standards like ST 2110. “The migration to IP transport for professional media is a key enabler for new live workflows. IP transport and ST 2110, in particular, can give more flexibility and more utilisation of studio infrastructure than SDI is able to provide. Regular interoperability testing and industry collaboration sees an ever-increasing ecosystem of ST 2110 equipment that is able to be put together to create working systems. The IP future is being delivered now and ST 2110 equipment is at the heart of it.”
 

About Dalet Brio


Built on an IT-based input and output video platform, Dalet Brio is an integral part of fast-paced professional media workflows, whether as part of a Dalet Galaxy five enterprise-wide solution, integrated with third-party platforms, or as a standalone product. Dalet Brio suite of applications – Ingest Scheduler, Multicam Manager, Media Logger and Media Navigator – are purpose-built tools that allow broadcasters to expand the capabilities to include multi-camera control, comprehensive logging, and studio production ingest and playout. Dalet customers who have put Dalet Brio at the core of their media foundation range from enterprise broadcasters Euronews, France TV, Fox Networks Group Europe and Mediacorp, to iconic sports teams like San Jose Sharks, to leading post-production and digital distribution facility VDM.

For more information on Dalet Brio and other Dalet solutions, please visit https://www.dalet.com/platforms/brio.
 

About Dalet Digital Media Systems

Dalet solutions and services enable media organisations to create, manage and distribute content faster and more efficiently, fully maximising the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions.

Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloguing, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. 

Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio) and government organisations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA).

Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA.

Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners. 

YOU MAY ALSO LIKE
Welcome to the OTT video content streaming revolution!
This article was first published on the SMPTE website by Michael Goldman, Board Member at SMPTE. This is based on an interview of Lincoln Spiteri, VP of Engineering, Ooyala at Dalet. In the opinion of Lincoln Spiteri, VP of Engineering at Dalet, a major manufacturer of media workflow technology solutions, the OTT video content streaming revolution is currently in a vibrant, but dichotomous place. On the one hand, many technological, logistical, and standardization aspects of securely delivering scaled content over the internet to consumer devices or businesses are now stable and are burgeoning at this point, so that more creators and distributors than ever can efficiently push their programming to the public. The current growing global dependence on streaming news, sports, entertainment content, meetings and personal interactions during the ongoing worldwide emergency, for example, illustrates how “established” OTT streaming has become. Disruptive forces and the new possible On the other hand, significant “disruptive forces,” as he puts it, have evolved what are possible, needed, expected, and desired out of the video-streaming paradigm. This means, in essence, that no sooner have established methodologies proliferated than new questions and answers start hurtling down the chute that can change some of those methodologies over time. “The delivery side has settled overall,” Spiteri says. “We have the means to deliver high bandwidth, so the adaptive streaming side is there. Edge providers like Akamai and AWS Cloudfront, among others, and also cloud services being offered by the likes of Amazon, Google, and Microsoft Azure are providing the ability to anyone who wishes to deliver content over the top via the internet to be able to do so. In terms of delivering generalized services, all the building blocks are now understood, and they are on the verge of becoming commoditized in many new ways. It’s a global phenomenon where people are consuming video on an incredibly large scale. “But if you think about the disruption that Netflix and others have brought about in terms of being able to monetize their platform on such a large scale, by providing incredibly high-quality video and, nowadays, producing content themselves, I think that also drove the adoption, because consumers ultimately tend to gravitate to quality. On one side, you have big content producers spending billions of dollars to create compelling, original content they can distribute [online]. But then, on the other hand, you have the ongoing phenomenon of YouTube, which probably offers the most video content on the Internet in terms of volume. And a lot of that is from small-time content makers—homegrown content. The need for those people to also be able to produce and distribute is an interesting development from a technological point of view. “So there is something to be said about knowing what happens upstream in terms of acquiring or producing content [for distribution on the Web]. But the question arises, how do you do that efficiently to meet all sorts of different criterion at different distribution points, when targeting all sorts of different devices and platforms?” Managing digital video assets more efficiently Spiteri says the answer lies in figuring out how to manage digital video assets more efficiently all along the chain. “You have to have a good grip on your assets and have good metadata describing those assets,” he adds. “You need a way to prepare assets for consumption on a wide variety of devices and media platforms. That allows you to go back into your archive, and basically monetize not just the new content you are making, but also your existing assets.” Thus, among other developments, manufacturers like the company Spiteri works for, Dalet, and others are pushing into the marketplace various “media logistics platforms,” which he calls “the orchestration piece of the streaming puzzle,” designed, he says, “to make it easier and less costly to manage, package, and distribute content.” After all, he says, we live in a world where content not only has to be streamed, but in fact, has to be captured, edited, packaged with robust production values intact, and distributed or re-distributed to a wide range of destinations, often in a matter of minutes. Some examples of such platforms come from companies like Dalet, SDVI Corp., and Ownzones, among others. “For news and sports especially, time is of the essence,” he says. “That’s what I mean about ‘orchestration.’ We have a sports client in the UK, and they are required as a virtue of their license to deliver clips from a game within, say, five minutes of the event happening. This could be a two-minute package that needs to be sent off to perhaps 100 licensees to use [as highlights]—the package will be placed on someone’s Web platform or their applications for mobile consumption. So these orchestration platforms are needed to drive those sorts of things—to capture, edit, and produce packages within minutes out of a live event, and then enable it to be distributed immediately in a format that is the right format for whomever is receiving that package. So the development of tools that can make streaming of such content fast, robust, reliable, and scalable is very important.” In other words, tools that can “bring in a high level of automation” on the front end are now helping to democratize the streaming revolution, Spiteri suggests. “The idea is we can marry the media asset management side with metadata and with orchestration so that you can bring in that high level of automation, be able to prepare packages and, at the same time, make sure they are being reviewed for compliance purposes and then be ultimately delivered to whomever is putting the material online,” he explains. Spiteri emphasizes that the content industry has invested heavily in recent years in not only managing and protecting data, but in ways of tracking their users viewing habits and interests. That’s why, he suggests, you will often see content streaming providers “behaving like digital agencies to a certain extent, readily experimenting with their user interfaces to see what is working and to understand what their customers are viewing. They have a sophisticated level of analytics for that sort of thing, gathering an amazing amount of data.” Related to all this, Spiteri feels it is inevitable that artificial intelligence tools will “naturally play a bigger part” in how this data on users is gathered and taken advantage of, as well as how content is packaged and streamed around the world going forward. He points to new initiatives from several companies trying to weave AI tools into the video streaming tapestry. These include his company, Dalet, Amazon, Azure, and Graymeta. Further, some industry players are taking advantage of open-source AI learning framework services such as Tensorflow, developed by Google, and Facebook’s Pytorch, among others. “The AI community is definitely growing at a fast rate, now that we can run things through the Cloud,” he says. “So I think we will see some interesting applications of AI coming, as a result, with innovations combining analytics coming from users and helping companies learn about usage patterns and things like that.” On the importance of cyber-security He particularly feels that the security issue for content distributors has also largely stabilized in terms of delivering material over encrypted channels, thanks to established digital rights management (DRM) standards and services, including Google Widevine, Microsoft Playready, Apple FairPlay, and Adobe Content Server, among others. Additionally, he points to an increasing trend across the industry whereby companies are joining forces to develop new and more stringent cyber-security protocols, such as the DPP initiative and the Trusted Partner Network. “All the major means of delivery allow you to deliver encrypted content—I think that is pretty much a done game,” Spiteri says. “Those things are very robust at this point. They are still evolving, but there is no magic sauce. They use the fabric of the Web itself—the core technologies are acting as part of a framework and infrastructure that allow us to deliver content at scale. And now, various CDN’s [content delivery networks] allow you to essentially geo-fence your content [using geographically distributed servers to transport files] so that it can’t be touched by anyone outside the region it is intended for. So we have all sorts of mechanisms for providers to understand who their users are, what their level of access should be, and they are all very mature at this point with well-established operators and practices.” Iterate and innovate Still, he expects the industry “to continue to iterate and innovate” in terms of user interfaces and other ways of simplifying the experience, with greater use of voice control technologies and more powerful, AI-powered content recommendation engines on the way, among other things. Spiteri says other interesting advancements coming down the chute include increased reliance for some providers on the notion of an API-first platform—what he calls “a headless OVP” for certain kinds of applications. “There are various companies, including one called Mux, that are built around the idea of having an easier way for the technology needed to deliver content to be more open with API-first delivery,” Spiteri relates. “I think we will see more of these in the near future. They are not necessarily there to build the best content management system, or to provide an end-to-end tool chain to let anyone who wants to build an OTT platform to do so, but they focus very specifically on a particular piece and make it as good as it can be.” He also emphasizes that the evolution of adaptive bit rate streaming standards such as MPEG-DASH, Apple’s HLS (HTTP Live Streaming), Microsoft Smooth Streaming, and others has “removed a lot of the fragmentation” in the video streaming world. By that, Spiteri means that, “it’s now fairly easy to be able to package your content and scale it up or down based on the conditions the stream is being delivered in.” Thus, Spiteri largely feels that the standardization issue in the streaming world is not an impediment anymore on a mixed-platform landscape. However, more generally, he also points out that what has really changed is the fact that, in this realm, “a significant part of the standards is carried by software now, not hardware. Devices can be upgraded over the air to fix certain issues or meet certain changes in a standard now. Therefore, the whole dynamic is changing. TV’s, phones, tablets are so powerful these days that the standards have to become agile. We can’t wait four or five years anymore for a new standard to be developed.” And related to that, he points out that the OTT side of things will, for the foreseeable future, remain linked to one degree or another to the OTA side of things due to the “bigger role that OTA still plays when it comes to live events—it’s typically more reliable for sporting events and so on. So OTA still has room to grow, as well.” As a consequence, the ATSC 3.0 next-generation terrestrial TV broadcast standard’s growth and evolution in the US is having an impact also in the streaming world, as discussed in Newswatch in 2019, because of the hybrid nature of the viewing landscape for the foreseeable future.     “Many companies are beginning to mix streaming content with live channels,” he says. “I’m sure several of the platforms are heading that way. In the UK, we have a service called TalkTalk, for example, which has created a pretty seamless mixture of over-the-air and over-the-top means of delivering content to the set-top box. Their software makes it pretty indistinguishable, whether you are receiving a channel over IP or a broadcast coming from your cable. “Ultimately, this is due to the set-top box technology, or the new Smart TV technology generally. With the processors they are putting into these things now, it gives them a lot of power and the ability to make the experience pretty seamless.” At the end of the day, Spiteri expects “more disruption from the content delivery side” over time. “They want more 8k content, but it is questionable whether there will be much 8k content in the next year or two,” he says. “But we can probably expect a lot more 4k content, more high dynamic range content, and that sort of thing. But we will also see a market disruption in terms of new players coming into the fore. In other words, there will be more disruption because the technology is now able to deliver the content more efficiently for more people to give it a try.”
What is HDR, WCG and Dolby Vision and why does it matter?
Alphabet soup starring HDR and WCG "Hey Guys, let's re-invent the entire TV and Cinema chains from Camera to Screen!" said no high-ranking executive in any board meeting ever. The whole concept sounds like crazy talk when you say it out loud, but in reality that's what the High Dynamic Range (HDR) and Wide Color Gamut (WCG) revolution have done over the recent years. We've moved on from glowing goop The cinema world, shooting on film, has always had a little more freedom that the TV world when it comes to controlling brightness, color and contrast between the camera and the screen. There were limitations in physics and chemistry, of course. You could make the projector brighter assuming you didn't melt the film and you could make the film more sensitive provided you liked that grainy look on the screen. The TV world, however had a fixed and inflexible transmission infrastructure that was stabilized in the 1950s. The relationship between the photons going into a camera and the photons coming out of most of today's TV are still based on the response characteristics of the glowing goop you find inside CRTs (Cathode Ray Tubes) of that early era. So in comes HDR. "Hey guys the eye can capture about 14 stops of brightness so let’s transmit that." is the fundamental idea behind HDR. In a basic MPEG system, the brightness of most pixels is represented by a number between 0 and 255. This gives you the ability to represent 8 stops (28 values) whereas we would like to represent 214 values in our HDR chain i.e. the brightness of each pixel is represented by a number between 0 and 16383. Sounds simple really. But, what is Dolby Vision HDR? Let's redesign the entire Cinema and Broadcast Value chain The complexity comes with making sure that each and every device in the value chain from camera through switcher and ingest and transcode understands what these new numerical values actually mean. In an ideal world we would replace all the old kit with brand new kit, but that's not really practical so the HDR systems that were created have compatibility modes to allow these new bright, colorful pixels to travel across traditional SDI, H.264 and IP transmission paths with good integrity to appear at the final display to show wondrous pictures. Now, what is Dolby Vision HDR? Dolby Vision is one of the HDR systems that requires metadata to work. Its trick is identifying that in any typical scene you only use a portion of the total available dynamic range. A dark shadowy scene in a cave will need more bits allocated in the small numerical pixel value ranges. A bright seaside scene on a sunny day will need more bits allocated in the large numerical pixel value range. This scene by scene adaption is enabled with metadata that tells each device how to behave for that scene. The Dalet AmberFin team is really proud that it's the first software only transcoder and workflow engine to have full support for the Dolby Vision system. It can do this in a wide range of different codecs in parallel with the usual array of high quality video processing functions from scaling to Standards Conversion. The Dolby Vision metadata itself might be carried in a sidecar XML file or embedded within the media file as a data track. Whichever mechanism is used, it's vitally important to retain the synchronization between the metadata and the images to get the best results particularly when aligning metadata changes to hard cuts in the video. This becomes doubly important when frame rate converting because blended frames and mis-timing of metadata combined are highly visible, highly annoying and consume a lot of bitrate in the final encoding. A transcoder like the Dalet AmberFin platform gets all of those complex factors right first time, resulting in high efficiency, low bitrate, outstanding picture. In today's era, the consumer often lead the professionals So, what is Dolby Vision HDR and why is it important? HDR is important because the consumers of media get to see HDR on the content they make on their mobile devices. If the paid-for entertainment content they see on other platforms looks washed out and old-fashioned by comparison, then this will be a factor in what media gets consumed. If anyone has a spare crystal ball to help predict what this future might look like, then I would be very grateful to borrow it for a while!
Small steps for designers, giant leaps for users! How UX drives growth
With the advent of omnipresent technologies (think smartphones) in our daily lives, free/try apps have transformed the way we make products and the way consumers expect to be served up products that are intuitive and enjoyable. Just like consumers flocking from one social app to the other, a new generation of creative media professionals have adopted, from their personal digital lives, a “try and keep/throw” approach to media tools. i.e., "if I don't like it or can't get my work done, I'll look for another one." As software vendors, we must embrace this change to enrich our user community's experience in their daily work, with minimal distractions from the application, i.e., focus on the craft, not the tools. We have also seen a shift towards putting the emphasis on product design from users' feedback versus historically "Technical experts/Engineer," where the focus was on the technical capabilities of the solution with less importance on usability, complexity, intuitiveness for the persons actually doing the work. Back in the day Back in the day, you had to be an engineer to operate sophisticated technical software. Today you can cut a movie on your iPhone. Not that you would want to, but the fact that the tools are there for the masses has created a baseline from a user perspective, on expectations as to what they want and expect from a product. Today, our media professionals need to work with software tools to tell their story. A UX (user experience)-centric approach in software design focuses on abstracting the complexity to surface and empower creatives. User Experience is at the center of a successful product and a primary axis to our commitment to user/market-driven approach to producing value for our client community. This need for transformation in how software vendors interact with an organization, and users, means putting transformative approaches and processes to dynamically engage with the user community much earlier in the product development stage, from ideation to release, and onwards. Your accelerator to market success The value in implementing a user-driven feedback loop model is a tenfold accelerator to market, with well-received releases that provide incremental value. Ideas are great, but unless tested against the intended audience, the product output generates frustration and dissatisfaction, further introducing new cycles of product iteration. Great designs can have a positive effect on the quality, accuracy, and user satisfaction/adoption of a product. This is what User Experience design practice enables. I want Ketchup, not a workout It is essential to differentiate UX (user experience) from UI (user interface), however. While UX focuses on "how to do something and how something should work" in the most intuitive, precise, and efficient way, UI focuses on the presentation layer and visual appeal of the product. A simple example: a button. UX design is focused on the position, discovery, feedback, and interaction with careful attention in taking into account both previous and following functions. Whereas UI design will focus on how to make the button visible, accounting for the shape, color, and typography to make the user want to press it. Courtesy of the author A great consumer-goods example that emphasizes this is ketchup bottles. When looking at the glass bottle, anyone can attest that it conveys elements of quality and aesthetics; however, the user experience is disappointing to everyone, including my grandmother, having users come up with workarounds for the poor usability. On the other hand, the plastic squeeze bottle, while looking cheap, is immediately intuitive in its use, and the fact that it stands on its cap, further adds a value of being ready to be dispensed by a simple squeeze. The wheel is not enough Credit: Nima Torabi The above example highlights the iteration of design of a user problem statement: "I want a mode of transportation that gets me from point A to point B safely, comfortably and expeditiously". While the top approach addressed the components of the product, each iteration was rendered useless on its own and not adopted (i.e., failed) as it didn't address the user's need. What is vital here are tangible, usable deliverables, or MVPs (Minimum Viable Product). “User experience encompasses all aspects of the end-user’s interaction with the company, its services, and its products.” Don Norman, Nielsen Norman Group (Credited with inventing the term UX) Back to the ketchup bottle, we can see that the first bottle was usable and formed the basis, by allowing feedback, which led to further iterations, each providing more intuitive designs. Note the middle image, where the bottle is now squeezable, would have never been initially designed without this user feedback. It’s all about the user User research is at the center of our approach at Dalet, both for outbound activities (on-site interviews, regular visits to see the product in use, etc.) and inbound (through conducting surveys, facilitating user community forums, etc.). This is the approach we took when we set out to re-design the user experience for our OoyalaMAM, the main user interface for the Ooyala Flex Media Platform: usability and speed were our top goals – but that’s a story for our next blog post. The recently refreshed OoyalaMAM user interface We want to enable our user community to converse and contribute while gaining knowledge and identifying trends that would need to be slated in the roadmap. As part of this user research, we are putting in place forums for the community to contribute, collaborate, and drive innovative ideas at the forefront of our practice. We'd love to have you aboard!
Dalet Reveals Updated Product Lineup that Streamlines the Multi-platform and Digital Media Content Supply Chain at IBC2019
Dalet, a leading provider of solutions and services for broadcasters and content professionals, will demonstrate how its newly expanded media product lineup manages the content supply chain from production to distribution for a much wider range of media management needs and a significantly broader market at the 2019 IBC Show (Hall 8, Stand B77), held in Amsterdam from September 13th through September 17th. The recent Ooyala Flex Media Platform business acquisition combined with new product introductions spanning broadcast graphics, AI-powered content and production workflows, and IP infrastructure add even greater breadth and depth to the Dalet media management capabilities. The expanded Dalet product offering answers the needs of existing news, sports, programs, radio and archive customers requiring solutions to enable enterprise orchestration, intensive production and multi-platform distribution workflows. For new customers and markets, including corporate brands, telcos, sports teams and leagues, Dalet now offers agile, subscription-based solutions to manage multi-platform packaging and digital media distribution workflows. “Today, the ability to launch new services quickly at reduced operational costs is essential for content owners and distributors to remain competitive, and both Dalet and Ooyala have been helping them address this challenge in our respective areas of expertise. This was one of the key drivers behind the Ooyala acquisition,” states Kevin Savina, Director of Product Strategy, Dalet. “The Ooyala Flex Media Platform, with its strong OTT content supply chain and multi-platform distribution capabilities, is a perfect complement to Dalet Galaxy five, which is the platform of choice for production-intensive and enterprise media operations. We look forward to demonstrating that at IBC2019 alongside all the other innovations we will be showcasing on our other platforms.” Powering the Digital Content Supply Chain & Multi-platform Distribution - Ooyala Flex Media Platform Now under the Dalet umbrella, the Ooyala Flex Media Platform is nested within a media ecosystem that will enable a steady growth path, with a number of new features being unveiled at IBC2019. The platform now includes the ability to publish to multiple online video platforms, such as Kaltura, JWPlayer and Brightcove, offering increased support for OTT workflows and multi-platform distribution. Integration with Dalet AmberFin for orchestrated, high-quality media processing, and Dalet Media Cortex for AI-augmented workflows will also be demonstrated at the show, developing the first synergies between the Ooyala Flex Media Platform and the wider Dalet product range. Leveraging AI to Augment Content Workflows - Dalet Media Cortex The AI SaaS platform, Dalet Media Cortex, automatically enriches content and provides actionable insights, contextual recommendations, speech-to-text transcriptions and automated closed captioning, offering real value for both the users and the wider organization. An add-on to Dalet Galaxy five and Ooyala Flex Media Platform, the pay-as-you-go AI platform helps media organizations leverage cognitive services and machine learning across content operations and businesses. Providing the right insights, in the right toolset, with the right context, Dalet Media Cortex helps content producers, owners and publishers across news, sports, programs and radio operations make the most of their media assets and become more productive, automating mundane tasks so they can focus on the creative editorial process. Advanced Broadcast Graphics with Brainstorm Real-Time Graphics Engine - Dalet CubeNG Fully integrated across the Dalet Unified News Operations solution powered by Dalet Galaxy five, the state-of-the-art, full-featured Dalet CubeNG graphics platform is a major upgrade and leverages the industry-leading Brainstorm real-time graphics engine to deliver superior 2D and 3D branding and visuals. Suited for both on-air and file-based graphics creation, the Dalet CubeNG unified approach enables news broadcasters to easily create dynamic branding and up-level visual storytelling across traditional, digital and social channels. Exceptional In-the-Field Multimedia Production Experience - Dalet Remote Editing Media organizations in fast-paced production markets such as news, sports and reality TV have extensive in-the-field production needs. The Dalet Remote Editing highly scalable framework brings full-featured multimedia editing capabilities and speed to the editors working in the field or out of remote offices, as well as freelancers. Dalet Remote Editing securely connects journalists, producers, editors and other content creators to the central content hub, enabling remote users to edit, assemble, collaborate and quickly submit packages or download high-resolution media to finalize locally even in low-bandwidth situations, with no additional PAM or MAM required. The first release of the new Dalet Remote Editing framework supports Dalet OneCut natively with subsequent updates supporting Dalet Xtend-compatible third-party editing applications. A Clear Path to IP - Dalet Brio Dalet is providing media organizations a clear path and controlled transition to IP with support for SMPTE ST 2110 in the latest release of its Dalet Brio I/O platform. Supporting both SMPTE ST 2110 and SDI standard workflows, the high-density ingest and playout platform allows media facilities to invest in their future IP infrastructure without disrupting their current operation. The cornerstone of advanced, IP-ready media operations, Dalet Brio adapts to new production and distribution environments with advanced capabilities that manage ingest, transfers and playout to and from a wide range of systems and devices. Its extensive IP support enables users to process a wide range of parallel streams including SMPTE ST 2110, ST 2022-2 and NDI for linear channels, and RTMP for digital platforms like Facebook Live, YouTube and Twitter. Better Together - Join us for a Very Special Dalet Pulse Event! This IBC2019, the Dalet Pulse media innovation summit will expand its platform to include Ooyala. Celebrating the joining of two great media teams and technologies, the Dalet Pulse theme this year, Better Together, will give attendees a chance to learn about the extended product portfolio and how it helps leading media organizations develop agile content supply chains, deliver unique content experiences to multi-platform audiences, and increase revenues with Dalet solutions and partner technologies. It’s also a unique opportunity to meet the expanded team. Thursday, 12 September Pompstation, Amsterdam Keynote: 17:30 - 19:00 Party: 19:00 - 22:00 Register now via https://www.dalet.com/events/dalet-pulse-ibc-2019 Book a Private Briefing to Learn More About Dalet Take the opportunity to have a private demonstration or workflow consultation with a Dalet expert to learn how the latest products and solutions can help you better create, manage and distribute content. Book a meeting via https://www.dalet.com/events/ibc-show-2019 Press can contact Alex Molina at alex@zazilmediagroup.com to schedule a media briefing. About Dalet Digital Media Systems Dalet solutions and services enable media organisations to create, manage and distribute content faster and more efficiently, fully maximising the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions. Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloguing, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. In July 2019, Dalet announced the acquisition of the Ooyala Flex Media Platform business. An acceleration of the company’s mission, the move brings tremendous value to existing Dalet and Ooyala customers, opening vast opportunities for OTT & digital distribution. Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, TV2 Denmark, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Fox Sports Australia, Turner Asia, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio), sporting organisations (National Rugby League, FIVB, LFP) and government organisations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners.
ProSiebenSat.1 PULS 4 GmbH Brings Its Media Operation into the Future with Dalet
Dalet, a leading provider of solutions and services for broadcasters and content professionals, has signed a deal with Austrian private broadcaster group ProSiebenSat.1 PULS 4 to fully revamp their media operation with the agile Dalet Galaxy five Media Asset Management (MAM), Workflow Orchestration and Editorial platform. ProSiebenSat.1 PULS 4 is home to hits such as Austrian’s Next Top Model, iconic UEFA Europa League and international sports coverage such as National Football League and popular thematic news, talk and show programs. The Dalet Galaxy five installation will equip the well-known broadcaster with a state-of-the-art media production workflow that realigns production and delivery to optimize cross-functional team collaboration and multi-platform content output across its three channels PULS 4, ATV and ATV 2. ProSiebenSat.1 PULS 4 Broadcast System Engineers, Christoph Stadlhofer and Dirk Raschig comment on the vision for the leap forward and partnering with Dalet to better manage the news, sports, archives and program preparation workflows, “We have a rapidly growing pool of content that needs to be centralized and enriched with metadata. In addition, we need tools that enable our staff to easily search, prepare and distribute that content. Facilitating these needs combined with managing a much higher output of program and news content to our digital platforms is what we expect to accomplish with the move to Dalet Galaxy five.” Tobias Stößel, Project Manager of this Project at ProSiebenSat.1 PULS 4 continues, “The Dalet Galaxy five core media asset management and orchestration capabilities will free our users from many manual processes and technical duties that currently weigh them down. It will enable them to reach a higher level of collaboration, giving them more time to focus on the project at hand.” The full implementation of Dalet Galaxy five will allow ProSiebenSat.1 PULS 4 to build an end-to-end, cohesive digital content supply chain that unifies all operations and processes from the newsroom to program preparation to post-production to distribution and archives. Key components include: a centralized content catalogue (MAM) that houses enriched metadata for ProSiebenSat.1 PULS 4’s three channels; Dalet Webspace for browse, search and media management; Dalet Workflow Engine for the orchestration of operations and processes; the Dalet Brio I/O platform for centralized ingest management; and Dalet AmberFin scalable transcoding. The Dalet installation will facilitate seamless collaboration between all users including editors on Adobe® Premiere® Pro through Dalet Xtend and Avid® Pro Tools®. “Dalet Galaxy five will unify the infrastructure with an underlying component-based workflow that enables ProSiebenSat.1 PULS 4 to scale at will its content production and multi-platform delivery and thrive in an ever-changing media landscape,” comments Johann Zemmour, General Manager, EMEA and APAC, Dalet. “We look forward to working closely with ProSiebenSat.1 PULS 4 to revamp its infrastructure and deliver on the ambitious roadmap that will undoubtedly transform the way they produce and deliver their premium product to the delight of ProSiebenSat.1 PULS 4 audiences.” Philipp Beuchert, Head of Broadcast & Production Systems at ProSiebenSat.1 PULS 4 concludes on partnering with Dalet, “We are also looking into future technology and here, we think Dalet is one of the key players on the market. ProSiebenSat.1 PULS 4 is happy to improve the in-house technology together.” About Dalet Digital Media Systems Dalet solutions and services enable media organizations to create, manage and distribute content faster and more efficiently, fully maximizing the value of assets. Based on an agile foundation, Dalet offers rich collaborative tools empowering end-to-end workflows for news, sports, program preparation, post-production, archives and enterprise content management, radio, education, governments and institutions. Dalet platforms are scalable and modular. They offer targeted applications with key capabilities to address critical functions of small to large media operations - such as planning, workflow orchestration, ingest, cataloguing, editing, chat & notifications, transcoding, play out automation, multi-platform distribution and analytics. Dalet solutions and services are used around the world at hundreds of content producers and distributors, including public broadcasters (BBC, CBC, France TV, RAI, RFI, Russia Today, RT Malaysia, SBS Australia, VOA), commercial networks and operators (Canal+, FOX, MBC Dubai, Mediacorp, Mediaset, Orange, Charter Spectrum, Warner Bros, Sirius XM Radio) and government organisations (UK Parliament, NATO, United Nations, Veterans Affairs, NASA). Dalet is traded on the NYSE-EURONEXT stock exchange (Eurolist C): ISIN: FR0011026749, Bloomberg DLT:FP, Reuters: DALE.PA. Dalet® is a registered trademark of Dalet Digital Media Systems. All other products and trademarks mentioned herein belong to their respective owners. For more information on ProSiebenSat.1 PULS 4, visit http://www.prosiebensat1puls4.com.
A Three-Platform Approach: Dalet Galaxy, Dalet Brio and Dalet AmberFin
So far, 2014 has been the year of mergers and acquisitions within the broadcast industry. As previously reported on this blog, not all this M&A activity is driven by the same customer-focused aims. However, in the case of Dalet, our recent strategic acquisition of AmberFin has the customer clearly in mind. The merging of the two companies enables our new enlarged and enriched company to cover significantly more bases within file-based workflow environments. From IBC 2014, Dalet will offer three technology platforms: Dalet Galaxy, Dalet Brio and Dalet AmberFin, leveraging the knowledge and technologies of both companies to deliver a broader and deeper set of solutions. It’s worth looking under the hood and understanding why this is so important. For readers that are new to some parts of the Dalet product family, let me shed a little light on these platforms: Dalet Galaxy is the latest and most advanced version of the Dalet Media Asset Management (MAM) platform and the most recent evolution of Dalet Enterprise Edition. The landmark development initiative leverages more than 10 years of successful MAM development and customer input. Dalet Galaxy is the industry's first business-centric, MAM platform developed to manage media workflows, systems and assets throughout the multimedia production and distribution chain. Dalet Brio is an innovative and cost-effective platform for broadcast customers looking for non-proprietary solutions to digitize and playback their content. Constructed using Dalet Brio servers (IT-based ingest and playout servers for SD and HD content), it also provides a powerful set of user tools and applications to help deliver video workflows. Dalet AmberFin is a high-quality, scalable transcoding platform with fully integrated ingest, mastering, QC and review functionality, enabling facilities to make great pictures in a scalable, reliable and interoperable way. AmberFin software runs on cost-effective, commodity IT hardware that can adapt and grow 
as the needs of your business change. Advanced Integration Capabilities to deliver new workflows As a specialist in MAM-driven workflows, Dalet has been actively looking at delivering end-to-end workflows, and we all know that one of the biggest problems we encounter is making the various workflow components work together efficiently and intelligently. This is the reason we, at Dalet and AmberFin, have always been strong supporters of industry standards as a means to ease integration issues when building workflows. Each of the three Dalet platforms possess powerful integration capabilities, based on standards and APIs, which enable every product built on these platforms to be integrated within overall workflows. Most importantly, we believe that the greatest added value we can bring to our customers comes from tight integration between these three platforms, empowering workflow optimization that previously was unimaginable. This vision goes well beyond what any industry standard or even proprietary API can achieve. Let’s take an example: in today’s modern workflows media will be transcoded at a variety of touch points in the production and distribution process, potentially degrading the source quality over successive generations. At Dalet, we strive within the AmberFin platform to minimize quality degradation at each step of the process, but we recognize this is not enough. In fact we still believe that “the best transcode is no transcode.” This can only be achieved by exploiting key metadata (technical, editorial and rights metadata) stored in the MAM platform in order to make smart decisions on when to transcode or not, and what type of transcode profile to apply. And this is just one of the ideas we have. At IBC this year, we will be showcasing some fantastic new features and facilities that are possible using the new extended and enriched Dalet portfolio of workflow solutions. Check out here our exciting theatre line-up for the next few days. We’re still booking demos, so it’s not too late to book a meeting: http://www.dalet.com/events/ibc-amsterdam-2014. To learn more about Dalet’s strategic acquisition of AmberFin, download the following white paper: http://www.dalet.com/white-paper/dalet-and-amberfin.
MXF AS02 and IMF: What's the Difference and Can They Work Together?
If you read my previous posts about IMF, you will already know what it is and how it works. But one of the questions I often get is "how is IMF different from AS02 and will it replace it? After all, don’t they both claim to provide a solution to versioning problems?". In a nutshell, the answer is yes, IMF and AS02 are different and no, IMF will not replace AS02; in fact the two complement and enhance each other. Let me explain: MXF AS02 (for broadcast versioning) and IMF (for movie versioning) grew up at the same time. And while both had very similar requirements in the early stages, we soon ended up in a situation where the level of sophistication required by the broadcasters’ versioning process never really reached critical industry mass. Efforts were continually made to merge the MXF AS02 work and the IMF work to prevent duplication of effort and to ensure that the widest number of interoperable applications could be met with the minimum number of specifications. When it came to merging the AS02 and IMF work, we looked at the question of what would be a good technical solution for all of the versioning that takes place in an increasingly complex value chain. It was clear that in the studio business there was a need for IMF, and that the technical solution should recognize the scale of the challenge. It came down to a very simple technical decision, and a simple case of math. AS02 does all of its versioning using binary MXF files, while IMF does all of its versioning using human-readable XML files. There are maybe 20 or 30 really good MXF binary programmers in the world today; XML is much more generic, and there must be hundreds of thousands of top quality XML programmers out there. Given the growing amount of localized versioning that we are now faced with, it makes sense to use a more generic technology like XML to represent the various content versions whilst maintaining the proven AS02 media wrapping to store the essence components. In a nutshell this is the main difference between AS02 and IMF. Both standards have exactly the same pedigree and aim to solve exactly the same problems, but IMF benefits from a more sophisticated versioning model and therefore requires a greater degree of customization – and XML is a better means of achieving this. IMF is not going to replace AS02. Rather the goal is to get to a place where we have a standardized IMF package as a means of exchanging versioned packages within the workflow. IMF will actually enhance the AS02 bundles that represent componentized clips that are already ingested, transcoded and interchanged today.
Life before and after DPP (Digital Production Partnership)
People that know me will be aware that file-based workflows are a passion of mine. Ten years ago I was co-author of the MXF (Media Exchange Format) specification and ever since I have been engaged in taking this neatSMPTE standard and using it to create a business platform for media enterprises of every size and scale. This is why I’m so excited by the Digital Production Partnership (DPP): it represents the first ratified national Application Specification of the MXF standard and is set to revolutionize the way that media facilities and broadcasters work.To explain what I mean, let’s compare life with a DPP ecosystem to life without. Less pain to feel the gain 
In a standardized DPP world, there would be a limited amount of pain and cost felt by everybody but this would be shared equally amongst the organizations involved and it would be a limited cost, which is incurred only once. After this point, our industry has a fantastic common interchange format to help encourage partnerships and build businesses. In an unstandardized world, where different facilities have decided to use different tools and variants of MXF or other formats, the major cost becomes the lack of third-party interoperability. Each time content is exchanged between different facilities, a media transcode or rewrap in that format is required. This means that all vendors in all the facilities will ultimately support all the file formats andmetadata. The engineering required to implement and test takes time and costs money on an on-going basis. Interoperable metadata helps the content creator 
In a world that has adopted DPP, media and metadata interoperability is not an issue since the format is built on a strong, detailed common interchange specification. In this homogeneous scenario the resources that would have been used in the interoperability engineering process can be used in more creative and productive ways, such as programme making. Programme making is a process where most broadcasters utilise external resources. In a world without DPP, whenever a broadcaster or production facility receives a new file from an external facility, such as a Post House, the question must be asked whether this file meets the requirements of their in-house standard. That evaluation process can lead to extra QC costs in addition to possible media ingest, transcoding, conformance and metadata re-keying costs that need to be taken into account. Building a business platform
 This heterogeneous environment is an issue not just for interaction with external facilities: often different departments within the same major broadcaster will adopt slightly different file standards and metadata making interoperability a big issue to them. As a result, today only about 70 per cent of transactions within companies are file-based – the remainder employ tape. However, this is much higher than where external agencies are involved – here, only 10 – 15 per cent of transactions are file-based. The essence of the problem is the lack of a common interchange format to enable these transactions. DPP is the first open public interchange format that is specifically designed to address this issue. DPP is intended to transform today’s 20 per cent trickle into an 80 per cent flood in the shortest time. To find out more about DPP and how it can transform the way your operation works and also your effectiveness working with other organizations read AmberFin’s White Paper on DPP.
What’s really going on in the industry?
My inbox is a confusing place before a trade show. I get sincere emails asking if I’m interested in a drone-mounted 3ME Production Switcherand familiar emails asking when is the last time I considered networking my toaster and water cooler to save BIG on my IT infrastructure. The reality is that prior to a great trade show like IBC, I want to see a glimpse into the future; I want to know what’s really on the radar in our industry, not what happened in the past, or some mumbo jumbo about unrealistic technological achievements. I am personally very lucky that I spend quality time with the folks who set the standards in SMPTE, because this is one place in the world where the future of the industry is hammered out in detail by tiny detail until a picture of the future presents itself like some due process Rorschach test. With the permission of SMPTE’s Standards Vice President Alan Lambshead, here’s a little glimpse of some of those details that you’ll get to see in the weeks, months and years to come. UHDTV – Images Ultra High Definition TV – it’s more than just 4k pixels. In fact, SMPTE has published a number of standards including ST 2036 (parameters) and ST 2084 (Perceptual Quantization High Dynamic Range) that define how the professional media community can create pictures that give consumers the WOW factor when they upgrade. But there’s a lot more to come. How do we map all those pixels onto SDI, 3G SDI, 12G SDI, IP links and into files? SMPTE is actively looking at all thoseareas as well as the ecosystem needed for High Dynamic Range Production. Time Code Oh Time Code. How we love you. Possibly the most familiar and widely used of all SMPTE’s standards, it needs some major updates to be able to cope with the proposals for higher frame rates and other UHDTV enhancements. Beyond Time Code, however, we have the prospect of synchronizing media with arbitrary sample rates over generic IP networks. SMPTE is working on ways of achieving just that, and it means that proprietary mechanisms won’t be needed. That also means different vendors kit should simply work! IMF I’ve written and lectured extensively about IMF’s ability to help you manage and deploy multi-versioned content in an environment of standardized interoperability. As this toolset for a multi-platform ecosystem rolls out into the marketplace, the specifications are continually evolving with the developing needs of the market, as well as with the needs of individuals on the design team who influence the feature set. UHDTV – Immersive Sound I remember back in the 1980s at the BBC, when we proved that great sound improves the quality of pictures. These fundamental principles never change and the desire to create immersive audio-scapes through the use of many channels, objects or advanced sound fields requires standards to ensure that all the stakeholders in the value chain can move the audio from capture to consumption whilst creating the immersive experience we all strive for. SMPTE is the place where that future is being recorded today. TTML The humble caption file. Internationally it is nearly always legal to broadcast black and silence, providing that it’s captioned. There’s really only one international format that can generate captions and subtitles without proprietary lock in, and that’s TTML. SMPTE is active in the use of TTML in the professional space and its constraints for IMF. Whether your view on captioning is good or bad, TTML is the only open show in town and SMPTE’s helping to write the script. ProRes What? Apple disclosing ProRes? Yes, it’s true. As the world requires more interoperability and better visibility, the excellent folks at Apple have created a SMPTE Registered Disclosure Document describing the way that ProRes appears in files. One file format may not seem like a big deal, but the fact that SMPTE is the place where companies that are serious about working together write down the technical rules of engagement is exactly what makes SMPTE the perfect place to plot trajectories for the future. To quote one of my intellectual heroes, Niels Bohr, “Prediction is difficult, especially if it’s about the future.” SMPTE won’t tell you the future, but by participating, you’re more likely to spot the trajectories that will hit and those that will miss. If any of these topics interest you, excite you or put you into an incandescent rage of “How Could They!”, then you are able to participate in 3 easy steps: Join SMPTE Add Standards membership from your My Account page on the SMPTE site Register & turn up in Paris to the meetings on the 16th Sept 2015 Until then, you can always check out more visions of the future on our blog or find out all about IMF on the Dalet Academy Webinar Replay on YouTube. Now, where’s my drone-mounted Mochaccino maker? Until next time…
Why the forecast is [still] looking Cloudy
I wrote a blog post a while ago about the future looking cloudy. To quote a famous person, Mark Twain I think, "Predictions are difficult – especially when it concerns the future!" and predictions about the link between the technological benefits of cloud technology and new business models are especially hard to get right. So why do I think that the future still looks cloudy? In my opinion, it comes down to the fact that media is undergoing several simultaneous transformations: Moving files over IP instead of tape has broken the link to real time signals for (nearly all) scripted and offline content Moving content by IP instead of SDI is practical, time-accurate, reliable and can be cheap Viewer consumption via IP is increasing year-on-year We are seeing more formats at higher frame rates with strange pixel numbers every year SDI links the properties of the transport to the frame rate / pixel numbers, and requires work & standardisation for every ancillary data format carried IP keeps the properties of the transport and the frame rate / pixel numbers & ancillary payloads independent Traditionally, broadcasting & media have achieved reliability through over-provisioning of equipment – along with the associated cost of doing it Cloud is all about maximising the use of the physical infrastructure so that over-provisioning is shared between many end-users and saves money Add all of these different elements together and I think that a cloudy future is more and more likely as time progresses. The death of over-provisioning is happening in many facilities, as the ability to run processing farms and data-links at near full-capacity at all times becomes more operational practise and less theoretical. Innovative licensing tools that allow floating of licenses to where the processing is needed will increase the usage of hardware and therefore keep overall running costs down. We still have a long way to go before every broadcast centre becomes a virtualised set of applications in a data centre. Many of the cost structures don't work yet. The cost of transporting content into and out of the cloud is still a lot more expensive than the cost of processing it. The cost of storing data in the cloud is not dropping as fast as processing costs. There are still question marks over security. Although papers have been published about secure clouds that allow processing of encrypted content without it ever being "in the clear" in the data centre, I still haven't seen that technology rolled out and deployed at a reasonable cost. Whether cloud means in-house data centres or public shared facilities or some sort of hybrid model, the key is to get the best value from the infrastructure and to share the costs so that everyone is better off. The types of media that we use in our industry vary in volume and value so that the perfect business model for one broadcaster may not work for another. We have an interesting journey ahead of us where business models will be changed massively by technology and in turn new technologies will be forced into existence when business conditions are right. The future is full of change and uncertainty. From where I'm sitting, it's still looking cloudy.
HPA: Mapping the Future, One Pixel at a Time
I love the HPA Tech Retreat. It is the most thought provoking conference of the year, one where you're guaranteed to learn something new, meet interesting people and get a preview of the ideas that will shape the future of the industry. Here are the six most interesting things I learned this year. Collaborating competitors can affect opinions At this year’s HPA Tech Retreat, I had the honour of presenting a paper with John Pallett from Telestream. Despite the fact that our products compete in the market place, we felt it important to collaborate and educate the world on the subject of fractional frame rates. 30 minutes of deep math on drop frame timecode would have been a little dry, so we took some lessons from great comedy double acts and kept the audience laughing, while at the same time pointing out the hidden costs and pitfalls of fractional frame rates that most people miss. We also showed that there is a commercial inertia in the industry, which means the frame rate 29.97i will be with us for a very long time. In addition to formal presentations, HPA also features breakfast round tables, where each table discusses a single topic. I hosted two great round tables, with John as a guest host on one, where the ground swell of opinion seems to be that enforcing integer frame rate above 59.94fps is practical, and any resulting technical issues can be solved – as long as they are known. I will never be smart enough to design a lens Larry Thorpe of Canon gave an outstanding presentation of the design process for their latest zoom lens. The requirements at first seemed impossible: design a 4K lens with long zoom range that is light, physically compact, and free from aberrations to meet the high demands of 4K production. He showed pictures of lens groupings and then explained why they couldn't be used because of the size and weight constraints. He went on to show light ray plots and the long list of lens defects that they were battling against. By the end of the process, most members of the audience were staring with awe at the finished lens, because the design process seemed to be magical. I think that I will stick to the relative simplicity of improving the world's file-based interoperability. Solar flares affect your productions We've all seen camera footage with stuck or lit pixels and, like most people, we probably assumed that they were a result of manufacturing defects or physical damage. Joel Ordesky of Court Five Productions presented a fascinating paper on the effects of gamma photons, which, when passing through a camera’s sensor, cause the sensor to permanently impair individual pixels. This is something that cannot be protected against unless you do all of your shooting underground in a lead lined bunker. Joel presented some interesting correlations between sun spot activity and lit pixels appearing in his hire stock, and then showed how careful black balance procedures can then reduce the visibility of the issue. UHD is coming – honest The HPA Tech Retreat saw a huge range of papers on Ultra High Definition (UHD) issues and their impacts. These ranged from sensors to color representation to display processing, compression, high frame rates and a slew of other issues. I think that everyone in the audience recognised the inevitability of UHD and that the initial offering will be UHDTV featuring resolution improvements. This is largely driven by the fact that UHD screens seem to be profitable for manufacturers; soon enough they will be the only options available at your local tech store (that’s just good business!). The displays are arriving before the rest of the ecosystem is ready (a bit like HDTV), but it also seems that most of the audience feels better colour and high dynamic range (HDR) is a more compelling offering than more pixels. For me, the best demonstration of this was the laser projector showing scenes in true BT2020 wide colour range. First we saw the well-known HDTV Rec.709 colour range and everything looked normal. Next up was the same scene in BT2020 – and it was stunning. Back to Rec.709, and the scene that looked just fine only seconds before now appeared washed out and unsatisfactory. I think HDR and rich colors will be addictive. Once you've seen well-shot, full color scenes, you won't want to go back to Rec.709. The future is looking very colourful. Women are making more of an impact in the industry (Hooray!) There were three all-women panels at this year's HPA, none of which were on the subject of women in the industry. This was a stark contrast to the view of women in the industry as shown on a 1930s documentary of the SMPTE Conference, where men with cigars dominated the proceedings and women were reduced to participating in the chattering social scene. This contrast was beautifully and ironically highlighted by Barbara Lange (Executive Director of SMPTE) and Wendy Aylesworth (President of SMPTE 2005-2015), who hosted their panel in bathrobes with martini glasses, while explaining the achievements of the society over the year. If you haven't yet contributed to the SMPTE documentary film project or the SMPTE centennial fund, it's time to do so now. These funds will help support the next, diverse generation of stars. IMF and DPP are a symbiotic pair One of the most interesting panels was on the Interoperable Mastering Format (IMF) and the Digital Production Partnership (DPP) interchange format (and yes, this was in fact one of my panels!). One format’s purpose is to distribute a bundle of files representing several versions of one title. The other is designed to create a finished, single file with ingest-ready metadata, where the file can be moved to playout with virtually no changes. Both formats have a strong foothold in the life cycle of any title and are likely to form the strongest symbiotic relationship as we move into the future. One thing that I pointed out to the audience is that the DPP has done a huge amount of work educating UK production and postproduction houses about the change management that is required for file-based delivery. They have written a wonderful FREE guide that you can download from their website. All in all, the HPA Tech Retreat is a wonderful event with so much information flowing that it takes weeks to absorb it all. I must confess though, that one of the highlights for me was being able to cycle up the mountain every morning before breakfast. It meant that I could go back for seconds of all the wonderful cake that was on offer. Happy days! Until next time – don't forget about our UHD webinar, happening today. If you didn’t sign up in time, drop us a line at academy@dalet.com and ask for a re-run. The more people that ask, the more likely that we'll do it!
Practice your scales to make your enterprise workflow sing
An increasingly common approach now to developing new media infrastructure is the “proof of concept”. This could sound a bit negative, as if we needed to try something first in order to see if it really works. But I really do not think that is the motivation behind it: To meet the multi-platform, multi-format requirements of a media business today, we need complex, largely automated workflows. And it makes sense to try them out first, in one part of the organization. But this achieves more than one goal: First it obviously proves the concept: it shows that you have all the equipment and processes available to do what you need. Second it allows you to develop workflows on the concept system, so you fine-tune them to work precisely the way that you want to work. Some vendors will try to push you towards a big bang approach where the workflows are baked into the architecture, which makes it difficult to make changes when you find you want something slightly different. Third and this is really important, it allows you to get a sub-set of users comfortable with the system, and to take ownership of the workflows. It means you get the processes right, because they are being designed by the people who actually need them, and it means you get a group of super-users who can ease the transition to the main system. Which all sounds good. But it does depend upon something that we all talk about but rarely really understand. The proof of concept stage is only worthwhile if this small system performs in exactly the same way as the final enterprise-wide implementation. Scalability The word “scalable” is often used quite loosely, but this is what it really means. You can start with something small, and then by adding capacity, make it cover the whole operation, without changing any detail of how it works. For me, that means that the enterprise system has to be built the same way as the proof of concept system. If the first iteration consisted of a single workstation performing all the functionality – which in our case might be ingest, transcode, quality control and delivery – then the full system should be a stack of workstations that can perform all the functionality. And it also means that you don’t need to blow the capital budget on a huge number of hardware boxes. That would not be efficient, because at any given time some of the boxes might be idle while others had a queue of processes backed up and delaying the output. Flexible Licensing It's better to ensure you have sufficient licenses for the software processes you require, with a smart licensing system that can switch jobs around. If server A is running a complextranscode on a two-hour movie, then its quality control license could be transferred to server B which can get on with clearing this week’s batch of trailers and commercials. The AmberFin iCR platform is designed on this basis. You can buy one and run all the processes on it sequentially, or you can buy a network to share the load, under the management of an iCR Controller. This manages the queue of tasks, allocating licenses as required from the central pool. As well as making the best use of the hardware, it also collects statistics from each server and each job. Managers can see at a glance if jobs are being delayed, and if this is an overall problem for the business. More than that, they can also see why jobs are delayed. Can it be solved by additional software licenses, or do you need more servers? Scalable systems are definitely the way to go, but only if you can understand how you need to scale them. If you want to find out more about enterprise level file based workflows, check out our newwhite paper. I hope you found this blog post interesting and helpful. If so, why not sign-up to receive notifications of new blog posts as they are published?