The Flow of Data: What Internal Workflows Look Like for the Media and Entertainment Industries
To say that creative organisations will revert back to operating in the exact same ways they once did prior to the Covid-19 pandemic is a very unlikely statement. In the past year, fully cloud-based operations and hybrid creative workflows have flourished, and what the pandemic has taught us is that the industry has proven itself perfectly capable of working remotely without compromising content creation pipelines or quality of the end product.
Thanks to the rapid transformation of workflows that took place almost overnight in 2020 when the world was locked down, broadcast, visual effects, and production studios are now using solutions that would have previously been years down the road on their internal technology roadmaps.
Cloud computing, for example, is being used to perform procedures, such as offline editing, pre-visualisation rendering, and data wrangling tasks, which may have traditionally taken place on lower-powered workstations on-premise. Others are utilising it for disaster recovery solutions, including archive backup, and to connect different office locations.
Post-Pandemic Considerations
Deploying virtual workstations into the cloud has enabled many creative organisations to maintain project consistency remotely since the pandemic, and for many, cloud has been perceived as a way of reducing on-premise maintenance costs while implementing more resilient business models and workflows.
These rapid deployments can also, in the short term, impact internal practices as new, more efficient workflows are formed. In evaluating the required compute or storage resource in the cloud, it is important to highlight the needs of the end-user and what they will need in order to be able to deliver the work.
This includes having the necessary network bandwidth for users to effectively share and collaborate data; that, if overlooked, can have a significant impact on business operations, making them prohibitively expensive or overwhelming to manage, potentially putting strain on other parts of the media pipeline. This is particularly true with studios working on 2D or 3D media files at 2K, 4K, and 8K resolutions.
When it comes to storing such large assets natively in the cloud, it can get very expensive to store data in “cloud disk volumes,” the technology that makes up the “primary tier” of most file systems. However, the same cannot be said for storing large volumes in object storage, such as Amazon S3 from AWS, GCS from Google Cloud, and Azure Blob Storage from Microsoft Azure, where the cost is dependent on a range of factors, including a studio’s data lifecycle needs and retrieval patterns to name a few.
The other side of this approach is putting large volumes of data into “very cheap” object storage, such as S3 Glacier or GCS Coldline. Primarily used for long-term and lower availability data storage, if studios use this type of storage as their “primary area” for on-premise or at home users, the costs will occur in network delivery and recall costs to simply access their data.
Storage cost optimisation will be a big consideration for studios moving forward. Studios will need to find the perfect balance of using both small amounts of “expensive” storage for the required tasks and larger amounts of cost-effective storage without users or applications facing slowdown.
Alternatively, studios deploy specialist types of on-premise storage according to the type of job performed. This creates “islands” of storage technology. As a result, the only way of moving data from one stage of the pipeline to another is by manually duplicating files from the original location onto the next. Demanding not only more time, but also network and storage availability, and, inevitably, dialing up the overall cost of the final deliverable.
Not a One-Size-Fits-All Approach
There is not a one-size-fits-all approach when it comes to data storage and management for the media and entertainment industries. If you were to re-evaluate your position today, now that this industry is on the frontier of new working methodologies, a key consideration would be to outline the upcoming projects for the year ahead. In doing so, you can fully understand the system performance requirements and the relevant solution that takes advantage of all the different existing storage or power available in the marketplace.
For each project, it is beneficial to address the following questions to evaluate the current set up. Where are your users based? What content do they need to egress and work on, and what are their performance requirements in doing so? What are the media asset and project management tools involved? How can the solution be integrated into existing storage and technology architectures?
In order to run more efficient and sustainable workflows that can be scaled up to meet project requirements, the resulting solution must be futureproof in terms of cost, reliability and flexibility. It must build long-lasting efficiencies for storing and moving data on a global basis, guarantee high performance up to the storage capacity limits, scalable to cloud and hybrid infrastructures, and fit a wide variety of application integrations.
Perfecting the ‘Ultimate Workflow’
As you can imagine, such an evaluation is no easy feat. That’s why studios should consider software-defined storage solutions which enable analysis to be done to assess where each part of the content workflow is performing best and which could require improvement. This is far from using typical IT tools which enable reactive monitoring and alerting with dashboards that appeal to technical staff, but solutions that consider all aspects of the studio; from creatives, to operational staff, to the management team.
When combined with built-in data orchestration tools, storage solutions can make internal practices smoother and enable companies to operate more quickly and efficiently; spending less time worrying about the individual elements of their infrastructure, such as a specific storage appliance. The end result is time spent optimising their own workflows and business processes for better collaboration in the context of ‘hybrid workflows’.
If software-defined storage acts as an overlay of any studio’s current IT infrastructure, then a software-defined solution designed for media and entertainment gives studios the ability to choose which software, hardware and cloud vendors to work with, when, and if needed. Many studios today are working with remote teams, running virtual workstations that need connectivity to the studio. Operating within one integrated system guarantees workflows like this can be automated, media asset and project management tools are available across the system, and therefore ensure teams and workflows perform more efficiently.
Solutions packed with powerful APIs featured as standard assist studios in streamlining complex workflows and driving efficiencies, whilst quickly and securely transporting data to and from globally distributed cloud, object storage, traditional NAS storage, and archive resources; automatically moving data into the “right cost” resource according to value and usage as work teams and business needs demand.
Perfecting the Flow of Data
Software-defined data storage and management solutions enable media and entertainment companies to run more efficient and sustainable storage to underpin pipelines and workflows, whilst accelerating applications to deliver projects faster. With data orchestration and integration to job management tools, studios can free up manpower resources and reduce egress charges for greater cost predictability.
This intelligent and collaborative flow of data will inevitably give Media and Entertainment companies a huge competitive advantage to remain agile whilst lowering infrastructure costs and maintaining high-quality creative output.
About the author: Jamie Bean, a senior solutions architect at pixitmedia, has over 20 years’ experience in the media and entertainment industry. Having worked with industry leaders, such as Envy Post Production and Glassworks, Jamie joined pixitmedia to play a fundamental role in researching trends and technologies. Today, Jamie is solving the problems of working with heavy media files and provides our customers insight into building high performance media workflows for both storing and moving data, effectively, reliably, and securely.
Related Items:
Cloud Storage: A Brave New World
Cost Overruns and Misgovernance: Two Threats to Your Cloud Data Journey
The Future of Storage: Hardware