Private Cloud

Common Media and Entertainment Scaling Challenges

Platina Systems
5 Minutes

The public cloud wasn’t built for media and entertainment companies and the massive amounts of data that they work with. From video and audio content to advertisements to logs and analytics, it is a firehose of data that needs a place to live.

There are clear and significant benefits of using the public cloud to store and process large data sets, including scalability, elasticity and flexibility. But public cloud infrastructure isn’t optimized for the CPU- and bandwidth-intensive workloads that media and entertainment companies rely on, which, when lacking, can slow down their entire operations. And the costs of regularly accessing data from the public cloud can add up quickly.

Let’s take a look at the major scaling challenges for media and entertainment companies and find out how Platina Systems can help address them.

Workflows, Workflows and Workflows

There are several common workflows across the media and entertainment industry, although they might vary according to the size of the production house, the distribution mechanism or the audience being served.

A typical workflow for broadcasters may look something like this:

  • Ingest your content and advertisements from third parties.
  • Prepare the content for playback.
  • Perform verifications and quality control checks on at least 50 to 100 advertisements per day.
  • Incorporate specialized effects and other production elements as desired.
  • Distribute content to a transmission server, content delivery network (CDN) or other delivery mechanism.
  • Copy content into archives.

You can automate a lot of these common workflows with very basic processes, but at smaller media companies in particular, you tend to find customized, homegrown solutions for each workflow. These solutions require multiple tools, and there’s a lack of integration and orchestration between them. These do-it-yourself toolchains become complex, brittle and a nightmare to support.

That approach just can’t scale as the number of workflows grow — and especially as new, more complex workflows such as metadata extraction and analysis emerge.

More advanced organizations take an approach similar to modern software development — a curated pipeline of steps in which every piece of content that flows through is validated, QA and QC checked and reviewed at each step of the process. These pipelines become common, pluggable, and reconfigurable workflows that set a clear order of operations to ensure repeatability, reduce the brittleness of the environment and increase reliability. Let’s have a deeper look.

Ingest and Transformation

Data ingestion is the process of moving information from various sources to one storage platform — basically, bringing data into your organization for the first time, or transferring physical media to digital data. High-speed stream ingest of uncompressed MP2 or MP4 formatted video can exceed 20GB per hour!

Ingest can create a high-bitrate firehose of enormous amounts of data that you need to store to disk. To speed up the process, you want to ingest data in a highly parallelized way. But if you’re ingesting data into the public cloud, your internet connection will become the limiting factor. The process is guaranteed to be faster with on-premises networking.

Transformation and transcoding is the process of taking raw video feeds and converting them to the right formats for today’s platforms. Content aggregators may have upwards of 40 to 50 different variations of incoming files that all need to be normalized and transformed in order to meet their platform specifications. There may be AVIs, various flavors of MPEGs, MXF and IMF files and more. This content may be compressed or uncompressed. Some of it may not be in the desired bitrate or aspect ratio. This transformation typically requires significant manual work.

Transcoding is a massive CPU-intensive workload but also generally one-time operation. The public cloud may make sense for your occasional transcoding needs, but the private cloud may be more cost-effective and computationally efficient for regular high-volume transcoding. A private cloud has the benefit of on-premises networking and no egress fees.

On-Site Production

In visual effects and production environments, which have to deal with storage that is flexing all the time, there’s a lot of effort required to ensure that storage space, which is at a premium, is used as efficiently as possible. If you’ve got a major production starting, the tier-one storage has to be available. And as soon as that project is finished, the tier-one storage needs to be vacant for the next project.

This workflow may look like:

  • Keep any new content in fast, on-premises storage, for a certain number of days.
  • Keep a copy of this content on less-performant storage for the same timespan.
  • Move that content to the public cloud after it likely won’t be needed anymore.

The costs of data egress from cloud providers, the latency of networks, and the constant movement of assets add up in direct cloud bills and lost productivity time from staff. 

Metadata Extraction and Analysis

As we continuously deploy capabilities of machine learning and artificial intelligence to video workflows, we now have the ability to scan and analyze video and audio content to perform facial detection, verify advertisements and even collect in-game statistics. These activities unlock new potential in video archives and in live formats, but they come at a price. These storage- and compute-heavy activities require significant accesses and re-accesses to content streams, which ultimately add up on the monthly egress bill.

Packaging and Security

Packaging is the process of taking transcoded content and preparing it for consumption by various players. You may want to apply security, such as digital rights management technology, to your content during this process. The output of these processes creates a ready-to-consume video feed in formats such as HLS, HDS and MPEG.DASH, compatible with today’s modern web players.

Depending on how you design it, you may end up paying an egress fee to your storage provider to get the content through your packaging and security pipeline. It’s yet another opportunity for public cloud providers to charge you. On the other hand, packaging is a bursty, generally one-time workload, so it may be well suited for the public cloud. That is, unless you’re trying to do fancy things such as server-side ad insertion or manifest manipulation.

In that case, this isn’t a one-time action, and it becomes a great use case for a hybrid cloud strategy. Bursty work goes to the cloud, but your baseline workload lives on happily in the cost-optimized private cloud.

Content Distribution

The cost of moving data from the public cloud to your CDNs can be prohibitively high.

First, there are the egress fees for moving from central cloud storage to CDN. Then, depending on how high your CDN’s cache hit ratio is, you may end up having to pay for the same data being accessed over and over again as it expires from cache. And the more you rely on multiple CDN points of presence — whether it be for improved performance or to comply with geographic or political content restrictions — the more these costs are multiplied.

Analytics and Troubleshooting

The above concerns and considerations not only apply to creating and distributing content. They come into play when gathering analytics and conducting troubleshooting as well.

Analytics and troubleshooting generate a massive amount of data. If you ingest this data into the public cloud and regularly access it, that will lead to more cost and performance issues. And if you need to store this data across multiple regions because of cybersecurity and privacy regulations, again, that’s another multiplier.

Archives and Storage Management

As a final step in the process, extensive archives are created. These archives were usually placed upon linear tape-open (LTO) storage due to the sheer volume of data. LTO is often seen as a cost-effective solution, but we know this comes at a cost for data durability. The industry chronically underestimates the operational overhead of managing LTO storage — the cost of the robot, the tape management, the data center and the people required to maintain it all properly — and these costs actually make it more expensive than any form of cloud, with the added problem of data being in cold storage.

As content archives become increasingly valuable for metadata extraction and analysis, the benefit of LTO is being eroded, as value can be extracted from nearby hot archives. The price-to-benefit ratio is greatly influenced by the underlying cost of said storage and the cost of accessing the data. 

The Platina Difference

By building, automating and managing private and hybrid cloud environments, Platina Systems enables organizations to have all the benefits of the public cloud on-premises — even at the petabyte-plus scale that media and entertainment companies require. Media and entertainment companies can gain full control over their workflows, processes and cost optimization at every step of the way.

With a private or hybrid cloud, you can control your infrastructure costs, eliminate or reduce data egress fees and run your most demanding workloads — editing, transcoding, packaging, distribution, analytics and more — in a much more optimized way. And you can always burst up to the public cloud when needed.


Learn how Platina Command Center can help your company scale.

Get the Most out of your Data Archive

It goes without saying: the value of data is enormous. Organizations across every industry are collecting all the data they can to tap into ways to analyze, understand and improve their performance. From learning more about customers to better attract and retain them, to understanding industry trends and the organization’s response to them, the ability to properly collect, analyze and translate data into actionable guidance gives organizations an unparalleled competitive advantage.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
7 minutes
From Preservation to Activation: How Modern Data Archives Expose Tape Storage Inadequacies

Tape storage has had a good run. For decades, enterprise organizations have relied on physical tapes to safely store sensitive data. It’s easy to understand why so many large operations have opted to secure their data on tape. Tape storage has long been the most affordable data storage option for preserving data. Tape has been the default because organizations haven’t wanted to mess with something they have grown familiar with operationally. But is tape still a “good thing” in 2021?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
8 minutes
Seagate Report: Rethink Data – Put More of Your Business Data to Work

Seagate, in partnership with IDC surveyed over 1,500 respondents globally. In the report, CIOs, CTOs, IT VPs, COO/LOBs, storage architects, and solution architects reveal insights into how businesses approach their data management challenges and the new ways to gain business value from a plethora of data being generated from the edge to the cloud.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
8 minutes
Whitepaper: Building Your Private Cloud Infrastructure - Industry Evolutions, Challenges, and Automation

Public cloud technologies have emerged as part of the digital transformation mega trend that’s reshaping entire industries. While many organizations have invested in moving their resources into the cloud, they’re finding that it isn’t a panacea.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
4 minutes
Lessons Learned From the First Wave of Cloud Migrations: Misconceptions, Challenges, and Costs

In a cross-industry survey of over 350 IT decision-makers at small, medium, and large U.S. businesses, more than half of the respondents said costs of running workloads in the cloud were higher than estimated – sometimes triple than what was expected.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
9 minutes
Enterprise Considerations for Legacy Application Modernization

From sprawling multinational enterprises to operators/providers, organizations across the world are facing mounting technical debt within their legacy infrastructures.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
5 minutes
Building a Flexible and Scalable Private Cloud

Now that the gloss is wearing from the shiny promise of the public cloud, organizations are increasingly migrating applications and data from public cloud providers (CSPs) to private or on-prem clouds. The drivers for these decisions include a range of issues.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Platina Systems
4 minutes

How Can We Help?

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
chatsimple