Snowballs, buckets
and bill shock

Migrating a content catalogue to the cloud without suffering a meltdown

By Blue Lucy -

If your content catalogue isn’t already in the cloud, you’re probably wishing it was right now.  Despite the dramatic growth in the media industry’s use of cloud-based storage in recent years and the escalating take-up of cloud operations in the wake of the Covid-19 pandemic, there is still some reluctance to place high-value material on 3rd party storage.

Here are three simple steps distilled from some recent projects which should ally any fears and enable a controlled migration to ‘the cloud’.

 

1.  Let Snowballs do the heavy lifting…

One of the perceived barriers to migrating a content catalogue to the cloud is the size of the data and therefore the time it will take to upload the material.  We’ve recently supported the migration of significant media catalogues to AWS S3 storage and the process is really straightforward, thanks to the AWS Snowball service.

To facilitate the movement of media from your facility to the AWS S3 Bucket, AWS send a large (up to 100TB) storage device called a Snowball to your facility.  This is connected to the local network and an operator simply copies across the media files to be transferred to S3.  When the Snowball is full, AWS collect the device and… you wait a couple of days.  As the Snowball thaws (not an AWS term!) the files begin to arrive in your nominated S3 Bucket.

So far so easy.  At this point your media operations management platform should pick up the workflow and bring the content under management. This should be equally as simple and will likely include:

  • registering new files, carrying out some basic integrity checks as well as creation of a checksum if necessary;
  • inspecting media files for technical data such as resolution, frame rate or audio track layout and store this as technical metadata against the asset file;
  • quality control checks via a suitable cloud-based service;
  • creation of a browse proxy version for easy visualisation, and
  • quickly archiving the file to an appropriate storage class based on the short-term usage needs. There is no need to keep a file at Standard class if it will not be used for some time and equally there is no value in sending media to Deep Archive which will soon be need for distribution.

 

2.  Hold your own keys to the bucket

While cloud-based material is stored on infrastructure owned by another party, concerns about loss or unauthorised access (theft) should be negligible.  AWS S3 and similar cloud-based storage is secure as long as it’s properly configured, account management is carefully controlled and a formal infosec policy is developed and adhered to.

At Blue Lucy we recommend a “bring your own bucket” model.  Our BLAM platform manages content and orchestrates and automates operational processes using a variety of services.  BLAM manages access to these 3rd party services and tracks the utilisation costs, but the commercial relationship is directly between the content owner and the service provider.

This model provides operators with maximum control in terms of security and cost.  Content owners may negotiate favourable rates with storage and other service providers without any unnecessary, margin stacking intermediaries.  There are also operational benefits to controlling your own bucket.  One of the simplest and most cost-effective ways to ‘deliver’ content to direct-to-consumer providers is to provide managed access at the bucket level, which may not be possible if the bucket belongs to the intermediary service provider or if your content is in a bucket along with other another company’s material.

 

3.  Track your Costs – as you go

Tracking costs, ideally in real time, is an increasingly important consideration for operators in the era of software-as-a-service.  While the ability to use high-end software tools on a pay-per-use basis is transforming the media industry in terms of accessibility and autonomy, the accompanying flexible payment model is very different from the traditional model of  amortising a large initial spend over a given period.  Anecdotes of ‘bill-shock’ from unexpected egress costs abound as content producers struggle to untangle the openly published but often complex cost models of cloud services.

Although operators cite aggregate storage and media processing as the most important cloud service costs to track, operations management platforms should provide the ability to configure the tracking enabling:

  • reporting cost against single assets
  • aggregation by cost centre, campaign or tracked service, and
  • importantly forecasting costs based on required operations.

 

The Covid19 crisis means that 2020 is likely to be the year that sees the most rapid growth in the migration to cloud based operations. I hope these simple steps will mean the process is painless and that the new operating models that go with it are as cost effective as they are flexible.

 

This article was first published in the May edition of TVBEurope:


By Blue Lucy -
Share this