Post pandemic the growth of digital data is unprecedented, particularly unstructured data which includes documents such as text / office files, images, audio and video files.
When we started Storage Made Easy we were dealing with Terabytes of data, which quickly moved to low end Petabytes. Now we deal with customers who are mid to high end Petabytes with the very large customers talking about how to move to Exabyte storage.
Hybrid Storage Architectures are a combination of private and public cloud data resources. The pandemic accelerated many companies to shift at least some data to the latter but most enterprise companies that we deal with have a combination of the two. A 2020 report by Flexera on the ‘state of the cloud’ shows that 93% of enterprises have a multi-cloud strategy and 87% have, in development, a hybrid cloud strategy.
In my opinion this is not likely to change any time soon as for many enterprise companies, despite the public cloud providing ease of access and, at least initially, reduced complexity there is still a requirement for the enhanced privacy , control and security provided by a private cloud environment in which internal IT teams are on-hand to mitigate data security and compliance risks.
This hybrid combination has created a complex data environment in which companies have their data spread across multiple storage environments, physical and virtual locations, and also Apps that function as pseudo storage repositories and retain data. Throw in remote working and data being stored on laptops outside of the corporate working environment and the problem exacerbates quickly.
What is required is a way to abstract the data away from the storage and provide not only a single namespace but a single place to securely authenticate, manage and protect dispersed data more efficiently whilst also making it easily accessible and more productive for end users.
What else is required ? Well, any approach should ensure that no additional complexity is introduced, including no vendor lock-in, whilst ultimately providing a strong return on investment.
The required stack ends up looking something like this:
As digital data continues to grow and hybrid workers becomes normal this approach is key to maximising data value, reducing data compliance / security risk, whilst promoting end user productivity through unified data access and data discovery.
Data transfer speed is also important, particularly in a hybrid working / hybrid data topology and when transfer is over the internet which often introduces its own degrees of latency. The speed at which data is transferred not only has performance implications, but also employee productivity implications. especially when dealing with massive volumes and different formats of data. If John from Acme Inc. is working from home and wants to transfer his 10GB media asset stored on Amazon S3 to a partners DropBox folder he does not want to have to download it to his desktop and then re-upload it across consumer broadband and how does the company track this asset movement from a data governance and compliance perspective ?
For those who have followed my posts or presentations, a mantra that I often come back to is that of data being the most essential business asset that a company owns. Data is the lifeblood of a how a company operates day-to-day and is essential to maintaining and increasing competitive advantage. The data asset can quickly turn sour if not managed correctly particularly in today’s world in which data can be ransomware’d, fall foul of compliance regimes or be exposed in some high profile public data breach. The chances of this occurring increase vastly In a multi-cloud hybrid data environment.
Providing a solution to this problem is the challenge we set ourselves at Storage Made Easy. We often talk about the 360 degree view of files. Depending on your viewpoint what you require and what the obstacles are can be very different indeed.
We modelled our own File Fabric architecture on satisfying this value stack as to what we believe is required to target hybrid file and object architectures with this changing 360 degree view very much in mind:
Unfortunately, even today the most common approach I see employed for hybrid data architectures are still based around point solutions to resolve pressing issues that have bubbled to the surface and/or highly complex, and costly integrations. It’s a broken process and there has to be a better way. There will always be another new data store, faster or cheaper or offering a whizzy new feature, that will become additive to what already exists within a company. These should be quickly and easily be brought into the company’s unified namespace wrapped with the same security/compliance blanket and end user productivity enhancements that are implicit for existing hybrid data.