- Market Framework
- Maturity of Categories
- Considerations for Using Next-Gen File Service Solutions
- Vendors to Watch
- Near-Term Outlook
- Key Takeaways
- About Enrico Signoretti
There are at least four major areas of concern when we look at traditional file services, ranging from end-user perspective to the flexibility of the backend infrastructure:
Evolving user needs. NFS and SMB were designed to be used on high speed, low latency local area networks. These protocols cannot be exposed efficiently on wide area networks or the internet. Adding to this constraint, this is not the only way users need to access data today. In fact, most users access data remotely from their laptops or mobile devices, and/or while traveling. Furthermore, the access pattern is even more complicated than that, with users frequently switching devices while collaborating simultaneously on the same documents. There are also documents that are shared outside of the team and could be copied to a local device, complete with all the associated risks that come with that.
High TCO. File servers and NAS appliances, especially remote ones, are very expensive to manage. Providing the required SLAs is more difficult than in the past. The infrastructure must be protected properly against data loss, disasters, and an increasing number of security risks. This means that instead of reducing the total $/GB spent for storage, the costs are increasing while, at the same time, data under management is growing – a trend that could quickly pose many threats to overall storage infrastructure sustainability.
Multi-cloud transition. Traditional file services, bound to a single system installed on the premises, do not benefit from the advantages of cloud, and even less so with multi-cloud flexibility and economics when it comes to advanced data protection, disaster recovery, or scalability. Even more so, without a cloud backend, data distribution for a logically and geographically dispersed organization can be more challenging and expensive.
New regulations and security. New regulations like European General Data Protection Regulation (GDPR) and similar ones that will soon be effective in many other countries, require a totally different approach to data storage, access, and conservation. Most traditional storage systems do not have the right metadata, analytics, and automation tools to cope with these kinds of problems. Additional software is a costly option and adds complexity to an already convoluted infrastructure, while most modern systems have these functionalities embedded and ready to use.
SaaS solutions, Sync & Share from the public cloud for example, may partially solve small business needs, but they cannot offer adequate performance for certain workloads, nor an adequate control over data for highly regulated environments. Even more so, most of these services do not typically offer data protection and archiving, solutions aligned with enterprise business needs. This deficiency requires additional external tools to make the system compliant with stringent legislation and data retention policies usually found in large organizations or highly regulated fields.
New hybrid cloud infrastructure designs offer performance for local access, lower TCO, better data protection, and improved security when compared to traditional infrastructure. The combination of small, stateless, local appliances for caching data locally, with the capacity of the cloud and the additional services that can be built on top of it, are the answer to creating a modern file service platform for all kinds of users in the organization, no matter the size or its global distribution.
In this report, we will analyze several aspects of next-gen file services including:
- Challenges of traditional file services approach
- Modern user behaviors and access patterns
- The increasing impact of regulations like GDPR
- Data protection, auditing, and security in large scale file services
- Architecture design of modern file service infrastructure
- Hybrid Vs Private cloud approach for file services
Key findings (and considerations for adopting the solution):
- A new approach to file services is necessary to cope with both data growth as well as changing user needs. The TCO of traditional solutions is too high and they are not flexible enough to respond quickly to ever-changing business needs.
- The separation of the frontend access layer from the backend provides better services at a lower cost. If the backend storage layer is delivered on the cloud, all major pain points regarding scalability, data protection, and disaster recovery can be neutralized.
- Hybrid solutions offer the best compromise in terms of cost and flexibility, but some organizations could adopt a similar infrastructure design on-premises to get better control of data and security when there is data sovereignty concerns or specific regulations.
More Big Data Research
Available to GigaOm Research Subscribers