The recent past has demonstrated the importance of having an IT infrastructure that is ready for anything. We’ve seen just how fast the world can change. That might mean adapting to having your workforce work remotely, or shifting priorities and even entire operations to address customer satisfaction.
Within that fast-moving environment, the nature of storage has quickly evolved. Once the esoteric domain of a few IT people, the role played by your storage infrastructure is now highly visible. Or it will be when it acts as either an inhibitor that holds you back or an enabler that drives your business forward.
The critical role of data in decisions large and small demands a fresh evaluation of storage techniques. We need to consider the ever-increasing quantity and variety of data, as well as the speed with which it can be accessed, processed, analyzed and put to work. Businesses across all industries are rapidly discovering new use cases, creating new applications and modernizing the old to speak the language of the cloud.
There is a need to provide enhanced performance for traditional object workloads to support the modernization of traditional apps. Even more pressing are the needs of new applications that demand both a high-performance parallel file system and massive scale. Today, Hitachi Vantara has revealed significant enhancements to its object storage capabilities to support the needs of all of those requirements.
See the News: Hitachi Content Platform Enhances Unstructured Data Operations
Putting Unstructured Data to Work
Unstructured data used to be stuff a business had to retain for years to meet governance and compliance requirements. Now that data, along with vast streams of data from IoT sensors, remote devices, video cameras and more, need to be consumed, analyzed and acted upon in real time. This isn’t data storage for storage’s sake anymore. These are immediate revenue-generating opportunities. While traditional file-based storage has moved from fast NAS to scale-out file systems, that approach has hit a performance wall for many cloud-native applications and next-gen analytics use cases.
People are instead increasingly pointing applications to leverage parallel file systems for the extreme performance needed for high-performance computing, AI and machine learning (ML), and object stores for performant, modern, cloud-based storage services. We’ve made significant enhancements to the Hitachi Content Platform (HCP) portfolio for object storage to support the demands of these new workloads.
Optimized Performance for Next-Generation Applications
Today’s new applications have very different performance demands from those of the past. Hotspotting, where high activity on the storage side slows the sending and receiving of data to apps, is more than an inconvenience for users. In many workflows delays can have a direct impact on bottom lines. Today’s Hitachi Content Platform has been specifically optimized to deliver high-performance application support without creating bottlenecks.
Optimized Scale-Out To Customize Data Services for Demanding Use Cases
While the public cloud is frequently offered as the answer to every need, the reality is that public cloud services are typically a one-size-fits-all answer. When it comes to storage, a generic architecture may support an application in theory, but in reality it is a breeding ground for silos. Within the range of existing and new workloads your organization supports exists a variety of very specific needs.
We’ve enhanced Hitachi Content Platform so you can scale out discrete data services based on the use case requirements. This improves efficiency, eliminates bottlenecks and ensures the applications have the performance they need, when and where they need it. HCP’s scale-out life-cycle policy means you’re not forced to scale the front end and compute in lockstep with added storage. Instead, HCP lets you scale where needed to meet specific application and performance scales linearly with infrastructure additions. So, for example, if you have a use case such as analytics, video or backup that writes and deletes large datasets, HCP policies can be set to provide the correct mix of resources.
Intelligent Management With Automated Real-Time Classification
Leveraging Hitachi Content Intelligence in conjunction with the intelligent, metadata-based tiering inherent in HCP makes it possible to perform real-time data classification as data comes in. Identify sensitive information inside of unstructured data based on criteria you define and then execute actions based on workflows you’ve designed to suit business objectives, governance requirements or other parameters.
Handle the Unexpected With Smart Load Balancing
Your business depends on the uninterrupted flow of data. With smart load balancing and other features, we’re continuing the legendary Hitachi resiliency our Hitachi Content Platform has always been known for. Object storage is now a primary target for new cloud-native applications, so reliability and availability are paramount requirements. HCP’s new smart-load-balancing feature helps handle unexpected outages and activity surges to ensure seamless expansion of the solution when needs change.
Faster Time to Value
With our new automated installer, object storage resources can be set up and deployed correctly and quickly to bring your new workloads online.
Strongly Consistent Bucket Listings
New sophisticated applications such as analytics and Hadoop bring complex, multistage workflows. Storage has to be consistent across every stage or the entire workload will struggle. New, strongly consistent bucket listings remove the need for extra middleware and improve overall efficiency and cost.
Parallel File Systems for Performance-Sensitive Applications
Performance-sensitive use cases such as high-performance computing, AI or ML, and analytics need parallel file systems to handle the massive quantity of simultaneous reads and writes to storage at scale. We’ve tightly integrated our object storage into our new Hitachi Content Software for File solution to deliver unrivaled performance for your most demanding high-performance computing applications at an optimized cost. In addition to its currently supported range of file-based protocols, Content Software for File can now accept writes directly via Amazon S3 to deliver the write speeds needed for ingest-intensive workloads.
Ready for Deployment
In today’s fast-changing application landscape, your business can’t afford to wait for technology to catch up to your needs. Tomorrow may be too late. Hitachi Vantara understands, so we have a tested and documented reference architecture and we are completing testing for NVIDIA GPUDirect Storage certification to speed deployment, improve predictability and speed time to value for these new workloads. The result is optimized configurations that are ready to help you meet your business objectives.
Learn More About Our Object Storage Portfolio
Tanya Loughlin is Director of Product Marketing, Hitachi Content Platform Portfolio at Hitachi Vantara.
Tanya Loughlin
Tanya leads product marketing for the Hitachi Content Platform portfolio, including object storage and related cloud storage gateway and more. She has +20 years' experience across various product marketing, voice-of-the-customer, and business development disciplines.