| |March 20209forms (i.e. device/instrument gener-ated raw data, to derivative data at a project level, etc.).Trend #2: Storage will be archi-tected and consumed as Software-definedWe can expect to see new stor-age designs in 2020 that will fur-ther blur the line between storage and compute.With deeper integration of vir-tualization technologies on the stor-age array, apps can be run directly on the same system and managed with standard tools. This could be suitable for data-centric applica-tions that require very storage- and data-intensive operations. Software-defined infrastructure (SDI) is also becoming a greater consideration in enterprise data centers to augment traditional SANs and HCI deploy-ments. Long the realm of hyper-scalers, traditional enterprises are ready to adopt SDI for the redeploy-ment of certain workloads that have different requirements for capacity and compute than what traditional 3-layer SANs can provide. The solution is for customer that needs to consolidate multiple high performance (e.g. database) or general workloads. As enter-prises consider consoli-dation strategies, they will bump up against the limits of traditional SANs and the unpredictable performance/costs and lock-in of cloud services. This is where SDI becomes a very vi-able alternative to traditional SANs and HCI for certain workloads.Trend #3: High-performance Object storage enters the mainstream As Object moves from cheap and deep, cold storage or archive to a modern cloud-native storage plat-form, performance is on many peo-ple's minds. One of the reasons we see this solution rising in 2020 will be due to its demand by application developers. Analytics is also driving a lot of demand and we expect to see compa-nies in different verticals moving in this direction. In turn, the added performance of flash and NVMe are creat-ing tremendous oppor-tunity for Object-based platforms to support things that require speed and near-limitless scale Flash-based Object storage with automated tiering to disk offers a cost-effective solution, particularly when a customer is talking about hundreds of petabytes or exabyte-scale. It allows you to move the data you need up to the flash tier to run your analytics and high-performance applications and then move the data off to a cold or archive tier when you're done with it. As the pace of technology innovation accelerates, so too will the possibilities in storage and data management. We are standing with our customers at the dawn of the "Data Decade." Amit Luthra, Director & GM Storage & CI, India Commercial
<
Page 8 |
Page 10 >