Accelerating generative AI requires the right storage

BrandPost By Jay Limbasiya, Global AI, Analytics, & Data Management Business Development, Unstructured Data Solutions, Dell Technologies
Aug 09, 20236 mins
Artificial Intelligence

As IT leaders race toward generative AI, architecting the right storage infrastructure is the linchpin to fueling effective business insights for improved outcomes.

AI
Credit: iStock

Formula 1 (F1) drivers are some of the most elite athletes in the world. In other sports, such as basketball or soccer, there may be hundreds or thousands of players at the topmost levels. In F1 racing, drivers must excel to earn one of only 20 F1 seats.

Further elevating this status, F1 reigns as the world’s most prominent racing event, spanning five continents during a year-long season. F1 boasts the fastest open-wheel racecars, capable of reaching speeds of 360 km/h or /224 mph and accelerating from 0 to 100 km/h or 62 mph in 2.6 seconds. Each racecar costs an estimated $15 million (after $135 million of materials to support the racecar).

But all this work, investment and prominence is nothing without one thing: fuel – and the right amount of it. Just ask the six drivers that were leading F1 races and ran out of fuel during the final lap, crushing their chances of victory.

What does this have to do with technology? It’s an appropriate takeaway for another prominent and high-stakes topic, generative AI. 

Generative AI “fuel” and the right “fuel tank”

Enterprises are in their own race, hastening to embrace generative AI (another CIO.com article talks more about this). The World Economic Forum estimates 75% of companies will adopt AI by 2027. Generative AI’s economic impact, per McKinsey, will add $2.6-4.4 trillion per year to the global economy. To put that in perspective, the UK’s annual gross domestic product (GDP) is $3.1 trillion. 

Like F1, all this investment and effort holds great promise. But it also creates one key dependency that will make or break generative AI: the fuel and the right amount of it. In generative AI, data is the fuel, storage is the fuel tank and compute is the engine. Organizations need massive amounts of data to build and train generative AI models. In turn, these models will also generate reams of data that elevate organizational insights and productivity. 

All this data means that organizations adopting generative AI face a potential, last-mile bottleneck, and that is storage. Before generative AI can be deployed, organizations must rethink, rearchitect and optimize their storage to effectively manage generative AI’s hefty data management requirements. By doing so, organizations won’t “run out of fuel” or slow down processes due to inadequate or improperly designed storage – especially during that final mile; in other words, after all the effort and investment has been made.

Unstructured data needs for generative AI

Generative AI architecture and storage solutions are a textbook case of “what got you here won’t get you there.” Novel approaches to storage are needed because generative AI’s requirements are vastly different. It’s all about the data—the data to fuel generative AI and the new data created by generative AI. As generative AI models continue to advance and tackle more complex tasks, the demand for data storage and processing power increases significantly. Traditional storage systems struggle to keep up with the massive influx of data, leading to bottlenecks in training and inference processes.

New storage solutions, like Dell PowerScale, cater to AI’s specific requirements and vast, diverse data sets by employing cutting-edge technologies like distributed storage, data compression and efficient data indexing. Advances in hardware boost the performance and scalability of generative AI systems.

In addition, managing the data created by generative AI models is becoming a crucial aspect of the AI lifecycle. That newly generated data, from AI interactions, simulations, or creative outputs, must be properly stored, organized and curated for various purposes like model improvement, analysis, and compliance with data governance standards.

To better understand the scale of data changes, the graphic below shows the relative magnitude of generative AI data management needs, impacting both compute and storage needs. For context, 1 PB is equivalent to 500 billion pages of standard typed text.

AI graph

Dell

Enabling data access, scalability and protection for generative AI

It’s not just the size of the storage that is driving change, it’s also data movement, access, scalability and protection. As a quick fix, many organizations adopted cloud-first strategies to manage their data storage requirements. But more data means more data movement. In the cloud, which creates escalating ingress and egress costs and more latency, making cloud-first an infeasible generative AI storage solution.

Generative AI storage models must meet many challenging requirements simultaneously and in near real-time. In other words, storage platforms must be aligned with the realities of unstructured data and the emerging needs of generative AI. Enterprises need new ways to cost-effectively store the sheer scale and complexity of the data while providing easy access to find data quickly and protect files and data as they move. 

As organizations work to outpace the competition, AI-powered enterprises are taking the clear lead. Those that pause and lag may not even be in the race at all. Like a world-class F1 racecar driver, winning high-stakes events mandates the preparation to ensure there is enough fuel (or data) when it’s needed at the most critical point, the final mile.

Learn more about unstructured data storage solutions for generative AI, other AI-workloads and at exabyte-scale.

Dell Technologies and Intel work together helping organizations modernize infrastructure to leverage the power of data and AI. Modernizing infrastructure starts with creating a more agile and scalable data architecture with the flexibility to support near real-time analytics. Analytic workloads now rely on newer storage models that are more open, integrated and secure by design to help organizations unlock and use the full and tremendous potential of their data. 

Powering business with data means making the data easier to manage, process and analyze as part of a data pipeline, so infrastructure can meet the data where it is. Intel can help customers build a modern data pipeline that can collect, extract, and store any type of data for advanced analytics or visualization. Learn more here.