As artificial intelligence (AI) becomes deeply embedded in enterprise operations, IT storage professionals are being pulled into uncharted territory. Where once their focus was confined to provisioning, performance tuning, data protection and backups, today they are expected to orchestrate data services, ensure regulatory compliance, optimize cost models, and even help train AI models. The transition can be overwhelming—but for those ready to evolve, it presents a unique opportunity to move from being storage stewards to data enablers in a unique era of history.

Here are some key requirements and strategies for storage teams to adapt to the new world of AI:

Understanding Your Own Data Landscape

The famous Peter Drucker management quote, “You can’t manage what you don’t measure,” applies more than ever. In the AI era, storage teams need granular metrics about their own environment—data about the data. This includes knowing where data resides, its access patterns, growth trends, duplication rates, and compliance status. It also means segmenting by department, project, or data type to make smarter storage management decisions and improve data searchability and usability for internal customers. This often requires metadata enrichment or tagging and tools that can crack open files to supply additional context about the data. Regularly analyzing data usage and growth and searching for the right data with context improves the day-to-day management of data while supporting efficient AI data ingestion.

Smarter Spending with Analysis

Between storage hardware, backups, disaster recovery, and hybrid cloud capacity, enterprises invest millions each year.  IT organizations are now managing enormous volumes of unstructured (file and object) data: 20, 30, or even upwards of 50 petabytes (PB). But as unstructured data footprints expand, it’s clear that not all data can remain active or be treated equally. Treating everything as “hot” data drives up unnecessary costs, increases exposure to ransomware attacks, and clogs infrastructure needed for AI workloads.  Organizations can shrink costs and ransomware risk, given that 80% of data in organizations is cold and unused.

To address this, IT leaders are embracing transparent, automated data tiering strategies that operate across storage vendors.  Cold data can be shifted to lower-cost storage or cloud-based solutions transparently without user disruption. Some organizations even layer in “cool” storage tiers to retain a consistent experience while cutting costs. At the same time, chargeback or showback models allow departments to see how much data they’re using, how old it is, and who are the top consumers.

Financial operations (FinOps) and cost modeling tools are becoming essential to data storage management. By forecasting storage needs, modeling growth, and comparing on-premises versus cloud strategies, IT leaders can determine the most cost-effective mix of infrastructure.

Unstructured data management systems that index and analyze data across disparate storage platforms provide visibility into what can be archived, deleted, or moved. These insights help organizations prepare for AI while keeping data storage budgets under control.

Delivering AI-Ready Data with Context and Control

Once a cost optimization plan and strategy is in place to handle the ongoing unstructured data deluge–especially considering that AI itself is producing more data – it’s time to focus on preparing data for AI. 

AI systems can’t function without data and not just any data: contextual, curated, and compliant data. This has made data classification a top priority for storage teams. To restrict what AI bots can access, protect sensitive information and avoid redundant processing, IT organizations are prioritizing metadata tagging and automated data workflows.

Yet, manually finding and tagging data locked away in enterprise storage silos is cumbersome and often untenable for both storage professionals and the users who need data for AI.  Automated tools that allow end users to tag and classify their data are becoming essential. 

For example, researchers need to distinguish between data tagged “internal,” “sensitive,” or “public” to comply with governance policies. Power users such as analysts, data scientists and researchers also need easier ways to search across their data – such as project code, project name, and any other relevant keyword indicating the contents. Since unstructured data can easily span billions of files strewn across tens to hundreds of enterprise directories, efficiently classifying, searching and curating unstructured data is integral to AI.  

Bridging Organizational Silos

To meet the evolving demands of AI, storage professionals must adopt a cross-disciplinary, collaborative mindset. Storage teams now serve as trusted advisors to business units, researchers and IT peers, helping define data needs, governance policies, and infrastructure priorities.

This requires a broader understanding of enterprise objectives and the ability to align technical decisions with business outcomes—whether that’s cost control, regulatory compliance, or supporting cutting-edge research.

Redefining Metrics for the Modern Enterprise

As AI workloads and cross-functional collaboration become standard, traditional storage SLAs may no longer be sufficient. Storage professionals should begin tracking new data management metrics, such as:

  1. Top data owners by individual or department
  2. Percent of non-compliant or orphaned data
  3. Data classification completeness
  4. Duplication reduction
  5. Chargeback effectiveness

These KPIs help demonstrate value, encourage better data hygiene, and align IT services with business needs.

Locking Down File Data Against Cyber Threats

AI’s reliance on data makes storage a prime target for ransomware attacks. Offloading cold data to immutable storage in the cloud is one effective mitigation strategy. Immutable storage ensures that once written, data cannot be altered or deleted, effectively reducing the active attack surface by up to 80%.

Building the Foundation for AI Infrastructure

Launching an AI initiative is an infrastructure-intensive undertaking. Training and deploying models often require high-performance compute: GPUs, TPUs, and advanced networking. Whether organizations choose to build their own environments or use cloud-based options, storage teams must be involved from the start to determine where AI should live (on-prem, cloud, or hybrid), how to manage data movement, and how to ensure security and performance at scale.

Elevating the Role: From Storage Technician to Strategic Data Services Providers

Research has found that by 2025, half of all employees would need to reskill due to technology shifts. That prediction has arrived. For IT storage professionals, the shift is more than technical. It’s about protecting and furthering their careers and delivering a needed data and AI-centric foundation to their employers. That means evolving from infrastructure managers to trusted data services providers. With the right mindset and tools, storage professionals can lead the charge into a more intelligent, secure, and efficient future.