Topic: data

Applying agentic AI to legacy systems? Prepare for these 4 challenges

Agentic AI has enormous potential to add efficiency and speed to legacy system transformation. However, given the complexity of legacy platforms and their critical role in enabling business processes, fully leveraging AI agents to assist with legacy system migration and modernization can be a deeply challenging task. Fortunately, these issues are solvable. They do, however, … continue reading

Pure Storage introduces Enterprise Data Cloud to enable centralized management of data and storage across on-premises, public cloud, and hybrid environments

Pure Storage is introducing a simpler approach to data and storage management that it says will allow organizations to focus more on business outcomes than on infrastructure. Enterprise Data Cloud (EDC) allows IT teams to centrally manage a virtualized cloud of data regardless of how many different locations it’s stored in, spanning on-premises, public cloud, … continue reading

Enterprise AI: A new game plan for storage IT professionals

As artificial intelligence (AI) becomes deeply embedded in enterprise operations, IT storage professionals are being pulled into uncharted territory. Where once their focus was confined to provisioning, performance tuning, data protection and backups, today they are expected to orchestrate data services, ensure regulatory compliance, optimize cost models, and even help train AI models. The transition … continue reading

Opsera, Databricks expand partnership with DevOps for DataOps, Cloudera joins AI-RAN Alliance, and more: ITOps News Digest

AI-powered DevOps platform provider Opsera today announced the expansion of its Databricks partnership with the launch of its DevOps for DataOps capability, a Built on Databricks solution. Customers of Opsera and Databricks can use the solution to orchestrate end-to-end analytics and streamline delivery pipelines, the company wrote in its announcement. This is done through the … continue reading

Capital One Databolt tokenizes data to reduce risks associated with data breaches

Capital One is introducing a new solution that provides a way to protect sensitive data to reduce the risk of data being exposed during a breach. Capital One Databolt is a tokenization solution that replaces sensitive data with secure tokens. It preserves the underlying data format, which allows companies to still run their applications, manage … continue reading

Fortanix adds File System Encryption in latest release

The security company Fortanix has announced File System Encryption, a new feature that is part of the Fortanix Data Security Manager. With File System Encryption, data can be encrypted at the file system level, which is useful in scenarios where different user groups need access to different parts of a database.  Organizations can set up … continue reading

DDS standard 1.2 adds robust security enhancements

The Data Distribution Service (DDS) standard is getting significant updates with its latest release, version 1.2.  DDS is a data connectivity standard that enables developers to provide real-time data to IoT systems.  The new version contains a more robust security model, including: NSA approved algorithms Pre-shared keys to better protect from DoS attacks Key revisions … continue reading

Cribl announces AI copilot to deliver insights from IT data

Cribl, which is a data management platform for IT teams, has announced a new AI copilot to further help customers manage growing amounts of data.  Cribl Copilot can generate insights, dashboards, and notifications from the company’s data, as well as answer questions about that data. According to the company, the goal of this new AI … continue reading

Polaris Catalog – ITOps Times Open Source Project of the Week

The Polaris Catalog is an open source catalog for Apache Iceberg that was recently announced by Snowflake at its user conference, Snowflake Summit, earlier this week. While not technically fully available yet, the company says the catalog will be open sourced within the next 90 days.  Apache Iceberg is a format for big data analytics, … continue reading

Protocol Buffers – ITOps Times Open Source Project of the Week

Protocol Buffers (also sometimes referred to as protobuf) is a cross-platform mechanism for serializing structured data. Its documentation compares it to formats like XML or JSON, but smaller, faster, and simpler. “You define how you want your data to be structured once, then you can use special generated source code to easily write and read … continue reading

Hitachi Vantara adds block storage in latest release of its data platform

Hitachi Vantara has just unveiled new features across its hybrid cloud data platform, Virtual Storage Platform One.  The main highlight of this release is the addition of block storage, which is a data storage technique where data is divided into blocks for easier and more efficient retrieval. According to Hitachi Vantara, using block storage can … continue reading

ITOps Times Open-Source Project of the Week: Vineyard

Vineyard is an in-memory data manager designed specifically for data-heavy analytics.  It uses shared memory to efficiently share data across different systems without needing to do serialization and deserialization. It employs a zero-copy method for sharing, which helps to avoid I/O overhead costs.  According to the project maintainers, Vineyard is ideal for complex cloud-native environments, … continue reading

1 2 3 15
DMCA.com Protection Status

Get access to this and other exclusive articles for FREE!

There's no charge and it only takes a few seconds.

Sign up now!