
Mirantis launches its AI Factory Reference Architecture
The AI Factory Reference Architecture provides blueprints for building and managing AI factories.
Utilizing the company’s k0rdent AI platform that provides a templated, declarative model for rapid provisioning, the AI Factory Reference Architecture will enable AI workloads to be deployed within days of hardware being installed.
According to Mirantis, cloud-native workloads are usually designed for scale-out and multi-core operations, while AI workloads typically necessitate turning multiple GPU-based servers into a single computer with aggregated memory.
Other challenges of AI workloads include the need for fine-tuning and configuration, multi-tenancy, data sovereignty, managing scale and sprawl, and skills availability.
The reference architecture attempts to address these challenges by providing reusable templates across application, platform, compute, storage and network, and security and compliance layers, which can be used to assemble infrastructure.
Trustwise introduces Agentic AI Shields
Agentic AI Shields are runtime enforcement layers that are designed to secure agent behavior, tool usage, and policy alignment.
The company launched this solution with six different shields: MCP Shield, Prompt Shield, Compliance Shield, Brand Shield, Cost Shield, and Carbon Shield.
“We couldn’t just monitor agents, we had to control them. Trust can’t be an afterthought. It has to live inside the decision loop itself. The world had no runtime infrastructure to control agentic AI. So we built it,” the company wrote in a blog post.
Cisco prepares for AI-ready data centers with new innovations, partnerships
Cisco switches have been fully integrated into NVIDIA’s Spectrum-X architecture, the company’s networking platform for AI. According to Cisco, theirs are the first non-NVIDIA switches to work with NVIDIA NICs.
Cisco Secure AI Factory with NVIDIA offers a “full stack of infrastructure, software, and security designed to help enterprises deploy, operate, and scale AI with trust and efficiency,” according to Cisco.
The Secure AI Factory also leverages Cisco’s AI Defense technology, which provides model validation and enforces safety and security guardrails at run time.
The company also expanded its range of Cisco AI PODs, which are production-ready AI solutions for edge, RAG, training, and large-scale inferencing. It incorporates NVIDIA’s accelerated computing, is built on Cisco UCS and Cisco Nexus solutions, and also uses Cisco’s partner storage solutions.
To further ensure that Cisco UCS servers are AI-ready, Cisco announced the next generation of Fabric Interconnects, which provide increased bandwidth, reduced latency, and enhanced scalability.
Opsera, Databricks expand partnership with DevOps for DataOps
Customers of Opsera and Databricks can use the solution to orchestrate end-to-end analytics and streamline delivery pipelines, the company wrote in its announcement. This is done through the Databricks Data Intelligence Platform, which brings together customer data with AI models unique to the business. The platform is built on a lakehouse foundation of open data formats and open governance
to ensure that all data is completely within the customers’ control, Opsera said.
The DevOps for DataOps capabilities include automated deployments, built-in security and compliance, DORA metrics for monitoring pipeline health, and the use of natural language communication for debugging and triggering pipelines. Further, the solution enables DataOps teams to deploy database objects without custom scripts, using no-code, drag-and-drop pipelines.
“Opsera helps Databricks teams shift left by aligning security and delivery priorities in a single pane of glass,” said Steve Sobel, Global Leader, Startup and Venture Programs at Databricks. “Its commitment extends beyond eliminating today’s bottlenecks to anticipating the future of scalable software delivery and evolving data workflows, with a focus on user-friendly onboarding and customer experience.”
Cloudera Joins AI-RAN Alliance to Drive AI-Native Telecommunications
Cloud hybrid data platform provider Cloudera has joined the AI-RAN Alliance, a global consortium committed to integrating AI into telecommunications infrastructure. In its announcement, Cloudera wrote: “The complexity of deploying AI across distributed edge environments is not trivial and telecommunication providers will have to drive strategic enterprise-wide efforts to operationalize AI at scale across the radio access network (RAN) to unlock its full commercial potential.”
NVIDIA was a founding member of the AI-RAN Alliance, which is working to standardize how AI can be integrated into new and existing networks, with the advantages of shared infrastructure for AI optimization, accelerating the development of edge AI applications, and establishing real-world proof points to help telecommunications deploy AI reliably and profitably, Cloudera said in the announcement.