Cloudflare One for AI was released as a collection of features that help teams build with the latest AI services while maintaining a zero-trust security posture.

“Cloudflare One’s goal is to allow you to safely use the tools you need, wherever they live, without compromising performance. These features will feel familiar to any existing use of the Zero Trust products in Cloudflare One. Still, we’re excited to walk through cases where you can use the tools available to allow your team to take advantage of the latest LLM features,” Sam Rhea, VP of product, wrote in a blog post

Cloudflare One customers on any plan can now review the usage of AIs and IT departments can deploy Cloudflare Gateway and passively observe how many users are selecting which services as a way to start scoping out enterprise licensing plans.

The company stated that when it started to experiment with OpenAI’s ChatGPT service to create applications with Cloudflare Workers it had much success. However, for some cases where the info is outdated Cloudflare implemented scoped inputs and connect plug-ins to provide itsr customers with better AI-guided experiences when using Cloudflare services.

For users who need to securely share training data and grant plug-in access for an AI service, Cloudflare One’s security suite extends beyond human users and can give teams the ability to securely share Zero Trust access to sensitive data over APIs.

Users can create service tokens that external services must present to reach data made available through Cloudflare One and admins can provide tokens to systems making API requests while logging every request. 

Admins can create policies that allow certain services to access training data, which will be verified through the service token. Policies can also be extended to verify the country, IP address, or mTLS certificate. Furthermore, human users may be required to go through authentication with an identity provider and complete an MFA prompt before they are granted access to sensitive training data or services.

Once teams are ready to connect AI services to their infrastructure, Cloudflare Tunnel will create an encrypted, outbound-only connection to Cloudflare’s network where every request will be checked against the access rules configured for one or more services protected by Cloudflare One.

“Administrators can select an AI service, block Shadow IT alternatives, and carefully gate access to their training material, but humans are still involved in these AI experiments. Any one of us can accidentally cause a security incident by oversharing information in the process of using an AI service – even an approved service,” Rhea added. “We expect AI playgrounds to continue to evolve to feature more data management capabilities, but we don’t think you should have to wait for that to begin adopting these services as part of your workflow. Cloudflare’s Data Loss Prevention (DLP) service can provide a safeguard to stop oversharing before it becomes an incident for your security team.”