It’s no secret that businesses and organizations are embracing cloud services, containers, and other innovations to boost business productivity, scale more efficiently, and better manage infrastructure costs. However, transitioning legacy environments to the cloud introduces new challenges around securing dynamic and ephemeral cloud infrastructure. How can organizations unlock the full potential of the cloud without compromising security and compliance? 

Zero trust security can enable organizations to move beyond traditional perimeter-based security approaches to tackle the challenges of today’s dynamic multi-cloud infrastructures. Instead of defending a perimeter, zero trust security leverages trusted identities to address security concerns at all levels of infrastructure and streamline secure remote access. The result is an architecture designed to help secure all phases of the modern cloud journey.

Not so long ago, securing an enterprise environment was a fairly clear and easy-to-understand concept. Data centers generally consisted of static infrastructure with dedicated servers and IP addresses, all inside a defined network perimeter. Securing the datacenter was essentially based on a “castle-and-moat” approach, where you could secure everything by controlling the entry and exit points. Private networks inside the “castle” assumed high trust and integrity.

However, as companies move to the cloud, the measures they once took to secure their private data centers are becoming less effective. IP-based addresses are being replaced by ephemeral resources, and a constantly changing workforce needs uninterrupted access to shared resources at all times, from any location. As organizations grow, managing ephemeral resources like containers, access, and credentials at scale becomes progressively more brittle and complex. Securing infrastructure, data, and access has become increasingly difficult across clouds and on-premises datacenters, requiring lots of overhead and expertise.

Six Steps to Zero Trust Security

This shift to zero trust security requires a different approach to security and a different trust model — one that is identity-based, centrally managed, widely encrypted, and always authenticated and authorized. The six steps outlined here can help organizations get started reaping the benefits of zero-trust security:

Step 1. Replace perimeter-based security with identity-based security

The migration to dynamic cloud infrastructure is exponentially increasing the number of systems to manage, endpoints to monitor, networks to connect, and people who need access. The potential for a breach increases significantly due to the complexity of trying to manage ephemeral IP addresses at scale. That’s why perimeter-based security should be replaced by identity-based access. Instead of setting access parameters based on IP-addresses, machine and human users must authenticate who or what they are, and what they’re allowed to do. This allows for the unhindered scaling of teams and infrastructure without sacrificing security.

Many organizations are utilizing centralized identity and secrets management solutions that can secure and streamline machine authentication (authN) and authorization (authZ) workflows throughout their heterogeneous infrastructure assets. Organizations should leverage solutions that are designed to work cleanly and complement the technology, including containers, that you may already have in your stack including centralized secrets management across multi-cloud as well as  on-premises environments that cloud-native capabilities cannot reach.

Step 2. Centrally store and protect secrets

As businesses scale and move to the cloud, properly managing secrets such as passwords, tokens, certificates, and encryption keys becomes even more critical. Applications and data may now reside in multiple clouds and on-premises locations, potentially scattering secrets across the infrastructure. The disorganized nature of secret sprawl can lead to increased vulnerabilities.

This issue should be addressed by centralizing secrets and adding methods of protection like role-based access controls (RBACs). These access controls create a set of rules to determine which specific secrets can be accessed at a particular time, by whom, and how that access is audited. Audit logs can then be used to trace and identify potential weak points or where and how a breach may have occurred. Having this information in a central location can decrease response and remediation time in the event of a breach.

Organizations should seek a solution that provides an automated and centralized credential management workflow for users to connect and access all of their infrastructure resources, including Kubernetes clusters and pods. This solution should allow organizations to dynamically create, rotate, and revoke credentials needed for applications and pods running in Kubernetes. Obtaining unique credentials (including lease and expiration times) in order to access the correct systems will reduce the impact of breaches from leaked credentials. 

Step 3. Leverage dynamic secrets

As companies move their applications and infrastructure out of private datacenters and into the cloud, they face new operational and security issues. Secrets previously stored in on-premises systems may be more vulnerable as they’re moved into public repositories and cloud instances. To achieve robust security, businesses must centrally store, tightly control, and monitor access to the tokens, passwords, certificates, API keys, and encryption keys that protect their systems and sensitive data.

Secrets that are weak or haven’t been updated regularly can allow attackers to access protected data, often for long periods of time. By tightly coupling trusted identities with access, organizations can maintain a tighter security posture — rotating access, revoking, and utilizing dynamic secrets with trusted identities. For example, organizations that use Kubernetes should leverage a solution where each application or pod can dynamically obtain unique credentials (including lease and expiration times) in order to access the correct systems. Automatically rotated credentials and dynamic secrets reduce the attack surface and the potential blast radius by shrinking the window of opportunity to use stolen secrets.

Step 4. Encrypt everything and make it easy

Encrypting data is another line of defense that should be considered non-negotiable within modern cloud security. Encrypting data in transit and at rest helps ensure that when a system is compromised, the encoded data remains safe. Using cyphertext or more advanced forms of encryption like tokenization, format-preserving encryption, or data masking, companies can ensure that all data, especially sensitive data, is protected. Even if an attacker or privileged insider has accessed systems, applications, or networks, properly encrypted data can remain protected.

It’s important to make the encryption process easy for developers and security engineers — otherwise, it’s often done poorly or not at all. Encryption should be made available to teams “as a service” and used across containers, networks, and systems. Users can leverage solutions to make encryption easier to protect data such as Social Security numbers, credit card numbers, and other types of compliance-regulated data with one-way (masking), and two-way transformations (such as tokenizing data and format-preserving encryption).

Many organizations in highly regulated industries require secure solutions for key management that solidify the root of trust for their cloud ecosystem to meet strict regulatory requirements for data encryption, such as GDPR and FINMA. For situations when organizations need to bring their own key to the cloud, they should seek a solution that supports the lifecycle management of keys.

Step 5. Authenticate and authorize all network traffic

Applications talking to databases, users accessing hosts and services, and servers talking across clouds — traditionally these have all been protected by allowing or restricting access based on IP addresses. Because IP addresses are so ephemeral in today’s dynamic, cloud-based infrastructure, those access restrictions no longer work as effectively. To cope, organizations are moving from IP-based authorization to identity-based authorization. 

Organizations should leverage a solution that is designed to help organizations automate, discover, and secure services and service connections across any runtime platform. It should enable machine-to-machine access to services by enforcing authentication between applications, ensuring that only the right machines are talking to each other by providing policies based on identity.

Service registries are one type of tool that can help manage dynamic IP addresses by creating identity abstractions and tracking in a centralized way. Using a service registry allows Kubernetes services to use native Kubernetes service discovery to discover and connect to registered external services, and for registered external services to use service discovery to discover and connect to Kubernetes services. Organizations may also consider leveraging a service mesh. A service mesh allows you to assign service identities to each service running on the Kubernetes cluster. Based on service identity, the mesh can authenticate service identities using mTLS, and service-access requests can be authorized or blocked using intentions, which allows operators to define service-to-service communication permissions by service name. 

Step 6. Enable multi-factor authentication

Protecting user accounts is vital to an enterprise’s security strategy. Even the best security tools alone mean little if the organization doesn’t also protect itself against cracked passwords. Cybercriminals often use legitimate compromised credentials to gain a foothold in an organization, avoiding detection for a median time of 21 days, according to the M-Trends 2022 Report. Verizon’s 2021 Data Breach Investigations Report found that 61% of breaches involved credential data. The best way to deter these breaches is through multi-factor authentication (MFA). Microsoft Security reports that enabling MFA reduces account credential compromise by 99.9%

The Time is Now for Zero Trust Security

Whether caused by a hack or human error, the question is no longer if a breach will occur, it’s when, so it’s critical to have systems and processes in place ahead of time. To help safeguard today’s dynamic environments, organizations need an identity-based security approach, a secrets management platform, and a dynamic secrets strategy. Best practices also include robust encryption as a service, a network authN and authZ strategy such as identity brokering, a service mesh, multi-factor authN, and secure sessions management.

Organizations should look at using a solution that delivers the fundamental, identity-based capabilities needed to protect today’s dynamic multi-cloud environments and enable a zero trust security approach that lets organizations:

  • Improve enterprise security posture with identity-based authorization and access controls at all levels of networking and infrastructure as well as fully integrated data encryption services.
  • Reduce the likelihood of a breach due to secrets sprawl by authenticating and authorizing everything for machine-machine and human-to-machine authentication, authorization, and access.
  • Accelerate secure multi-cloud adoption with centralized secrets management that spans on-premises, hybrid, and multi-cloud environments, with dynamic service discovery for machine access.

Perhaps best of all, these solutions are flexible enough to be deployed using a phased approach, at any stage of your journey toward zero trust security.

As the migration toward cloud continues to accelerate, it’s clear that the challenges of today’s dynamic, ephemeral environments will continue to grow. To get out in front of the evolving threat landscape, the time to make your zero trust security vision a reality is now. 

To learn more about the transformative nature of cloud native applications and open source software, join us at KubeCon + CloudNativeCon Europe 2023,  hosted by the Cloud Native Computing Foundation, from April 18-21.