Secure your container environment on GCP, GKE, or Anthos.
Overview
Containerization allows development teams to move fast, deploy software efficiently, and operate at an unprecedented scale. As enterprises create more containerized workloads, security must be integrated at each stage of the build-and-deploy life cycle. Learn how to secure any container environment you run on GCP—whether that's Google Kubernetes Engine or Anthos—in three critical areas.
Infrastructure security
Infrastructure security means that your container management platform provides the right security features. Kubernetes includes security features to protect your identities, secrets, and network, and Google Kubernetes Engine uses native GCP functionality—like Cloud IAM, Cloud Audit Logging, and Virtual Private Clouds—and GKE-specific features like application layer secrets encryption and workload identity to bring the best of Google security to your workloads.
Software supply chain
Securing the software supply chain means that container images are safe to deploy. This is how you make sure your container images are vulnerability free and that the images you build aren't modified before they're deployed.
Runtime security
Runtime security allows you to identify a container acting maliciously in production and take action to protect your workload.
Running containers allows you to adopt a fundamentally different security model
Simpler patch management and immutability
Containers are meant to be immutable, so you deploy a new image in order to make changes. You can simplify patch management by rebuilding your images regularly, so the patch is picked up the next time a container is deployed. Get the full picture of your environment with regular image security reviews.
Smaller surface of attack
Containers are meant to run on a much smaller host OS than for a VM, as more is packaged into the application directly. This minimal host OS reduces the potential surface of attack for your workload.
Resource and workload isolation
Containers provide an easy way to isolate resources, such as storage volumes, to certain processes using cgroups and namespaces. With technologies like GKE Sandbox, you can logically isolate workloads in a sub-VM sandbox, separate from other applications.
Infrastructure security
Container infrastructure security is about ensuring that your developers have the tools they need to securely build containerized services. These capabilities are typically built into the container orchestrator, like Kubernetes. If you use Google Kubernetes Engine, this functionality is surfaced natively, in addition to other features of Google Cloud.
Identity and authorization
On Google Kubernetes Engine, use Cloud IAM to manage access to your projects and role-based access control (RBAC) to manage access to your clusters and namespaces.
Audit logging
In Kubernetes, API audit logs are automatically captured. On Google Kubernetes Engine, Cloud Audit Logging records API audit logs automatically for you.
Networking
On Google Kubernetes Engine, create a network policy to manage pod-to-pod communications in your cluster. Use private clusters for private IPs and include Google Kubernetes Engine resources in a shared VPC.
Compliance
Google Kubernetes Engine features many compliance certifications, including ISO 27001, ISO 27017, ISO 27108, HIPAA, and PCI-DSS.
Minimal host OS
Google Kubernetes Engine uses Container-Optimized OS (COS) by default, an OS purpose-built and optimized for running containers. COS is maintained by Google in open source.
Automatically upgraded components
On GKE, masters are automatically patched to the latest Kubernetes version, and you can use node auto-upgrade to keep your security up to date by automatically applying the latest patches for your nodes.
Customer-managed encryption keys
Users in regulated industries may need to be in control of the keys used to encrypt data stored in GKE. With customer-managed encryption keys, you can pick a key from Cloud KMS to protect your GKE persistent disk.
Application layer secrets encryption
By default, Kubernetes secrets are stored in plaintext. GKE encrypts these secrets on disk and monitors this data for insider access. But that alone might not be enough to protect those secrets from a malicious application in your environment. Application layer secrets encryption protects secrets with envelope encryption, with a key you manage in Cloud KMS.
Workload Identity
Your containerized application probably needs to connect to other services, like a database, to fulfill its duties. To do so, your application first needs to authenticate to them. Workload Identity uses a Google-managed service account to share credentials for authentication, following the principles of least privilege for application authentication.
Managed SSL certs
In GKE, HTTPS load balancers need to be associated with an SSL certificate. You can obtain, manage, and renew these certificates yourself, or have Google automatically obtain, manage and renew those certificates for you—saving you the burden of renewing (or forgetting to renew) them yourself.
No comments:
Post a Comment