Pomerium upgrade from v0.20.0 to v0.22.0 grpc check ext_authz_error

What happened?

We are using helm chart to install pomerium in our EKS clusters pomerium 34.0.1 · helm/pomerium
With upgrade from 33.0.2(v0.20.0) to 34.0.1(v0.22.0) we started to see the following error:

{"level":"error","error":"hpke: error requesting hpke-public-key endpoint: Get \"https://authenticate.nebula-dkh3.nprd.euc1.sys.caas.oneweb.mercedes-benz.com/.well-known/pomerium/hpke-public-key\": context canceled","request-id":"dcd7d2e6-47de-40ce-9d3f-0d4563752c64","time":"2023-11-08T08:48:19Z","message":"grpc check ext_authz_error"}

What’s your config.yaml?

Helm is installed by ArgoCD with the following values:

helm:
      releaseName: "pomerium-{{ .Values.spec.cluster.env }}-{{ .Values.spec.cluster.dns_region }}"
      values: |
        authenticate:
          existingTLSSecret: pomerium-tls
          idp:
            clientID: "pomerium.{{ .Values.spec.misc.nebula_domain }}"
            clientSecret: "{{ .Values.spec.pomerium.dex_secret }}"
            provider: oidc
            url: {{ .Values.spec.pomerium.dex_url }}
          replicaCount: 1
          autoscaling:
            enabled: false
            minReplicas: 1
            maxReplicas: 5
            targetCPUUtilizationPercentage: 50
            targetMemoryUtilizationPercentage: 50
        authorize:
          existingTLSSecret: pomerium-tls
        config:
          generateTLS: false
          existingCASecret: pomerium-tls
          cookieSecret: "{{ .Values.spec.pomerium.cookie_secret }}"
          routes:
            - allowed_users: {{ .Values.spec.pomerium.users }}
              from: "https://apps.{{ .Values.spec.misc.nebula_domain }}"
              preserve_host_header: true
              to: http://forecastle.caas-base:80
            - allowed_users: {{ .Values.spec.pomerium.users }}
              from: "https://{{ .Values.spec.cluster.tenant }}-{{ .Values.spec.cluster.name }}-kubecfg.{{ .Values.spec.misc.nebula_domain }}"
              preserve_host_header: true
              to: http://nginx-serve-kubecfg.caas-base:80
            - allowed_users: {{ .Values.spec.pomerium.users }}
              from: "https://policy-reporter.{{ .Values.spec.misc.nebula_domain }}"
              preserve_host_header: true
              to: http://kyverno-policy-reporter-{{ .Values.spec.cluster.env }}-{{ .Values.spec.cluster.dns_region }}-ui.caas-security:8080
          rootDomain: "{{ .Values.spec.misc.nebula_domain }}"
        extraEnv:
          JWT_CLAIMS_HEADERS: email, groups, user
          POMERIUM_DEBUG: false
          LOG_LEVEL: warn
        ingress:
          className: alb
          pathType: ImplementationSpecific
          annotations:
            alb.ingress.kubernetes.io/scheme: internet-facing
            alb.ingress.kubernetes.io/target-type: ip
            alb.ingress.kubernetes.io/listen-ports: '[{"HTTP": 80}, {"HTTPS": 443}]'
            alb.ingress.kubernetes.io/ssl-redirect: '443'
            alb.ingress.kubernetes.io/certificate-arn: "{{ .Values.spec.aws.cert_arn_nebula }}"
            alb.ingress.kubernetes.io/ssl-policy: ELBSecurityPolicy-TLS-1-2-2017-01
            alb.ingress.kubernetes.io/backend-protocol: HTTPS
            alb.ingress.kubernetes.io/group.name: nebula
nginx.ingress.kubernetes.io/proxy-buffer-size: 16k
          enabled: true
        metrics:
          enabled: true
        proxy:
          existingTLSSecret: pomerium-tls
          deployment:
            extraEnv:
              DEFAULT_UPSTREAM_TIMEOUT: 300s
        databroker:
          existingTLSSecret: pomerium-tls
        resources:
          limits:
            cpu: 1000m
            memory: 600Mi
          requests:
            cpu: 100m
            memory: 300Mi

What did you see in the logs?

{"level":"error","error":"hpke: error requesting hpke-public-key endpoint: Get \"https://authenticate.nebula-dkh3.nprd.euc1.sys.caas.oneweb.mercedes-benz.com/.well-known/pomerium/hpke-public-key\": context canceled","request-id":"dcd7d2e6-47de-40ce-9d3f-0d4563752c64","time":"2023-11-08T08:48:19Z","message":"grpc check ext_authz_error"}

Hello,

Helm has been deprecated for quite some time. Please use the recommended all in one installation method: Installation | Pomerium which has less moving parts.