Skip to content

Incorrect pod state shown #12762

Open
Open
@hookenz

Description

@hookenz

Before you start please confirm the following.

Problem Description

When I run rollout restart on a deployment, to ensure the correct uptime guarantee one container will be created and all traffic routed to it, while the container it replaced is terminated.

kubectl is showing the correct status.
However, Portainer is showing both containers in the running state

Expected Behavior

The terminating container should show "terminating".

Actual Behavior

It doesn't, it shows "running"

Steps to Reproduce

A deployment that takes a while to get going and to terminate.
Image pull policy is set to "always" in my case and tag = "latest"
Replica Count is 1.

kubectl rollout restart deployment/your-deployment
kubectl get all

compare to portainer deployment app view showing running pods

Portainer logs or screenshots

I have a screencast of it but cannot seem to update it. Here is a screenshot of it. Notice the state of the pods.
I refreshed the portainer screen after kubectl was already showing "terminating". You can clearly see the pod state still shows "running"

Image

Portainer version

2.31.3

Portainer Edition

Community Edition (CE)

Platform and Version

microk8s

OS and Architecture

Ubuntu 22.04

Browser

firefox

What command did you use to deploy Portainer?

standard

Additional Information

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions