Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server-side apply conflict warning doesn't show which resource is causing the problem, make it very hard to diagnosis #1683

Open
kghost opened this issue Dec 5, 2024 · 3 comments
Labels
kind/feature Categorizes issue or PR as related to a new feature. needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one.

Comments

@kghost
Copy link

kghost commented Dec 5, 2024

Here I use a kubectl apply to add a directory with handreds of resources:

kubectl apply -f manifests --server-side

Then got lots of warning like this:

Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using apps/v1: .spec.template.spec.containers[name="kube-rbac-proxy"].env[name="IP"].valueFrom.fieldRef
Please review the fields above--they currently have other managers. Here
are the ways you can resolve this warning:
* If you intend to manage all of these fields, please re-run the apply
  command with the `--force-conflicts` flag.
* If you do not intend to manage all of the fields, please edit your
  manifest to remove references to the fields that should keep their
  current managers.
* You may co-own fields by updating your manifest to match the existing
  value; in this case, you'll become the manager if the other manager(s)
  stop managing the field (remove it from their configuration).
See https://kubernetes.io/docs/reference/using-api/server-side-apply/#conflicts

As you can see, the output only shows version, no group no name, which make is very hard to locate the problem.

What would you like to be added:

We should output which resource is causing the problem, show its group/name.

Why is this needed:

@kghost kghost added the kind/feature Categorizes issue or PR as related to a new feature. label Dec 5, 2024
@k8s-ci-robot k8s-ci-robot added the needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one. label Dec 5, 2024
@k8s-ci-robot
Copy link
Contributor

This issue is currently awaiting triage.

SIG CLI takes a lead on issue triage for this repo, but any Kubernetes member can accept issues by applying the triage/accepted label.

The triage/accepted label can be added by org members by writing /triage accepted in a comment.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@ardaguclu
Copy link
Member

First line of the error message indicates the field;

Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using apps/v1: .spec.template.spec.containers[name="kube-rbac-proxy"].env[name="IP"].valueFrom.fieldRef
Please review the fields above--they currently have other managers. Here

@kghost
Copy link
Author

kghost commented Dec 5, 2024

The field is not very helpful, I have a dozen of objects have same fields. It can be better if we show the object ref.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/feature Categorizes issue or PR as related to a new feature. needs-triage Indicates an issue or PR lacks a `triage/foo` label and requires one.
Projects
None yet
Development

No branches or pull requests

3 participants