15

I am working on a bitbucket pipeline for pushing image to gc container registry. I have created a service account with Storage Admin role. (bitbucket-authorization@mgcp-xxxx.iam.gserviceaccount.com)

enter image description here

gcloud auth activate-service-account --key-file key.json
gcloud config set project mgcp-xxxx
gcloud auth configure-docker --quiet
docker push eu.gcr.io/mgcp-xxxx/image-name

Although that the login is successful, i get: Token exchange failed for project 'mgcp-xxxx'. Caller does not have permission 'storage.buckets.get'. To configure permissions, follow instructions at: https://cloud.google.com/container-registry/docs/access-control

Can anyone advice on what i am missing?

Thanks!

Maxim
  • 3,172
  • 9
  • 21
Tania Petsouka
  • 1,078
  • 1
  • 7
  • 16

12 Answers12

13

In the past I had another service account with same name and different permissions. After discovering that service account names are cached, I created a new service account with different name and it's pushing properly.

Tania Petsouka
  • 1,078
  • 1
  • 7
  • 16
13

For anyone reading all the way here. The other suggestions here did not help me, however I found that the Cloud Service Build Account role was also required. Then the storage.buckets.get dissappears.

This is my minimal role (2) setup to push docker images: auomationroles

The Cloud Service Build Account role however adds many more permissions that simply storage.buckets.get. The exact permissions can be found here.

note: I am well aware the Cloud Service Build Account role also adds the storage.objects.get permission. However, adding roles/storage.objectViewerdid not resolve my problem. Regardless of the fact it had the storage.objects.get permission.

If the above does not work you might have the wrong account active. This can be resolved with:

gcloud auth activate-service-account --key-file key.json

If that does not work you might need to set the docker credential helpers with:

gcloud auth configure-docker --project <project_name>

On one final note. There seemed to be some delay between setting a role and it working via the gcloud tool. This was however minimal, think of a scope less than a minute.

Cheers

Shine
  • 461
  • 4
  • 12
  • loud auth activate-service-account --key-file was the solution for me. Thanks – Jørgen Jul 08 '20 at 08:01
  • Thanks for this! Not sure if they renamed it, but it's now "Cloud Build Service Account" rather than "Cloud Service Build Account". If you need to set it from the command line or API, the name of the role is `roles/cloudbuild.builds.builder` – Matt Browne Apr 02 '21 at 17:20
10

You need to be logged into your account and set the project to the project you'd like. There is a good chance you're just not logged in.

gcloud auth login

gcloud config set project <PROJECT_ID_HERE>

Patrick Collins
  • 1,796
  • 1
  • 8
  • 25
4

for anyone else coming across this, my issue was that I had not granted my service account Storage legacy bucket reader. I'd only granted it Object viewer. Adding that legacy permission fixed it.

It seems docker is still using a legacy method to access GCR

Jonas D
  • 179
  • 2
  • 7
4

These are step-by step commands which got me to push first container to a GCE private repo:

export PROJECT=pacific-shelter-218
export KEY_NAME=key-name1
export KEY_DISPLAY_NAME='My Key Name'

sudo gcloud iam service-accounts create ${KEY_NAME} --display-name ${KEY_DISPLAY_NAME}
sudo gcloud iam service-accounts list
sudo gcloud iam service-accounts keys create --iam-account ${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com key.json
sudo gcloud projects add-iam-policy-binding ${PROJECT} --member serviceAccount:${KEY_NAME}@${PROJECT}.iam.gserviceaccount.com --role roles/storage.admin
sudo docker login -u _json_key -p "$(cat key.json)" https://gcr.io
sudo docker push  gcr.io/pacific-shelter-218/mypcontainer:v2
vitaly goji
  • 252
  • 3
  • 13
  • 1
    thank you! That finally worked for me. I tried creating the service account via the UI, and it would simply not work. I'll never know why. – jpbochi Dec 09 '19 at 11:46
  • I ran into similar problems with the UI. I managed to get it to work via the UI by adding the role: Editor ... but admit this is a rather wrecklessly insecure blunderbus approach. – andrew pate Feb 10 '20 at 18:21
  • Using service account for users is a terrible idea. Users should use user accounts. – Francisco Delmar Kurpiel Apr 17 '20 at 10:18
2

Here in the future, I've discovered that I no longer have any Legacy options. In this case I was forced to grant full Storage Admin. I'll open a ticket with Google about this, that's a bit extreme to allow me to push an image. This might help someone else from the future.

MXWest
  • 161
  • 1
  • 5
1

add service account role

on google cloud IAM

Editor Storage object Admin Storage object Viewer

fix for me

s4wet
  • 11
  • 2
1

I think the discrepancy is that https://cloud.google.com/container-registry/docs/access-control says, during the #permissions_and_roles section that you need the Storage Admin role in order to push images. However, in the next section that explains how to configure access, it says to add Storage Object Admin to enable push access for the account you're wishing to configure. Switching to Storage Admin should fix the issue.

dcow
  • 7,228
  • 3
  • 41
  • 65
0

GCR just uses GCS to store images check the permissions on your artifacts. folder in GCS within the same project.

Dan
  • 186
  • 5
0

Tried several things, but it seems you have to run gcloud auth configure-docker

Nebulastic
  • 4,673
  • 21
  • 44
0

I had a hard time figuring this out.

Although the error message was the same, my issue was that i was using the project name and not the project ID in the Image URL.

Natalia C
  • 116
  • 7
0

I created a separate service account to handle GCR IO. Added Artifact Registry Administrator role (I need to push and pull images) and it started to push the images again to GCR

DK_DEV
  • 31
  • 2