2

I hit some issues today developing some new flows - the first I've done reading from & loading into EU-region BigQuery databases.

To isolate the issue, I took the following steps:

  1. Create a new BQ database in the EU region
  2. Create a table by uploading a CSV
  3. Write a flow which reads from this table and outputs into a new table in the same database, without any transforms

And still the job fails with the following message:

status: {
    "errorResult": {
        "message": "Cannot read and write in different locations: source: EU, destination: US",
        "reason": "invalid"
    },
    "errors": [{
        "message": "Cannot read and write in different locations: source: EU, destination: US",
        "reason": "invalid"
    }],
    "state": "DONE"
}

This is the test flow:

Test flow

And this is the resulting DataFlow:

enter image description here

Adam Hopkinson
  • 26,835
  • 7
  • 57
  • 87

2 Answers2

0

I had the same issue for my EU datasources. Even though I did have the sources in BQ in UE - DataPrep default staging buckets were in US.

I recreated same structure of the bucket but using the EU location as I was not able to modify location of the already auto-created by Dataprep staging buckets.
And this link : https://cloud.google.com/dataprep/docs/html/User-Profile-Page_57344911 helped to figure out where to change temp, jobrun, upload paths after.

Doti
  • 1