I am new in Python and airflow. I want to create a dataproc cluster using an http request inside a pythonoperator task. See the below code:
def create_cluster():
API_ENDPOINT = "https://dataproc.googleapis.com/v1beta2/projects/trim
-**********/regions/us-central1-b/clusters"
data = {
"projectId": "trim-**********",
"clusterName": "cluster-1",
"config": {
"configBucket": "",
"gceClusterConfig": {
"subnetworkUri": "default",
"zoneUri": "us-central1-b"
},
"masterConfig": {
"numInstances": 1,
"machineTypeUri": "n1-standard-1",
"diskConfig": {
"bootDiskSizeGb": 500,
"numLocalSsds": 0
}
},
"workerConfig": {
"numInstances": 2,
"machineTypeUri": "n1-standard-1",
"diskConfig": {
"bootDiskSizeGb": 100,
"numLocalSsds": 0
}
}
}
}
r = requests.post(url=API_ENDPOINT, data=data)
pastebin_url = r.text
print("The pastebin URL is:%s" % pastebin_url)
But I am getting an error: Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential.
See https://developers.google.com/identity/sign-in/web/devconsole-project.
What would be the solution for this error? Thanks in advance.