Cloud Workflows
- Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials
- Option 1
- http.post + googleapis.bigquery.v2.jobs.query
- call: http.post args: url: ${"https://bigquery.googleapis.com/bigquery/v2/projects/"+project+"/queries"} headers: Content-type: "application/json" auth: type: OAuth2 scope: ["https://www.googleapis.com/auth/drive","https://www.googleapis.com/auth/cloud-platform","https://www.googleapis.com/auth/bigquery"] body: query: select * from sheets.sheets_data timeoutMs: 200000 useLegacySql: false result: response
- Google Sheets > Share > Add the Service Account runs query > Viewer > Done
- Option 2
- Scheduled query + googleapis.bigquerydatatransfer.v1
- call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.startManualRuns args: parent: ${scheduled_query_name} body: requestedRunTime: ${time.format(sys.now())} result: response
BigQuery
- string_field_0 after a Google Sheets file created as an external table
- Add a number column if all columns are in text format
- Permission denied while getting Drive credentials
- Google Sheets > Share > Add the Service Account runs query > Viewer > Done
Cloud Functions
- Your client does not have permission to get URL
- Cloud Functions Developer
- It will take time to be applied
- Cloud Functions Developer
- failed to export: failed to write image to the following tags
- Use “gcloud beta function” and “--docker-registry=artifact-registry”
- Your client does not have permission to get URL /yourUrl from this server
- auth:
- type: OIDC
Cloud Run
- DefaultCredentialsError: Neither metadata server or valid service account credentials are found
- Use a service account
- Cloud Run Invoker
- It will take time to be applied
Data Fusion
- The client is not authorized to make this request
- Add the related role to the data fusion service account
- E.g., Add Cloud SQL Client to service-[project-number]@gcp-sa-datafusion.iam.gserviceaccount.com
- MongoSocketException, UnknownHostException
- Use all shard hosts
- MongoSocketReadException, Prematurely reached end of stream
- Add ssl=true
- mongodb-plugins, Authentication failed
- Add authSource=admin
Data Studio
- interval 1 day, Invalid formula
- Use interval 24 hour
Cloud Storage
- Error getting access token from metadata server at
- val hadoopConf = spark.sparkContext.hadoopConfiguration
- hadoopConf.set("google.cloud.auth.service.account.enable", "true")
- hadoopConf.set("google.cloud.auth.service.account.json.keyfile", "yourKey.json")
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.