Monday, May 16, 2022

Help 4 GCP

Cloud Workflows

  • Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials
    • Option 1
      • http.post + googleapis.bigquery.v2.jobs.query
      • call: http.post args: url: ${"https://bigquery.googleapis.com/bigquery/v2/projects/"+project+"/queries"} headers: Content-type: "application/json" auth: type: OAuth2 scope: ["https://www.googleapis.com/auth/drive","https://www.googleapis.com/auth/cloud-platform","https://www.googleapis.com/auth/bigquery"] body: query: select * from sheets.sheets_data timeoutMs: 200000 useLegacySql: false result: response
      • Google Sheets > Share > Add the Service Account runs query > Viewer > Done
    • Option 2
      • Scheduled query + googleapis.bigquerydatatransfer.v1
      • call: googleapis.bigquerydatatransfer.v1.projects.locations.transferConfigs.startManualRuns args: parent: ${scheduled_query_name} body: requestedRunTime: ${time.format(sys.now())} result: response

BigQuery

  • Permission denied while getting Drive credentials 
    • Google Sheets > Share > Add the Service Account runs query > Viewer > Done


Cloud Functions

  • failed to export: failed to write image to the following tags
    • Use “gcloud beta function” and “--docker-registry=artifact-registry”
  • Your client does not have permission to get URL /yourUrl from this server
    • auth:
      • type: OIDC


Data Fusion

  • The client is not authorized to make this request
    • Add the related role to the data fusion service account
    • E.g., Add Cloud SQL Client to service-[project-number]@gcp-sa-datafusion.iam.gserviceaccount.com
  • MongoSocketException, UnknownHostException
    • Use all shard hosts
  • MongoSocketReadException, Prematurely reached end of stream
    • Add ssl=true
  • mongodb-plugins, Authentication failed
    • Add authSource=admin


Data Studio

  • interval 1 day, Invalid formula
    • Use interval 24 hour


Cloud Storage

  • Error getting access token from metadata server at
    • val hadoopConf = spark.sparkContext.hadoopConfiguration
    • hadoopConf.set("google.cloud.auth.service.account.enable", "true")
    • hadoopConf.set("google.cloud.auth.service.account.json.keyfile", "yourKey.json")