Access Denied: Table X:Y.Z: The user 123-compute@developer.gserviceaccount.com does not have permission to query table X:Y.Z (bigQuery)
- Python
- from google.cloud import bigquery
- from google.oauth2 import service_account
- credentials = service_account.Credentials.from_service_account_file( 'path/to/file.json')
- project_id = 'my-bq'
- client = bigquery.Client(credentials= credentials, project=project_id)
- Python
Cannot connect to the instance using SSH since the disk is full (GCE)
- Check if your operating system supports automatic resizing: If so, using Cloud Console you can edit VM's root disk and increase its size. Your virtual machine instance can automatically resize the partition to recognize the additional space after you restart the instance.
- Use Interactive Serial Console feature to login to your VM and clean up your VM's disk or copy them to another storage, if you would need them later.
- If you know what data you want to delete, you can configure a startup script to remove the files and reboot your VM to run the script (e.g. rm /tmp/*).
- You can detach the persistent disk and attach this disk to another machine as an additional disk. On the temporary machine, you can mount it and clean up your data or copy them to another storage, if you would need them later. Finally, recreate the original instance with the same boot disk. You can follow the same steps described in this video to add your disk to another Linux VM but add your existing boot disk instead of creating a new disk.
- Check if your operating system supports automatic resizing: If yes, then create a snapshot of your persistent disk, create a new persistent disk with larger size from the snapshot. Finally, recreate the original instance with this larger boot disk.
No scalar data was found (tensorboard)
- Use gcloud command to train the model.
prediction_lib.PredictionError: Failed to load model: Cloud ML only supports TF 1.0 or above and models saved in SavedModel format. (Error code: 0) (ml enine)
- Check the model path which is the value of "--model-dir" flag.
- Note:
- Do not use the model location in the log info.
- E.g.
- INFO:tensorflow:SavedModel written to: b"output/export/census/temp-b'1531882849'/saved_model.pb"
- However, you should use "output/export/census/1531882849"
"error": "Prediction failed: unknown error." (ml engine)
- This is because the model doesn't support the specified instance format.
- E.g.
- The model supports JSON instance for prediction.
- However, a CSV instance has been specified for prediction.
- If the error still happens, then try to specify the version of the model which supports instances for prediction.
ERROR: (gcloud.ml-engine.jobs.submit.training) Could not copy [/tmp/.../output/trainer-0.0.0.tar.gz] to [.../trainer-0.0.0.tar.gz]. Please retry: HTTPError 404: Not Found (ml engine)
- Check the bucket name.
The schema of pandas dataframe created from read_gbq is different from bigQuery table (bigQuery)
- Use from google.cloud import bigquery instead.
- E.g. client.query('SELECT * FROM `pojectId.dataset.table` limit 1').result().schema
java.net.UnknownHostException: metadata (general)
- Set one of the configurations place below.
- google.cloud.auth.service.account.json.keyfile
- fs.gs.auth.service.account.json.keyfile
- Set one of the configurations place below.
- java.io.IOException: Error accessing: bucket: null (hadoop)
- Set "mapred.bq.gcs.bucket" configuration.
- java.lang.NullPointerException: Required parameter projectId must be specified (hadoop)
- Set "mapred.bq.project.id" configuration.
- org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: Failed to create load job with id prefix ${ID prefix}, reached max retries: 3, last failed load job (bigQuery)
- org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: Failed to create load job with id prefix ${ID prefix}, reached max retries: 3, last failed load job
- Make sure using right data type for related column while creating TableRow.
- org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: Failed to create load job with id prefix ${ID prefix}, reached max retries: 3, last failed load job
- Error detected while parsing row starting at position: 556531513. Error: Bad character (ASCII 0) encountered (bigQuery)
- Find the character causing the problem.
- less +556531513P test.csv
- There will be a character like Ctrl-@ which is ^@.
- Avoid or remove it before producing the CSV file, or from the CSV file.
- gsutil cp gs://bucket/test.csv - | tr -d '\000' | gsutil cp - gs://bucket/test2.csv
- Find the character causing the problem.
Wednesday, August 22, 2018
Help 4 GCP
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.