Sorry Your Request Couldnt Be Authenticated Please Sign in to Your Google Account and Try Again

This folio describes troubleshooting methods for mutual errors y'all may encounter while using Cloud Storage.

See the Google Deject Status Dashboard for data about regional or global incidents affecting Google Cloud services such as Cloud Storage.

Logging raw requests

When using tools such as gsutil or the Cloud Storage customer libraries, much of the asking and response information is handled by the tool. All the same, it is sometimes useful to see details to assist in troubleshooting. Employ the following instructions to return request and response headers for your tool:

Panel

Viewing request and response information depends on the browser yous're using to access the Google Cloud Console. For the Google Chrome browser:

  1. Click Chrome's main bill of fare button ().

  2. Select More Tools.

  3. Click Developer Tools.

  4. In the pane that appears, click the Network tab.

gsutil

Employ the global -D flag in your request. For example:

gsutil -D ls gs://my-bucket/my-object

Customer libraries

C++

  • Fix the environment variable CLOUD_STORAGE_ENABLE_TRACING=http to become the total HTTP traffic.

  • Set the environs variable CLOUD_STORAGE_ENABLE_CLOG=aye to get logging of each RPC.

C#

Add a logger via ApplicationContext.RegisterLogger, and ready logging options on the HttpClient message handler. For more data, see the FAQ entry.

Go

Set the surround variable GODEBUG=http2debug=one. For more than data, see the Go package cyberspace/http.

If you want to log the request torso besides, utilize a custom HTTP client.

Java

  1. Create a file named "logging.properties" with the following contents:

    # Properties file which configures the functioning of the JDK logging facility. # The organization will look for this config file to be specified equally a system holding: # -Djava.util.logging.config.file=${project_loc:googleplus-unproblematic-cmdline-sample}/logging.backdrop  # Set up the console handler (uncomment "level" to testify more than fine-grained messages) handlers = java.util.logging.ConsoleHandler java.util.logging.ConsoleHandler.level = CONFIG  # Set logging of HTTP requests and responses (uncomment "level" to show) com.google.api.client.http.level = CONFIG
  2. Use logging.properties with Maven

    mvn -Djava.util.logging.config.file=path/to/logging.properties                      insert_command                    

For more information, run into Pluggable HTTP Transport.

Node.js

Fix the environment variable NODE_DEBUG=https before calling the Node script.

PHP

Provide your own HTTP handler to the client using httpHandler and fix middleware to log the request and response.

Python

Employ the logging module. For instance:

import logging import http.client  logging.basicConfig(level=logging.DEBUG) http.client.HTTPConnection.debuglevel=five

Cerise

At the top of your .rb file after require "google/cloud/storage", add together the following:

ruby Google::Apis.logger.level = Logger::DEBUG

Error codes

The following are mutual HTTP status codes you lot may meet.

301: Moved Permanently

Issue: I'm setting up a static website, and accessing a directory path returns an empty object and a 301 HTTP response code.

Solution: If your browser downloads a zero byte object and you lot get a 301 HTTP response code when accessing a directory, such as http://www.example.com/dir/, your bucket most likely contains an empty object of that name. To cheque that this is the case and fix the outcome:

  1. In the Google Cloud Console, get to the Deject Storage Browser page.

    Become to Browser

  2. Click the Activate Cloud Shell push button at the top of the Google Cloud Console. Activate Cloud Shell
  3. Run gsutil ls -R gs://world wide web.instance.com/dir/. If the output includes http://www.example.com/dir/, you have an empty object at that location.
  4. Remove the empty object with the command: gsutil rm gs://www.example.com/dir/

You lot can now access http://www.example.com/dir/ and have it render that directory's index.html file instead of the empty object.

400: Bad Asking

Issue: While performing a resumable upload, I received this mistake and the bulletin Failed to parse Content-Range header.

Solution: The value you lot used in your Content-Range header is invalid. For example, Content-Range: */* is invalid and instead should be specified every bit Content-Range: bytes */*. If yous receive this error, your current resumable upload is no longer agile, and yous must start a new resumable upload.

Issue: Requests to a public saucepan directly, or via Cloud CDN, are failing with a HTTP 401: Unauthorized and an Authentication Required response.

Solution: Cheque that your client, or whatsoever intermediate proxy, is not adding an Authorization header to requests to Cloud Storage. Any request with an Dominance header, even if empty, is validated equally if information technology were an hallmark endeavour.

403: Account Disabled

Consequence: I tried to create a saucepan merely got a 403 Account Disabled error.

Solution: This error indicates that you have not yet turned on billing for the associated project. For steps for enabling billing, see Enable billing for a project.

If billing is turned on and you lot continue to receive this error message, yous can reach out to support with your project ID and a description of your problem.

403: Access Denied

Consequence: I tried to list the objects in my bucket but got a 403 Admission Denied error and/or a message similar to Anonymous caller does not have storage.objects.list admission.

Solution: Bank check that your credentials are right. For case, if you are using gsutil, bank check that the credentials stored in your .boto file are accurate. Also, ostend that gsutil is using the .boto file y'all expect by using the command gsutil version -fifty and checking the config path(southward) entry.

Bold yous are using the right credentials, are your requests existence routed through a proxy, using HTTP (instead of HTTPS)? If and so, check whether your proxy is configured to remove the Authorization header from such requests. If so, make certain you lot are using HTTPS instead of HTTP for your requests.

403: Forbidden

Event: I am downloading my public content from storage.deject.google.com, and I receive a 403: Forbidden fault when I use the browser to navigate to the public object:

https://storage.cloud.google.com/BUCKET_NAME/OBJECT_NAME        

Solution: Using storage.cloud.google.com to download objects is known as authenticated browser downloads; it e'er uses cookie-based hallmark, fifty-fifty when objects are fabricated publicly accessible to allUsers. If you have configured Data Access logs in Cloud Audit Logs to track access to objects, one of the restrictions of that feature is that authenticated browser downloads cannot be used to access the afflicted objects; attempting to practise and so results in a 403 response.

To avoid this upshot, do one of the post-obit:

  • Utilise direct API calls, which back up unauthenticated downloads, instead of using authenticated browser downloads.
  • Disable the Deject Storage Data Access logs that are tracking access to the affected objects. Be aware that Data Access logs are ready at or in a higher place the project level and tin be enabled simultaneously at multiple levels.
  • Set Data Access log exemptions to exclude specific users from Information Access log tracking, which allows those users to perform authenticated browser downloads.

409: Disharmonize

Issue: I tried to create a bucket but received the following mistake:

409 Disharmonize. Sorry, that name is non bachelor. Please try a different one.

Solution: The bucket proper noun you lot tried to utilise (east.grand. gs://cats or gs://dogs) is already taken. Cloud Storage has a global namespace then y'all may non proper name a bucket with the same name as an existing bucket. Choose a proper name that is not beingness used.

429: Likewise Many Requests

Result: My requests are being rejected with a 429 Too Many Requests mistake.

Solution: You are striking a limit to the number of requests Cloud Storage allows for a given resource. Meet the Cloud Storage quotas for a discussion of limits in Cloud Storage. If your workload consists of 1000's of requests per second to a bucket, see Request charge per unit and admission distribution guidelines for a discussion of best practices, including ramping upwardly your workload gradually and avoiding sequential filenames.

Diagnosing Google Cloud Console errors

Issue: When using the Google Cloud Console to perform an functioning, I go a generic error message. For example, I encounter an error message when trying to delete a bucket, merely I don't see details for why the operation failed.

Solution: Use the Google Cloud Console'due south notifications to encounter detailed information about the failed operation:

  1. Click the Notifications push button in the Google Cloud Console header.

    Notifications

    A dropdown displays the most contempo operations performed by the Google Cloud Panel.

  2. Click the particular you want to find out more than about.

    A page opens upward and displays detailed information most the operation.

  3. Click on each row to expand the detailed error information.

    Below is an example of error information for a failed bucket deletion operation, which explains that a bucket retention policy prevented the deletion of the bucket.

    Bucket deletion error details

gsutil errors

The following are mutual gsutil errors you may encounter.

gsutil stat

Issue: I tried to use the gsutil stat command to display object status for a subdirectory and got an error.

Solution: Cloud Storage uses a flat namespace to shop objects in buckets. While yous can use slashes ("/") in object names to make it appear every bit if objects are in a hierarchical structure, the gsutil stat command treats a trailing slash as function of the object name.

For case, if y'all run the command gsutil -q stat gs://my-bucket/my-object/, gsutil looks up information well-nigh the object my-object/ (with a trailing slash), every bit opposed to operating on objects nested under my-saucepan/my-object/. Unless y'all actually have an object with that name, the operation fails.

For subdirectory listing, use the gsutil ls instead.

gcloud auth

Issue: I tried to authenticate gsutil using the gcloud auth command, merely I even so cannot access my buckets or objects.

Solution: Your system may accept both the stand-solitary and Google Cloud CLI versions of gsutil installed on information technology. Run the control gsutil version -fifty and check the value for using cloud sdk. If False, your system is using the stand up-alone version of gsutil when you run commands. Y'all tin can either remove this version of gsutil from your system, or y'all tin can authenticate using the gsutil config command.

Static website errors

The post-obit are common issues that you may encounter when setting up a saucepan to host a static website.

HTTPS serving

Effect: I want to serve my content over HTTPS without using a load balancer.

Solution: You can serve static content through HTTPS using directly URIs such every bit https://storage.googleapis.com/my-saucepan/my-object. For other options to serve your content through a custom domain over SSL, you can:

  • Use a tertiary-party Content Commitment Network with Deject Storage.
  • Serve your static website content from Firebase Hosting instead of Cloud Storage.

Domain verification

Issue: I can't verify my domain.

Solution: Normally, the verification process in Search Console directs yous to upload a file to your domain, only you may not accept a way to do this without first having an associated saucepan, which you can only create afterwards yous have performed domain verification.

In this instance, verify ownership using the Domain name provider verification method. See Ownership verification for steps to reach this. This verification can be done before the bucket is created.

Inaccessible page

Issue: I get an Access denied error bulletin for a web page served by my website.

Solution: Check that the object is shared publicly. If it is not, meet Making Information Public for instructions on how to exercise this.

If you previously uploaded and shared an object, but so upload a new version of it, and then you must reshare the object publicly. This is considering the public permission is replaced with the new upload.

Permission update failed

Issue: I become an error when I attempt to make my data public.

Solution: Make sure that yous have the setIamPolicy permission for your object or saucepan. This permission is granted, for instance, in the Storage Admin role. If you have the setIamPolicy permission and you still get an error, your bucket might be subject area to public admission prevention, which does not permit access to allUsers or allAuthenticatedUsers. Public access prevention might be set on the bucket directly, or it might be enforced through an organisation policy that is set at a higher level.

Content download

Issue: I am prompted to download my page'south content, instead of existence able to view information technology in my browser.

Solution: If you specify a MainPageSuffix every bit an object that does non have a web content type, and then instead of serving the page, site visitors are prompted to download the content. To resolve this issue, update the content-type metadata entry to a suitable value, such as text/html. See Editing object metadata for instructions on how to do this.

Latency

The following are common latency issues you might encounter. In addition, the Google Cloud Condition Dashboard provides information nigh regional or global incidents affecting Google Cloud services such as Cloud Storage.

Upload or download latency

Consequence: I'grand seeing increased latency when uploading or downloading.

Solution: Use the gsutil perfdiag control to run functioning diagnostics from the affected environment. Consider the following common causes of upload and download latency:

  • CPU or memory constraints: The afflicted environment's operating system should accept tooling to mensurate local resource consumption such equally CPU usage and memory usage.

  • Disk IO constraints: As part of the gsutil perfdiag control, use the rthru_file and wthru_file tests to gauge the performance affect caused past local disk IO.

  • Geographical distance: Performance can be impacted past the concrete separation of your Cloud Storage bucket and affected environment, particularly in cantankerous-continental cases. Testing with a bucket located in the same region equally your afflicted environment can identify the extent to which geographic separation is contributing to your latency.

    • If applicable, the affected environment'due south DNS resolver should utilise the EDNS(0) protocol so that requests from the environment are routed through an advisable Google Forepart.

gsutil or client library latency

Event: I'1000 seeing increased latency when accessing Cloud Storage with gsutil or one of the client libraries.

Solution: Both gsutil and client libraries automatically retry requests when it'due south useful to do then, and this behavior can effectively increase latency equally seen from the end user. Use the Deject Monitoring metric storage.googleapis.com/api/request_count to see if Deject Storage is consistenty serving a retryable response lawmaking, such as 429 or 5xx.

Proxy servers

Issue: I'm connecting through a proxy server. What do I need to do?

Solution: To access Cloud Storage through a proxy server, you must let access to these domains:

  • accounts.google.com for creating OAuth2 authentication tokens via gsutil config
  • oauth2.googleapis.com for performing OAuth2 token exchanges
  • *.googleapis.com for storage requests

If your proxy server or security policy doesn't support whitelisting by domain and instead requires whitelisting by IP network block, we strongly recommend that you lot configure your proxy server for all Google IP address ranges. You can find the accost ranges by querying WHOIS information at ARIN. As a best exercise, you lot should periodically review your proxy settings to ensure they friction match Google's IP addresses.

Nosotros do not recommend configuring your proxy with individual IP addresses you obtain from ane-time lookups of oauth2.googleapis.com and storage.googleapis.com. Because Google services are exposed via DNS names that map to a large number of IP addresses that tin can alter over time, configuring your proxy based on a one-fourth dimension lookup may pb to failures to connect to Cloud Storage.

If your requests are being routed through a proxy server, yous may need to cheque with your network administrator to ensure that the Authorisation header containing your credentials is not stripped out by the proxy. Without the Authorization header, your requests are rejected and you receive a MissingSecurityHeader error.

What's side by side

  • Learn about your support options.
  • Detect answers to boosted questions in the Cloud Storage FAQ.
  • Explore how Error Reporting can help y'all identify and empathise your Cloud Storage errors.

changdrue1948.blogspot.com

Source: https://cloud.google.com/storage/docs/troubleshooting

0 Response to "Sorry Your Request Couldnt Be Authenticated Please Sign in to Your Google Account and Try Again"

Publicar un comentario

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel