Authenticate Gcloud

Authentication is the more frustrating part of google cloud at first. It seems really complicated with various system, a terminology that at first we do not understand.

the google storage authentication quest

Let's list a few facts, we have understood.

Enabling the API's and discovering API manager

the tutorial suggest you

Enable both the Google App Engine Admin API and Google Cloud Storage API in your Cloud Platform project:

So :

  • Api's are not enabled by default

  • there is 2 concepts :

    • google cloud storage : for storing static files

    • google app engine : to compute app like a virtual machine

When clicking on the enable all apis button

I notice a panel that is about APIS

API panel

That go to a page dedicated to all APIs. Not only the 2 mentioned, but more APIS that are in those 2 groups of APIS- storage and compute.

Google cloud APIS

This page is also accessible via a side menu.

Gcloud side menu

Once in the APIs Manager page, you have a credentials menu item.

Credentials gcloud

The last item of this menu offer to help choose. Good idea!

You can access this wizard directly at this link.

What type of authentication?

There is 3 types :

  • API keys : A simple API key. Some were already generated by the browser or the server (per aps by firebase) in my case.

  • OAuth 2.0 client IDs : A way to create an account on you site thanks to their google login (for your app users)

  • Service account keys : A variant or API keys?

What is Application Default Credentials?

For the case of a server that want to have access to the app. I ended with the concept of Application Default Credentials. Documented in this page.

I am interested in the node.js version

It use a library : which is global to all google APIs.

Google's officially supported Node.js client library for accessing Google APIs. Support for authorization and authentication with OAuth 2.0, API Keys and JWT (Service Tokens) is included

So let's install it :

npm install googleapis --save

Another place to look at is this page

Which is the identification temple of google! All google methods to authenticate.

Among which this mysterious Application Default Credentials.

After reading a bit, it seems there is another node.js more specific library only for google authentication.

Let's use it, as its doc should be clearer (as it is more specific)

This is Google's officially supported node.js client library for using OAuth 2.0 authorization and authentication with Google APIs.

So again :

npm install google-auth-library --save

The google-cloud-node library the library to use

While reading a discussion on how to use google storage instead of firebase (which does not have a SDK yet in node.js)

So let's sum up :

The response of Aurelien is quite interesting

Yes I found the source!

Google cloud

google-cloud is a client library for accessing Google Cloud Platform services that significantly reduces the boilerplate code you have to write.

Yes, I do not like writing boiler plate code! After hours of search I am quite happy to see this promise of simplicity.

Google cloud

And google cloud is the recommended way, good to know....

The Google APIs Node.js Client is a client library for using the broad set of Google APIs. google-cloud is built specifically for the Google Cloud Platform and is the recommended way to integrate Google Cloud APIs into your Node.js applications. If your application requires both Google Cloud Platform and other Google APIs, the 2 libraries may be used by your application.

google-cloud-node do not take my json key

Here is a much more detailed description on how to authenticate with google-cloud SDK (not be mistaken with gcloud the command line utility)

Then I set my authentication in my node.js file.

var gcloud = require('gcloud')({
// the project Id is the unique id of the project
// derived from the project name
// it looks like myproject-1d298
projectId: 'myProject-Name',
keyFileName: './bamarchi-42fde040ae9d.json',

The keyfileName is the path to the service account key you have created in the google cloud console at

Google cloud

And relaunched my API using those settings. But was stopped by an other error.

firebase there was an error uploading to gcloud Error: Could not load the default credentials. Browse to

I then investigated the code that triggered this error, and found that the code is checking if the code is executed from a google cloud Engine (GCE) or not.

So if it is not the case (I am in local), it triggered this error independently of if it found the famous service key or not.

By reading again this the page linked in the error

Application Default credential need to be set in an env variable!

I figured that to set the application default credentials, we need to modify an environment variable. What is a environment variable? They are UNIX variables that you can set in the command line by typing export.

export nameOftheVariable="valueOfTheVariable"

To get a list of variable already defined in your machine.

Just type :


You will be surprised to find :

  • PATH The list of directory where are placed some executables

  • PWD the current directory

  • ....

and lot of basic settings very important everywhere :)

So let's set application default credentials as mentioned in the doc :

export GOOGLE_APPLICATION_CREDENTIALS="/path/to/the/myproject-key.json"

Credentials fixed / What's next ?

The last export command seemed to have solved the credential missing error. I understood also that only the environmental is necessary, or at least only that is successful to authenticate. But my google cloud quest is not finished!

Here is the next error message. What is not found ?

  • The api route used by gcloud bucket.upload function?

  • Or the file I am giving it to upload?

  • Or the bucket?

By the way what is the file format it is expecting? Let's look at the code

firebase there was an error uploading to gcloud { ApiError: Not Found
at Object.parseHttpRespBody (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:191:30)
at Object.handleResp (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:131:18)
at /Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:465:12
at Request.onResponse [as _callback] (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/retry-request/index.js:120:7)
at Request.self.callback (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/request/request.js:186:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:191:7)
at Request.<anonymous> (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/request/request.js:1081:10)
at emitOne (events.js:96:13)
at Request.emit (events.js:188:7)
code: 404,
errors: [ { domain: 'global', reason: 'notFound', message: 'Not Found' } ],
response: undefined,
message: 'Not Found' } +2s

Learn more about uploads in google storage

Let's read in the file it self where is that function Bucket.upload It's here.

The comments are quite informative and give link to a online documentation. This page is the documentation on Gcloud storage upload module.

There are 3 types of upload :

  • simple : only a file

  • multi-part : a file + its metadata

  • resumable : that can resume if a network disconnection occurred (for large file)

By going in the official doc, it was quite abstract and confusing. But reading at the code @example was more instructive

  • Simple upload with link to a local file

* //-
* // The easiest way to upload a file.
* //-
* bucket.upload('/local/path/image.png', function(err, file, apiResponse) {
* // Your bucket now contains:
* // - "image.png" (with the contents of `/local/path/image.png')
* // `file` is an instance of a File object that refers to your new file.
* });
  • With some options

Resumable options is interesting as it allow to resume the upload if the network connection is lost.

Also the metadata object seem a non standardized object where you choose which metadata to add, great for simple image annotation....

* //-
* // It's not always that easy. You will likely want to specify the filename
* // used when your new file lands in your bucket.
* //
* // You may also want to set metadata or customize other options.
* //-
* var options = {
* destination: 'new-image.png',
* resumable: true,
* validation: 'crc32c',
* metadata: {
* event: 'Fall trip to the zoo'
* }
* };
* bucket.upload('local-image.png', options, function(err, file) {
* // Your bucket now contains:
* // - "new-image.png" (with the contents of `local-image.png')
* // `file` is an instance of a File object that refers to your new file.
* });
  • Gzip on the fly!

* //-
* // You can also have a file gzip'd on the fly.
* //-
* bucket.upload('index.html', { gzip: true }, function(err, file) {
* // Your bucket now contains:
* // - "index.html" (automatically compressed with gzip)
* // Downloading the file with `` will automatically decode the
* // file.
* });
  • Overwrite an existing file

* //-
* // You may also re-use a File object, {module:storage/file}, that references
* // the file you wish to create or overwrite.
* //-
* var options = {
* destination: bucket.file('existing-file.png'),
* resumable: false
* };
* bucket.upload('local-img.png', options, function(err, newFile) {
* // Your bucket now contains:
* // - "existing-file.png" (with the contents of `local-img.png')
* // Note:
* // The `newFile` parameter is equal to `file`.
* });
  • Encrypt the files

* //-
* // To use
* // <a href="">
* // Customer-supplied Encryption Keys</a>, provide the `encryptionKey` option.
* //-
* var crypto = require('crypto');
* var encryptionKey = crypto.randomBytes(32);
* bucket.upload('img.png', {
* encryptionKey: encryptionKey
* }, function(err, newFile) {
* // `img.png` was uploaded with your custom encryption key.
* // `newFile` is already configured to use the encryption key when making
* // operations on the remote object.
* // However, to use your encryption key later, you must create a `File`
* // instance with the `key` supplied:
* var file = bucket.file('img.png', {
* encryptionKey: encryptionKey
* });
* // Or with `file#setEncryptionKey`:
* var file = bucket.file('img.png');
* file.setEncryptionKey(encryptionKey);
* });
* //-
* // If the callback is omitted, we'll return a Promise.
* //-
* bucket.upload('local-image.png').then(function(data) {
* var file = data[0];
* });

File get found? Use of fs module

But back to our concerns, why this error, 404, not found. I was suspecting that it was the file that was not found.

So I had the idea to use fs module to check first if the file exist. To prevent stupid file not found error.

I realized 2 things:

  • files are relative to the file where we are (not the starter file of the node process)

  • We can use __dirname to get the absolute directory where we are and path.join

    As this stack overflow response suggest.

So we wrap our bucket.upload call in a fs.exists function to be sure the file exist before trying to upload it to google storage.

var filePath = path.join(__dirname, '..', '..', '..', 'files', 'upload', filemeta.newName);
fs.exists(filePath, function(exists) {
if (exists) {
debug('going to upload the file' + filePath);
bucket.upload(filePath, function(err, file) {

The bucket Id!

By looking at the cloud storage console, I realized that some request was logged. 6 requests, so the API was hit. A first step.

Google cloud

But my there was an error uploading to gcloud { ApiError: Not Found remain.

And looking at more detail. It is 6 client errors (4XX errors).

Google cloud error

By searching this error message in google I found the error message page and their description (for the specific google-cloud-storage API). Not very informative but the phrase is

The following is an example of an error response you receive if you try to retrieve an object that does not exist.

So I searched an object, I suspect an object in cloud storage so either a file object or a bucket object.... But as I was uploading a new file and not updating it. I suppose it could be the bucket?

I went to google storage configuration page. There is one page to manage API's all API's. And one page per service / module. When you do not use the module yet, you have a card linking to documentation...

And found the id of my bucket different from the id of the project. I had put the projectId instead.

That was it!

First upload

The quest is over / First upload to google storage

After 1-2 days on it. I understood :

  • some facts about node.js : fs and path module

  • Made my way through google cloud online console

  • Choose a library in node.js to communicate with it

  • Fixed temporarily the authentication problem

Because the remaining problems are :

  • How to to authenticate in the project and not with the not very handy environmental variable

But now I am no more stuck and get back to work. The quest is over!