Sébastien Lucas / Dev wiki
  • Blog Wiki
  • Story book
  • Developers comics
  • Angular
    • Angular CLI + Angular.json
    • ngTemplateOutlet
    • Angular Schematic
    • Interceptor
    • Micro frontend + Mono repo
    • Build a library module
    • Error handling
    • Virtual and infinite scroll
    • Angular i18n
    • Debug Angular
    • Angular LifeCycle Hook
    • Nested forms in Angular
    • Angular Recipes
    • Master component in Angular
    • Perfomance optimization
    • Service Workers + PWA
    • Mobile
    • Electron app / Desktop app
    • Unit test & Angular
      • Unit test recipes (Jasmine / Karma / Angular)
      • Testing services with httpMock
    • Communication between components
    • Angular snippet library
    • Release and version management
    • Angular tutorial selection
    • UI components libraries
    • Angular libraries
    • Angular Tutorials
    • NGRX
      • Angular NGRX / Tips and code examples
      • Angular new Redux (alternative to ngrx)
      • NGRX unit test
      • Angular ngrx / Basics
    • Angular 2/Angular 1 differences
  • Graphql
  • Three.js
  • Ag grid
  • Open source BIM and 3D
  • Javascript
    • Null vs undefined
    • Html API
    • Await API
    • Debug memory leaks
    • Offline and PWA
    • Javascript tutorials
    • Javascript recipes
    • Bluebird missing docs
    • Alternative to lodash with ES6
    • ES6 syntax the best parts
    • Vanilla JS
  • RXJS
    • Docs
    • Recipes
    • Mock API and sandbox
    • Observables rxjs recipes
    • Combination operators
  • NODE.js
    • Environment variables
    • Fix CORS
    • Pagination requests in API
    • API tests
    • Node.js security
    • Learn node.js
    • Best libraries for node.js
    • Mongoose recipe
    • Debug node.js
  • Gatsby / React
    • Hooks
    • React concepts
    • Gatsby internationalisation
  • Ghost blog
  • Services for developers
    • SaaS images services
    • Airtable API examples
  • MISC
    • JIRA debugging
    • Wordpress plugins
    • Interview Sébastien Lucas
    • English expression
    • True recipes
    • Science podcast
  • AI
    • Machine learning open source
    • Tensor flow
    • Machine learning
    • Code examples
    • Courses and tutorials
    • Datasets
    • The Future of AI
    • Learn algo and data structures
  • Typescript
    • Generic types
    • Typescript recipes
    • Advanced types
      • Conditional types
      • Type guards
    • d.ts files
  • Docker
    • Starting with docker
    • Dockerise simple node app
    • Docker by aymen el amri
  • Mongodb
    • Pattern and data modeling
  • Devops
    • Heroku
    • Scaleway
    • Github template
    • Gitlab CI
    • http2
    • nginx
    • zsh
    • CI Continuous integration
    • DNS
    • Devops resources
    • Gcloud useful commands
    • Authenticate Gcloud
    • Documentation generators
    • Firebase database
  • Developers ressources
    • Online platform coding
      • Online courses
      • Coding games
      • Coding test platforms
      • Links to check
    • Good developers blogs
    • Nice open source project / github
  • Tooling
    • The chrome urls
    • Linux Mac tips
    • Webstorm configuration
    • Develop in Windows
    • Mac debug hardware
    • Mac Setup for a developer
    • Chrome extension
    • Develop toolbox
  • HTML / CSS
    • Tailwind
    • Css grid
    • ☘️Nice styles & generators
    • Favicon
    • Flexbox grid
    • Flexbox layout
    • PUG templates tips
    • Html and css references
    • Css snippets
    • SASS-CSS libraries
    • New things in scss
    • SASS best practices
    • Style lint
  • Tests
    • Cypress
      • Learn cypress and more tests
      • Cypress commands
      • Cypress plugins
      • Page object && app actions
      • Flaky tests
    • Mobile test
    • BDD + Cucumber
    • Puppeteer
    • Type of tests + Ressources
    • Jasmine test
    • Mock, fake, stub in unit tests
    • E2e tests with protactor
    • Mocha
  • REVIT & AEC tools
  • Git
    • Git commits
    • Git tips
    • Git hooks
    • Set up a mono repo
  • Design Pattern
    • Functional Programming
  • Job board / Remote jobs
  • SVG
  • JSON
  • Github
    • Forking a github repo
  • NPM
    • Private NPM packages
    • Publish to NPM with np
  • Yarn
    • Yarn evolution / 2, 3...
    • Yarn Linking
Powered by GitBook
On this page
  • Enabling the API's and discovering API manager
  • What type of authentication?
  • What is Application Default Credentials?
  • The google-cloud-node library the library to use
  • google-cloud-node do not take my json key
  • Application Default credential need to be set in an env variable!
  • Credentials fixed / What's next ?
  • Learn more about uploads in google storage
  • File get found? Use of fs module
  • The bucket Id!
  • The quest is over / First upload to google storage

Was this helpful?

  1. Devops

Authenticate Gcloud

PreviousGcloud useful commandsNextDocumentation generators

Last updated 6 years ago

Was this helpful?

Authentication is the more frustrating part of google cloud at first. It seems really complicated with various system, a terminology that at first we do not understand.

the google storage authentication quest

Let's list a few facts, we have understood.

Enabling the API's and discovering API manager

the tutorial suggest you

Enable both the Google App Engine Admin API and Google Cloud Storage API in your Cloud Platform project:

So :

  • Api's are not enabled by default

  • there is 2 concepts :

    • google cloud storage : for storing static files

    • google app engine : to compute app like a virtual machine

I notice a panel that is about APIS

That go to a page dedicated to all APIs. Not only the 2 mentioned, but more APIS that are in those 2 groups of APIS- storage and compute.

This page is also accessible via a side menu.

Once in the APIs Manager page, you have a credentials menu item.

The last item of this menu offer to help choose. Good idea!

What type of authentication?

There is 3 types :

  • API keys : A simple API key. Some were already generated by the browser or the server (per aps by firebase) in my case.

  • OAuth 2.0 client IDs : A way to create an account on you site thanks to their google login (for your app users)

  • Service account keys : A variant or API keys?

What is Application Default Credentials?

Google's officially supported Node.js client library for accessing Google APIs. Support for authorization and authentication with OAuth 2.0, API Keys and JWT (Service Tokens) is included

So let's install it :

npm install googleapis --save

Another place to look at is this page

Which is the identification temple of google! All google methods to authenticate.

Among which this mysterious Application Default Credentials.

After reading a bit, it seems there is another node.js more specific library only for google authentication.

Let's use it, as its doc should be clearer (as it is more specific)

This is Google's officially supported node.js client library for using OAuth 2.0 authorization and authentication with Google APIs.

So again :

npm install google-auth-library --save

The google-cloud-node library the library to use

So let's sum up :

The response of Aurelien is quite interesting

Yes I found the source!

Google cloud

google-cloud is a client library for accessing Google Cloud Platform services that significantly reduces the boilerplate code you have to write.

Yes, I do not like writing boiler plate code! After hours of search I am quite happy to see this promise of simplicity.

And google cloud is the recommended way, good to know....

The Google APIs Node.js Client is a client library for using the broad set of Google APIs. google-cloud is built specifically for the Google Cloud Platform and is the recommended way to integrate Google Cloud APIs into your Node.js applications. If your application requires both Google Cloud Platform and other Google APIs, the 2 libraries may be used by your application.

google-cloud-node do not take my json key

Then I set my authentication in my node.js file.

var gcloud = require('gcloud')({
  // the project Id is the unique id of the project
  // derived from the project name
  // it looks like myproject-1d298
  projectId: 'myProject-Name',
  keyFileName: './bamarchi-42fde040ae9d.json',
});

And relaunched my API using those settings. But was stopped by an other error.

firebase there was an error uploading to gcloud Error: Could not load the default credentials. Browse to https://developers.google.com/accounts/docs/application-default-credent

I then investigated the code that triggered this error, and found that the code is checking if the code is executed from a google cloud Engine (GCE) or not.

So if it is not the case (I am in local), it triggered this error independently of if it found the famous service key or not.

Application Default credential need to be set in an env variable!

I figured that to set the application default credentials, we need to modify an environment variable. What is a environment variable? They are UNIX variables that you can set in the command line by typing export.

export nameOftheVariable="valueOfTheVariable"

To get a list of variable already defined in your machine.

Just type :

export

You will be surprised to find :

  • PATH The list of directory where are placed some executables

  • PWD the current directory

  • ....

and lot of basic settings very important everywhere :)

So let's set application default credentials as mentioned in the doc :

export GOOGLE_APPLICATION_CREDENTIALS="/path/to/the/myproject-key.json"

Credentials fixed / What's next ?

The last export command seemed to have solved the credential missing error. I understood also that only the environmental is necessary, or at least only that is successful to authenticate. But my google cloud quest is not finished!

Here is the next error message. What is not found ?

  • The api route used by gcloud bucket.upload function?

  • Or the file I am giving it to upload?

  • Or the bucket?

By the way what is the file format it is expecting? Let's look at the code

firebase there was an error uploading to gcloud  { ApiError: Not Found
  at Object.parseHttpRespBody (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:191:30)
  at Object.handleResp (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:131:18)
  at /Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/@google-cloud/common/src/util.js:465:12
  at Request.onResponse [as _callback] (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/retry-request/index.js:120:7)
  at Request.self.callback (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/request/request.js:186:22)
  at emitTwo (events.js:106:13)
  at Request.emit (events.js:191:7)
  at Request.<anonymous> (/Users/slucas/SEB/www/bam/bamv2/node_modules/google-cloud/node_modules/request/request.js:1081:10)
  at emitOne (events.js:96:13)
  at Request.emit (events.js:188:7)
code: 404,
errors: [ { domain: 'global', reason: 'notFound', message: 'Not Found' } ],
response: undefined,
message: 'Not Found' } +2s

Learn more about uploads in google storage

There are 3 types of upload :

  • simple : only a file

  • multi-part : a file + its metadata

  • resumable : that can resume if a network disconnection occurred (for large file)

By going in the official doc, it was quite abstract and confusing. But reading at the code @example was more instructive

  • Simple upload with link to a local file

@example
* //-
* // The easiest way to upload a file.
* //-
* bucket.upload('/local/path/image.png', function(err, file, apiResponse) {
*   // Your bucket now contains:
*   // - "image.png" (with the contents of `/local/path/image.png')
*
*   // `file` is an instance of a File object that refers to your new file.
* });
*
  • With some options

Resumable options is interesting as it allow to resume the upload if the network connection is lost.

Also the metadata object seem a non standardized object where you choose which metadata to add, great for simple image annotation....

* //-
* // It's not always that easy. You will likely want to specify the filename
* // used when your new file lands in your bucket.
* //
* // You may also want to set metadata or customize other options.
* //-
* var options = {
*   destination: 'new-image.png',
*   resumable: true,
*   validation: 'crc32c',
*   metadata: {
*     event: 'Fall trip to the zoo'
*   }
* };
*
* bucket.upload('local-image.png', options, function(err, file) {
*   // Your bucket now contains:
*   // - "new-image.png" (with the contents of `local-image.png')
*
*   // `file` is an instance of a File object that refers to your new file.
* });
*
  • Gzip on the fly!

* //-
* // You can also have a file gzip'd on the fly.
* //-
* bucket.upload('index.html', { gzip: true }, function(err, file) {
*   // Your bucket now contains:
*   // - "index.html" (automatically compressed with gzip)
*
*   // Downloading the file with `file.download` will automatically decode the
*   // file.
* });
*
  • Overwrite an existing file

* //-
* // You may also re-use a File object, {module:storage/file}, that references
* // the file you wish to create or overwrite.
* //-
* var options = {
*   destination: bucket.file('existing-file.png'),
*   resumable: false
* };
*
* bucket.upload('local-img.png', options, function(err, newFile) {
*   // Your bucket now contains:
*   // - "existing-file.png" (with the contents of `local-img.png')
*
*   // Note:
*   // The `newFile` parameter is equal to `file`.
* });
*
  • Encrypt the files

* //-
* // To use
* // <a href="https://cloud.google.com/storage/docs/encryption#customer-supplied">
* // Customer-supplied Encryption Keys</a>, provide the `encryptionKey` option.
* //-
* var crypto = require('crypto');
* var encryptionKey = crypto.randomBytes(32);
*
* bucket.upload('img.png', {
*   encryptionKey: encryptionKey
* }, function(err, newFile) {
*   // `img.png` was uploaded with your custom encryption key.
*
*   // `newFile` is already configured to use the encryption key when making
*   // operations on the remote object.
*
*   // However, to use your encryption key later, you must create a `File`
*   // instance with the `key` supplied:
*   var file = bucket.file('img.png', {
*     encryptionKey: encryptionKey
*   });
*
*   // Or with `file#setEncryptionKey`:
*   var file = bucket.file('img.png');
*   file.setEncryptionKey(encryptionKey);
* });
*
* //-
* // If the callback is omitted, we'll return a Promise.
* //-
* bucket.upload('local-image.png').then(function(data) {
*   var file = data[0];
* });
*/

File get found? Use of fs module

But back to our concerns, why this error, 404, not found. I was suspecting that it was the file that was not found.

So I had the idea to use fs module to check first if the file exist. To prevent stupid file not found error.

I realized 2 things:

  • files are relative to the file where we are (not the starter file of the node process)

  • We can use __dirname to get the absolute directory where we are and path.join

So we wrap our bucket.upload call in a fs.exists function to be sure the file exist before trying to upload it to google storage.

var filePath = path.join(__dirname, '..', '..', '..', 'files', 'upload', filemeta.newName);
    fs.exists(filePath, function(exists) {
      if (exists) {
        debug('going to upload the file' + filePath);
        bucket.upload(filePath, function(err, file) {
        ...

The bucket Id!

By looking at the cloud storage console, I realized that some request was logged. 6 requests, so the API was hit. A first step.

But my there was an error uploading to gcloud { ApiError: Not Found remain.

And looking at more detail. It is 6 client errors (4XX errors).

The following is an example of an error response you receive if you try to retrieve an object that does not exist.

So I searched an object, I suspect an object in cloud storage so either a file object or a bucket object.... But as I was uploading a new file and not updating it. I suppose it could be the bucket?

I went to google storage configuration page. There is one page to manage API's all API's. And one page per service / module. When you do not use the module yet, you have a card linking to documentation...

And found the id of my bucket different from the id of the project. I had put the projectId instead.

That was it!

The quest is over / First upload to google storage

After 1-2 days on it. I understood :

  • some facts about node.js : fs and path module

  • Made my way through google cloud online console

  • Choose a library in node.js to communicate with it

  • Fixed temporarily the authentication problem

Because the remaining problems are :

  • How to to authenticate in the project and not with the not very handy environmental variable

But now I am no more stuck and get back to work. The quest is over!

Accessing the api guide

When clicking on the enable

API panel
Google cloud APIS
Gcloud side menu
Credentials gcloud

You can access this wizard directly at this .

For the case of a server that want to have access to the app. I ended with the concept of Application Default Credentials. Documented in this .

I am interested in the

It use a library : which is global to all google APIs.

While reading a on how to use google storage instead of firebase (which does not have a SDK yet in node.js)

Manage all google apis with node.js app independently of gcloud

Manage only the authentication in the context of gcloud or not

the gcloud specific library

Here is a much more detailed description on how to authenticate with google-cloud SDK (not be mistaken with gcloud the command line utility)

The keyfileName is the path to the service account key you have created in the google cloud console at

Google cloud

By reading again this the page linked in the error

Let's read in the file it self where is that function Bucket.upload It's .

The comments are quite informative and give link to a online documentation. This page is the documentation on .

As this suggest.

Google cloud
Google cloud error

By searching this error message in google I found the error message page and their description (for the specific google-cloud-storage API). Not very informative but the phrase is

First upload
https://cloud.google.com/appengine/docs/admin-api/accessing-the-api
all apis button
link
page
node.js version
https://github.com/google/google-api-nodejs-client
https://developers.google.com/identity/
https://github.com/google/google-auth-library-nodejs
discussion
https://github.com/GoogleCloudPlatform/google-cloud-node
https://github.com/google/google-api-nodejs-client
https://github.com/google/google-auth-library-nodejs
https://github.com/GoogleCloudPlatform/google-cloud-node
https://googlecloudplatform.github.io/google-cloud-node/#/docs/google-cloud/0.45.0/guides/authentication
https://console.cloud.google.com/apis/credentials
https://developers.google.com/identity/protocols/application-default-credentials
here
Gcloud storage upload module
stack overflow response
https://cloud.google.com/storage/docs/json_api/v1/status-codes
https://googlecloudplatform.github.io/google-cloud-node/#/
Google cloud