Monday, July 15, 2024
No menu items!
HomeCloud ComputingUsing a private repo on Artifact Registry in Google Cloud Functions

Using a private repo on Artifact Registry in Google Cloud Functions

Late last year, we announced that Artifact Registry was going GA, allowing GCP customers to manage their packages within the same platform as they were being deployed. In this blogpost, we want to show you how to do exactly that with a private dependency.

Private dependencies allow your packages to be shared with only a select group of viewers. If your codebase is already private, a private dependency can help modularize functionality using the same methodologies you use in your open source projects. Furthermore, you can experimentally develop your private dependency without breaking your overall codebase by pinning the version of the dependency on a working release. It can also provide necessary and durable abstraction if multiple projects depend on the same functionality. It does so by allowing multiple teams access to up-to-date and tested code rather than relying on copying and pasting fragmented code snippets.

With Artifact Registry, you can now wire together your serverless processes with your private dependencies without ever leaving Google Cloud Platform. This blogpost will discuss one example of how you can host your private dependency and later deploy to a serverless host like Google Cloud Functions using Cloud Build to automate the deployment.

Before getting started, take a look at the sample code here to copy and follow along.

Creating a package

Let’s walk through deploying a Google Cloud Function with a simple private dependency written in Node. Our example dependency will return the input given in unicode. It’s index.js file will look like this:

First, you’ll want to prepare your package to upload to Artifact Registry. For Node, this package should have a package.json file, which should dictate the entry point and information about the package. You can create a simple one like the one below by running the command npm init -y. 

For the name of the package you should specify your scope. A scope allows you to group packages, which is helpful if you want to publish a private package; alternatively, publishing without a scope would make the repository public by default. In this blogpost, we’re going to use the scope @example, but you should name it after your private dependency’s group (i.e., your company, team or project).

Another important detail in this file is that the devDependencies property contains the dependency for authenticating to the google artifact registry. To authenticate, you will use the command in the scripts section later in this tutorial.

Uploading the package to Artifact Registry & setting up authentication

Follow the instructions on these guides for creating an npm package repository on Artifact Registry, without configuring npm or pushing the repository. Next, you’ll want to configure the .npmrc file. To do so, simply add an empty file titled .npmrc, which should live at the base of your repository. To configure this file to deploy to the registry you just created, run the following command, and add the output to the .npmrc file. (Note: you may need to install the Google Cloud SDK before running the command.)

gcloud alpha artifacts print-settings npm –scope=@example 

–repository=blog-repo –location=”us-central1”

Copy that output into your .npmrc file. Ultimately, it should look like this, substituting <projectId> for your Google Cloud Platform project ID.

Then, you can push the package to the artifact repository. To do so, run this command (ensuring that you’ve copied the scripts portion from the package.json file above):

npm run artifactregistry-login <path to your .npmrc file>

This command allows you to refresh your access token when pushing your repository.

Then, simply publish by running:

npm publish

You can confirm you’ve deployed your library by searching for the repo in Artifact Registry in your Google Cloud Platform dashboard. 

Setting up your Google Cloud Function

Once you have set up a repository in Artifact Registry, you can start to build your applications on top of it. Take a simple serverless example, like a Google Cloud Function:

This simple Cloud Function uses our private dependency to print out “Hello World” to the specified URL in unicode (it actually uses the same example in this tutorial). But how will Cloud Function successfully pull the private dependency? By using the .npmrc file you created in your original repository.

To see it in action, follow instructions for creating a simple Google Cloud Function. You can follow the tutorial exactly, ensuring that the following three key elements are in your function:

When you create the index.js file (as done in the tutorial), it should live at the base of the repository, and should use the private dependency you’ve set up in artifact registry (like the example above),

Its package.json should list:

Your dependency with the version as listed in Artifact RegistryA script to authenticate with artifact registry (just as for your dependency)

And, most importantly, you should copy over your .nmprc file to the base of this Google Cloud Function to authenticate your npmrc token.

Then, you can deploy the function using the following command:

gcloud functions deploy mygcf –runtime nodejs12 –trigger-http –allow-unauthenticated

To see it in action, simply follow the http trigger link (from the tutorial) and check out your input in unicode.

Automate and protect your Cloud Function

The command above will deploy the function, but it does so by exposing your token in your .npmrc file, and by forcing you to manually re-authenticate each time you redeploy the function. To automate the redeployment of the function in a safe manner, you can add a cloudbuild.yaml file to the root of your Cloud Function package.

First, let’s start by creating a helper function to modify the .npmrc file. You should save the following file to the root of your Cloud Function package, and name it npmrc-parser.js:

Next, let’s create the file cloud build file. To do so, copy the following file in the root of your directory, and title it cloudbuild.yaml:

The first two steps of the build file will authenticate your private dependency, and install all dependencies on the project. The third step will call the custom helper function we created above to prepare your .npmrc file. This function takes two arguments, pathToAuthToken, and pathToNpmrc. The pathToAuthToken is the left-hand side of the authToken assignment in your .npmrc file. It should look something like this, replacing projectId with your own project:


The pathToNpmrc would be wherever you’ve saved your .npmrc file. In this case, the value would look like so:


This build step removes the token value on the file and saves it to a variable, and replaces the .npmrc file with the environment variable TOKEN. So, the Cloud Function never stores the actual token in the source code, and the .npmrc file that is saved locally looks like this:




The last step in the build file redeploys the function, replacing the environment variable in the .npmrc file with the token value we just created. To run the build steps, you can set up a trigger, or run the following command manually, replacing the variables as we’ve described above:

gcloud builds submit –config=cloudbuild.yaml –substitutions=_PATHTOTOKEN=”<PATHTOTOKEN>”,_PATHTONPMRC=”<PATHTONPMRC>”,_FUNCTIONNAME=”<CLOUDFUNCTIONNAME>”

Before running, make sure you’ve set the appropriate permissions for your Cloud Build function.

That’s all there is to it! Once set up this way, your Google Cloud Function can pull in your private dependency from Artifact Registry without hosting on any external package managers, and without any manual deployment steps.

Automate publishing your private dependency 

To speed up the deployment of your local package to Artifact Registry, you can also add a Cloud Build file to your Artifact Registry package that will trigger a publishing event when changes are saved to your package. You can follow the setup steps here, but here is a snippet of a sample cloudbuild.yaml file that would live in your private dependency:

What about other languages and runtimes?

Even though this blog post focuses on Node.js, Cloud Functions and Artifact Registry support other runtimes as well, like Python and Java. For example, with Python the steps for deploying the module to Python aren’t much more complicated than Node. Once you’ve readied your private dependency and published to Artifact Registry, you can start creating a Google Cloud Function like the one above, but in Python. Next, you will want to fetch and package these dependencies locally. 

Related Article

Artifact Registry: the next generation of Container Registry

Compared with Container Registry, Artifact Registry lets you store non-container artifacts, and provides better security and more flexibi…

Read Article

Cloud BlogRead More



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments