Securely use Snowflake from VS Code in the browser
Mar 29, 2024
•
Stijn De Haes
Secure Snowflake SSO in browser-based VS Code using custom OAuth, CLI/API auth flow, and a dbt adapter for seamless cloud IDE integration.
At Conveyor we help you to build, deploy, and scale your data products. One way we do that is by offering IDEs that run in the cloud. A primary activity among our users involves utilizing dbt within the IDE environment.However, we’ve encountered challenges when it comes to integrating Snowflake SSO with dbt in our Conveyor IDEs.
Our IDE’s are based on VS Code in the browser, more specifically we use the code-server project. The approach we used can be generalized to any setup that uses VS code in the browser. Therefore I wanted to share the blueprint of our solution in this article.

The issue with using VsCode, dbt, and Snowflake SSO
As mentioned before integrating VS Code in the browser, dbt, and Snowflake SSO is not easy at the moment.
Using SSO for Snowflake works great, except when you want to login from VS code in the browser. For example, when using the externalbrowserauthentication with dbt, it works like this:
- A user uses - dbt run, this will start the Snowflake authentication flow
- A web browser is opened for the user to authenticate 
- User authenticates through SSO 
- Snowflake posts an authentication token to localhost:**** (this is a random port), that the dbt needs to catch 
- But nothing happens, since our VS Code in the browser is not running on localhost (but on Kubernetes, on a domain like - http://app.conveyordata.com)

The problem with this is that our current VS Code is running in the cloud, and not on our laptop. Hence dbt is not running on localhost and thus the credentials can never be passed to dbt. There is also no way to change this callback to anything else besides localhost and a random port.
The port can be chosen, however, so we could write software that runs on the machine of the users, and not on VS Code in the browser. However, this defeats the purpose of using cloud IDEs. Users have to configure their local machine again.
An alternative is using one of the many security integrations available in Snowflake, that are supported in dbt. We have chosen to use the OAUTH integration of Snowflake.
This implementation was a three-step process:
- We are going to configure the Custome OAUTH security integration in Snowflake 
- Then we are going to configure dbt 
- Lastly, I am going to give you the blueprint of how to create a CLI/API combination to pass credentials to dbt from Snowflake SSO 
Using the Snowflake Custom OAUTH Security Integration
To fix our problem we can use the Custom OAUTH security integration available in Snowflake. You can enable it as follows:
CREATE SECURITY INTEGRATION VS_CODE_BROWSER
TYPE = OAUTH
ENABLED = TRUE
OAUTH_CLIENT = CUSTOM
OAUTH_CLIENT_TYPE = 'PUBLIC'
OAUTH_REDIRECT_URI = 'https://OUR_DOMAIN/snowflake-callback'
OAUTH_ISSUE_REFRESH_TOKENS = TRUE
OAUTH_REFRESH_TOKEN_VALIDITY = 86400;
There are a couple of things to unpack here:
- TYPE = OAUTHand- OAUTH_CLIENT = CUSTOMmeans we are enabling an OAUTH-compliant security integration of type custom. Custom OAUTH integrations are used for custom integrations, you can create your own OAUTH integration with Snowflake using this
- OAUTH_CLIENT_TYPE = 'PUBLIC', specifies that the OAUTH client id and secret we get are publicly visible. This limits the OAUTH flows we can use so that sharing client id and secret publicly is safe. For example, Authorization code flow (we will use this one) is still safe, however, Client credentials flow is not
- OAUTH_REDIRECT_URI = 'https://OUR_DOMAIN/snowflake-callback', in the Authorization code flow that we’ll use this is the URL that receives the Authorization code. That code can be traded in for an access token. This is a bit of a problem since our security integration can only have one callback. But our VS Code ide’s are not all hosted on the same URL
Integrating with dbt
Now that we have set up a security integration we need to integrate it with dbt. You can do this using the following profiles.yml:default:The 
  target: vs_code_browser
  outputs:
    conveyor:
      #Basic snowflake setup
      type: snowflake
      account: YOUR_SNOWFLAKE_ACCOUNT
      database: YOUR_DATABASE
      warehouse: YOUR_WAREHOUSE
      schema: YOUR_SCHEMA
      threads: 1
      #OAUTH Setup
      authenticator: oauth
      oauth_client_id: OAUTH_CLIENT_ID
      oauth_client_secret: OAUTH_CLIENT_SECRET
      username: "{{ env_var("SNOWFLAKE_USERNAME") }}"
      token: "{{ env_var("SNOWFLAKE_TOKEN") }}"OAUTH_CLIENT_ID and OAUTH_CLIENT_SECRET need to be replaced with the values we can fetch from our security integration:
SELECT SYSTEM$SHOW_OAUTH_CLIENT_SECRETS('VS_CODE_BROWSER');
Fetching the Snowflake username and token
To make dbt work we need to populate the environment variables SNOWFLAKE_USERNAME and SNOWFLAKE_TOKEN , these are the credentials that will be used by dbt to authenticate to Snowflake.
This is the tricky part! To make this work we need to create both a CLI and API. In the next part, I am going to show how you can implement them.

1. Start authentication flow with CLI
The flow starts with a CLI, that you can implement in any language. It needs to have basic support to make http calls, open a browser, and receive http calls. We internally use golang for this, but any language should work.
To start the flow the user of the VS Code IDE should call your CLI to start up the authentication process, and needs to pass the parameters snowflake_client_id , snowflakec_lient_secret and snowflake_account .
Then the CLI needs to know on which HOST the VS Code in the browser is currently running, and will start an OAUTH flow with a custom state parameter. In this state parameter we store a base64 encoded json message containing:
- A random string, as protection against CSRF 
- Our current HOST where the VS Code environment is running 
- And a port. This will later be used by the API to redirect to our VS Code environment 
Then we can start an authentication flow to authorization endpoint of snowflake. We need to make a user open the browser with the following url:
We should fill in the correct query paremeters:
- client_id, should be the OAUTH_CLIENT_ID of the security integration
- response_type, should be- code
- redirect_uri, should be the redirect uri we specified in the security integration
- state, as said before a base64 encoded json message, container a random string, our current HOST, and a port
Now we just need the user to open a browser to this URL.
And we need the CLI to open an http server to receive the callback on the port we stored in the state.
2. User authentication and capturing the callback
After the user opens the URL in the browser they will be presented with a login screen. After logging in, they will be redirected to our API.
Remember that we can have only one callback URL? Well, this is the moment that we are going to solve this problem. We are going to host an API on this callback URL. After the user login, they are redirect to the redirect URL with a GET request containing the following parameters:
- code, the short lived authentication that we can trade in to a real access and refresh token
- state, the same base64 encoded json we passed from the CLI
This will be redirected to our API, in our API we can read the state, and get the HOST of the VS Code environment and the port where our CLI is listening to a response. This means we know where we can forward the request!
In the CLI we can then verify the state to make sure it is the same state we started our authentication flow with (this is protection against cross site request forgery, you can read more here), and we can use the authentication code to trade in for an access token.
3. Setting the credentials in the environment
Now that we have the authentication code we can trade it in for a real access token.
To do that we need to use the token endpoint. I am not going to go into the details of this since it is a typical OAUTH token endpoint call, and pretty standard.
After you have done this call, you will receive a body with a couple of parameters. We are interested in the following:
- refresh_token, this is the token we need for dbt, we need to make our CLI set the environment variable- SNOWFLAKE_TOKENto this value
- username, this is the username of the user that just authenticated, our CLI need to set the environment variable- SNOWFLAKE_USERto this value
After this is done we can use dbt run like normal!
Improving the UX
There is one extra step we can do to improve the UX for our users. At Conveyor we created the dbt-conveyor-snowflake adapter, it is a thin wrapper around the dbt-snowflake that automatically calls our CLI from the previous steps to log a user in. This way users don’t have to do anything manually anymore to authenticate themselves. You enable this adapter by installing it and using the following profiles.yml:default:
  target: vs_code_browser
  outputs:
    conveyor:
      #Basic snowflake setup
      type: conveyorsnowflake
      account: YOUR_SNOWFLAKE_ACCOUNT
      database: YOUR_DATABASE
      warehouse: YOUR_WAREHOUSE
      schema: YOUR_SCHEMA
      threads: 1
      #OAUTH Setup
      authenticator: oauth
      oauth_client_id: OAUTH_CLIENT_ID
      oauth_client_secret: OAUTH_CLIENT_SECRET
The main differences are:
- We switched the type of adapter from - snowflaketo- conveyorsnowflake
- the keys - tokenand- usernamein the conveyor output have been removed, since they are filled in by our adapter
Recap
In this article we learned how we can use Snowflake SSO with dbt inside of VS Code IDE running in the browser. We learned:
- How to set up the security integration in Snowflake 
- How to configure our dbt - profiles.yaml
- How to create a CLI and API combination that allows the IDE to fetch the authentication token from Snowflake 
- Make the UX easier for our users by using a custom snowflake adapter 
Latest
The ROI Challenge: Why Measuring Data’s Value is Hard, but Crucial
Too many data products, not enough ROI? Learn how to track value, cost & governance to manage data as a true business asset.
Authorizing AWS Principals on Azure
How to delegate trust from Entra to AWS IAM through Cognito, authorizing Azure actions without needing long-lived credentials.
Why Your Next Data Catalog Should Be a Marketplace
Why data catalogs fail - and how a Data Product Marketplace can rebuild trust, drive adoption, and unlock business value from your data.


