Tutorial: Execute Snowflake statements via Rest API
This tutorial provides step-by-step instructions for executing Snowflake statements as a service user within a Snoweaver project.
This approach is useful in several scenarios:
Calling Snowflake procedures/functions from outside Snoweaver projects
Executing Snowflake operations within Snoweaver functions
Interacting with Snowflake beyond standard Snoweaver capabilities
For details on the REST API endpoints used in this tutorial, please refer to the following page:
Prerequisites
Before continuing with this tutorial, ensure you have completed the following prerequisites:
Create resources for connecting to Snowflake
Launch a command-line terminal (Linux, macOS) or Git Bash (Windows) on your client machine and execute the following commands to generate a key pair.
openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out rsa_key.p8 -nocrypt openssl rsa -in rsa_key.p8 -pubout -out rsa_key.pub echo -e '\nPrivate Key:\n' cat rsa_key.p8 echo -e '\nPublic Key:\n' sed '1d;$d' rsa_key.pub | tr -d '\n' echo -e '\n'
Please make a note of the values for the key pair for future reference.
Open a Snowflake worksheet and run the following statements with a role that has the required privileges. You may use the AccountAdmin role if working in a sandbox environment:
-- Create a service user for the LEARN_DEV project CREATE USER IF NOT EXISTS S_SW_LEARN_DEV; -- Set the RSA public key for the service user -- Replace '<your public key>' with the actual public key generated earlier ALTER USER S_SW_LEARN_DEV SET RSA_PUBLIC_KEY='<your public key>'; -- Create a role for the service user if it doesn't exist CREATE ROLE IF NOT EXISTS S_SW_LEARN_DEV; -- Grant the LEARN_DEV_DEVELOPER role to the service user GRANT ROLE LEARN_DEV_DEVELOPER TO USER S_SW_LEARN_DEV; -- Display information about the newly created service user DESC USER S_SW_LEARN_DEV;
The last statement provides details about the new service user. Please make a note of the RSA_PUBLIC_KEY_FP value for future reference.
Execute the following statements to configure Snoweaver resources for connecting to your Snowflake account.
Note
Ensure that the executing role has both the CREATE INTEGRATION privilege and the application role APP_ADMIN. You may use the AccountAdmin role if working in a sandbox environment:
USE SCHEMA SNOWEAVER.ADMIN; -- create a secret to store the username and the private key. CALL SAVE_SECRET('snflk_local_sw_learn_dev',' type: basic username: S_SW_LEARN_DEV password: |- -----BEGIN PRIVATE KEY----- <Your multi-line private key, formatted with indentations.> -----END PRIVATE KEY----- '); -- create a secret to store the public key fingerprint CALL SAVE_SECRET('snflk_local_sw_learn_dev_fp',' type: string generic_string: <your rsa_public_key_fp> '); -- create a network access rule for the Snowflake connection CALL SAVE_NETWORK_RULE('snflk_local_access_rule', '<your org>-<your account>.snowflakecomputing.com'); -- create an external access integration for the Snowflake connection CREATE OR REPLACE EXTERNAL ACCESS INTEGRATION SNFLK_LOCAL_SW_LEARN_DEV_ACESS_INTEGRATION ALLOWED_NETWORK_RULES = (SNOWEAVER.ADMIN.SNFLK_LOCAL_ACCESS_RULE) ALLOWED_AUTHENTICATION_SECRETS = (SNOWEAVER.ADMIN.SNFLK_LOCAL_SW_LEARN_DEV,SNOWEAVER.ADMIN.SNFLK_LOCAL_SW_LEARN_DEV_FP) ENABLED = TRUE; -- register the new external access integration GRANT USAGE ON INTEGRATION SNFLK_LOCAL_SW_LEARN_DEV_ACESS_INTEGRATION TO APPLICATION SNOWEAVER; CALL REGISTER_EXTERNAL_INTEGRATION('snflk_local_sw_learn_dev_acess_integration'); -- add the new secrets and integration to the project. CALL SAVE_PROJECT('LEARN_DEV',' secrets: - snflk_local_sw_learn_dev - snflk_local_sw_learn_dev_fp external_integrations: - snflk_local_sw_learn_dev_acess_integration ');
Note
Please refer to this document to locate your organization and account names in Snowflake.
Import and review the job
Open the LEARN_DEV project in Snoweaver with the LEARN_DEV_DEVELOPER role.
Open the Home page.
Create a new project variable as follows:
Name: snflk_local_account Type: Text Value: <your org>-<your account>
Use the Importing / Exporting Resources function to import the the following resources.
Format: On Screen Resource Type: Job New record: checked Name: snflk_local_exec_stmt
YAML Resource Data:
endpoint: https://{{_proj.snflk_local_account}}.snowflakecomputing.com/api/v2/statements/ external_integration: snflk_local_sw_learn_dev_acess_integration headers: |- { "Authorization": "Bearer {{ generate_snflk_jwt(_proj.snflk_local_account,_secrets.local,_secrets.local_fp) }}", "X-Snowflake-Authorization-Token-Type":"KEYPAIR_JWT", "Accept":"application/json" } instance_type: function payload: |- { "statement":{{_vars.statement|tojson}}, "warehouse": "{{_vars.warehouse}}", "role": "{{_vars.role}}" } payload_format: json posthook: |- {% if 200<=_status<=299 %} {{ _response.data|tojson }} {% else %} {{ raise_exception(_response.message) }} {% endif %} posthook_replace_output: true request_type: post response_format: json secrets: - alias: local secret: snflk_local_sw_learn_dev - alias: local_fp secret: snflk_local_sw_learn_dev_fp type: web_request variables: - name: statement type: text value: select current_role(); - name: warehouse type: text value: LEARN_SW_WH - name: role type: text value: LEARN_DEV_DEVELOPER
Open snflk_local_exec_stmt on the Jobs page.
To streamline the design, this job accepts only a single statement for submission. However, if you need to submit multiple statements, you can enclose them within a BEGIN … END block. We will cover this process later in the tutorial.
The following variables are defined for the job.
statement: the statement that will be executed.
warehouse: the warehouse used for executing the statement.
role: the role assigned to perform the execution.
We use the system function generate_snflk_jwt to generate a JWT token for authenticating with Snowflake.
The payload includes parameters for the statement, warehouse, and role, all of which are derived from the job variables.
Test the job and build an instance
Click Make a Test Call to test the job without the post-hook.
Review the response from Snowflake
Review the configuration of the post-hook. The post-hook checks the status of the response. If the call is successful, it replaces the output with the value of the data property. Otherwise, it raises an exception with the returned error message.
Click Make a Test Call with Posthook to test the job.
Click Build to build a job instance.
Open a Snowflake worksheet and execute the following statements to call the job instance:
USE ROLE LEARN_DEV_DEVELOPER; USE SCHEMA SNOWEAVER.LEARN_DEV; WITH T AS ( SELECT 'select current_role();' STMT,'LEARN_SW_WH' WH, 'LEARN_DEV_DEVELOPER' ROLE union SELECT 'show warehouses;','LEARN_SW_WH', 'LEARN_DEV_DEVELOPER' union SELECT 'show databases;','LEARN_SW_WH', 'LEARN_DEV_DEVELOPER' ) SELECT STMT,SNFLK_LOCAL_EXEC_STMT(STMT,WH,ROLE) FROM T;
Use pre-hook to prepare statements with Jinja
Return to Snoweaver
Click Save as to create a new job named snflk_local_exec_jinja_output.
Remove the job variable statement
Enable Pre-hook and insert the following code to the pre-hook section.
DECLARE results ARRAY default ARRAY_CONSTRUCT(); BEGIN USE SCHEMA LEARN_SW.DEV; {%- for i in range(10) %} CREATE OR REPLACE TABLE TEST{{ i }}(ID INT); CREATE OR REPLACE VIEW V_TEST{{ i }} AS SELECT * FROM TEST{{ i }}; DROP VIEW V_TEST{{ i }}; DROP TABLE TEST{{ i }}; results:=ARRAY_APPEND(:results,'Round {{i}} successfully completed.'); {%- endfor %} RETURN results; END;
This script generates 10 tables and their corresponding 10 views. Subsequently, it drops all these objects. Test the pre-hook template to see the output.
Update the payload template with the following configuration:
{ "statement":{{_prehook|tojson}}, "warehouse": "{{_vars.warehouse}}", "role": "{{_vars.role}}" }
Click Make a Test Call with Posthook to test the job and verify the results.
Save the job.