
=> including "TCL" (= begin, rollback, commit) * select from (arbitrary) Snowflake table With (2): python re-implementing stored proc.: With (1): python calling javascript stored proc.: Several passes and test invocations were performed here: Instead: SQL statements (= SELECT) are executed directly in Snowflake from Python, without a detour via stored procedures. (2) Stored procedures are re-implemented or reproduced in Python, and also executed in Python. (1) Stored procedures in Snowflake are invoked from Python. The following variants exist in Python (values determined from a "POC" project): Which execution times are achieved with the variants in Python? Further Snowflake-specific SQL statements for the environment are needed these are "mandatory" and shown here as examples they are not treated in greater detail, as this is beyond the scope of this article:Ĭur.execute("use warehouse warehousename")Ĭur.execute("alter warehouse warehousename resume")Ĭur.execute("alter session set timestamp_type_mapping = 'TIMESTAMP_NTZ'")Ĭur.execute("alter session set timezone = 'Europe/Berlin'")Ĭur.execute("alter session set TIME_OUTPUT_FORMAT = 'HH24:MI:SS.FF'")

The method of invoking the above-mentioned stored procedure "my_proc1()" in Python is described here. # Eine Variante davon ist der Zugriff per Column-NameĬur_dict = conn.cursor() # Cursor mit ParameterĬur_dict.execute("SELECT current_version() as c1, current_client() as c2, current_date() as c3") A first snippet of Python is provided next: This makes the Python connector available, and implementation can begin. The following module can be imported in Python code: In Python, new packages are installed with the help of "pip": Scheduling can take place, for example, via "Cronjob" sources (csv, xml, tables, xlsx etc.) can be imported, and further SQL scripts can also be invoked. This article describes this Python connector using sample snippets: ETL loading routes can be realized (manually) with the help of the Python connector (= DB connection). This means that a Python connector for Snowflake is available to create DB connections from the own client.
#Using snowflake pro to write a script drivers#
Snowflake's own implementation offers drivers for Python, Spark, ODBC and JDBC. Snowflake offers drivers for Python, Spark, ODBC and JDBC



One possibility is implementation of logic in SQL, and orchestration via Python. If no separate ETL tool is to be used at the company, there are several possibilities of loading data and realizing the ETL routes. These usually provide native drivers and connections to allow use of their tools with Snowflake. As a cloud-DB service, Snowflake itself offers no proprietary ETL tools, but leaves this to the manufacturers of ETL, reporting or self-service BI tools. Incidentally, one does not have to be an AWS customer to be able to use Snowflake. A detailed online documentation is available at the following URL: Because (storage) volume and execution time are paid for, the shorter times can reduce costs. During execution, it can automatically scale up as required, and shift down again at the end. The Internet connection from the client to the cloud and the data within the DB are encrypted. Snowflake is a native cloud DB which runs on AWS, and now also on Azure.
