Skip to content

Commit

Permalink
Merge branch 'snowflake' into bloomberg-snowflake
Browse files Browse the repository at this point in the history
  • Loading branch information
dprophet authored Dec 5, 2023
2 parents c4ff5fb + 4536040 commit 4b233b0
Show file tree
Hide file tree
Showing 114 changed files with 164 additions and 3 deletions.
Empty file modified .github/bin/build-matrix-from-impacted.py
100755 → 100644
Empty file.
Empty file modified .github/bin/build-pt-matrix-from-impacted-connectors.py
100755 → 100644
Empty file.
Empty file modified .github/bin/download-maven-dependencies.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/fake-ptl
100755 → 100644
Empty file.
Empty file modified .github/bin/free-disk-space.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/git-fetch-base-ref.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/prepare-check-commits-matrix.py
100755 → 100644
Empty file.
Empty file modified .github/bin/redshift/delete-aws-redshift.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/redshift/setup-aws-redshift.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/retry
100755 → 100644
Empty file.
Empty file modified .github/bin/s3/delete-s3-bucket.sh
100755 → 100644
Empty file.
Empty file modified .github/bin/s3/setup-empty-s3-bucket.sh
100755 → 100644
Empty file.
1 change: 1 addition & 0 deletions .mvn/jvm.config
Original file line number Diff line number Diff line change
Expand Up @@ -9,3 +9,4 @@
--add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED
--add-opens=jdk.compiler/com.sun.tools.javac.code=ALL-UNNAMED
--add-opens=jdk.compiler/com.sun.tools.javac.comp=ALL-UNNAMED
--add-opens=java.base/java.nio=ALL-UNNAMED
Empty file modified core/docker/bin/health-check
100755 → 100644
Empty file.
Empty file modified core/docker/bin/run-trino
100755 → 100644
Empty file.
Empty file modified core/docker/build.sh
100755 → 100644
Empty file.
Empty file modified core/trino-main/bin/check_webui.sh
100755 → 100644
Empty file.
Empty file.
Empty file modified core/trino-main/src/test/resources/cert/generate.sh
100755 → 100644
Empty file.
6 changes: 6 additions & 0 deletions core/trino-server/src/main/provisio/trino.xml
Original file line number Diff line number Diff line change
Expand Up @@ -289,6 +289,12 @@
<unpack />
</artifact>
</artifactSet>

<artifactSet to="plugin/snowflake">
<artifact id="${project.groupId}:trino-snowflake:zip:${project.version}">
<unpack />
</artifact>
</artifactSet>

<artifactSet to="plugin/sqlserver">
<artifact id="${project.groupId}:trino-sqlserver:zip:${project.version}">
Expand Down
Empty file modified docs/build
100755 → 100644
Empty file.
47 changes: 47 additions & 0 deletions docs/src/main/sphinx/connector.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
**********
Connectors
**********

This chapter describes the connectors available in Trino to access data
from different data sources.

.. toctree::
:maxdepth: 1

Accumulo <connector/accumulo>
Atop <connector/atop>
BigQuery <connector/bigquery>
Black Hole <connector/blackhole>
Cassandra <connector/cassandra>
ClickHouse <connector/clickhouse>
Delta Lake <connector/delta-lake>
Druid <connector/druid>
Elasticsearch <connector/elasticsearch>
Google Sheets <connector/googlesheets>
Hive <connector/hive>
Hudi <connector/hudi>
Iceberg <connector/iceberg>
Ignite <connector/ignite>
JMX <connector/jmx>
Kafka <connector/kafka>
Kinesis <connector/kinesis>
Kudu <connector/kudu>
Local File <connector/localfile>
MariaDB <connector/mariadb>
Memory <connector/memory>
MongoDB <connector/mongodb>
MySQL <connector/mysql>
Oracle <connector/oracle>
Phoenix <connector/phoenix>
Pinot <connector/pinot>
PostgreSQL <connector/postgresql>
Prometheus <connector/prometheus>
Redis <connector/redis>
Redshift <connector/redshift>
SingleStore <connector/memsql>
Snowflake <connector/snowflake>
SQL Server <connector/sqlserver>
System <connector/system>
Thrift <connector/thrift>
TPCDS <connector/tpcds>
TPCH <connector/tpch>
107 changes: 107 additions & 0 deletions docs/src/main/sphinx/connector/snowflake.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,107 @@
===================
Snowflake connector
===================

.. raw:: html

<img src="../_static/img/snowflake.png" class="connector-logo">

The Snowflake connector allows querying and creating tables in an
external `Snowflake <https://www.snowflake.com/>`_ account. This can be used to join data between
different systems like Snowflake and Hive, or between two different
Snowflake accounts.

Configuration
-------------

To configure the Snowflake connector, create a catalog properties file
in ``etc/catalog`` named, for example, ``example.properties``, to
mount the Snowflake connector as the ``snowflake`` catalog.
Create the file with the following contents, replacing the
connection properties as appropriate for your setup:

.. code-block:: none
connector.name=snowflake
connection-url=jdbc:snowflake://<account>.snowflakecomputing.com
connection-user=root
connection-password=secret
snowflake.account=account
snowflake.database=database
snowflake.role=role
snowflake.warehouse=warehouse
Arrow serialization support
^^^^^^^^^^^^^^^^^^^^^^^^^^^

This is an experimental feature which introduces support for using Apache Arrow
as the serialization format when reading from Snowflake. Please note there are
a few caveats:

* Using Apache Arrow serialization is disabled by default. In order to enable
it, add ``--add-opens=java.base/java.nio=ALL-UNNAMED`` to the Trino
:ref:`jvm-config`.


Multiple Snowflake databases or accounts
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

The Snowflake connector can only access a single database within
a Snowflake account. Thus, if you have multiple Snowflake databases,
or want to connect to multiple Snowflake accounts, you must configure
multiple instances of the Snowflake connector.

.. snowflake-type-mapping:
Type mapping
------------

Trino supports the following Snowflake data types:

================================== ===============================
Snowflake Type Trino Type
================================== ===============================
``boolean`` ``boolean``
``tinyint`` ``bigint``
``smallint`` ``bigint``
``byteint`` ``bigint``
``int`` ``bigint``
``integer`` ``bigint``
``bigint`` ``bigint``
``float`` ``real``
``real`` ``real``
``double`` ``double``
``decimal`` ``decimal(P,S)``
``varchar(n)`` ``varchar(n)``
``char(n)`` ``varchar(n)``
``binary(n)`` ``varbinary``
``varbinary`` ``varbinary``
``date`` ``date``
``time`` ``time``
``timestampntz`` ``timestamp``
``timestamptz`` ``timestampTZ``
``timestampltz`` ``timestampTZ``
================================== ===============================

Complete list of `Snowflake data types
<https://docs.snowflake.com/en/sql-reference/intro-summary-data-types.html>`_.

.. _snowflake-sql-support:

SQL support
-----------

The connector provides read access and write access to data and metadata in
a Snowflake database. In addition to the :ref:`globally available
<sql-globally-available>` and :ref:`read operation <sql-read-operations>`
statements, the connector supports the following features:

* :doc:`/sql/insert`
* :doc:`/sql/delete`
* :doc:`/sql/truncate`
* :doc:`/sql/create-table`
* :doc:`/sql/create-table-as`
* :doc:`/sql/drop-table`
* :doc:`/sql/alter-table`
* :doc:`/sql/create-schema`
* :doc:`/sql/drop-schema`
Empty file.
Empty file modified mvnw
100755 → 100644
Empty file.
Empty file modified plugin/trino-bigquery/src/test/resources/proxy/generate.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_abfs_access_key_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_abfs_oauth_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_adl_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_s3_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/run_hive_wasb_tests.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/bin/start_hive.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-hive-hadoop2/conf/files/hadoop-put.sh
100755 → 100644
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file modified plugin/trino-oracle/src/test/resources/restart.sh
100755 → 100644
Empty file.
Empty file modified plugin/trino-pinot/pom.xml
100755 → 100644
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -35,4 +35,4 @@ protected SqlExecutor onRemoteDatabase()
{
return server::execute;
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -408,4 +408,4 @@ private static boolean isGap(ZoneId zone, LocalDate date)
{
return zone.getRules().getValidOffsets(date.atStartOfDay()).isEmpty();
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -74,4 +74,4 @@ public void close()
{
execute("DROP SCHEMA IF EXISTS tpch");
}
}
}
Empty file modified testing/bin/ptl
100755 → 100644
Empty file.
Empty file.
Empty file.
Empty file modified testing/trino-product-tests-launcher/bin/run-launcher
100755 → 100644
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.

0 comments on commit 4b233b0

Please sign in to comment.