Skip to content

Conversation

radhikaathalye-db
Copy link

@radhikaathalye-db radhikaathalye-db commented Sep 16, 2025

Changes

What does this PR do?

Add a new method to the DashboardManager class to upload DuckDB extract to UC volume. Add tests for the same.

Relevant implementation details

Caveats/things to watch out for when reviewing:

Linked issues

Resolves #..

Functionality

  • added relevant user documentation
  • added new CLI command
  • modified existing command: databricks labs lakebridge ...
  • ... +add your own

Tests

  • manually tested
  • added unit tests
  • added integration tests

Copy link

github-actions bot commented Sep 16, 2025

❌ 20/27 passed, 7 failed, 1m7s total

❌ test_upload_duckdb_to_uc_volume_success: TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer' (106ms)
TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer'
[gw0] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_upload_duckdb_to_uc_volume_failure: TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer' (1ms)
TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer'
[gw0] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_upload_duckdb_to_uc_volume_file_not_found: TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer' (113ms)
TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer'
[gw1] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_upload_duckdb_to_uc_volume_invalid_volume_path: TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer' (1ms)
TypeError: DashboardManager.__init__() missing 1 required positional argument: 'dashboard_deployer'
[gw1] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpiles_all_dbt_project_files: ValueError: No such transpiler: Morpheus (688ms)
ValueError: No such transpiler: Morpheus
[gw4] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
20:06 ERROR [databricks.labs.lakebridge.install] Error while fetching maven metadata: com.databricks.labs:databricks-morph-plugin
Traceback (most recent call last):
  File "/home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/install.py", line 310, in get_current_maven_artifact_version
    with request.urlopen(url) as server:
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 525, in open
    response = meth(req, response)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 634, in http_response
    response = self.parent.error(
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 563, in error
    return self._call_chain(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 496, in _call_chain
    result = func(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 643, in http_error_TEST_SCHEMA
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
20:06 WARNING [databricks.labs.lakebridge.install] Could not determine the latest version of Databricks morpheus transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Failed to install transpiler: Databricks {self._product_name} transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Error while fetching maven metadata: com.databricks.labs:databricks-morph-plugin
Traceback (most recent call last):
  File "/home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/install.py", line 310, in get_current_maven_artifact_version
    with request.urlopen(url) as server:
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 525, in open
    response = meth(req, response)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 634, in http_response
    response = self.parent.error(
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 563, in error
    return self._call_chain(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 496, in _call_chain
    result = func(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 643, in http_error_TEST_SCHEMA
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
20:06 WARNING [databricks.labs.lakebridge.install] Could not determine the latest version of Databricks morpheus transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Failed to install transpiler: Databricks {self._product_name} transpiler
[gw4] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_sql_file: ValueError: No such transpiler: Morpheus (649ms)
ValueError: No such transpiler: Morpheus
[gw4] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
20:06 ERROR [databricks.labs.lakebridge.install] Error while fetching maven metadata: com.databricks.labs:databricks-morph-plugin
Traceback (most recent call last):
  File "/home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/install.py", line 310, in get_current_maven_artifact_version
    with request.urlopen(url) as server:
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 525, in open
    response = meth(req, response)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 634, in http_response
    response = self.parent.error(
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 563, in error
    return self._call_chain(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 496, in _call_chain
    result = func(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 643, in http_error_TEST_SCHEMA
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
20:06 WARNING [databricks.labs.lakebridge.install] Could not determine the latest version of Databricks morpheus transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Failed to install transpiler: Databricks {self._product_name} transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Error while fetching maven metadata: com.databricks.labs:databricks-morph-plugin
Traceback (most recent call last):
  File "/home/runner/work/lakebridge/lakebridge/src/databricks/labs/lakebridge/install.py", line 310, in get_current_maven_artifact_version
    with request.urlopen(url) as server:
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 216, in urlopen
    return opener.open(url, data, timeout)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 525, in open
    response = meth(req, response)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 634, in http_response
    response = self.parent.error(
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 563, in error
    return self._call_chain(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 496, in _call_chain
    result = func(*args)
  File "/opt/hostedtoolcache/Python/3.10.18/x64/lib/python3.10/urllib/request.py", line 643, in http_error_TEST_SCHEMA
    raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden
20:06 WARNING [databricks.labs.lakebridge.install] Could not determine the latest version of Databricks morpheus transpiler
20:06 ERROR [databricks.labs.lakebridge.install] Failed to install transpiler: Databricks {self._product_name} transpiler
[gw4] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
❌ test_transpile_sql_file: AssertionError (9.31s)
AssertionError
[gw2] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python
20:07 INFO [databricks.labs.lakebridge.install] Installing Databricks bladebridge transpiler (v0.1.16)
20:07 DEBUG [databricks.labs.lakebridge.install] Created virtual environment with context: namespace(env_dir=PosixPath('/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv'), env_name='.venv', prompt='(bladebridge) ', executable='/home/runner/work/lakebridge/lakebridge/.venv/bin/python', python_dir='/home/runner/work/lakebridge/lakebridge/.venv/bin', python_exe='python', inc_path='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/include', bin_path='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin', bin_name='bin', env_exe='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python', env_exec_cmd='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python')
20:07 INFO [databricks.labs.lakebridge.install] Successfully installed bladebridge transpiler (v0.1.16)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Detected virtual environment to use at: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin:/home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python ['-m', 'databricks.labs.bladebridge.server', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3785, client_info=ClientInfo(name='lakebridge', version='0.10.8+2020250929200710'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration', initialization_options={'remorph': {'source-dialect': 'teradata'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:07 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/output
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')]
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (result: TranspileResult(transpiled_code='select cole(hello) world from table;\n', success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', '66964887063aa087674f2204f5917cf668dae6ef']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0)))]))
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 1)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (result: TranspileResult(transpiled_code="CREATE TABLE REF_TABLE\n    ,NO FALLBACK\n(\n    col1    BYTEINT NOT NULL,\n    col2    SMALLINT NOT NULL,\n    col3    INTEGER NOT NULL,\n    col4    BIGINT NOT NULL,\n    col5    DECIMAL(10,2) NOT NULL,\n    col6    DECIMAL(18,4) NOT NULL,\n    col7    TIMESTAMP(1) NOT NULL,\n    col8    TIME,\n    col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,\n    col10   CHAR(01) NOT NULL,\n    col11   CHAR(04) NOT NULL,\n    col12   CHAR(4),\n    col13   DECIMAL(10,0) NOT NULL,\n    col14   DECIMAL(18,6) NOT NULL,\n    col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,\n    col16   DATE FORMAT 'YY/MM/DD',\n    col17   VARCHAR(30) NOT CASESPECIFIC,\n    col18   FLOAT NOT NULL\n    )\n    UNIQUE PRIMARY INDEX (col1, col3);\n", success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', 'bfa05ae4a572fef81bb5faa14c54a59e68dae6f0']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))]))
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 1)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', '66964887063aa087674f2204f5917cf668dae6ef']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0))), TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', 'bfa05ae4a572fef81bb5faa14c54a59e68dae6f0']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))])
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
20:07 INFO [databricks.labs.lakebridge.install] Installing Databricks bladebridge transpiler (v0.1.16)
20:07 DEBUG [databricks.labs.lakebridge.install] Created virtual environment with context: namespace(env_dir=PosixPath('/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv'), env_name='.venv', prompt='(bladebridge) ', executable='/home/runner/work/lakebridge/lakebridge/.venv/bin/python', python_dir='/home/runner/work/lakebridge/lakebridge/.venv/bin', python_exe='python', inc_path='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/include', bin_path='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin', bin_name='bin', env_exe='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python', env_exec_cmd='/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python')
20:07 INFO [databricks.labs.lakebridge.install] Successfully installed bladebridge transpiler (v0.1.16)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Detected virtual environment to use at: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Using PATH for launching LSP server: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin:/home/runner/work/lakebridge/lakebridge/.venv/bin:/opt/hostedtoolcache/Python/3.10.18/x64/bin:/opt/hostedtoolcache/Python/3.10.18/x64:/snap/bin:/home/runner/.local/bin:/opt/pipx_bin:/home/runner/.cargo/bin:/home/runner/.config/composer/vendor/bin:/usr/local/.ghcup/bin:/home/runner/.dotnet/tools:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Starting LSP engine: /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/bin/python ['-m', 'databricks.labs.bladebridge.server', '--log_level=NOTSET'] (cwd=/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] LSP init params: InitializeParams(capabilities=ClientCapabilities(workspace=None, text_document=None, notebook_document=None, window=None, general=None, experimental=None), process_id=3785, client_info=ClientInfo(name='lakebridge', version='0.10.8+2020250929200710'), locale=None, root_path=None, root_uri='file:///home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration', initialization_options={'remorph': {'source-dialect': 'teradata'}, 'options': None, 'custom': {}}, trace=None, work_done_token=None, workspace_folders=None)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.lsp.lsp_engine] Registered capability: document/transpileToDatabricks
20:07 INFO [databricks.labs.lakebridge.helpers.db_sql] Using SQL backend with warehouse_id: DATABRICKS_WAREHOUSE_ID
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] SQL Backend used for query validation: StatementExecutionBackend
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Starting to process input directory: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiling files from folder: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration -> /tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/output
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Processing next 2 files: [PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')]
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (result: TranspileResult(transpiled_code='select cole(hello) world from table;\n', success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', '66964887063aa087674f2204f5917cf668dae6ef']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0)))]))
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql (errors: 1)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Started processing file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Finished transpiling file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (result: TranspileResult(transpiled_code="CREATE TABLE REF_TABLE\n    ,NO FALLBACK\n(\n    col1    BYTEINT NOT NULL,\n    col2    SMALLINT NOT NULL,\n    col3    INTEGER NOT NULL,\n    col4    BIGINT NOT NULL,\n    col5    DECIMAL(10,2) NOT NULL,\n    col6    DECIMAL(18,4) NOT NULL,\n    col7    TIMESTAMP(1) NOT NULL,\n    col8    TIME,\n    col9    TIMESTAMP(5) WITH TIME ZONE NOT NULL,\n    col10   CHAR(01) NOT NULL,\n    col11   CHAR(04) NOT NULL,\n    col12   CHAR(4),\n    col13   DECIMAL(10,0) NOT NULL,\n    col14   DECIMAL(18,6) NOT NULL,\n    col15   DECIMAL(18,1) NOT NULL DEFAULT 0.0,\n    col16   DATE FORMAT 'YY/MM/DD',\n    col17   VARCHAR(30) NOT CASESPECIFIC,\n    col18   FLOAT NOT NULL\n    )\n    UNIQUE PRIMARY INDEX (col1, col3);\n", success_count=1, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', 'bfa05ae4a572fef81bb5faa14c54a59e68dae6f0']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))]))
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Processed file: /home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql (errors: 1)
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler results: TranspileStatus(file_list=[PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql')], no_of_transpiled_queries=2, error_list=[TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/dummy_function.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/dummy_function.sql', '-s', 'TERADATA', '-H', '66964887063aa087674f2204f5917cf668dae6ef']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=2, character=0))), TranspileError(code='FAILURE', kind=<ErrorKind.PARSING: 'PARSING'>, severity=<ErrorSeverity.ERROR: 'ERROR'>, path=PosixPath('/home/runner/work/lakebridge/lakebridge/tests/resources/functional/teradata/integration/create_ddl.sql'), message="Command '['/tmp/pytest-of-runner/pytest-0/popen-gw2/test_transpile_sql_file0/labs/remorph-transpilers/bladebridge/lib/.venv/lib/python3.10/site-packages/databricks/labs/bladebridge/Converter/bin/Linux/dbxconv', 'SQL', '-u', 'base_teradata2databricks_sql.json', '-n', 'transpiled', '-i', 'originals/create_ddl.sql', '-s', 'TERADATA', '-H', 'bfa05ae4a572fef81bb5faa14c54a59e68dae6f0']' returned non-zero exit status 255.", range=CodeRange(start=CodePosition(line=0, character=0), end=CodePosition(line=24, character=0)))])
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] SQL validation errors: 0
20:07 DEBUG [databricks.labs.lakebridge.transpiler.execute] Transpiler Status: {'total_files_processed': 2, 'total_queries_processed': 2, 'analysis_error_count': 0, 'parsing_error_count': 2, 'validation_error_count': 0, 'generation_error_count': 0, 'error_log_file': None}
20:07 INFO [databricks.labs.lakebridge.transpiler.execute] Done transpiling.
[gw2] linux -- Python 3.10.18 /home/runner/work/lakebridge/lakebridge/.venv/bin/python

Running from acceptance #2412

@gueniai gueniai added the feat/profiler Issues related to profilers label Sep 16, 2025

# Validate inputs
if not os.path.exists(local_file_path):
print(f"Error: Local file not found: {local_file_path}")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll want to use logger.error() statements here as opposed to printing to the console. Here is a good reference: https://github.com/databrickslabs/lakebridge/blob/main/src/databricks/labs/lakebridge/assessments/configure_assessment.py#L43

url = f"{workspace_url}/api/2.0/fs/files{volume_path}"

with open(local_file_path, 'rb') as f:
response = requests.put(url, headers=headers, data=f)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@radhikaathalye-db could you pull the upstream changes from the branch feature/add_local_dashboards. We'll want to use the Python SDK (WorkspaceClient) to upload files to the UC volume vs. using the REST API. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feat/profiler Issues related to profilers stacked PR Should be reviewed, but not merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants