Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add e2e tests for dataplex metadata update #1140

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ jobs:
)
strategy:
matrix:
tests: [bigquery, common, gcs, pubsub, spanner, gcsdelete, gcsmove, bigqueryexecute]
tests: [bigquery, common, gcs, pubsub, spanner, gcsdelete, gcsmove, bigqueryexecute, dataplex]
fail-fast: false
steps:
# Pinned 1.0.0 version
Expand Down
38 changes: 38 additions & 0 deletions src/e2e-test/features/dataplex/sink/BigQueryToDataplex.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
@Dataplex_Sink
Feature: Dataplex sink - Verification of BQ source to dataplex sink with update metadata enabled

@Dataplex_SINK_TEST @GCS_SINK_TEST @BQ_SOURCE_TEST @Dataplex_Sink_Required
Scenario:Validate successful records transfer from BigQuery to Dataplex with update metadata enabled
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Sink is Dataplex
Then Connect source as "BigQuery" and sink as "Dataplex" to establish connection
Then Open BigQuery source properties
Then Enter BigQuery property reference name
Then Enter BigQuery property projectId "projectId"
Then Enter BigQuery property datasetProjectId "projectId"
Then Enter BigQuery property dataset "dataset"
Then Enter BigQuery source property table name
Then Override Service account details if set in environment variables
Then Enter BiqQuery property encryption key name "cmekBQ" if cmek is enabled
Then Validate output schema with expectedSchema "bqSourceSchema"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Open Dataplex sink properties
Then Override Service account details if set in environment variables
Then Enter the Dataplex mandatory properties
Then Enter the Dataplex sink mandatory properties
Then Enable Metadata Update
Then Validate "Dataplex" plugin properties
Then Close the Dataplex properties
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
@Dataplex_Sink
Feature: Dataplex sink - Verification of BQ source to dataplex sink with update metadata enabled

@Dataplex_SINK_TEST @GCS_SINK_TEST @BQ_SOURCE_TEST
Scenario:Validate successful records transfer from BigQuery to Dataplex with update metadata enabled
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Sink is Dataplex
Then Connect source as "BigQuery" and sink as "Dataplex" to establish connection
Then Open BigQuery source properties
Then Enter BigQuery property reference name
Then Enter BigQuery property projectId "projectId"
Then Enter BigQuery property datasetProjectId "projectId"
Then Enter BigQuery property dataset "dataset"
Then Enter BigQuery source property table name
Then Override Service account details if set in environment variables
Then Enter BiqQuery property encryption key name "cmekBQ" if cmek is enabled
Then Validate output schema with expectedSchema "bqSourceSchema"
Then Validate "BigQuery" plugin properties
Then Close the BigQuery properties
Then Open Dataplex sink properties
Then Override Service account details if set in environment variables
Then Enter the Dataplex mandatory properties
Then Enter the Dataplex sink mandatory properties
Then Enable Metadata Update
Then Validate "Dataplex" plugin properties
Then Close the Dataplex properties
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Given Open Datafusion Project to configure pipeline
When Source is Dataplex
When Sink is Dataplex
Then Connect source as "Dataplex" and sink as "Dataplex" to establish connection
Then Open Dataplex source properties
Then Enter the Dataplex mandatory properties
Then Enter the Dataplex source mandatory properties
Then Override Service account details if set in environment variables
Then Validate output schema with expectedSchema "dataplexSourceSchema"
Then Validate "Dataplex" plugin properties
Then Close the Dataplex properties
Then Open Dataplex sink properties
Then Override Service account details if set in environment variables
Then Enter the Dataplex mandatory properties
Then Enter the Dataplex sink mandatory properties
Then Enable Metadata Update
Then Remove "ts" column from output schema
Then Validate "Dataplex" plugin properties
Then Close the Dataplex properties
Then Save and Deploy Pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.dataplex.runners.sinkrunner;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;


/**
* Test Runner to execute Dataplex sink cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.plugin.dataplex.stepsdesign", "io.cdap.plugin.gcs.stepsdesign",
"io.cdap.plugin.bigquery.stepsdesign", "stepsdesign", "io.cdap.plugin.common.stepsdesign"},
tags = {"@Dataplex_Sink and not @ignore"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/dataplex-sink",
"json:target/cucumber-reports/cucumber-dataplex-sink.json",
"junit:target/cucumber-reports/cucumber-dataplex-sink.xml"}
)
public class TestRunner {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.dataplex.runners.sinkrunner;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute only required dataplex Sink plugin test cases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"io.cdap.plugin.dataplex.stepsdesign", "io.cdap.plugin.gcs.stepsdesign",
"io.cdap.plugin.bigquery.stepsdesign", "stepsdesign", "io.cdap.plugin.common.stepsdesign"},
tags = {"@Dataplex_Sink_Required and not @ignore"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/dataplex-sink-required",
"json:target/cucumber-reports/cucumber-dataplex-sink-required.json",
"junit:target/cucumber-reports/cucumber-dataplex-sink-required.xml"}
)
public class TestRunnerRequired {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

/**
* Package contains the runner for the Dataplex sink features.
*/
package io.cdap.plugin.dataplex.runners.sinkrunner;
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.dataplex.stepsdesign;

import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.plugin.utils.DataplexHelper;
import io.cdap.plugin.utils.E2EHelper;
import io.cucumber.java.en.Then;

import java.io.IOException;

/**
* Dataplex related common stepDesigns.
*/
public class DataplexBase implements E2EHelper {
@Then("Enter the Dataplex mandatory properties")
public void enterTheDataplexMandatoryProperties() throws IOException {
DataplexHelper.enterReferenceName();
DataplexHelper.enterProjectId();
DataplexHelper.enterDataplexProperty(
"location", PluginPropertyUtils.pluginProp("dataplexDefaultLocation"));
DataplexHelper.enterDataplexProperty("lake", PluginPropertyUtils.pluginProp("dataplexDefaultLake"));
DataplexHelper.enterDataplexProperty("zone", PluginPropertyUtils.pluginProp("dataplexDefaultZone"));
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.dataplex.stepsdesign;

import io.cdap.e2e.pages.actions.CdfPluginPropertiesActions;
import io.cdap.e2e.pages.actions.CdfStudioActions;
import io.cdap.e2e.utils.PluginPropertyUtils;
import io.cdap.plugin.common.stepsdesign.TestSetupHooks;
import io.cdap.plugin.utils.DataplexHelper;
import io.cdap.plugin.utils.E2EHelper;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;

import java.io.IOException;


/**
* Dataplex Sink related stepDesigns.
*/
public class DataplexSink implements E2EHelper {

@When("Sink is Dataplex")
public void sinkIsDataplex() {
CdfStudioActions.clickSink();
selectSinkPlugin("Dataplex");
}

@Then("Open Dataplex sink properties")
public void openDataplexSinkProperties() {
openSinkPluginProperties("Dataplex");
}

@Then("Close the Dataplex properties")
public void closeTheBigQueryProperties() {
CdfPluginPropertiesActions.clickCloseButton();
}

@Then("Enter the Dataplex sink mandatory properties")
public void enterTheDataplexSinkMandatoryProperties() throws IOException {
DataplexHelper.enterDataplexProperty("asset", PluginPropertyUtils.pluginProp("dataplexDefaultAsset"));
DataplexHelper.setAssetType("STORAGE_BUCKET");
DataplexHelper.enterDataplexProperty("table", TestSetupHooks.gcsTargetBucketName);
}

@Then("Enable Metadata Update")
public void enableMetadataUpdate() {
DataplexHelper.toggleMetadataUpdate();
}

@Then("Remove {string} column from output schema")
public void removeColumnFromOutputSchema(String fieldName) throws InterruptedException {
DataplexHelper.deleteSchemaField(fieldName);
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.dataplex.stepsdesign;

import io.cdap.plugin.common.stepsdesign.TestSetupHooks;
import io.cdap.plugin.utils.DataplexHelper;
import io.cdap.plugin.utils.E2EHelper;
import io.cucumber.java.en.Then;
import io.cucumber.java.en.When;

import java.io.IOException;

/**
* Dataplex Source related stepDesigns.
*/
public class DataplexSource implements E2EHelper {

@When("Source is Dataplex")
public void sourceIsDataplex() {
selectSourcePlugin("Dataplex");
}

@Then("Open Dataplex source properties")
public void openDataplexSourceProperties() {
openSourcePluginProperties("Dataplex");
}

@Then("Enter the Dataplex source mandatory properties")
public void enterTheDataplexSourceMandatoryProperties() throws IOException {
DataplexHelper.enterDataplexProperty(
"entity", TestSetupHooks.gcsTargetBucketName.replaceAll("[^a-zA-Z0-9_]", "_"));
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
/*
* Copyright © 2022 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

/**
* This package contains stepDesigns for Dataplex Plugin.
*/
package io.cdap.plugin.dataplex.stepsdesign;
Loading