Skip to content

Commit

Permalink
e2e-XmlReader-ITN
Browse files Browse the repository at this point in the history
  • Loading branch information
AnkitCLI committed May 15, 2024
1 parent 61b1d14 commit 4497389
Show file tree
Hide file tree
Showing 10 changed files with 345 additions and 1 deletion.
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Copyright © 2024 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@XmlReader_Source
Feature:File Sink - Verify XML Reader Plugin Error scenarios

@XMLREADER_DELETE_TEST @FILE_SINK_TEST
Scenario: Verify Pipeline fails when an invalid pattern is entered
Given Open Datafusion Project to configure pipeline
When Select plugin: "XML Reader" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "XMLReader" and "File" to establish connection
Then Navigate to the properties page of plugin: "XMLReader"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter input plugin property: "path" with value: "xmlTestFile"
Then Enter input plugin property: "nodePath" with value: "node"
Then Select dropdown plugin property: "reprocessingRequired" with option value: "No"
Then Enter input plugin property: "pattern" with value: "invalidPattern"
Then Validate "XMLReader" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Failed"
Then Close the pipeline logs

@XMLREADER_TEST @FILE_SINK_TEST
Scenario: Verify no data is transferred when an invalid node path is entered
Given Open Datafusion Project to configure pipeline
When Select plugin: "XML Reader" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "XMLReader" and "File" to establish connection
Then Navigate to the properties page of plugin: "XMLReader"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter input plugin property: "path" with value: "xmlTestFile"
Then Enter input plugin property: "nodePath" with value: "invalidNode"
Then Select dropdown plugin property: "reprocessingRequired" with option value: "No"
Then Validate "XMLReader" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count is equal to IN record count
113 changes: 113 additions & 0 deletions core-plugins/src/e2e-test/features/xmlReader/XmlReaderToFile.feature
Original file line number Diff line number Diff line change
@@ -0,0 +1,113 @@
# Copyright © 2024 Cask Data, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
# use this file except in compliance with the License. You may obtain a copy of
# the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.

@XmlReader_Source
Feature:File Sink - Verification of XmlReader plugin to File Successful data transfer

@XmlReader_Source_Required @XMLREADER_TEST @FILE_SINK_TEST
Scenario: To verify data is getting transferred from XmlReader to File sink
Given Open Datafusion Project to configure pipeline
When Select plugin: "XML Reader" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "XMLReader" and "File" to establish connection
Then Navigate to the properties page of plugin: "XMLReader"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter input plugin property: "path" with value: "xmlTestFile"
Then Enter input plugin property: "nodePath" with value: "node"
Then Select dropdown plugin property: "reprocessingRequired" with option value: "No"
Then Validate "XMLReader" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count is equal to IN record count
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "outputForXMLTest"

@XmlReader_Source_Required @XMLREADER_DELETE_TEST @FILE_SINK_TEST
Scenario: To verify data is getting transferred from XmlReader to File sink using pattern and delete action
Given Open Datafusion Project to configure pipeline
When Select plugin: "XML Reader" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "XMLReader" and "File" to establish connection
Then Navigate to the properties page of plugin: "XMLReader"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter input plugin property: "path" with value: "xmlTestFile"
Then Enter input plugin property: "nodePath" with value: "node"
Then Select dropdown plugin property: "reprocessingRequired" with option value: "No"
Then Enter input plugin property: "pattern" with value: "filePattern"
Then Select dropdown plugin property: "ActionAfterProcessingFile" with option value: "Delete"
Then Validate "XMLReader" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count is equal to IN record count
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "outputForXMLTest"

@XMLREADER_TEST @FILE_SINK_TEST
Scenario: To verify data is getting transferred from XmlReader to File sink using move action
Given Open Datafusion Project to configure pipeline
When Select plugin: "XML Reader" from the plugins list as: "Source"
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "XMLReader" and "File" to establish connection
Then Navigate to the properties page of plugin: "XMLReader"
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
Then Enter input plugin property: "path" with value: "xmlTestFile"
Then Enter input plugin property: "nodePath" with value: "node"
Then Select dropdown plugin property: "reprocessingRequired" with option value: "No"
Then Select dropdown plugin property: "ActionAfterProcessingFile" with option value: "Move"
Then Enter input plugin property: "targetFolder" with value: "folder"
Then Validate "XMLReader" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count is equal to IN record count
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "outputForXMLTest"
Original file line number Diff line number Diff line change
Expand Up @@ -285,7 +285,7 @@ public static void createBucketWithFileCSVDataTypeTest1() throws IOException, UR
BeforeActions.scenario.write("CSV Datatype test bucket name - " + fileSourceBucket1);
}

@After(order = 1, value = "@CSV_DATATYPE_TEST1 or @EXCEL_TEST")
@After(order = 1, value = "@CSV_DATATYPE_TEST1 or @EXCEL_TEST or @XMLREADER_TEST or @XMLREADER_DELETE_TEST")
public static void deleteSourceBucketWithFileCSVDataTypeTest1() {
deleteGCSBucket(fileSourceBucket1);
fileSourceBucket1 = StringUtils.EMPTY;
Expand Down Expand Up @@ -455,6 +455,12 @@ private static String createGCSBucketWithFile(String filePath) throws IOExceptio
return bucketName;
}

private static String createGCSBucketWithXmlFile(String filePath) throws IOException, URISyntaxException {
String bucketName = StorageClient.createBucket("e2e-test-xml").getName();
StorageClient.uploadObject(bucketName, filePath, filePath);
return bucketName;
}

private static void deleteGCSBucket(String bucketName) {
try {
for (Blob blob : StorageClient.listObjects(bucketName).iterateAll()) {
Expand Down Expand Up @@ -494,4 +500,20 @@ public static void createBucketWithExcelFile() throws IOException, URISyntaxExce
PluginPropertyUtils.pluginProp("excelFile"));
BeforeActions.scenario.write("excel test bucket name - " + fileSourceBucket1);
}

@Before(order = 1, value = "@XMLREADER_TEST")
public static void createBucketWithXmlFile() throws IOException, URISyntaxException {
fileSourceBucket1 = createGCSBucketWithXmlFile(PluginPropertyUtils.pluginProp("xmlFile"));
PluginPropertyUtils.addPluginProp("xmlTestFile", "gs://" + fileSourceBucket1 + "/" +
PluginPropertyUtils.pluginProp("xmlFile"));
BeforeActions.scenario.write("xml test bucket name - " + fileSourceBucket1);
}

@Before(order = 1, value = "@XMLREADER_DELETE_TEST")
public static void createBucketWithXmlFileForTestPattern() throws IOException, URISyntaxException {
fileSourceBucket1 = createGCSBucketWithXmlFile(PluginPropertyUtils.pluginProp("xmlFile"));
PluginPropertyUtils.addPluginProp("xmlTestFile", "gs://" + fileSourceBucket1 + "/testdata/xmldata/"
+ "*");
BeforeActions.scenario.write("xml test bucket name - " + fileSourceBucket1);
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/

package io.cdap.plugin.xmlreader.runners;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute XmlReader Source plugin testcases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"stepsdesign", "io.cdap.plugin.common.stepsdesign"},
tags = {"@XmlReader_Source"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/XmlReader-source",
"json:target/cucumber-reports/cucumber-xmlreader-source.json",
"junit:target/cucumber-reports/cucumber-xmlreader-source.xml"}
)
public class TestRunner {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
package io.cdap.plugin.xmlreader.runners;

import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
import org.junit.runner.RunWith;

/**
* Test Runner to execute only required XmlReader Source plugin testcases.
*/
@RunWith(Cucumber.class)
@CucumberOptions(
features = {"src/e2e-test/features"},
glue = {"stepsdesign", "io.cdap.plugin.common.stepsdesign"},
tags = {"@XmlReader_Source_Required"},
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/xmlreader-source",
"json:target/cucumber-reports/cucumber-xmlreader-source.json",
"junit:target/cucumber-reports/cucumber-xmlreader-source.xml"}
)
public class TestRunnerRequired {
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
/*
* Copyright © 2024 Cask Data, Inc.
*
* Licensed under the Apache License, Version 2.0 (the "License"); you may not
* use this file except in compliance with the License. You may obtain a copy of
* the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
* License for the specific language governing permissions and limitations under
* the License.
*/
/**
* Package contains the runner for the XmlReader source plugin.
*/
package io.cdap.plugin.xmlreader.runners;
Original file line number Diff line number Diff line change
Expand Up @@ -49,3 +49,4 @@ errorCodeColumnName=codeField
errorEmitterNodeName=stageField
recipe=directives
onError=on-error
ActionAfterProcessingFile=actionAfterProcess
Original file line number Diff line number Diff line change
Expand Up @@ -291,6 +291,13 @@ quotedValueDelimitedTestFiles=dummy
fileSinkTargetBucket=file-plugin-output
recursiveTest=dummy
testOnCdap=true
xmlFile=testdata/xmldata/testxmlfile.xml
node=/students/student
outputForXMLTest=e2e-tests/file/expected_outputs/OUTPUT_FOR_XMLREADER_TEST.csv
filePattern=^testxmlfile.xml
folder=gs://e2e-test-xml/
invalidPattern=abcd
invalidNode=dummy

## EXCEL-PLUGIN-PROPERTIES-START ##
excelTestFile=dummy
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
3,gs://e2e-test-xml/testdata/xmldata/testxmlfile.xml,<student>
<id>1</id>
<name>John Doe</name>
<address>123 Main Street, Cityville</address>
</student>
8,gs://e2e-test-xml/testdata/xmldata/testxmlfile.xml,<student>
<id>2</id>
<name>Jane Smith</name>
<address>456 Elm Street, Townsville</address>
</student>
13,gs://e2e-test-xml/testdata/xmldata/testxmlfile.xml,<student>
<id>3</id>
<name>Alice Johnson</name>
<address>789 Oak Street, Villageton</address>
</student>
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
<?xml version="1.0" encoding="UTF-8"?>
<students>
<student>
<id>1</id>
<name>John Doe</name>
<address>123 Main Street, Cityville</address>
</student>
<student>
<id>2</id>
<name>Jane Smith</name>
<address>456 Elm Street, Townsville</address>
</student>
<student>
<id>3</id>
<name>Alice Johnson</name>
<address>789 Oak Street, Villageton</address>
</student>
</students>

0 comments on commit 4497389

Please sign in to comment.