Skip to content

Commit

Permalink
Selected input scenarios for basic joins
Browse files Browse the repository at this point in the history
and advance scenarios for joiner
  • Loading branch information
rahuldash171 committed Aug 24, 2023
1 parent 92e9cbe commit 88a18ef
Show file tree
Hide file tree
Showing 7 changed files with 307 additions and 2 deletions.
261 changes: 261 additions & 0 deletions core-plugins/src/e2e-test/features/joiner/JoinerWithFile.feature
Original file line number Diff line number Diff line change
Expand Up @@ -250,3 +250,264 @@ Feature: Joiner - Verify File source to File sink data transfer using Joiner ana
Then Close the pipeline logs
Then Validate OUT record count of joiner is equal to IN record count of sink
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "joinerTest4OutputFile"

@JOINER_TEST1 @JOINER_TEST2 @FILE_SINK_TEST
Scenario:To verify data is getting transferred from File to File successfully using Joiner plugin with Advance inner join type
Given Open Datafusion Project to configure pipeline
When Select plugin: "File" from the plugins list as: "Source"
When Select plugin: "File" from the plugins list as: "Source"
And Expand Plugin group in the LHS plugins list: "Analytics"
When Select plugin: "Joiner" from the plugins list as: "Analytics"
Then Connect plugins: "File" and "Joiner" to establish connection
Then Connect plugins: "File2" and "Joiner" to establish connection
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "Joiner" and "File3" to establish connection
Then Click plugin property: "alignPlugins" button
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest1"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvFileFirstSchema"
Then Validate "File2" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File2"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest2"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvDataTypeFileSchema"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
When Navigate to the properties page of plugin: "Joiner"
Then Select joiner type "Inner"
Then Select radio button plugin property: "conditionType" with value: "advanced"
Then Enter textarea plugin property: "conditionExpression" with value: "joinConditionSQLExpression"
Then Click on the Get Schema button
Then Validate "Joiner" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File3"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count of joiner is equal to IN record count of sink
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "joinerTest1OutputFile"

@JOINER_TEST1 @JOINER_TEST2 @FILE_SINK_TEST
Scenario:To verify data is getting transferred from File to File successfully using Joiner plugin with Advance outer join type
Given Open Datafusion Project to configure pipeline
When Select plugin: "File" from the plugins list as: "Source"
When Select plugin: "File" from the plugins list as: "Source"
And Expand Plugin group in the LHS plugins list: "Analytics"
When Select plugin: "Joiner" from the plugins list as: "Analytics"
Then Connect plugins: "File" and "Joiner" to establish connection
Then Connect plugins: "File2" and "Joiner" to establish connection
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "Joiner" and "File3" to establish connection
Then Click plugin property: "alignPlugins" button
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest1"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvFileFirstSchema"
Then Validate "File2" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File2"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest2"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvDataTypeFileSchema"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
When Navigate to the properties page of plugin: "Joiner"
Then Select joiner type "Outer"
Then Select radio button plugin property: "conditionType" with value: "advanced"
Then Enter textarea plugin property: "conditionExpression" with value: "joinConditionSQLExpression"
Then Select dropdown plugin property: "inMemoryInputs" with option value: "File2"
Then Press ESC key to close the joiner fields dropdown
Then Click on the Get Schema button
Then Validate "Joiner" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File3"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count of joiner is equal to IN record count of sink
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "joinerTest1OutputFile"

@JOINER_TEST1 @JOINER_TEST2 @FILE_SINK_TEST
Scenario:To verify data is getting transferred from File to File successfully using Joiner plugin with outer join type with selected inputs
Given Open Datafusion Project to configure pipeline
When Select plugin: "File" from the plugins list as: "Source"
When Select plugin: "File" from the plugins list as: "Source"
And Expand Plugin group in the LHS plugins list: "Analytics"
When Select plugin: "Joiner" from the plugins list as: "Analytics"
Then Connect plugins: "File" and "Joiner" to establish connection
Then Connect plugins: "File2" and "Joiner" to establish connection
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "Joiner" and "File3" to establish connection
Then Click plugin property: "alignPlugins" button
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest1"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvFileFirstSchema"
Then Validate "File2" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File2"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest2"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvDataTypeFileSchema"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
When Navigate to the properties page of plugin: "Joiner"
Then Select joiner type "Outer"
Then Select radio button plugin property: "conditionType" with value: "basic"
Then Expand fields
Then Uncheck plugin "File" field "lastname" alias checkbox
Then Uncheck plugin "File2" field "item" alias checkbox
Then Click on the required input checkbox for first schema "File"
Then Click on the Get Schema button
Then Validate "Joiner" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File3"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count of joiner is equal to IN record count of sink
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "joinerTest7OutputFile"

@JOINER_TEST1 @JOINER_TEST2 @FILE_SINK_TEST
Scenario:To verify data is getting transferred from File to File successfully using Joiner plugin with inner join type with selected inputs
Given Open Datafusion Project to configure pipeline
When Select plugin: "File" from the plugins list as: "Source"
When Select plugin: "File" from the plugins list as: "Source"
And Expand Plugin group in the LHS plugins list: "Analytics"
When Select plugin: "Joiner" from the plugins list as: "Analytics"
Then Connect plugins: "File" and "Joiner" to establish connection
Then Connect plugins: "File2" and "Joiner" to establish connection
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "File" from the plugins list as: "Sink"
Then Connect plugins: "Joiner" and "File3" to establish connection
Then Click plugin property: "alignPlugins" button
Then Navigate to the properties page of plugin: "File"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest1"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvFileFirstSchema"
Then Validate "File2" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File2"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "joinerInputTest2"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Click plugin property: "skipHeader"
Then Click plugin property: "enableQuotedValues"
Then Click on the Get Schema button
Then Verify the Output Schema matches the Expected Schema: "joinerCsvDataTypeFileSchema"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
When Navigate to the properties page of plugin: "Joiner"
Then Select joiner type "Inner"
Then Select radio button plugin property: "conditionType" with value: "basic"
Then Expand fields
Then Uncheck plugin "File" field "lastname" alias checkbox
Then Uncheck plugin "File" field "state" alias checkbox
Then Uncheck plugin "File2" field "item" alias checkbox
Then Uncheck plugin "File2" field "price" alias checkbox
Then Click on the Get Schema button
Then Validate "Joiner" plugin properties
Then Close the Plugin Properties page
Then Navigate to the properties page of plugin: "File3"
Then Enter input plugin property: "referenceName" with value: "FileReferenceName"
Then Enter input plugin property: "path" with value: "fileSinkTargetBucket"
Then Replace input plugin property: "pathSuffix" with value: "yyyy-MM-dd-HH-mm-ss"
Then Select dropdown plugin property: "format" with option value: "csv"
Then Validate "File" plugin properties
Then Close the Plugin Properties page
Then Save the pipeline
Then Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate OUT record count of joiner is equal to IN record count of sink
Then Validate output file generated by file sink plugin "fileSinkTargetBucket" is equal to expected output file "joinerTest6OutputFile"
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,14 @@ public static void uncheckPluginFieldAliasCheckBox(String plugin, String field)
ElementHelper.selectCheckbox(JoinerLocators.fieldAliasCheckBox(plugin, field));
}

public static void selectRequiredInputCheckboxFirstFile(String plugin){
ElementHelper.selectCheckbox(JoinerLocators.requiredInputCheckboxFirstFile(plugin));
}

public static void selectRequiredInputCheckboxSecondFile(String plugin){
ElementHelper.selectCheckbox(JoinerLocators.requiredInputCheckboxSecondFile(plugin));
}

public static void selectJoinerType(String targetJoinerType) {
ElementHelper.selectDropdownOption(JoinerLocators.joinerTypeSelectDropdown,
CdfPluginPropertiesLocators.locateDropdownListItem(targetJoinerType));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,17 @@ public class JoinerLocators {

public static WebElement fieldAliasCheckBox(String pluginName, String field) {
String xpath = "//*[@data-cy='" + pluginName + "-stage-expansion-panel']" + "//*[@data-cy='" + field +
"-field-selector-name']/..//*[@type='checkbox']";
"-field-selector-name']/..//*[@data-cy='"+ field +"-field-selector-checkbox']";
return SeleniumDriver.getDriver().findElement(By.xpath(xpath));
}

public static WebElement requiredInputCheckboxFirstFile(String pluginName){
String xpath = "//*[@type='checkbox'][@value='0-"+ pluginName +"']";
return SeleniumDriver.getDriver().findElement(By.xpath(xpath));
}

public static WebElement requiredInputCheckboxSecondFile(String pluginName){
String xpath = "//*[@type='checkbox'][@value='1-"+ pluginName +"']";
return SeleniumDriver.getDriver().findElement(By.xpath(xpath));
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,16 @@ public void uncheckPluginFieldAliasCheckBox(String plugin, String field) {
JoinerActions.uncheckPluginFieldAliasCheckBox(plugin, field);
}

@Then("Click on the required input checkbox for first schema {string}")
public void clickRequiredInputCheckboxFirstFile(String plugin){
JoinerActions.selectRequiredInputCheckboxFirstFile(plugin);
}

@Then("Click on the required input checkbox for second schema {string}")
public void clickRequiredInputCheckboxSecondFile(String plugin){
JoinerActions.selectRequiredInputCheckboxSecondFile(plugin);
}

@Then("Enter numPartitions {string}")
public void openJoinerProperties(String partitions) {
JoinerActions.enterNumPartitions(PluginPropertyUtils.pluginProp(partitions));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -232,7 +232,7 @@ joinerInvalidPartitions=&*^*
joinerKeys=File.purchase_id = File2.customer_id
joinerInputMemory=File
joinerNullKeys=false
joinConditionSQLExpression=File.customer_name = customers.name
joinConditionSQLExpression=File.id = File2.customerid
joinerOutputSchema={ "type": "record", "name": "text", "fields": [ \
{ "name": "purchase_id", "type": "int" }, { "name": "customer_name", "type": "string" }, \
{ "name": "item", "type": "string" }, { "name": "customer_id", "type": "int" }, { "name": "name", "type": "string" } ] }
Expand All @@ -241,6 +241,8 @@ joinerTest1OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST1_Output.csv
joinerTest2OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST2_Output.csv
joinerTest3OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST3_Output.csv
joinerTest4OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST4_Output.csv
joinerTest6OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST6_Output.csv
joinerTest7OutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST7_Output.csv
joinerMacroOutputFile=e2e-tests/expected_outputs/CSV_JOINER_TEST5_Output.csv
## JOINER-PLUGIN-PROPERTIES-END

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
1,Douglas,1, Vista Montana,San Jose,95134,408-777-3214,1
1,Douglas,1, Vista Montana,San Jose,95134,408-777-3214,1
2,David,3, Baypointe Parkway,Houston,78970,804-777-2341,2
2,David,3, Baypointe Parkway,Houston,78970,804-777-2341,2
5,Frank,1609 Far St.,San Diego,29770,201-506-8756,5
3,Hugh,5, Cool Way,Manhattan,67263,708-234-2168,3
Loading

0 comments on commit 88a18ef

Please sign in to comment.