Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: Dont review #5012

Open
wants to merge 57 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
ae601d2
jdk-17 compile time
akshat0395 Jan 17, 2024
d18ad6f
fix indentation
akshat0395 Jan 22, 2024
fd1e1fb
sw java 17 for hive build
akshat0395 Jan 22, 2024
d696fa7
source jdk 17 shell
akshat0395 Jan 24, 2024
24df35c
Disable spotbugs
akshat0395 Jan 24, 2024
5a45f77
remove auto remove imports
akshat0395 Jan 24, 2024
3dfb66c
Add-opens for hadoop client opts
akshat0395 Mar 27, 2024
769b282
Add opens for hive build test
akshat0395 Mar 28, 2024
c9c2a61
Add jvm args
akshat0395 Apr 2, 2024
eed2009
Add jvm args for lang
akshat0395 Apr 3, 2024
2c04778
Add jvm args for nio
akshat0395 Apr 3, 2024
ddedbec
Add jvm args for util.concurrent.atomic
akshat0395 Apr 3, 2024
0e5671b
Add jvm args for util.regex
akshat0395 Apr 4, 2024
a2cf132
Jdk17 Timestamp behaviour changes
Apr 5, 2024
147f19a
Upgrade mockito for jdk 17
akshat0395 Apr 6, 2024
cebe682
Hadoop shaded classpath issue
Apr 23, 2024
0b90fdf
antlr code too large problem. Move some of the alter statements to Al…
Apr 23, 2024
ecb5983
Javadoc issues fix
tanishq-chugh Apr 24, 2024
7f6fb60
javadoc classpath fix
tanishq-chugh Apr 24, 2024
79f96d6
javadoc classpath tested fix
tanishq-chugh Apr 29, 2024
815012c
Fixed pom formatting
tanishq-chugh Apr 29, 2024
bdf4494
solve beeline error due to --add-opens
May 2, 2024
f89dde1
datanucleus upgrade
May 3, 2024
9511186
Add nashorn Javascript engine dependency
tanishq-chugh May 21, 2024
ee2ab9a
Final fields modification using reflection workaround
tanishq-chugh May 21, 2024
a49a554
Reflection workaround for TestJdbcWithMiniHS2
tanishq-chugh May 22, 2024
118697b
Add jvm args for io java.sql & util.concurrent
tanishq-chugh May 22, 2024
909549d
datanucleus rdms patch
May 20, 2024
50df2c2
DateTime issues fix
tanishq-chugh May 30, 2024
9875fdb
Enable spotbugs & suppress warnings
tanishq-chugh May 30, 2024
d0cdeef
Upgrade kudu to 1.17.0 for jdk17 compatibility
tanishq-chugh May 31, 2024
c91a2de
Add Logging dependencies for upgraded kudu in qtests
tanishq-chugh Jun 2, 2024
5a027ba
Datetime qtest fixes
tanishq-chugh Jul 1, 2024
035e241
Revert heap size back to 2048
tanishq-chugh Jul 1, 2024
8d0c1b8
Gzip compression & Math radians qtest fix
tanishq-chugh Jul 2, 2024
0804e65
trySetMock filtering issue
tanishq-chugh Jul 2, 2024
2850d0b
Merge pull request #3 from akshat0395/gzip-radians-17
tanishq-chugh Jul 2, 2024
53c773c
Merge pull request #4 from akshat0395/mockingIssue-17
tanishq-chugh Jul 2, 2024
3648a3a
New Spotbugs error fix (#5)
tanishq-chugh Jul 2, 2024
042a5c9
Java version parsing & Netty Reflection fix (#6)
tanishq-chugh Jul 3, 2024
6308e1b
SocketTimeoutException ErrorMsg case fix (#7)
tanishq-chugh Jul 3, 2024
0ac07f6
Change derby script clob datatypes to max length varchar (#8)
tanishq-chugh Jul 9, 2024
3283b08
Add opens for surfire plugin across project
akshat0395 Jul 10, 2024
62c0adf
Add opens for surfire plugin across project
akshat0395 Jul 10, 2024
7c16afc
Add opens for ql module
akshat0395 Jul 11, 2024
f54fa1a
Merge pull request #10 from akshat0395/ql-add-opens
akshat0395 Jul 11, 2024
6d85533
add java lang add opens for ql
akshat0395 Jul 11, 2024
ac86017
Merge pull request #11 from akshat0395/ql-add-opens
akshat0395 Jul 11, 2024
a407fb6
Add opens hive,iceberg,llap configs for Tez container issues (#12)
tanishq-chugh Jul 11, 2024
2e202f3
Add opens for ql module
akshat0395 Jul 12, 2024
713488e
Merge pull request #13 from akshat0395/ql-add-opens
akshat0395 Jul 12, 2024
bca530d
Add open configs for MR (#14)
tanishq-chugh Jul 12, 2024
9cf001b
Increase Tez AM heap size to address OOM errors (#15)
tanishq-chugh Jul 15, 2024
35dd594
Increase Tez container size to address vertex resource OOM errors (#16)
tanishq-chugh Jul 15, 2024
e548040
Regex io add open configs for Tez & MR (#17)
tanishq-chugh Jul 15, 2024
3b0ec9f
fix NoClassDefFoundError: org/apache/hadoop/shaded/com/ctc/wstx/io/In…
Jul 16, 2024
2059db0
Merge pull request #18 from kokila-19/jdk17
kokila-19 Jul 16, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,14 +29,14 @@ env:

jobs:
macos-jdk8:
name: 'macOS (JDK 8)'
name: 'macOS (JDK 17)'
runs-on: macos-latest
steps:
- uses: actions/checkout@v2
- name: 'Set up JDK 8'
- name: 'Set up JDK 17'
uses: actions/setup-java@v1
with:
java-version: 8
java-version: 17
- name: 'Build project'
run: |
mvn clean install -DskipTests -Pitests
10 changes: 7 additions & 3 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -84,11 +84,12 @@ def buildHive(args) {
set -x
. /etc/profile.d/confs.sh
export USER="`whoami`"
export MAVEN_OPTS="-Xmx2g"
export MAVEN_OPTS="-Xmx4G"
export -n HIVE_CONF_DIR
sw java 17 && . /etc/profile.d/java.sh
cp $SETTINGS .git/settings.xml
OPTS=" -s $PWD/.git/settings.xml -B -Dtest.groups= "
OPTS+=" -Pitests,qsplits,dist,errorProne,iceberg"
OPTS+=" -Pitests,qsplits,dist,iceberg"
OPTS+=" -Dorg.slf4j.simpleLogger.log.org.apache.maven.plugin.surefire.SurefirePlugin=INFO"
OPTS+=" -Dmaven.repo.local=$PWD/.git/m2"
git config extra.mavenOpts "$OPTS"
Expand Down Expand Up @@ -254,7 +255,7 @@ if [ $n != 0 ]; then
exit 1
fi
'''
buildHive("-Pspotbugs -pl " + spotbugsProjects.join(",") + " -am test-compile com.github.spotbugs:spotbugs-maven-plugin:4.0.0:check")
buildHive("-Pspotbugs -pl " + spotbugsProjects.join(",") + " -am test-compile com.github.spotbugs:spotbugs-maven-plugin:4.5.0.0:check")
}
stage('Compile') {
buildHive("install -Dtest=noMatches")
Expand Down Expand Up @@ -283,12 +284,14 @@ fi
stage('init-metastore') {
withEnv(["dbType=$dbType"]) {
sh '''#!/bin/bash -e
sw java 17 && . /etc/profile.d/java.sh
set -x
echo 127.0.0.1 dev_$dbType | sudo tee -a /etc/hosts
. /etc/profile.d/confs.sh
sw hive-dev $PWD
export DOCKER_NETWORK=host
export DBNAME=metastore
export HADOOP_CLIENT_OPTS="--add-opens java.base/java.net=ALL-UNNAMED"
reinit_metastore $dbType
time docker rm -f dev_$dbType || true
'''
Expand Down Expand Up @@ -393,6 +396,7 @@ tar -xzf packaging/target/apache-hive-*-nightly-*-src.tar.gz
}
stage('Generate javadoc') {
sh """#!/bin/bash -e
sw java 17 && . /etc/profile.d/java.sh
mvn install javadoc:javadoc javadoc:aggregate -DskipTests -pl '!itests/hive-jmh,!itests/util'
"""
}
Expand Down
2 changes: 1 addition & 1 deletion bin/ext/beeline.sh
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ beeline () {
hadoopClasspath="${HADOOP_CLASSPATH}:"
fi
export HADOOP_CLASSPATH="${hadoopClasspath}${HIVE_CONF_DIR}:${beelineJarPath}:${superCsvJarPath}:${jlineJarPath}"
export HADOOP_CLIENT_OPTS="$HADOOP_CLIENT_OPTS -Dlog4j.configurationFile=beeline-log4j2.properties "
export HADOOP_CLIENT_OPTS="$HADOOP_CLIENT_OPTS -Dlog4j.configurationFile=beeline-log4j2.properties --add-opens=java.base/java.net=ALL-UNNAMED "

if [ "$EXECUTE_WITH_JAVA" != "true" ] ; then
# if CLIUSER is not empty, then pass it as user id / password during beeline redirect
Expand Down
31 changes: 31 additions & 0 deletions common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,11 @@
<artifactId>hive-classification</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-shims</artifactId>
Expand Down Expand Up @@ -69,6 +74,12 @@
<dependency>
<groupId>org.apache.orc</groupId>
<artifactId>orc-core</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>jline</groupId>
Expand Down Expand Up @@ -373,6 +384,18 @@
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<compilerArgs>
<arg>--add-opens</arg>
<arg>org.apache.hadoop/org.apache.hadoop.fs=ALL-UNNAMED</arg>
<arg>--add-opens</arg>
<arg>java.base/java.net=ALL-UNNAMED</arg>
</compilerArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
Expand Down Expand Up @@ -431,6 +454,14 @@
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>17</source>
<target>17</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
32 changes: 17 additions & 15 deletions common/src/java/org/apache/hadoop/hive/ant/GenHiveTemplate.java
Original file line number Diff line number Diff line change
Expand Up @@ -73,21 +73,23 @@ private Document generateTemplate() throws Exception {
doc.appendChild(doc.createProcessingInstruction(
"xml-stylesheet", "type=\"text/xsl\" href=\"configuration.xsl\""));

doc.appendChild(doc.createComment("\n" +
" Licensed to the Apache Software Foundation (ASF) under one or more\n" +
" contributor license agreements. See the NOTICE file distributed with\n" +
" this work for additional information regarding copyright ownership.\n" +
" The ASF licenses this file to You under the Apache License, Version 2.0\n" +
" (the \"License\"); you may not use this file except in compliance with\n" +
" the License. You may obtain a copy of the License at\n" +
"\n" +
" http://www.apache.org/licenses/LICENSE-2.0\n" +
"\n" +
" Unless required by applicable law or agreed to in writing, software\n" +
" distributed under the License is distributed on an \"AS IS\" BASIS,\n" +
" WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n" +
" See the License for the specific language governing permissions and\n" +
" limitations under the License.\n"));
doc.appendChild(doc.createComment("""

Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""));

Element root = doc.createElement("configuration");
doc.appendChild(root);
Expand Down
24 changes: 12 additions & 12 deletions common/src/java/org/apache/hadoop/hive/common/CompressionUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -169,30 +169,30 @@ public static List<File> unTar(final String inputFileName, final String outputDi
// no sub-directories
continue;
}
LOG.debug(String.format("Attempting to write output directory %s.",
outputFile.getAbsolutePath()));
LOG.debug("Attempting to write output directory %s.".formatted(
outputFile.getAbsolutePath()));
if (!outputFile.exists()) {
LOG.debug(String.format("Attempting to create output directory %s.",
outputFile.getAbsolutePath()));
LOG.debug("Attempting to create output directory %s.".formatted(
outputFile.getAbsolutePath()));
if (!outputFile.mkdirs()) {
throw new IllegalStateException(String.format("Couldn't create directory %s.",
outputFile.getAbsolutePath()));
throw new IllegalStateException("Couldn't create directory %s.".formatted(
outputFile.getAbsolutePath()));
}
}
} else {
final OutputStream outputFileStream;
if (flatten) {
File flatOutputFile = new File(outputDir, outputFile.getName());
LOG.debug(String.format("Creating flat output file %s.", flatOutputFile.getAbsolutePath()));
LOG.debug("Creating flat output file %s.".formatted(flatOutputFile.getAbsolutePath()));
outputFileStream = new FileOutputStream(flatOutputFile);
} else if (!outputFile.getParentFile().exists()) {
LOG.debug(String.format("Attempting to create output directory %s.",
outputFile.getParentFile().getAbsoluteFile()));
LOG.debug("Attempting to create output directory %s.".formatted(
outputFile.getParentFile().getAbsoluteFile()));
if (!outputFile.getParentFile().getAbsoluteFile().mkdirs()) {
throw new IllegalStateException(String.format("Couldn't create directory %s.",
outputFile.getParentFile().getAbsolutePath()));
throw new IllegalStateException("Couldn't create directory %s.".formatted(
outputFile.getParentFile().getAbsolutePath()));
}
LOG.debug(String.format("Creating output file %s.", outputFile.getAbsolutePath()));
LOG.debug("Creating output file %s.".formatted(outputFile.getAbsolutePath()));
outputFileStream = new FileOutputStream(outputFile);
} else {
outputFileStream = new FileOutputStream(outputFile);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -361,6 +361,7 @@ public void setInterned(Properties p) {
public CopyOnFirstWriteProperties() {
}

@SuppressFBWarnings(value = "EI_EXPOSE_REP", justification = "intended_to_do")
public Properties getInterned() {
return interned;
}
Expand Down
12 changes: 8 additions & 4 deletions common/src/java/org/apache/hadoop/hive/common/FileUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -1007,8 +1007,10 @@ public static boolean rename(FileSystem fs, Path sourcePath,
// If destPath directory exists, rename call will move the sourcePath
// into destPath without failing. So check it before renaming.
if (fs.exists(destPath)) {
throw new IOException("Cannot rename the source path. The destination "
+ "path already exists.");
throw new IOException("""
Cannot rename the source path. The destination \
path already exists.\
""");
}
return fs.rename(sourcePath, destPath);
}
Expand Down Expand Up @@ -1094,8 +1096,10 @@ public static void checkDeletePermission(Path path, Configuration conf, String u
if (childStatus.getOwner().equals(user)) {
return;
}
String msg = String.format("Permission Denied: User %s can't delete %s because sticky bit is"
+ " set on the parent dir and user does not own this file or its parent", user, path);
String msg = ("""
Permission Denied: User %s can't delete %s because sticky bit is\
set on the parent dir and user does not own this file or its parent\
""").formatted(user, path);
throw new IOException(msg);

}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,10 @@ private static MemoryPoolMXBean getTenuredGenPool() {
if (isUsageThresholdSupported) {
return pool;
} else {
LOG.error("{} vendor does not support isCollectionUsageThresholdSupported() and isUsageThresholdSupported()" +
" for tenured memory pool '{}'.", vendor, pool.getName());
LOG.error("""
{} vendor does not support isCollectionUsageThresholdSupported() and isUsageThresholdSupported()\
for tenured memory pool '{}'.\
""", vendor, pool.getName());
}
}
}
Expand Down
8 changes: 4 additions & 4 deletions common/src/java/org/apache/hadoop/hive/common/JavaUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -70,8 +70,8 @@ public static boolean closeClassLoadersTo(ClassLoader current, ClassLoader stop)
try {
closeClassLoader(current);
} catch (IOException e) {
String detailedMessage = current instanceof URLClassLoader ?
Arrays.toString(((URLClassLoader) current).getURLs()) :
String detailedMessage = current instanceof URLClassLoader urlcl ?
Arrays.toString(urlcl.getURLs()) :
"";
LOG.info("Failed to close class loader " + current + " " + detailedMessage, e);
}
Expand All @@ -90,8 +90,8 @@ private static boolean isValidHierarchy(ClassLoader current, ClassLoader stop) {
}

public static void closeClassLoader(ClassLoader loader) throws IOException {
if (loader instanceof Closeable) {
((Closeable) loader).close();
if (loader instanceof Closeable closeable) {
closeable.close();
} else {
LOG.warn("Ignoring attempt to close class loader ({}) -- not instance of UDFClassLoader.",
loader == null ? "mull" : loader.getClass().getSimpleName());
Expand Down
16 changes: 7 additions & 9 deletions common/src/java/org/apache/hadoop/hive/common/LogUtils.java
Original file line number Diff line number Diff line change
Expand Up @@ -245,16 +245,14 @@ public static void unregisterLoggingContext() {
public static String getLogFilePath() {
String logFilePath = null;
org.apache.logging.log4j.Logger rootLogger = LogManager.getRootLogger();
if (rootLogger instanceof org.apache.logging.log4j.core.Logger) {
org.apache.logging.log4j.core.Logger coreLogger =
(org.apache.logging.log4j.core.Logger)rootLogger;
if (rootLogger instanceof org.apache.logging.log4j.core.Logger coreLogger) {
for (Appender appender : coreLogger.getAppenders().values()) {
if (appender instanceof FileAppender) {
logFilePath = ((FileAppender) appender).getFileName();
} else if (appender instanceof RollingFileAppender) {
logFilePath = ((RollingFileAppender) appender).getFileName();
} else if (appender instanceof RollingRandomAccessFileAppender) {
logFilePath = ((RollingRandomAccessFileAppender) appender).getFileName();
if (appender instanceof FileAppender fileAppender) {
logFilePath = fileAppender.getFileName();
} else if (appender instanceof RollingFileAppender fileAppender) {
logFilePath = fileAppender.getFileName();
} else if (appender instanceof RollingRandomAccessFileAppender fileAppender) {
logFilePath = fileAppender.getFileName();
}
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,14 @@
import java.io.PrintStream;

import org.apache.hive.common.util.StreamPrinter;
import org.apache.hive.common.util.SuppressFBWarnings;

public class ShellCmdExecutor {
private String cmd;
private PrintStream out;
private PrintStream err;

@SuppressFBWarnings(value = "EI_EXPOSE_REP2", justification = "intended_to_do")
public ShellCmdExecutor(String cmd, PrintStream out, PrintStream err) {
this.cmd = cmd;
this.out = out;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@
import org.apache.commons.lang3.tuple.Pair;
import org.apache.hadoop.hive.common.type.Date;
import org.apache.hadoop.hive.common.type.Timestamp;
import org.apache.hive.common.util.SuppressFBWarnings;

import java.io.Serializable;
import java.time.DateTimeException;
Expand Down Expand Up @@ -520,6 +521,7 @@ public Token(TokenType tokenType, String string) {
this(tokenType, null, null, string, string.length(), false);
}

@SuppressFBWarnings(value = "EI_EXPOSE_REP2", justification = "intended_to_do")
public Token(TokenType tokenType, TemporalField temporalField, TemporalUnit temporalUnit,
String string, int length, boolean fillMode) {
this.type = tokenType;
Expand Down Expand Up @@ -847,17 +849,21 @@ private void verifyForParse() {
!(temporalFields.contains(ChronoField.MONTH_OF_YEAR) &&
temporalFields.contains(ChronoField.DAY_OF_MONTH) ||
temporalFields.contains(ChronoField.DAY_OF_YEAR))) {
throw new IllegalArgumentException("Missing day of year or (month of year + day of month)"
+ " tokens.");
throw new IllegalArgumentException("""
Missing day of year or (month of year + day of month)\
tokens.\
""");
}
if (containsIsoFields &&
!(temporalFields.contains(IsoFields.WEEK_OF_WEEK_BASED_YEAR) &&
temporalFields.contains(ChronoField.DAY_OF_WEEK))) {
throw new IllegalArgumentException("Missing week of year (iw) or day of week (id) tokens.");
}
if (roundYearCount > 0 && yearCount > 0) {
throw new IllegalArgumentException("Invalid duplication of format element: Both year and"
+ "round year are provided");
throw new IllegalArgumentException("""
Invalid duplication of format element: Both year and\
round year are provided\
""");
}
for (TemporalField tokenType : temporalFields) {
if (Collections.frequency(temporalFields, tokenType) > 1) {
Expand Down Expand Up @@ -1281,8 +1287,10 @@ private int parseNumericTemporal(String substring, Token token) {
return 0;
}
if ("0".equals(substring)) {
throw new IllegalArgumentException("Value of hour of day (hh/hh12) in input is 0. "
+ "The value should be between 1 and 12.");
throw new IllegalArgumentException("""
Value of hour of day (hh/hh12) in input is 0. \
The value should be between 1 and 12.\
""");
}
}
if (token.temporalField == ChronoField.YEAR
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@

package org.apache.hadoop.hive.common.io;

import org.apache.hive.common.util.SuppressFBWarnings;

import java.io.FileNotFoundException;
import java.io.OutputStream;
import java.io.UnsupportedEncodingException;
Expand Down Expand Up @@ -52,6 +54,7 @@ public void flush() {
super.flush();
}

@SuppressFBWarnings(value = "EI_EXPOSE_REP", justification = "intended_to_do")
public List<String> getOutput() {
return output;
}
Expand Down
Loading
Loading