Skip to content

[SPARK-52587][SHELL] spark-shell 2.13 support -i -I parameter #51294

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

cxzl25
Copy link
Contributor

@cxzl25 cxzl25 commented Jun 26, 2025

What changes were proposed in this pull request?

Why are the changes needed?

In spark-shell 2.12 we supported the -i and -I parameters, but in version 2.13, we did not support it.

def loadInitFiles(): Unit = settings match {
case settings: GenericRunnerSettings =>
for (f <- settings.loadfiles.value) {
loadCommand(f)
addReplay(s":load $f")
}
for (f <- settings.pastefiles.value) {
pasteCommand(f)
addReplay(s":paste $f")
}
case _ =>


If we correctly construct ShellConfig, we can reuse scala's processing -i and -I logic.

scala.tools.nsc.interpreter.shell.ILoop#interpretPreamble

    for (f <- filesToLoad) {
      loadCommand(f)
      addReplay(s":load $f")
    }
    for (f <- filesToPaste) {
      pasteCommand(f)
      addReplay(s":paste $f")
    }

Does this PR introduce any user-facing change?

Yes

How was this patch tested?

local test

Was this patch authored or co-authored using generative AI tooling?

No

@HyukjinKwon
Copy link
Member

What does the parameter do?

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you, @cxzl25 .

I verified manually.

$ cat hello.scala
print("Hello World")
System.exit(0)
$ bin/spark-shell -I hello.scala
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.1.0-SNAPSHOT
      /_/

Using Scala version 2.13.16 (OpenJDK 64-Bit Server VM, Java 21.0.7)
Type in expressions to have them evaluated.
Type :help for more information.
25/06/26 18:02:28 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://localhost:4040
Spark context available as 'sc' (master = local[*], app id = local-1750986148574).
Spark session available as 'spark'.
Hello World%

@dongjoon-hyun
Copy link
Member

Merged to master for Apache Spark 4.1.0 for now.

I agree that this is a bug fix because this is a documented function. I support the backporting to branch-4.0 in general. If you don't mind, let's have some time to test this patch in master branch first, @cxzl25 .

$ bin/spark-shell --help | head
WARNING: Using incubator modules: jdk.incubator.vector
Usage: ./bin/spark-shell [options]

Scala REPL options, Spark Classic only:
  -I <file>                   preload <file>, enforcing line-by-line interpretation
...

@cxzl25
Copy link
Contributor Author

cxzl25 commented Jun 27, 2025

What does the parameter do?

@HyukjinKwon

scala -help
Other startup options:

 -i <file>    preload <file> before starting the REPL
 -I <file>    preload <file>, enforcing line-by-line interpretation

In spark-shell with scala 2.12, it supports both parameters.


I verified manually

Thank you for your help verification. @dongjoon-hyun

let's have some time to test this patch in master branch first

Sure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants