Skip to content

Conversation

@dbtsai
Copy link
Member

@dbtsai dbtsai commented Aug 27, 2018

What changes were proposed in this pull request?

Using some reflection tricks to merge Scala 2.11 and 2.12 codebase.

How was this patch tested?

Existing tests.

@dbtsai
Copy link
Member Author

dbtsai commented Aug 27, 2018

Ping @srowen

@SparkQA
Copy link

SparkQA commented Aug 27, 2018

Test build #95302 has finished for PR 22246 at commit 075ca4a.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Aug 27, 2018

If it works, great! You can then remove the hacks in the POM to support 2.11 vs 2.12 code? I wonder if 2.12 will compile the other two classes found in the 2.11 tree.

@dbtsai
Copy link
Member Author

dbtsai commented Aug 27, 2018

@viirya The reflection trick we use in scala.reflect.internal.util.ScalaClassLoader doesn't work when the REPL is called from test. Do you have any idea about it? Thanks.

sbt.ForkMain$ForkError: java.lang.NoSuchMethodException: scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(scala.Function0)
	at java.lang.Class.getDeclaredMethod(Class.java:2130)
	at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:156)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:178)
	at org.apache.spark.repl.Main$.doMain(Main.scala:78)
	at org.apache.spark.repl.ReplSuite.runInterpreter(ReplSuite.scala:56)
	at org.apache.spark.repl.ReplSuite$$anonfun$11.apply(ReplSuite.scala:141)

@srowen The other two files in 2.11 can not be compiled in 2.12 . I removed the POM hack as well. Thanks for pointing out.

@srowen
Copy link
Member

srowen commented Aug 27, 2018

Oh, but if there's on tree for both versions, and it has files that 2.11 needs but 2.12 can't compile, won't it fail? Not sure about the NSME error up there, I haven't seen it.

@dbtsai
Copy link
Member Author

dbtsai commented Aug 27, 2018

SparkILoopInterpreter.scala and SparkExprTyper.scala are only on 2.11 branch. It can not be compiled in 2.12. I think it's okay to have them in 2.11 branch since they are not needed in 2.12. We can remove those two files when the support of 2.11 is ended.

[error] /Users/d_tsai/dev/clean/apache-spark/repl/src/main/scala/org/apache/spark/repl/SparkILoopInterpreter.scala:80: overriding lazy value importableSymbolsWithRenames in class ImportHandler of type List[(this.intp.global.Symbol, this.intp.global.Name)];
[error]  lazy value importableSymbolsWithRenames needs `override' modifier
[error]       lazy val importableSymbolsWithRenames: List[(Symbol, Name)] = {
[error]                ^
[warn] two warnings found
[error] one error found
[error] Compile failed at Aug 27, 2018 11:51:19 AM [0.613s]

@srowen
Copy link
Member

srowen commented Aug 27, 2018

I understand, you still need the extra source files in a separate tree just for 2.11.

@dbtsai dbtsai changed the title [SPARK-25235] [SHELL] Merge the REPL code in Scala 2.11 and 2.12 branches [WIP] [SPARK-25235] [SHELL] Merge the REPL code in Scala 2.11 and 2.12 branches Aug 27, 2018
@dbtsai
Copy link
Member Author

dbtsai commented Aug 27, 2018

Yes. But we can remove the code in 2.12, and this can also fix the printing order issue in 2.12.

I don't know why we get some reflection error in tests. It works in command line. I will do some debugging tonight.

if (isScala2_11) {
val loader = Utils.classForName("scala.reflect.internal.util.ScalaClassLoader$")
.getDeclaredField("MODULE$")
.get(null)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@viirya Out of my curiosity, since it's a static method, in theory, in the invoke, the first arg should be null. Why do you put the loader in? Also, this doesn't work in tests, and we get the method not found exception.

Copy link
Member

@viirya viirya Aug 28, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Although it is a static method in Scala, it will be compiled to a non-static method in Java class. This class has a static member of the class type. The access of all Scala static methods are through the static member.

@SparkQA
Copy link

SparkQA commented Aug 27, 2018

Test build #95304 has finished for PR 22246 at commit 6203f83.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member

viirya commented Aug 28, 2018

@viirya The reflection trick we use in scala.reflect.internal.util.ScalaClassLoader doesn't work when the REPL is called from test. Do you have any idea about it? Thanks.

Yeah, it seems due to classloader. After changed to Spark classloader, the tests were passed locally. Let's see if Jenkins tests pass too.

@dbtsai dbtsai changed the title [WIP] [SPARK-25235] [SHELL] Merge the REPL code in Scala 2.11 and 2.12 branches [SPARK-25235] [SHELL] Merge the REPL code in Scala 2.11 and 2.12 branches Aug 28, 2018
@SparkQA
Copy link

SparkQA commented Aug 28, 2018

Test build #95324 has finished for PR 22246 at commit e0d424d.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member

viirya commented Aug 28, 2018

retest this please.

@SparkQA
Copy link

SparkQA commented Aug 28, 2018

Test build #95333 has finished for PR 22246 at commit e0d424d.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member

viirya commented Aug 28, 2018

retest this please.

@SparkQA
Copy link

SparkQA commented Aug 28, 2018

Test build #95348 has finished for PR 22246 at commit e0d424d.

  • This patch fails PySpark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@kiszk
Copy link
Member

kiszk commented Aug 28, 2018

retest this please

settings.classpath append addedClasspath
}
intp = Utils.classForNameFromSpark("org.apache.spark.repl.SparkILoopInterpreter")
.getDeclaredConstructor(Seq(classOf[Settings], classOf[JPrintWriter]): _*)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To summarize the classLoader issue here, classOf[Settings] will use the classloader of the object of SparkILoop, and Utils.classForName will use the classloader of the current running thread which can lead to mismatching.

I think using Utils.classForNameFromSpark is dangerous since there is no guaranteed that this will use the classloader of this object which is used in classOf[Settings].

I will use Class.ofName to ensure the consistency.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point!

@SparkQA
Copy link

SparkQA commented Aug 28, 2018

Test build #95360 has finished for PR 22246 at commit e0d424d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dbtsai
Copy link
Member Author

dbtsai commented Aug 28, 2018

@srowen @kiszk it works now. Can you have a pass on this PR? Thanks.

@SparkQA
Copy link

SparkQA commented Aug 29, 2018

Test build #95383 has finished for PR 22246 at commit b147dba.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 29, 2018

Test build #95381 has finished for PR 22246 at commit af4d984.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@viirya
Copy link
Member

viirya commented Aug 29, 2018

LGTM

@SparkQA
Copy link

SparkQA commented Aug 29, 2018

Test build #95393 has finished for PR 22246 at commit bab5947.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@dbtsai
Copy link
Member Author

dbtsai commented Aug 29, 2018

Thanks all for reviewing. Merged into master.

@asfgit asfgit closed this in ff8dcc1 Aug 29, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants