Skip to content
Closed
Prev Previous commit
Next Next commit
Remove simpleWritableConverter from SparkContext
  • Loading branch information
zsxwing committed Nov 15, 2014
commit 185c12f1bd372d6a0d164e373c958b826ec0d22f
12 changes: 3 additions & 9 deletions core/src/main/scala/org/apache/spark/SparkContext.scala
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ import org.apache.spark.storage._
import org.apache.spark.ui.SparkUI
import org.apache.spark.ui.jobs.JobProgressListener
import org.apache.spark.util._
import org.apache.spark.WritableConverter.simpleWritableConverter

/**
* Main entry point for Spark functionality. A SparkContext represents the connection to a Spark
Expand Down Expand Up @@ -1503,13 +1504,6 @@ object SparkContext extends Logging {
arr.map(x => anyToWritable(x)).toArray)
}

// Helper objects for converting common types to Writable
private def simpleWritableConverter[T, W <: Writable: ClassTag](convert: W => T)
: WritableConverter[T] = {
val wClass = classTag[W].runtimeClass.asInstanceOf[Class[W]]
new WritableConverter[T](_ => wClass, x => convert(x.asInstanceOf[W]))
}

@deprecated("An API for backforward compatibility", "1.2.0")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the simpleWritableConverter in SparkContext still necessary?

def intWritableConverter(): WritableConverter[Int] =
simpleWritableConverter[Int, IntWritable](_.get)
Expand Down Expand Up @@ -1769,10 +1763,10 @@ private[spark] class WritableConverter[T](
val convert: Writable => T)
extends Serializable

object WritableConverter {
private[spark] object WritableConverter {
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be private[spark] or public? Now class WritableConverter is private[spark], so I set object WritableConverter to private[spark] too.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should add some tests or examples that are outside the Spark package, but try to use it. My guess is that it needs to be public.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually that would be great to add for the whole patch; write some test programs in a different package (e.g. org.apache.sparktest) and make sure all the conversions can happen.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My guess is that it needs to be public.

If it's private[spark], people are still able to use it implicitly. But they cannot import these methods or call them directly. In other words, my question is should we enable people to import them or call them directly.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

write some test programs in a different package (e.g. org.apache.sparktest) and make sure all the conversions can happen.

Moved ImplicitSuite to org.apache.sparktest.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm, I find this pretty weird, so IMO we should make them public. I wouldn't be surprised if this gets removed in some version of Scala otherwise.


// Helper objects for converting common types to Writable
private def simpleWritableConverter[T, W <: Writable: ClassTag](convert: W => T)
private[spark] def simpleWritableConverter[T, W <: Writable: ClassTag](convert: W => T)
: WritableConverter[T] = {
val wClass = classTag[W].runtimeClass.asInstanceOf[Class[W]]
new WritableConverter[T](_ => wClass, x => convert(x.asInstanceOf[W]))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ class SparkContextSuite extends FunSuite {
bytesWritable.set(inputArray, 0, 10)
bytesWritable.set(inputArray, 0, 5)

val converter = SparkContext.bytesWritableConverter()
val converter = WritableConverter.bytesWritableConverter()
val byteArray = converter.convert(bytesWritable)
assert(byteArray.length === 5)

Expand Down