Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ class LongType private() extends IntegralType {
*/
override def defaultSize: Int = 8

override def simpleString: String = "bigint"
override def simpleString: String = "long"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so. bigint is the SQL type for an 8-byte integer, right?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When you try to read a csv and map to a case class with a Long your get a message like this one:
EXCEPTION:org.apache.spark.sql.AnalysisException: Cannot up cast linked_docs.MR_NUMBER_OF_DOCS_UPLOADED from string to bigint as it may truncate
The type path of the target object is:

  • field (class: "scala.Long", name: "MR_NUMBER_OF_DOCS_UPLOADED")

Getting a message that talks about bigint while you are trying to cast a String to a Long looks confusing to me. I thought this was a typo.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're renaming the whole type, when the issue is at best that some intermediate stage of the plan does cast to bigint first. At the least, this is not the fix, and I'm not sure it's really a problem, even if the error is from an implementation detail.


private[spark] override def asNullable: LongType = this
}
Expand Down