Skip to content
Closed
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ package org.apache.spark.sql.hive
import java.util.Properties

import scala.collection.JavaConverters._
import scala.util.{Failure, Success, Try}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shall we remove this, @StefanXiepj ?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your review, i have removed it.


import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.{Path, PathFilter}
Expand Down Expand Up @@ -47,6 +48,7 @@ import org.apache.spark.sql.internal.SQLConf
import org.apache.spark.unsafe.types.UTF8String
import org.apache.spark.util.{SerializableConfiguration, Utils}


Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: please remove the unnecessary change.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

/**
* A trait for subclasses that handle table scans.
*/
Expand Down Expand Up @@ -486,13 +488,20 @@ private[hive] object HadoopTableReader extends HiveInspectors with Logging {
var i = 0
val length = fieldRefs.length
while (i < length) {
val fieldValue = soi.getStructFieldData(raw, fieldRefs(i))
if (fieldValue == null) {
mutableRow.setNullAt(fieldOrdinals(i))
} else {
unwrappers(i)(fieldValue, mutableRow, fieldOrdinals(i))
Try {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't feel strongly about it, but I prefer simple try-catch myself. I think it avoids having to handle the 'return value' here, when there isn't one within the while loop?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get it & done

val fieldValue = soi.getStructFieldData(raw, fieldRefs(i))
if (fieldValue == null) {
mutableRow.setNullAt(fieldOrdinals(i))
} else {
unwrappers(i)(fieldValue, mutableRow, fieldOrdinals(i))
}
i += 1
} match {
case Failure(ex) =>
logError(s"Exception thrown in field <${fieldRefs(i).getFieldName}>")
throw ex
case Success(value) => value
}
i += 1
}

mutableRow: InternalRow
Expand Down