[SPARK-14387][SQL] Exceptions thrown when querying ORC tables in Hive #12293
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What changes were proposed in this pull request?
Physical files stored in Hive as ORC would have internal columns as _col1,_col2 etc and column mapping would be available in HiveMetastore. It was possible to query ORC tables stored in Hive via Spark's beeline client in earlier branches, and with master branch this was broken. When reading ORC files, it would be good map hive schema to physical schema for supporting backward compatibility. This PR addresses this issue.
How was this patch tested?
Manual execution of TPC-DS queries at 200 GB scale.
(If this patch involves UI changes, please attach a screenshot; otherwise, remove this)
…in Hive