-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-23510][SQL] Support Hive 2.2 and Hive 2.3 metastore #20668
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| private[client] class Shim_v2_3 extends Shim_v2_2 { | ||
|
|
||
| val environmentContext = new EnvironmentContext() | ||
| environmentContext.putToProperties("DO_NOT_UPDATE_STATS", "true") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Otherwise will throw NumberFormatException:
[info] Cause: java.lang.NumberFormatException: null
[info] at java.lang.Long.parseLong(Long.java:552)
[info] at java.lang.Long.parseLong(Long.java:631)
[info] at org.apache.hadoop.hive.metastore.MetaStoreUtils.isFastStatsSame(MetaStoreUtils.java:315)
[info] at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterPartitions(HiveAlterHandler.java:605)
[info] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3837)
more see:
https://github.com/apache/hive/blob/5468207e430b8b3fad6d65f2fcd80d1042cf8327/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java#L1191-L1192
https://issues.apache.org/jira/browse/HIVE-15653
| ConfVars.METASTORE_AGGREGATE_STATS_CACHE_MAX_READER_WAIT -> TimeUnit.MILLISECONDS, | ||
| ConfVars.HIVES_AUTO_PROGRESS_TIMEOUT -> TimeUnit.SECONDS, | ||
| ConfVars.HIVE_LOG_INCREMENTAL_PLAN_PROGRESS_INTERVAL -> TimeUnit.MILLISECONDS, | ||
| ConfVars.HIVE_STATS_JDBC_TIMEOUT -> TimeUnit.SECONDS, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove HIVE_STATS_JDBC_TIMEOUT from Hive 2.0.0 ,
more see: https://issues.apache.org/jira/browse/HIVE-12164
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But we also support all the previous versions
|
Test build #87645 has finished for PR 20668 at commit
|
|
Test build #87646 has finished for PR 20668 at commit
|
|
|
||
| private[client] class Shim_v2_2 extends Shim_v2_1 { | ||
|
|
||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove {}
| // hive.metastore.schema.verification from false to true since 2.0 | ||
| // For details, see the JIRA HIVE-6113 and HIVE-12463 | ||
| if (version == "2.0" || version == "2.1") { | ||
| if (version.split("\\.").head.toInt > 1) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if (version == "2.0" || version == "2.1" || version == "2.2" || version == "2.3") {|
Also need to update |
| classOf[JList[Partition]], | ||
| classOf[EnvironmentContext]) | ||
|
|
||
| override def alterPartitions(hive: Hive, tableName: String, newParts: JList[Partition]): Unit = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we do not add alterPartitionsMethod , which test case will fail?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
alterPartitions:
[info] - 2.3: alterPartitions *** FAILED *** (50 milliseconds)
[info] java.lang.reflect.InvocationTargetException:
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
[info] at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info] at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info] at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info] at org.scalatest.Suite$class.run(Suite.scala:1147)
[info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info] at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
[info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info] at java.lang.Thread.run(Thread.java:748)
[info] Cause: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to alter partition. java.lang.NumberFormatException: null
[info] at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:736)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
[info] at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info] at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info] at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info] at org.scalatest.Suite$class.run(Suite.scala:1147)
[info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info] at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
[info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info] at java.lang.Thread.run(Thread.java:748)
[info] Cause: org.apache.hadoop.hive.metastore.api.MetaException: java.lang.NumberFormatException: null
[info] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newMetaException(HiveMetaStore.java:6143)
[info] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3876)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
[info] at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
[info] at com.sun.proxy.$Proxy101.alter_partitions_with_environment_context(Unknown Source)
[info] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_partitions(HiveMetaStoreClient.java:1527)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
[info] at com.sun.proxy.$Proxy102.alter_partitions(Unknown Source)
[info] at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:734)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
[info] at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info] at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info] at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info] at org.scalatest.Suite$class.run(Suite.scala:1147)
[info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info] at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
[info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info] at java.lang.Thread.run(Thread.java:748)
[info] Cause: java.lang.NumberFormatException: null
[info] at java.lang.Long.parseLong(Long.java:552)
[info] at java.lang.Long.parseLong(Long.java:631)
[info] at org.apache.hadoop.hive.metastore.MetaStoreUtils.isFastStatsSame(MetaStoreUtils.java:315)
[info] at org.apache.hadoop.hive.metastore.HiveAlterHandler.alterPartitions(HiveAlterHandler.java:605)
[info] at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.alter_partitions_with_environment_context(HiveMetaStore.java:3837)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148)
[info] at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
[info] at com.sun.proxy.$Proxy101.alter_partitions_with_environment_context(Unknown Source)
[info] at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.alter_partitions(HiveMetaStoreClient.java:1527)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:173)
[info] at com.sun.proxy.$Proxy102.alter_partitions(Unknown Source)
[info] at org.apache.hadoop.hive.ql.metadata.Hive.alterPartitions(Hive.java:734)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[info] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[info] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[info] at java.lang.reflect.Method.invoke(Method.java:498)
[info] at org.apache.spark.sql.hive.client.Shim_v2_1.alterPartitions(HiveShim.scala:1144)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply$mcV$sp(HiveClientImpl.scala:616)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$alterPartitions$1.apply(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:275)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:213)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:212)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:258)
[info] at org.apache.spark.sql.hive.client.HiveClientImpl.alterPartitions(HiveClientImpl.scala:607)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:432)
[info] at org.apache.spark.sql.hive.client.VersionsSuite$$anonfun$6$$anonfun$apply$55.apply(VersionsSuite.scala:424)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:103)
[info] at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:183)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:196)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
[info] at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:196)
[info] at org.scalatest.FunSuite.runTest(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:229)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:396)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:384)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:379)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
[info] at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:229)
[info] at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
[info] at org.scalatest.Suite$class.run(Suite.scala:1147)
[info] at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:233)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
[info] at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:233)
[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:52)
[info] at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:213)
[info] at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:52)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:480)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:296)
[info] at sbt.ForkMain$Run$2.call(ForkMain.java:286)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info] at java.lang.Thread.run(Thread.java:748)
|
Test build #87652 has finished for PR 20668 at commit
|
|
Based on my understanding, the test failure is caused by a bug in the test case. When doing an alter partition, is that possible that we could alter a partition without |
|
Otherwise, Index: sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala
IDEA additional info:
Subsystem: com.intellij.openapi.diff.impl.patch.CharsetEP
<+>UTF-8
===================================================================
--- sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala (date 1519557876000)
+++ sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalogSuite.scala (date 1519702924000)
@@ -955,8 +955,10 @@
val oldPart1 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part1.spec)
val oldPart2 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part2.spec)
catalog.alterPartitions(TableIdentifier("tbl2", Some("db2")), Seq(
- oldPart1.copy(storage = storageFormat.copy(locationUri = Some(newLocation))),
- oldPart2.copy(storage = storageFormat.copy(locationUri = Some(newLocation)))))
+ oldPart1.copy(parameters = oldPart1.parameters,
+ storage = storageFormat.copy(locationUri = Some(newLocation))),
+ oldPart2.copy(parameters = oldPart2.parameters,
+ storage = storageFormat.copy(locationUri = Some(newLocation)))))
val newPart1 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part1.spec)
val newPart2 = catalog.getPartition(TableIdentifier("tbl2", Some("db2")), part2.spec)
assert(newPart1.storage.locationUri == Some(newLocation))
@@ -965,7 +967,9 @@
assert(oldPart2.storage.locationUri != Some(newLocation))
// Alter partitions without explicitly specifying database
catalog.setCurrentDatabase("db2")
- catalog.alterPartitions(TableIdentifier("tbl2"), Seq(oldPart1, oldPart2))
+ catalog.alterPartitions(TableIdentifier("tbl2"),
+ Seq(oldPart1.copy(parameters = newPart1.parameters),
+ oldPart2.copy(parameters = newPart2.parameters)))
val newerPart1 = catalog.getPartition(TableIdentifier("tbl2"), part1.spec)
val newerPart2 = catalog.getPartition(TableIdentifier("tbl2"), part2.spec)
assert(oldPart1.storage.locationUri == newerPart1.storage.locationUri)
|
|
We hit the test failure? |
|
Yes, If we do not add |
|
Why my PR #20671 does not fail? |
## What changes were proposed in this pull request? This is based on apache#20668 for supporting Hive 2.2 and Hive 2.3 metastore. When we merge the PR, we should give the major credit to wangyum ## How was this patch tested? Added the test cases Author: Yuming Wang <[email protected]> Author: gatorsmile <[email protected]> Closes apache#20671 from gatorsmile/pr-20668.
## What changes were proposed in this pull request? This is based on apache#20668 for supporting Hive 2.2 and Hive 2.3 metastore. When we merge the PR, we should give the major credit to wangyum ## How was this patch tested? Added the test cases Author: Yuming Wang <[email protected]> Author: gatorsmile <[email protected]> Closes apache#20671 from gatorsmile/pr-20668.
## What changes were proposed in this pull request? This is based on apache#20668 for supporting Hive 2.2 and Hive 2.3 metastore. When we merge the PR, we should give the major credit to wangyum ## How was this patch tested? Added the test cases Author: Yuming Wang <[email protected]> Author: gatorsmile <[email protected]> Closes apache#20671 from gatorsmile/pr-20668.
What changes were proposed in this pull request?
Support Hive 2.2 and Hive 2.3 metastore.
How was this patch tested?
Exist tests.