Skip to content

Conversation

@caneGuy
Copy link
Contributor

@caneGuy caneGuy commented Jul 17, 2018

What changes were proposed in this pull request?

When we operate as below:
0: jdbc:hive2://xxx/> create function funnel_analysis as 'com.xxx.hive.extend.udf.UapFunnelAnalysis';

0: jdbc:hive2://xxx/> select funnel_analysis(1,",",1,''); Error: org.apache.spark.sql.AnalysisException: Undefined function: 'funnel_analysis'. This function is neither a registered temporary function nor a permanent function registered in the database 'xxx'.; line 1 pos 7 (state=,code=0)

0: jdbc:hive2://xxx/> describe function funnel_analysis; +-----------------------------------------------------------+--+ | function_desc | +-----------------------------------------------------------+--+ | Function: xxx.funnel_analysis | | Class: com.xxx.hive.extend.udf.UapFunnelAnalysis | | Usage: N/A. | +-----------------------------------------------------------+--+
We can see describe funtion will get right information,but when we actually use this funtion,we will get an undefined exception.
Which is really misleading,the real cause is below:
No handler for Hive UDF 'com.xxx.xxx.hive.extend.udf.UapFunnelAnalysis': java.lang.IllegalStateException: Should not be called directly; at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.initialize(GenericUDTF.java:72) at org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector$lzycompute(hiveUDFs.scala:204) at org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector(hiveUDFs.scala:204) at org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema$lzycompute(hiveUDFs.scala:212) at org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema(hiveUDFs.scala:212)
This patch print the actual failure for quick debugging.

How was this patch tested?

UT

@caneGuy
Copy link
Contributor Author

caneGuy commented Jul 17, 2018

@maropu Could you help review this?Thanks
since #21552 i used git merge so reopen this.

@SparkQA
Copy link

SparkQA commented Jul 17, 2018

Test build #93167 has finished for PR 21790 at commit 690035a.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class NoSuchFunctionException(db: String, func: String, rootCause: Option[String] = None)

@caneGuy
Copy link
Contributor Author

caneGuy commented Aug 28, 2018

retest this please

@maropu
Copy link
Member

maropu commented Aug 28, 2018

sure, I'll check later.

@SparkQA
Copy link

SparkQA commented Aug 28, 2018

Test build #95330 has finished for PR 21790 at commit 690035a.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class NoSuchFunctionException(db: String, func: String, rootCause: Option[String] = None)

@maropu
Copy link
Member

maropu commented Aug 30, 2018

retest this please

@maropu
Copy link
Member

maropu commented Aug 30, 2018

@caneGuy Can you fix the title?

@maropu
Copy link
Member

maropu commented Aug 30, 2018

Also, you need to add tests.

@SparkQA
Copy link

SparkQA commented Aug 30, 2018

Test build #95465 has finished for PR 21790 at commit 690035a.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class NoSuchFunctionException(db: String, func: String, rootCause: Option[String] = None)

@caneGuy
Copy link
Contributor Author

caneGuy commented Oct 22, 2018

I will fix the test and add unit test as soon as possible @maropu
I am too busy last month,sorry for the late reply.
Thanks for your comments and your precious time again!

@caneGuy caneGuy changed the title [SPARK-24544][SQL] Print actual failure cause when look up function f… [SPARK-24544][SQL] Print actual failure cause when look up function failed Dec 19, 2018
@caneGuy
Copy link
Contributor Author

caneGuy commented Dec 19, 2018

Sorry to bother you @maropu .I tried to add some test case,but i found SessionCatalogSuite has test the look up logic.
Any suggestion to add some more test case?

@SparkQA
Copy link

SparkQA commented Dec 19, 2018

Test build #100310 has finished for PR 21790 at commit e1866d1.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
  • class NoSuchFunctionException(db: String, func: String, cause: Option[Throwable] = None)

@maropu
Copy link
Member

maropu commented Dec 27, 2018

I think we need tests for the changes this pr added.

@caneGuy
Copy link
Contributor Author

caneGuy commented Dec 27, 2018

@maropu Thanks for your time.I add a test case in SessionCatalogSuite

s"a permanent function registered in the database '$db'.")
s"Undefined function: '$func'. This function is neither a registered temporary function nor " +
s"a permanent function registered in the database '$db'." +
s"${cause.map(th => s"Exception thrown during look up:" +
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is too complex for string interpolation. But why not just set the exception's cause?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen Thanks.IIUC.Since some times the cause may be undefined.For that case, we do not need to print Exception thrown during look up.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean, why not set the cause on AnalysisException? the cause may be null if the Option is None, that's fine.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Update @srowen Thanks!

@SparkQA
Copy link

SparkQA commented Dec 27, 2018

Test build #100475 has finished for PR 21790 at commit ec208e0.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 29, 2018

Test build #100524 has finished for PR 21790 at commit 95ba6e7.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 29, 2018

Test build #100522 has finished for PR 21790 at commit 26aeb58.

  • This patch fails due to an unknown error code, -9.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Dec 29, 2018

Test build #100529 has finished for PR 21790 at commit 1bff63c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

// built-in function.
// Hive is case insensitive.
val functionName = funcName.unquotedString.toLowerCase(Locale.ROOT)
logWarning(s"Encounter a failure during looking up function:" +
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: Encounter -> Encountered, and the first string doesn't need interpolation

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @srowen Updated!

@SparkQA
Copy link

SparkQA commented Dec 30, 2018

Test build #100558 has finished for PR 21790 at commit 820faf2.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.


test("SPARK-24544: test print actual failure cause when look up function failed") {
withBasicCatalog { catalog =>
intercept[NoSuchFunctionException] {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this test anything about the cause of the exception? maybe get the cause and assert about its message?

@SparkQA
Copy link

SparkQA commented Jan 1, 2019

Test build #100600 has finished for PR 21790 at commit 6d3bd19.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 1, 2019

Test build #100603 has finished for PR 21790 at commit 97493c9.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Jan 1, 2019

Test build #100604 has finished for PR 21790 at commit d1d4adf.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@srowen
Copy link
Member

srowen commented Jan 1, 2019

Merged to master

@srowen srowen closed this in 2bf4d97 Jan 1, 2019
@caneGuy caneGuy deleted the zhoukang/print-warning1 branch January 14, 2019 13:49
jackylee-ch pushed a commit to jackylee-ch/spark that referenced this pull request Feb 18, 2019
…ailed

## What changes were proposed in this pull request?

When we operate as below:
`
0: jdbc:hive2://xxx/> create  function funnel_analysis as 'com.xxx.hive.extend.udf.UapFunnelAnalysis';
`

`
0: jdbc:hive2://xxx/> select funnel_analysis(1,",",1,'');
Error: org.apache.spark.sql.AnalysisException: Undefined function: 'funnel_analysis'. This function is neither a registered temporary function nor a permanent function registered in the database 'xxx'.; line 1 pos 7 (state=,code=0)
`

`
0: jdbc:hive2://xxx/> describe function funnel_analysis;
+-----------------------------------------------------------+--+
|                       function_desc                       |
+-----------------------------------------------------------+--+
| Function: xxx.funnel_analysis                            |
| Class: com.xxx.hive.extend.udf.UapFunnelAnalysis  |
| Usage: N/A.                                               |
+-----------------------------------------------------------+--+
`
We can see describe funtion will get right information,but when we actually use this funtion,we will get an undefined exception.
Which is really misleading,the real cause is below:
 `
No handler for Hive UDF 'com.xxx.xxx.hive.extend.udf.UapFunnelAnalysis': java.lang.IllegalStateException: Should not be called directly;
	at org.apache.hadoop.hive.ql.udf.generic.GenericUDTF.initialize(GenericUDTF.java:72)
	at org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector$lzycompute(hiveUDFs.scala:204)
	at org.apache.spark.sql.hive.HiveGenericUDTF.outputInspector(hiveUDFs.scala:204)
	at org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema$lzycompute(hiveUDFs.scala:212)
	at org.apache.spark.sql.hive.HiveGenericUDTF.elementSchema(hiveUDFs.scala:212)
`
This patch print the actual failure for quick debugging.
## How was this patch tested?
UT

Closes apache#21790 from caneGuy/zhoukang/print-warning1.

Authored-by: zhoukang <[email protected]>
Signed-off-by: Sean Owen <[email protected]>
@dongjoon-hyun
Copy link
Member

Hi, All.
This seems to cause a regression to show misleading warnings always at first invocation for all Hive function. Hive fallback lookup should not be warned. It's a normal process in function lookups. Since this bug is not exposed to the release, I'll make a followup on master branch.

@srowen
Copy link
Member

srowen commented Jun 3, 2019

OK, if it's pretty common after all, this can be reverted, thank you

cloud-fan added a commit that referenced this pull request Sep 19, 2022
…n look up function failed`

### What changes were proposed in this pull request?

This reverts #21790 because it's no longer needed. It kept the original error from Hive when Spark loads builtin functions from Hive, which no longer happens as Spark has implemented all builtin functions natively.

### Why are the changes needed?

code cleanup

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

existing tests

Closes #37896 from cloud-fan/error.

Lead-authored-by: Wenchen Fan <[email protected]>
Co-authored-by: Wenchen Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
LuciferYang pushed a commit to LuciferYang/spark that referenced this pull request Sep 20, 2022
…n look up function failed`

### What changes were proposed in this pull request?

This reverts apache#21790 because it's no longer needed. It kept the original error from Hive when Spark loads builtin functions from Hive, which no longer happens as Spark has implemented all builtin functions natively.

### Why are the changes needed?

code cleanup

### Does this PR introduce _any_ user-facing change?

No

### How was this patch tested?

existing tests

Closes apache#37896 from cloud-fan/error.

Lead-authored-by: Wenchen Fan <[email protected]>
Co-authored-by: Wenchen Fan <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants