-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-6913][SQL] Fixed "java.sql.SQLException: No suitable driver found" #5782
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…river added with SparkContext.addJar
|
Jenkins, test this please. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this doesn't need to be a case class, does it? Looks like you can just slightly change the pattern matching down below.
|
Test build #31370 has finished for PR 5782 at commit
|
|
Thanks for comments, fixed. |
|
Jenkins, test this please. |
|
Test build #31394 has finished for PR 5782 at commit
|
|
This is very clever. Thanks! I ran the docker based mysql and postgres tests manually. Merging to master. |
|
@marmbrus @SlavikBaranov @rxin this break the build with jdk6 [error] /home/tgraves/tgravescs_spark/sql/core/src/main/scala/org/apache/spark/sql/jdbc/jdbc.scala:198: value getParentLogger is not a member of java.sql.Driver java.sql.Driver.getParentLogger doesn't exist in jdk6, only jdk7 |
|
@tgravescs, thanks for the heads up. We are taking a look and will revert the patch if there isn't an easy fix. /cc @yhuai |
|
#5847 tries to fix it. |
…und" Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument. The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader. Wrappers that forward requests are created for these drivers. Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient. Author: Vyacheslav Baranov <[email protected]> Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits: 510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
|
Hi, |
|
Later on I found by adding "spark.executor.extraClassPath" property with value pointing to the jar works. But as Spark document below, the best way is to edit the compute-classpath.sh. Don't know why it doesn't work for me. |
…und" Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument. The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader. Wrappers that forward requests are created for these drivers. Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient. Author: Vyacheslav Baranov <[email protected]> Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits: 510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
…und" Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument. The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader. Wrappers that forward requests are created for these drivers. Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient. Author: Vyacheslav Baranov <[email protected]> Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits: 510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
|
How to work the patch ? I am newbie in spark and facing the same problem. Can anyone tell me how to work this through. I mean how to resolve the suitable JDBC driver not found problem ....? |
|
你好,我是一个新手,遇到这样的问题不知道何从下手,希望可以得到有效的帮助,我是在使用sparkSQL将处理过的数据保存至mysql中,打包上传运行时出现的“java.sql.SQLException: No suitable driver”,spark版本是2.3.2,希望可以得到一个有效的解决办法,谢谢 |
在需要用到JDBC Driver的算子里,先注册一下Driver: |
Fixed
java.sql.SQLException: No suitable driver foundwhen loading DataFrame into Spark SQL if the driver is supplied with--jarsargument.The problem is in
java.sql.DriverManagerclass that can't access drivers loaded by Spark ClassLoader.Wrappers that forward requests are created for these drivers.
Also, it's not necessary any more to include JDBC drivers in
--driver-class-pathin local mode, specifying in--jarsargument is sufficient.