Skip to content

Conversation

@SlavikBaranov
Copy link
Contributor

Fixed java.sql.SQLException: No suitable driver found when loading DataFrame into Spark SQL if the driver is supplied with --jars argument.

The problem is in java.sql.DriverManager class that can't access drivers loaded by Spark ClassLoader.

Wrappers that forward requests are created for these drivers.

Also, it's not necessary any more to include JDBC drivers in --driver-class-path in local mode, specifying in --jars argument is sufficient.

@SlavikBaranov SlavikBaranov changed the title [SPARK-6913] Fixed "java.sql.SQLException: No suitable driver found" [SPARK-6913][SQL] Fixed "java.sql.SQLException: No suitable driver found" Apr 29, 2015
@rxin
Copy link
Contributor

rxin commented Apr 30, 2015

Jenkins, test this please.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this doesn't need to be a case class, does it? Looks like you can just slightly change the pattern matching down below.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31370 has finished for PR 5782 at commit b2a727c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch does not change any dependencies.

@SlavikBaranov
Copy link
Contributor Author

Thanks for comments, fixed.

@rxin
Copy link
Contributor

rxin commented Apr 30, 2015

Jenkins, test this please.

@SparkQA
Copy link

SparkQA commented Apr 30, 2015

Test build #31394 has finished for PR 5782 at commit 510c43f.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.
  • This patch adds the following new dependencies:
    • jaxb-api-2.2.7.jar
    • jaxb-core-2.2.7.jar
    • jaxb-impl-2.2.7.jar
    • pmml-agent-1.1.15.jar
    • pmml-model-1.1.15.jar
    • pmml-schema-1.1.15.jar
  • This patch removes the following dependencies:
    • activation-1.1.jar
    • jaxb-api-2.2.2.jar
    • jaxb-impl-2.2.3-1.jar

@marmbrus
Copy link
Contributor

marmbrus commented May 1, 2015

This is very clever. Thanks! I ran the docker based mysql and postgres tests manually. Merging to master.

@asfgit asfgit closed this in e991255 May 1, 2015
@tgravescs
Copy link
Contributor

@marmbrus @SlavikBaranov @rxin this break the build with jdk6

[error] /home/tgraves/tgravescs_spark/sql/core/src/main/scala/org/apache/spark/sql/jdbc/jdbc.scala:198: value getParentLogger is not a member of java.sql.Driver
[error] override def getParentLogger: java.util.logging.Logger = wrapped.getParentLogger
[error] ^

java.sql.Driver.getParentLogger doesn't exist in jdk6, only jdk7

@tgravescs
Copy link
Contributor

@marmbrus
Copy link
Contributor

marmbrus commented May 1, 2015

@tgravescs, thanks for the heads up. We are taking a look and will revert the patch if there isn't an easy fix.

/cc @yhuai

@yhuai
Copy link
Contributor

yhuai commented May 1, 2015

#5847 tries to fix it.

@SlavikBaranov SlavikBaranov deleted the SPARK-6913 branch May 5, 2015 08:39
jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request May 28, 2015
…und"

Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument.

The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader.

Wrappers that forward requests are created for these drivers.

Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient.

Author: Vyacheslav Baranov <[email protected]>

Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits:

510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments
b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration
c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
@wangxj1127
Copy link

Hi,
What is the workaround for this issue if I am using Spark 1.3.1? I have modified the compute-classpath.sh for appeding the mysql-connector jar, but still not working in yarn-cluster or yan-client mode. Only local mode works for me.Thanks.

@wangxj1127
Copy link

Later on I found by adding "spark.executor.extraClassPath" property with value pointing to the jar works. But as Spark document below, the best way is to edit the compute-classpath.sh. Don't know why it doesn't work for me.
https://spark.apache.org/docs/latest/sql-programming-guide.html#jdbc-to-other-databases

jeanlyn pushed a commit to jeanlyn/spark that referenced this pull request Jun 12, 2015
…und"

Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument.

The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader.

Wrappers that forward requests are created for these drivers.

Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient.

Author: Vyacheslav Baranov <[email protected]>

Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits:

510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments
b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration
c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
nemccarthy pushed a commit to nemccarthy/spark that referenced this pull request Jun 19, 2015
…und"

Fixed `java.sql.SQLException: No suitable driver found` when loading DataFrame into Spark SQL if the driver is supplied with `--jars` argument.

The problem is in `java.sql.DriverManager` class that can't access drivers loaded by Spark ClassLoader.

Wrappers that forward requests are created for these drivers.

Also, it's not necessary any more to include JDBC drivers in `--driver-class-path` in local mode, specifying in `--jars` argument is sufficient.

Author: Vyacheslav Baranov <[email protected]>

Closes apache#5782 from SlavikBaranov/SPARK-6913 and squashes the following commits:

510c43f [Vyacheslav Baranov] [SPARK-6913] Fixed review comments
b2a727c [Vyacheslav Baranov] [SPARK-6913] Fixed thread race on driver registration
c8294ae [Vyacheslav Baranov] [SPARK-6913] Fixed "No suitable driver found" when using using JDBC driver added with SparkContext.addJar
@Shawon91sust
Copy link

How to work the patch ? I am newbie in spark and facing the same problem. Can anyone tell me how to work this through. I mean how to resolve the suitable JDBC driver not found problem ....?

@yubao2
Copy link

yubao2 commented Nov 21, 2018

你好,我是一个新手,遇到这样的问题不知道何从下手,希望可以得到有效的帮助,我是在使用sparkSQL将处理过的数据保存至mysql中,打包上传运行时出现的“java.sql.SQLException: No suitable driver”,spark版本是2.3.2,希望可以得到一个有效的解决办法,谢谢

@wanshicheng
Copy link

你好,我是一个新手,遇到这样的问题不知道何从下手,希望可以得到有效的帮助,我是在使用sparkSQL将处理过的数据保存至mysql中,打包上传运行时出现的“java.sql.SQLException: No suitable driver”,spark版本是2.3.2,希望可以得到一个有效的解决办法,谢谢

在需要用到JDBC Driver的算子里,先注册一下Driver:
DriverRegistry.register(className)
它会根据全类名调用Spark的类加载器,加载Driver。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

10 participants