Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 8 additions & 3 deletions yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
Original file line number Diff line number Diff line change
Expand Up @@ -537,9 +537,14 @@ private[spark] class Client(
sys.env.get(envKey).foreach { path =>
val dir = new File(path)
if (dir.isDirectory()) {
dir.listFiles().foreach { file =>
if (file.isFile && !hadoopConfFiles.contains(file.getName())) {
hadoopConfFiles(file.getName()) = file
val files = dir.listFiles()
if (files == null) {
logWarning("Failed to list files under directory " + dir)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think that means it (necessarily) failed; it can return null if it's empty. I think a simple != null check is all you want here.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to the Java API doc, it returns empty array if the directory is empty. It only returns null if it is not a directory or have IO error (permission issue). Without logging a warning, we might be silently ignoring misconfiguration.

Ref: https://docs.oracle.com/javase/7/docs/api/java/io/File.html#listFiles()

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah fair point, it has already been checked with isDirectory. OK this looks good to me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: this could be s"Failed to list files under directory $dir" but I don't think it's worth changing

} else {
files.foreach { file =>
if (file.isFile && !hadoopConfFiles.contains(file.getName())) {
hadoopConfFiles(file.getName()) = file
}
}
}
}
Expand Down