Skip to content

Conversation

@jonalter
Copy link
Contributor

No description provided.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like a number of cases like this might be bad uses of println? I don't immediately see why this isn't a normal log

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any guidelines on what should be a println or a log?
I'll be glad to go through and check them.

@tdas
Copy link
Contributor

tdas commented Jun 29, 2015

Is there any way to exclude the example files from this rule? The examples seems to require the ugly //scalastyle:off but println is totally okay in the examples.

@jonalter
Copy link
Contributor Author

Checking out options for excluding examples

jonalter added 3 commits June 29, 2015 16:22
- Commented code does not get validated
- Some unnecessary comments removed
Conflicts:
	core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala
	sql/core/src/main/scala/org/apache/spark/sql/DataFrame.scala
	sql/hive/src/test/resources/regression-test-SPARK-8489/Main.scala
@jonalter
Copy link
Contributor Author

@tdas scalastyle currently does not provide a way to exclude a folder (or files) from this rule. Wrapping the full example in //scalastyle:off may make it less intrusive than wrapping the specific printlns. Thoughts?

@tdas
Copy link
Contributor

tdas commented Jun 30, 2015

I think that is a better idea. Much less intrusive. Thank you for finding
this out :)

On Tue, Jun 30, 2015 at 9:43 AM, Jonathan Alter [email protected]
wrote:

@tdas https://github.com/tdas scalastyle currently does not provide a
way to exclude a folder (or files) from this rule. Wrapping the full
example in //scalastyle:off may make it less intrusive than wrapping the
specific printlns. Thoughts?


Reply to this email directly or view it on GitHub
#7093 (comment).

@jonalter
Copy link
Contributor Author

jonalter commented Jul 1, 2015

@srowen - I have addressed the point you made regarding using log rather than println in some cases. Please let me know what you think. Thank you.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a big change. I still think a number of these printlns are debug leftovers and can be removed, or in some cases turned into logging. I don't know of a good way to review these other than to just sift through this a few times. I think anything in a main() method or close support of a CLI utility can stay; examples too are probably OK with println. Obviously there are some methods whose job it is to print to the console directly. Everything else, not sure where you would generally allow println.

So, this is one I think can be removed?

@srowen
Copy link
Member

srowen commented Jul 1, 2015

Heroic effort here. It certainly is big but is fixing up a lot of little issues of this form. I flagged a few more questions about it but it is looking pretty good.

Is anyone else concerned about the cost of future merge conflicts on this one vs the benefit?

@srowen
Copy link
Member

srowen commented Jul 7, 2015

Pinging people like @pwendell @JoshRosen @rxin @vanzin for thoughts about the scope of the change vs value -- anybody object?

@rxin
Copy link
Contributor

rxin commented Jul 7, 2015

+1 on the change.

Historically I don't think it doesn't make it much harder to backport fixes from small style changes. And most likely we don't backport printlns anyway.

The import one is slightly larger -- but even that I'm for merging it in towards the end of a release cycle.

@SparkQA
Copy link

SparkQA commented Jul 8, 2015

Test build #1005 timed out for PR 7093 at commit 0b1dcb4 after a configured wait of 175m.

@SparkQA
Copy link

SparkQA commented Jul 8, 2015

Test build #1010 timed out for PR 7093 at commit 10724b6 after a configured wait of 175m.

@srowen
Copy link
Member

srowen commented Jul 8, 2015

I can't figure out why this keeps timing out as it still doesn't look related to this change. Will try again later.

@jonalter
Copy link
Contributor Author

jonalter commented Jul 8, 2015

I'm thinking the issue may actually be a failure and not just a timeout. Looks like it is having a serialization issue around ThreadingSuite.scala:126 which is right were I made a change. Checking it out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@srowen I confirmed the test is having trouble with logInfo here and works fine with println. Should I just revert this?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh darn, maybe the logging ends up dragging in some reference to the outer class. Maybe just zap this whole clause, or make the error message a field in this ThreadingSuiteState and use it below.

jonalter added 2 commits July 8, 2015 10:06
logInfo was causing the test to fail with `org.apache.spark.SparkException: Task not serializable`
@jonalter
Copy link
Contributor Author

jonalter commented Jul 9, 2015

Ooops, sorry about that.
Looks like we may not even need to save the message since the fail state and count is already being saved by ThreadingSuiteState.

@SparkQA
Copy link

SparkQA commented Jul 10, 2015

Test build #1037 has finished for PR 7093 at commit ccd44cc.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • implicit class DebugQuery(query: DataFrame) extends Logging
    • class HiveContext(sc: SparkContext) extends SQLContext(sc) with Logging

@asfgit asfgit closed this in e14b545 Jul 10, 2015
@jonalter jonalter deleted the SPARK-7977 branch July 10, 2015 15:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants