-
Notifications
You must be signed in to change notification settings - Fork 29k
[SPARK-32088][PYTHON][FOLLOWUP] Replace collect() by show() in the example for timestamp_seconds
#28959
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-32088][PYTHON][FOLLOWUP] Replace collect() by show() in the example for timestamp_seconds
#28959
Conversation
collect() by show() in the timestamp_seconds examplecollect() by show() in the example for timestamp_seconds
jiangxb1987
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM pending jenkins test
|
Test build #124688 has finished for PR 28959 at commit
|
|
It seems the build failure is not related to the changes: |
|
Hi, @MaxGekk and @jiangxb1987 . Could you give me more hints? |
|
Until now, I'm observing the following. I'll monitor more, but it would be great if you can give me more evidence so that I can help you, @MaxGekk and @jiangxb1987 .
|
|
retest this please |
|
Sure, let's block before the test results. |
|
Test build #124718 has finished for PR 28959 at commit
|
|
retest this please |
| def timestamp_seconds(col): | ||
| """ | ||
| >>> from pyspark.sql.functions import timestamp_seconds | ||
| >>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we set the timezone in pyspark doctest, like the scala side test?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm wondering if this is the common approach we do for all timestamp related pyspark doc tests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, we can do that too. Let's set it when we face another similar issue. It's not difficult to do it.
|
Test build #124729 has finished for PR 28959 at commit
|
|
retest this please |
|
Test build #124735 has finished for PR 28959 at commit
|
|
Merged to master. |
|
FYI, This is from unidoc when you generate the documentation. It sorts of compiles Scala files back to Java files, and extract the documentations from there. It's kind of buggy time to time. |
What changes were proposed in this pull request?
Modify the example for
timestamp_secondsand replacecollect()byshow().Why are the changes needed?
The SQL config
spark.sql.session.timeZonedoesn't influence on thecollectin the example. The code below demonstrates that:The expected time is 07:30 but we get 15:30.
Does this PR introduce any user-facing change?
No
How was this patch tested?
By running the modified example via: