Skip to content

Commit ac3a055

Browse files
GuoPhilipsedongjoon-hyun
authored andcommitted
[SPARK-32088][PYTHON] Pin the timezone in timestamp_seconds doctest
### What changes were proposed in this pull request? Add American timezone during timestamp_seconds doctest ### Why are the changes needed? `timestamp_seconds` doctest in `functions.py` used default timezone to get expected result For example: ```python >>> time_df = spark.createDataFrame([(1230219000,)], ['unix_time']) >>> time_df.select(timestamp_seconds(time_df.unix_time).alias('ts')).collect() [Row(ts=datetime.datetime(2008, 12, 25, 7, 30))] ``` But when we have a non-american timezone, the test case will get different test result. For example, when we set current timezone as `Asia/Shanghai`, the test result will be ``` [Row(ts=datetime.datetime(2008, 12, 25, 23, 30))] ``` So no matter where we run the test case ,we will always get the expected permanent result if we set the timezone on one specific area. ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Unit test Closes apache#28932 from GuoPhilipse/SPARK-32088-fix-timezone-issue. Lead-authored-by: GuoPhilipse <[email protected]> Co-authored-by: GuoPhilipse <[email protected]> Signed-off-by: Dongjoon Hyun <[email protected]>
1 parent 8795133 commit ac3a055

File tree

2 files changed

+4
-2
lines changed

2 files changed

+4
-2
lines changed

python/pyspark/sql/functions.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1431,9 +1431,11 @@ def to_utc_timestamp(timestamp, tz):
14311431
def timestamp_seconds(col):
14321432
"""
14331433
>>> from pyspark.sql.functions import timestamp_seconds
1434+
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
14341435
>>> time_df = spark.createDataFrame([(1230219000,)], ['unix_time'])
14351436
>>> time_df.select(timestamp_seconds(time_df.unix_time).alias('ts')).collect()
14361437
[Row(ts=datetime.datetime(2008, 12, 25, 7, 30))]
1438+
>>> spark.conf.unset("spark.sql.session.timeZone")
14371439
"""
14381440

14391441
sc = SparkContext._active_spark_context

sql/core/src/main/scala/org/apache/spark/sql/functions.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3360,8 +3360,8 @@ object functions {
33603360

33613361
/**
33623362
* Creates timestamp from the number of seconds since UTC epoch.
3363-
* @group = datetime_funcs
3364-
* @since = 3.1.0
3363+
* @group datetime_funcs
3364+
* @since 3.1.0
33653365
*/
33663366
def timestamp_seconds(e: Column): Column = withExpr {
33673367
SecondsToTimestamp(e.expr)

0 commit comments

Comments
 (0)