pyspark.sql.functions.try_divide#

pyspark.sql.functions.try_divide(left, right)[source]#

Returns dividend/divisor. It always performs floating point division. Its result is always null if divisor is 0.

New in version 3.5.0.

Parameters
leftColumn or column name

dividend

rightColumn or column name

divisor

Examples

Example 1: Integer divided by Integer.

>>> import pyspark.sql.functions as sf
>>> spark.createDataFrame(
...     [(6000, 15), (1990, 2), (1234, 0)], ["a", "b"]
... ).select("*", sf.try_divide("a", "b")).show()
+----+---+----------------+
|   a|  b|try_divide(a, b)|
+----+---+----------------+
|6000| 15|           400.0|
|1990|  2|           995.0|
|1234|  0|            NULL|
+----+---+----------------+

Example 2: Interval divided by Integer.

>>> import pyspark.sql.functions as sf
>>> df = spark.range(4).select(sf.make_interval(sf.lit(1)).alias("itvl"), "id")
>>> df.select("*", sf.try_divide("itvl", "id")).show()
+-------+---+--------------------+
|   itvl| id|try_divide(itvl, id)|
+-------+---+--------------------+
|1 years|  0|                NULL|
|1 years|  1|             1 years|
|1 years|  2|            6 months|
|1 years|  3|            4 months|
+-------+---+--------------------+

Example 3: Exception during division, resulting in NULL when ANSI mode is on

>>> import pyspark.sql.functions as sf
>>> origin = spark.conf.get("spark.sql.ansi.enabled")
>>> spark.conf.set("spark.sql.ansi.enabled", "true")
>>> try:
...     spark.range(1).select(sf.try_divide("id", sf.lit(0))).show()
... finally:
...     spark.conf.set("spark.sql.ansi.enabled", origin)
+-----------------+
|try_divide(id, 0)|
+-----------------+
|             NULL|
+-----------------+