pyspark.sql.functions.try_reflect#

pyspark.sql.functions.try_reflect(*cols)[source]#

This is a special version of reflect that performs the same operation, but returns a NULL value instead of raising an error if the invoke method thrown exception.

New in version 4.0.0.

Parameters
colsColumn or str

the first element should be a literal string for the class name, and the second element should be a literal string for the method name, and the remaining are input arguments to the Java method.

Examples

Example 1: Reflecting a method call with arguments

>>> from pyspark.sql import functions as sf
>>> df = spark.createDataFrame([("a5cf6c42-0c85-418f-af6c-3e4e5b1328f2",)], ["a"])
>>> df.select(
...     sf.try_reflect(sf.lit("java.util.UUID"), sf.lit("fromString"), df.a)
... ).show()
+------------------------------------------+
|try_reflect(java.util.UUID, fromString, a)|
+------------------------------------------+
|                      a5cf6c42-0c85-418...|
+------------------------------------------+

Example 2: Exception in the reflection call, resulting in null

>>> from pyspark.sql import functions as sf
>>> df = spark.range(1)
>>> df.select(
...     sf.try_reflect(sf.lit("scala.Predef"), sf.lit("require"), sf.lit(False))
... ).show()
+-----------------------------------------+
|try_reflect(scala.Predef, require, false)|
+-----------------------------------------+
|                                     NULL|
+-----------------------------------------+