Function Description
aggregate(expr, start, merge, finish) Applies a binary operator to an initial state and all elements in the array, and reduces this to a single state. The final state is converted into the final result by applying a finish function.
array_sort(expr, func) Sorts the input array. If func is omitted, sort in ascending order. The elements of the input array must be orderable. NaN is greater than any non-NaN elements for double/float type. Null elements will be placed at the end of the returned array. Since 3.0.0 this function also sorts and returns the array based on the given comparator function. The comparator will take two arguments representing two elements of the array. It returns a negative integer, 0, or a positive integer as the first element is less than, equal to, or greater than the second element. If the comparator function returns null, the function will fail and raise an error.
cardinality(expr) Returns the size of an array or a map. This function returns -1 for null input only if spark.sql.ansi.enabled is false and spark.sql.legacy.sizeOfNull is true. Otherwise, it returns null for null input. With the default settings, the function returns null for null input.
concat(col1, col2, ..., colN) Returns the concatenation of col1, col2, ..., colN.
element_at(array, index) Returns element of array at given (1-based) index. If Index is 0, Spark will throw an error. If index < 0, accesses elements from the last to the first. The function returns NULL if the index exceeds the length of the array and `spark.sql.ansi.enabled` is set to false. If `spark.sql.ansi.enabled` is set to true, it throws ArrayIndexOutOfBoundsException for invalid indices.
element_at(map, key) Returns value for given key. The function returns NULL if the key is not contained in the map.
exists(expr, pred) Tests whether a predicate holds for one or more elements in the array.
filter(expr, func) Filters the input array using the given predicate.
forall(expr, pred) Tests whether a predicate holds for all elements in the array.
map_filter(expr, func) Filters entries in a map using the function.
map_zip_with(map1, map2, function) Merges two given maps into a single map by applying function to the pair of values with the same key. For keys only presented in one map, NULL will be passed as the value for the missing key. If an input map contains duplicated keys, only the first entry of the duplicated key is passed into the lambda function.
reduce(expr, start, merge, finish) Applies a binary operator to an initial state and all elements in the array, and reduces this to a single state. The final state is converted into the final result by applying a finish function.
reverse(array) Returns a reversed string or an array with reverse order of elements.
size(expr) Returns the size of an array or a map. This function returns -1 for null input only if spark.sql.ansi.enabled is false and spark.sql.legacy.sizeOfNull is true. Otherwise, it returns null for null input. With the default settings, the function returns null for null input.
transform(expr, func) Transforms elements in an array using the function.
transform_keys(expr, func) Transforms elements in a map using the function.
transform_values(expr, func) Transforms values in the map using the function.
try_element_at(array, index) Returns element of array at given (1-based) index. If Index is 0, Spark will throw an error. If index < 0, accesses elements from the last to the first. The function always returns NULL if the index exceeds the length of the array.
try_element_at(map, key) Returns value for given key. The function always returns NULL if the key is not contained in the map.
zip_with(left, right, func) Merges the two given arrays, element-wise, into a single array using function. If one array is shorter, nulls are appended at the end to match the length of the longer array, before applying function.