pyspark.sql.functions.posexplode
The posexplode function in PySpark is part of the pyspark.sql.functions module and is used to transform a column of arrays or maps into multiple columns, with one row for each element in the array or map. The posexplode function is similar to the explode function, but it also returns the position of each element in the array or map in a separate column.
From official doc “Returns a new row for each element with position in the given array or map. Uses the default column name pos for position, and col for elements in the array and key and value for elements in the map unless specified otherwise.“
Here is an example of how to use the posexplode function in PySpark:
from pyspark.sql import SparkSession
from pyspark.sql.functions import posexplode
# Start a SparkSession
spark = SparkSession.builder.appName("PosExplode example @ Freshers.in").getOrCreate()
# Create a DataFrame
data = [([101, 202, 330],), ([43, 51],), ([66],)]
df = spark.createDataFrame(data, ["values"])
df.show()
+---------------+
| values|
+---------------+
|[101, 202, 330]|
| [43, 51]|
| [66]|
+---------------+
Applying posexplode function
# Use the posexplode function to transform the values column
df = df.select("values", posexplode("values").alias("position", "value"))
df.show()
+---------------+--------+-----+
| values|position|value|
+---------------+--------+-----+
|[101, 202, 330]| 0| 101|
|[101, 202, 330]| 1| 202|
|[101, 202, 330]| 2| 330|
| [43, 51]| 0| 43|
| [43, 51]| 1| 51|
| [66]| 0| 66|
+---------------+--------+-----+
As you can see, the posexplode function has transformed the values column into two separate columns: position and value. The position column contains the position of each element in the array, and the value column contains the value of each element. Each row of the DataFrame represents a single element in the array, with its position and value.
In conclusion, the posexplode function in PySpark is a useful tool for transforming arrays and maps in Spark dataframes. Whether you need to extract the position and value of each element in an array or perform more complex operations, the pyspark.sql.functions module provides the tools you need to get the job done.
Spark important urls to refer