pyspark.sql.functions.countDistinct In PySpark, the countDistinct function is used to calculate the number of unique elements…
Tag: PySpark
How to add a new column in PySpark using withColumn
withColumn Syntax: DataFrame.withColumn(column_name, col) withColumn is comonly used to add a column on an existing dataframe. withColumn returns a new…
How to use filter or where condition in PySpark
filter / where The filter condition will filters rows based on multiple conditions. where() is an alias for filter(). In…
Explain Complex datatype PySpark (ArrayType,MapType,StructType)
There are three complex datatype in PySpark, (1) ArrayType, (2) MapType (3) StructType. ArrayType ArrayType represents values comprising a sequence…
How to create tables from Spark Dataframe and join the tables (createOrReplaceTempView)
createOrReplaceTempView There are many scenario in which you can do the transformation using sql instead of direct spark dataframe operations….
How to transform a JSON Column to multiple columns based on Key in PySpark
JSON Column to multiple columns Consider you have situation with incoming raw data got a json column, and you need…
How to parses a column containing a JSON string using PySpark(from_json)
from_json If you have JSON object in a column, and need to do any transformation you can use from_json. from_json…
How to get the common elements from two arrays in two columns in PySpark (array_intersect)
array_intersect When you want to get the common elements from two arrays in two columns in PySpark you can use…
How to find difference between two arrays in PySpark(array_except)
array_except In PySpark , array_except will returns an array of the elements in one column but not in another column…
How to convert Array elements to Rows in PySpark ? PySpark – Explode Example code.
Function : pyspark.sql.functions.explode To converts the Array of Array Columns to row in PySpark we use “explode” function. Explode returns…
How to find array contains a given value or values using PySpark ( PySpark search in array)
array_contains You can find specific value/values in an array using spark sql function array_contains. array_contains(array, value) will return true if…