What do you want to save?
Add Code snippet
New code examples
-
Other 2022-01-30 13:50:10
pyspark dropna in one column
ou can use Column.isNull / Column.isNotNull: df.where(col("dt_mvmt").isNull()) df.where(col("dt_mvmt").isNotNull()) If you want to simply drop NULL values you can use na.drop with subset argument: df.na.drop(subset=["dt_mvmt&qu... Add solution