Question: would like to convert the following method from scala spark to pyspark def getCurrDateFromConfElseCurrentDt(curr_date: String): String={ var dt= if(curr_date.length>0 && isValidDateFrmt_yyyyMMdd(curr_date)){ dt=curr_date } else{ dt=LocalDate.now().format(DateTimeFormatter.ISO_LOCAL_DATE)
would like to convert the following method from scala spark to pyspark
def getCurrDateFromConfElseCurrentDt(curr_date: String): String={
var dt=""
if(curr_date.length>0 && isValidDateFrmt_yyyyMMdd(curr_date)){
dt=curr_date
}
else{
dt=LocalDate.now().format(DateTimeFormatter.ISO_LOCAL_DATE)
}
dt
}
def isValidDateFrmt_yyyyMMdd(dt: String):Boolean={
val isValid=true
val pattern="[0-9]{4}-[0-9]{2}-[0-9]{2}$"
try{
dt.matches(pattern)
}
catch{
case ex:ParseException=>{
}
}
isValid
}
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
