Question: I get the error below , while trying to create a delta lake table from my vm using scala into my s 3 bucket, I
I get the error below while trying to create a delta lake table from my vm using scala into my s bucket, I have every dependency installed, and the versions are compatable as well, can you let me know why I am facing this issue
Java.lang.NoClassDefFoundError: orgapachesparksqlexecutiondatasourcesFileFormatWriter$EmptyNull
at org.apache.spark.sqldelta.DeltaLog.startTransactionDeltaLogscala:
at org.apache.spark.sqldelta.DeltaLog.withNewTransactionDeltaLogscala:
at org.apache.spark.sqldelta.commands.WriteIntoDelta.runWriteIntoDeltascala:
at org.apache.spark.sqldelta.sources.DeltaDataSource.createRelationDeltaDataSourcescala:
at org.apache.spark.sqlexecution.datasources.SaveIntoDataSourceCommand.runSaveIntoDataSourceCommandscala:
at org.apache.spark.sqlexecution.command.ExecutedCommandExec.sideEffectResult$lzycomputecommandsscala:
at org.apache.spark.sqlexecution.command.ExecutedCommandExec.sideEffectResultcommandsscala:
at org.apache.spark.sqlexecution.command.ExecutedCommandExec.executeCollectcommandsscala:
at org.apache.spark.sqlexecution.QueryExecution$$anonfun$eagerlyExecuteCommands$$anonfun$applyOrElse$QueryExecutionscala:
at org.apache.spark.sqlexecution.SQLExecution$$anonfun$withNewExecutionId$SQLExecutionscala:
at org.apache.spark.sqlexecution.SQLExecution$withSQLConfPropagatedSQLExecutionscala:
at org.apache.spark.sqlexecution.SQLExecution$$anonfun$withNewExecutionId$SQLExecutionscala:
at org.apache.spark.sqlSparkSession.withActiveSparkSessionscala:
at org.apache.spark.sqlexecution.SQLExecution$withNewExecutionIdSQLExecutionscala:
at org.apache.spark.sqlexecution.QueryExecution$$anonfun$eagerlyExecuteCommands$applyOrElseQueryExecutionscala:
at org.apache.spark.sqlexecution.QueryExecution$$anonfun$eagerlyExecuteCommands$applyOrElseQueryExecutionscala:
at org.apache.spark.sqlcatalyst.trees.TreeNode.$anonfun$transformDownWithPruning$TreeNodescala:
at org.apache.spark.sqlcatalyst.trees.CurrentOrigin$withOriginoriginscala:
at org.apache.spark.sqlcatalyst.trees.TreeNode.transformDownWithPruningTreeNodescala:
at org.apache.spark.sqlcatalyst.plans.logicalLogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruningLogicalPlanscala:
at org.apache.spark.sqlcatalyst.plans.logicalAnalysisHelper.transformDownWithPruningAnalysisHelperscala:
at org.apache.spark.sqlcatalyst.plans.logicalAnalysisHelper.transformDownWithPruning$AnalysisHelperscala:
at org.apache.spark.sqlcatalyst.plans.logicalLogicalPlan.transformDownWithPruningLogicalPlanscala:
at org.apache.spark.sqlcatalyst.plans.logicalLogicalPlan.transformDownWithPruningLogicalPlanscala:
at org.apache.spark.sqlcatalyst.trees.TreeNode.transformDownTreeNodescala:
at org.apache.spark.sqlexecution.QueryExecution.eagerlyExecuteCommandsQueryExecutionscala:
at org.apache.spark.sqlexecution.QueryExecution.commandExecuted$lzycomputeQueryExecutionscala:
at org.apache.spark.sqlexecution.QueryExecution.commandExecutedQueryExecutionscala:
at org.apache.spark.sqlexecution.QueryExecution.assertCommandExecutedQueryExecutionscala:
at org.apache.spark.sqlDataFrameWriter.runCommandDataFrameWriterscala:
at org.apache.spark.sqlDataFrameWriter.saveToVSourceDataFrameWriterscala:
at org.apache.spark.sqlDataFrameWriter.saveInternalDataFrameWriterscala:
at org.apache.spark.sqlDataFrameWriter.saveDataFrameWriterscala:
elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sqlexecution.datasources.FileFormatWriter$EmptyNull
at java.basejavanet.URLClassLoader.findClassURLClassLoaderjava:
at java.basejavalang.ClassLoader.loadClassClassLoaderjava:
at java.basejavalang.ClassLoader.loadClassClassLoaderjava:
more
Step by Step Solution
There are 3 Steps involved in it
1 Expert Approved Answer
Step: 1 Unlock
Question Has Been Solved by an Expert!
Get step-by-step solutions from verified subject matter experts
Step: 2 Unlock
Step: 3 Unlock
