spark class conflict with uber jar

when I am running spark, there is an exception thrown

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NullPointerException

turns out this is due to the class conflicts.

as i have some libraries bundled in the uber jar, for example, the

hence i have requested spark to use my own libraries, instead of the ones bundled in spark by

--conf spark.driver.userClassPathFirst=true --conf spark.executor.userClassPathFirst=true

this works as intended to point to the right libraries. however, seems like there are more libraries duplicated between the uber jar and spark.

hence instead of using above option, and turning to maven shaded sorted out the issue.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s