java.lang.NoClassDefFoundError scala/Product$class

环境:windows 7 + idea + scala + spark

本地运行以后报下面错误

1
2
3
4
5
6
7
8
9
10
11
12
13
14
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Product$class
at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:682)
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:539)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:72)
at org.apache.spark.SparkConf.setAppName(SparkConf.scala:87)
at com.bim.WordCount$.main(WordCount.scala:9)
at com.bim.WordCount.main(WordCount.scala)
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more

Spark和Scala的版本是有对应关系的,下面有个查看关系的小技巧,去https://mvnrepository.com/中搜索spark,进入Spark Project Core查看即可

下面分别引入spark-corespark-sql(不需要的话可以不引)运行即可。