2

I installed Hadoop, Mahout and Spark. I am able to see the Hadoop and Spark MasterWebUI. Moreover, I can also run the following command,

[hadoop@muildevcel01 mahout]$ bin/mahout

However, we I try running the spark-shell I run in the problem stated below,

[hadoop@muildevcel01 mahout]$ bin/mahout spark-shell

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/repl/SparkILoop
        at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.repl.SparkILoop
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 1 more

Question

Any suggestions how I could resolve my problem?

Stereo
  • 1,423
  • 9
  • 24
Dimag Kharab
  • 141
  • 1
  • 5

1 Answers1

1

This error is common when the SPARK_HOME environment variable is not set.

In the shell type export SPARK_HOME=/path/to/your/spark

rawkintrevo
  • 268
  • 1
  • 7