eclipse(set with scala envirnment) : object apache is not a member of package org












2















enter image description here



As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org".
I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:



 import org.apache.spark.{SparkConf, SparkContext}
object ABC {

def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}









share|improve this question























  • Have you added spark-core jar in your classpath?

    – sud29
    Apr 19 '16 at 9:59











  • Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

    – Shalini Baranwal
    Apr 19 '16 at 12:42











  • You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

    – sud29
    Apr 19 '16 at 12:47











  • after adding the spak-core, do Maven -> update Project, and you are good to go.

    – Deepesh Rehi
    May 8 '18 at 10:26
















2















enter image description here



As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org".
I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:



 import org.apache.spark.{SparkConf, SparkContext}
object ABC {

def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}









share|improve this question























  • Have you added spark-core jar in your classpath?

    – sud29
    Apr 19 '16 at 9:59











  • Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

    – Shalini Baranwal
    Apr 19 '16 at 12:42











  • You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

    – sud29
    Apr 19 '16 at 12:47











  • after adding the spak-core, do Maven -> update Project, and you are good to go.

    – Deepesh Rehi
    May 8 '18 at 10:26














2












2








2








enter image description here



As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org".
I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:



 import org.apache.spark.{SparkConf, SparkContext}
object ABC {

def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}









share|improve this question














enter image description here



As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org".
I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:



 import org.apache.spark.{SparkConf, SparkContext}
object ABC {

def main(args: Array[String]){
//Scala Main Method

println("Spark Configuration")

val conf = new SparkConf()

conf.setAppName("My First Spark Scala Application")

conf.setMaster("spark://ip-10-237-224-94:7077")

println("Creating Spark Context")
}
}






eclipse scala apache-spark






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Apr 19 '16 at 9:55









Shalini BaranwalShalini Baranwal

749819




749819













  • Have you added spark-core jar in your classpath?

    – sud29
    Apr 19 '16 at 9:59











  • Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

    – Shalini Baranwal
    Apr 19 '16 at 12:42











  • You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

    – sud29
    Apr 19 '16 at 12:47











  • after adding the spak-core, do Maven -> update Project, and you are good to go.

    – Deepesh Rehi
    May 8 '18 at 10:26



















  • Have you added spark-core jar in your classpath?

    – sud29
    Apr 19 '16 at 9:59











  • Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

    – Shalini Baranwal
    Apr 19 '16 at 12:42











  • You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

    – sud29
    Apr 19 '16 at 12:47











  • after adding the spak-core, do Maven -> update Project, and you are good to go.

    – Deepesh Rehi
    May 8 '18 at 10:26

















Have you added spark-core jar in your classpath?

– sud29
Apr 19 '16 at 9:59





Have you added spark-core jar in your classpath?

– sud29
Apr 19 '16 at 9:59













Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

– Shalini Baranwal
Apr 19 '16 at 12:42





Actually i dont exactly which all jars to add and at which location are they on spark, from where i can copy. Could you just guide?

– Shalini Baranwal
Apr 19 '16 at 12:42













You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

– sud29
Apr 19 '16 at 12:47





You simply need to follow instructions in spark.apache.org/docs/1.2.0/quick-start.html

– sud29
Apr 19 '16 at 12:47













after adding the spak-core, do Maven -> update Project, and you are good to go.

– Deepesh Rehi
May 8 '18 at 10:26





after adding the spak-core, do Maven -> update Project, and you are good to go.

– Deepesh Rehi
May 8 '18 at 10:26












3 Answers
3






active

oldest

votes


















4














Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.






share|improve this answer


























  • is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

    – Shalini Baranwal
    Apr 19 '16 at 12:53











  • Yes, that's the jar.

    – sud29
    Apr 19 '16 at 12:56



















-1














If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:



https://github.com/apache/incubator-toree



and create your notebooks with that kernel.



You also have to start the Jupyter Notebook with:



pyspark





share|improve this answer































    -1














    I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3.
    I resolved my error by



    1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files.
    Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.



    2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)*
    *before selecting this option one should check the compatibility of SPARK and SCALA.






    share|improve this answer

























      Your Answer






      StackExchange.ifUsing("editor", function () {
      StackExchange.using("externalEditor", function () {
      StackExchange.using("snippets", function () {
      StackExchange.snippets.init();
      });
      });
      }, "code-snippets");

      StackExchange.ready(function() {
      var channelOptions = {
      tags: "".split(" "),
      id: "1"
      };
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function() {
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled) {
      StackExchange.using("snippets", function() {
      createEditor();
      });
      }
      else {
      createEditor();
      }
      });

      function createEditor() {
      StackExchange.prepareEditor({
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: true,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: 10,
      bindNavPrevention: true,
      postfix: "",
      imageUploader: {
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      },
      onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      });


      }
      });














      draft saved

      draft discarded


















      StackExchange.ready(
      function () {
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f36714920%2feclipseset-with-scala-envirnment-object-apache-is-not-a-member-of-package-or%23new-answer', 'question_page');
      }
      );

      Post as a guest















      Required, but never shown

























      3 Answers
      3






      active

      oldest

      votes








      3 Answers
      3






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      4














      Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.






      share|improve this answer


























      • is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

        – Shalini Baranwal
        Apr 19 '16 at 12:53











      • Yes, that's the jar.

        – sud29
        Apr 19 '16 at 12:56
















      4














      Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.






      share|improve this answer


























      • is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

        – Shalini Baranwal
        Apr 19 '16 at 12:53











      • Yes, that's the jar.

        – sud29
        Apr 19 '16 at 12:56














      4












      4








      4







      Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.






      share|improve this answer















      Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.







      share|improve this answer














      share|improve this answer



      share|improve this answer








      edited Apr 19 '16 at 10:11

























      answered Apr 19 '16 at 10:03









      sud29sud29

      3,43211429




      3,43211429













      • is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

        – Shalini Baranwal
        Apr 19 '16 at 12:53











      • Yes, that's the jar.

        – sud29
        Apr 19 '16 at 12:56



















      • is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

        – Shalini Baranwal
        Apr 19 '16 at 12:53











      • Yes, that's the jar.

        – sud29
        Apr 19 '16 at 12:56

















      is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

      – Shalini Baranwal
      Apr 19 '16 at 12:53





      is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1/spark-core_2.10-1.4.1.jar"? Please help

      – Shalini Baranwal
      Apr 19 '16 at 12:53













      Yes, that's the jar.

      – sud29
      Apr 19 '16 at 12:56





      Yes, that's the jar.

      – sud29
      Apr 19 '16 at 12:56













      -1














      If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:



      https://github.com/apache/incubator-toree



      and create your notebooks with that kernel.



      You also have to start the Jupyter Notebook with:



      pyspark





      share|improve this answer




























        -1














        If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:



        https://github.com/apache/incubator-toree



        and create your notebooks with that kernel.



        You also have to start the Jupyter Notebook with:



        pyspark





        share|improve this answer


























          -1












          -1








          -1







          If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:



          https://github.com/apache/incubator-toree



          and create your notebooks with that kernel.



          You also have to start the Jupyter Notebook with:



          pyspark





          share|improve this answer













          If you are doing this in the context of Scala within a Jupyter Notebook, you'll get this error. You have to install the Apache Toree kernel:



          https://github.com/apache/incubator-toree



          and create your notebooks with that kernel.



          You also have to start the Jupyter Notebook with:



          pyspark






          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Jun 9 '17 at 18:11









          Clem WangClem Wang

          14118




          14118























              -1














              I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3.
              I resolved my error by



              1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files.
              Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.



              2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)*
              *before selecting this option one should check the compatibility of SPARK and SCALA.






              share|improve this answer






























                -1














                I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3.
                I resolved my error by



                1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files.
                Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.



                2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)*
                *before selecting this option one should check the compatibility of SPARK and SCALA.






                share|improve this answer




























                  -1












                  -1








                  -1







                  I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3.
                  I resolved my error by



                  1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files.
                  Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.



                  2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)*
                  *before selecting this option one should check the compatibility of SPARK and SCALA.






                  share|improve this answer















                  I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3.
                  I resolved my error by



                  1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files.
                  Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.



                  2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)*
                  *before selecting this option one should check the compatibility of SPARK and SCALA.







                  share|improve this answer














                  share|improve this answer



                  share|improve this answer








                  edited Nov 27 '18 at 16:34









                  Matthew Fontana

                  2,0931438




                  2,0931438










                  answered Nov 27 '18 at 13:45









                  Radhika SharmaRadhika Sharma

                  11




                  11






























                      draft saved

                      draft discarded




















































                      Thanks for contributing an answer to Stack Overflow!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid



                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.


                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function () {
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f36714920%2feclipseset-with-scala-envirnment-object-apache-is-not-a-member-of-package-or%23new-answer', 'question_page');
                      }
                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown







                      Popular posts from this blog

                      Contact image not getting when fetch all contact list from iPhone by CNContact

                      count number of partitions of a set with n elements into k subsets

                      A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks