How to surpress “Code Generated” and “Cleaned Accumulator” messages in Spark












1















I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type



[error] 18/11/25 17:28:14 INFO CodeGenerator: Code 
generated in 16.947005 ms


and



[error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
accumulator 239819


and



[error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed 
broadcast_13354_piece0 on 192.168.2.101:43753 in memory
(size: 20.5 KB, free: 6.2 GB)


Is there any way to suppress these messages. they are just bloating up my log file.



Not sure why spark is reporting these as error. when they look like some kind of debug messages.










share|improve this question





























    1















    I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type



    [error] 18/11/25 17:28:14 INFO CodeGenerator: Code 
    generated in 16.947005 ms


    and



    [error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
    accumulator 239819


    and



    [error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed 
    broadcast_13354_piece0 on 192.168.2.101:43753 in memory
    (size: 20.5 KB, free: 6.2 GB)


    Is there any way to suppress these messages. they are just bloating up my log file.



    Not sure why spark is reporting these as error. when they look like some kind of debug messages.










    share|improve this question



























      1












      1








      1


      1






      I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type



      [error] 18/11/25 17:28:14 INFO CodeGenerator: Code 
      generated in 16.947005 ms


      and



      [error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
      accumulator 239819


      and



      [error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed 
      broadcast_13354_piece0 on 192.168.2.101:43753 in memory
      (size: 20.5 KB, free: 6.2 GB)


      Is there any way to suppress these messages. they are just bloating up my log file.



      Not sure why spark is reporting these as error. when they look like some kind of debug messages.










      share|improve this question
















      I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type



      [error] 18/11/25 17:28:14 INFO CodeGenerator: Code 
      generated in 16.947005 ms


      and



      [error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
      accumulator 239819


      and



      [error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed 
      broadcast_13354_piece0 on 192.168.2.101:43753 in memory
      (size: 20.5 KB, free: 6.2 GB)


      Is there any way to suppress these messages. they are just bloating up my log file.



      Not sure why spark is reporting these as error. when they look like some kind of debug messages.







      scala apache-spark






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Nov 26 '18 at 3:39







      Knows Not Much

















      asked Nov 26 '18 at 3:33









      Knows Not MuchKnows Not Much

      10.6k28100203




      10.6k28100203
























          1 Answer
          1






          active

          oldest

          votes


















          4














          when you are creating the object of SparkContext, use the following code with it to set the log level according to the requirement:



          sparkContext.setLogLevel("WARN")


          The above line will set the log level for Spark to WARN and you will not get any INFO or DEBUG level logs.






          share|improve this answer























            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53474466%2fhow-to-surpress-code-generated-and-cleaned-accumulator-messages-in-spark%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            4














            when you are creating the object of SparkContext, use the following code with it to set the log level according to the requirement:



            sparkContext.setLogLevel("WARN")


            The above line will set the log level for Spark to WARN and you will not get any INFO or DEBUG level logs.






            share|improve this answer




























              4














              when you are creating the object of SparkContext, use the following code with it to set the log level according to the requirement:



              sparkContext.setLogLevel("WARN")


              The above line will set the log level for Spark to WARN and you will not get any INFO or DEBUG level logs.






              share|improve this answer


























                4












                4








                4







                when you are creating the object of SparkContext, use the following code with it to set the log level according to the requirement:



                sparkContext.setLogLevel("WARN")


                The above line will set the log level for Spark to WARN and you will not get any INFO or DEBUG level logs.






                share|improve this answer













                when you are creating the object of SparkContext, use the following code with it to set the log level according to the requirement:



                sparkContext.setLogLevel("WARN")


                The above line will set the log level for Spark to WARN and you will not get any INFO or DEBUG level logs.







                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 26 '18 at 4:33









                anuj saxenaanuj saxena

                24917




                24917






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53474466%2fhow-to-surpress-code-generated-and-cleaned-accumulator-messages-in-spark%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks

                    Calculate evaluation metrics using cross_val_predict sklearn

                    Insert data from modal to MySQL (multiple modal on website)