How to surpress “Code Generated” and “Cleaned Accumulator” messages in Spark
I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type
[error] 18/11/25 17:28:14 INFO CodeGenerator: Code
generated in 16.947005 ms
and
[error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
accumulator 239819
and
[error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed
broadcast_13354_piece0 on 192.168.2.101:43753 in memory
(size: 20.5 KB, free: 6.2 GB)
Is there any way to suppress these messages. they are just bloating up my log file.
Not sure why spark is reporting these as error. when they look like some kind of debug messages.
scala apache-spark
add a comment |
I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type
[error] 18/11/25 17:28:14 INFO CodeGenerator: Code
generated in 16.947005 ms
and
[error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
accumulator 239819
and
[error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed
broadcast_13354_piece0 on 192.168.2.101:43753 in memory
(size: 20.5 KB, free: 6.2 GB)
Is there any way to suppress these messages. they are just bloating up my log file.
Not sure why spark is reporting these as error. when they look like some kind of debug messages.
scala apache-spark
add a comment |
I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type
[error] 18/11/25 17:28:14 INFO CodeGenerator: Code
generated in 16.947005 ms
and
[error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
accumulator 239819
and
[error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed
broadcast_13354_piece0 on 192.168.2.101:43753 in memory
(size: 20.5 KB, free: 6.2 GB)
Is there any way to suppress these messages. they are just bloating up my log file.
Not sure why spark is reporting these as error. when they look like some kind of debug messages.
scala apache-spark
I wrote a spark job. The job ran fine without any problems. However when I look at my error log file, I see so many messages of type
[error] 18/11/25 17:28:14 INFO CodeGenerator: Code
generated in 16.947005 ms
and
[error] 18/11/25 17:28:15 INFO ContextCleaner: Cleaned
accumulator 239819
and
[error] 18/11/25 17:28:06 INFO BlockManagerInfo: Removed
broadcast_13354_piece0 on 192.168.2.101:43753 in memory
(size: 20.5 KB, free: 6.2 GB)
Is there any way to suppress these messages. they are just bloating up my log file.
Not sure why spark is reporting these as error. when they look like some kind of debug messages.
scala apache-spark
scala apache-spark
edited Nov 26 '18 at 3:39
Knows Not Much
asked Nov 26 '18 at 3:33
Knows Not MuchKnows Not Much
10.6k28100203
10.6k28100203
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
when you are creating the object of SparkContext
, use the following code with it to set the log level according to the requirement:
sparkContext.setLogLevel("WARN")
The above line will set the log level for Spark
to WARN
and you will not get any INFO
or DEBUG
level logs.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53474466%2fhow-to-surpress-code-generated-and-cleaned-accumulator-messages-in-spark%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
when you are creating the object of SparkContext
, use the following code with it to set the log level according to the requirement:
sparkContext.setLogLevel("WARN")
The above line will set the log level for Spark
to WARN
and you will not get any INFO
or DEBUG
level logs.
add a comment |
when you are creating the object of SparkContext
, use the following code with it to set the log level according to the requirement:
sparkContext.setLogLevel("WARN")
The above line will set the log level for Spark
to WARN
and you will not get any INFO
or DEBUG
level logs.
add a comment |
when you are creating the object of SparkContext
, use the following code with it to set the log level according to the requirement:
sparkContext.setLogLevel("WARN")
The above line will set the log level for Spark
to WARN
and you will not get any INFO
or DEBUG
level logs.
when you are creating the object of SparkContext
, use the following code with it to set the log level according to the requirement:
sparkContext.setLogLevel("WARN")
The above line will set the log level for Spark
to WARN
and you will not get any INFO
or DEBUG
level logs.
answered Nov 26 '18 at 4:33
anuj saxenaanuj saxena
24917
24917
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53474466%2fhow-to-surpress-code-generated-and-cleaned-accumulator-messages-in-spark%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown