org.glassfish.jersey.servletServletContainer.service() hanging for too long











up vote
0
down vote

favorite












I was analysing some slow API requests with Dynatrace. I saw this one that took 623 ms and it caught my attention. I was able to see the PurePath of the request and the first thing that happens is org.glassfish.jersey.servletServletContainer.service() spend 590 ms before calling the next method, as can be seen below:
enter image description here



The scenario of the test is like this:



Apache JMeter with 1 thread performing the same request everytime. On the first 100 seconds or so, the average response times is around 5 ms and then suddenly the response time increases and stay high until the end of the test. After the increase in response time, all the requests look like the on in the image. There is a huge hang time (apparently from Jersey).



I'm using the following Jersey/Jersey REST versions:



 <dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
<version>9.2.3.v20140905</version>
</dependency>

<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-jetty-http</artifactId>
<version>2.22.2</version>
</dependency>


Is it a documented problem? Have anyone experienced that?



Edit 1:
I was able to reproduce this with Postman.
JMeter is sending Connection: close on the request header. JMeter's Connect time is low and the response time is higher than normal. It doesn't seem to be a problem of opening a new connection on every request.










share|improve this question
























  • As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
    – Joakim Erdfelt
    Nov 21 at 19:10












  • @JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
    – luizfzs
    Nov 21 at 19:25















up vote
0
down vote

favorite












I was analysing some slow API requests with Dynatrace. I saw this one that took 623 ms and it caught my attention. I was able to see the PurePath of the request and the first thing that happens is org.glassfish.jersey.servletServletContainer.service() spend 590 ms before calling the next method, as can be seen below:
enter image description here



The scenario of the test is like this:



Apache JMeter with 1 thread performing the same request everytime. On the first 100 seconds or so, the average response times is around 5 ms and then suddenly the response time increases and stay high until the end of the test. After the increase in response time, all the requests look like the on in the image. There is a huge hang time (apparently from Jersey).



I'm using the following Jersey/Jersey REST versions:



 <dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
<version>9.2.3.v20140905</version>
</dependency>

<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-jetty-http</artifactId>
<version>2.22.2</version>
</dependency>


Is it a documented problem? Have anyone experienced that?



Edit 1:
I was able to reproduce this with Postman.
JMeter is sending Connection: close on the request header. JMeter's Connect time is low and the response time is higher than normal. It doesn't seem to be a problem of opening a new connection on every request.










share|improve this question
























  • As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
    – Joakim Erdfelt
    Nov 21 at 19:10












  • @JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
    – luizfzs
    Nov 21 at 19:25













up vote
0
down vote

favorite









up vote
0
down vote

favorite











I was analysing some slow API requests with Dynatrace. I saw this one that took 623 ms and it caught my attention. I was able to see the PurePath of the request and the first thing that happens is org.glassfish.jersey.servletServletContainer.service() spend 590 ms before calling the next method, as can be seen below:
enter image description here



The scenario of the test is like this:



Apache JMeter with 1 thread performing the same request everytime. On the first 100 seconds or so, the average response times is around 5 ms and then suddenly the response time increases and stay high until the end of the test. After the increase in response time, all the requests look like the on in the image. There is a huge hang time (apparently from Jersey).



I'm using the following Jersey/Jersey REST versions:



 <dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
<version>9.2.3.v20140905</version>
</dependency>

<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-jetty-http</artifactId>
<version>2.22.2</version>
</dependency>


Is it a documented problem? Have anyone experienced that?



Edit 1:
I was able to reproduce this with Postman.
JMeter is sending Connection: close on the request header. JMeter's Connect time is low and the response time is higher than normal. It doesn't seem to be a problem of opening a new connection on every request.










share|improve this question















I was analysing some slow API requests with Dynatrace. I saw this one that took 623 ms and it caught my attention. I was able to see the PurePath of the request and the first thing that happens is org.glassfish.jersey.servletServletContainer.service() spend 590 ms before calling the next method, as can be seen below:
enter image description here



The scenario of the test is like this:



Apache JMeter with 1 thread performing the same request everytime. On the first 100 seconds or so, the average response times is around 5 ms and then suddenly the response time increases and stay high until the end of the test. After the increase in response time, all the requests look like the on in the image. There is a huge hang time (apparently from Jersey).



I'm using the following Jersey/Jersey REST versions:



 <dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-server</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-servlet</artifactId>
<version>9.2.3.v20140905</version>
</dependency>
<dependency>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-util</artifactId>
<version>9.2.3.v20140905</version>
</dependency>

<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-server</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-servlet-core</artifactId>
<version>2.22.2</version>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.containers</groupId>
<artifactId>jersey-container-jetty-http</artifactId>
<version>2.22.2</version>
</dependency>


Is it a documented problem? Have anyone experienced that?



Edit 1:
I was able to reproduce this with Postman.
JMeter is sending Connection: close on the request header. JMeter's Connect time is low and the response time is higher than normal. It doesn't seem to be a problem of opening a new connection on every request.







java jersey jetty jersey-2.0 dynatrace






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 21 at 19:17

























asked Nov 21 at 18:20









luizfzs

65011028




65011028












  • As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
    – Joakim Erdfelt
    Nov 21 at 19:10












  • @JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
    – luizfzs
    Nov 21 at 19:25


















  • As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
    – Joakim Erdfelt
    Nov 21 at 19:10












  • @JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
    – luizfzs
    Nov 21 at 19:25
















As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
– Joakim Erdfelt
Nov 21 at 19:10






As with all things jmeter related, are you opening new HTTP/1.1 persistent connections for each request? (a super common mistake). Are you sending data? (POST/PUT/etc) if so, are you reading all of that data on the server side? What does the Jetty Server Dump tell you?
– Joakim Erdfelt
Nov 21 at 19:10














@JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
– luizfzs
Nov 21 at 19:25




@JoakimErdfelt added info. I'm not able to see Jetty Server log at the moment. I'll add info if I can access it.
– luizfzs
Nov 21 at 19:25

















active

oldest

votes











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418314%2forg-glassfish-jersey-servletservletcontainer-service-hanging-for-too-long%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418314%2forg-glassfish-jersey-servletservletcontainer-service-hanging-for-too-long%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks

Calculate evaluation metrics using cross_val_predict sklearn

Insert data from modal to MySQL (multiple modal on website)