Sunday, November 8, 2020

Realtime app issues on high load - SocketException: Too many open files

 


java.net.SocketException: Too many open files ?

ulimit

If you know the process IDs (PID) of the specific user you can get the limits for each process with:
cat /proc/<PID>/limits  (default for Max open files seems to be 4096 hard limit)

You can get the number of opened files for each PID with:

ls -1 /proc/<PID>/fd | wc -l


Or /usr/sbin/lsof -u <userName> | wc -l


how-to-fix-javanetsocketexception-too-many-open-files-java-tomcat


urlconnection-leads-too-many-open-files - stackOverflow reference

Make sure to disconnect the connection object

finally {
    if (con != null) con.disconnect()
}


URLConnection or HTTPClient (stackOverflow reference)


con.disconnect() 

https://techblog.bozho.net/caveats-of-httpurlconnection/ 
just for quick reference - copied from above - This is still unclear, but gives us a hint that there’s something more. After reading a couple of stackoverflow and java.net answers (1234) and also the android documentation of the same class, which is actually different from the Oracle implementation, it turns out that .disconnect() actually closes (or may close, in the case of android) the underlying socket.


 

1 comment:

  1. With TOP command on the process or heap memory graph captured in ELK UI - if you always see a SawTooth pattern for the heap memory - it is not really contributed by the application as it is (yes, the constant polls by the Springboot-Kafka consumer adds to it, still) - it is the commands which captured the metadata as in Java VisualVM, creating a bulk of objects and later with GC all gets cleaned up making the SawTooth pattern. I do not see this issue while monitoring the heap usage via Dynatrace offering

    ReplyDelete