I realize that this question is pretty generic, and I'm looking for very general input on how to reconcile performance differences on linux (RedHat 5) vs Windows (2008 Server). I have a java application that is a collection of multiple services that is processing netflow packets off a wire that seems to be performing at about half the capacity on Linux vs Windows on the exact same hardware. The application involves pulling packets off a socket, and performing some very simple processing on them, and writing out a condensed output file to be processed downstream by a different C++ application.
Areas in the program that are relevant: Pulling packets off a socket Database access (mysql 5.1.45) A ton of file i/o
I have very little linux experience as far as optimizations go, and it would be great if someone could point out a few things to watch out for when trying to optimize this for linux that are different than on windows.
Also, I'm not entirely sure what other information I can give. As in, I know I haven't provided lots of useful information, and I would appreciate questions that point me in the right direction.
Are there any utilities that provide java performance benchmarking across linux and windows?
Feel free to shoot me down if the question doesn't make sense, or I'm providing too little information to even ask a useful question.
Thanks!
Edit: All the jvm settings are the same (we use the exact same wrapper config files in both cases for the services involved)
Edit: I'd like to table this issue for the time being. I have been unable to spend any more significant time exploring this apart from running a basic profiler and comparing the results. The results on both linux and windows look almost identical, apart from a much larger lag in pulling packets off the NIC. I still don't understand why this is the case, or if this is an anomaly related to other network conditions. I will post more when I get the time to look at this again.
Thanks for the answers all!