I have created some Java code that measures delay, packet size and bandwidth.
What is the equation needed to calculate the latency?
This is what I am currently using but unsure it is correct:
//latency = packetsize / delay + bandwidth
System.out.println("latency is " + (len*2) / (duration + transferRateMb));
EDIT the length is multipled by 2 to give the correct value in bytes
duration is the time taken for the ping to complete
transfer rate is found by :
double transferRateMb = ((len*524288.0) / (duration/ 1000000000.0) ) ; //amount of data in megabytes transferred in 1 second.
I have read various guides, and do not understand them, this is just a simple requirement for a project
also this will be run 20 times and the average taken
any ideas ?