I have derived the following response time data for a performance test I am running:

Min - 8sec Max - 284sec Average - 28sec Standard Deviation - 27sec

What does the standard deviation say about the response time data distribution? When you say low/high standard deviation, what does this actually mean? Is this in comparison to the Average/Min/Max?

I know what standard deviation is and how it's computed. I'm just not sure how to tell if it is high or low.