0

I have a table, db.Men, that has three columns

  • nvarchar "Name"
  • int "Age"
  • nvarchar "Status"

The Status column has three values only: "Happy", "Normal", and "Bad".

I need to count the average and standard deviation of the ages of "Happy" or "Normal":

using System.Linq 

var ctx     = new DataClasses1DataContext();

double? avg = (from n in ctx.men
               where n.status == "Happy"
               select n.age).Average();

int? sum    = (from n in ctx.men
               where n.status == "Happy"
               select n.age).Sum();

I computed the average and the sum. How can I compute the standard deviation in this conditional status?

Kit
  • 15,260
  • 3
  • 47
  • 87
  • the formula for standard deviation is `stdDev = Math.Sqrt((sum) / (ctx.Count()-1));` in your case (I think... :P) – Sean T Dec 13 '18 at 16:49

2 Answers2

0

Stick the Happy ages (or one for each Status) into a list, and then calculate standard deviation. This article as a good process for determining Standard Deviation:

Standard deviation of generic list?

so something like:

var happyAges = ctx.Men.Where(i=>i.status == "Happy")Select(i=>i.age);
var happySd = CalculateStdDev(happyAges);

Also, you could make that Standard Deviation method static, and an extension method to do it all in one request.

Jonathan
  • 4,115
  • 2
  • 18
  • 35
0

I did this and it worked for me:

int?[] array = (from a in ctx.men where a.status == "Happy" select a.age).ToArray();
double? AVG = array.Average();
double? SumOfSquaresOfDifferences = array.Select(val => (val - AVG) * (val  - AVG)).Sum();
double? SD = Math.Sqrt(Convert.ToDouble(SumOfSquaresOfDifferences) / array.Length);