20

I'm trying to detect three actions: when a user begins walking, jogging, or running. I then want to know when the stop. I've been successful in detecting when someone is walking, jogging, or running with the following code:

- (void)update:(CMAccelerometerData *)accelData {

    [(id) self setAcceleration:accelData.acceleration];

    NSTimeInterval secondsSinceLastUpdate = -([self.lastUpdateTime timeIntervalSinceNow]);

    if (labs(_acceleration.x) >= 0.10000) {
        NSLog(@"walking: %f",_acceleration.x);
    }
    else if (labs(_acceleration.x) > 2.0) {
        NSLog(@"jogging: %f",_acceleration.x);
    }
    else if (labs(_acceleration.x) > 4.0) {
        NSLog(@"sprinting: %f",_acceleration.x);
    }

The problem I run into is two-fold:

1) update is called multiple times every time there's a motion, probably because it checks so frequently that when the user begins walking (i.e. _acceleration.x >= .1000) it is still >= .1000 when it calls update again.

Example Log:

    2014-02-22 12:14:20.728 myApp[5039:60b] walking: 1.029846
    2014-02-22 12:14:20.748 myApp[5039:60b] walking: 1.071777
    2014-02-22 12:14:20.768 myApp[5039:60b] walking: 1.067749

2) I'm having difficulty figuring out how to detect when the user stopped. Does anybody have advice on how to implement "Stop Detection"

Khaled Barazi
  • 8,503
  • 6
  • 39
  • 59
Apollo
  • 7,820
  • 27
  • 91
  • 177
  • Check out my answer bellow. That class answers all your questions fully plus you can learn a lot from it. – Segev Mar 12 '14 at 17:27

5 Answers5

25

According to your logs, accelerometerUpdateInterval is about 0.02. Updates could be less frequent if you change mentioned property of CMMotionManager.

Checking only x-acceleration isn't very accurate. I can put a device on a table in a such way (let's say on left edge) that x-acceleration will be equal to 1, or tilt it a bit. This will cause a program to be in walking mode (x > 0.1) instead of idle.

Here's a link to ADVANCED PEDOMETER FOR SMARTPHONE-BASED ACTIVITY TRACKING publication. They track changes in the direction of the vector of acceleration. This is the cosine of the angle between two consecutive acceleration vector readings.

cos(angle) formula

Obviously, without any motion, angle between two vectors is close to zero and cos(0) = 1. During other activities d < 1. To filter out noise, they use a weighted moving average of the last 10 values of d.

WMA10 formula

After implementing this, your values will look like this (red - walking, blue - running):

WMA(d)

Now you can set a threshold for each activity to separate them. Note that average step frequency is 2-4Hz. You should expect current value to be over the threshold at least few times in a second in order to identify the action.

Another helpful publications:

UPDATE

_acceleration.x, _accelaration.y, _acceleration.z are coordinates of the same acceleration vector. You use each of these coordinates in d formula. In order to calculate d you also need to store acceleration vector of previous update (with i-1 index in formula).

WMA just take into account 10 last d values with different weights. Most recent d values have more weight, therefore, more impact on resulting value. You need to store 9 previous d values in order to calculate current one. You should compare WMA value to corresponding threshold.

vokilam
  • 9,563
  • 3
  • 42
  • 55
  • so the vectors you refer to would be _acceleration.x, _acceleration.y, and _acceleration.z? – Apollo Mar 11 '14 at 21:56
  • could you also explain a little more about how you interpret the weighted average WMA. Do you take the difference between consecutive WMA values? – Apollo Mar 11 '14 at 22:00
  • @Auser Updated answer. Hope you read the publication. Also it is better to draw live chart of WMA values on screen using `CorePlot` for instance. – vokilam Mar 12 '14 at 07:15
  • Why is it better to draw a live chart? I'm not trying to present the user with a graph, but trying to detect in the background when the user begins walking or running... – Apollo Mar 13 '14 at 15:33
  • @Auser For debugging and choosing thresholds – vokilam Mar 13 '14 at 16:51
  • Hey @vokilam! How often (times per second) I should calculate WMA? – Arsen Jun 13 '16 at 15:41
  • @Arsen every time new step is detected – vokilam Jun 14 '16 at 10:32
  • @vokilam thank you. What's about a neural network that they describe in the paper. Do you how to implement it? – Arsen Jun 14 '16 at 21:11
17

if you are using iOS7 and iPhone5S, I suggest you look into CMMotionActivityManager which is available in iPhone5S because of the M7 chip. It is also available in a couple of other devices:

M7 chip

Here is a code snippet I put together to test when I was learning about it.

#import <CoreMotion/CoreMotion.h>

@property (nonatomic,strong) CMMotionActivityManager *motionActivityManager;

-(void) inSomeMethod
{
  self.motionActivityManager=[[CMMotionActivityManager alloc]init];

  //register for Coremotion notifications
  [self.motionActivityManager startActivityUpdatesToQueue:[NSOperationQueue mainQueue] withHandler:^(CMMotionActivity *activity) 
  {
    NSLog(@"Got a core motion update");
    NSLog(@"Current activity date is %f",activity.timestamp);
    NSLog(@"Current activity confidence from a scale of 0 to 2 - 2 being best- is: %ld",activity.confidence);
    NSLog(@"Current activity type is unknown: %i",activity.unknown);
    NSLog(@"Current activity type is stationary: %i",activity.stationary);
    NSLog(@"Current activity type is walking: %i",activity.walking);
    NSLog(@"Current activity type is running: %i",activity.running);
    NSLog(@"Current activity type is automotive: %i",activity.automotive);
  }];
}

I tested it and it seems to be pretty accurate. The only drawback is that it will not give you a confirmation as soon as you start an action (walking for example). Some black box algorithm waits to ensure that you are really walking or running. But then you know you have a confirmed action.

This beats messing around with the accelerometer. Apple took care of that detail!

Tim Groeneveld
  • 7,699
  • 1
  • 37
  • 54
Khaled Barazi
  • 8,503
  • 6
  • 39
  • 59
  • do you know which other devices support this? Is it only the iPhone5s? – Apollo Feb 23 '14 at 02:25
  • iPad air and iPad mini (2nd gen). I included the link to the M7 wikipedia page in my answer above. – Khaled Barazi Feb 23 '14 at 02:54
  • Tried this method out. The notifications are relatively accurate, however they happen too late for my requirements. – Apollo Mar 11 '14 at 18:21
  • I don't know why but if you're driving a car...and stop for like a few minutes...it would still return automotive or automotive + stationary. I've actually opened a [question](https://stackoverflow.com/questions/45594747/coremotionactivitymanager-returning-either-automotive-or-automotive-stationary) here – Honey Aug 10 '17 at 14:38
7

You can use this simple library to detect if user is walking, running, on vehicle or not moving. Works on all iOS devices and no need M7 chip.

https://github.com/SocialObjects-Software/SOMotionDetector

In repo you can find demo project

arturdev
  • 10,228
  • 2
  • 34
  • 62
  • the library you suggested does not you Core Motion, but CLLocation which requires connectivity to GPS. – Apollo Mar 11 '14 at 18:21
  • 1
    Actually it does. Its calculating motion type by using 2 factors: speed and acceleration. And for getting device acceleration it uses CMMotionManager class which is in CoreMotion framework. – arturdev Mar 11 '14 at 21:18
  • not wroking in iphone 4. not able to detect any motion from this library. – kalpesh Jun 29 '16 at 05:26
5

I'm following this paper(PDF via RG) in my indoor navigation project to determine user dynamics(static, slow walking, fast walking) via merely accelerometer data in order to assist location determination.

Here is the algorithm proposed in the project:

enter image description here

And here is my implementation in Swift 2.0:

import CoreMotion
let motionManager = CMMotionManager()
motionManager.accelerometerUpdateInterval = 0.1
motionManager.startAccelerometerUpdatesToQueue(NSOperationQueue.mainQueue()) { (accelerometerData: CMAccelerometerData?, error: NSError?) -> Void in
        if((error) != nil) {
            print(error)
        } else {
            self.estimatePedestrianStatus((accelerometerData?.acceleration)!)
        }
}

After all of the classic Swifty iOS code to initiate CoreMotion, here is the method crunching the numbers and determining the state:

func estimatePedestrianStatus(acceleration: CMAcceleration) {
    // Obtain the Euclidian Norm of the accelerometer data
    accelerometerDataInEuclidianNorm = sqrt((acceleration.x.roundTo(roundingPrecision) * acceleration.x.roundTo(roundingPrecision)) + (acceleration.y.roundTo(roundingPrecision) * acceleration.y.roundTo(roundingPrecision)) + (acceleration.z.roundTo(roundingPrecision) * acceleration.z.roundTo(roundingPrecision)))

    // Significant figure setting
    accelerometerDataInEuclidianNorm = accelerometerDataInEuclidianNorm.roundTo(roundingPrecision)

    // record 10 values
    // meaning values in a second
    // accUpdateInterval(0.1s) * 10 = 1s
    while accelerometerDataCount < 1 {
        accelerometerDataCount += 0.1

        accelerometerDataInASecond.append(accelerometerDataInEuclidianNorm)
        totalAcceleration += accelerometerDataInEuclidianNorm

        break   // required since we want to obtain data every acc cycle
    }

    // when acc values recorded
    // interpret them
    if accelerometerDataCount >= 1 {
        accelerometerDataCount = 0  // reset for the next round

        // Calculating the variance of the Euclidian Norm of the accelerometer data
        let accelerationMean = (totalAcceleration / 10).roundTo(roundingPrecision)
        var total: Double = 0.0

        for data in accelerometerDataInASecond {
            total += ((data-accelerationMean) * (data-accelerationMean)).roundTo(roundingPrecision)
        }

        total = total.roundTo(roundingPrecision)

        let result = (total / 10).roundTo(roundingPrecision)
        print("Result: \(result)")

        if (result < staticThreshold) {
            pedestrianStatus = "Static"
        } else if ((staticThreshold < result) && (result <= slowWalkingThreshold)) {
            pedestrianStatus = "Slow Walking"
        } else if (slowWalkingThreshold < result) {
            pedestrianStatus = "Fast Walking"
        }

        print("Pedestrian Status: \(pedestrianStatus)\n---\n\n")

        // reset for the next round
        accelerometerDataInASecond = []
        totalAcceleration = 0.0
    }
}

Also I've used the following extension to simplify significant figure setting:

extension Double {
    func roundTo(precision: Int) -> Double {
        let divisor = pow(10.0, Double(precision))
        return round(self * divisor) / divisor
    }
}

With raw values from CoreMotion, the algorithm was haywire.

Hope this helps someone.

EDIT (4/3/16)

I forgot to provide my roundingPrecision value. I defined it as 3. It's just plain mathematics that that much significant value is decent enough. If you like you provide more.

Also one more thing to mention is that at the moment, this algorithm requires the iPhone to be in your hand while walking. See the picture below. Sorry this was the only one I could find.

iPhone status while walking

My GitHub Repo hosting Pedestrian Status

Can
  • 3,914
  • 6
  • 25
  • 41
  • Surmeli can you please tell me about "roundingPrecision" where you get that? – Akhtar Mar 04 '16 at 07:06
  • @Surmeli thanks for the reply.Did you have this function in objective c ? Basically i have pass the filtered value of accelerometer. i have apply an extended kalman filter and now can use these process value ? or use the actual ? please any help can be appreciated – Akhtar Mar 25 '16 at 12:59
  • 1
    @Akhtar the code is written in Swift. When deciding the step count, use the processed values(kalman filtered, etc.) not the raw ones as the processed values are there for this reason in the first place. – Can Mar 25 '16 at 20:49
  • Thanks @CanSürmeli – dev_m May 16 '16 at 07:49
  • @CanSürmeli There is no code like above in project .... Actually it is empty app created in swift ..... – dev_m May 16 '16 at 07:53
  • @kishan94 Oops, I'm really sorry. My bad. Silly me. I created that repo when I was busy and frankly somehow the commits got mixed up. I was creating different versions of the project so the wrong one got uploaded. I've some heavy work load at the moment but I'll try to fix it as soon as possible and report it back here. – Can May 16 '16 at 07:58
  • @CanSürmeli Would be super helpful to see how you've used it for the Pedestrian Status app project. Thanks in advance. – Thomas May 19 '16 at 02:02
  • @kishan94 just uploaded it to GitHub; for good this time :] Sorry it took this long. I also added the GitHub Repo link into my answer to the below. – Can May 23 '16 at 13:50
0

You can use Apple's latest Machine Learning framework CoreML to find out user activity. First you need to collect labeled data and train the classifier. Then you can use this model in your app to classify user activity. You may follow this series if are interested in CoreML Activity Classification.

https://medium.com/@tyler.hutcherson/activity-classification-with-create-ml-coreml3-and-skafos-part-1-8f130b5701f6

DareDevil
  • 722
  • 6
  • 20