0

I'm working with the Veins framework over the OMNeT++ simulator and I'm facing a weird situation where certain few nodes lose all received packets.

To put everybody in context, I'm simulating 100 nodes (4 flows of 25 nodes), all under coverage (apparently), and sending 10 packets per second each. Depending on the moment the nodes enter the network (i.e: are created by SUMO), some of them (usually just 1 but can be 2, 3, 4...) enter in a mode where all packets are marked as lost (SNIRLostPackets) as they receive a packet while another packet is already being received (according to the Decider the NIC is already synced to another frame).

That doesn't suppose to happen in 802.11 unless there are hidden nodes and the senders don't see each other at the moment of sending their respective frames (both see the the channel idle) right?

So, this behavior is not expected at all and destroys the final lost packets statistics. I tuned the transmission powers of transmission and interference range but nothing changes.

It happens too often to ignore it and I would like to know if anybody has experienced this behavior and how it was solved.

Thank you

Jack
  • 13
  • 3

1 Answers1

0

(OK, apparently the issue comes in an special case where a packet is received (started to being received) OK but at the end of the reception, the node has changed to a TX state.

Then, the packet is marked as "received while sending" but the node has already marked this frame as the next correctly reception. So it discards all receiving ones with no end.

It seems a bug and the possible workaround is adding these lines

if (!frame->getWasTransmitting()){
     curSyncFrame = 0;
 }

in the processSignalEnd function (Decider80211p file), inside the "(frame->getWasTransmitting() || phy11p->getRadioState() == Radio::TX)" case.

I'm not quite sure if this is a case that whether should happen or not, as a node should not send a packet while receiving.

Hope it helps.

Baum mit Augen
  • 46,177
  • 22
  • 136
  • 173
Jack
  • 13
  • 3
  • 2
    Did you perhaps set `allowTxDuringRx=true` (the hack that turns a node into a dumb jammer)? Otherwise, this shouldn't be possible: as soon as a node starts receiving a frame, it marks the channel as busy. This will keep it from trying to transmit. If you didn't configure a jammer and this is indeed a bug, do you have a test case to reproduce your report? – Christoph Sommer Jun 16 '16 at 10:01