In basic language:
This code is used to try to control robots that are crawling/spidering your site, by allowing just bing|Google|msn|MSR|Twitter|Yandex to do so and send all others to a dead end.
lines 1-3 describe the conditions for the action in line 4:
1 = if HTTP_USER_AGENT
is knocking on the door [OR]
2 = if HTTP_USER_AGENT
is a robot, crawler or spider
3 = and if HTTP_USER_AGENT
is not one of the listed ones [case INsensitive]
4a = [RewriteRule] = you are going to give another address to go to (instead of searching in your site)
4b = [^/?.*$] = for everything they want to look for
4c = you send them to their own navel (local host) [R = redirecting them, and L = stopping the the execution of the rule set]