8

I was playing around with the _ underscore in the Python interpreter and wanted to try if it has the same behavior in code. I have used the underscore in code as a 'Don't care'-variable, like this:

_, a = someFunction()

And in the interpreter to get the last stored value, like this:

>>> 2 + 2
4
>>> a = _
>>> a
4

Now I tried to execute the following example code:

for i in range(5):
    2 + 1
    a = _

print (a)

In the interpreter as well as written in a Python script and ran using python underscore.py.
With the behavior in mind that the _ underscore will save the last stored value, because this it is not formatted as a 'Don't care'-variable, the expected outcome would be 2 + 1 = 3, making 3 the last stored value, which is then saved into the a variable with a = _.

The outcome of the interpreter was the following:

>>> for i in range(5):
...     2 + 1
...     a = _
...
3
3
3
3
3
>>> print(a)
3

This outcome works as expected while the outcome of the same code saved in a Python script and ran using python underscore.py, resulted in a name error:

C:\Users\..\Python files>python underscore.py
Traceback (most recent call last):
  File "underscore.py", line 3, in <module>
    a = _
NameError: name '_' is not defined

When reading the error, it sounds logic that the _ variable is not defined, but, while it probably has something to do with how Python runs a script, I was just wondering what the difference between these two cases is that makes the outcome result in a somewhat logic answer (when you've been using the interpreter like this for a while) vs a name error?

So don't get me wrong, I do know what the _ symbol does in Python. What I'm asking is why the exact same code behaves differently in the interpreter then when run as a Python program in the terminal?

funie200
  • 2,683
  • 4
  • 16
  • 27
  • 1
    https://hackernoon.com/understanding-the-underscore-of-python-309d1a029edc – mohammedgqudah May 03 '19 at 12:00
  • I can re-open if you think the dupe is wrong, but it seems like you do already understand the difference, so I'm not sure what you're asking – Chris_Rands May 03 '19 at 12:03
  • @Chris_Rands I just want to know why the behavior of the `_` is different between Python code and the interpreter. The answer you posted is what the `_` in code means, and yes, I know what it does – funie200 May 03 '19 at 12:04
  • 1
    Ok i re-opened but what are you asking them? About the internals of how the python interpreter is implemented? Or about the *rationale* for the different behaviour? Look up `sys.displayhook` – Chris_Rands May 03 '19 at 12:08
  • @Chris_Rands Thanks for that reference, `sys.displayhook` explains exactly what I wanted to know. – funie200 May 03 '19 at 12:14
  • 1
    The special assignment to `_` only happens when you execute an *expression* in interactive mode. That's not what's happening here - you're executing a *statement* (a `for` loop), which has no value to assign. – jasonharper May 03 '19 at 12:14

1 Answers1

5

For anyone that finds it interesting, with the help of @Chris_Rands I found out that the Python interpreter stores the last used value using sys.displayhook, more on that here.

sys.displayhook is called on the result of evaluating an expression entered in an interactive Python session. Meaning that it will only have this behavior in the interpreter and not in a Python script ran in the terminal.

funie200
  • 2,683
  • 4
  • 16
  • 27