(question)
Question:
Answer:
Question:
#!/usr/bin/python
i=1
def f():
print "i=%s" % i
i = i + 1
f()
This program fails due to a scoping error:
This has two undesirable effects:
Python has an unusual scoping system, where functions are pre-scanned for assignments before they are executed. Any variable which gets assigned is assumed to be local unless explicitly declaredTraceback (most recent call last): File "function_scope.py", line 8, in ? f() File "function_scope.py", line 5, in f print "i=%s" % i UnboundLocalError: local variable 'i' referenced before assignment
global
.
This has two undesirable effects:
- It confuses users, since the same sort of code would work in almost any other language. It's not intuitive to need to define parent variables as "global" in order to assign a value to them.
- It forces the use of kludges in order to access parent variables which are not global. Basically, you must put the parent data into a list or a hash, in order to trick the python interpreter into thinking you are not assigning a value to it.