python - A function with itself as a default argument -


defaults parsed @ definition. not work

def f(x, func = f):     if x<1:         return x     else:         return x + func(x-1) 

i did find way, though. start dummy function

def a(x):     return x  def f(x, func=a):     if x < 1:         return x     else:         return x + func(x-1) 

and issue

f.__defaults__ = (f,) 

obviously awkward. there way or bad python? if it's bad, can explain why? break things?

in case, works:

in [99]: f(10) out[99]: 55  in [100]: f(10, f) out[100]: 55  in [101]: f(10, lambda x : 2*x) out[101]: 28  

to elaborate on andrea corbellini's suggestion, can this:

def f(x, func=none):     if func none:         func = f      if x < 1:         return x     else:         return x + func(x-1) 

it pretty standard idiom in python implement actual defaults inside of function , defining default parameter none (or private sentinel object in case none valid input), due problems mutable default values such list objects.


Comments

Popular posts from this blog

Sass watch command compiles .scss files before full sftp upload -

filehandler - java open files not cleaned, even when the process is killed -

gridview - Yii2 DataPorivider $totalSum for a column -