Python couchdb script consuming too much memory (getting KILLED) -


i'm processing couchdb view has on 10 million rows returned. there way iterate on rows "clean up" memory after i've processed row?

for example, in django can use queryset.iterator() prevent queryset caching helps memory consumption.

this i'm doing:

couch = couchdb.server(url) couch.resource.credentials = (username, password) db = couch[database_name] result = db.view('xxx/xxx', none, stale='update_after', reduce='false') total = 0 row in result.rows:     total += row['value'].num 

my actual code more complicated please don't suggest using reduce in couchdb.


Comments

Popular posts from this blog

java - Suppress Jboss version details from HTTP error response -

filehandler - java open files not cleaned, even when the process is killed -

gridview - Yii2 DataPorivider $totalSum for a column -