Python couchdb script consuming too much memory (getting KILLED) -
i'm processing couchdb view has on 10 million rows returned. there way iterate on rows "clean up" memory after i've processed row?
for example, in django can use queryset.iterator() prevent queryset caching helps memory consumption.
this i'm doing:
couch = couchdb.server(url) couch.resource.credentials = (username, password) db = couch[database_name] result = db.view('xxx/xxx', none, stale='update_after', reduce='false') total = 0 row in result.rows: total += row['value'].num
my actual code more complicated please don't suggest using reduce in couchdb.
Comments
Post a Comment