Python SQLite error: "attempt to write a readonly database" -
i'm using chunks write local sqlite database. after has progressed point, error "attempt write readonly database". if change chunk size, point @ error comes changes well. database being created progresses, it's 15mb before error, gets 35mb. ideas on how fix or why it's occurring?
import pandas pd sqlalchemy import create_engine import datetime dt disk_engine = create_engine('sqlite:///c:\\databases\\test.db') start = dt.datetime.now() chunksize = 100000 j = 0 index_start = 1 df in pd.read_csv('c:\my_file.txt',sep='\t', error_bad_lines=false, chunksize=chunksize, iterator=true, encoding='iso-8859-1'): df = df.rename(columns={c: c.replace(' ', '') c in df.columns}) # remove spaces columns df.index += index_start columns = ['column_1'] c in df.columns: if c not in columns: df = df.drop(c, axis=1) j+=1 print '{} seconds: completed {} rows'.format((dt.datetime.now() - start).seconds, j*chunksize) df.to_sql('data', disk_engine, if_exists='append') index_start = df.index[-1] + 1
output:
0 seconds: completed 100000 rows 1 seconds: completed 200000 rows 3 seconds: completed 300000 rows 4 seconds: completed 400000 rows 6 seconds: completed 500000 rows 7 seconds: completed 600000 rows 8 seconds: completed 700000 rows 10 seconds: completed 800000 rows 11 seconds: completed 900000 rows 13 seconds: completed 1000000 rows 14 seconds: completed 1100000 rows 16 seconds: completed 1200000 rows 17 seconds: completed 1300000 rows 19 seconds: completed 1400000 rows operationalerror: (sqlite3.operationalerror) attempt write readonly database [sql: u'insert data ("index") values (?)'] [parameters: ((1300001l,), (1300002l,), (1300003l,), (1300004l,), (1300005l,), (1300006l,), (1300007l,), (1300008l,) ... displaying 10 of 100000 total bound parameter sets ... (1399999l,), (1400000l,))]
Comments
Post a Comment