Julian Bui
2008-10-29 08:59:54 UTC
Hi everyone,
First off, I'm a database and sqlite newbie. I'm inserting many many
records and indexing over one of the double attributes. I am seeing
my insert times slowly degrade as the database grows in size until
it's unacceptable - less than 1 write per millisecond (other databases
have scaled well). I'm using a intel core 2 duo with 2 GB of ram and
an ordinary HDD.
I am trying to figure out why some of the other databases (firebird,
mysql, berkeley db) have provided constant insert speeds whereas
sqlite has not. Now I do not mean to say anything negative about
sqlite, because there's a good chance I'm not doing something right.
Aside from not having the db perfectly tuned, I think I probably have
bigger problems like not having a suitable index or database
structure.
I don't know much about how sqlite does its indexing. My index,
(CREATE INDEX IF NOT EXISTS IDX_DDD on MYTABLE(ddd)), does not really
tell the database how to make use of my data. Since records are
inserted into the database so that ddd is sorted (record i's ddd field
is less than the ddd field of record i+1) the database should know
about this to index it quickly. I figure a clustered index would
help, but I haven't seen any way to do this in sqlite.
It would really benefit me if someone could help explain maybe basic
lower level details about how sqlite is indexing, or maybe provide
some other insight as to why the database insert times may be slowing
down as the database grows.
Thanks in advance,
Julian
<code>
//ps is a prepared statement
ps = conn.prepareStatement("CREATE table IF NOT EXISTS MY_TABLE(aaa SMALLINT
NOT NULL, bbb BIGINT NOT NULL, ccc SMALLINT, ddd DOUBLE, eee CHAR(8));");
ps.execute();
....
ps = conn.prepareStatement("CREATE INDEX IF NOT EXISTS IDX_DDD on MYTABLE
(ddd)");
ps.execute();
...
dataInsertPs = conn.prepareStatement("INSERT into MY_TABLE(aaa, bbb, ccc,
ddd, eee) VALUES (?, ?, ?, ?, ?)";);
//every dataInsertPs gets added to a batch and committed every 1000 records
</code>
First off, I'm a database and sqlite newbie. I'm inserting many many
records and indexing over one of the double attributes. I am seeing
my insert times slowly degrade as the database grows in size until
it's unacceptable - less than 1 write per millisecond (other databases
have scaled well). I'm using a intel core 2 duo with 2 GB of ram and
an ordinary HDD.
I am trying to figure out why some of the other databases (firebird,
mysql, berkeley db) have provided constant insert speeds whereas
sqlite has not. Now I do not mean to say anything negative about
sqlite, because there's a good chance I'm not doing something right.
Aside from not having the db perfectly tuned, I think I probably have
bigger problems like not having a suitable index or database
structure.
I don't know much about how sqlite does its indexing. My index,
(CREATE INDEX IF NOT EXISTS IDX_DDD on MYTABLE(ddd)), does not really
tell the database how to make use of my data. Since records are
inserted into the database so that ddd is sorted (record i's ddd field
is less than the ddd field of record i+1) the database should know
about this to index it quickly. I figure a clustered index would
help, but I haven't seen any way to do this in sqlite.
It would really benefit me if someone could help explain maybe basic
lower level details about how sqlite is indexing, or maybe provide
some other insight as to why the database insert times may be slowing
down as the database grows.
Thanks in advance,
Julian
<code>
//ps is a prepared statement
ps = conn.prepareStatement("CREATE table IF NOT EXISTS MY_TABLE(aaa SMALLINT
NOT NULL, bbb BIGINT NOT NULL, ccc SMALLINT, ddd DOUBLE, eee CHAR(8));");
ps.execute();
....
ps = conn.prepareStatement("CREATE INDEX IF NOT EXISTS IDX_DDD on MYTABLE
(ddd)");
ps.execute();
...
dataInsertPs = conn.prepareStatement("INSERT into MY_TABLE(aaa, bbb, ccc,
ddd, eee) VALUES (?, ?, ?, ?, ?)";);
//every dataInsertPs gets added to a batch and committed every 1000 records
</code>