i'm pretty new databases , have been stuck on problem while. think need guidance on start kind of problem.
i have web server set receive data , write oracle database.
the server needs able receive multiple rows worth of data per user (about 30 - 100 per 10 seconds, occasional chucks of 1000 rows when there backlog).
from testing application, looks there bottleneck when trying scale number of users. suspect taking wrong approach how writing queries / table structure.
testing 100 - 150 users gives me linearly increasing response time (up 800 seconds after 20 minutes!).
my impression each request block of data has queue , oracle processes each write 1 one, right? due table being locked?
the data being written 1 table, , each 'row' being entered 1 query. if have 30 rows of data needs run 30 inserts.
what best things try when trying improve performance?
- will having multiple tables help?
- is there way write multiple lines @ once , help?
probably long shot hoping can / has come across similar problem.
thanks.
to improve efficiency/security, suggest use binding variables (if dont use them) :
insert mytable (col1, col2) values (:toto, :tata);
instead of
insert mytable (col1, col2) values ('value1', 'value2');
this works selects :
select col1 mytable id=:myid
more info in doc : http://docs.oracle.com/cd/b10501_01/appdev.920/a96584/oci05bnd.htm
secondly, if plan on inserting multiple line @ once, more efficient bulk insert : fewer , forth between server , client, better :
for exemple, if have array :
forall in 1..myarray.count insert col1 values myarray(i);
also, don't think multiple tables (hard without more info), generally, oracle can handle without needing tables.
hope helps
Comments
Post a Comment