2.2 Using transactions
When you use an ANSI compliant RDBMS like Oracle or DB2, the database interface must perform a COMMIT after each statement execution. This generates unnecessary database operations and can slow down big loops. To avoid this implicit COMMIT, you can control the transaction with BEGIN WORK / COMMIT WORK around the code containing a lot of SQL statement execution.
For example, the following loop will generate 2000 basic SQL operations ( 1000 INSERTs plus 1000 COMMITs ):
01
PREPARE s FROM "INSERT INTO tab VALUES ( ?, ? )"01
FOR n=1 TO 10003
EXECUTE s USING n, c -- Generates implicit COMMIT04
END FOR
You can improve performance if you put a transaction block around the loop:
01
PREPARE s FROM "INSERT INTO tab VALUES ( ?, ? )"02
BEGIN WORK03
FOR n=1 TO 10004
EXECUTE s USING n, c -- In transaction -> no implicit COMMIT05
END FOR06
COMMIT WORK
With this code, only 1001 basic SQL operations will be executed ( 1000 INSERTs plus 1 COMMIT ).
However, you must take care when generating large transactions because all modifications are registered in transaction logs. This can result in a lack of database server resources (INFORMIX "transaction too long" error, for example) when the number of operations is very big. If the SQL operation does not require a unique transaction for database consistency reasons, you can split the operation into several transactions, as in the following example:
01
PREPARE s FROM "INSERT INTO tab VALUES ( ?, ? )"02
BEGIN WORK03
FOR n=1 TO 10004
IF n MOD 10 == 0 THEN05
COMMIT WORK06
BEGIN WORK07
END IF08
EXECUTE s USING n, c -- In transaction -> no implicit COMMIT09
END FOR10
COMMIT WORK
2.3 Avoiding long transactions
Some BDL applications do not care about long transactions because they use an Informix database without transaction logging (transactions are not stored in log files for potential rollbacks). However, if a failure occurs, no rollback can be made, and only some of the rows of a query might be updated. This could result in data inconsistency !
With many providers (Genero DB, SQL Server, IBM DB2, Oracle…), using transactions is mandatory. Every database modification is stored in a log file.
BDL applications must prevent long transactions when connected to a database using logging. If a table holds hundreds of thousands of rows, a "DELETE FROM table", for example, might cause problems. If the transaction log is full, no other insert, update or delete could be made on the database. The activity could be stopped until a backup or truncation of the log !
For example, if a table holds hundreds of thousands of rows, a "DELETE FROM table" might produce a "snapshot too old" error in ORACLE if the rollback segments are too small.
Solution :
You must review the program logic in order to avoid long transactions:
- keep transactions as short as possible.
- access the least amount of data possible while in a transaction. If possible, avoid using a big SELECT in transaction.
- split a long transaction into many short transactions. Use a loop to handle each block (see the last example below : 2.2 Using transactions).
- to delete all rows from a table use the "TRUNCATE TABLE" instruction instead of "DELETE FROM" (Not for all vendors)
In the end, increase the size of the transaction log to avoid it filling up.
2.4 Declaring prepared statements
Line 2 of the following example shows a cursor declared with a prepared statement:
01
PREPARE s FROM "SELECT * FROM table WHERE ", condition02
DECLARE c CURSOR FOR s
While this has no performance impact with Informix database drivers, it can become a bottleneck when using non-Informix databases:
Statement preparation consumes a lot of memory and processor resources. Declaring a cursor with a prepared statement is a native Informix feature, which does consume only one real statement preparation. But non-Informix databases do not support this feature. So the statement is prepared twice (once for the PREPARE, and once for the DECLARE). When used in a big loop, such code can cause performance problems.
To optimize such code, you can use the FROM clause in the DECLARE statement:
01
DECLARE c CURSOR FROM "SELECT * FROM table WHERE " || condition
By using this solution only one statement preparation will be done by the database server.
Remark: This performance problem does not appear with DECLARE statements using static SQL.
2.5 Saving SQL resources
To write efficient SQL in a Genero program, you should use Dynamic SQL as described in 2.1 of this performance section. However, when using Dynamic SQL, you allocate an SQL statement handle on the client and server side, consuming resources. According to the database type, this can be a few bytes or a significant amount of memory. For example, on a Linux 32b platform, a prepared statement costs about 5 Kbytes with an Informix CSDK 2.80 client, while it costs about 20 Kbytes with an Oracle 10g client. That can be a lot of memory if you have programs declaring a dozen or more cursors, multiplied by hundreds of user processes. When executing several Static SQL statements, the same statement handle is reused and thus less memory is needed.
Genero allows you to use either Static or Dynamic SQL, so it's in your hands to choose memory or performance. However, in some cases the same code will be used by different kinds of programs, needing either low resource usage or good performance. In many OLTP applications you can actually distinguish two type of programs:
- Programs where memory usage is not a problem but needing good performance (typically, batch programs executed as a unique instance during the night).
- Programs where performance is less important but where memory usage must be limited (typically, interactive programs executed as multiple instances for each application user).
To reuse the same code for interactive programs and batch programs, you can do the following:
- Define a local module variable as an indicator for the prepared statement.
- Write a function returning the type of program (for example, 'interactive' or 'batch' mode).
- Then, in a reusable function using SQL statements, you prepare and free the statement according to the indicators, as shown in the next example.
01
DEFINE up_prepared INTEGER02
03
FUNCTION getUserPermissions( username )04
DEFINE username VARCHAR(20)05
DEFINE cre, upd, del CHAR(1)06
07
IF NOT up_prepared THEN08
PREPARE up_stmt FROM "SELECT can_create, can_update, cab_delete"09
|| " FROM user_perms WHERE name = ?"10
LET up_prepared = TRUE11
END IF12
13
EXECUTE up_stmt USING username INTO cre, upd, del14
15
IF isInteractive() THEN16
FREE up_stmt17
LET up_prepared = FALSE18
END IF19
20
RETURN cre, upd, del21
22
END FUNCTION
The first time this function is called, the
up_prepared
value will be FALSE, so the statement will be prepared in line 08. The next time the function is called, the statement will be re-prepared only ifup_prepared
is TRUE. The statement is executed in line 13 and values are fetch into the variables returned in line 20. If the program is interactive, lines 15 to 18 free the statement and set theup_prepared
module variable back to FALSE, forcing statement preparation in the next call of this function.
沒有留言:
張貼留言