0

I am updating two different SQL tables in the same loop using parameterized queries in Delphi XE8. The whole thing is wrapped in a transaction, so that if anything in the loop fails, neither table gets updated.

My original query was posted here, here is the current simplified code after help from this site:

begin
  Query1:=TSQLQuery.Create(nil);
  try
    Query1.SQLConnection:=Connection;
    Query1.SQL.Text:='UPDATE TABLE A SET QUANTITY = :Quantity 
                      WHERE PRODUCT_NAME = :Product_name'; 

    Query2:=TSQLQuery.Create(nil);
    try
      Query2.SQLConnection:=Connection;
      Query2.SQL.Text:= 'UPDATE TABLE B SET QUANTITY = :Quantity 
                         WHERE PRODUCT_NAME = :Product_name'; 

      Transaction:=Connection.BeginTransaction;
      try
        for I := 1 to whatever to
        begin
          { fill params here and execute the commands }
          Query1.ExecSQL;
          Query2.ExecSQL;
        end;
        Connection.CommitFreeAndNil(Transaction);
      except
        Connection.RollbackFreeAndNil(Transaction);
        raise;
      end;
      .... etc.    

I've just realised that I might have a problem, in that part of the update of table B (ie query 2) involves first retrieving a record from table B, and then updating it based on one of the values returned.

So, if I flesh out the loop above:

for I:= 1 to whatever do
begin
  //Retrieve relevant values from file being read 
  Product_name:=Product_name[I]; 
  Quantity:=Value[I];

  //Execute query 1, no problems here 
  SQL_query1.Params.ParamByName('Product_name').AsString:=
                                                   Product_name;
  SQL_query1.Params.ParamByName('Quantity').AsString:=
                                                   Quantity;      
  Query1.ExecSQL;

  //Interim get from Table B 
  //I am using datasets here that are already open in my actual code, 
  //but it could also be a SQL_query3 component; I am simply showing 
  //the logic here of what's going on
  SQL_dataset1.CommandType:=ctQuery; 
  SQL_dataset1.CommandText:=
      'SELECT QUANTITY FROM TABLE B WHERE PRODUCT_NAME = '+Product_name;
  SQL_dataset1.Open; 

  Old_quantity:=SQL_dataset1.FieldByName('Quantity').AsString;

  New_quantity:=Old_quantity+Quantity;

  //Execute query 2
  SQL_query2.Params.ParamByName('Product_name').AsString:=
                                                   Trim_str(Product_name);
  SQL_query2.Params.ParamByName('Quantity').AsString:=
                                                   Trim_str(Quantity);   
  Query2.ExecSQL; 
  ... etc.
end;

So the entire loop could theoretically update the same product's quantity, and the updated quantity is based on the previous quantity.

Is this even possible, or will I have to settle for one update at a time? Sorry if this is a stupid question.

Also, while table A can certainly be updated using the above code as it doesn't have the same problem as table B, I don't want it to update at all if there are any issues with table B updates.

Part 2: simplified example

Table A is noise in what I'm actually trying to understand from this question, sorry, so let me rephrase with only table B. What's happening is that a file is being read sequentially, and based on each row's information, table B's running total has to be updated.

So let's say table B has the following records:

Product name Quantity
Red widget 3
Blue widget 5

We sequentially read in a file of widget purchases, which is in random order and contains a number of red and blue purchases mixed up.

For example:

  1. Red widget +6
  2. Red widget +2
  3. Blue widget +1
  4. Red widget +2 ... and so on.

Looking at the code sample below....

Query2.SQL.Text:= 'UPDATE TABLE B SET QUANTITY = :Quantity 
                   WHERE PRODUCT_NAME = :Product_name'; 

for I := 1 to file length do
begin
  //get the current quantity for that widget
  //add the quantity purchased in that row in the file to the 
  //quantity just retrieved 

  SQL_query2.Params.ParamByName('Product_name').AsString:=
                                                   Trim_str(Product_name);
  SQL_query2.Params.ParamByName('Quantity').AsString:=
                                                   Trim_str(Quantity); 
  Query2.ExecSQL;
end;

....my question is: will looping through a parameterized query like this update the running total as we go along? It's probably a stupid question, but just trying to wrap my head around the difference between that and this ...

for I := 1 to file length do
begin
  //get the current quantity for that widget
  //add the quantity purchased in that row in the file to the  
  //quantity just retrieved 

  Query2.SQL.Text:= 'UPDATE TABLE B SET QUANTITY = ' + Quantity + 
                    'WHERE PRODUCT_NAME = 'Red widget'; 
  Query2.ExecSQL;
end; 

...where it certainly is updating as you go along. Just want to be sure that I understand these parameterized queries properly. From what I've read, there certainly seems to be some optimization if you use parameters? I think where I'm not clear/correct is my impression that the 'optimization' when using parameters doesn't mean fewer trips to the database? Fewer calls to the database = running totals going out of sync, as a question in my head!

Obviously the tables and files are more complex than this, and we have ID's set up as keys etc. My bad example is just for the purposes of the logic of the question. Once I understand this, I can apply my limited knowledge to improving the query!

Community
  • 1
  • 1
Alex
  • 523
  • 1
  • 8
  • 20
  • 2
    So what do you want, the dataset you're iterating over to reflect the result of the UPDATE commands? If so, you'll need to re-run the SELECT query (as you wrote in example 2). Unless you have some strange logic you need to do client-side, however, trying to do all this in one large SQL statement would be far more efficient. Databases don't like running lots of UPDATEs in a loop (or cursor). – Matt Allwood Jul 13 '15 at 11:02
  • Is it correct that crucial `{ fill params here and execute the commands }` stands outside of the `for I := 1 to whatever to` loop? IOW you are executing EXACTLY the same queries with exactly the same parameters time and again. Shouldn't you CHANGE the parameters within the loop ? – Arioch 'The Jul 13 '15 at 11:09
  • @Matt Allwood I thought I had that covered by parameterizing the updates. The problem is that the second update is dependent on the results of updates as we go along the loop, thus I'm not sure the parameterizing will work in this case? – Alex Jul 13 '15 at 11:10
  • @Arioch 'The Sorry, populating the params is within the loop, it's just commentary that's outside the loop. Have edited. – Alex Jul 13 '15 at 11:13
  • To clarify your simplified example, have you basically got table A with a list of quantity changes and table B with a list of quantities that need updating? If so, this can be done as a set operation in a single DB call rather than a loop. If that is the case then we can spell out how to do it. Alternatively, can't you write query 2 to take into account the current quantity stored there without needing the SELECT? – Matt Allwood Jul 13 '15 at 13:01
  • @Matt Allwood I have edited my example to hopefully make it clearer. Table A simply gets updated with the changes read in from a file. Table B is meant to keep track of the quantities as the file gets read in. So, for example, if the quantity in the table is 10 widgets at the start of the read, and row 1 = +3, and row 2 = +2, then the update has to reflect 15 when committing the transaction. My question is, is this possible using the code login above, using parameterized queries as shown? I could keep track of the sum separately, and update at the end, but that's not what I'm asking. – Alex Jul 13 '15 at 13:46
  • @Alex Looping as above should be fine, though you may want to do an UPDATE Quantity = Quantity + :Change rather than pulling down the last known value, doing the sum locally and then sending that to the database. If you have to do it like this then doing the SUM locally and updating each record once will be faster than calling UPDATE once per TABLE A row. DB calls are expensive, so try to minimize them. – Matt Allwood Jul 13 '15 at 14:46

1 Answers1

2

As a guess on what you're attempting to do, I think the following SQL approach is better than your attempt at looping through updates. I am basing this on table A containing a list of quantity changes and table B containing the current quantity (i.e. if table B has 3 foo, 2 bar and table A has +2 foo, -1 foo, +1 bar the result after the operation will be table B having 4 foo and 3 bar)

UPDATE TableA
SET Quantity = TableA.Quantity + 
   (SELECT sum(tableB.Change)
    FROM TableB
    WHERE TableA.ID = TableB.ID)

This works in Fiddle for SQL Server, YMMV (http://sqlfiddle.com/#!6/017df/7/0)

As an aside, you may want to join on a ProductID primary key rather than product name. If you want to understand why then look into DB normalisation

To cover your actual question, I can't see any reason that it wouldn't work (it would just be a LOT slower than the single SQL statement above). The transaction effectively 'freezes' any records you touch with your UPDATE statements. You'll be able to use (SELECT and UPDATE) any manipulated records as though there's no transaction, but others may or may not be able to see them depending on other DB settings (and definitely can't update/delete them), so as long as you don't run your other queries in a separate SQLConnection you'll be fine.

Amendment following question edit:

Yes, that should work fine, though I highly recommend doing

UPDATE TableB 
SET Quantity = Quantity + :QuantityChange 
WHERE PRODUCT_NAME = :Product_name

then you don't need to run the other SELECT query (unless you need the updated total client-side for instance in writing out a log). The big bonus with parameters is protection from SQL injection attacks, but it can also help with the query optimisation at the DB end. In terms of trips to the DB you get one of those every EXECUTE, the optimisation just means that the DB spends less time thinking about it each time.

Matt Allwood
  • 1,356
  • 10
  • 23
  • Yes, normalization is quite missing here. And your point with executing the statement without fetching anything to the client is definitely the way to go. – TLama Jul 13 '15 at 16:43
  • Thanks @Matt Allwood for all your help. I have added a 'part 2' to the original question in an attempt to strip out some of the noise and try and clarify what I'm actually asking. If I understand you correctly, you answer this in the second half of your answer above? – Alex Jul 14 '15 at 08:30
  • @Matt Allwood Took me a while, but I finally see what you mean! Don't need the get using your syntax, which is great. Thanks very much. While I'm thinking of it, I presume I can use the same logic in a SQL server merge query? What about SQLite, we are currently using an 'INSERT OR REPLACE'....? – Alex Jul 14 '15 at 14:55
  • @Alex I'm not so familiar with MERGE, as I've never used them, nor SQLite. The easiest way to find out is to try it out with a noddy example. I'd be surprised if it didn't work, though as it's a fairly fundamental concept. The sqlfiddle website I linked to earlier is quite nice for trying things out quickly without needing to install a full DB server. – Matt Allwood Jul 15 '15 at 08:47
  • @Matt Allwood I've got the MERGE working well in SQL server, the relevant bit is pretty much as per your example: WHEN MATCHED THEN UPDATE SET Quantity = Quantity + :QuantityChange. SQLite not there yet, have posted a separate question [here](http://stackoverflow.com/questions/31429300/upsert-in-sqlite-with-running-total-if-record-is-found). Thanks again for the help. – Alex Jul 15 '15 at 11:40