1

I have some aggregated data :

name   asd  colA   ColB    ColC      X      Y
-----------------------------------------------
...
George  PJ  Sp      P       B       14.8    56
George  PJ  Sp      P       C       11.4    43
...
-----------------------------------------------

And a local result table

declare @result table(
    Name nvarchar(255),
    SP_E_A int null,
    SP_E_B int null,
    SP_E_C int null,
    SP_E_D int null,
    SP_E_viso decimal(5,1) null,
    ...
    --LOTS MORE COLUMN GOES AFTER
    ...
)

It has lots of columns

What I'm trying to do is : generate an update statement dynamically and execute it
Example:

UPDATE TABLE @table SET SP_P_B = 56

So I wrote this

open cursor for *that result set*
declare @sql varchar(255) = 'UPDATE @table SET '+@colA+'_'+@colB+'_'+@colC+'='+CONVERT(varchar,@Y);
exec (@sql)

Error I'm getting

Must declare the table variable "@table".

Looks like exec starts another session where @result table doesn't exist.

How can I work around it ?

evictednoise
  • 549
  • 6
  • 18
  • 1
    just change to use temp table. – Squirrel Mar 24 '16 at 13:22
  • @Squirrel should I use local, global or tempDb ? – evictednoise Mar 24 '16 at 13:26
  • 1
    Passing table variables in/out of dynamic SQL statements can be tricky. The issue here is that the dynamic SQL executes within a different context. This [question](http://stackoverflow.com/questions/4258798/pass-a-table-variable-to-sp-executesql) shows how you can work around that. – David Rushton Mar 24 '16 at 13:29
  • @destination-data I'm not sure passing a table as parameter will work for me performance-wise (I expect it to degrade a lot if I pass around huge table for so many times, am I mistaking?) – evictednoise Mar 24 '16 at 13:36
  • 1
    local temp table `#temp` will do the job – Squirrel Mar 24 '16 at 13:37
  • @Squirrel `Invalid object name '#table'.` – evictednoise Mar 24 '16 at 13:41
  • Certainly it requires more effort upfront, than switching to a temp table (which is the approach I'd probably take). Performance wise the best way to tell is by trying both. Comparing the execution times and query plans will show you the way to go. On performance, as @SeanLange pointed out, removing the cursor would probably provide the biggest boost. – David Rushton Mar 24 '16 at 13:42
  • 1
    did you create the `#table` ? change your `declare @table` line to `create table #table` – Squirrel Mar 24 '16 at 13:46
  • 1
    can you post your query ? – Squirrel Mar 24 '16 at 14:20
  • Lol just noticed `create table ##table` and `insert into ##result` don't match. It works – evictednoise Mar 24 '16 at 14:21
  • 1
    you don't need to use global temp table. Local temp table will works also. – Squirrel Mar 24 '16 at 14:39
  • You're right. Thanks alot @Squirrel – evictednoise Mar 24 '16 at 14:43
  • Be careful if you are using a global temp table. They have some serious challenges when it comes to concurrency. If there is even the slightest chance more than 1 person will ever be executing this code at the same time you need to rethink this. And for heaven's sake using a cursor for updates is a recipe for horrible performance and maintenance. – Sean Lange Mar 24 '16 at 14:48
  • @SeanLange Thanks for the tips, although I'm not using this to `update [AdventureWorks2012].[Sales]`. It's a specific case, several ways to work around, and I choose generating statements instead of a kilometer-long `CASE WHEN(...` because that would be *even worse* maintainance nightmare if business logic changes or I make a mistake. – evictednoise Mar 25 '16 at 09:12

1 Answers1

0

You can try using a global Temporary table as the following code shows:

CREATE TABLE ##globalSession ( column1 int null, column2 varchar(45) null);

INSERT INTO ##globalSession (column1, column2) Values (2, 'initial data');

declare @sql VARCHAR(255) = 'UPDATE ##globalSession set column1 = 1, column2 = ''test'' ; ';

exec (@sql)

SELECT * FROM ##globalSession

DROP TABLE ##globalSession;