0

I have a sproc that generates an 80,000-row temp table which is passed as a table-valued parameter to 32 different other sprocs (each sproc the TVP as an input parameter).

  1. Should I be concerned that I am going to get a balloon of memory I can't manage?
  2. What is a good way to monitor (PerfMon?) how the memory is being used/tracked?

Thanks.

Snowy
  • 5,488
  • 15
  • 56
  • 110

1 Answers1

1

1) According to this question:

Performance of bcp/BULK INSERT vs. Table-Valued Parameters TVP's will underperform using bulkcopy on datasets that large. On the other hand... figure out the max datasize of your 80,000 rows and determine if you're ok with that size object floating around in RAM (Personally I wouldn't have a problem with it... we could store our entire DB in RAM three times over)

2) Here is a good thread on ServerFault for monitoring SQL Server's memory usage:

https://serverfault.com/questions/115957/viewing-sqls-cache-ram-usage

Community
  • 1
  • 1
Matthew
  • 9,896
  • 5
  • 43
  • 95