2

EDIT: *Could anyone please help me understand why the time variable is behaving the way it is? The out come looks like this:

c:\directory><
for /D /r %%F IN ("*") DO ( 
   for %%G  IN ("%%F\*.sdf") DO xcopy "%%G" c:\Path2\%newest% /y /i 
for /f "delims=_" %%J IN ('forfiles /p "%%F" /m *.extension /c "cmd /c echo @path"')  DO start "Program"  /D "c:\program files\Path2" /Wait jmpt -r  %%J
set ENDTIME=(whatever the current time is)
echo STARTTIME:
echo ENDTIME:
set /a DURATION=-
endlocal
rem this ends the runtime segment
>

So I've got a batch file that will execute multiple files using a for loop. I'm trying to add in a script to measure the time that it takes for each file to execute (not the entire batch file). Or in other words, I'd like to measure how long it takes each iteration of the loop to finish. A coworker gave me a working batch time measure but when I try to implement it into my batch file it sets start and end time at the very beginning of the run and never updates it.

Here's his batch file

@ECHO OFF
set PATH=%PATH%;C:\Path

set STARTTIME=%TIME%

rem here begins the command you want to measure
@ECHO %PATH%
rem here ends the command you want to measure
rem (I, the stackoverflow asker, just used @ECHO %PATH% to test if his script worked)

set ENDTIME=%TIME%

rem output as time
echo STARTTIME: %STARTTIME%
echo ENDTIME: %ENDTIME%

rem convert STARTTIME and ENDTIME to centiseconds
set /A STARTTIME=(1%STARTTIME:~0,2%-100)*360000 + (1%STARTTIME:~3,2%-100)*6000 + (1%STARTTIME:~6,2%-100)*100 + (1%STARTTIME:~9,2%-100)
set /A ENDTIME=(1%ENDTIME:~0,2%-100)*360000 + (1%ENDTIME:~3,2%-100)*6000 + (1%ENDTIME:~6,2%-100)*100 + (1%ENDTIME:~9,2%-100)

rem calculating the duratyion is easy
set /A DURATION=%ENDTIME%-%STARTTIME%

rem we might have measured the time in between days
if %ENDTIME% LSS %STARTTIME% set /A DURATION=%STARTTIME%-%ENDTIME%

rem now break the centiseconds down to hours, minutes, seconds and the remaining centiseconds
set /A DURATIONH=%DURATION% / 360000
set /A DURATIONM=(%DURATION% - %DURATIONH%*360000) / 6000
set /A DURATIONS=(%DURATION% - %DURATIONH%*360000 - %DURATIONM%*6000) / 100
set /A DURATIONHS=(%DURATION% - %DURATIONH%*360000 - %DURATIONM%*6000 - %DURATIONS%*100)

rem some formatting
if %DURATIONH% LSS 10 set DURATIONH=0%DURATIONH%
if %DURATIONM% LSS 10 set DURATIONM=0%DURATIONM%
if %DURATIONS% LSS 10 set DURATIONS=0%DURATIONS%
if %DURATIONHS% LSS 10 set DURATIONHS=0%DURATIONHS%

rem outputing
echo STARTTIME: %STARTTIME% centiseconds
echo ENDTIME: %ENDTIME% centiseconds
echo DURATION: %DURATION% in centiseconds
echo %DURATIONH%:%DURATIONM%:%DURATIONS%,%DURATIONHS%

endlocal

Now my batch file looks like this (with his implemented)

RMDIR c:\directory /s /q
mkdir c:\directory
xcopy "\\Path1\*" c:\directory /s /i
cd c:\Path2
for /f "tokens=*" %%a in ('dir /b /od') do set newest=%%a
cd c:\directory
sqlcmd -S Server -i "\\Path3\query1.sql"

for /D /r %%F IN ("*") DO ( 
   for %%G  IN ("%%F\*.sdf") DO xcopy "%%G" c:\Path2\%newest% /y /i 

@ECHO OFF

set STARTTIME=%TIME%

   for /f "delims=_" %%J IN ('forfiles /p "%%F" /m *.extension /c "cmd /c echo @path"')  DO start "Program"  /D "c:\program files\Path2" /Wait jmpt -r  %%J




set ENDTIME=%TIME%

rem output as time
echo STARTTIME: %STARTTIME%
echo ENDTIME: %ENDTIME%

rem convert STARTTIME and ENDTIME to centiseconds
set /A STARTTIME=(1%STARTTIME:~0,2%-100)*360000 + (1%STARTTIME:~3,2%-100)*6000 (1%STARTTIME:~6,2%-100)*100 + (1%STARTTIME:~9,2%-100)
set /A ENDTIME=(1%ENDTIME:~0,2%-100)*360000 + (1%ENDTIME:~3,2%-100)*6000 + (1%ENDTIME:~6,2%-100)*100 + (1%ENDTIME:~9,2%-100)

rem calculating the duratyion is easy
set /A DURATION=%ENDTIME%-%STARTTIME%

rem we might have measured the time inbetween days
if %ENDTIME% LSS %STARTTIME% set set /A DURATION=%STARTTIME%-%ENDTIME%

rem now break the centiseconds down to hors, minutes, seconds and the remaining centiseconds
set /A DURATIONH=%DURATION% / 360000
set /A DURATIONM=(%DURATION% - %DURATIONH%*360000) / 6000
set /A DURATIONS=(%DURATION% - %DURATIONH%*360000 - %DURATIONM%*6000) / 100
set /A DURATIONHS=(%DURATION% - %DURATIONH%*360000 - %DURATIONM%*6000 - %DURATIONS%*100)

rem some formatting
if %DURATIONH% LSS 10 set DURATIONH=0%DURATIONH%
if %DURATIONM% LSS 10 set DURATIONM=0%DURATIONM%
if %DURATIONS% LSS 10 set DURATIONS=0%DURATIONS%
if %DURATIONHS% LSS 10 set DURATIONHS=0%DURATIONHS%

rem outputing
echo STARTTIME: %STARTTIME% centiseconds
echo ENDTIME: %ENDTIME% centiseconds
echo DURATION: %DURATION% in centiseconds
echo %DURATIONH%:%DURATIONM%:%DURATIONS%,%DURATIONHS%


endlocal
)

sqlcmd -S Server -i "\\path3\query2.sql"

pause

When I run my batch file with those additions, I get a start and end time (set before the file actually starts running) that are equal to each other and they will be equal to the previous start and end time from the last iteration and the upcoming iteration.

Any ideas what the deal is?

David
  • 812
  • 3
  • 12
  • 30
  • The deal is probably delayed expansion. Change `@echo off` to `@echo on` and look at the output to see what's actually being run. – indiv Dec 12 '14 at 19:19
  • When I turned `@ECHO ON`, the code "worked". In the sense that it didn't give an endless series of `STARTTIME: ENDTIME: Unbalanced Parenthesis` but I don't know why setting `@ECHO ON` would effect the preformance of the script. Also, I realized that it has been giving a start and end time before that but its setting all start and end times as = to the time the script started. So each loop returns a start and end time of whatever time it was when I began the script and it returns this before the loop that I'm trying to measure actually executes. – David Dec 15 '14 at 14:04

2 Answers2

1

In your code,

set /A STARTTIME=(1%STARTTIME:~0,2%-100)*360000 + (1%STARTTIME:~3,2%-100)*6000 (1%STARTTIME:~6,2%-100)*100 + (1%STARTTIME:~9,2%-100)

You are missing a + after *6000

set /A STARTTIME=(1%STARTTIME:~0,2%-100)*360000 + (1%STARTTIME:~3,2%-100)*6000 + (1%STARTTIME:~6,2%-100)*100 + (1%STARTTIME:~9,2%-100)

In the original code:

if %ENDTIME% LSS %STARTTIME% set set /A DURATION=%STARTTIME%-%ENDTIME%

set is repeated - evidently, that code has never been tested across midnight.

In both, the correct calculation if endime is less than starttime is to add 3600*2400 (3600 sec/hr * 24 hr/dy * 100 for centiseconds) to the resultant (negative) duration.

Magoo
  • 68,705
  • 7
  • 55
  • 76
  • Interestingly enough, those typos are isolated to my post here. They don't show up in the actual code but I just copied and pasted.. anyways, another interesting issue is that the start time and end time are being declared before the process inside the `rem here begins the command you want to measure` and `rem here ends the command you want to measure` is executed. I would think that the way I've set up the code that the start time would be declared and then the loop would go and then the endtime would declare. – David Dec 15 '14 at 13:47
  • I think this answer can be improved to just show how to do it and focus less on the code by the poster. – Mike Q Apr 30 '20 at 14:41
  • See this link as well https://stackoverflow.com/questions/9922498/calculate-time-difference-in-windows-batch-file – Mike Q Apr 30 '20 at 14:43
-1

How about using the right tool for the job.

Using WPT For Batch Script Profiling

That should give you more than enough insights into your batch skripts and you can turn on additional profiling of the executed commands if you want to find out why it is slow.

Alois Kraus
  • 12,323
  • 32
  • 57