1

So below should contain the relevant piece of code, and I do think this is some sort of permissions issue. I can edit my file with notepad & run my executable jar. However, If I am viewing the EXACT SAME file but edited via my powershell script using my the same credentials then the program exits immediately with error "java.lang.NumberFormatException: null." However, This error is still refering to the same exact file if you follow the trace. So, I'm at a loss. Any ideas based on this info? The files really are the same, because i've even copied over the new file content generated from the powershell into the OLD file, and it that old file WORKS! I've checked the properties & the security profile as well. They are identical minus size. (one is double the size) Still, checked in notepad++, and didn't see anything.

$properties = Get-Content $filename

     foreach ($item in $properties) {
         if ($item -like '*a=*'){
            $oldValue = $item.substring(5)
            write-host "changing $oldValue to $newa" 
            $item = $item.Replace($oldValue, $newa)
         } elseif ($item -like '*sql_column=*')
         {
            $oldValue = $item.substring(11)
            write-host "changing $oldValue to $sqlColumn" 
            $item = $item.Replace($oldValue, $sqlColumn)
         }elseif ($item -like '*sql=*')
         {
            $oldValue = $item.substring(4)
            write-host "changing $oldValue to $sqlQuery" 
            $item = $item.Replace($oldValue, $sqlQuery)
         } 
        $out = $out + $item 
    }
    Clear-Content "C:\app\vapp.properties"
    $out >> "C:\app\vapp.properties" 
    #mv "C:\app\vapp.properties" "C:\app\vapp.properties"  
    #|out-file -FilePath "C:\app\vapp.properties"
    write-host "changed file to Transaction Type: $TransType"

2 Answers2

2

They are identical minus size. (one is double the size) Still, checked in notepad++, and didn't see anything.

The double size is the likely giveaway here. Windows PowerShell 5.1 (and earlier) encodes text files as UTF16-LE with BOM by default for the method you're using. Each character in a UTF16 encoded file takes up 16 bits (2 bytes) rather than 8 (1 byte) in more common encodings like UTF8, thus the double size on disk. By using the >> operator to append the output to the cleared copy of your file, you're effectively changing whatever encoding it was previously using to UTF16-LE. Your app likely doesn't know how to deal with that encoding.

As @mklement0 mentions in his comment, the above explanation is a rather simplistic view of the intricacies surrounding file encoding in PowerShell. For way more detail, see his excellent answer from another question.

Incidentally, I believe Notepad++ can show you the file encoding somewhere. I'm not exactly sure where though because I don't use it personally.

Instead of using >>, try using Out-File with the -Encoding utf8 or -Encoding ascii parameter. You'll still have the BOM with utf8, but most apps these days can deal with that pretty well. If you don't want the BOM, check @mklement0's answer for a function that should help.

# You can skip the Clear-Content call before this as well
$out | Out-File "C:\app\vapp.properties" -Encoding utf8 -Force
Ryan Bolger
  • 1,205
  • 8
  • 14
  • FYI this solution only works for PowerShell Core (6+). Doesn't work for Windows PowerShell (5.1 and earlier). – Bender the Greatest Dec 03 '19 at 21:56
  • This worked for me, and surprisingly it was in powershell version 2. I'm not complaining! – MasonTheStoneWorker Dec 03 '19 at 22:42
  • @BendertheGreatest: The solution works, assuming that the fact that the output file will have a UTF-8 _BOM_ is not a problem (which _on Windows_ is usually fine). – mklement0 Dec 03 '19 at 23:52
  • Maybe I'm just one of the lucky ones, a UTF8 BOM causes issues with a few programs I use. I generally use the function I placed in my answer now, but I do remember CCTray is sensitive to it, as well as an XML and JSON parser I use to validate those file formats. I know I've had issues with Git seeing files with a BOM as binary, not text, but I can't remember if I tried it with a UTF8 BOM or not. – Bender the Greatest Dec 04 '19 at 15:19
2

You don't say what your PowerShell version is, but if it's 5.1 or earlier, the redirection operators > and >> will add a Byte-Order Mark (BOM) to the file when it's created. Out-File also does this.

In Powershell 5.1, I use this function (should also work back to at least 4):

Function Write-FileWithoutBom {
    Param(
        [Parameter(Mandatory = $true)]
        [string]$Content,
        [Parameter(Mandatory = $true)]
        [string]$Path
    )
    $UTF8EncodingNoBom = New-Object System.Text.UTF8Encoding $False
    [System.IO.File]::WriteAllText( $Path, $Content, $UTF8EncodingNoBom )
}

In PowerShell 6 and newer, Out-File behaves a bit differently in that the UTF8 encoding does not produce a BOM (you would have to use UTF8BOM to get the 5.1 and earlier behavior):

'text' | Out-File -Encoding UTF8 C:\path\to\file.txt

The byte-order mark confuses many programs and makes some programs think they are binary files or otherwise in an invalid format.


I found an old question of mine that boils down to the same answer I just gave, but with some more details on @mklement0's answer here.

mklement0
  • 245,023
  • 45
  • 419
  • 492
Bender the Greatest
  • 12,311
  • 17
  • 65
  • 124
  • Just to let you know! Your answer was correct as well, however since I used his solution (and he answered first) I gave him the check. Thank you for your answer! – MasonTheStoneWorker Dec 03 '19 at 22:41