I am testing how big a collection could be in .Net. Technically, any collection object could grows to the size of the physical memory.
Then I tested the following code in a sever, which has 16GB memory, running Windows 2003 server and Visual Studio…
I'm using a BinarySerializer with a pretty big (althought not very deep) graph of items. I have 8GB of ram backed by 12Gig of swap and i'm getting an OutOfMemoryException when serializing which is expected ( it's possible the graph could go near or…
I've a 64-bit PC with 128 GB of RAM and I'm using C# and .NET 4.5.
I've the following code:
double[,] m1 = new double[65535, 65535];
long l1 = m1.LongLength;
double[,] m2 = new double[65536, 65536]; // Array dimensions exceeded supported range
long…
After several outOfMemory exceptions, I enabled "gcAllowVeryLargeObjects", it works perfectly fine. I am now wondering why it is not a default option in C# (on a 64-bit platform).
Is it for pure compatibility reasons ? Or am I missing a major…
In .Net 4.5 gcAllowVeryLargeObjects was introduced to allow arrays greater than 2gb in size for 64 bit systems. However arrays were (and still are) limited to ~4.2 billion elements and ~2.1 billion in any dimension. Why?
Is there no interest for it…
I'm using the Array.Copy Method (Array, Array, Int64). I've the following static method
public static T[,] Copy(T[,] a)
where T : new()
{
long n1 = a.GetLongLength(0);
long n2 = a.GetLongLength(1);
T[,] b = new T[n1, n2];
…
I want to resize the image in my website, but when I using Bitmap to load a image of 14032*19864(png extension), an OutOfMemoryException is thrown. My compiler configuration is any cpu.
I was doubting whether the running environment is x64.
the code…
I\m unable to understand how to set gcAllowVeryLargeObjectsruntime param for worker role. I set this pram in app.config. But it doesnt work. As I understand I need to somehow config it in config file of wotker host.
Update: Final solution based on…
The question is regarding the allocation of arrays in .net. i have a sample program below in which maximum array i can get is to length. I increase length to +1 it gives outofMemory exception. But If I keep the length and remove the comments I am…
I'm a little bit out of ideas. With the following code I try to instace a byte array > than 2GB:
var b = Array.CreateInstance(typeof(byte), uint.MaxValue);
Every time it will cause an System.ArgumentOutOfRangeException excpetion with the message…
Given
// r is a System.Data.IDataRecord
var blob = new byte[(r.GetBytes(0, 0, null, 0, int.MaxValue))];
r.GetBytes(0, 0, blob, 0, blob.Length);
and r.GetBytes(...) returns Int64
Since Array.zeroCreate and Array.init take Int32 how do I create an…
I wanted to show to a colleague that you can allocate more than 2GB of ram, so I made a little test application.
let mega = 1 <<< 20
let konst x y = x
let allocate1MB _ = Array.init mega (konst 0uy)
let memoryHog = Array.Parallel.init 8192…
In my C# app I use gcAllowVeryLargeObjects because I am doing image processing with large datasets, resulting in extensive RAM usage. Now I want to write some UnitTests and I am running into the same situations I had without…
Due to the ~2.15 billion element limitation with the .NET Framework (even taking into account 64bit Windows, .NET 4.5+, and gcAllowVeryLargeObjects), I needed to create my own BigStringBuilder to manipulate extremely large strings.
Unfortunately,…