C# 当仍有大量可用内存时抛出“System.OutOfMemoryException”

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/1153702/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-06 09:30:09  来源:igfitidea点击:

'System.OutOfMemoryException' was thrown when there is still plenty of memory free

c#memory-managementout-of-memory

提问by m3ntat

This is my code:

这是我的代码:

int size = 100000000;
double sizeInMegabytes = (size * 8.0) / 1024.0 / 1024.0; //762 mb
double[] randomNumbers = new double[size];

Exception: Exception of type 'System.OutOfMemoryException' was thrown.

异常:抛出了“System.OutOfMemoryException”类型的异常。

I have 4GB memory on this machine 2.5GB is freewhen I start this running, there is clearly enough space on the PC to handle the 762mb of 100000000 random numbers. I need to store as many random numbers as possible given available memory. When I go to production there will be 12GB on the box and I want to make use of it.

我在这台机器上有 4GB 内存,当我开始运行时2.5GB 是空闲的,PC 上显然有足够的空间来处理 100000000 个随机数的 762MB。在给定可用内存的情况下,我需要存储尽可能多的随机数。当我开始生产时,盒子上会有 12GB,我想使用它。

Does the CLR constrain me to a default max memory to start with? and how do I request more?

CLR 是否将我限制为默认的最大内存?我如何要求更多?

Update

更新

I thought breaking this into smaller chunks and incrementally adding to my memory requirements would help if the issue is due to memory fragmentation, but it doesn't I can't get past a total ArrayList size of 256mb regardless of what I do tweaking blockSize.

我认为如果问题是由于内存碎片造成的,我认为将它分成更小的块并逐渐增加我的内存需求会有所帮助,但是无论我如何调整 blockSize ,我都无法超过 256mb 的总 ArrayList 大小

private static IRandomGenerator rnd = new MersenneTwister();
private static IDistribution dist = new DiscreteNormalDistribution(1048576);
private static List<double> ndRandomNumbers = new List<double>();

private static void AddNDRandomNumbers(int numberOfRandomNumbers) {
    for (int i = 0; i < numberOfRandomNumbers; i++) {
      ndRandomNumbers.Add(dist.ICDF(rnd.nextUniform()));                
  }
}

From my main method:

从我的主要方法:

int blockSize = 1000000;

while (true) {
  try
  {
    AddNDRandomNumbers(blockSize);                    
  }
  catch (System.OutOfMemoryException ex)
  {
    break;
  }
}            
double arrayTotalSizeInMegabytes = (ndRandomNumbers.Count * 8.0) / 1024.0 / 1024.0;

采纳答案by Fredrik M?rk

You may want to read this: "“Out Of Memory” Does Not Refer to Physical Memory" by Eric Lippert.

您可能需要阅读此内容:Eric Lippert 的“ “内存不足”不指物理内存”。

In short, and very simplified, "Out of memory" does not really mean that the amount of available memory is too small. The most common reason is that within the current address space, there is no contiguous portion of memory that is large enough to serve the wanted allocation. If you have 100 blocks, each 4 MB large, that is not going to help you when you need one 5 MB block.

简而言之,非常简化,“内存不足”并不意味着可用内存量太小。最常见的原因是在当前地址空间内,没有足够大的连续内存部分来提供所需的分配。如果您有 100 个块,每个 4 MB 大,那么当您需要一个 5 MB 块时,这对您没有帮助。

Key Points:

关键点:

  • the data storage that we call “process memory” is in my opinion best visualized as a massive file on disk.
  • RAM can be seen as merely a performance optimization
  • Total amount of virtual memory your program consumes is really not hugely relevant to its performance
  • "running out of RAM" seldom results in an “out of memory” error. Instead of an error, it results in bad performance because the full cost of the fact that storage is actually on disk suddenly becomes relevant.
  • 在我看来,我们称之为“进程内存”的数据存储最好可视化为磁盘上海量文件
  • RAM 可以看作仅仅是性能优化
  • 您的程序消耗的虚拟内存总量实际上与其性能没有太大关系
  • “内存不足”很少会导致“内存不足”错误。它不是错误,而是导致性能不佳,因为存储实际上在磁盘上这一事实的全部成本突然变得相关。

回答by leppie

Increase the Windows process limit to 3gb. (via boot.ini or Vista boot manager)

将 Windows 进程限制增加到 3GB。(通过 boot.ini 或 Vista 启动管理器)

回答by Shay Erlichmen

You don't have a continuous block of memory in order to allocate 762MB, your memory is fragmented and the allocator cannot find a big enough hole to allocate the needed memory.

您没有连续的内存块来分配 762MB,您的内存碎片化并且分配器找不到足够大的孔来分配所需的内存。

  1. You can try to work with /3GB (as others had suggested)
  2. Or switch to 64 bit OS.
  3. Or modify the algorithm so it will not need a big chunk of memory. maybe allocate a few smaller (relatively) chunks of memory.
  1. 您可以尝试使用 /3GB (正如其他人所建议的那样)
  2. 或者切换到 64 位操作系统。
  3. 或者修改算法,使其不需要大量内存。也许分配一些较小(相对)的内存块。

回答by redcalx

32bit windows has a 2GB process memory limit. The /3GB boot option others have mentioned will make this 3GB with just 1gb remaining for OS kernel use. Realistically if you want to use more than 2GB without hassle then a 64bit OS is required. This also overcomes the problem whereby although you may have 4GB of physical RAM, the address space requried for the video card can make a sizeable chuck of that memory unusable - usually around 500MB.

32 位 Windows 有 2GB 进程内存限制。其他人提到的 /3GB 启动选项将使这个 3GB 空间只剩下 1GB 供操作系统内核使用。实际上,如果您想轻松使用超过 2GB 的空间,则需要 64 位操作系统。这也解决了一个问题,尽管您可能有 4GB 的物理 RAM,但视频卡所需的地址空间可能会使该内存的大量卡盘无法使用 - 通常约为 500MB。

回答by Francis B.

Well, I got a similar problem with large data set and trying to force the application to use so much data is not really the right option. The best tip I can give you is to process your data in small chunk if it is possible. Because dealing with so much data, the problem will come back sooner or later. Plus, you cannot know the configuration of each machine that will run your application so there's always a risk that the exception will happens on another pc.

好吧,我在大数据集上遇到了类似的问题,并试图强制应用程序使用如此多的数据并不是真正正确的选择。我能给你的最好的建议是,如果可能的话,以小块的形式处理你的数据。因为处理这么多数据,问题迟早会卷土重来。此外,您无法知道将运行您的应用程序的每台机器的配置,因此始终存在异常发生在另一台 PC 上的风险。

回答by jalf

I'd advise against the /3GB windows boot option. Apart from everything else (it's overkill to do this for onebadly behaved application, and it probably won't solve your problem anyway), it can cause a lot of instability.

我建议不要使用 /3GB Windows 启动选项。除了其他所有事情(对一个表现不佳的应用程序执行此操作是过分的,而且无论如何它可能无法解决您的问题),它可能会导致很多不稳定。

Many Windows drivers are not tested with this option, so quite a few of them assume that user-mode pointers always point to the lower 2GB of the address space. Which means they may break horribly with /3GB.

许多 Windows 驱动程序没有使用此选项进行测试,因此很多驱动程序假定用户模式指针始终指向地址空间的较低 2GB。这意味着它们可能会用 /3GB 严重崩溃。

However, Windows does normally limit a 32-bit process to a 2GB address space. But that doesn't mean you should expect to be able to allocate 2GB!

但是,Windows 通常会将 32 位进程限制为 2GB 地址空间。但这并不意味着您应该期望能够分配 2GB!

The address space is already littered with all sorts of allocated data. There's the stack, and all the assemblies that are loaded, static variables and so on. There's no guarantee that there will be 800MB of contiguous unallocated memory anywhere.

地址空间已经堆满了各种分配的数据。有堆栈,所有加载的程序集,静态变量等等。不能保证任何地方都会有 800MB 的连续未分配内存。

Allocating 2 400MB chunks would probably fare better. Or 4 200MB chunks. Smaller allocations are much easier to find room for in a fragmented memory space.

分配 2 400MB 块可能会更好。或 4 200MB 块。较小的分配更容易在碎片化的内存空间中找到空间。

Anyway, if you're going to deploy this to a 12GB machine anyway, you'll want to run this as a 64-bit application, which should solve all the problems.

不管怎样,如果您无论如何都要把它部署到 12GB 的机器上,您将希望将它作为 64 位应用程序运行,这应该可以解决所有问题。

回答by Dejan Stani?

If you need such large structures, perhaps you could utilize Memory Mapped Files. This article could prove helpful: http://www.codeproject.com/KB/recipes/MemoryMappedGenericArray.aspx

如果您需要如此大的结构,也许您可​​以使用内存映射文件。这篇文章可能会有所帮助:http: //www.codeproject.com/KB/recipes/MemoryMappedGenericArray.aspx

LP, Dejan

LP, 德扬

回答by Judah Gabriel Himango

Rather than allocating a massive array, could you try utilizing an iterator? These are delay-executed, meaning values are generated only as they're requested in an foreach statement; you shouldn't run out of memory this way:

与其分配大量数组,不如尝试使用迭代器?这些是延迟执行的,这意味着只有在 foreach 语句中请求它们时才会生成值;你不应该以这种方式耗尽内存:

private static IEnumerable<double> MakeRandomNumbers(int numberOfRandomNumbers) 
{
    for (int i = 0; i < numberOfRandomNumbers; i++)
    {
        yield return randomGenerator.GetAnotherRandomNumber();
    }
}


...

// Hooray, we won't run out of memory!
foreach(var number in MakeRandomNumbers(int.MaxValue))
{
    Console.WriteLine(number);
}

The above will generate as many random numbers as you wish, but only generate them as they're asked for via a foreach statement. You won't run out of memory that way.

以上将生成任意数量的随机数,但仅在通过 foreach 语句要求时生成它们。这样你就不会耗尽内存。

Alternately, If you must have them all in one place, store them in a file rather than in memory.

或者,如果您必须将它们全部放在一个地方,请将它们存储在文件中而不是内存中。

回答by chris

Changing from 32 to 64 bit worked for me - worth a try if you are on a 64 bit pc and it doesn't need to port.

从 32 位更改为 64 位对我有用 - 如果您使用的是 64 位 PC 并且不需要移植,那么值得一试。

回答by Trisped

As you probably figured out, the issue is that you are trying to allocate one large contiguous block of memory, which does not work due to memory fragmentation. If I needed to do what you are doing I would do the following:

正如您可能已经发现的那样,问题在于您正在尝试分配一个大的连续内存块,由于内存碎片而无法正常工作。如果我需要做你正在做的事情,我会做以下事情:

int sizeA = 10000,
    sizeB = 10000;
double sizeInMegabytes = (sizeA * sizeB * 8.0) / 1024.0 / 1024.0; //762 mb
double[][] randomNumbers = new double[sizeA][];
for (int i = 0; i < randomNumbers.Length; i++)
{
    randomNumbers[i] = new double[sizeB];
}

Then, to get a particular index you would use randomNumbers[i / sizeB][i % sizeB].

然后,要获取特定索引,您将使用randomNumbers[i / sizeB][i % sizeB].

Another option if you always access the values in order might be to use the overloaded constructorto specify the seed. This way you would get a semi random number (like the DateTime.Now.Ticks) store it in a variable, then when ever you start going through the list you would create a new Random instance using the original seed:

如果您总是按顺序访问值,另一种选择可能是使用重载的构造函数来指定种子。这样你会得到一个半随机数(如DateTime.Now.Ticks)将它存储在一个变量中,然后当你开始浏览列表时,你将使用原始种子创建一个新的 Random 实例:

private static int randSeed = (int)DateTime.Now.Ticks;  //Must stay the same unless you want to get different random numbers.
private static Random GetNewRandomIterator()
{
    return new Random(randSeed);
}


It is important to note that while the blog linked in Fredrik M?rk's answer indicates that the issue is usually due to a lack of address spaceit does not list a number of other issues, like the 2GB CLR object size limitation (mentioned in a comment from ShuggyCoUk on the same blog), glosses over memory fragmentation, and fails to mention the impact of page file size (and how it can be addressed with the use of the CreateFileMappingfunction).

需要注意的是,虽然 Fredrik M?rk 的回答中链接的博客表明该问题通常是由于地址空间不足,但它并未列出许多其他问题,例如 2GB CLR 对象大小限制(在来自 ShuggyCoUk 在同一博客上的评论),掩盖了内存碎片,并没有提到页面文件大小的影响(以及如何使用CreateFileMapping函数解决它)。

The 2GB limitation means that randomNumbersmust be less than 2GB. Since arrays are classes and have some overhead them selves this means an array of doublewill need to be smaller then 2^31. I am not sure how much smaller then 2^31 the Length would have to be, but Overhead of a .NET array?indicates 12 - 16 bytes.

2GB 限制意味着randomNumbers必须小于 2GB。由于数组是类并且它们本身有一些开销,这意味着数组double需要小于 2^31。我不确定长度必须比 2^31 小多少,但是.NET 数组的开销?表示 12 - 16 个字节。

Memory fragmentation is very similar to HDD fragmentation. You might have 2GB of address space, but as you create and destroy objects there will be gaps between the values. If these gaps are too small for your large object, and additional space can not be requested, then you will get the System.OutOfMemoryException. For example, if you create 2 million, 1024 byte objects, then you are using 1.9GB. If you delete every object where the address is not a multiple of 3 then you will be using .6GB of memory, but it will be spread out across the address space with 2024 byte open blocks in between. If you need to create an object which was .2GB you would not be able to do it because there is not a block large enough to fit it in and additional space cannot be obtained (assuming a 32 bit environment). Possible solutions to this issue are things like using smaller objects, reducing the amount of data you store in memory, or using a memory management algorithm to limit/prevent memory fragmentation. It should be noted that unless you are developing a large program which uses a large amount of memory this will not be an issue. Also, this issue can arise on 64 bit systems as windows is limited mostly by the page file size and the amount of RAM on the system.

内存碎片与硬盘碎片非常相似。您可能有 2GB 的地址空间,但是当您创建和销毁对象时,这些值之间会存在差距。如果这些间隙对于您的大对象来说太小,并且无法请求额外的空间,那么您将获得System.OutOfMemoryException. 例如,如果您创建 200 万个 1024 字节的对象,那么您使用的是 1.9GB。如果您删除地址不是 3 的倍数的每个对象,那么您将使用 0.6GB 的内存,但它会分布在地址空间中,中间有 2024 字节的开放块。如果您需要创建一个 0.2GB 的对象,您将无法这样做,因为没有足够大的块来容纳它,并且无法获得额外的空间(假设是 32 位环境)。此问题的可能解决方案是使用较小的对象、减少存储在内存中的数据量或使用内存管理算法来限制/防止内存碎片。应该注意的是,除非您正在开发一个使用大量内存的大型程序,否则这不会成为问题。还,

Since most programs request working memory from the OS and do not request a file mapping, they will be limited by the system's RAM and page file size. As noted in the comment by Néstor Sánchez (Néstor Sánchez) on the blog, with managed code like C# you are stuck to the RAM/page file limitation and the address space of the operating system.

由于大多数程序从操作系统请求工作内存而不请求文件映射,因此它们将受到系统 RAM 和页面文件大小的限制。正如 Néstor Sánchez (Néstor Sánchez) 在博客上的评论中所指出的,使用像 C# 这样的托管代码,您会受到 RAM/页面文件限制和操作系统地址空间的限制。



That was way longer then expected. Hopefully it helps someone. I posted it because I ran into the System.OutOfMemoryExceptionrunning a x64 program on a system with 24GB of RAM even though my array was only holding 2GB of stuff.

那比预期的要长得多。希望它可以帮助某人。我发布它是因为我遇到了在System.OutOfMemoryException具有 24GB RAM 的系统上运行 x64 程序的情况,即使我的阵列只有 2GB 的东西。