Date: prev next · Thread: first prev next last
2012 Archives by date, by thread · List index


On 05/18/2012 03:38 PM, Arnaud Versini wrote:
I checked on Linux and Windows, the memory after a massive allocation
and a massive deallocation is really freed, so currently la raison
d'être of our internal allocator is gone (need to be checked on Mac Os
X). About performance I don't know.

Indeed, on Fedora 16 at least,

$ cat test.cc
#include <iostream>
#include <string>
int main() {
    char * p[100000];
    for (int i = 0; i != 100000; ++i) p[i] = new char[10000];
    std::cout << "...";
    std::cin.ignore();
    for (int i = 0; i != 100000; ++i) delete p[i];
    std::cout << "...";
    std::cin.ignore();
}
$ g++ test.cc
$ ./a.out

and "pmap ... | grep total" at the two checkpoints reveals a shrink-again allocation mechanism (from 990752K back to 12736K here).

Stephan

Context


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.