Download Reference Manual
The Developer's Library for D
About Wiki Forums Source Search Contact

Collections managing heap simillar to STL Vector

Moderators: kris

Posted: 04/20/08 23:57:10

I had a stupid test the other day (knee deep in my matrix and vector library) and thought what would happen if I needed 100 million vertices, so I devised a small test with D's array and allocating on the heap (since stack obviously wouldn't work :) ) 100 million of them to see what would happen, it faulted around 10 million - reportedly due to contiguous heap block allocation problems on windows AND on 32bit machines.

Ok, so my next plan was to default test to ArraySeq? and see how it goes - first I created a ArraySeq?, then looped through size_t up to a 100 million and appended my structure to it, it faulted around 32 million - I thought it was the same issue as with regular arrays in D. OK, next plan, brake it down to 10 loops, each 10 million large, it faulted again, and on 100 loops - each 1 million large - faulted again around 32 million.

So I thought to myself, ok this is some OS + 32 bit related issue, whatever. As this is a useless test anyways, since I'd better have my own memory manager for stuff like that, I began googleing around for memory manager idioms related to 3D and scene graph problems.. Later this evening it dawned on me that I should try the same test with gcc4.3.0 and STL Vector to see what would happen.

Behold (structure at its rawest form simillar to D one):

#include <iostream>
using std::cout;
using std::cin;
using std::endl;
	
#include <iomanip>
using std::setw;
	
#include <vector>
using std::vector;

typedef struct {
	float x, y, z;
} Vector3D, *Vector3DPtr;

void OutputVector(const vector<Vector3D> &);
	
int main() {
	
	vector<Vector3D> vectors(100000000);
	
	cout << "Size of $vectors field is " << vectors.size() << endl;

	cout << setw(5) << "<<" << vectors[5000000].x << "," << vectors[5000000].y << "," << vectors[5000000].z << ">>" << endl;
	
	cout << "Size of $vectors field is " << vectors.size() << endl;
	
	
	return 0;
}

and gues what, it works. It allocates 100 million of them (on the heap I hope) and it works. We should get this kind of functionality in Tango for sure, work it as a MappedBuffer? in the background or something, or peek into STL, it should work - its the OS that allocates in both cases if I'm not mistaken. Randomize code on each vector which uses Tangos Random in the background is already showing spectacular results, I'd like to test drive it on huge arrays :) Anyways, now that I finished math lib, I'm deleting it and starting from scratch - now I know how it should look like.

</rant>

Author Message

Posted: 04/21/08 03:35:40

yeah, the collections package needs to support either custom allocators or array-based sub-allocation