Clone/Deepclone support for nodes, including entities, vertexelements, materials, ect

I’m trying to sort out some memory issues in a pipeline that processes many thousands of files, and there is a lot of rearranging of nodes+geometry from files.

Because Node, Entity and VertexElement all have fields/properties (including internal and private ones) to the scene graph in some way, I am having to create copies manually and this is only partially implemented.

Would it be correct to say I should be creating new Node, Mesh, Material instances if i wanted to create a copy of node A from scene A into scene B, without removing it from scene A?

I haven’t successfully isolated exact memory leak issues, but I’m wondering if it makes sense to add methods for deepcloning a part of the scene graph, which trickles down to all entities, materials, textures, ect…

Consider the following pseudocode:

Scene bigScene = new Scene("hugeFile.fbx");
Scene smallObjectsScene = new Scene();

// made up method but the intent is
foreach(Node node in bigScene.FindEntities<Mesh>().Where(n => n.GetBoundingBox().Size.Length < 4)
{
  var copiedNode = node.Clone();
  smallObjectsScene.RootNode.AddChildNode(copiedNode);
}

bigScene  = null;

In the above scenario, I’m having to create methods for handling cloning and it is only partially implemented for some types, such as Nodes and Materials. My codebase doesnt have entity/geometry/mesh/vertexelement cloning yet - and I’m wondering if that is causing me to retain scenes when I expect them to be GC’d, i.e bigScene in the above code.

What is the best procedure to create copies of a node and all of its data in a way that allows things to fall out of scope and be GC’d?

@bortos

We have logged an investigation ticket as THREEDNET-897 in our issue tracking system for your requirements. We will further look into its details and keep you posted with the status of its correction. Please be patient and spare us some time.

We are sorry for the inconvenience.