Batch loading families creates large files
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report
Hi everyone
I'm new to this forum, and to programming in general (I'm and architect, not a programmer)
So I've built this python script (run through PyRevit), where essentially I create 2 lists of files (target files and source files) , and load all of the source files into all of the target files (skipping of course in case the source and target are
identical) using doc.LoadFamily(string,IloadFamiliyOptions).
It works fine, but when testing it on the same files in both lists, the size of the files grow with each iteration, even though all of the files contain the same amount of families in the end.
e.g: 4 files in each list, all of the files were just a plain generic model family file with no other families loaded (I just copy/pasted the files in he folder and renamed them)
So the result should be 4 files, each containing the other 3 files (which is what happens). I would expect all the files to be of the same size, but what happens is each file is larger than the previous one (some kind of exponential growth)
any thoughts?
I've attached some screenshots of the files' sizes before and after I run the script, and a screenshot of the script itself, and the relevant part of the script.
Thanks everyone!