I have a command that set the name of the Level associated to the element to a Shared Parameter.
I understand that when I set parameters values I am increasing file size. The problem is that when I fill this parameter manually the file size doesn't get as big as when the parameter is set through API.
As an example, I have a file with about 15000 elements.
File size before setting parameter value: 147520 KB
File size after setting parameter manually: 188672 KB
File size after setting parameter programmatically: 247880 KB
Besides that, when I opened last file (with parameters filled by API) with Audit checked, file size decreased by 50000 KB.
Does anyone know why this is happening?
I have a command that set the name of the Level associated to the element to a Shared Parameter.
I understand that when I set parameters values I am increasing file size. The problem is that when I fill this parameter manually the file size doesn't get as big as when the parameter is set through API.
As an example, I have a file with about 15000 elements.
File size before setting parameter value: 147520 KB
File size after setting parameter manually: 188672 KB
File size after setting parameter programmatically: 247880 KB
Besides that, when I opened last file (with parameters filled by API) with Audit checked, file size decreased by 50000 KB.
Does anyone know why this is happening?
Are you committing and/or starting a new transaction for each element.
Maybe share the part of the code that edits the element?
Are you committing and/or starting a new transaction for each element.
Maybe share the part of the code that edits the element?
@TripleM-Dev.net yes, I am commiting for each element. And to agroup all transactions I use a TransactionGroup.
Do you know if it's more advisable to use only one transaction for them all?
var parameter = element.get_Parameter(parameterGuid);
if (parameter != null)
{
if (!parameter.IsReadOnly && parameter.AsString() != value)
{
using (Transaction trans = new Transaction(doc, "Preenche Parâmetro"))
{
trans.Start();
parameter.Set(value);
trans.Commit();
}
}
}
@TripleM-Dev.net yes, I am commiting for each element. And to agroup all transactions I use a TransactionGroup.
Do you know if it's more advisable to use only one transaction for them all?
var parameter = element.get_Parameter(parameterGuid);
if (parameter != null)
{
if (!parameter.IsReadOnly && parameter.AsString() != value)
{
using (Transaction trans = new Transaction(doc, "Preenche Parâmetro"))
{
trans.Start();
parameter.Set(value);
trans.Commit();
}
}
}
Hi,
I don't now for sure if it's more advisable to use a single transaction, mayby someone with more inside knowledge could answer that.
But, I only use a grouptransaction if i need to refresh the view/result of a transaction.
Do you need the GroupTransaction or could you do it in a single transaction?, and see if it also increases the file as much.
I actually would think it would generate the same file increase (large or small)....as the TransactionGroup would also result in a single Undo (and not multiple undo's)
Another, what's the size (length) of the values for the new and the old values.
- Michel
Hi,
I don't now for sure if it's more advisable to use a single transaction, mayby someone with more inside knowledge could answer that.
But, I only use a grouptransaction if i need to refresh the view/result of a transaction.
Do you need the GroupTransaction or could you do it in a single transaction?, and see if it also increases the file as much.
I actually would think it would generate the same file increase (large or small)....as the TransactionGroup would also result in a single Undo (and not multiple undo's)
Another, what's the size (length) of the values for the new and the old values.
- Michel
I can use a single transaction. I just didn't know if it was better.
The strings I am setting to the parameter are about 20 characters.
I can use a single transaction. I just didn't know if it was better.
The strings I am setting to the parameter are about 20 characters.
@TripleM-Dev.net and @Sean_Page , I did some tests using a single transaction and the results are the same.
The file grows a lot and after Audit the file decrease to about the same size of the file with parameters filled manually.
@TripleM-Dev.net and @Sean_Page , I did some tests using a single transaction and the results are the same.
The file grows a lot and after Audit the file decrease to about the same size of the file with parameters filled manually.
Actually, the result were not the same. The file using a single transaction was 30000 KB bigger than the one using multiple transactions.
Actually, the result were not the same. The file using a single transaction was 30000 KB bigger than the one using multiple transactions.
I have experienced this type of ballooning, but only in the family environment. The way around it is to "Save-As" and/or use the Compress option during Synchronization. That is what solves it for families anyway, and it has to do with all the "memory" of things changed in the files that hang around. I can't find the reference document that led me to this solution originally, but may be worth a try.
One other question I would have is do you have any other idlers, updaters or document changed events doing something as well when you are changing all these parameters? I may be something else going on around it, and not the parameters directly?
I have experienced this type of ballooning, but only in the family environment. The way around it is to "Save-As" and/or use the Compress option during Synchronization. That is what solves it for families anyway, and it has to do with all the "memory" of things changed in the files that hang around. I can't find the reference document that led me to this solution originally, but may be worth a try.
One other question I would have is do you have any other idlers, updaters or document changed events doing something as well when you are changing all these parameters? I may be something else going on around it, and not the parameters directly?
Maybe the filesize growth it's quite possible, depending on the elements.
For walls (without inserts) it would be think the increase would be strange, but user families it could.
Something you could try (solving the increase another way) is to only update elements that need it, so compare it's current value against te new value and only set the value if they are different.
A Colleague used Dynamo to set some values for IFC export, it took relatively long and somtimes failed due to worksharing elements checkout state. In my addin I only update the elements that needed it with a checkout state validation first, it runs a lot faster.
Is it a workshared project, and do the elements constantly have the value changing?
Also like @Sean_Page mentions, changing one element could trigger a update of another, from Revit itself or another addin running.
- Michel
Maybe the filesize growth it's quite possible, depending on the elements.
For walls (without inserts) it would be think the increase would be strange, but user families it could.
Something you could try (solving the increase another way) is to only update elements that need it, so compare it's current value against te new value and only set the value if they are different.
A Colleague used Dynamo to set some values for IFC export, it took relatively long and somtimes failed due to worksharing elements checkout state. In my addin I only update the elements that needed it with a checkout state validation first, it runs a lot faster.
Is it a workshared project, and do the elements constantly have the value changing?
Also like @Sean_Page mentions, changing one element could trigger a update of another, from Revit itself or another addin running.
- Michel
This file, specifically, is not using worksharing, but we use this code in workshared files. We checkout elements before trying to modificate them, so, we don't have problems with the checkout status.
About the verification, the code only set the parameter when its value is different from the value it wants to set.
In a ideal world, this command was supposed to be executed frequently. This way, the number of elements that need change would be small for each execution. However, sometimes, this command is executed only in the end of the project.
But anyway, the file will get bigger because of the command. How can I be sure that this information (that can be audited) is not causing corruption to my files?
This file, specifically, is not using worksharing, but we use this code in workshared files. We checkout elements before trying to modificate them, so, we don't have problems with the checkout status.
About the verification, the code only set the parameter when its value is different from the value it wants to set.
In a ideal world, this command was supposed to be executed frequently. This way, the number of elements that need change would be small for each execution. However, sometimes, this command is executed only in the end of the project.
But anyway, the file will get bigger because of the command. How can I be sure that this information (that can be audited) is not causing corruption to my files?
No... I don't use any idlers, updaters nor document changed events...
What is bothering me is the difference in file size between the file where I used the command and the file where I filled the parameter manually...
No... I don't use any idlers, updaters nor document changed events...
What is bothering me is the difference in file size between the file where I used the command and the file where I filled the parameter manually...
I may suggest sing a ParameterValueProvider (PVP) to only collect the elements that match the parameter value to start with. That will automatically eliminate ALL but the essential items.
This is an example for a Sheet Number search
ParameterValueProvider pvp = new ParameterValueProvider(new ElementId(BuiltInParameter.SHEET_NUMBER));
FilterStringRuleEvaluator fsr = new FilterStringEquals();
FilterRule fRule = new FilterStringRule(pvp, fsr, _sheetNumber, true);
ElementParameterFilter filter = new ElementParameterFilter(fRule);
using(FilteredElementCollector fec = new FilteredElementCollector(doc).OfCategory(BuiltInCategory.OST_Sheets).WherePasses(filter))
{
foreach(ViewSheet vs in fec.ToElements())
{
}
}
I may suggest sing a ParameterValueProvider (PVP) to only collect the elements that match the parameter value to start with. That will automatically eliminate ALL but the essential items.
This is an example for a Sheet Number search
ParameterValueProvider pvp = new ParameterValueProvider(new ElementId(BuiltInParameter.SHEET_NUMBER));
FilterStringRuleEvaluator fsr = new FilterStringEquals();
FilterRule fRule = new FilterStringRule(pvp, fsr, _sheetNumber, true);
ElementParameterFilter filter = new ElementParameterFilter(fRule);
using(FilteredElementCollector fec = new FilteredElementCollector(doc).OfCategory(BuiltInCategory.OST_Sheets).WherePasses(filter))
{
foreach(ViewSheet vs in fec.ToElements())
{
}
}
Thank you @Sean_Page but I don't need to filter element by their parameter values. I read all elements of certain categories, analyse if I need to set a geometry-based value to the element and then set this value to the parameter.
Thank you @Sean_Page but I don't need to filter element by their parameter values. I read all elements of certain categories, analyse if I need to set a geometry-based value to the element and then set this value to the parameter.
Hi. I faced the same issue where I added many new parameters to a project programmatically then saved me project using both Save() and SaveAs() with SaveOptions.Compact setting. Despite everything I've tried, the resulted file ended up being 2,5 times of the size that it supposed to be.
The solution was easy, I used SaveAs() but instead of overwriting the existing file, I choosed a new file name as suggested here.
I believe the API team should investigate this issue, if it can be solved easily by users, it can also be solved easily by the developer team.
Hi. I faced the same issue where I added many new parameters to a project programmatically then saved me project using both Save() and SaveAs() with SaveOptions.Compact setting. Despite everything I've tried, the resulted file ended up being 2,5 times of the size that it supposed to be.
The solution was easy, I used SaveAs() but instead of overwriting the existing file, I choosed a new file name as suggested here.
I believe the API team should investigate this issue, if it can be solved easily by users, it can also be solved easily by the developer team.
Can't find what you're looking for? Ask the community or share your knowledge.