Hi all...
I have a client using Vault Colaboration 2011 with a huge DB, almost 800Gbm and he started have a performance issues.
I'm looking for something to do to make his problem a little bit less stressing.
If anyone know something about that, i will be very gratefull.
Thanks a lot.
Solved! Go to Solution.
Solved by Leokornowski. Go to Solution.
Is there a maintenance plan? (there is a lot of stuff on that page so scroll down to where it talks about creating a SQL Maintenance Plan).
Also, have you tried defragmenting the SQL indexes?
-Dave
Is that definitely the database and not the filestore that's 800GB?
Very unusual to have a Vault database that size.
To put it into perspective, at the place I'm at right now the Vault is 8 years old and we have ~150 people checking in and out daily, 200,000 items which are pure DB objects, utilising ECOs extensively too... yet the DB is only 25GB in size but the filestore is around 450GB. I keep it tidy using the maintenance plans Dave mentioned above, but even so if the DB was even half of 800GB I'd suspect something is not quite right.
Can you verify this is definitely the SQL database size and not the combined sum of all Vault related content?
I will go to client and when im back i post here what happened.
Thanks alot for all tips.
is a file storage... DB is under 20Gb...
all fixed... thanks for help...