Ok, this is frustrating beyond belief. We have an ODBC connection that is linked to an SQL db. When a change is made to the table, the data connection in studio doesn't reflect that change.
We then delete the data connection, and recreate it in studio which refreshes the ODBC connection and the correct data now appears from the database table. Great.
We have an existing Data Source that joins an SDF to the ODBC connection. That data source still shows the old data from the original data connection, even though we have confirmed that AIMS is recognizing our update now that we recreated the ODBC connection.
We have tried to recreate the data source to use the updated ODBC connection and the existing SDF but even after that, the secondary class table still does not update and as such the columns within that table are still the old SQL tables.
That means any layers using that data source are still showing the old data table information which causes the layer to not work correctly.
I have restarted the service and still it shows the cached version of data against the datasource.
How do we get this to refresh?? Something so simple, yet is extremely frustrating to deal with.
Is the "existing data source" the same one that you deleted and rebuilt in Studio?
If not is it using differnent credentials that could result in a different view of your SQL Server database?
On a general note, cache invalidation does not require something nuclear like a service restart. Simply writing resource content to a Feature Source triggers the flushing of any cached information about that feature source.
note: except in cases where you can't connect to SQL server using the OSGEO for sql provider, such as ms sql 2005, try to avoid odbc.