> I have a similar site running out of Oracle that
> is performing similarly and I do not run the Oracle
> services when working with MySQL.
Are you running SP1? It included quite a few fixes for FDO performance.
To test whether it's the database connections, use Haris Kurtagic's
awesome FDO2FDO application to transfer the data into SDF, or try using
his King.Oracle provider to see if there's a speed difference.
To help debug the performance aspect let's first confirm your machine isn't running out of memory in any way. If you run Taskmanager and view the processes tab how much memory is the mgserver process consuming? On the Performance tab how much available memory do you have? You may also want to show additional columns in the Processes tab for Peak memory usage and Virtual Memory usage. If your system is hitting the swapfile because there is not enough memory then the problem is not likely the database connection. Are you running Outlook and several other applications at the time you have Studio running? How much memory is Studio consuming? The thing here is between the mgserver, IIS, a database and Studio you have some potential memory problems. Are you running PHP or .NET?
Once we eliminate the box as an issue how many of your layers have properties? Do they have lots of properties? You may want to watch your memory consumption as you turn on layers to see if it is still memory related.
The first time you retrieve the map the server will have to query and load the layer information. The first request will be slow. Subsequent requests should be faster once the information is cached.
Have you created spatial indexes on all your tables?
This might just be me but I am suspicious of your hardware.
Reasonable processor but
Slow hard drive?
1GB is not that much ram when you get OS, Mapguide, Apache and MySQL running
Win XP Pro, not recommended for Mapguide. (I know it works, I use it for MG
development but it is not fast)
Do you have a desktop/server environemnt that you can test this in?
wrote in message news:firstname.lastname@example.org...
I have developed a small site on my local machine that is serving up
anywhere from 10 to 45 layers of information depending on the zoom level.
This information is stored in MySQL (on the localhost) and consists of 51
tables that contain roughly 119,000 records.
When the page initially loads I'm displaying 10 layers that cover about 1000
Sq. Ft. within a 4 Sq. Mile area, the page takes forever to load I mean 4 or
Panning to a new location is similarly slow.
This is running on a Dell Inspiron 9100 P4 laptop at 3 GHz with a 1GB of
The OS is Win XP Pro SP2
I have a similar site running out of Oracle that is performing similarly and
I do not run the Oracle services when working with MySQL.
I know that this can't be normal and need to know if anything can be done to
help, aside from dumping more memory into the machine.
I have some prospective clients that are excited about seeing this stuff but
I can't show it like this.
The memory details suggest that virtual memory is not a problem and the system isn't resorting to the swap file to complete the task. Although if you ballpark resources used by the major process:
OS: 250 MB minimum or more for XP, probably closer to 350 MB
Studio: 75-150MB or more depending on open editors and objects in the tree.
Server: anywhere from 45 MB starting to 150+ MB depending on what is cached. I have seen a server run and maintain over 500 MB of memory because things are cached and not freed up.
MySQL: No experience with running my own machine. I'm guessing anywhere from 25MB to 100MB
PHP FCGI agents: 4 @15MB+
As you start to add this up at peak times there may not be enough memory and hence the system will hit the swap file. On a laptop harddrives are generally slower so this would really slow things down.
There is also the possibility it is MySQL:
Partway down on that thread they talk about MySQL beign slow on XP, even with an Inel processor. What build of MySQL are you using?
I have a desktop with a similar configuration with the same site package and MySQL schema, I'll attempt to gather metrics against this installation.
A problem that I've experienced in gathering data is that in many instances while testing to gather the data I will receive a "Failed to open" message at 3 minutes into the process. The occurrence of this failure is not consistent meaning sometimes I reboot the machine and get the error on the first attempt, sometimes I get results twice in a row and fail on the third attempt.
Sorry to be such a difficult nut to crack,
Thank you all for your support!
There were some big performance improvements added to SP1. Testing with 178.04 is insufficient. SP1 also introduces caching so that the first access to any data in a map might be somewhat slow if there is a lot of it, but subsequent accesses will be quick because connect and schema information is cached.