We're currently developing a program that allows the administrator to delete certain user records in the MySQL database. However, the delete command also creates holes in the table that causes overhead, which ultimately leads to the table being damaged to the point that it no longer accepts additional entries.
We can manually optimize the table and have the overhead removed in this way, but do you think performing an optimization via php after each delete command would make the problem go away indefinitely? Meaning, the database will remain clean and continue to accept data without problems, as well as rid us of the task of manually optimizing and repairing?
We're just trying to get some idea on this problem, so if you have any alternative suggestions, please let us know.
I think that it'd be good to create a cron job that would issue analyze/optimize queries. You can set it up to perform maintenance tasks twice a day (or whatever suits your needs). The best thing is it's transparent to your php development and I think it's the best way to clean up.
To my experience, tables that are under heavy update/delete (over 1000 delete or update queries a day) should be optimized/repaired at least 2 times a week.