For me it doesn't work out too good. All the correct format, correct headers and after like a week still nothing indexed. I think Google is suffering severe capacity problems with their crawler since I'm not the only person with problems.
Keep in mind how many new sites have sprung up on the internet just in the last year or 2.
The number of submissions to all the search sites has gone up exponentially, so obviously the pending queue will be much larger.
And of course, the larger the queue, the longer it takes to get to your request.
As to someone's comment about keeping the sitemap file updated and why should a hosting admin spend the time doing it? Yes, it makes Google's part 'easier', but the more efficient we make it for them, the faster their crawlers can operate, hence it will help to decrease the wait time for pending request....