One of the things that doesn’t get a lot of attention, or mentioned that often is how to tell Googlebot that you posted a new page that you would like crawled and hopefully indexed.
I know a lot of people have difficulties getting crawled and indexed. People post something new and expect Google to find and index it almost immediately, which doesn’t always happen. This can depend on a lot of factors such as: How new your website is, how many backlinks you have, even what kind of script your site is based on, just to name a few.
People talk about using pinging services and sites to help speed up the process. The expectation is you go to a service like Pingomatic add your website link and details, hit the ping button, and somehow all the bots and crawlers are going to come rushing. Maybe you have better luck with things like these than I do. Every time I have ever tried a ping service the results were so poor I have doubts about whether they do anything at all.
Other people use them to ping a blog after they have left a comment hoping that Google will come running and find the link back (backlink) to their website. Seeing how I never had any luck with pinging my own posts, I have doubts about doing this as well. To me it seems like it’s more work than what it’s worth.
When I post something new with a WordPress based website Google always crawls and indexes it within 30 minutes or less, and usually more like 5 to 10 minutes. Part of that could be attributed to the built-in update services (ping) feature that WordPress comes with. Other scripts I use that don’t include a pinging feature can take several days, and sometimes even up to a week on occasion.
Food for thought:
This reminds me. Remember when Dominos Pizza had a guaranteed delivery time of 30 minutes or less? Maybe Google can do something like: Google: We’ll crawl and index your site in 30 minutes or less…The guarantee? I don’t know about that part yet!
If it seems like Googlebot is taking a long time to find and index your new posts you can simply submit your new posts (URLs) in Google Webmaster Tools using Fetch as Googlebot. Obviously you will need to have an account with a verified site setup to do this.
How to Submit URLs to Google with Fetch as Googlebot
1. Login to Google Webmaster Tools
2. Find Diagnostics, and then “Fetch as Googlebot”
3. Enter the address (URL) to your new post or page you want crawled, and click “Fetch”
4. It will say something like “Your request was completed successfully.” near the top of the page. Your link should also appear in the URL status area along with the time. If it doesn’t show up right away be patient. Sometimes it takes a few minutes. You may need to come back or refresh this page.
5. Now you can click on the “Submit to index” link.
6. A little window will open or appear and you can choose “URL” or “URL and all linked pages”. Make your selection and click submit.
At the time of this post you are allowed:
You can submit 500 urls per week and 10 urls with all linked pages per month.
7. Now you can see it says: “URL submitted to index”. You will also notice I have 499 fetches remaining for the month.
There is no guarantee that your post will be added to Google, but at least you told it where to find it and to consider it. Supposedly after doing a Fetch as Googlebot it will be crawled within 24 hours.
It makes more sense to me if you tell Google where to find your new post or page, and to have it crawled within 24 hours than to rely on a 3rd party pinging service that you’re not even sure about.
Obviously if Google crawls and indexes your site, posts, and pages frequently then you shouldn’t have to do this.
Originally (or a while ago anyway)
You could submit 50 URLs per week and 10 URLs with all linked pages per month.
Currently (at the time of this post anyway)
You can submit 500 URLs per week and 10 URLs with all linked pages per month.
So you can submit more URLs now, but you still might want to choose your URLs with linked pages wisely since you are only allowed 10 per month.