Submitting Updated URLs with the New Fetch as Googlebot Feature
Google’s Webmaster tools now provide users with a new way to submit new or updated URL’s for Google indexing. The fetch as Googlebot feature allows users to identify individual URL’s and submit them to Google’s crawl schedule for index consideration. Unlike the other methods of URL submission the Googlebots will crawl the submitted pages usually within one day of their submission. Although the response time is faster Google will still use the same selection process when considering a URLs inclusion in their index. The Fetch as Googlebot helps Google discover a web page faster but like its natural discovery process there is no guarantee that the page will be included in their index.
In order for Google to crawl and index a page it first needs to know that the site exists. The fetch as Googlebot feature may be a new and convenient method to submit a URL to Google but it’s not the only option. The process of discovery and can take place in a couple of different ways. Traditionally discovery occurs through links which is why it is extremely important to make sure that all your web pages are linked internally. Pages don’t always have external links connecting them, making internal links necessary for bots to travel from page to page.
Google also uses RSS feeds, XML Sitemaps, and Public URL Request to discover URLs that are not yet indexed. An XML Sitemap is a complete list of URL’s that you want Google to crawl for your website. The public URL request is an “Add URL” form that is available for anyone who wants to request that a URL be added to the index. Recently the add URL form has been renamed to the Crawl URL Request Form and to use this feature you must be signed into a Google account. Similar to the… Read the rest
Source: http://feedproxy.google.com/~r/internetbeacon/tZlB/~3/0mmKupW_8A8/
affiliate marketing search engine optimization internet marketing search engine marketing
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home