Prosperative Public Forum

working of google spiders

0 Members and 1 Guest are viewing this topic.

working of google spiders
« on: January 06, 2018, 12:53:50 AM »
HI every one:
can any one help me to understand the working of spiders? Are crawlers and spiders are same thing?

Re: working of google spiders
« Reply #1 on: January 06, 2018, 10:17:16 AM »
Hi

Yes, spiders, crawlers and bots are basically the same thing. In short they are programs that fetch the content of webpages and try to extract data about the content and about outgoing links. This data is then added to the bot/spider/crawler's owner's index and used e.g. for search results. Where do the funny names come from? I dunno... it's basically just a piece of software...

Mike

Re: working of google spiders
« Reply #2 on: January 08, 2018, 12:31:46 AM »
Main crawlers of google and other search engine are same or different?

Offline amirah

  • *
  • 6
    • View Profile
  • Skype: amirah
Re: working of google spiders
« Reply #3 on: June 13, 2018, 08:52:03 PM »
Google has their own crawling bot that is sent out to crawl billions of websites daily.
Spider friendly sites are the ones that have relevant and quality links on them. Google bot only crawls links, don’t expect the bot to put in login details, if your page cannot be accessed by a link the bot would not see it let alone crawl it. There’s no fixed time for the spider to crawl your website, but it does not do it in real time. Understand that the entire Google algorithm and how things work is extremely private information available only to Google’s team.


Re: working of google spiders
« Reply #4 on: June 14, 2018, 04:42:28 AM »
Quote
Main crawlers of google and other search engine are same or different?

Some crawlers (or bots) are designed to steal content. They are not all used exclusively by search engines.

As far as search engines go they all have a fairly similar need - to locate your site pages, understand the content and then add the page to the search index (unless the site is blacklisted or something like that).

If you're concerned about getting the best indexing you can add a sitemap file to your site. Most search engine crawlers/bots will look for that and use the information in it to determine what's on your site. There are various utilities that can produce such sitemaps, or you can create them by hand.

If you have a Google webmaster account you can even submit a sitemap within that, for your own domains.