Crawlers (or bots) are used to collect info obtainable on the web. By using web site navigation menus, and studying internal and exterior links, the bots start to know the context of a page. Of course, the words, pictures, and other data on pages also help search engines like google https://johnf852ted1.blogsmine.com/profile