Crawl, in the context of SEO, is the process by which search engines discover your web pages and backlinks.
Crawling or spidering means scanning the website, its sections, content, keywords, headings, hyperlinks, images by thousands of small bots.
Any data that can be found on the website is crawled.
Crawlers detect all hypertext links on a website that point to other websites. Then they parse those pages for new links over and over again. Bots crawl the whole internet regularly to update the data.
Comments