A crawler, also web crawler or bot, is a programme or script that automatically travels the pages of the World Wide Web, feeding back information to a database as it does so. Search engines use web crawlers to provide their databases, or indexes, with up-to-date information about the web.
You can also use Google's Search Console to submit a list of pages in the form of an XML Sitemap which will encourage the search engines to send their crawler to visit. You can monitor this by checking:
Here's the crawl report and index report for a small business website from Google's Search Console:
Depending on the authority and size of your site, search engines will assign a crawl budget. If you site is of a very low authority wth thousands of pages, the crawler will only be likely to visit the top level pages before the budget runs out and the web crawler leaves. It's important to make your website as attractive a place as possible for the crawler to stick around with a great structure, internal linking, clean code and great content.