A robots.txt file is a text file placed in the root directory of a website which instructs the search engine bots on how the should behave when crawling the HTML.
In the robots.txt file you can choose to block specific bots from crawling specific parts of your website. For example, stopping Googlebot from accessing your blog but allowing Bingbot and any other search bots access.
This example would be madness, but it illustrates the control that robots.txt gives the SEO over the search bots.
For more information and to start creating your own robots.txt file check out www.robots.txt.org.