Cloaking is the act of determining whether an incoming visitor to a page is a search engine crawler or a user and serving different content to each. When a search engine is detected, a server-side script shows the crawler a different version of the page to the one that's visible to human users in their web browser.
Cloaking is often used as a black hat technique to rank a page that is irrelevant for the search term. For example, the crawler is served a quality page which it would be likely to rank for a high-volume search term; however, the content served to the user does not satisfy the intent of their query.
SEOs used to defend the use of cloaking by complaining that pages that search engines choose to rank are not necessarily user-friendly. But updates in Google's and Bing's ability to rank pages by their usability have more or less put this argument to bed. Google's Matt Cutts has stated that any website found cloaking with black hat intentions will be penalised:
"Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index."
Sometimes, for a variety of reasons, it might make sense to show the search engine a different page to the users. For example, Google might crawl your site from the US and will index this one - despite the majority of your users being served the UK version of the page.
Search engines also struggle with certain types of content, such as Flash and Ajax, and some SEOs create pages with easy-to-crawl HTML in place of these specifically for the crawlers. This makes sense from a user perspective (although might not be advisable so err on the side of caution with this one).
If you suspect a page is using cloaking you can always select to see the page from Googles cache by selecting the option from the SERPs (more on this here). If the page in the cache is different from the one you're on then cloaking is being used.