An A/B test is an experiment comparing two versions of a webpage to see which version performs better.
In a simple A/B test page A acts as the control, remaining the same. Page B is an alternative version of the page introducing a new element (or elements) with a view to improving performance.
A typical A/B test would send 50% of the traffic to the control page (A) and 50% of the traffic to the test page (B). However, depending on the design of the experiment this can change.
A common A/B test in the digital marketing world is to check which version of a page converts more users into customers. A simple example would be to change the colour of a call-to-action button to see which one gets more clicks. For example, the A page (also known as the control page) might have a green button, whereas the B page (also known as the variant or test page) might have a red button.
When users click any links through to this page 50% would see A with the green button and 50% would see version B with the red button. After a period of time if significantly more users click the red button in the over the green, it might be time to think about changing to a red button permanently.
While you could set up an A/B test by creating two of your own pages and analysing the result using your usual analytics programme, it's much simpler to use software ready made for the task.
Google is pretty good at determining if an A/B test is going on so don't be tempted to add a noindex directive to the variant pages – this could well lead to all of the pages being deindexed! Google recommends that you try not to run A/B tests for too long, though how long "not too long" is is uncertain.