Market Research at Trade Shows, the Quantitative Way

Try an A/B testing approach to your event marketing strategy.

How many of us go to trade shows with the intent to “learn more about our customer base,” only to walk away with a few vague memories? To get the best ROI from our event experience, we need a quantitative way to learn from conversations.

A/B testing opens up a new approach to strategic event marketing. Traditionally an online marketing tool, an A/B test is like a science experiment without the test tubes and chemicals.

The letter A and the letter B represent the idea of two alternate versions of content.

Here’s an example of how you could apply the A/B test approach at a trade show:

  • Research question: Can we close on more sales by adding an incentive to the initial pitch?

  • Hypothesis: Adding an incentive draws people to your booth with a sense of immediacy, leading to more conversions.

  • Version 1: Ask, “Would you be interested in trying our product, X?”  (Control).

  • Version 2: Ask, “Would you be interested in trying our product X and entering to win an iPad?” (Experimental).

  • Data: Keep track of who converted from each version. If you use a lead retrieval system like the Bloodhound app, you can tag leads or briefly take notes after each conversation that are stored alongside leads in your phone, for reference later.

  • Results & Conclusions: Which was the more effective tactic? If the incentive worked, at the next trade show you could test another hypothesis, like whether people respond more favorably to an iPad or a different type of reward.

It’s best to make your research question as specific as possible. Here are other questions you could ask:

  • What is the most effective order to present features X, Y, Z?

  • Are prospects more interested when I stress benefit X or benefit Y?

  • Does handing out a pamphlet lead to more response to my follow-up email?

A good hypothesis is a statement that you can test and call valid or invalid after reviewing the data. You’ll need to try both versions throughout the day, in case interest peaks at a particular time when attendees are less worn-out.

Conversions don’t have to be your only metric. Perhaps you’re interested in which version gets you more follow-up responses, or the fewest number of follow-up calls.

If you’re curious, here’s how a traditional web A/B test would go:

  • Form a research question (“What color is best for button Z?”) and a hypothesis (“Making button Z green will lead more viewers to click it than if it is red”).

  • Create multiple versions of your web page based on the hypothesis. Perhaps Version 1 has the red button, and Version 2 has the green button.

  • Randomly show different viewers different versions.

  • Use web analytics to track how viewers interact with each version.

  • Decide which version worked best to achieve your goals.

Last month, Google released a new Experiments API analytics platform, which you can test on your company site.

Try the A/B approach to marketing at your next event! You’ll be surprised at how much you can learn, and you’ll be able to quantify your trade show ROI if one of your goals is to conduct audience research. Based on the findings, you can decide which shows to attend in the future.

Has the A/B test approach worked for you? What questions would you want answered at your next event?