How Live Chat Tools Impact Conversions and Why I Launched a Bad Variant

Do those live chat tools actually help your business? Will they get you more customers by allowing your visitors to chat directly with your team?

Like most tests, you can come up with theories that sound great for both sides.

Pro Live Chat Theory: Having a live chat tool helps people answer questions faster, see the value of your product, and will lead to more signups when people see how willing you are to help them.

Anti Live Chat Theory: It’s one more element on your site that will distract people from your primary CTAs so conversions will drop when you add it to your site.

These aren’t the only theories either, we could come up with dozens on both sides.

But which is it? Do signups go up or down when you put a live chat tool on the marketing site of your SaaS app?

It just so happens I ran this exact test while I was at Kissmetrics.

How We Set Up the Live Chat Tool Test

Before we ran the test, we already had Olark running on our pricing page. The Sales team requested it and we launched without running it through an A/B test. Anecdotally, it seemed helpful. An occasional high-quality lead would come through and it would help our SDR team disqualify poor leads faster.

Around September 2014, the Sales team started pushing to have Olark across our entire marketing site. Since I had taken ownership of signups, our marketing site, and our A/B tests, I pushed back. We weren’t just going to launch it, it needed to go through an A/B test first. I was pro-Olark at this point but wanted to make sure we weren’t cannibalizing our funnel by accident.

We got it slotted for an A/B test in Oct 2014 and decided to test it on 3 core pages of our marketing site: our Features, Customers, and Pricing pages.

Our control didn’t have Olark running at all. This means that we stripped it from our pricing page for the control. Only the variant would have Olark on any pages.

Here’s what our Olark popup looked like during business hours:

Kissmetrics Olark Popup Business Hours

And here it is after-hours:

Kissmetrics Olark Popup After Hours

Looking at the popups now, I wish and I done a once-over with the copy. It’s pretty bland and generic. That might have gotten us better results. At the time, I decided to test whatever Sales wanted since this test was coming from them.

Setting up the A/B test was pretty simple. We used an internal tool to split visitors into variants randomly (this is how we ran most of our A/B tests at Kissmetrics). Half our visitors randomly got Olark, the other half never saw it. Then we tagged each group with Kissmetrics properties and used our own Kissmetrics A/B Test Report to see how conversions changed in our funnel.

So how did the data play out anyway?

Not great.

Our Live Chat A/B Test Results

Here’s what Olark did to our signups:

Live Chat Tool Impact on Signup Conversions

A decrease of 8.59% at 81.38% statistical significance. I can’t say that we have a confirmed loser at this point. I prefer 99% statistical significance for those kinds of claims. But that data is not trending towards a winner.

How about activations? Did it improve signup quality and get more people to install Kissmetrics? That step of the funnel looked even worse:

Live Chat Tool Impact on Activations

A 22.14% decrease on activations at 97.32% statistical significance. Most marketers would declare this as a confirmed loser since we hit the 95% statistical significance threshold. Even if you push for 99% statistical significance, the results are not looking good at this point.

What about customers? Maybe it increased the total number of new customers somehow? I can’t share that data but the test was inconclusive that far down the funnel.

The Decision – Derailed by Internal Politics

So here’s what we know:

  • Olark might decrease signups by a small amount.
  • Olark is probably decreasing Kissmetrics installs.
  • The impact on customer counts is unknown.

Seems like a pretty straightforward decision right? We’re looking at possible hits on signups and activations, then a complete roll of the dice on customers. These aren’t the kind of odds I like to play with. Downside at the top of the funnel with a slim chance of success at the bottom. We should of taken it down right?

Unfortunately, that’s not what happened.

Olark is still live on the Kissmetrics site 9 months after we did the test. If you go to the pricing page, it’s still there:

Kissmetrics Life Chat Tool on Pricing Page

Why wouldn’t we kill a bad test? Why would we let a bad, risky variant live on?

Internal politics.

Here’s the thing: just because you have data doesn’t mean that decisions get made rationally.

I took these test results to one of our Sales directors at the time and said that I was going to take Olark off the site completely. That caused a bit of a firestorm. Alarms got passed up the Sales chain and I found myself in a meeting with the entire Sales leadership.

I wanted Olark gone. Sales was 100% against me.

Live chat is considered a best practice (or at least it was a best practice at one point). It’s a safe choice for any SaaS leadership team. I have no idea HOW it became a best practice considering the data I found but that’s not the point. There’s plenty of best practices that sound great but actually make things worse.

Here’s what the head of Sales told me: “Salesforce uses live chat so it should work for us too.”

But following tactics from industry leaders is the fastest path to mediocrity for a few reasons:

  • They might be testing it themselves to see if it works, you don’t know if it’s still mid-test or a win they’ve decided to keep.
  • They might not have tested it, they could be following best practices themselves and have no idea if it actually helps.
  • They may have gotten bad data but decided to keep it because of internal politics.
  • Even if it does work for them, there’s no guarantee that it’ll work for you. I’ve actually found most tactics to be very situational. There’s a few cases where a tactic helps immensely but most of the time it’s a waste of effort and has no impact.

It’s also difficult to understand how a live chat tool would decrease conversions. Maybe it’s a distraction, maybe not. But when you see good opportunities come in as an SDR rep that help you meet your qualified lead quotas, it’s not easy to separate that anecdotal experience from the data on the entire system.

But none of this mattered. Sales was completely adamant about keeping it.

The ambiguity on customer counts didn’t help either. As long as it was an unknown, arguments could still be made in favor of Olark.

Why didn’t I let the test run longer and get enough data on how it impacted new customer counts? With how close the data was, we would have needed to run the test for several months before getting anywhere close to an answer. Since I had several other tests in my pipeline, I faced serious opportunity costs if I let the test run. Running one test for 3 months means not running 3-4 other tests that have a chance at being major wins.

So I faced a choice. I could have removed Olark if I was stubborn enough. My team had access to the marketing site, Sales didn’t. But standing my ground would start an internal battle between Marketing and Sales. It’d get escalated to our CEO and I’d spend the next couple of weeks arguing in meetings instead of trying to find other wins for the company. Regardless of the final decision, the whole ordeal would fray relationships between the teams. I’d also burn a lot of social capital if I decided to push my decision through. With the decrease in trust, there would be all sorts of long-term costs that would prevent us executing effectively on future projects.

I pushed back and luckily got agreement for not launching it on the Features or Customers pages. But Sales wouldn’t budge on the Pricing page. I chose to let it drop and it lives to this day.

That’s how I launched a variant that decreased conversions.

Should You Use a Live Chat Tool on Your Site?

Could a live chat tool increase the conversions on your site? Possibly. Just because it didn’t work for me doesn’t mean it won’t work for you.

Are there other places that I would place a live chat tool? Maybe a support site or within a product? Certainly. There are plenty of cases where acquisition matters less than helping people as quickly as possible.

Would I use a live chat tool at an early stage startup to collect every possible bit of feedback I could? Regardless of what it did to signups? Most definitely. Any qualitative feedback at this stage is immensely valuable as you iterate to product/market fit. Sacrificing a few signups is well worth the cost of being able to chat will prospects.

If I was trying to increase conversions to signups, activations, and customers, would I launch a live chat tool on a SaaS marketing site without A/B testing it first? Absolutely not. Since this test didn’t go well, I wouldn’t launch a live chat tool without conclusive data proving that it helped conversions.

Olark and the rest of the live chat companies have great products. There’s definitely ways for them to add a ton of value. Getting lots of qualitative feedback at an early stage startup is probably the strongest use case that I see. But if your goal is to increase signups, activations, and customers, I’d be very careful with assuming that a live chat tool will help you.

I’ll show you how to build an unstoppable growth machine. Don’t miss any of my new essays.

I don’t share emails. Ever. And your trust means a great deal to me.

{ 11 comments… add one }
  • Karl Pawlewicz July 27, 2015, 2:17 pm

    Thank you Lars! A very honest and data-driven look at live chat. Really hope your readers find this helpful when considering live chat, or evaluating whether an existing chat platform is working or not. – Karl from Olark

    • Lars Lofgren July 28, 2015, 8:48 pm

      Thanks Karl! I’m still a big fan of your app, lots of ways to get value out of it.

  • Sunir Shah July 27, 2015, 3:15 pm

    Hi, I’m the CMO at Olark. I love split tests. Thank you! That’s a very interesting bit of analysis. I’ll have to consider how it stacks up against other split tests, like that showed a 31% improvement. That doesn’t mean that it worked out in your context, so I’m curious why not. 🙂

    One question I have. Did you look at the chats themselves? When in the habit of split testing, it’s easy to view live chat as just another page element, but of course live chat isn’t a page element. Customers perceive it primarily as the human interaction because that’s its essential purpose. So you would also be including the live chat team and its processes as well in the split test, if that makes sense. For instance, the sales team may have moved the chatters out into a slower but more lucrative sales process.

    One further thing to consider. We did notice that KISSmetrics pops up immediately on the pricing page. Our own internal statistics has shown 25 seconds as the optimal time to wait so it doesn’t interrupt the customer before they are engaged.

    — Sunir Shah, CMO, Olark live chat

    • Lars Lofgren July 28, 2015, 9:22 pm

      Yup, we spent a lot of time seeing which types of chats came in. That’s why we wanted to expand it across the rest of the site. Our SDR team thought it was great. They’d get a couple of high quality prospects every month and they were able to disqualify weak prospects faster. They loved it.

      Our A/B test looked at data all the way down the customer funnel. So it included our SDR followup process in addition to the page element on the site. But the impact on customers was small enough that we weren’t able to detect a lift or drop at any degree of certainty.

      Delaying the live chat pop-up might help. It’s rare for a variable that small to move the funnel in a major way but it does happen occasionally.

      As for the VWO case study, I’m HIGHLY skeptical of any proclaimed lift unless people are willing to share the data and the statistical significance reached on their test. For example, a 31% improvement at 90% statistical significance really doesn’t mean much. It’s way too easy to hit false positives. Also, 30%+ wins are exceedingly rare. In my experience, I’ve only seen a few tests improve signups by that degree. Usually, you’d need to test different value props, offers, or complete funnel rebuilds to get that kind of win. I highly doubt Ez Texting got a 31% win off a Live Chat widget. I could be completely wrong in this prediction, I just want to see the data first.

  • Sunir Shah August 5, 2015, 5:25 pm

    Hey Lars, thanks for the follow up. Very interesting. As a fellow data nerd, I feel and quantify your pain of lack of data to 3 significant digits. 🙂

    — Sunir Shah, CMO, Olark live chat

  • Alex Chaidaroglou September 15, 2015, 3:56 pm

    This is the first article I read on your blog Lars and I must say it was totally amazing! Props also to the Olark team for their replies and chiming in.

    It’s a great demonstration of how things might get screwed due to politics or blindly following popular tactics and competitors. Very down to earth and realistic approach.

    Looking forward to your next articles.

    • Lars Lofgren September 15, 2015, 4:01 pm

      Thanks Alex! I’ve got more essays on the way. 🙂

  • Gary Tramer September 15, 2015, 8:14 pm


    I wanted to chime in here because at LeadChat, we see this all the time. When you install live chat on a poorly optimized website (see that has a long enquiry form that’s hard to find, it works wonders.

    But the reason is, the site is terrible. So, live chat can easily give a 4-8x conversion uplift.

    However, on sites that have a extremely well optimized conversion flow and strong CTAs, if you don’t know what you’re doing, you’ll kill conversions with live chat.

    We did a trial for a company called Sanebox last year who has an extremely well thought out (and tested) homepage. Our chat implementation (and team) couldn’t lift conversions noticeably.

    Since then, we’ve evolved significantly along with the data we capture.

    Here are some tips re live chat and what you found above:

    1. Sunir is right, the proactive popup shouldn’t be instant. But it shouldn’t be 25 seconds either.

    At LeadChat, we apply the following measure. If standard conversion rate is >5% on a particular page, don’t install live chat (at least initially). If it’s between 2-5%, do a thorough analysis on the average time to complete the form (CTA), and hit people in the sweet spot between average form completion time and time to leave that page. In some cases, it can be over 30-40 seconds.

    Even better, don’t show chat button at all (ie, hidden from view) and only popup after that time (so they can’t click a compressed chat button either).

    Sure, you’ll leave some leads on the table, but you’ll pick up the cream without affecting the base.

    2. You used Pre-chat survey. Ie, you asked people for name, email and phone number to engage in chat.

    Technically, this is more information required from the visitor than clicking the main CTA. So it was doomed by virtue of the setup.

    Best practice is not to have this, and have agents trained in capturing lead information.

    3. Another way to implement to catch the stragglers is to use click-to-chat. So, next to your main CTA is some small text underlined that says, “questions? click to chat with an expert” and this link pops up a hidden chat box. Not only do you measure the clicks, but you get more qualified chats.

    Not all chat platforms do this, but the better ones do.

    4. Never have after hours on chat.

    It’s a CTA killer, and it’s plain rude. It’s like walking into a retail store, and being met with a 16 year old pimply-face stored attendant who’s playing on his iphone, looks at you and yawns, and goes back to playing with his phone.

    Better to have no chat than “Leave Offline” setting. (or use a 24/7 live chat service like… ehhem… cough cough.. LeadChat)

    5. Personalization

    Using smart triggers, personalizing the experience depending on a bunch of factors – referral source, country, previous visits to site, etc.

    So you could open chat with

    “Hey, welcome back! We’re also based in SF, wicked 🙂

    I can see you were reading about live chat conversions on Lars Lofgren’s blog, how can I help today?”

    This would check referral source is* and country=USA, and city=SF, visitor=repeat

    This has a big impact to conversions as well.


    As you know, at LeadChat we do hundreds of thousands of chats for companies as small as dentists and as large as major healthcare providers, SaaS, banks, etc and what I’m sharing above comes from very conclusive analysis.

    You should get me in touch with the current guys at Kissmetrics, we’d kill their test 🙂

    • Lars Lofgren September 15, 2015, 9:20 pm

      Wow, this is incredibly helpful Gary.

      And I bet you’re right. We had spent a ton of time optimizing some of the pages on the Kissmetrics website. So you’re probably spot-on with needing to be more careful with live chats on optimized sites.

      The rest of your ideas are also really interesting, you’ve sold me on giving live chats a second chance. I’ll pass this thread to my contacts at Kissmetrics. 🙂

    • Alex Chaidaroglou October 1, 2015, 3:37 pm

      Wow another killer comment, thanks Gary!

      I guess that’s one way to sell with blog commenting while adding value 🙂

  • Rodrigo May 17, 2016, 5:26 pm

    I think what Gary says is very useful. We had a very optimized landing page and the live chat drastically lowered our conversions (smartsupp live chat). I don’t know, maybe visitors think they can chat anytime so they forget, or they get more distracted, who knows, but at least in our case it affected us. Greetings


Leave a Comment