Some Data About Test Pilot Users With No Experiments Enabled

A few weeks ago we implemented a notification that reminds users if they don’t enable an experiment within 24 hours of installing Test Pilot.[1] We also triggered this notice for existing users with no experiments installed.

here’s what that looks like

I spent some time poking at the funnel and feedback today and here’s what I’m seeing…

The notification bar is triggered 92,552 times from the Test Pilot add-on. 10,119 sessions show a click event on the button to go the Test Pilot home page from the notification (that What experiments? button).

src: https://pageshot.net/6o2jX6aEc6JSVdaX/analytics.google.com

In addition to Unified Telemetry, Test Pilot uses two different GA instances to record event metrics – one for the add-on and the one for the app.[2] Splitting things out like this sometimes causes reporting oddness. For example, in this case the Test Pilot add-on constructs a URL to the Test Pilot app with UTM parameters that let us see the specific referrer URL in the app’s metrics dashboards. One would think that for for every click event recorded by the add-on, there would be one Test Pilot page view with a matching referrer (so 10k clicks on one end and 10k matching referrers on the other). It turns out, we’re actually seeing just shy of 7k URLs with a matching referrer.

src: https://pageshot.net/HAn6ESIFjXgZvL5m/analytics.google.com

So, I’m not sure exactly what’s going on here, though there are a few different explanations. For example a user might be be blocking GA at a page level (ie the Test Pilot app), but not inside of the Test Pilot add-on. Anyway, it’s not an order of magnitude off, so that’s a start at least. Still, I’d like to see if there are other places where therea are similar differences between add-on events and app referrers in order to drill down on what’s causing the discrepancy.

Diving in further. If a user makes it to the Test Pilot page from the notification bar, we can see whether they wind up installing an experiment. It turns out, that about 27% of people do (so that’s not nothing!)


src: https://pageshot.net/xgK2U29qRScCbOFZ/analytics.google.com

There’s a lot going on here, but the important numbers are sessions (6,932) and Enable Experiment (Goal 1 Completion) (1,914). These 2,000ish souls are actually enabling an experiment because of our efforts here!

(editorial aside: the count above gives 6,933 users from the no-experiments-installed referrer while the table lists the count at 6,932. Counting, it turns out, is hard)

Okay so 27% of people who get to the Test Pilot site through this funnel enable an experiment, but if you factor in all of the sessions where users see the originating notification that conversion rate drops to a slightly less awe-inspiring 2% (1,914 out of 92,552 total notification events). So that kind of sucks, right? Well here’s a thought: that notification bar is pretty small, maybe people don’t see it or something?

It’s certainly possible, but we just did a test on No More 404s that suggests otherwise. We recently converted No More 404s from a notification bar UI to something much more visible and cute (it’s really cute, you should really install that experiment and troll around for 404 pages). We hypothesized that users would click through at a far greater rate because the new UI is so much more noticeable. It turns out this isn’t true. Users converted to the Internet Archive’s cache at the same rate. The change we saw was users started dismissing the more up-front UI at a far greater rate.

Last week we overhauled the UI on No More 404s. The CTR (blue) is the same, but all of a sudden users started clicking the dismiss button like crazy.
src: https://pageshot.net/VkLqEdKPpeG8Z4Sb/sql.telemetry.mozilla.org

The data from No More 404s references a different cohort of users (experiment enablers) than our non-experiments but, it seems to suggest that people see even relatively subtle notification UI. So we’ll need to do more digging to find out why the CTR on this notification is so low. Fortunately, we’re planning some pretty extensive user research this winter to help us better understand our users mental models about Test Pilot, so that should help us get some insight.

Now (adjusts black turtleneck) there is one more thing.

We put a survey link on Test Pilot that shows up for non-experimenters. We don’t have a ton of results yet, but we’re seeing a few interesting things already. These people aren’t leavers they’ve just never had an experiment installed.

src: https://pageshot.net/vxdE0fO4uY7AOnje/app.surveygizmo.com
Other experiments don’t even show up.

It’s also worth noting that when we ask people why they aren’t experimenting, the answers are mostly of the “I don’t know what Test Pilot is” variety. I’m not totally surprised by this but it’s a good reminder of the gap between developer/designer intent and user understanding. As I mentioned above, we’re working to do some in depth qualitative UR to better understand (among other things) the technical literacy of the Test Pilot user base.

[1] https://mozilla.invisionapp.com/share/Q77S1H6ZC#/screens/203225298_No_Experiments_Enabled_After_N

[2] If you’re interested in seeing how we’re sending events to GA check this out: https://github.com/mozilla/testpilot/blob/master/docs/metrics/ga.md