- Amanda Weiner
- Apr 10
- 5 min read
Build your website in minutes, try Wix for free today →

The thing about owning a website is that your work is never truly done. At Wix, we never leave our website alone.
My job itself is fully dedicated to testing different elements of our site. Whether we’re dealing with a landing page, product page—or this very blog that you’re looking at, my role as CRO manager is to question everything.
Over the years, we’ve run multiple experiments across our site (with varying success). So, I thought I’d take a moment to reflect on a few of our learnings, in case they’re useful for your own testing.
Read also: How to create a website
How we approach CRO testing
Just a quick note before we dive in. When it comes to picking what to test, we try our best to think about what will have the biggest impact. The bigger the impact, the more important the test.
That said, there are many other factors that influence how feasible an experiment is. For example, we need to know how quickly a change can be made and who else needs to be involved (after all, I’m not the only one working on our site, and even a minor change could interfere with someone else’s strategy).
The other thing that’s considered: What are the defensive versus offensive tests we want to run? A defensive test is one that we might conduct to keep our site on-brand, well-ranking and functional. Maybe we notice that our design is outdated. In this case, we’re looking to make changes so that our site doesn’t underperform in the near future. A defensive test, for that matter, is considered successful if it doesn’t negatively harm our KPIs during the testing period.
Meanwhile, an offensive test is one that’s meant to drive more conversions (and, in turn, sales). This only works if we can nail intent. In order to see the improvements we want to see, we need to get into the heads of our visitors and understand what they’re looking for when they land on our site. An offensive test is considered successful if we see a significant lift in conversions during the testing period.
Defensive CRO test | Offensive CRO test | |
Purpose | A test for optimizing the branding, SEO performance and/or functionality of a page | A test for improving the conversion rate of a page |
Frequently tested variables | Content, design, navigation, backend code | Content, design, navigation |
Expected results | No negative impact on performance or conversions | A statistically significant increase in conversions |
Example | Remove form from a page because it’s slowing down the site | Add a new call-to-action to the hero fold |
Once we’ve done our research and identified the who, what and why, then we’re ready to jump in with a plan.
Related reading:
Experiment one: adding Google One Tap
One of our most successful offensive tests in terms of numbers involved Google One Tap.
If you’re not familiar, Google One Tap (“GOT” for short) allows users who are signed into their Google accounts to log in or sign up to Wix with one click, directly on the page. On one hand, this experiment is super basic and obvious; it cuts down signup time and makes the process seamless, leading to more conversions.
On the other hand, it’s more complicated than you’d think, given the sheer number of pages on our site. The test started with a few products pages and our homepage, where a significant number of first-time visitors sign up for Wix. Very early on, we saw positive results—namely, an uptick in signups.
Once we realized its impact, we began rolling out GOT across Wix’s many, many pages. Time and time again, we saw the positive impact of this element, no matter the intent of the page. The full rollout involved many hands and careful monitoring of the results.
But overall, I love this test because the concept behind it is really so simple and purely CRO. If you can take away an obstacle to sign up, then more people are likely to convert.

Experiment two: adding “Get Started” to the top of mobile pages
Another test that followed a similar principle as our GOT test involved adding a “Get Started” call-to-action (CTA) to the top of our mobile pages.
We used to only have a CTA hiding inside of the hamburger menu of our mobile site, as that was what contained all of our buttons. This also mirrored the experience on desktop, where the “Get Started” CTA appears in the same area as the rest of our menu items.
But, of course, on mobile, accessing the hamburger menu is an additional step that you don't have to take on desktop. The button doesn’t stand out to the naked eye.
Therefore, we took the “Get Started” button and put it in the part of the mobile menu that’s visible right away. In other words, you don’t have to click into anything to see it. We saw an immediate uplift in signups from our mobile pages.
The logic here is similar: make the signup process more accessible and reduce friction to improve overall conversions.

Our “flops”
For every successful test, there are a dozen failed experiments. I would be remiss not to mention these failures, because even these have taught us a lot.
Many tests where we totally redid the look and feel of a page or an element of a page often showed flat results. There was no clear winner between version A and version B. Or, in some cases, the original version slightly outperformed the new version.
It taught us that sometimes, the small, simple elements make all the difference. In one instance, we spent months rethinking the design of one of our pages and making it more exciting and modern—only to find that it had absolutely no positive impact.
Safe to say, we were disappointed. Crushed, even. But it made us question the impact of design on behavior. While design is still essential, many design tests from that point on became less focused on CRO and more focused on simply highlighting our own brand language (with no expectations of positive CRO results).
This clarity, I would say, was still a win. It helped us recenter and realign our expectations.
Try, try and try again
If you take anything away from this article, I hope it’s this: never stop trying things. Even if you “fail,” you will learn something.
Base your tests on data and plan them wisely, but don’t let perfectionism or the hope of only creating “winning” tests get in your way of trying.
Another important tip: Always make sure you have enough data to really make a decision. Don’t let a few visitors acting one way or another persuade you. You have to see the same behavior over and over again to truly determine a pattern, and to determine whether what you’ve changed has made an impact.
With that, I’ll leave you to it. Get testing, and enjoy the insights you gain along the way.