Launching Ovuline: 4 weeks, 4 cheap lessons and 1 expensive learning

by Vasile Tofan

Ovuline, a web based fertility monitoring application, went live on January 10, following the HBS MVP Fund sponsored alpha phase. Four weeks, 2,000+ signups and 70,000+ pageviews later I’m attempting to summarize some of our learnings.

A lot of this is very tactical: cheap tips and practical notes – I’ll leave the grand strategizing for the next post. In the spirit of lean thinking – maximizing learning per dollar spent – I will thus start with the dollar side of the equation. If you want to take a shortcut, go directly to the conclusion, then decide for yourself if you want to read the entire post.

1. A viable product – cutting the right corners

As a consumer health product, dealing with a very sensitive issue, we could not allow ourselves drastic shortcuts. Many ‘smoke screens’ would have been damaging to our users who are already under significant stress because of their fertility concerns. For example, when considering to test a Polycystic Ovary Syndrome (PCOS – a condition affecting up to 10% of women) module in our application we could in principle have created a smoke option saying – ‘Monitor for PCOS condition’. But we did not feel comfortable about the ethical (and medical!) implications of some of these ‘smoke tests’. As a team we explicitly decided not to blatantly ‘burn’ through our users to test one or another hypothesis and treat any experiment with outmost respect. Being cheap on ethics is never a good choice.

This still left us with plenty of shortcuts to take in the non crucial aspects: from our Prezi based intro video to the very raw design on some of the pages to the numerous ‘coming soon’ inner tabs – we had enough reasons to be a little embarrassed by our first release. Luckily, our users did not seem to mind too much.

2. Reducing traffic cost – my 2 cents

With 50% of couples taking more than six months to conceive, there is a lot of search for conception related topics on-line. In fact, there are as many as 4 million Google searches each month for ‘how to get pregnant’ only. The Keyword tool was extremely helpful to estimate the specific inventory for all relevant terms. As somewhat unsophisticated AdWords users, we did get fooled initially by the Google-recommended cost-per-click (CPC) bids. ‘How to get pregnant’ for instance, had a suggested bid of $1.15 CPC. A dozen of iterations downwards, we reduced our bids to … $0.01, the absolute minimum! Even at this level we managed to get a steady stream of 300-500 clicks per day. Notes for ourselves:
  • Always start with a ridiculously low bid first and adjust upwards until you reach the desired stream of clicks per day
  • Consider expanding search criteria to cover English speaking developing markets which command much lower CPCs (Philippines, India, Pakistan) – even if these are not your target, the data can be still valuable for testing certain hypotheses.

3. Increasing conversions – before you move to A/B testing

Our conversion rate has evolved from the initial ~5% to as high as 13%, implying a user acquisition cost (CAC) of about $0.10 at our CPC of $0.01. We managed to gradually increase our conversion by applying simple tweaks even before jumping to home page A/B testing:
  • Link your AdWords and Analytics accounts to optimize the keywords employed for Conversions as opposed to Click Through Rate. Google optimizes for CTR by default, which is not always helpful
  • Monitor your Analytics for obvious conversion outliers – by country, browser type, keyword etc. For example, early on we noticed a pathetically low conversion rate from mobile users (~2% vs the site average ~10%), so we promptly excluded mobile browsing from our AdWords target at this stage
  • Experiment with non-obvious traffic sources. I was very skeptical of Facebook ads at first – there is no way they can result in lower CAC than AdWords, I was reasoning. To our surprise though, Facebook traffic showed signup conversions of 15%-20% (despite the abysmally low CTR), which, at $0.015 average CPC gave us a CAC of $0.09 (and almost 300 Likes along the way!). Yahoo and Bing (which I personally never use) are next in line.

4. User feedback – cheap advice

Determined to seek early feedback, we made sure we had our Uservoice button prominently displayed on the side. The only problem: there was not much specific feedback coming in (we did receive a lot of praising messages, which, while flattering, did not tell us what we can do better). We felt qualitative interviews or focus groups would be too time consuming and expensive at this stage so decided to integrate a simple, unobtrusively displayed poll into the site. For instance, the ‘Which feature do you value most?’ question drew in more than 200 votes in a matter of days (again, unexpectedly for us it was the Calendar feature – 56% of votes, when we expected Dashboard to win – only 17%). Interestingly, the blog and forum, which took a lot of our time to maintain and moderate, were voted by only 5%. Needless to say, the blog will not be our priority in the following couple of months.

The one expensive learning – an unlikely conclusion

The most important learning however, is somewhat at odds with the above mentioned tips. The abundance of data and the tyranny of Real-Time Google Analytics made our team the slave of minutia detail, often at the expense of big picture. We would frenetically monitor the results, on daily, even hourly basis, not unlike obsessive day traders monitor live stock performance. This over-focus on TODAY and NOW, the obsession with lowering the CAC by a couple of cents and increasing the Conversion by a percentage point was extremely time consuming. More importantly, it was detrimental to our ability to distill the more important learnings. Who are our users? Do they love our product? The ones that come back frequently, why do they? The ones that don’t, why don’t they?

In short, I feel we overly focused on the dollar aspect, as opposed to focusing on the learning per dollar. Luckily, we are only four weeks in.