Okay, it’s time.
You know about the launch craziness, and the ensuing retreat.
You also know about the spectacular numbers; $294,865 in revenues generated in just 25 days in January.
Now it’s time for us to pull back the curtain, and share the nitty-gritty on exactly how we did it all.
The good, and the bad; what went right, and what went wrong.
So buckle up – this is going to be legen… wait for it… dary! 😉
Our Launch Process from 30,000 Feet
On the face of it, our launch process was pretty straight-forward:
- Our affiliates received a carefully structured package with emails that they could customize to promote the webinar.
- Anyone clicking on the links in those emails would land on a webinar registration page.
- After they registered for the webinar, they received several “warm-up” emails helping them to get to know us, and reminding them to attend the webinar.
- They would hopefully attend the webinar, enjoy the content, and in many cases, buy.
- After purchasing, they would have the option of customizing their offer with a few up-sells.
- If they missed the webinar, they would be invited to attend a live replay.
- After the fact, there would be a few follow-up emails reminding them about the offer, in case they wanted to sign up.
Take all of that, multiply it by 30 affiliates, add a bit of extra bonus stuff, and that’s the story of our launch. 😉
What Went Well: Content, Follow-up, and Planning
Of the many things that we tried, a few were unequivocal successes.
Topping the list was the quality of the content that we had put together; launching a product is always easier when it’s a great product that you’re launching. 😉
It also helped that this wasn’t our first spin on the merry-go-round; we knew what worked in the past – for us, for our audience, and for our offers;
- We knew that email is a good way for us to engage and connect,
- We knew that webinars work very well for us, both in terms of relationships, and sales,
- And we knew what we appreciated seeing in other people’s campaigns, and that gave us some ideas, too!
So we built a strong launch plan based on the stuff that we knew would work for us: a live webinar for each and every affiliate, with warm-up and follow-up emails before and after, respectively.
We planned it all out well in advance, which was critical (relationships were a big part of getting so many great partners onboard, but approaching them months in advance made a big difference, too!).
Of course, just as we had some unequivocal successes, we also had some big glaring failures…
What Went Badly: Technology Issues and Overload
These were things that just went badly, period.
Thankfully, there weren’t *too* many items on this list, but there were definitely a few: general overload, human error, and technology glitches.
Yes, there was the overload of student intake, but there was more than that;
We had just packed the schedule waaaay too tightly; the launch ran for 25 days, but if you don’t count weekends or other quiet days, it was 30 webinars spread over just 13 days, which works out to an average of over 4 hours of live presenting per day (the worst was the day that I delivered four webinars, three of which were back to back).
That was just way, way too much, but somehow when we were planning it, it seemed like it wouldn’t be that big of a deal.
Maybe this was just the Dopeler effect in action?
Yeah, human error was definitely an issue – not because of the quality or attention of my team, but just because of the sheer complexity of the launch structure; for each affiliate partner, there were half a dozen email templates, each with several customized links, plus close to a dozen emails of our own, webinars and replays to setup, links to program, and more… multiply this by thirty partners, and you arrive at well over 30,000 opportunities for someone to make a mistake.
Yes, that’s right: 30,000. “Holy crap” is right!
The actual error rate was very low, and a testament to how great of a team I’ve got (especially Amanda, who dealt with most of this) – but we really need a system that we can implement with fewer opportunities to get something wrong.
Now,the third item is a bit long, and it kind of morphed into a review of Office AutoPilot:
Technology Glitches (a.k.a. Office AutoPilot review)
We had a lot of technology glitches. A LOT.
And almost all of them could be traced back to a single system that we were using: Office AutoPilot.
Now, I don’t want to be unfair, because OAP did a lot of things right:
- It allowed us to do sophisticated “if-this-then-that” email marketing that we just couldn’t do with other platforms.
- It allowed us to keep track of everything in one place, rather than have to coordinate between a handful of different systems for email marketing, transaction processing, affiliate management, and even content access management for our students.
- Their support team was very helpful in teasing out and solving a lot of the issues.
In spite of all this, though, there were *a lot* of things that went wrong; lots and lots of glitches, and several features that just didn’t work as advertised (topping the list was their built-in split-testing technology: at the end of the launch we discovered that, for a variety of reasons, just about all of the split-test data was completely useless – and it’s not like we haven’t done split-testing before).
Now, when I realized just how badly things had worked out (I practically blew a fuse when I found out that all of our split-test data was corrupt, especially since all the problems would have been avoided if we had just used our tried and tested Visual Website Optimizer), I reached out to the CEO of OAP through a friend, who put me in touch with their Head of Client Experience, and we got on the phone.
She heard me out, and was very empathetic, with promises to investigate, etc., but ultimately:
- She denied knowledge that the feature-set in question was buggy, but in our dealings with their support team, it became pretty clear that they knew about it (this is an extract from an email that was sent to her elaborating the problem: “Throughout the launch, it seemed that data was accurate some of the time, but other times it was wildly off. Last week we were having problems with our payment gateway, and I got on the phone with one of your helpful techs and we set up a new form and landing page together while I was on screenshare. After having created it, the page data immediately displayed a 8200% conversion rate. I asked if we could stop for a sec and address why a brand new form and page was displaying such an inaccurate number. I was told that this was an issue he had seen before and that they were working on fixing it for the new version of OAP set to be released soon.“).
- She suggested that what we were trying to do was too complex, and that we should have had more lead time to set it up, and reached out to their concierge services. That being said, we started the setup a full two months before the launch, and had a person working on it pretty much full-time. We also reached out to their concierge services, willing to pay for some things to be setup, and were told that they were busy working on a systems update.
- I got the impression that any issues would be corrected as part of the update they’ve been working on, which was supposed to be happening “imminently”, but further exploration through the grapevine of my colleagues in the marketing world revealed that this systems update has been “imminent” for a very, very long time (i.e. over a year).
- She said that she would investigate further, and get back to me – but as I write this (three and a half weeks after our phone conversation), I haven’t heard anything back. Update: After giving the OAP team a heads-up about this review a few days before publishing it, they did get back to me, and start putting some energy into untangling why everything had gone so badly. It seems that at least part of the issue was a configuration problem on their end, and correcting it should have fixed the issue (a month too late), but by that time we had made page modifications (since the data was useless anyway), and so even the semi-recovered data was corrupted. Oh well.
Not impressive or confidence-inspiring, to say the least.
Now, to be fair, it is completely possible, and probably even likely, that *some* of the problems that we had were the result of the above-mentioned human error – but not all of them. Not even close.
And also, to be completely fair, the feeling that I get from my interactions with the people at OAP is that they’re hard-working and well-meaning people that are just seriously overloaded and under-resourced; a feeling to which I can absolutely relate.
But when it happened to us, we were up-front about it; we communicated what was really going on, and took steps to fix it quickly. Other than a phone call with their head of Client Experience (which was on short notice, and I appreciated), I haven’t seen any of that.
Now, the fact is that we’re still using OAP, and we aren’t planning on switching; partially because we really do like the feature-set (at least the part of it that works the way they say it does), partially because we’ve already invested so heavily in moving everything into their system, and partially because InfusionSoft, the only major competitor, is reputed to be even worse.
And I even recommend OAP to businesses that have grown over a certain size and complexity, like we have. But unfortunately, rather than being able to say “OAP is great, go for it”, the best I can say is that “OAP is probably your best bet, but plan for a long and painful migration, test everything carefully, and above all else, buyer beware.”
So that’s our Office Autopilot review – if you have a story of your own – let us know about it!
There was one last thing (separate from OAP) that felt like a small-scale disaster, but it really deserves a post of its own, so look for that early next week. 😉
So some things went well, and others went badly, but on top of all that, there were also a bunch of things that started off with a whimper, but ended with a bang.
Those were the things that we improved during the launch…
What We Iterated: Webinars, Replays, and Up-Sells
Part of the reason why we opted for such an intense launch calendar (as opposed to a rolling launch over the course of months, for example), was that it would allow us to rapidly test a lot of things, iterate, and improve.
(Too bad about the split-test data…) :-S
This was key, because there were a number of things that ended up being very important to the launch, that didn’t work well at all right out of the gate.
Now, the list of things that we improved between the start and end of the launch is huge, but there are three big things that definitely deserve mention: our webinar, the replays, and the up-sells.
At the core of our launch was our webinar, titled “The Brutally Honest Truth about what it’s REALLY Going to Take to Build a Thriving Audience and Business Online in 2013”.
To say that this webinar was a cornerstone of the launch would be an understatement; between the live presentations and the replays, it was attended by over 3,000 people, and it was responsible for the *vast* majority of our sales.
So I guess it should come as no surprise that we worked very hard to improve it throughout the launch!
And I mean *very* hard; even though the webinar was carefully built, structured, and rehearsed before the launch, it went through pretty significant changes with almost every delivery for the first half of the launch, and more changes with almost every presentation thereafter.
Now, our webinar did pretty well right from the beginning, but there was definitely room for improvement;
- We got complaints about taking too long to get to the point,
- We saw confusion around a number of things presented on the webinar, and
- While conversions were decent, they weren’t great, or particularly consistent.
By the end of the launch, though, it was down to a science; response was consistently excellent, and conversions were off the charts.
The lesson here, though, is that no matter how experienced you are, creating a successful webinar takes time, and lots of it!
The second thing that went through a lot of iterations was the way we delivered replays for people who didn’t make it onto the live call.
In the past, with Write Like Freddy, we would just post a recording for people to see whenever they wanted – but this wasn’t working very well for us.
Here’s the pattern that we noticed:
On the one hand, the people who show up live got a TREMENDOUS amount of value out of the sessions – they learned a ridiculous amount of stuff that they could apply to their business right away, and I would often get emails weeks or months later telling me about the amazing successes they’ve had doing what I taught them.
On the other hand, lots of people would *say* that they want the recording, but the numbers show that the recording hardly ever gets watched; people save the link as something they’ll look at “later”, and never really get to it.
Which is why we decided not to offer recordings, period.
Instead, we would offer a live replay, broadcasted at a specific point in time (which we did using Stealth Seminars).
At first, we just scheduled the replay for two days after the live event, and attendance on the replays was terrible. Clearly, that wasn’t working.
We noticed, though, that when we offered replays on the weekend, attendance was much, much better (this wasn’t the result of a planned test; we just had webinars scheduled for Thursdays, so two days later was a weekend).
So that’s what we started doing, and continued doing throughout the launch.
Now, this is better, but still not ideal; there are still a lot of people for whom this is really inconvenient, whether it’s because of busy schedules, choppy internet connections, or something else.
So we’re still trying to figure out what to do instead;
- Should we have half a dozen replays, at various times throughout the weekend?
- Should we just put a recording online for a window of two days?
On the one hand, this would seem to be more convenient, but on the other hand, the more time we make it available for, the fewer people end up watching it (according to our statistics).
So this has gotten better over the course of the launch, but there’s still a lot of room for improvement.
We’re working on it, and open to suggestions if you have them!
The third thing that changed (and improved) a lot from the start of the launch to the end of it was the up-sells.
Before I talk about the changes, let me clarify a couple of things:
First, what is an up-sell? An up-sell is an offer that is an add-on to whatever you’re selling. It isn’t necessary, but it does add value to some of the people who buy the main product. The up-sells that we offered were around extra support in the training program; stuff that absolutely isn’t mandatory, but some people really want.
Second, an up-sell is *not* the same as a one-time offer (OTO). One-time offers are meant to capitalize on loss aversion during a sales process; it is made clear that this is the only time that you can get this special offer, and if you don’t sign up now, you’ll miss out. Our up-sells were NOT one-time offers; if a student realizes further down the line that they want to take advantage of the extra support services, they absolutely can.
Now, over the course of the launch, we experimented with five different up-sell offers that we thought would interest our students, and initially, NONE of them were very well received. In fact, of the five, only one of them converted *at all* right out of the gate, and not particularly well.
We ended up going through four major iterations of the ways that we presented the up-sells, and which we offered.
By the end of the launch, we had dumped three of the five up-sells, and improved conversions on them both substantially. Our feeling is that one of them is pretty much where it should be, whereas the second (our out of this world homework review service) is still pretty heavily under-utilized. We know that the offer is solid and valuable, which means there’s something important that we need to fix about the presentation.
We’re working on it – you can’t fix everything over a 25-day launch… 😉
How Did We Do? (Our Goals for the Launch)
So how did we do?
On the face of it, the results sound very good; almost $300,000 in revenue generated, 450+ new students, etc.
But is that actually good? Just okay?
It depends on our goals; so let’s explore our three big goals for the launch, and see how we did as measured against each of them.
Big Goal #1: Critical Mass of Students
Our first big goal for the launch was to bring a big group of students into the Audience Business Masterclass.
This wasn’t just a case of “the more sales, the better” – since the program involves group work through the alumni network (and other venues), having a certain critical mass was critical for the program to function properly. So before the launch, we had three possible scenarios in mind:
- Disaster: Bring on less than 200 students
- Reasonable: Bring on 200-400 students
- Spectacular: Bring in over 400 students
We ended up bringing in over 450 new students, which is well within our “spectacular” scenario – so on that front, the launch was definitely a success.
(Actually, it turns out that our expectations were off, and we were a little over capacity with the on-boarding of the students that we took on – lesson learned!)
Big Goal #2: Financial Runway
The second big goal was around building a financial runway for our business, and that meant a minimum number in terms of revenues generates.
Here’s what it boils down to: like any business, Mirasee has regular expenses; stuff like:
- Compensation for the great people who work here
- Basic technologies like hosting, email list management, business automation, webinars, etc.
- Prizes for people who participate in our contests
- Investment in future projects geared to changing the world 😉
All of these expenses come together to make up our burn rate – the money that we have to spend every month just to “keep the lights on”.
Now, because tuition to the Audience Business Masterclass is broken into six monthly installments, we would either generate enough sales to be covered for six months, or for none at all, and the magic number for us was around $100,000 net; if we generated $100,000 in sales (after paying affiliate commissions), we’d be set for the next six months.
Done and done. Check.
Big Goal #3: Solid and Optimized Process
Our third big goal for the launch was to use all of the rapid iterations of the webinars to optimize our funnel.
We wanted to get to the point where we would have a process that we could deploy quickly and easily on an ongoing basis, that would bring lots of new students into the program. That would include:
- Email templates that we could give affiliates that would consistently get lots of opens and clicks
- Webinar registration pages that consistently convert the majority of visitors to registrants
- Pre-webinar emails that would build engagement and create consistently high live attendance
- A webinar presentation that would consistently leave *everyone* highly satisfied, and convert a large portion of attendees into students
- An up-sell offer funnel that would consistently get everyone who needs an add-on to sign up for it, without “catching” anyone for whom the offer isn’t appropriate
- A webinar replay setup that would consistently get many of the people who missed the webinar to watch and learn from it (and, by extension, convert to students)
Did you notice the key word that kept repeating? The word is consistent!
Now, on this front, we did well, but could have done better – mostly because of all the lost testing data from all the Office AutoPilot glitches.
So all in all, we’re looking at about two and a half goals met out of three. Not bad!
The Results + 7 Lessons Learned
All in all, the results of the launch were fantastic;
- We enrolled over 450 students into the Audience Business Masterclass
- We generated a grand total of $294,865 in revenues
- Of that, over $110,000 would be paid to our wonderful affiliates (thank you!)
- We met two and a half of our big goals, and are well on our way to meeting the third
And on top of all that, we learned some very important lessons, that will be very valuable to us in the days ahead, and we hope will be valuable to you, too:
- Launch with a great product. I can’t emphasize this strongly enough; without a great product, you aren’t likely to have much success at all, and any success that you do have will be short-lived.
- Play to your strengths. We knew that personalized webinars and email marketing were good strategies for us, and we used them heavily. On the other and, the massive, tons-of-affiliates-pointing-to-3-videos launches *haven’t* worked well for us, and so we didn’t even bother trying that angle.
- Extra effort pays off. There were a lot of places where we could have cut corners, but at the end of the day, the extra effort really does pay off. You might be exhausted by the end of it, but it’s still worth it.
- Budget for more work than you expect. Again, I can’t emphasize this enough. The further out we plan, the more we tend to under-estimate how long things will take.
- Automate everything you can. Increased complexity inevitably leads to more mistakes; that’s just part of being human. The solution isn’t to try to “concentrate harder” to get it right, but rather to build systems that eliminate the opportunities for mistakes to be made.
- Test everything. Especially if you’re working with a technology that you haven’t had extensive experience with. Never take any feature-set or functionality for granted; until you’ve seen it work yourself, with your own eyes, don’t trust it.
- Expect to iterate. No matter how good you are, or how hard you work, you still aren’t going to get it all right on the first try, so expect to iterate and improve as you go.
Phew, great results, and great lessons learned – what could be better? 😉
Okay, over to you – was this helpful? Did you learn something that will be useful to you in your own business? Is there a lesson for us hidden in this post, that we may have missed?