Growth Hacking Native Mobile Apps Using Decision Management

by Rob Salerno

Latest Trends in Growth Hacking Native Mobile Apps

As a digital marketing consultant, I often get asked about Native Mobile App A/B growth hacking strategies to increase engagement, conversion and brand awareness.

The general premise of these discussions is companies large and small want to increase revenue through mobile by improving their marketing automation and decision management (IBM already has this figured out).

In other words, you can move the needle through the mobile channel by anticipating the next best action with your customer. Ideally, interaction with your customer is fruitful each and every time you communicate so that you can perfectly time stuff like push marketing notifications, offers, email reminders, customized messages or product updates in synch with your customers needs.

So how do you solve this engagement dilemma? Mobile A/B Testing of course.

To get started, here are some quick recommendations to consider:

1. Choosing the right platform. Deciding on which Mobile A/B Testing SaaS provider is right for your needs depends greatly on your company or industry. Most CROs know the list: Apptimize, Leanplum, Taplytics (iOS only), Tune, Splitforce (now owned by Localytics), Swrve, and perhaps Optimizely.

Personally, I’m provider agnostic. I see value in all of the offerings.

In a healthcare market like Nashville (where I’m based) or Pittsburgh (where I went to school), for example, many of the apps are tied into wellness programs and thus these apps rely on integration with products like the Apple Watch so be sure to choose one that supports iOS in that case. To be certain though, analyze your customer base and determine if the Operating System matters. If it does, test against the largest population of users.

Another factor can be the talent resources of the development team. Android apps are typically built on Java while iOS apps are built on Objective C or Swift. They both can use interchangeable frameworks but I would highly consider the resources of your development team before choosing a vendor.

2. Mobile Web, Native or Hybrid. Ok, lets face it Hybrid is probably not a good option for A/B testing on mobile. Industry averages say that 25% of your app download audience will never return again. I’d take that a step further and bet that above 90% of that 25% never even opened the app once. It makes you wonder why (if not by accident) someone even bothered to hit the download button in the App Store.

So my recommendation is to A/B Test on Native only since Hybrid screens appear sluggish during page and state transitions. This impacts both the Control and Challenger recipes for testing. When you have slow page load speeds, you will see high bounce rates and thus the user won’t ever see your test.

Mobile Web isn’t bad if you have no other option and there is server redundancy with a CDN like AWS or Akamai. However, stand alone mobile web experiences are becoming more extinct by the day as most desktop experiences are now typically responsive to mobile devices and tablets; not to mention, you have limited control of the user inside the browser and possibly a slow connection (e.g. a bad WiFi connection). Thus you face the same problem as Hybrid. All of which can make Mobile A/B Testing less appetizing.

3. User Segmentation. This is second nature for most CROs but if its your first time running a Native Mobile app A/B Test, try to leverage your Web Analytics to segment new users from returning users. For new users, the typical goal is new user Account Creation and getting a lift in MAUs.

New Users should be tested for making it as painless as possible to create an account and use at least one feature on a recurring basis. If only one of those two things are true, engagement will drop and the test result won’t matter.

If successful with both, however, you will see a lift in MRR (Monthly Recurring Revenue) from higher engagement and thus the user’s higher propensity to take action (e.g. get an insurance quote, buy a product or service, refer friends, etc).

Its crucial you capture the hearts and minds of the user during the first 7 days after the app download. This is because after a week or so, if the app doesn’t cut it the user engagement level basically drops to zero.

4. Next Best Action Decision Management. As mentioned above, a lot of CROs preach the best practices for Mobile A/B Testing by taking advantage of Next Best Action Decision Making. The idea here is to leverage predictive analytics and Big Data to determine when and what to communicate with your end consumer.

With so much Marketing Automation (Marketo, HubSpot, Adobe Campaign, Salesforce, etc) going on already via email and even robotic outbound calling, its best to create conditions for the test that will yield the most fruitful results.

Depending on the user’s place in the conversion funnel, you will want to start by testing the high value – high likelihood to convert population first and work your way down.

So leverage Decision Management techniques to make that happen.

By A/B Testing users when they are truly engaged and ready to take action, it provides a better ROI versus tests that are designed to get them to engage in the first place.

Consider the business goals first, though, before making that determination because getting new users to engage may in fact be the bigger problem to address first despite the smaller ROI.

5. Increase User Engagement before Increasing User Acquisition. I hear this a lot. Get me more installs. Drive up MAUs. With more MAUs, I’ll see more growth.

Lets address this right off the bat and say that getting more app downloads doesn’t equal more growth.

MAU is a monthly metric that can be deceiving. Driving users to download the app to save money off a product or service is a good example of this. I know I’ve downloaded the Papa Murphy’s app for this very reason (i.e. my kids love their pizza). Problem is it may be and most likely is a one-time event. I downloaded their app and probably never used it again.

Mobile Advertisers will spend tons of cash driving users to download their apps with a promotion (say on Facebook, which is the largest Mobile Advertising source) and then lose them in a matter of days just as fast.

It also hurts Mobile Advertisers when “*Facebook forced third-party measurement companies to strip key device-level attribution data away from the other meta-data they provide to advertisers, essentially making it impossible for advertisers to retarget individuals who have taken certain actions.”

So what then should be the focus?

Look at DAU (Daily Active Users) and see if there’s a trend in engagement. If the majority of your users came from an acquisition campaign such as the Papa Murphys promo but disappeared, the problem isn’t getting them to download the app but to use the app.

If you A/B test improvements to the app itself and see what moves the DAU number in a northward direction, then you will see a rise in growth. DAU by default factors in time more effectively because its looking at engagement by day and not by month and basically erases the outlier data like a promo.

While these tips are just a few considerations to get started, there are many more optimization practices available.

Source: * VentureBeat



Leave a Reply

Your email address will not be published. Required fields are marked *