Read the #ASO#ASA insights from the 56 min video in the twitter thread below π
π #1
Apple will tell you "do not use search ads data for ASO" because the timeframe of the trend they are reporting is short term. You can not use a short timeframe for ASO because it takes 2-3 weeks to get your keywords fully indexed.
π #2
Apple will still index you for 2 to 3 weeks after an update. If you have good rankings already, you can do keyword metadata optimization every 30 days and have a solid view of how an optimization performed.
π #3
Paid marketing can support organic success but organic success is not a pay-to-play model. If you do not set the foundation with ASO then you won't get volume for ASA.
π #4
The Apple Search Ads algorithm does crawl descriptions. So even though your description does not improve ASO, it can improve ASA.
π #5
For new apps or during major updates, ASA can help achieve 2x-3x indexation speed for keywords (it provides additional CTR data).
π #6
With a new app in a high volume category you will need click data first: goal is to build relevancy. Focus on core set of keywords, find related phrases (they index faster), target them and build a footprint as large as you can.
Then double down on ASA for top keywords.
π #7
For an established app, don't be afraid to go after top keywords. Look at the themes where you're ranking well (top 10-20) and start building out a footprint that is broader.
π #8
If you have a perfect keyword metadata optimization but you're struggling to get above top 20 it's not a CTR problem that you have so think about conversion optimization.
π #9
Play Store: what are the core keywords you're targeting and what are the phrases related to these keywords that have a similar audience? Look at placements (title, short description, description). You don't want to feed Google only keywords, you want to feed it context too.
π #10
Volume and velocity is not going to have a huge impact on your search rankings, but it may have an impact on indexation time because Apple and Google will crawl your app faster.
π #11
If you're A/B testing on the Google Play store it will be traffic across all sources. Once you find a winner and choose to deploy it, look at every channel through your MMP to see how conversion has been impacted to get some channel granularity.
π #12
The reason user-facing metadata tends to perform best is because users searching for these keywords see the actual keywords and convert better, leading to better rankings.
π #13
The format of your Google Play description can be more important than where you place keywords: inside a fixed place paragraph vs. somewhere that is easier to crawl. You want to balance user experience and easiness to crawl for Google.
π #14
Even for very large apps, still change selection of keywords every 30 days before Apple/Google have a linear degradation in the value that they see for older keywords. If you stop making incremental changes while competitors do, Apple/Google will see you as less relevant.
π #15
Usually if you convert very well for a keyword and remove it, you will continue to convert well for it. You can/should also back that up with some ASA campaigns.
π In yesterday's #growthgems newsletter I shared insights from the recent "Understanding, Optimizing and Predicting LTV in Mobile Gaming" GameCamp webinar
Read the 15 #LTV π from this 100 min video in the twitter thread below π
π 1/15
Chart your creatives on a X axis and check your D3/D7/D28 ROAS to quickly spot outlier creatives (both good and bad) so you can act on that (by reallocating spend for example).
π 2/15
A benchmark comparing ROAS (e.g. D7/D28) for each week (X axis) with success thresholds allows you to evaluate your UA strategies.
Read the #UA#creatives insights in the twitter thread below π
π #1
HopSkipDrive uses the same brand feeling and emotional value across channels but what might differ is the goal of each campaign. Example: Twitter for retargeting/re-engagement and Facebook for conversion.
π #2
Try to take a portfolio-based approach to creative testing and understand that you're adding something new to the mix. Some channels do not have a "clean" way to A/B test (e.g. TikTok) but even if there is it might not give you results of what will happen "live".
15 mobile #growthgems π from "Mobile Marketing for Card Games" with Josh Chandley (Wildcard Games) @eniac, Jonathan Lau (Weee!) and @jokim1
Watch this 52:14 discussion directly...Or start by reading the main #mobilegrowth#UA#creatives insights in the twitter thread below π
A way to test creatives at a lower cost is to start in India with a small budget to filter them out first, then move to Canada and deploying the best-performing onesΒ in the US.
π #2:
For games that have a strong organic demand, it makes sense to pay attention to ASO and also to potentially try preloads on devices as well as try low cost/quality channels. Examples: Gin, Rummy, Solitaire, Tetris.
π #1
Tag the ids within each ad creative: not one id per creative but an id for each "element": light/dark background, team id, player offer, CTA, etc. Add granularity of tagging for each concept so you can really understand what drives results. [DraftKings]
π #2
Yousician did a live op with paid content unlocked (e.g. Jimmy Hendrix songs) and people playing these songs for free for a limited time ended up converting (subscribing) at a 40-50% higher conversion rate. [Yousician]
10 mobile #growthgems π from "How We Scaled from 80k to 6m Users" by Jordan Gladstone (Director of Marketing at Dave - Banking App) at @apppromotion APS WFH USA
ππ Twitter thread below
π #2
Then start filming other people to be able to show the app more. It helps figuring out which messaging works as well as test different thumbnails (e.g. empty gas), showing or not the app (showing the app works better), actors, locations, etc.
π #3
Something that has worked well for Dave is responding to Facebook comments on their ads "as Dave" (they now have a style guide with what Dave likes). It allows for ads to look more organic and creates more engagement, leading to lower CPMs.
π #1
There are very different kinds of subscriptions, and the retention tactics/levers are entirely different for each subcategory:
- Content - music, books, video, comics, news, etc.
- Motivation - health & fitness, dating, education
- Utilities - video editors, dropbox etc.
π #2
Subscriptions align users and developers' interests. People will keep paying if you keep providing them value.
Example of Evernote: as you use it more, it becomes even more valuable and that's when the subscription model makes the most sense.