Compare separate Instagram placement vs. Placement Optimization
Objective: run an ad study to check how Placement Optimization performs compared to an Instagram-only or Facebook-only campaign.
Step 1: Create a campaign, clone it twice and create altogether 3 similar campaigns where the only difference is placements: one with Facebook (FB) placements only, one with Instagram (IG) only, and one with both FB and Instagram. For example, if the Right-Hand-Side placement (RHS) is used in the first ad set, it should also be included in the last ad set. Run campaigns with objectives supported by Instagram already. Focus on Mobile App Installs (MAI), Conversions, or Video View. Target an audience big enough, as we will need to split the audience for testing. Ideally, you should just create one set of creative, and create ads using those creative in each ad set in the same way. The IG actor ID in the creative would be ignored by FB placement ads, so just make the creative have everything needed for both IG and FB.
Step 2: Assign 50% of the budget to the Placement Optimization (PO) ad set. Assign 50% to the other two ad sets. How to split that would depend on how you would run ads without PO. For example, if without PO, you would split the budget 4:1 (FB vs IG) due to their audience size, you would now give 40% of the whole budget to the FB ad set, and 10% to the IG set. In the slides we sent earlier, the budget is split evenly between FB and IG, but that does not have to be followed. Due to the split test requirement, the minimum percentage should be at least 10.
Step 3: Create an ad study with 3 cells. Each cell should have the same treatment percentage as their budget allocation. For example, 50 for PO ad set, 40 for FB ad set, and 10 for IG ad set, in the example given above.
Compare two bidding types
Comparing different bidding types should be done with ad studies because that is the only way to run campaigns with identical settings and audiences with different bids, without them affecting each others' performance. This example compares bidding optimized for link clicks to optimization towards offsite conversions, but the same setup can be used for any bidding comparisons.
Step 1: Create the campaign with bidding optimized towards link clicks.
Step 2: Go into the campaign and clone it using the initial state.
Step 3: Change bidding of the campaign to optimize towards offsite conversions. If you are not using Automatic bid amount, remember to adjust the bid value to match the new optimization goal. The example picture above-set bid to $10 per link click and $50 per purchase, meaning that bid values are equal if approximately every 5th link click results in a purchase (20% click to conversion rate).
As a result, you should now have two identical campaigns, that have different bid types.
Step 4: Go to Library → Ad Studies and create new ad study.
Step 5: Define the required settings for your ad study:
- Name, start and end times. It is recommended to run the test for 2-3 weeks to get enough data for reliable results.
- 2 ad study groups with 50% of an audience each. You can give names to the segments.
- Click on Add Items to select one of the campaigns to each study group.
Step 6: Click on Save and your test will start! Keep monitoring the results during the test. You can always get an overview of your test by coming back to the Library.
Predictive Budget Allocation vs. manual budget allocation
Using ad studies to split the target audience is required for meaningful budget optimization testing, because otherwise results may be skewed due to variables you are not able to control. With ad studies and the setup described below, you can run a reliable test that will give you easily understandable actionable results for which budget optimization method works best for you.
Step 1: Create the campaign you want to test with.
Step 2: Go into the campaign and clone it using the initial state.
Step 3: Keeping other settings the same, change the name of the campaign and enable Predictive Budget Allocation. Set it to optimize towards your ideal goal, but make sure that each ad set is expected to generate enough conversions every day so that the algorithm is able to work properly. FAQ on budget allocation can be found here.
As a result, you should now have two identical campaigns, one of which uses Predictive Budget Allocation and one that relies on your manual optimization actions.
Step 4: Go to Library → Ad studies and create new Ad Study.
Step 5: Define the required settings for your ad study:
- Name, start and end times. It is recommended to run the test for a minimum of 2-3 weeks to get enough data for reliable results.
- 2 ad study groups with 50% of an audience each. You can give names to the segments.
- Click on Add Items to select one of the campaigns to each study group.
Step 6: Click on Save and your test will start! Keep monitoring the results during the test. You can always get an overview of your test by coming back to the Library.