Atomic Media text

Atomic Media

Archive for the ‘seo news’ Category

« Older Entries |

Meta AI adds Google Search results

Monday, April 22nd, 2024

Meta AI

Meta AI answers now may include Google Search results, making this the first AI assistant to include search results from both Google and Microsoft.

That said, I’ve yet to find a query that triggered Google Search results. Meta AI launched with Microsoft Bing integrated in September.

Why we care. Gartner predicted search traffic will fall due to the rise of AI chatbots, virtual agents and AI answer engines providing direct answers, as opposed to having to click on a website to find an answer. However, the models (like Meta’s Llama 3) don’t have access to real-time data – so it makes sense for Meta to incorporate information from search engines.

How Meta AI works. You can now enter searches in the search bar in the Facebook, Instagram, WhatsApp and Messenger apps. When Meta AI includes search results, those will appear as a tappable link beneath the answer, as Sources. Tapping on that link will bring you to the web, but keep you within the Meta app.

Meta AI will also appear in the main Facebook feed.

What it looks like. Here’s an image Meta shared of the search experience:

Meta AI Search

The deal. “There’s not a “ton of money” flowing either way, Zuckerberg told tech rag the Verge. He was also asked about Google outsourcing search to another AI assistant, despite having Gemini: website. Meta also announced the launch of, which makes Meta AI available on computers for the first time.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Airbnb dominates in search sentiment report

Monday, April 22nd, 2024

airbnb search sentiment report

Airbnb is the leading advertiser in search sentiment, closely trailed by Viacom and Netflix, according to a new analysis of 100 top advertisers.

Why we care. Looking at search sentiment can help marketers better understand how consumers feel about them in real-time and what contributes to brand loyalty and sustainability. AirBnB made a significant shift from search marketing to brand marketing. As they have taken the lead in this search sentiment report, it will pique the curiosity of other advertisers to go down the same path to improve their customer sentiment ranking.

The big picture. The findings illustrate what it takes to have strong customer sentiment for an extended period. 

Share of Time when monthly Sentiment matches Long Term<br />
73% - Brands with Positive Sentiment<br />
61% Brands with Negative Sentiment

Get the daily newsletter search marketers rely on.

See terms.

Volume and sentiment. There is no direct correlation between search volume and sentiment. However, search volume can enhance sentiment, serving as an amplifier, which advertisers should consider in their strategies. 

correlation between search volume and sentiment.

Top 10 advertisers by search activity. Airbnb, ViacomCBS and Netflix were the top three ranked brands. The others were:

Top advertisers by industry. These were the top advertisers across 10 categories analyzed in the report: 

About the data. The analysis was conducted by Captify (registration required), which collects data from over 1 billion searches on 3 million websites across 2 billion devices daily.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Google Chrome IP masking could radically impact search advertising

Monday, April 22nd, 2024

Data privacy with Google colours

Google is developing a two-hop proxy to enhance privacy for Chrome users, which has three big implications for advertisers:

Why we care. The two-hop proxy is only implemented on Chrome, meaning Google will have a monopoly on this data. No other search engines will have any data for advertisers to use for location targeting. This could effectively eliminate competition in the search ads space.

What is IP-based geolocation. According to Google’s documentation:

The details. User IP addresses will be batched and masked by region, and Google will allocate an IP address to each batch. Here’s what that looks like: 

Google geofeed categorisation

Get the daily newsletter search marketers rely on.

See terms.

Any user assigned an IP address for a region will have been verified to be in that region. The Google geofeed will be plugged into the proxy and have city-level accuracy.

Will consumer data be truly private? While a consumer’s data will be shielded from advertisers, it won’t be safeguarded from Google, posing a threat to data privacy.

More Google privacy and self-preferencing concerns. In February, the CMA (Competition and Markets Authority said:

Dig deeper. Google ‘cannot proceed with third-party cookie deprecation  

Meanwhile, the Information Commissioner’s Office, a UK privacy regulator, also shared significant concerns about Privacy Sandbox, in a WSJ article (subscription required), published last week. Once released, nothing will stop Google and other companies from using data to track users from different sites, the ICO said.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Structure, consume, learn and retire: Google’s pattern of learning

Monday, April 22nd, 2024

Google’s pattern of learning

Over the years, Google has seemingly established a pattern in how it interacts with the web. The search engine provides structured data formats and tools that allow us to supply information to Google. Think: meta tags, schema markup, the disavow tool and more. 

Google then consumes and learns from this structured data deployed across the web. Once sufficient learnings are extracted, Google then retires or de-emphasizes these structured data formats, making them less impactful or obsolete.

This cyclical process of giving structured data capabilities, consuming the information, learning from it and then removing or diminishing those capabilities seems to be a core part of Google’s strategy.

It allows the search engine to temporarily empower SEOs and brands as a means to an end – extracting data to improve its algorithms and continually improve its understanding of the web.

This article explores this “give and take” pattern through several examples.

Google’s pattern of ‘give and take’

The pattern can be divided into four stages:

The race is for the search engine to learn from webmasters’ interactions with Google’s suggested structure before they can learn to manipulate it. Google usually wins this race.

It doesn’t mean no one can leverage new structural items before Google discards them. It simply means that Google usually discards such items before illegitimate manipulations become widespread.

Give and take examples

1. Metadata

In the past, meta keywords and meta descriptions played crucial roles within Google’s ranking algorithms. The initial support for meta keywords within search engines actually predates Google’s founding in 1998.

Deploying meta keywords was a way for a webpage to tell a search engine the terms for which the page should be findable. However, such a direct and useful bit of code was quickly abused. 

Many webmasters injected thousands of keywords per page in the interest of getting more search traffic than was fair. It quickly led to the rise of low-quality websites filled with ads that unfairly converted acquired traffic into advertising income.

In 2009, Google confirmed what many had suspected for years. Google stated:

At least for Google’s web search results currently (September 2009), the answer is no. Google doesn’t use the keywords meta tag in our web search ranking.

Google does not use the keywords meta tag in web ranking

Another example is the meta description, a snippet of code that Google supported since its early days. Meta descriptions were used as the snippet text under a link in Google search results.

As Google improved, it started ignoring meta descriptions in certain situations. This is because users might discover a webpage through various Google keywords.

If a webpage discusses multiple topics and a user searches for a term related to topic 3, showing a snippet with a description of topics 1 or 2 would not be helpful.

Therefore, Google began rewriting search snippets based on user search intent, sometimes ignoring a page’s static meta description.

In recent times, Google has shortened search snippets and even confirmed that they mostly examine a page’s primary content when generating descriptive snippets.

2. Schema and structured data

Google introduced support schema (a form of structured data) in 2009. 

Initially, it pushed the “microformats” style of schema, where individual elements had to be marked up within the HTML to feed structured or contextual information to Google.

Early structured data in Google SERPs

In terms of concept, this actually isn’t too far removed from the thinking behind HTML meta tags. Surprisingly, a new coding syntax was adopted instead of just using meta tags more extensively.

For example, the idea of schema markup was initially (and largely remains) to supply additional contextual information concerning data or code that is already deployed – which is similar to the definition of metadata:

Both schema and metadata attempt to achieve this same goal. Information that describes other existing information to help the user leverage such information. However, the detail and structural hierarchy of schema (in the end) made it far more scalable and effective.

Today, Google still uses schema for contextual awareness and detail concerning various web entities (e.g., webpages, organizations, reviews, videos, products – the list goes on). 

That said, Google initially allowed schema to alter the visuals of a page’s search listings with a great degree of control. You could easily add star ratings to your pages for Google’s search results, making them stand out (visually) against competing web results.

As usual, some began abusing these powers to outperform less SEO-aware competitors. 

In February 2014, Google started talking about penalties for rich snippet spam. This was when people misused schema to make their search results look better than others, even though the information behind them was wrong. For example, a site without reviews purports a 5-star aggregate review rating (clearly false).

Fast-forward to 2024, and while still situationally useful, schema is not as powerful as it once was. Delivery is easier, thanks to Google’s JSON-LD preference. However, schema no longer has the absolute power to control the visuals of a search listing. 

Get the daily newsletter search marketers rely on.

See terms.

3. Rel=Prev / Next

Rel=”prev” and rel=”next” were two HTML attributes Google suggested in 2011. The idea was to help Google develop more contextual awareness of how certain types of paginated addresses were interrelated:

Rel="prev" and rel="next" per Google

Eight years later, Google announced they no longer supported it. They also said they hadn’t supported this kind of coding for a while, suggesting support ended around 2016, just five years after the suggestions were first made.

Many were understandably annoyed because the tags were fiddly to implement, often requiring actual web developers to re-code aspects of website themes.

Increasingly, it seemed as if Google would suggest complex code changes in one moment only to ditch them the next. In reality, it is likely that Google had simply learned all it needed from the rel=prev / next experiment.

4. Disavow tool

In October 2012, the web buzzed with news of Google’s new Disavow links tool.

Link disavow tool - Google

In April 2012, Google released the Penguin update, which caused the web to be in turmoil. The update targeted spammy off-site activity (link building) heavily, and many websites saw manual action notices appear within the Search Console (then named Webmaster Tools).

Using the Disavow tool, you could upload lists of linking pages or domains they would like to exclude from Google’s ranking algorithms. If these uploaded links largely agreed with Google’s own internal assessment of the backlink profile, the active manual penalty may then have been lifted. 

This would give back a “fair” amount of Google traffic to their site, though obviously, with part of their backlink profile now “disavowed” – post-penalty traffic was usually lower than pre-penalty traffic.

As such, the SEO community had a relatively low opinion of the tool. Usually, a complete backlink removal or disavow project was necessary. Having less traffic after the penalty was better than having no traffic at all.

Disavow projects haven’t been necessary for years. Google now says that anyone still offering this service is using outdated practices.

In recent years, Google’s John Mueller has been extremely critical of those selling “disavow” or “toxic links” work. It seems as if Google no longer wants us to use this tool; certainly, they do not advise us on its usage (and haven’t in many years).

Dig deeper. Toxic links and disavows: A comprehensive SEO guide

Unraveling Google’s give-and-take relationship with the web

Google provides tools or code snippets for SEOs to manipulate its search results in minor ways. Once Google gains insights from these deployments, such features are frequently phased out. Google grants us a limited amount of temporary control to facilitate its long-term learning and adaptation.

Does this make these small, temporary releases from Google useless? There are two ways of looking at this:

In truth, there is no right or wrong answer. It depends on your ability to adapt to web changes efficiently.

If you’re comfortable with quick changes, implement what you can and react fast. If your organization lacks the expertise or resources for quick changes, it’s not worth following trends blindly.

I think this ebb and flow of give and take doesn’t necessarily make Google evil or bad. Any business will leverage its unique assets to drive further learning and commercial activity.

In this instance, we are one of Google’s assets. Whether you wish for this relationship (between yourself and Google) to continue is up to you.

You could choose not to cooperate with Google’s temporary power, long-term learning trade deals. However, this may leave you at a competitive disadvantage.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Make 2024 the year you take home the highest honor in search

Saturday, April 20th, 2024

Blue background with white Search Engine Land logo and a woman in a blue dress holding a trophy.

Winning an industry award can greatly impact how customers, clients, and colleagues regard your brand. Showcase your achievements and celebrate your professional excellence by entering The Search Engine Land Awards — the highest honor in search marketing!

For nearly 10 years, The Search Engine Land Awards have honored some of the best in the search industry — including leading in-house teams at Wiley Education Services, T-Mobile, Penn Foster, Sprint, and HomeToGo – and exceptional agencies representing Samsung, Lands’ End, Stanley Steemer, and beyond.

This year, it’s your turn. The 2024 entry period is now open!

Here’s what you need to know:

  • The Search Engine Land Awards celebrate individuals, agencies, and internal teams within the search marketing community who have demonstrated excellence in executing organic and paid search marketing campaigns.
  • This year’s program features 19 unique categories, from Best Use of AI Technology in Search Marketing to Agency of the Year… click here to explore them all.
  • Applying is easier than ever – send us an executive summary that showcases, in 750 words or less, the award-worthy work you and your team performed this past year.
  • Completing your application empowers you to reflect on an impressive year of work, featuring its successes and lessons learned – an invaluable exercise for you and your team.
  • Winning a Search Engine Land Award is a unique, rewarding, and cost-effective way to put your organization a step ahead of its competitors, gain well-earned publicity, boost company morale, and more.
  • Submit your application by May 24 to unlock Super Early Bird pricing – just $395 per entry ($300 off final rates!).

Don’t miss your opportunity to participate in the only awards program recognized by Search Engine Land, the industry publication of record. Begin your application today! 

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

5 alternatives to the missing Page Timings report in GA4

Friday, April 19th, 2024

5 alternatives to the missing Page Timings report in GA4

Page load speed, among other Core Web Vitals, is a known Google organic ranking factor. While we have the PageSpeed Insights tool, it unfortunately only works on one page at a time. 

The Page Timings report in Universal Analytics surfaced specific pages on your site that were slowest, allowing you to prioritize which pages to evaluate and optimize.

The tool was particularly helpful if you have a large site with thousands of pages to analyze. Armed with the list of problem URLs, you could then prioritize pages for review using the PageSpeed Insights tool.

But Google didn’t include the Page Timings report in GA4, so where can you find similar information now? 

Below are several free and paid tools that can help you pinpoint your problem pages and prioritize their optimization.

1. Google Search Console

Google Search Console (GSC) provides a Core Web Vitals report and even separates the data by mobile versus desktop.

However, while GSC provides some examples of URLs affected, it doesn’t provide a full list. Instead, it groups pages together and shows examples from the group. The data is also not easily downloadable to a CSV for monitoring.

If your goal is regular monitoring, you’ll need to log in to GSC and review the data within the tool. The GSC API does not support exporting core web vitals report data, so you can’t pull GSC data into Looker Studio or other visualization tools.

2. Screaming Frog

A long-time favorite of SEO professionals, Screaming Frog software has many helpful SEO applications, but the most important thing for this article is that it provides page load times. 

It can further be connected to the PageSpeed Insights tool using a key from the PageSpeed Insights API to import Core Web Vitals data directly into the PageSpeed report:

Screaming frog - Page speed 1

The only real drawback to Screaming Frog is that because it’s a desktop-based application, the computer you host it on has to be turned on and connected to the web when the report runs. This makes the tool less optimal for dashboarding and highly regular data monitoring.

One workaround is to have a desktop computer that is always turned on. I did this in my agency for many years with a dedicated, old desktop computer running Screaming Frog. 

Because the tool allows for scheduling, the scheduled report can run at the appointed time as long as the computer is on and connected to the internet. Additionally, you can connect Screaming Frog to a Google Drive account and export the report tabs to Google Sheets:

Screaming frog - Page speed 2

If you want to use the upload for dashboarding, choose the Overwrite files in output, which will allow you to just update the same Google Sheet.

Once the data is in a Google Sheet, you can import it into other platforms, such as Looker Studio, to create dashboards and visualizations or create thresholds to send email alerts using Apps Script.

Get the daily newsletter search marketers rely on.

See terms.

3. Ahrefs

Ahrefs has long been an SEO favorite for tracking backlinks, but the tool also has a robust site audit tool that tracks page load speed as it indexes a website.

Like Screaming Frog, you can connect PageSpeed Insights directly to the site audit to see specific core web vitals optimizations that should be made:

Ahrefs - Page speed

While you can export reports to Google Sheets, it’s a manual process. Site audits can be scheduled for regular intervals.

Unfortunately, the Ahrefs API doesn’t appear to have a way to automatically export the results, leaving it a bit of a manual process and less than ideal for dashboarding and near real-time reports.

4. Semrush

Another popular SEO tool is Semrush, and it also has a site audit feature that reviews page load speed and lists the pages with the longest load times:

Semrush - Page speed

Unlike Ahrefs and Screaming Frog, you aren’t required to enter a personal PageSpeed Insights API key to connect core web vitals optimization information directly to the audit.

Again, with this tool, however, the data export for this report is manual. Semrush has an API, though, and it will report on page load speed issues. However, the API is only available for business plans and higher, which start at $499/month.

5. Add page speed into GA4 using custom dimensions

Another option to restore page load speed in Google Analytics is to create a custom dimension. You can use that custom dimension to create an Explorations report, import data into Looker Studio or export data using the GA4 API or various tools that incorporate the API.

Measure School has an excellent tutorial on how to track page load speed using Google Tag Manager and custom dimensions in GA4. 

Multiple free and paid tools can export your list of slow pages using the custom dimension to Google Sheets, including the free Google Sheets extension GA4 Reports Builder for Google Sheets.

Unlike its predecessor in Universal Analytics, this extension does not have scheduling capability. I personally use Supermetrics, which is a paid tool but provides me access to multiple APIs, including GA4, and allows me to schedule reports.

Connecting with the PageSpeed Insights API

Once you have your list of the site’s slowest pages, though, you’re not completely finished! Screaming Frog, Ahrefs and Semrush pull Core Web Vitals optimizations into their platforms using the PageSpeed Insights API. 

If you’re not using one of those tools, you’ll either need to interrogate each URL in the PageSpeed Insights tool manually, one by one, or you can also use the PageSpeed Insights API to make those queries for you.

If you’re not a web developer or skilled with coding, there are fortunately tools that you can use to tap into APIs, including the PageSpeed Insights API, to get the specific core web vitals details you need for optimization.

My personal favorite is Zapier, which has a webhook zap allowing even non-developers a simplified way to connect your list of slow URLs to the PageSpeed Insights and pull in whichever data points are most important:

Connecting with the PageSpeed Insights API - Zapier

Optimizing images can often be a quick way to improve page load speed. In the zap example above, I only pull in image details for each URL for a site with over 10,000 pages. This allows me a fast way to find:

The benefit of this approach is that it truly can provide near-real-time reporting and dashboarding, whereas the other solutions still have drawbacks that make them less than ideal for dashboard reports.

However, you continue to measure page load speed for organic search optimization, each solution requires some set-up and work. So, if you haven’t already started on a solution, get started immediately so that you can quickly mine quick wins for SEO and improve your problem pages.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Dominate search results with your SEO dream team by Edna Chavira

Friday, April 19th, 2024

Ever wondered how to structure an SEO team for unparalleled success? Join us on Tuesday, April 23rd for this webinar where our panel will guide you through the proven strategies to build a dynamic and scalable SEO program.

You’ll discover how a well-structured team can overcome and outperform unpredictable algorithm updates and dive into the art of determining the ideal SEO team structure (and where SEO should sit) that aligns with your business goals and ensures optimal collaboration between departments.

RSVP today for Beyond the Search Bar: Crafting an Impactful SEO Team Structure and Defining its Place in Your Organization and uncover the secrets to building your dream SEO team.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

How to set and manage PPC expectations for teams and stakeholders

Friday, April 19th, 2024

How to set and manage PPC expectations for teams and stakeholders

Have you ever been in a situation where not everyone was on the same page?

It happens often in the workplace and usually is caused by different expectations among stakeholders.

Here are some ways to set and manage expectations for PPC clients and agency teams.

Outline expectations during the sales process

Setting expectations at the beginning of a client engagement or project is crucial for success.

For advertising agencies, the time to set expectations is before the advertiser even becomes a client – during the sales process.

Giving the client an idea of how your team operates will help you both decide if the relationship is a good fit.

For example, does your agency have an account or media lead who oversees the client relationship, or do individual practice leads handle the relationship? Or is it a hybrid? Who is the main point of contact?

Be clear about how your team operates generally while you’re still negotiating.

Dig deeper: How to build and maintain client trust in your agency

Agree on parameters in the statement of work

It’s critical to lay out the engagement parameters in the statement of work.

The clearer you can be about the parameters of the relationship and how it will operate, the better you can manage expectations once you’re actually doing the work.

Agree on what work will be performed

What services will you be providing to the client? Here are some common agency services:

This is only a partial list!

Agencies can offer a wide variety of advertising and marketing services. 

Some agencies provide strategy and execution of the services listed above and some only provide consulting, with the client responsible for implementation. Spell out what work you intend to perform.

If you’re not clear in the statement of work about what work you’re performing, clients will ask you to do work that you’re not staffed for.

Make it obvious what’s in scope and what isn’t. Be detailed.

It’s impossible to list every possible scenario in a statement of work – and that’s why it’s crucial to be clear about the services the agency will handle.

Tell the client what work is in scope and be clear that anything else is out of scope.

For example, how many search engines will you manage for paid or organic search? How many social engines will you advertise on? Which ones? Are analytics services included? If not, who handles that troubleshooting? What about CRM?

For B2B advertisers, closing the loop between the initial website lead and down-funnel CRM actions is an important piece of the puzzle. Are you prepared to provide these services, or will the client be responsible for this work?

The same thing goes for landing page optimization and development.

Not being able to create optimized landing pages can be a performance blocker that can ultimately doom your relationship with the client. Be clear about who owns this responsibility.

By outlining who is responsible for CRO and landing page optimization, you can help stave off disappointment down the road.

Meeting and reporting cadences

Another aspect of client service to deal with during the sales process is deliverables and cadences.

How often will you meet with the client? Will the meetings be held online, or in person? Who from the agency will attend?

Meetings can become a giant time suck, yet they’re also necessary. Be thoughtful about how to make them efficient for both the agency and the client.

Reporting is another deliverable to address in the statement of work.

What types of reporting will be provided and on what cadence? Will you use Looker dashboards, PowerPoint reports, QBRs, or all three? How will you handle ad-hoc reporting?

Dig deeper: 3 steps for effective PPC reporting and analysis

Response times and client communications

You’ll also want to agree on client communications.

How will day-to-day communication be handled? Will you use email, instant messaging (IM) platforms like Slack or Teams, project management boards like Asana or Trello, or a combination of all of these? 

What response times should be expected?

One pitfall of using IM for client communications is that everyone starts to expect instant replies. That’s neither feasible nor productive for anyone.

Agree with your clients that regular communications will be responded to within 24 hours.

For urgent messages, perhaps a 6-hour response time is reasonable. Agree to this ahead of time – that way, no one is disappointed.

Think too about how easy it will be to search for relevant communications later.

I find it much more difficult to find messages and topics in Slack than email, although Slack is easier to organize into channels. Each has pros and cons! Think this through before you engage with the client.

Account staffing

Every statement of work should include a staffing plan. You don’t need to name names, but list the roles and percentage of time each role will be allocated to the engagement.

For example, staffing on a large paid search account might look like this:

Being clear about roles and percentage allocation helps clients understand who their key contacts are and how much time they will spend working on the account.

Dig deeper: Client onboarding and offboarding: The PPC agency’s guide

Dealing with unexpected issues

Unforeseen challenges can arise on an account. Perhaps the client’s conversion tracking breaks, or they need help spinning up a landing page when that’s normally something they would handle themselves. 

Outline in the SOW how you’ll handle issues that would normally be out of scope. 

Will you charge an hourly rate? Will a change order or new SOW be required? 

Good agencies will often pitch in and help without compensation. That’s part of being a good business partner.

Still, it’s important to set expectations on out-of-scope work to ensure the engagement remains profitable.

Get the daily newsletter search marketers rely on.

See terms.

Managing expectations during the engagement

Once the contract is signed, the work begins!

Now is the time to manage expectations.

It’s important on kickoff calls or meetings to establish your rules of engagement.

Reinforce how you will communicate, meeting cadences, turnaround times and other key service-level agreements (SLAs). Getting agreement from the client and buy-in on both sides is critical.

An effective way to get everyone’s buy-in is to whiteboard the rules during the kickoff, either virtually or in person. Then, take time to discuss the rules and hear all perspectives.

Be willing to add items you may not have thought of initially, or to adjust to meet everyone’s needs. Just make sure you can still deliver in the time frame you agree to.

Once you’ve aligned on the rules, distribute them to all stakeholders.

One agency I worked at printed and laminated the rules of engagement for each client. They shared a copy with everyone working on the account, both internally and client-side.

While this may sound quaint in 2024, it’s effective – a physical reminder stakeholders can keep at their desk and easily review at any time.

The rules of engagement could also be in an online document that’s pinned to a Slack or Teams channel.

It’s important to reiterate that getting everyone’s buy-in is key here. 

One of Dale Carnegie’s principles  in “How to Win Friends and Influence People” is to “Let the other person feel that the idea is his or hers.”

It’s important to remember this principle when establishing the rules of engagement with clients. If clients have a hand in developing the rules, they’ll be more likely to follow them. 

Dig deeper: How to retain clients in PPC

How to deal with issues during the engagement

Inevitably, issues will crop up during the engagement that require a review of the SOW.

The client might ask for more meetings than you’ve contracted for.

Or they start to expect faster turn times on the work you’re delivering.

It’s tricky because, on the one hand, you want to do everything you can to keep your client happy.

On the other hand, your agency needs to be profitable.

Think carefully about whether you should accommodate the client’s request or push back.

There are pros and cons to each approach.

If you’ve established ground rules and SLAs in the contract process, it’s not wrong to gently remind the client of what you agreed to.

In this case, I’ll usually say something like, “We understand how important this launch is for your business. Our contract stipulates a 5-day lead time for new campaign launches. Given the urgent timing of this campaign, we can aim to deliver it in 2 days. We’ll have to reprioritize some of your other work to accommodate this and we’re happy to do so to help you meet your goals.”

A statement like that does several things.

Making exceptions for clients is part of being a good partner. But if the exceptions start to become a regular thing, you’ll want to give a more forceful reminder of the rules of engagement and you may want push back. 

Renegotiating the contract is another option.

For example, you could add staff to the account that would enable faster turn times – at an additional cost.

Or you could charge the client the hourly rates you provided for in the contract. 

If you’ve set expectations clearly in the beginning, you have a good chance of avoiding a big mismatch between your reality and the client’s.

Clear expectations make for profitable relationships for everyone!

Dig deeper: 6 tips to build PPC client relationships

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

7 ways to elevate your responsive search ads

Friday, April 19th, 2024

7 ways to elevate your responsive search ads

While Google’s responsive search ads (RSAs) have come a long way from the simple text ads of the past, there are still plenty of opportunities to optimize and take your PPC performance to new heights.

Let’s dive into seven proven strategies that can help you elevate your Google responsive search ads game.

1. Less is sometimes more

We’re starting with a concept rather than a feature here, but just because you can add 15 headlines and four descriptions doesn’t mean you should also populate all of those slots. A few reasons why:

2. Keyword insertion 

This is a really useful way of making your ads more relevant and improving your quality score. When applied, Google will populate the RSA with whatever keyword your search term triggered the ad for. 

You can also add copy before and after the keyword insertion to tailor the message. For example:

If the keyword triggered pushes the character limit over, Google will use the fallback copy you included, “Nike running shoes” in the above example. 

It should not be used in every circumstance, as it can get messy when overused. And you are in danger of the ad not reading right. 

This is especially true in consolidated ad groups with more keywords. Less control with more keywords eligible to appear in the keyword insertion.

Extra word of caution: Never use this for competitor keyword campaigns. 

3. Countdown insertion 

This is a really cool feature and an absolute essential for your sale or event ads. Including a countdown timer is such an effective way to add urgency to your RSAs. 

Countdown insertion - Google RSAs

You can customize how many days in advance you’d like the timer to begin. As with the keyword insertion feature, you can include copy before and after insertion. For example:

Even when there is less than a day before the end of the countdown, Google tells you to the second how long you have left. Very eye-catching for users. 

Get the daily newsletter search marketers rely on.

See terms.

4. Ad pinning 

Not happy with giving Google the control to put your copy in any order that suits them? Well, then pinning is the answer. 

This allows you to tell Google what title and description you want to remain constant in your ad and in which position. You can also pin multiple headlines and descriptions in the same position, which Google will alternate.

Your ad strength will suffer using pins, as Google doesn’t like not being in control. But advertisers sometimes just need the ability to decide what order the copy goes in. 

For example, putting your product in the first two titles (long name) and including the price as the final title is not ideal if Google decides to mix that order, even if the algorithm thinks it has a better CTR.

I’d recommend testing two RSAs at once for ad groups with high volume. Use the same copy, but one with pins and one without. Test yourself against the algorithm. 

Remember, Google often prioritizes RSA variations based on CTR, but if the objective of the ads is conversion-based, then you should judge the results toward conversion rate, CPA or ROAS instead. 

5. Ad experiments

As Google will often prioritize the best CTR performance in ads, if you are testing two RSAs in the same ad group, then they will quickly favor one and show it more often. 

Experiments are what you want if you want a fairer, less biased testing framework. 

The experiment feature can effectively test different keywords, landing pages, bid strategies, etc. Testing ads is another feather in its cap. 

Google RSAs - Experiments

You can set up experiments to show each ad evenly at 50/50 over a selected time period (currently maxed out at 64 days). During setup, Google will duplicate your campaign into a test version. You make whatever changes to the RSA you need to, then just schedule a start date. 

Once the test begins, you can access a testing dashboard within the experiment tab that compares control and test campaigns.

When setting up the experiment, you will tell Google your two performance priorities, so the dashboard will focus its reporting on those metrics.

6. Campaign-level headline and description assets

This is the newest feature in the list and, as of this writing, is still in beta. It’s a great addition designed for use during a specific period (e.g., a sale or an event). 

At the campaign level, you can schedule up to three headlines and two descriptions to appear in all of the campaign’s RSAs rather than updating all of your ads individually.

If messaging is a priority, you can also pin these extra assets and schedule a start and end date.

RSAs - Campaign-level headline and description assets

They are ideal for large Search accounts with a high volume of RSAs that require frequent copy changes to highlight promotional periods. 

What could’ve taken hours to regularly update, schedule and revert back to the original copy now takes only a few minutes. Preparing for Black Friday might not seem as daunting this year.

7. Ad variations

This is probably the most underutilized feature for RSA ads. Experiments are the most common A/B testing framework. Still, if you want to test specific titles or descriptions against other variants (as opposed to RSA vs. RSA), this is the ideal solution. 

This is very cool for creating tests at a forensic level. If you have a legacy USP scattered across multiple RSAs (e.g., “Free Shipping Available”) but are thinking of replacing it with a similar but refreshed take (e.g., “Free shipping when you spend over $50”), then you simply select the “Find & Replace” variation type, input the original copy and replace with the new. 

Just like setting up experiments, you select the start and end date, how much the experiment split is (which for me is always 50%) and then create. There are other types of ad variations that can update whole headlines and descriptions, as well as URLs, but I would use find and replace more commonly.

What about AI content?

I couldn’t go through the whole article without mentioning the hottest industry topic: AI. So, I thought it deserved its own bonus section.

Now, the truth is that the practical application of AI for RSAs lies outside of the advertising platform (Performance Max asset group copy is a different story with the new generative AI feature). 

One of the first PPC use cases of ChatGPT, when it came on the scene, was generating additional and alternative creative for RSAs based on expanding the existing copy. 

The danger is being too reliant on AI for content. Yes, ChatGPT, Gemini and the rest are great tools for carrying the creative burden. However, using AI to create the majority of the copy can lead to generic output or even a separation away from the brand identity if your prompt engineering isn’t up to scratch. 

To strive toward compelling ad copy, simplicity is often the best method, so try not to overcomplicate the process. I’ve found the best use is identifying the best-performing titles and descriptions (hopefully, most of the ad asset report data isn’t pending, so you can see these insights) and using AI to expand and enhance the poor performers with alternative variations. 

Lean on AI and use it to generate fresh ideas, but don’t rely on it. Remember, you will know the brand and the USPs better than AI. Humans still have a use, after all.

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

Tracking in 2024: Where we are and how to prep

Thursday, April 18th, 2024

Tracking in 2024- Where we are and how to prep

Remember how painful iOS 14 was for performance marketers? An even more seismic change is looming, and way too many marketers are still unprepared.

Yes, I’m talking about the impending death of third-party cookies. If you’re reading this article, this is probably the 1,000th time you’ve heard it mentioned.

And that might be part of the problem: “the death of cookies” could now be one of those familiar phrases you skim over without understanding the depth of its ramifications.

If you’re a little unclear on what kind of havoc “the death of cookies” is going to wreak in your campaigns, stay with me for a few minutes as I tackle:

What’s changing from a technical perspective

First, let’s be clear: we’re talking about the death of third-party cookies, not first-party cookies. You own first-party cookies and the data they collect, which won’t be impacted by the Chrome update.

Third-party cookies, which pass data from your website to external parties (like ad platforms) to your site to paint a picture of the user and user behavior, are what’s disappearing.

If this sounds familiar, it’s because that’s exactly what happened with iOS 14.

In that update, Apple’s “App Tracking Transparency” introduction prevented companies from tracking user behavior across third-party apps. Advertising platforms (particularly Facebook) suddenly couldn’t help advertisers understand what users were doing after engaging with their ads.

Cookies, whether first-party or third-party, are snippets of code saved by the browser or app to the user’s device. They contain user and session identifiers, ad click IDs, timestamps and functions (e.g., whether you’re logged into an app). 

In short, they are (or were until recently) the most common way to identify and track users, and they’re about to disappear from Chrome (which is following Firefox and Safari in doing so).

If you’re using pixels, UTM parameters outside of a first-party environment, GTAG (ask your analytics team), or other tracking based on – and stored in – browsers, you’re in for a world of transition.

What’s changing from a marketing perspective

It might be easier to list what isn’t changing, but here’s a quick list of the biggest hits:

Dig deeper: 7 paid media reporting tips when tracking is messy

Given all of that, you can hopefully now realize that the time to start planning was about a year ago – and if you’re behind the curve, you’d better keep reading.

Get the daily newsletter search marketers rely on.

See terms.

4 real preparation steps to take ASAP

I break this down for my clients into four buckets:

1. Focus on CRM cleanliness

At the very least, you should be able to reference your CRM data to understand your users’ point of entry and identify your most valuable users.

Make sure you have a plan to assess your data cleanliness, your reports, and your dashboards and you can get things in good enough shape to trust what your first-party data is telling you. Work to

2. Tune up your data collection

First-party data will become even more important as data from third-party sources erodes.

Make sure your ad campaigns, organic campaigns, owned properties, etc., are fully maximized to collect first-party data and have a plan to use it in your campaigns (email, SMS, retargeting, lookalikes, etc.). 

3. Implement platform solutions

Get extremely comfortable with Google’s Enhanced Conversions, Meta and LinkedIn’s conversions APIs, and whatever monikers you’ll see other platforms use. They help ensure that ad algorithms can track valuable actions both online and offline, which is essential for future-proofing your tracking efforts.

(Bonus points if you combine platform solutions with first-party data to teach the platform algorithms to find your best users via offline conversion tracking.)

4. Go server-side

Analytics and data stored in servers you control (as opposed to browsers that can change their rules at any time) are one big hedge against cookie erosion.

Implement initiatives like server-side GTM and start researching CDP (customer data platform) options like Segment and Tealium to take at least partial ownership of your data and analytics.

Winning strategies for a data-driven, privacy-first future

If you need a little good news after reading all of that, I have a couple of tidbits for you. 

Dig deeper: 3 ways search marketers can prepare for the big cookie crumble

Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing

« Older Entries |