Archive for the ‘seo news’ Category
Thursday, December 21st, 2023
Google posted an indexing issue with Google Search, saying it is investigating reports of indexing delays for content. Google said this issue is “affecting a small number of sites” and they are “working on identifying the root cause.”
What Google wrote. Google posted, “Google is investigating reports about delayed indexing in Google Search that’s affecting a small number of sites. We’re working on identifying the root cause. Next update will be within 12 hours.”
Google also posted these responses on X:
Started early this morning. The reports of these indexing issues started to come in around 1:30 am ET on Thursday, December 21. I initially posted about it on the Search Engine Roundtable, several hours prior to Google confirming the issue. You can see that some of the complaints came in as early as 1:30 am ET and continued to come in throughout even as I write this story.
Who is impacted. Initial thoughts were that only sites in India and specific regions were impacted but that is not exactly correct. Others thought it had to do with sites that publish both normal HTML pages, as well as offer an AMP alternative solution.
The sites I write at, here and the Search Engine Roundtable do not appear to be impacted. The Wall Street Journal, NY Times, Washington Post and other large publications also do not seem to be impacted.
Google said it is only “affecting a small number of sites.” Google will update us when it figures out the “root cause” of the issue. But I do not expect Google to share which sites were impacted and which were not.
What now. If you are impacted, there is not much you can do right now but wait it out. Google is investigating the issue on their end and will hopefully resolve the issue sooner than later. Google did say it will provide an update within the next 12 hours, but hopefully we will see an update sooner.
Why we care. If you are noticing issues with indexing of your content and thus traffic issues to your site from Google Search, then this may be why. Again, this specific issue is new, as of this morning. If you have been having indexing and ranking issues over the past several days or months, that is likely more of an issue with one of the Google Search algorithm updates and Google thinking your site is not worth ranking and indexing at this point.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Thursday, December 21st, 2023
The prevalence of mass-produced, AI-generated content is making it harder for Google to detect spam.
AI-generated content has also made judging what is quality content difficult for Google.
However, indications are that Google is improving its ability to identify low-quality AI content algorithmically.
Spammy AI content all over the web
You don’t need to be in SEO to know generative AI content has been finding its way into Google search results over the last 12 months.
During that time, Google’s attitude toward AI-created content evolved. The official position moved from “it’s spam and breaks our guidelines” to “our focus is on the quality of content, rather than how content is produced.”
I’m certain Google’s focus-on-quality statement made it into many internal SEO decks pitching an AI-generated content strategy. Undoubtedly, Google’s stance provided just enough breathing room to squeak out management approval at many organizations.
The result: Lots of AI-created, low-quality content flooding the web. And some of it initially made it into the company’s search results.
Invisible junk
The “visible web” is the sliver of the web that search engines choose to index and show in search results.
We know from How Google Search and ranking works, according to Google’s Pandu Nayak, based on Google antitrust trial testimony, that Google “only” maintains an index of ~400 billion documents. Google finds trillions of documents during crawling.
That means Google indexes only 4% of the documents it encounters when crawling the web (400 billion/10 trillion).
Google claims to protect searchers from spam in 99% of query clicks. If that’s even remotely accurate, it’s already eliminating most of the content not worth seeing.
Content is king – and the algorithm is the Emperor’s new clothes
Google claims it’s good at determining the quality of content. But many SEOs and experienced website managers disagree. Most have examples demonstrating inferior content outranking superior content.
Any reputable company investing in content is likely to rank in the top few percent of “good” content on the web. Its competitors are likely to be there, too. Google has already eliminated a ton of lesser candidates for inclusion.
From Google’s point of view, it’s done a fantastic job. 96% of documents didn’t make the index. Some issues are obvious to humans but difficult for a machine to spot.
I’ve seen examples that lead to the conclusion Google is proficient at understanding which pages are “good” and are “bad” from a technical perspective, but relatively ineffective at decerning good content from great content.
Google admitted as much in DOJ anti-trust exhibits. In a 2016 presentation says: “We do not understand documents. We fake it.”
A slide from a Search all-hands presentation prepared by Eric Lehman
Google relies on user interactions on SERPs to judge content quality
Google has relied on user interactions with SERPs to understand how “good” the contents of a document is. Google explains later the presentation: “Each searcher benefits from the responses of past users… and contributes responses that benefit future users.”
A slide from a Search All Hands presentation prepared by Lehman
The interaction data Google uses to judge quality has always been a hotly debated topic. I believe Google uses interactions almost entirely from their SERPs, not from websites, to make decisions about content quality. Doing so rules out site-measured metrics like bounce rate.
If you’ve been listening closely to the people who know, Google has been fairly transparent that it uses click data to rank content.
Google engineer Paul Haahr presented “How Google Works: A Google Ranking Engineer’s Story,” at SMX West in 2016. Haahr spoke about Google’s SERPs and how the search engine “looks for changes in click patterns.” He added that this user data is “harder to understand than you might expect.”
Haahr’s comment is further reinforced in the “Ranking for Research” presentation slide, which is part of the DOJ exhibits:
A slide from “Ranking for Research” DOJ exhibit
Google’s ability to interpret user data and turn it into something actionable relies on understanding the cause-and-effect relationship between changing variables and their associated outcomes.
The SERPs are the only place Google can use to understand which variables are present. Interactions on websites introduce a vast number of variables beyond Google’s view.
Even if Google could identify and quantify interactions with websites (which would arguably be more difficult than assessing the quality of content), there would be a knock-on effect with the exponential growth of different sets of variables, each requiring minimum traffic thresholds to be met before meaningful conclusions could be made.
Google acknowledges in its documents that “growing UX complexity makes feedback progressively hard to convert into accurate value judgments” when referring to the SERPs.
Get the daily newsletter search marketers rely on.
Brands and the cesspool
Google says the “dialogue” between SERPs and users is the “source of magic” in how it manages to “fake” the understanding of documents.
A slide from “Logging & Ranking” DOJ exhibit
Outside of what we’ve seen in the DOJ exhibits, clues to how Google uses user interaction in rankings are included in its patents.
One that is particularly interesting to me is the “Site quality score,” which (to grossly oversimplify) looks at relationships such as:
- When searchers include brand/navigational terms in their query or when websites include them in their anchors. For instance, a search query or link anchor for “seo news searchengineland” rather than “seo news.”
- When users appear to be selecting a specific result within the SERP.
These signals may indicate a site is an exceptionally relevant response to the query. This method of judging quality aligns with Google’s Eric Schmidt saying, “brands are the solution.”
This makes sense in light of studies that show users have a strong bias toward brands.
For instance, when asked to perform a research task such as shopping for a party dress or searching for a cruise holiday, 82% of participants selected a brand they were already familiar with, regardless of where it ranked on the SERP, according to a Red C survey.
Brands and the recall they cause are expensive to create. It makes sense that Google would rely on them in ranking search results.
What does Google consider AI spam?
Google published guidance on AI-created content this year, which refers to its Spam Policies the define define content that is “intended to manipulate search results.”
Google spam policies
Spam is “Text generated through automated processes without regard for quality or user experience,” according to Google’s definition. I interpret this as anyone using AI systems to produce content without a human QA process.
Arguably, there could be cases where a generative-AI system is trained on proprietary or private data. It could be configured to have more deterministic output to reduce hallucinations and errors. You could argue this is QA before the fact. It’s likely to be a rarely-used tactic.
Everything else I’ll call “spam.”
Generating this kind of spam used to be reserved for those with the technical ability to scrape data, build databases for madLibbing or use PHP to generate text with Markov chains.
ChatGPT has made spam accessible to the masses with a few prompts and an easy API and OpenAI’s ill-enforced Publication Policy, which states:
“The role of AI in formulating the content is clearly disclosed in a way that no reader could possibly miss, and that a typical reader would find sufficiently easy to understand.”
OpenAI’s Publication Policy
The volume of AI-generated content being published on the web is enormous. A Google Search for “regenerate response -chatgpt -results” displays tens of thousands of pages with AI content generated “manually” (i.e., without using an API).
In many cases QA has been so poor “authors” left in the “regenerate response” from the older versions of ChatGPT during their copy and paste.
Patterns of AI content spam
When GPT-3 hit, I wanted to see how Google would react to unedited AI-generated content, so I set up my first test website.
This is what I did:
- Bought a brand new domain and set up a basic WordPress install.
- Scraped the top 10,000 games that were selling on Steam.
- Fed these games into the AlsoAsked API to get the questions being asked by them.
- Used GPT-3 to generate answers to these questions.
- Generate FAQPage schema for each question and answer.
- Scraped the URL for a YouTube video about the game to embed on the page.
- Use the WordPress API to create a page for each game.
There were no ads or other monetization features on the site.
The whole process took a few hours, and I had a new 10,000-page website with some Q&A content about popular video games.
Both Bing and Google ate up the content and, over a period of three months, indexed most pages. At its peak, Google delivered over 100 clicks per day, and Bing even more.
Google Search Console Performance data from this site presented by Lily Ray at PubCon
Results of the test:
- After about 4 months, Google decided not to rank some content, resulting in a 25% hit in traffic.
- A month later, Google stopped sending traffic.
- Bing kept sending traffic for the entire period.
The most interesting thing? Google did not appear to have taken manual action. There was no message in Google Search Console, and the two-step reduction in traffic made me skeptical that there had been any manual intervention.
I’ve seen this pattern repeatedly with pure AI content:
- Google indexes the site.
- Traffic is delivered quickly with steady gains week on week.
- Traffic then peaks, which is followed by a rapid decline.
Another example is the case of Casual.ai. In this “SEO heist,” a competitor’s sitemap was scraped and 1,800+ articles were generated with AI. Traffic followed the same pattern, climbing several months before stalling, then a dip of around 25% followed by a crash that eliminated nearly all traffic.
SISTRIX visibility data for Causal.app
There is some discussion in the SEO community about whether this drop was a manual intervention because of all the press coverage it got. I believe the algorithm was at work.
A similar and perhaps more interesting case study involved LinkedIn’s “collaborative” AI articles. These AI-generated articles created by LinkedIn invited users to “collaborate” with fact-checking, corrections and additions. It rewarded “top contributors” with a LinkedIn badge for their efforts.
As with the other cases, traffic rose and then dropped. However, LinkedIn maintained some traffic.
SISTRIX visibility for LinkedIn /advice/ pages
This data indicates that traffic fluctuations result from an algorithm rather than a manual action.
Once edited by a human, some LinkedIn collaborative articles apparently met the definition of useful content. Others were not, in Google’s estimation.
Maybe Google’s got it right in this instance.
If it’s spam, why does it rank at all?
From everything I have seen, ranking is a multi-stage process for Google. Time, expense, and limits on data access prevent the implementation of more complex systems.
While the assessment of documents never stops, I believe there is a lag before Google’s systems detect low-quality content. That’s why you see the pattern repeat: content passes an initial “sniff test,” only to be identified later.
Let’s take a look at some of the evidence for this claim. Earlier in this article, we skimmed over Google’s “Site Quality” patent and how they leverage user interaction data to generate this score for ranking.
When a site is brand new, users haven’t interacted with the content on the SERP. Google can’t access the quality of the content.
Well, another patent for Predicting Site Quality covers this situation.
Again, to grossly oversimplify, a quality score for new sites is predicted by first obtaining a relative frequency measure for each of a variety of phrases found on the new site.
These measures are then mapped using a previously generated phrase model built from quality scores established from previously scored sites.
Predicting Site Quality patent
If Google were still using this (which I believe they are, at least a small way), it would mean that many new websites are ranked on a “first guess” basis with a quality metric included in the algorithm. Later, the ranking is refined based on user interaction data.
I have observed, and many colleagues agree, that Google sometimes elevates sites in ranking for what appears to be a “test period.”
Our theory at the time was there was a measurement going on to see if user interaction matched Google’s predictions. If not, traffic fell as quickly as it rose. If it performed well, it continued to enjoy a healthy position on the SERP.
Many of Google’s patents have references to “implicit user feedback,” including this very candid statement:
“A ranking sub-system can include a rank modifier engine that uses implicit user feedback to cause re-ranking of search results in order to improve the final ranking presented to a user.”
AJ Kohn wrote about this kind of data in detail back in 2015.
It is worth noting that this is an old patent and one of many. Since this patent was published, Google has developed many new solutions, such as:
- RankBrain, which has specifically been cited to handle “new” queries for Google.
- SpamBrain, one of Google’s main tools for combatting webspam.
Google: Mind the gap
I don’t think anyone outside of those with first-hand engineering knowledge at Google knows exactly how much user/SERP interaction data would be applied to individual sites rather than the overall SERP.
Still, we know that modern systems such as RankBrain are at least partly trained on user click data.
One thing also piqued my interest in AJ Kohn’s analysis of the DOJ testimony on these new systems. He writes:
“There are a number of references to moving a set of documents from the ‘green ring to the ‘blue ring.’ These all refer to a document that I have not yet been able to locate. However, based on the testimony it seems to visualize the way Google culls results from a large set to a smaller set where they can then apply further ranking factors.”
This supports my sniff-test theory. If a website passes, it gets moved to a different “ring” for more computationally or time-intensive processing to improve accuracy.
I believe this to be the current situation:
- Google’s current ranking systems can’t keep pace with AI-generated content creation and publication.
- As gen-AI systems produce grammatically correct and mostly “sensible” content, they pass Google’s “sniff tests” and will rank until further analysis is complete.
Herein lies the problem: the speed at which this content is being created with generative AI means there is an unending queue of sites waiting for Google’s initial evaluation.
An HCU hop to UGC to beat the GPT?
I believe Google knows this is one major challenge they face. If I can indulge in some wild speculation, it’s possible that recent Google updates, such as the helpful content update (HCU), have been applied to compensate for this weakness.
It’s no secret the HCU and “hidden gems” systems benefited user-generated content (UGC) sites such as Reddit.
Reddit was already one of the most visited websites. Recent Google changes yielded more than double its search visibility, at the expense of other websites.
My conspiracy theory is that UGC sites, with a few notable exceptions, are some of the least likely places to find mass-produced AI, as much content is moderated.
While they may not be “perfect” search results, the overall satisfaction of trawling through some raw UGC may be higher than Google consistently ranking whatever ChatGPT last vomited onto the web.
The focus on UGC may be a temporary fix to boost quality; Google can’t tackle AI spam fast enough.
What does Google’s long-term plan look like for AI spam?
Much of the testimony about Google in the DOJ trial came from Eric Lehman, a former 17-year employee who worked there as a software engineer on search quality and ranking.
One recurring theme was Lehman’s claims that Google’s machine learning systems, BERT and MUM, are becoming more important than user data. They are so powerful that it is likely Google will rely more on them than user data in the future.
With slices of user interaction data, search engines have an excellent proxy for which they can make decisions. The limitation is collecting enough data fast enough to keep up with changes, which is why some systems employ other methods.
Suppose Google can build their models using breakthroughs such as BERT to massively improve the accuracy of their first content parsing. In that case, they may be able to close the gap and drastically reduce the time it takes to identify and de-rank spam.
This problem exists and is exploitable. The pressure on Google to address its shortcomings increases as more people search for low-effort, high-results opportunities.
Ironically, when a system becomes effective in combatting a specific type of spam at scale, the system can make itself almost redundant as the opportunity and motivation to take part is diminished.
Fingers crossed.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Thursday, December 21st, 2023
One of the joys of living in a place where the winters tend to be long and dark is the time it allows for reading. Make a fire in the fireplace, pour yourself a drink and open a good book.
I often do the bulk of my reading for the year between October and March because then it’s outside time (which isn’t to say you can’t read outside).
We live in a time when we’re surrounded by marketing. Everything and everyone seems to be vying for our attention.
If you work in marketing, the idea of reading a book about something you do all day and that surrounds you every waking moment might sound unappealing. Yet, there are still new things to be said and new things to learn about marketing.
As you head into 2024, here are six books to put on your reading list.
The first thing you’ll likely notice is they aren’t all actually about marketing. But marketing is an essential part of every business and every leader must be a marketer to be successful.
1. Impossible to Ignore: Creating Memorable Content to Influence Decisions
The central focus of Dr. Carmen Simon’s book is the creation of memorable presentations, which is an area where many people have just enough knowledge of PowerPoint and Google Slides to be dangerous.
The problem with many of the day-to-day presentations we see in sales and business, in general, is they try to function as both a presentation and a leave-behind. That leaves them packed with information and light on strong visuals and stories, and those are the exact elements that stick in our memories and promote recall.
As evidence that the techniques in the book work, I like to refer to how Simon uses them in the book itself.
Years after first reading “Impossible to Ignore,” I remember her anecdote about standing in line at a store when she was a child in Soviet-era Romania. Food was in short supply, so the workers had to limit the number of people in line. They decided to send home everyone behind the girl who stood out in a bright red coat, which was a young Simon. The combination of strong visuals and a powerful story burned that in my mind.
2. Running with Purpose: How Brooks Outpaced Goliath Competitors to Lead the Pack
Why would a memoir by the CEO of an athletic shoe company make the list? Because marketing, at its essence, is about identifying and creating markets for whatever you’re selling.
When Jim Weber took over as CEO of Brooks, the company was trying to be everything to everyone who wore sneakers. That’s a lot of people in a market with many big brand names.
Weber and team decided to drop a large portion of the market by leaving the “athleisure” business, which consists of the low(ish)-cost sneakers people wear around the house or when they’re doing chores. They decided instead to focus on serious runners.
This one also has a great marketing play involving luxurious portable toilets Brooks brought to major races. To gain entry, runners had to be wearing Brooks footwear.
There’s a lesson on market disruption, too. Remember the craze over five-finger running shoes? Yeah, that was fun.
3. Unreasonable Hospitality: The Remarkable Power of Giving People More Than They Expect
Will Guidara has a unique resumé. Among his roles: restaurant owner, creative agency leader, conference host and the author of four cookbooks.
His specialty is hospitality. One of his guiding beliefs is that hospitality need not be limited to what we think of as the hospitality industry (i.e., restaurants, spas, hotels). Instead, businesses across industries can create experiences that delight customers and drive more business.
As Guidara rose to prominence in the restaurant business in New York City, his business became legendary for providing experiences like sledding in Central Park for a family that had never before experienced snow.
The moments of brilliance and generosity in the book could serve as a lesson for corporations across the business spectrum. Americans have relatively dim views of large corporations and financial institutions in general. They feel much better about small businesses, which are more nimble and structured in a way that makes personal touches possible.
Many marketers will tell you their brand is more than a logo or color palette, it evokes emotions and, most importantly, trust. In “Unreasonable Hospitality,” you get a view of what this truly looks like in practice.
4. The Power of Moments: Why Certain Experiences Have Extraordinary Impact
We can’t remember every detail of every experience. If you’ve ever watched a courtroom drama, you’ve seen this play out.
“So what you’re saying is, you’re not sure if the suspect had a beard or not when you saw him on that misty, moonless night?”
We remember the peaks of our experiences most of all. Sometimes, we remember the valleys of our experiences. Everything else gets labeled as “just not important enough to remember” by our memory.
In “The Power of Moments,” Chip Heath and Dan Heath help readers understand how our minds process and classify experiences. Once you understand how this all works subconsciously, it’s much easier to be deliberate in creating moments that matter for our audiences.
As a blueprint, the book looks at events that weren’t necessarily designed to be memorable, such as a “Signing Day” ceremony for graduating high school seniors where they announced which college they were attending. It then deconstructs the events to see what exactly made them memorable.
5. Humanizing B2B: The new truth in marketing that will transform your brand and your sales
Download a whitepaper. Get calls from sales reps. Receive email after email.
For years, the B2B marketing playbook was pretty boring – even a bit annoying. It’s improved to some extent but still has a long way to go. You probably know the feeling if you have friends who work in B2C marketing.
“Oh, you’re doing a Super Bowl ad? That must be exhausting for you…”
What if it didn’t have to be this way? (Spoiler alert: it doesn’t.)
Instead of being the boring part of marketing, Paul Cash and James Trezona say, B2B should appeal to the emotions of people trying to transform organizations and create change.
They draw heavily on research from The B2B Institute at LinkedIn to make the case that B2B buyers rely on emotions just as much as their B2C counterparts.
That makes a great deal of sense, when you think about it. Because they aren’t actually counterparts. They are the same people, and they don’t take off their B2B hat and put on a B2C hat when they finish their workday.
6. Obviously Awesome: How to Nail Product Positioning So Customers Get It, Buy It, Love It
Part of what I enjoy about April Dunford’s story is that, like me, she never set out to be a marketer. As someone without a formal marketing education, she asked a lot of questions. The answers left her unsatisfied.
“Trust me, it works.”
“Because we’ve always done it that way.”
The result is “Obviously Awesome,” a book that re-thinks product marketing from an outsider’s perspective.
The most difficult part for people trying to turn their product into a story that resonates with customers is where to start. Do you craft a story that starts with your features? Or do you focus first on the customers’ needs? What about differentiation?
You’ll have to read the book to find out.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Thursday, December 21st, 2023

Data storytelling helps marketers present complex information in a relatable and compelling format – which plays a heavy hand in engaging consumers, influencing decisions and creating brand loyalty.
Join experts from Marigold for a 20-minute data storytelling masterclass and learn how to increase your conversions using data storytelling.
Register and attend “Data Storytelling Masterclass,” presented by Marigold.
Click here to view more Search Engine Land webinars.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Wednesday, December 20th, 2023
Google is reportedly planning a major reshuffle of its 30,000-person ad sales unit.
Sean Downey, who is in charge of ad sales to big customers in the Americas, announced plans to restructure the ad sales teams during a department-wide meeting last week, according to The Information.
Downey did not comment on whether the reorganization would include layoffs during the meeting.
Why we care. This news could be perceived as another sign that Google Ads is leaning towards full automation, which may provide disadvantages for some advertisers, particularly those with smaller budgets as they lack the financial resources to monitor and experiment with AI asset and budget variations.
Revenue. In October, Google revealed a 11% year-on-year increase in overall revenue, reaching $76.7 billion in Q3. Notably, ad revenue surged from $54.5 billion to $59.65 billion, marking the highest total in that category in nine quarters. Given the profitable year the company has enjoyed, potential layoffs may come as a surprise.
So why now? The news comes as Google continues to invest in AI and machine learning to facilitate increased ad purchasing, diminishing human involvement. In line with this, Search Engine Land reported earlier today that Google aims to improve support in Google Ads by leveraging AI further.
What Google is saying. A Google spokesperson did not immediately respond to our request for a comment.
First Google mass layoffs. Earlier this year, in January, Google’s CEO Sundar Pichai announced the company would be letting go of 12,000 employees and contractors – approximately 5% of their total workforce – in the company’s first-ever round of mass layoffs. In an email to staff, he said:
- “I have some difficult news to share. We’ve decided to reduce our workforce by approximately 12,000 roles. We’ve already sent a separate email to employees in the US who are affected. In other countries, this process will take longer due to local laws and practices.”
It’s important to note that Google has not announced layoffs. Currently, the company has reportedly only confirmed a restructure of the ad sales unit.
Get the daily newsletter search marketers rely on.
Deep dive. Read our Automation Layering guide for more information on “how PPC pros retain control when automation takes over.”
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Wednesday, December 20th, 2023
Campaign management software helps you automate the manual tasks of planning, launching and measuring the impact of your marketing campaigns.
Modern marketing campaigns often have many moving parts. Campaigns may involve multiple:
- Internal departments (e.g., creative, media, brand, legal).
- External partners (e.g., agencies, media, co-marketing).
Campaign management software helps keep stakeholders organized. Once the campaign is live, these tools help:
- Automate data collection on performance
- Generate insights.
Understanding the role of campaign management software
The biggest benefit of campaign management tools?
Organization.
All team members working on a marketing campaign must be organized before, during and after launch to be efficient and informed.
The right campaign management tools will improve collaboration and help teams move faster at each campaign stage, from preparing assets before launch to summarizing results after the campaign.
However, no true end-to-end, plug-and-play campaign management tool really exists.
But you can look for best-of-breed solutions that integrate well with existing tools and skill sets to build an ecosystem that meets your campaign management needs.
Key features and capabilities of marketing campaign management software
Marketing teams will benefit from campaign management solutions that help them across the campaign lifecycle of planning, tracking and analyzing.
The more tightly the tools are integrated, either out of the box or through the efforts of developers or marketing operations professionals, the better the experience will be for the marketers who use them.
Adopting several point solutions that cannot share information won’t create the efficiency you’re looking to gain from campaign management tools.
Campaign planning and scheduling
Aligning all the people and assets poses a significant challenge for teams, especially in distributed workforces.
Project management tools from several vendors can be used to track progress, assign work and send notifications. These tools offer automation, integrations and personalization capabilities that allow your team members to use them in a way that fits their work style.
The vendors with tools that will help your team with its campaign planning and scheduling include:
- Asana
- Monday.com
- SmartSheet
- ClickUp
- Trello
- Wrike
Audience segmentation and targeting tools
Using data and segmentation helps you avoid the inefficiency inherent in spray-and-pray marketing tactics. But you must have detailed audience and tool data to help segment that data into lists.
Many well-known cloud-based marketing tools can build lists based on certain criteria. Many also include the functionality to engage with the list members via email or create campaigns running on other channels (like paid ads) and connect to data sources to import data from those channels.
The vendors that make tools to help with audience segmentation and targeting include:
- HubSpot
- Salesforce
- Adobe
- ActiveCampaign
- Keap
- Klaviyo
Content creation and personalization
Many segmentation and targeting vendors also offer personalized marketing outreach. Customer data platform (CDP) vendors also help marketers craft personalized messages.
The combination of data and personalization functionality allows you to deliver the right message to the right person at the right time, which is a powerful combination for influencing prospects.
Multi-channel campaign execution
“Marketing cloud” vendors also offer visibility into your campaigns across channels, though it will likely require some work from the tool’s administrator or a marketing operations pro to get this working.
Leadership teams love high-level, holistic views, so the fewer data sources for your team to track down the better. You don’t want your team to spend valuable time collecting data from disparate sources to put in a slide deck for campaign updates and reviews.
Tracking and analytics
You might find that your existing campaign management software can’t work with all the data generated by their campaigns.
If that’s the case, you can pull together reporting tools from vendors like SAS, Tableau, SmartSheet and others to help manage your data and monitor campaign results.
Benefits of campaign management software
It’s difficult to coordinate, optimize and report on your campaigns without campaign management software.
While the exact tools and functionality you need will depend on many factors (i.e., your existing martech stack, your budget and your team), the right combination of tools will increase your efficiency and contribute to better outcomes.
Among the benefits of using campaign management software:
Improved efficiency and productivity
Some tools will help your team automate the often-mundane tasks that are simply part of marketing campaigns (e.g., deadlines, reporting). Other tools will help improve collaboration between the various roles that come together to create a marketing campaign.
Enhanced collaboration
Planning a marketing campaign often involves an assembly line that takes ideas from concept to reality and then introduces them to the market across various channels. That requires some job functions, and depending on your organization, possibly several departments.
Campaign management tools that establish deadlines and responsibilities keep the assembly line moving.
Data-driven decision making
Campaign management tools that collect and analyze data are important for quickly identifying how a campaign or portion of a campaign is performing. Having this data readily available allows for optimizations, such as diverting budget from an under-performing channel to an over-performing channel or making adjustments to creative units.
Software that helps analyze data also plays a key role in explaining the outcomes of a campaign to leadership teams that just want to know what they got for their investment in a campaign.
Considerations for choosing the right campaign management software
If you’re like most marketers, you’re trying to navigate lofty business goals, resource shortages and a variety of tools in a martech stack you inherited from someone else.
Licensing costs will play a critical role in investing in any technology. Here are four other criteria you can use to help make sound decisions around campaign management tools.
Scalability and flexibility
Tools that are too rigid and can’t grow with the organization make for poor investments.
Marketing is constantly changing, from the channels to the regulations that govern data collection and use, to new leadership with new ideas. You need to focus on tools designed to withstand these changes.
A constant cycle and ripping and replacing tools will hinder progress toward your goals.
User-friendliness and training
Many marketing teams are filled with people who can do a little bit of everything.
Deploying campaign management software with a steep learning curve delays progress but also creates bottlenecks when only certain people are capable of using it correctly.
Everyone needs training when a new tool is deployed, so make sure the vendor supports it. But the easier tools are to use the better.
Customer support and service
Even with training, your team is likely to hit some snags along the way. Good customer service from your campaign management software vendor can help your team keep moving, share best practices and introduce new features and functionality.
Look for the areas where your team has the most need. Start there.
If your campaign is getting stuck at the same point in the assembly line, then that’s your first area to address.
While software is certainly one way to solve a problem, don’t neglect other opportunities to increase efficiency.
If it’s a problem with people or processes, software alone is unlikely to fix it.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Wednesday, December 20th, 2023
Google has quietly started testing placing headlines within the ad copy description text in live ads.
Advertisers were not given prior notice about the ad copy variation experiment, and the uncertainty about the potential expansion of this test to more accounts has led to frustration within the community.
Why we care. Changing the rules without informing advertisers can make it harder for them to do their jobs and know what needs to be prioritized. The impact is even more significant for advertisers with smaller budgets, as assessing the changes, especially with responsive search ads, becomes challenging, adding to their workload.
What Google is saying. Google Ads liaison officer Ginny Marvin addressed concerns about ad variations following multiple reports on the topic during a PPC Chat Q&A. She said:
- “This is a small test and I don’t have anything further to share on this at this time.”

Just a small test? Despite Google’s comments, not everyone is convinced that the ad variation experiment is a “small test”. Google Ads expert Anthony Higman told Search Engine Land:
- “While I understand that Google rolls out tests to the SERPs and paid ads, this test seemed to be more far reaching in that everyone on my team and other people in the PPC community were seeing this in live ads. So this seems like it is one of the larger tests taking place. “
- “While I understand testing of paid ads, I think we are all just a little over the massive amount of tests and changes that have taken place this year and last.”
- “This test also seems different to me in that they are altering known elements of a search ad by making ad headlines show as descriptions or almost like “call out” assets in front of ad copy descriptions. This is troublesome because these changes alter the dynamics of ad copy that are well known by all Google advertisers.”
Calls for more transparency. Higman, who first flagged the ad variation test on X, went on to explain how a lack of transparency from Google can impact advertisers:
- “I think that since this test and other recent tests are changing ad copy dynamics that they need to be mentioned since it can alter planned out and tested ad copy in accounts.”
- “As others have mentioned, this also can change rules for certain more restrictive ad verticals like legal and medical where ad copy variations need to be approved before rolling out live.”
A move towards full automation? Commenting on the ad variation “small test”, as well as other experiments he’s witnessed within Google Ads recently, Higman claimed that Google appears to be heading towards full automation which could be problematic:
- “My point with all of these tests and also with the advancement of auto applied assets, recommendations, GBP connected ads, photos and also new asset format variations is that it seems as if everything is a new A/B test with every advertiser using Google ads.”
- “While this may be beneficial for advertisers with larger budgets, there is no statistical significance that can be gleaned on smaller spend accounts. Also we can no longer see what these changes are doing to our ad data because we don’t know what asset variations are doing to our CTR’s.”
- “So all of these new tests plus the dwindling ad data and lowered visibility of search query data is just further forcing us towards full automation which will not be a good fit for all advertisers using Google ads.”
Deep dive. Read our guide on How to write compelling ad copy in a Smart Bidding landscape for more information.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Wednesday, December 20th, 2023
Google Ads has confirmed that support is not being phased out.
With the introduction of a paid support service in August, concerns arose among advertisers about the potential withdrawal of the free feature. The perceived decline in customer experience further fuelled the belief that the free support might no longer be a priority.
However, Google Ads has now revealed that big changes are on the horizon with AI expected to play a major role moving forward.
Why we care. Advertisers, particularly those with smaller budgets, rely on support from their Google Ads rep to address campaign issues. Losing this service would make it more challenging to resolve problems affecting campaign performance and, consequently, ad revenue.
What Google is saying. During a live PPC Chat Q&A, Google Ads liaison officer Ginny Marvin addressed concerns about support being phased out. She said:
- “I’m very aware of questions and concerns about this topic.”
- “Support isn’t being phased out but changes are being made. There have long been challenges on this front, as everyone is likely aware. I’ve talked about this before, but Support was one of the areas I wanted to understand better when I joined.”
- “With the scope of inquiries, it’s not an easy solve. That said, I know there are real frustrations about the current state, including chat.”
- “I do think Support is an area where LLMs/Google AI will be able to make big strides in improving experiences. That’s not happening yet, but work is underway. Stay tuned.”
Support issues. SMX Next speaker and PPC expert Julie F Bacchini explained that advertisers suspected support was being phased out for several reasons, telling Search Engine Land:
- “A lot of people have made comments like this lately, so this seems like an important question to [address]. People think it is being phased out for a few reasons…”
- “It has gotten worse and harder to get answers lately.”
- “Everything takes longer to get resolved.”
- “The pilot program where you can pay for an actual call.”
AI replacing human support? Commenting on Marvin’s explanation as to what support will look like moving forward with AI playing a more significant role, Bacchini added:
- “I’m not surprised Google Ads is trying to bring AI into support. I think, like many companies, Google would love to find ways to have AI take over functions.”
- “For some low level tasks, it might be fine – but I can’t see AI every totally replacing human support.”
Paid support pilot. Marvin later confirmed that Google Ad’s paid support pilot is still ongoing, however there is no update on this front. In August, the platform’s enhanced customer service feature, which offers one-on-one support tailored to specific customer needs, was rolled out to small businesses as part of a new paid pilot for the first time. Historically, this level of one-on-one support has historically only been offered to Google Ads’ biggest clients.
Get the daily newsletter search marketers rely on.
Deep dive. Visit the Google Ads Help Center for more information on the support services it offers.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Tuesday, December 19th, 2023
SEO can be complicated – in many cases, overcomplicated. It’s easy to get lost in the SEO rabbit hole, spending significant time with minimal results.
This article will help you cut through the noise and focus on the four key pillars of SEO that will help you improve visibility in 2024 and beyond.
The four pillars of SEO
The four key areas of SEO that site owners need to consider are:
- Technical SEO: How well your content can be crawled and indexed.
- Content: Having the most relevant and best answers to a prospect’s question.
- On-site SEO: The optimization of your content and HTML.
- Off-site SEO: Building authority to boost trust and rankings.
By procedurally working through these four SEO pillars, you can improve your visibility, traffic and engagement from organic search.
1. Technical SEO
Technical SEO can seem a little daunting. But need it to make sure search engines can read your content and explore your site.
Much of this will be taken care of by the content management system you use, and Google’s Search Console can help you understand the technical makeup of your site.
The main areas to consider here are:
- Crawl: Can a search engine explore your site?
- Indexing: Is it clear which pages the search engine should index?
- Mobile: Does your site provide a solid mobile experience?
- Speed: Do your webpages load fast on mobile, desktop and beyond?
- Technology: Is your website search engine-friendly?
- Hierarchy: Is the content organized to help categorization?
If you are a small business using WordPress (or a similar CMS) for your website, technical SEO should be something you can check off your list pretty quickly. If you have a large, bespoke website with millions of pages, technical SEO becomes much more important (and troublesome).
In 2024 and beyond, much of what is “technical SEO” is actually part of your website and CMS. The key is to collaborate with a developer who grasps SEO principles and builds you a website that is SEO-friendly and properly configured. Doing this should get you most of the way toward effective SEO.
Note: If you are a small or micro business, don’t obsess over this too much or feel you have to perfect everything. We still see lots of sites that are essentially doing everything wrong and rank well, so just do your best!
2. On-site SEO
Once your technical SEO is in a good place, you need to optimize the content on your site.
Structural optimization
The first job here is to ensure your site is structured in a way that helps Google understand the relevance of every page. Think of your website as a filing cabinet. The website is the cabinet, the sections are drawers, and the pages are folders within those drawers.
You should be able to draw this structure on the back of a napkin and understand how everything relates.
- Home
- Services
- Locations
- Team
- Department
- Team Member A
- Team Member B
- Case Studies
- Case Study A
- Case Study B
- About Us
- Contact
You get the picture – and so, hopefully, does Google. By structuring your site like this, you provide context for a page before Google has even examined the page itself and set the scene for the page itself to be optimized.
Page-level optimization
With a sensible structure in place, you can now optimize the individual pages.
The main areas to focus on here are:
- Keyword research: Understand the language of your target audience.
- Descriptive URLs: Ensure each URL is simple and descriptive.
- Page titles: Use keywords naturally within the page title.
- Meta descriptions: Craft meta descriptions like they were ad copy to improve click-through rates.
- Content optimization: Sensibly use keywords and variations in your page copy.
- User experience (UX): Ensure your site is a joy to use and navigate.
- Strong calls to action: Make it easy for your users to know what to do next.
- Structured data markup: Help Google understand your content.
If you have taken the time to structure your site correctly, then the on-page optimization is fairly simple to layer over. If it helps, put your list of pages in a spreadsheet and detail the keyword you want to optimize each page for.
Plenty of tools will assess how well a page is optimized for a given term that can aid you in the nuts and bolts of the optimization.
Don’t think of this as a one-time job, either. Once the site is indexed, you can gather much more information from Google Search Console on what keywords each page ranks for and refine your optimization at a page-by-page level.
Get the daily newsletter search marketers rely on.
3. Content
Content is king. That’s the saying, right?
It’s true in a way. Your website is just a wrapper for your content.
Your content tells prospects what you do, where you do it, who you have done it for and why someone should use your business.
And if you’re smart, your content should also go beyond these obvious brochure-type elements and help your prospective customers achieve their goals.
For service businesses, we can loosely break your content down into four categories:
- Business Information. Who you are and why people should care.
- Service content. What you do and where you do it.
- Credibility content. Why a prospect should engage with your business.
- Marketing content. Content that helps position you as an expert and puts your business in front of prospects earlier in the buying cycle.
SEO is important for each type of content in different ways. SEO is often forgotten about when it comes to credibility content like case studies and reviews, but in an E-E-A-T world, this is a lost opportunity.
For example, I recently renovated a Victorian-era house in the UK. The house is 140 years old, falling apart, and known as The Money Pit!
Finding good people to help with this project was difficult, and it was those with good testimonials and case studies that we ended up:
- Finding via localized search results.
- Using due to the clear examples of relevant experience and expertise.
E-E-A-T may seem like another painful SEO acronym to work around, but in reality, E-E-A-T just represents what we, as consumers, want.
Adjusting your thinking to demonstrate your E-E-A-T in your content will only help you rank more highly, get more visitors and convert those clicks into customers!
Further still, ensure you optimize your marketing content, including case studies, portfolio entries and testimonials – not just the obvious service pages.
For larger businesses, a solid content marketing and SEO strategy is also the most scalable way to promote your business to a wide audience.
This generally has the best ROI, as there is no cost per click – so you are scaling your marketing without directly scaling your costs.
Layer in some remarketing and other ads that build on this first organic touch, and you can be onto a winning mix of tactics.
Ensure your SEO tactics align with your overall SEO strategy. We still see too many paint-by-numbers approaches to SEO, where local businesses are paying agencies or using generative AI tools to pump out blog posts that will never rank.
Create content that helps your customers either find you or choose you and focus on optimizing that.
Dig deeper: What is helpful content, according to Google
4. Off-site authority building
Eventually, all SEO rivers run to this one spot: authority building.
Building your authority, historically, was all about link building, a much abused and maligned SEO practice by 2024.
Authority is still crucial to developing strong organic rankings and is part of the E-E-A-T approach. However, this can be the hardest part of SEO to get right.
The best way I have ever seen to describe the right link-building mindset was penned by the late, great Eric Ward: “Connect what should be connected.”
This philosophy is beautiful in its simplicity and corrects the “more, more, more” mentality of historic link building. We only want links from relevant sources.
Often, this means that to scale our link building efforts beyond the obvious tactics, we need to create something that deserves links. You have links where it makes sense for you to have links. Simple.
Wikipedia has millions of links, yet I am pretty sure they have never done any link building. This is because they have reams of useful content that gets linked. These are real, natural links that enrich the linking page, provide further context and serve as the real connective tissue of this hyperlinked world we live in.
This kind of natural link should be the backbone of your link-building efforts. This may mean you have to revisit the content on your site and create something of value first, but if you can nail that, you are halfway home.
Any safe, scalable link building strategy should be built on this mindset.
Summary
SEO becomes more manageable when broken down into four core pillars.
- Technical SEO ensures search engines can properly crawl and index your site.
- Optimizing on-page elements provides helpful cues to search engines regarding relevance and hierarchy.
- Investing time and resources into useful content that answers your customers’ questions and establishes your expertise lays the groundwork for rankings.
- Taking a strategic yet authentic approach to external authority building cements your site’s place as a trusted resource on relevant topics.
Make sure to establish clear SEO goals and track your performance KPIs to continually improve on these four pillars.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Tuesday, December 19th, 2023
With a shared commitment to fostering a secure and trustworthy environment for publishers and advertisers, BidsCube and Pixalate have partnered to fortify the digital advertising industry and make it more transparent.
Pixalate’s fraud protection, privacy and compliance analytics solutions integrate into BidsCube’s programmatic ecosystem, providing clients with comprehensive tools powered by AI and machine learning. It will enhance the company’s quality approach, reinforcing its dedication to delivering safe and compliant programmatic advertising products.
“As the programmatic advertising continues to evolve, addressing ad fraud and ensuring transparency has become paramount for success,” said Dmytro Chebakov, CEO of BidsCube. “Our collaboration with Pixalate reflects our commitment to delivering trustworthy and secure programmatic advertising solutions.”
Why we care. The partnership of BidsCube and Pixalate is notable because it’s a substantial step towards creating a more secure ecosystem that benefits advertisers and publishers. The latest report by the Association of National Advertisers confirms concern about transparency issues in the programmatic market. The strategic partnership between BidsCube and Pixalate marks a significant milestone in the fight against ad fraud and championing transparency in programmatic advertising.
“Our partnership with BidsCube provides their customers with comprehensive fraud detection and prevention solutions,” said Jalal Nasir, CEO of Pixalate. “We are encouraged by their proactive approach in creating a programmatic advertising ecosystem built on transparency, efficiency and quality.”
How it works. For BidsCube customers, the new features will be now available within all platforms out of the box. Users can navigate the built-in Pixalate panel to customize all the necessary settings. For partners utilizing managed services, a traffic monitoring option with the Pixalate solutions will be available on demand. The company aims to provide every party involved with a more secure and reliable environment.
Why it is important for advertisers and publishers. This partnership will impact more than 250 of BidsCube’s global programmatic partners that use the company’s products, providing reliable publishers with more revenue and advertisers with high-quality advertising inventory.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing