Entities, topics, keywords: Clarifying core semantic SEO concepts
Written on September 7, 2023 at 6:11 am, by admin
In this article I’ve curated some of the most commonly misunderstood elements of entity and semantic SEO, aiming to clear the fog around them.
Here’s our roadmap:
- What are the differences between entities, topics and keywords?
- How do entities affect keyword research?
- What is a topical map?
- How do you do on-page optimizations in the entity paradigm?
- What role does schema markup play in entity SEO?
- What are semantic networks and how do you build them?
What are the differences between entities, topics and keywords?
One of the most common areas of confusion I’ve seen when discussing entities is what differentiates keywords, topics and entities from each other.
Because entities, keywords and topics are intertwined elements in the vast landscape of SEO, it can often be difficult to truly tease apart each definition.
Entities
These are the foundational concepts or things in content. At a basic level, entities can be singular nouns, like “chocolate cake” or “iPhone.”
But they can also represent more complex named concepts, such as events like “The Olympic Games” or places like “Mount Everest.”
In SEO’s semantic framework, entities are unique, identifiable concepts consistent across various texts or contexts. They aren’t tied to specific phrases but represent broader ideas.
Within a keyword phrase like “delicious chocolate cake recipe,” “chocolate cake” is the entity.
On a larger scale, for a website about tech reviews, entities such as “smartphones”, “laptops”, and “gadgets” guide its overarching themes, signaling to search engines the primary subject matter.
Keywords
These are the specific phrases or terms users type into a search engine. They’re the bridge between the user’s intent and the content they’re trying to find.
Keywords can encapsulate one or more entities, reflecting what users are actively seeking.
For instance, while “iPhone” is an entity, a keyword that encompasses it might be “iPhone 12 Pro Max review.”
Topics
Topics are thematic areas or categories that encapsulate one or more entities. Think of a topic as an umbrella under which multiple entities can reside.
For example, under the topic “Smart Home Technology,” entities could include “Google Nest Hub,” “smart thermostats,” and “IoT security.”
Here’s a summary of their key distinctions:
Scope
- Topics are broad and can encompass multiple entities and even various keywords.
- Entities are more specific and focused.
- Keywords are the specific searchable terms related to both.
Hierarchical relationship
- Entities usually fall under topics.
- Keywords can align with either entities or topics or sometimes both.
SEO
- Topics guide your broader content strategy.
- Entities help focus and refine that strategy.
- Keywords serve as the target for actual search queries.
Intersection
- Topics can be used to form content clusters.
- Entities and keywords serve to refine and specify the content within those clusters.
User intent
- Topics guide the user through their informational journey.
- Entities provide specific answers.
- Keywords can be crafted to meet specific user queries.
Semantic networks (in the context of SEO)
- Topics often serve as nodes in semantic networks that connect related entities.
- Keywords serve as the pathways that lead users to those nodes.
The relationship between topics and keywords manifests in how keywords help flesh out the various aspects of a given topic.
In a well-structured content strategy, the keywords you target should naturally fall under the umbrella of your chosen topics.
This ensures that your content is relevant and comprehensive and satisfies a range of user intents related to your topic.
In essence, topics serve as the overarching themes that guide the scope and direction of your content, providing a high-level focus.
Entities further sharpen this focus, giving search engines like Google a nuanced lens to understand the core essence of your content, both on micro and macro levels.
Keywords, meanwhile, refine and drill down into specific facets of your overarching topics, making your content discoverable to users with particular queries related to those topics.
How do entities affect keyword research?
Traditionally, SEO strategies were rooted in keyword research, focusing primarily on keyword difficulty and search volume.
Specialists would aim for low-hanging fruit – terms a site could realistically rank for – before moving on to more competitive keywords.
While effective in the past, this approach has become less optimal due to evolving search algorithms.
Today, a keyword-centric methodology risks creating disjointed topics across a website, undermining the development of topical authority.
Imagine if topics occupied a physical space where some topics are proximal while others are distant.
In this space, “bowling shoes” would be closer to “bowling” than “fun nights out with the family” – however, “fun nights out with the family” might not be far off.
This is how advanced language models perceive language. They generate graphical representations to discern topical relevance.
If your site has a scattered topic structure, jumping from one loosely related topic to another (as with ad hoc keyword targeting), Google might find it challenging to decipher your website’s core intent.
This could lead to a decline in ranking or just an inability to be competitive for the keywords that really matter to your business.
While targeting low-hanging fruit in terms of keywords is still viable, it doesn’t necessarily establish a website as an authority in a specific niche.
To optimize content in this era, we need to consider two key aspects:
Density of the subject matter
Your goal should be to cover content that is graphically close together and to do it better than any competitor. Assess the competition.
Determine which sub-niche within your realm you can outshine others in. Can you truly be the go-to expert on the subjects you target?

Logical subject expectations
Creating content that naturally aligns with your site’s objectives is crucial for SEO success.
I often see SEO professionals broadening their content scope excessively or employing AI to cover every conceivable angle of their subject matter, which misses the mark entirely.
For instance, imagine a site focused on “Kettlebell Workouts for Beginners.”
Adding an article about the history of kettlebell design, while interesting, may not directly benefit the site’s primary audience, who are looking for actionable workout tips.
Entity SEO is not a free pass to cover every topic under the sun. It’s imperative to prioritize the content that resonates with your site’s main objectives and aligns with Google’s understanding of your expertise.
In other words, before you delve into the nuances of kettlebell design, ensure you’ve already covered the basics your audience is actively searching for.
Only expand to broader topics when you’ve observed the search engine sees you as an expert in your sub-niche (this is usually observable by top rankings and quick rankings of new articles).

Remember: Establishing oneself as a topical authority is an ever-evolving challenge. Success hinges on thorough research, pinpointing subtopics ripe for deeper exploration – areas others might have overlooked.
Equally critical is identifying content clusters within competitor domains, not just to replicate but to outdo.
However, stay attuned to their backlink profiles. Gauging your ability to match or even outpace these external factors is crucial for a realistic strategy.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
What is a topical map?
As we discussed above, SEO was often synonymous with keyword research and this was stage one of an SEO effort.
Marketers would scour the web, hunting for keywords promising high search volumes and low competition.
The strategy was simple: find a high-potential keyword, create content around it, rinse and repeat.
This tactic was more about sniping possibilities than building a holistic digital presence.
The aim was to grab those low-hanging fruits – isolated keywords that could easily be targeted to drive traffic.
The idea was you’d build a base of ranking content and then slowly advance to more difficult terms over time. This approach was ultimately keyword-centric.
The cluster-centric approach
In my opinion, an evolved SEO strategy should focus on strategic decisions around clusters of content and not exclusively aimed at individual keywords.
Enter topical maps – a vital tool in modern SEO that helps you better visualize and plan your content strategy.
A topical map is essentially a visual representation of your primary subject and how it connects to various sub-topics and themes.
Think of it as a spider diagram or a mind map. At the center of this map, you have your main subject – the overarching topic your site focuses on.
Branching out from this central node are related sub-topics, which serve as the pillars supporting your main subject.
Each sub-topic can further divide into specific themes or even narrower facets, providing a layered, organized structure to your content.
Imagine your central topic is “Basketball.”
From there, branches like “Basketball Techniques,” “Basketball Equipment,” and “NBA Teams” emerge.
Dive deeper and from “Basketball Techniques,” you have “Free Throws,” “Dribbling” and more.
Interspersed are your keywords, suggesting content angles and user intent.
Instead of focusing on individual keywords, you’re now looking at clusters – groups of related content pieces.
These clusters allow you to paint a comprehensive picture of a subject, making your site a one-stop hub for audiences.
When assessing competition and volume, the focus shifts from individual keywords to these clusters, looking at the aggregate potential rather than singular opportunities.
Here, your focus becomes analyzing the landscape for what clusters of content your competition has and looking for pockets of where you can outperform their clusters instead of their keywords.
The benefits of thinking in clusters
This shift from keywords to clusters offers multiple advantages:
Depth and breadth
- By covering related topics under one umbrella, you provide both in-depth insights and a broad overview, catering to various user intents.
Authority
- A cluster approach signals to search engines that you’re not just skimming the surface. You’re diving deep, establishing your authority in a niche. This triggers search engines to rank you higher, oftentimes for terms that you have weaker external factors than the competition on.
Flexibility
- Clusters allow for easier content expansion. If a new trend emerges within a cluster, you can seamlessly integrate it without disrupting your site’s structure.
Keyword clustering tools can be game-changers when building topical maps. They help you identify when a keyword deserves its standalone topic page or if it fits better within a broader topic.
In essence, the older approach was cherry-picking good keywords and using their success to propel your SEO forward.
The new approach is to select clusters of dense content that establish authority. When done correctly, you can trigger a topical authority boost and circumvent the need for superior external SEO factors.
How do you do on-page optimizations in the entity paradigm?
Now that we’ve seen how entities can be factored into your high-level strategy and content roadmap via topical maps, let’s go through questions people ask about the actual implementation of entities on your site.
On-page SEO and entities: What’s changed?
If you’ve engaged in SEO for any time, a notable change in your Search Console query report is that your pages often rank for keywords not directly mentioned within their content.
Entities and the knowledge graph have transformed Google’s grasp on language. Previously, search results predominantly showcased pages that explicitly mentioned a keyword.
But now, with the deeper understanding provided by entities, Google draws from a wider pool, considering pages that address topics in a related sphere. This leads Google to favor more comprehensive content pieces.
This shift introduced what I term “authority pages.” These are content-rich pages saturated with entity relationships, designed to answer many potential user queries. Such pages aggregate the essence of what previously required several pages.
As you develop these authority pages, AI tools can be instrumental.
The fundamental procedure involves a few key steps:
- First, utilize tools or methods like TF-IDF (Term Frequency-Inverse Document Frequency), RAKE (Rapid Automatic Keyword Extraction), or keyword extraction techniques to pull out crucial keyword phrases or entities from articles that already rank highly in your target area. This step is crucial because it helps you establish the minimum subject matter and terms that Google seems to view as essential for ranking in that particular topic space.
- Once you have identified this baseline, the next objective is to go beyond it. Your aim should be to not just match the top-ranking articles in terms of entities and keywords but to exceed them by introducing new entity relationships or facets of the topic that haven’t been extensively covered. The idea is to offer a more comprehensive, insightful piece that adds value beyond what’s currently available, thereby enhancing your chances of being viewed as an authority on the subject by search engines.
I explore these concepts and tools in greater depth in How ChatGPT can help you optimize your content for entities, where I break down the nitty-gritty of how to maximize your topical authority in the SEO landscape.
What role does schema markup play in entity SEO?
The concept of entities is crucial in the world of SEO, serving as the focal points around which content is structured.
Leveraging schema to highlight these entities can provide search engines with an even clearer understanding of your page’s subject matter and context.
Schema is a tool often underutilized in the SEO community.
While many stick to generic schema setups, custom options like “Mentions schema” can vastly improve how Google understands your content.
Mentions schema allows you to specify what or who your page mentions, and can even link to authoritative sources like Wikipedia for greater clarity.
Here’s how to implement it.
Step 1: Identify your core entity
Before you begin implementing schema, identify the core entity or entities around which your content revolves.
For example, if you’re writing a comprehensive guide about “Mediterranean Diet,” your core entity is the Mediterranean Diet.
Step 2: Use mentions schema
Utilize mentions schema to specify additional entities related to your core entity.
If you’re discussing the Mediterranean Diet, you might mention entities like “Olive Oil,” “Fish,” and “Exercise.”
{
"@context": "http://schema.org/",
"@type": "Article",
"mentions": [{
"@type": "Thing",
"name": "Olive Oil"
},
{
"@type": "Thing",
"name": "Fish"
},
{
"@type": "Thing",
"name": "Exercise"
}]
}
Step 3: Use ‘SameAs’ for authoritative sources
When you mention other entities, use the “SameAs” attribute to link to their authoritative sources, such as Wikipedia pages or scientific studies.
{
"mentions": {
"@type": "Thing",
"name": "Olive Oil",
"sameAs": "https://en.wikipedia.org/wiki/Olive_oil"
}
}
Step 4: Visualize using tools
Tools like Schema Zone can help you visualize your schema structure.
Plug in your URL to see if your schema correctly highlights your core and related entities.
Step 5: Test and monitor
Use Google’s Schema Testing Tool to make sure your schema is correctly implemented.
After that, monitor your site’s performance to see how the enhanced schema affects your search rankings.
By consciously implementing schema to outline the entities within your content, you’re making your content more understandable to search engines and paving the way for better SEO performance.
It’s an essential step to make your content not just readable but also “understandable” by search engines.
What are semantic networks and how do you build them?
The importance of a well-structured SEO strategy cannot be overstated, especially as search engine algorithms continue to evolve.
While topical maps provide an initial framework defining which clusters of content you should focus on, they only scratch the surface.
Enter semantic content networks – a refined, holistic approach that ties together the multiple facets of your content into a unified whole.
What is a semantic content network?
A semantic content network is a sophisticated way to structure and interconnect your website’s content beyond isolated pages and lone keywords.
At its core, it’s an organizational model that serves as a roadmap for interconnecting your content.
Rather than marking standalone pages or individual keywords, you’ll be plotting out clusters of interrelated content, usually in the form of internal links and hierarchical relationships.
How does it build on traditional models?
If you’re familiar with the “hub and spoke” model, you’ll find similarities here.
In this enhanced model, central themes or “hubs” serve as the anchor points, with related sub-topics or “spokes” branching out.
What sets it apart is its keen focus on logic and accessibility, ensuring that both users and search engines can easily understand the architecture and flow of your content.
Why do semantic content networks matter?
In today’s SEO landscape, search engines like Google are shifting from a narrow focus on keyword counts to a broader understanding of content, context, and semantic relationships.
A semantic content network helps you adapt to this advanced landscape, ensuring your content is not just easily discoverable, but also resonates with this new level of search engine understanding.
By adopting this approach, you’re creating more than just a search-optimized website.
You’re constructing a logically interconnected web of content that enhances the user experience and aligns perfectly with modern search engines’ intelligent, semantic capabilities.
This is the future of SEO, and it’s a future where clarity, coherence, and connectivity reign supreme.
Prioritizing entities and semantic relationships in SEO
In SEO, we’ve transitioned from a narrow keyword-focused approach to an interconnected, holistic view encompassing entities, topics, and the broader semantic relationship between them.
Entities, being the foundational concepts within content, help search engines understand the context and essence of a page beyond mere keyword repetitions.
As we pivot to this enriched understanding, the importance of topical maps and semantic content networks becomes clear.
They’re not mere tools; they’re strategies to better align with the evolving nature of search engines and, in turn, enhance user experience.
A coherent, logical content web is not just the future of SEO – it’s the present.
Recognizing and harnessing this shift can pave the way for more meaningful, impactful online experiences.
The post Entities, topics, keywords: Clarifying core semantic SEO concepts appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
4 tips to prepare for Black Friday and Cyber Monday by Digital Drew SEM
Written on September 7, 2023 at 6:11 am, by admin

Did you know? In 2022, Black Friday online sales totaled over $9.12 billion, and Cyber Monday sales totaled over $11.3 billion.
Black Friday and Cyber Monday are two of the biggest shopping days of the year in the United States. These events are a great opportunity for retailers to boost sales and generate excitement for the holiday season and for people to shop all at discounted rates.
However, for online businesses, it is very important to prepare for these days by using digital marketing services. So, before we dive into the topic, let’s first understand the significance of these days.
How to prepare for Black Friday and Cyber Monday
Black Friday and Cyber Monday are two of the biggest shopping days of the year, and businesses can use digital marketing services to prepare for them. Here are some specific digital marketing services that businesses can use to prepare for Black Friday and Cyber Monday:
1. Start planning your email marketing campaigns
Black Friday and Cyber Monday are two of the biggest shopping days of the year, and email marketing is a great way to reach your existing customers and promote your deals. Here are some tips for starting your email marketing campaigns early:
- Start planning early. The best time to start planning your Black Friday and Cyber Monday email marketing campaigns is in September. This will give you enough time to create engaging and effective emails and to test and optimize your campaigns.
- Personalize your emails. People are more likely to open and click on emails that are personalized to them. Use your email marketing software to segment your list by customer interests and demographics and send targeted emails that are relevant to each group.
- Offer exclusive deals. One of the best ways to get people to open your emails is to offer exclusive deals they can’t find anywhere else. This could be a discount on a specific product, free shipping, or a gift with purchase.
- Track your results. It’s important to track the results of your email marketing campaigns to see what’s working and what’s not. This will help you improve your campaigns over time and get better results.
2. Optimize your website for search engines with SEO
By optimizing your website for Black Friday and Cyber Monday keywords, you can increase your chances of being found by people who are looking for deals.
- Optimize your product pages for Black Friday and Cyber Monday keywords. When people are looking for deals, they are likely to search for keywords related to Black Friday and Cyber Monday. Make sure your product pages are optimized for these keywords so that they rank higher in search results.
- Create holiday-specific content. In addition to optimizing your product pages, you can also create holiday-specific content, such as blog posts, infographics, and social media posts. This content should be informative and engaging and help people learn more about your products and discounts.
- Build backlinks. Backlinks are links from other websites to your website. They are a signal to Google that your website is authoritative and trustworthy. In the weeks leading up to Black Friday and Cyber Monday, focus on building backlinks from relevant websites.
- Track your SEO performance. It’s important to track your SEO performance to see what’s working and what’s not. This will help you make necessary adjustments to your strategy.
3. Use search engine marketing to reach new customers
By running SEM campaigns during Black Friday and Cyber Monday, you can reach a large audience looking for deals.
Here are some tips for running SEM campaigns for Black Friday and Cyber Monday:
- Set a budget. Before you start running any SEM campaigns, setting a budget is important. This will help you avoid overspending and ensure that you’re getting a good return on your investment.
- Choose the right keywords. When choosing keywords for your SEM campaigns, it’s important to select those relevant to your products and that people are likely to search for. You can use a keyword research tool to help you find the right keywords.
- Create effective ad copy. Your ad copy should be clear, concise, and persuasive. It should also be relevant to the keywords you’re targeting.
- Track your results. It’s important to track the results of your SEM campaigns to see what’s working and what’s not. This will help you improve your campaigns over time and get better results.
4. Run paid social ads
Paid social ads are a great way to reach a large audience looking for deals during Black Friday and Cyber Monday. You can boost your sales and reach your marketing goals by running paid social ads on the right platforms and targeting your ads to the right people.
Here are some tips for running paid social ads for Black Friday and Cyber Monday:
- Choose the right platform. Not all social media platforms are created equal. When choosing a platform for your paid social ads, choose one your target audience is using. For example, if you’re targeting millennials, you might want to focus on Instagram and TikTok.
- Target your ads. When you’re targeting your paid social ads, it’s important to be as specific as possible. You can target your ads by demographics, interests, and even past purchase behavior. This will help you ensure that your ads are seen by the people who are most likely to be interested in your offer.
- Create engaging content. Your paid social ads should be engaging and relevant to your target audience. They should also be visually appealing and easy to understand. Use high-quality images and videos and write clear and concise ad copy.
- Offer exclusive deals. One of the best ways to get people to click on your paid social ads is to offer exclusive deals. This could be a discount on a specific product, free shipping, or a gift with purchase.
- Track your results. It’s important to track the results of your paid social ads so that you can see what’s working and what’s not. This will help you improve your campaigns over time and get better results.
By following these tips, you can run effective paid social ads to help you reach your Black Friday and Cyber Monday goals.
It’s time to act now
As the holiday season approaches, the competition will only intensify. By implementing the strategies outlined here, you’re setting yourself up for a successful Black Friday and Cyber Monday. Remember, the time to act is now. Start refining your email marketing campaigns, optimizing your SEO, fine-tuning your SEM strategies and crafting eye-catching paid social ads. With these pillars in place, you’ll be well on your way to surviving and thriving during the holiday sales period. Your success story begins with the steps you take today.
For some great additional ad examples check out this blog: https://digitaldrewsem.com/black-friday-facebook-ads-examples/
The post 4 tips to prepare for Black Friday and Cyber Monday appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
This SEO strategy led to 497 page-one rankings in 7 weeks by Cynthia Ramsaran
Written on September 7, 2023 at 6:11 am, by admin
Increasing first-page rankings and organic traffic has always involved guesswork. Experienced marketers know that deciding what content to produce and how to publish it can lead to wasted time, energy, and money. That can stop anyone dead in their tracks. But what would it look like if guesswork was no longer a factor in SEO and everything that goes into it?
Join Ryan Brock, chief solution officer at DemandJump, for a follow-up to last year’s Pillar-Based Marketing case study and how DemandJump eliminated the guesswork to win 497 first-page rankings in mere weeks on a highly competitive topic.
Register and attend “497 Page One Rankings in 7 Weeks: How Pillar-Based Marketing is Changing SEO,” presented by DemandJump.
Click here to view more Search Engine Land webinars.
The post This SEO strategy led to 497 page-one rankings in 7 weeks appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Google adds URL Contains targeting functionality to Performance Max
Written on September 6, 2023 at 3:09 am, by admin
Google has launched the URL Contains targeting functionality on Performance Max.
The feature, which is also available on Dynamic Search ads (DSA), enables you to manually specify which URL to show your PMax ads to.
Here’s a preview of how to access the new tool on the platform, as shared by Google Ads expert Thomas Eccel on LinkedIn:

Explaining why this new feature could be super useful for advertisers, Eccel told Search Engine Land:
- “Since a Pmax campaign with the Final Url expansion turned on can basically re-direct users on every URL possible (for example, to your blogs, about us page etc), non-monetizable pages should be excluded.”
- “Now, with this new URL rule, I can tell PMax to only redirect users to pages that include them. For example ‘/shop’. Or if I run the PMax campaign for just one product category, let’s say Nike shoes, I can just include ‘/shoes’, for instance .”
Why we care. This feature provides marketers with more control over how their ads are served as you manually get to choose which tokens to target in order to reach a more specific and relevant audience for your brand.
How it works. You can use this functionality to target pages with URLs that contain a certain piece of text – also known as a “token”. Within URLs, a token is a piece of text surrounded by a limiter like “/” or ‘-” – among others.
This feature cannot be used in conjunction with URLs like “electronicsexample.com/servicesmenu/” because the word “menu” appears after the targeted keyword “services”. However, this feature can work for URLs like “electronicsexample.com/services-menu/” because the keyword “services” is separated from the word “menu” with the “-“. Other URL separators include:
- “:”
- “/”
- “?”
- “+”
- “&”
What has Google said? A Google told Search Engine Land:
- “This is a new feature rolling out for Performance Max.”
- “We are bringing the same URL Contains targeting functionality from Dynamic Search ads (DSA) into Performance Max to better support DSA use cases in PMax as part of the voluntary upgrade we announced in July.”
- “This feature works exactly as it does in DSA today.”
Deep dive. Visit the Google Ads Help Center for more information on the URL Contains in DSA.
The post Google adds URL Contains targeting functionality to Performance Max appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Meta may remove ads on Facebook and Instagram for subscribers in Europe
Written on September 6, 2023 at 3:09 am, by admin
Meta is reportedly considering paid versions of Facebook and Instagram that would ban ads in Europe.
The cost of the subscription and potential rollout date has not yet been confirmed by the company.
However, Meta will continue to offer free versions of its apps which will still serve ads regardless, according to the New York Times.
Why we care. Depending on user adoption, this could significantly affect brand reach and campaign performance. While an ad-free subscription service is currently under consideration for Europe, it may extend to the US in the future. Advertisers should closely watch this development as they may need to consider reallocating their ad spend to other platforms accordingly.
Why now? A Meta subscription service has been suggested in response to the European Union’s Digital Services Act, which comes into effect on 1 January 2024.
Under the new regulations, more onus is being put on large platforms that have more than 45 million regional users, such as Google and Meta, to:
- Create a safer digital space.
- Establish a level playing field for businesses.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
What has Meta said? The company has not commented on launching paid-for versions of Facebook and Instagram. However, Meta CEO Mark Zuckerberg hinted in 2018 when he appeared before the US Senate that such a product could be on the horizon. When asked if he would consider charging users for access to his apps to avoid ads, he said:
- “There will always be a version of Facebook that is free.“
Then Meta COO, Sheryl Sandberg, added:
- “We have different forms of opt-out. We don’t have an opt-out at the highest level. That would be a paid product.”
Deep dive. Read Meta’s statement on the DSA, issued by Nick Clegg, President, Global Affairs, for more information.
The post Meta may remove ads on Facebook and Instagram for subscribers in Europe appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Microsoft Advertising Editor expands audiences to all markets
Written on September 6, 2023 at 3:09 am, by admin
Microsoft Advertising Editor has rolled out in-market audiences across all of its markets in EMEA, APAC and LATM.
The platform has also launched new audience types.
Alongside this expansion, the platform now supports bulk associations for In-market audiences, as well as:
- Remarketing.
- Dynamic remarketing lists.
- Similar audiences.
- Customer match.
- Custom audiences.
- Custom combination lists.
Why we care. Advertisers have historically only been able to use Microsoft Advertising online to manage audiences. However, they can now do this using the Microsoft Advertising Editor tool, which means they now have the flexibility to monitor and tweak their audience campaigns offline.
How it works. Marketers have been advised that audience creation and management should still be done via Microsoft Advertising online. However, Microsoft Advertising Editor can now be used to update associations in bulk. To update associations in bulk, navigate to the ‘Audience’ tab on Microsoft Advertising Editor and select an option.
Alternatively, to update audience targeting associations, you can import a file or import from Google Ads.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
Read more. Read Microsoft’s Audience options guide for more information on audiences, or Microsoft’s In-Market guide for a full list of markets and available audiences.
The post Microsoft Advertising Editor expands audiences to all markets appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
How SEOs can deal with unwanted adult-intent traffic
Written on September 6, 2023 at 3:09 am, by admin
SEO for adult sites is a fiercely competitive space – yet pervasive and unwanted adult-intent traffic remains a big challenge for enterprises, ecommerce sites and marketplaces.
Here is why this is a problem and what can be done about it.
When non-adult sites rank for adult searches
It’s important to understand that “adult-intent traffic” and “adult content” differ.
Any amount of mature content can cause Google to label a website as “adult” and limit its exposure for most queries.
It’s a good practice to label any adult content as such using <meta name="rating" content="adult"> tag that will signal to Google that this content should be filtered from SafeSearch.
Whenever practical, mature content should be separated from the main site by moving it to a subdomain.
Adult-intent traffic, on the other hand, describes the intent behind the search query, regardless of the content of the page it lands on.
How SafeSearch influences Google’s results
If SafeSearch is on, most explicit and adult content will be filtered out from the results, which effectively means a ban on sexually exploitative or sexually suggestive content and nudity.
Websites that Google explicitly labels as being pornographic only show up for certain queries. Google prevents adult-themed content from triggering rich snippets or appearing in Discover.
Ironically, this means that safe websites and platforms that monitor and remove explicit content (for example, mainstream news sites or educational platforms) are more likely to appear for adult-oriented search queries in Google when SafeSearch is on.
General information
For many queries with adult intent, Google might return results that offer more general information about the topic or non-explicit references.
For instance, a search for an adult film star might return a Wikipedia page or a news article about them rather than their explicit content.
Vague queries
Many search queries can be interpreted in multiple ways, both innocent and adult. With SafeSearch on, Google is likely to favor a non-explicit interpretation.
For example, searching for “breast” might prioritize results about breast cancer, chicken breast recipes, or anatomy over more adult-themed results.
While we don’t know what percentage of all Google searches is adult in intent, we know that many authoritative, established sites and global marketplaces capture much of this traffic, even if no matching adult content is found on the site.
It is not uncommon for adult-intent searches to make up to 20-40% of all SEO visits. This number can be even higher for some geos.
Isn’t all traffic good traffic? Unpacking the adult-intent dilemma
For publisher sites that can monetize pageviews through programmatic advertising, a click is a click, and the intent of the traffic might not be the key determining factor for CPM.
For ad arbitrage sites, capturing adult intent visits may even be desirable.
However, this can be problematic for online businesses, platforms, or marketplaces that are conversion-oriented and non-adult.
Analytical noise
When organic search visits are going up, it’s tempting to deem SEO strategy a success. But what if a big portion of these visits are non-converting adult clicks?
An uptick in visits could be because a key competitor or another large website has scaled their adult-traffic blocking efforts.
Not having the right level of insight or ability to isolate valuable visitor segments from noise can lead to:
- Analytical mistakes.
- Misplaced investment of time and resources.
- Failure to tie SEO performance to business outcomes.
It’s expensive
What is the ROI of adult-intent traffic for a non-adult site?
If non-converting adult queries make up a lion’s share of all visits, it may be time to examine the costs associated with serving this traffic and start scaling back.
Quantifying adult-intent queries: Navigating your traffic data
Adult-intent traffic is easy to spot but difficult to quantify.
Sadly, no magic tool will provide all the SEO keyword data and determine what portion is adult in intent.
The bigger the site, the higher the risk.
Established sites that do not restrict indexing of search results pages or marketplaces that leverage user-generated content (UGC) run the risk of amassing an enormous amount of long-tail traffic through low-quality URLs that rank for the most obscure adult terms.
Google Search Console
GSC is a great place to start looking. While it does not provide complete keyword data, it offers enough insights to gauge the magnitude of the problem by examining a relatively small sample of top keywords.
Google Analytics
GA (and most other web analytics tools) can help get more granular by analyzing URLs of top organic landing pages for adult terms or phrases that could be interpreted as adult in meaning.
This is especially relevant for marketplaces, sites that index internal search results, or leverage UGC for SEO.
As a bonus, GA makes it easier to understand the business impact of adult traffic by cross-referencing it with available engagement and conversion data.
Ahrefs
Ahrefs is a fantastic tool that can analyze massive lists of keywords and their ranking fluctuations.
With a bit of regex magic or AI help, it’s possible to determine which keywords have adult intent and estimate the overall share of traffic they represent.
The best part? Competitive intelligence.
Ahrefs makes it easy to analyze competitor standing with respect to adult traffic and glimpse additional insights behind their SEO reach and performance.
It’s well worth segmenting traffic data for further detail. Do some geographies, days of week, times of day, or device types stand out more than others?
Understanding behavioral and usage patterns can make isolating and addressing unwanted traffic easier.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
How to minimize and manage unwanted adult-intent traffic
While no solution will be perfect, here are several ways to reduce adult-intent traffic.
Examine URL slugs and on-page keywords
Frequently, a partial keyword or phrase match in URLs or on-page keywords might be enough to rank a perfectly innocent page to one or more related adult queries.
Sometimes, updating URLs and on-page elements may be enough to drop unwanted rankings.
Remember that changing a URL will likely temporarily impact overall rankings and URL authority for the affected page.
Make use of blacklists
Paid search teams often use blacklists for adult, hateful or harmful keywords. These lists can also be useful for SEO.
Use them to restrict the crawling or indexing of URLs based on related keywords.
One of the most popular methods for this is robots.txt. It offers a simple, effective way to disallow problematic URLs at scale using regex rules.
One of the downsides to this approach is how public it is – it’s quite literally out there for the entire world to see. Another downside is that robots.txt does not allow for nuance.
Not every adult-intent search is equally problematic. In many cases, it may be enough to noindex a page to allow for crawl and discovery of other linked content.
On the other hand, it might be desirable to apply 404 or even 410 response codes to URLs that consistently rank for extreme or very illicit phrases. Websites with dynamic URL generation are especially susceptive to this.
Frequently, URLs that drive adult-intent traffic will only rank for one or a few closely related adult-intent terms, which makes disallowing, noindexing, or doing a 404 redirect viable options.
In other cases, a blanket rule is not the best solution. Consider doing an experiment with conditional rules that:
- Target users, instead of bots.
- Restrict access only to certain audiences, geographies, or device types that drive the unwanted traffic.
Consider engagement-based indexing
An adult-intent user is unlikely to convert on a non-adult site.
These low-quality visits will likely have exceptionally high bounce rates, low pageviews, and no conversions.
A scalable approach for an enterprise site might include custom indexing logic that issues noindex directives based on user engagement and conversion signals.
Manage adult-intent traffic to protect your SEO efforts
While adult-intent SEO traffic might increase the volume of visitors, the quality and relevance of this audience for non-adult sites are questionable.
Businesses must recognize the nuance between traffic numbers and genuine user engagement.
By effectively recognizing, segmenting, and acting against unwanted adult-intent traffic, enterprises can fine-tune their SEO strategies and ensure their content reaches the right audiences.
After all, in the age of data-driven decision-making, it’s not just about attracting eyes – it’s about attracting the right ones.
The post How SEOs can deal with unwanted adult-intent traffic appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
4 top affiliate search partners to consider: A breakdown
Written on September 5, 2023 at 12:06 am, by admin
Q4 is looming, and many brands are strategizing how to optimize their budgets to get more traffic to their products when buyer intent soars.
One humble suggestion from this longtime affiliate marketer: TM+ partners (also known as Trademark+ partners or affiliate search partners), who work with brands to bid on brand keywords to drive traffic to coupon or offer sites.
Despite what you may have heard, TM+ models are a no-risk model – the affiliate site pays for the click and gets paid by the brand when the ads lead to conversions.
To keep the guidelines clearly delineated, brands should give their affiliate partners a list of keywords approved by the search teams, with any other keywords off-limits to the TM+ partner to ensure both parties aren’t bidding against each other.
I work with a ton of affiliate search partners, but if you’re just getting your feet wet in this initiative, you should start with a short list of targets who can help you get traction.
This article covers four major partners – why I recommend them, their unique strengths, and when they might not make sense for your brand.
1. PromoCodesForYou
Brands that give PromoCodesForYou (PCFY) bidding rights to a keyword set have more of a chance to get picked up for organic inclusion in the portfolio of sites in Dotdash Meredith, PCFY’s parent company.
The portfolio is broad and includes publications like:
- People
- Travel + Leisure
- Allrecipes
- Food & Wine
- EatingWell
- InStyle
- Shape
- Better Homes & Gardens
Brands looking for added exposure to complement direct-response coupon campaigns have a potential gold mine with a PCFY partnership.
The flip side is that PCFY won’t work with just any brand (specifically, small, lesser-known ones aren’t going to be a fit here).
Their sweet spot is well-known brands, and their vetting process includes search volume (pulled from Google Ads) with aggressive volume minimums.
That said, if search volume is low but the brand offers a relatively high average order value, PCFY may be willing to run a test to see if the partnership has some sparks.
2. Offers.com
A relatively close second, Offers.com is a Ziff Davis company with the potential of added organic exposure on its portfolio of sites, which include:
- Mashable
- AskMen
- PCMag
- Everyday Health
- BlackFriday.com
- BestBlackFriday.com
- TheBlackFriday.com
(The latter three are obviously great and relevant platforms for the holidays, getting over 60 million global sessions during the holiday season).
Offers.com isn’t as particular as PCFY about partnerships, so it could be an ideal starting point for small brands looking to build on their momentum.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
3. Slickdeals
This is the deal site that’s most frequently leveraged for clients who have a need for inventory liquidation.
With 69 million unique monthly visitors and a big consumer footprint during the holidays, Slickdeals is great for pushing heavily discounted products.
Working with the site as a TM+ partner means you’ll likely get comped on individual deal submissions, which you usually have to pay for.
Slickdeals is an ideal partner for brands with seasonal catalogs, which usually means there are inventory remnants to clear out for the next season’s line.
It’s not solely for low-priced or mid-priced products. I’ve seen a discounted watch that retailed for $6,000 sell out in a matter of minutes, and some of our high-end clients have seasonal products that have found a great audience on Slickdeals.
But if you’re a brand that’s careful about buying in reasonable quantities and have a limited and/or stable collection of products, you can skip this one and research other partners.
4. CouponCause
CouponCause is different. Its main selling point is a give-back component that encourages consumers to donate their savings to any of a group of partner nonprofits.
It’s a great partner for calendar events like Giving Tuesday, and it’s a natural fit for socially conscious brands who have built a name around doing good.
CouponCause is much more flexible with brand partners, so SMBs and startup brands looking to get awareness or increase sales velocity often start here.
Getting started
If you don’t have relationships with any of these partners, you can go through an affiliate agency (using a recommended vetting process) or dig up a list of contacts at the partner and do the outreach yourself.
Remember that these are partnerships, so you’ll need to create a compelling case for why the platform will benefit from bidding on your brand and product terms.
It bears repeating that when you do get traction, it’s a good idea to make sure your search team is in the loop since they may have reservations or caveats for the partner to address (as well as keywords they want to protect).
Managed well, TM+ partnerships are an added source of low-risk revenue and exposure that can help you maximize purchase intent in the coming months.
Even if you’re not sure you can act in time to catch the Q4 wave, start doing the legwork now – good deals are always in season.
The post 4 top affiliate search partners to consider: A breakdown appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Increasing SERP visibility with structured data and schema testing
Written on September 1, 2023 at 3:02 pm, by admin
A steady decline in organic click-through rate has posed a real challenge for SEOs over the last few years.
With organic ranking becoming increasingly competitive, the best way SEOs can continue to excel is to shift their focus to SERP features and rich results, according to Tanner Zoromski, SEO manager at Merkle, a customer experience management company.
Technical SEO is an effective way to increase this type of visibility, using structured data and schema markup. He explained:
- “Essentially we want to make search engines’ lives easier by using these techniques because as a result, they will reward us with our content ranking higher in the SERP and we’ll have more interaction from searchers.”
Below is a summary of how structured data and schema testing can increase SERP visibility, as presented by Zoromski at SMX Advanced.
Search is changing
Search is always evolving. As new features are deployed, structured data can provide sites with the edge needed to effectively and efficiently communicate with search engines, Zoromski explained:
- “I want to highlight here that Google is continuing to test and implement search changes, relying more on information scraped from sites. And we are seeing more results resembling product listing pages – which lends clues to the direction Google is looking to take its product.”
- “This means we need to question how to stay at the forefront of this. How do we communicate most effectively with search engines? And how do we make sure that despite these different changes, we’re still present and showing up first?”
Understanding the breadth of schema
The SEO expert went on to explain the importance of knowing what schemes are available and how to correctly deploy them for maximum impact:
- “There are more than 32 different types of schema that can be implemented to interact with over 16 different SERP features.”
- “Whether we’re publishing articles or blogs or we’re reviewing movies, it’s important to understand what schema we can leverage to help search engines to index our sites.”
Winning with product schema
SEOs can implement product schema, images and descriptions across all pages to win feature snippets – otherwise known as position zero, as Zoromski explained:
- “Implement this type of schema to prevent traffic from going to competitors.”
- “Make sure your brand is putting its best foot forward and answering the questions that consumers want to know about their products.”
- “Additionally, make sure that the narrative stays on-site and on-topic for what your brand needs, so you are winning at position zero. Schema is the way to do that on this one.”
Results to shout about
Zoromski shared a case study that he had personally worked on, in which his team was able to double organic sessions for a recipe website by correcting microdata and replacing it with structured data:
- “My team were presented with a challenge – to review and address some improperly nested microdata that was causing error recipe videos and article markup on a blog.”
- “When we looked at the SERP for this keyword, the top organic listings all had structured data – there wasn’t an organic listing in the top search results, which hammers home the need to implement structured data and be visible.”
- “Our approach was to recommend the client remove the microdata and replace it with the JSON LD markup instead. As a result, we provided JSON-LD templates for the structured data types Article, Video, Recipe as well as Recipe with Nested Video.”
- “As a result, the recipe and article structured data returned 101% increase in organic sessions.”
- “This is a pretty clear demonstration that having schema on those pages really elevated their visibility and their engagement with searchers.”
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
Tracking performance
From a testing standpoint, Zoromski advised building and monitoring your own SEO testing to track performance and taking multiple factors into consideration, such as:
- Establishing the metrics and dimensions being impacted.
- Building customized reports to monitor KPI performance.
- Defining testing timelines to ensure results within a specific period.
- Controling for internal and external variables such as dev releases and site promotions.
- Allowing room for assumptions that could impact results.
- Relaying findings across teams.
AI and the future of schema
Following recent advancements in AI, Zoromski added that schema will most certainly play a role in this area moving forward – and it’s a situation SEOs need to monitor closely. He concluded:
- “With all the buzz around ChatGPT, Bing and Bard, we are seeing ramifications and effects across digital – and SEO is absolutely going to be affected.”
- “Search behavior is going to change as AI provides more natural language answers as opposed to a list of links.”
- “As SEOs, we need to make sure that in the future, our sites are equipped and able to effectively interact with these different changes. So keep on the lookout and make sure that you’re at the forefront of this.”
The post Increasing SERP visibility with structured data and schema testing appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Google introduces new Limited Ads Serving policy
Written on August 31, 2023 at 12:00 pm, by admin
Google Ads is introducing a new policy to combat scams and help prevent misleading ads.
The platform will now have a “get-to-know-you” period for advertisers it doesn’t know well. During this time, Google Ads may limit how many impressions unfamiliar advertisers receive.
The Limited Ads Serving policy will apply when an advertiser targets specific brands in their campaign but the relationship between the ad and brand is unclear, Google said.
This gradual rollout aims to curb bad actors while giving legitimate advertisers time to clarify their branding strategies on the platform before they’re rewarded with full reach.
Why we care. Implementing stricter ad policies could build user trust, giving people more confidence to click on buy from brands advertising on Google. The actual impact will likely be small for advertisers, but this could help some brands by reducing the reach of low-quality advertisers targeting them.
What’s next? Google Ads will notify advertisers impacted by the new policy. Those advertisers will get guidance on meeting the requirements to reach what Google calls “qualified status.”
Google Ads plans to slowly phase in enforcement before gradually expanding the policy’s reach.
Measuring trust. Google Ads shared how it will gauge an advertiser’s trustworthiness based on its track record:
- User feedback: Google Ads will closely monitor user feedback and consider negative and positive reviews.
- Advertising history: Google Ads will analyze whether advertisers have a good track record of adhering to its advertising policies.
- Advertiser Identity Verification: The platform confirmed that completing this step is an “important” factor in establishing trust between users and advertisers.
Help for advertisers. Google Ads stated that it will provide advertisers with advice on how to create clear ads – for example, pinning their domain to the title of the ad, especially if they are not a widely known brand.
Get the daily newsletter search marketers rely on.
<input type=”hidden” name=”utmMedium” value=”“>
<input type=”hidden” name=”utmCampaign” value=”“>
<input type=”hidden” name=”utmSource” value=”“>
<input type=”hidden” name=”utmContent” value=”“>
<input type=”hidden” name=”pageLink” value=”“>
<input type=”hidden” name=”ipAddress” value=”“>
Processing…Please wait.
What Google is saying? Advertisers without a record of good behavior could have their impressions limited under this policy until they build their track record, a Google Ads spokesperson told Search Engine Land:
- “While we want to allow users the opportunity to interact with relevant and helpful ads, this policy will reduce the chance that they’ll see a misleading or confusing ad from an advertiser with an unproven track record.”
- “It’s important to us that we keep our platform open to new advertisers and give them the opportunity to deliver a helpful experience. This policy won’t block or remove any ad from our platform, and any limitations on an ad will only apply in certain scenarios like when a user could be confused by an advertiser’s brand identity. We’ll be rolling out this policy gradually and making adjustments to ensure it’s working effectively.”
The post Google introduces new Limited Ads Serving policy appeared first on Search Engine Land.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
