A Declining Internet
Written on May 16, 2025 at 5:57 am, by admin
For as broad and difficult of a problem running a search engine is and how many competing interests are involved, when Matt Cutts was at Google they ran a pretty clean show. Some of what they did before the algorithms could catch up was of course fearmongering (e.g. if you sell links you might be promoting fake brain cancer solutions) but Google generally did a pretty good job with the balance between organic and paid search.
Early in search ads were clearly labeled, and then less so. Ad density was light, and then less so.
It appears as a somewhat regular set of compounded growth elements on the stock chart, but it is a series of decisions. What to measure, what to optimize, what to subsidize, and what to sacrifice.
Savvy publishers could ride whatever signals were over-counted (keyword repetition, links early on, focused link anchor text, keyword domains, etc.) and catch new tech waves (like blogging or select social media channels) to keep growing as the web evolved. In some cases what was once a signal of quality would later become an anomaly … the thing that boosted your rank for years eventually started to suppress your rank as new signals were created and signals composed of ratios of other signals got folded into ranking and re-ranking.
Over time as organic growth became harder the money guys started to override the talent, like in 2019 when a Google yellow flag had the ads team promote the organic search and Chrome teams intentionally degrade user experience to drive increased search query volume:
“I think it is good for us to aspire to query growth and to aspire to more users. But I think we are getting too involved with ads for the good of the product and company.” - Googler Ben Gnomes
A healthy and sustainable ecosystem relies upon the players at the center operating a clean show.
If they decide not to, and eat the entire pie, things fall apart.
One set of short-term optimizations is another set of long-term failures.
The specificity of an eHow article gives it a good IR score, and AdSense pays for a thousand similar articles to be created, then the “optimized” ecosystem gets a shallow sameness, which requires creating new ranking signals.
In the last quarter, Q1 of 2025, it was the first time the Google partner network represented less than 10% of Google ad revenues in the history of the company.
Google’s fortunes have never been more misaligned with web publishers than they are today. This statement becomes more true each day that passes.
That ecosystem of partners is hundreds of thousands of publishers representing millions of employees. Each with their own costs and personal optimization decisions.
Publishers create feature works which are expensive, and then cross-subsidize the most expensive work with cheaper & more profitable works. They receive search traffic to some type of pages which are seemingly outperforming today and think that is a strategy which will help them into the future, though hitting the numbers today can mean missing them next year, as the ranking signal mix squeezes out profits from those “optimizations,” and what led to higher traffic today becomes part of a negative sitewide classifier the lowers rankings across the board in the future.
Last August Googler Ryan Moulton published a graph of newspaper employees from 2010 until now, showing about a 70% decline. The 70% decline also doesn’t factor in that many mastheads have been rolled up by private equity players which lever them up on debt and use all the remaining blood to pay interest payments - sometimes to themselves - while stiffing losses from the underfunded pension plans on other taxpayers.
The quality of the internet that we’ve enjoyed for the last 20 years was an overhand from when print journalism still made money. The market for professionally written text is now just really small, if it exists at all.
Ryan was asked “what do you believe is the real cause for the decline in search quality, then? Or do you think there hasn’t been a decline?”
His now deleted response stated “It’s complicated. I think it’s both higher expectations and a declining internet. People expect a lot more from their search results than they used to, while the market for actually writing content has basically disappeared.”
The above is the already baked cake we are starting from.
The cake were blogs were replaced with social feeds, newspapers got rolled up by private equity players, larger broad “authority” branded sites partner with money guys to paste on affiliate sections, while indy affiliate sites are buried … the algorithmic artifacts of Google first promoting the funding of eHow, then responding to the success of entities like Demand Media with Vince, Panda, Penguin, and the Helpful Content Update.
The next layer of the icky blurry line is AI.
“We have 3 options: (1) Search doesn’t erode, (2) we lose Search traffic to Gemini, (3) we lose Search traffic to ChatGPT. (1) is preferred but the worst case is (3) so we should support (2)” - Google’s Nick Fox
So long as Google survives, everything else is non-essential.
StackOverflow questions over time, source SEDE; sadface, lunch has been eaten pic.twitter.com/tXZShoIBfG— Marc Gravell (@marcgravell) May 15, 2025
AI overview distribution is up 116% over the past couple months.
Yep, I have seen this too. AIOs surged with the March 2025 core update -> AI Overviews Have Doubled (25M AIOs Analyzed)
“The total number of AI Overviews grew by 116% between March 12th (pre-update) and May 6th, according to our database.”
And: “Reddit now appears in 5.5% of… pic.twitter.com/W0FxWo3qlQ— Glenn Gabe (@glenngabe) May 13, 2025
Google features Reddit *a lot* in their search results. Other smaller forums, not so much. A company consisting of many forums recently saw a negative impact from algorithm updates earlier this year.
VerticalScope, behind 1,200+ online communities, just confirmed Google updates have negatively impacted their business.
Some highlights from their Q1 ‘25 earnings update:
- Revenue decreased 8% to $13.6M
- Increased consulting costs for “AI initiatives and SEO optimizations”… pic.twitter.com/1OYsxEKm9e— Glen Allsopp (@ViperChill) May 14, 2025
Going back to that whole bit about not fully disclosing economic incentives risks promoting brain cancer … well how are AI search results constructed? How well do they cite their sources? And are the sources they cited also using AI to generate content?
“its gotten much worse in that “AI” is now, on many “search engines”, replacing the first listings which obfuscates entirely where its alleged “answer” came from, and given that AI often “hallucinates”, basically making things up to a degree that the output is either flawed or false, without attribution as to how it arrived at that statement, you’ve essentially destroyed what was “search.” … unlike paid search which at least in theory can be differentiated (assuming the search company is honest about what they’re promoting for money) that is not possible when an alleged “AI” presents the claimed answers because both the direct references and who paid for promotion, if anyone is almost-always missing. This is, from my point of view anyway, extremely bad because if, for example, I want to learn about “total return swaps” who the source of the information might be is rather important — there are people who are absolutely experts (e.g. Janet Tavakoli) and then there are those who are not. What did the “AI” response use and how accurate is its summary? I have no way to know yet the claimed “answer” is presented to me.” - Karl Denninger
The eating of the ecosystem is so thorough Google now has money to invest in Saudi Arabian AI funds.
Periodically ViperChill highlights big media conglomerates which dominate the Google organic search results.
One of the strongest horizontal publishing plays online has been IAC. They’ve grown brands like Expedia, Match.com, Ticketmaster, Lending Tree, Vimeo, and HSN. They always show up in the big publishers dominating Google charts. In 2012 they bought About.com from the New York Times and broke About.com into vertical sites like The Spruce, Very Well, The Balance, TripSavvy, and Lifewire. They have some old sites like Investopedia from their 2013 ValueClick deal. And then they bought out the magazine publisher Meredith, which publishes titles like People, Better Homes and Gardens, Parents, and Travel + Leisure. What does their performance look like? Not particularly good!
DDM reported just 1% year-over-year growth in digital advertising revenue for the quarter. It posted $393.1 million in overall revenue, also up 1% YOY. DDM saw a 3% YOY decline in core user sessions, which caused a dip in programmatic ad revenue. Part of that downturn in user engagement was related to weakening referral traffic from search platforms. For example, DDM is starting to see Google Search’s AI Overviews eat into its traffic.
Google’s early growth was organic through superior technology, then clever marketing via their toolbar, and later a set of forced bundlings on Android combined with payolla for default search placements in third party web browsers. A few years ago the UK government did a study which claimed if Microsoft gave Apple a 100% revshare on Bing they still couldn’t compete with the Google bid for default search placement in Apple Safari.
Microsoft offered over a 100% ad revshare to set Bing as the default search engine and went so far as discussing selling Bing to Apple in 2018 - but Apple stuck with Google’s deal.
In search, if you are not on Google you don’t exist.
As Google grew out various verticals they also created ranking signals which in some cases were parasitical, or in other cases purely anticompetitive. To this day Google is facing billions in of dollars in new suits across Europe for their shopping search strategy.
The Obama administration was an extension of Google, so the FTC gave Google a pass in spite of discovering some clearly anticompetitive behavior with real consumer harm. The Wall Street Journal published a series of articles from getting half the pages of the FTC research into Google’s conduct:
“Although Google originally sought to demote all comparison shopping websites, after Google raters provided negative feedback to such a widespread demotion, Google implemented the current iteration of its so-called ‘diversity’ algorithm.”
What good is a rating panel if you get to keep re-asking the questions again in a slightly different way until you get the answer you want? And then place a lower quality clone front and center simply because it is associated with the home team?
“Google took unusual steps to “automatically boost the ranking of its own vertical properties above that of competitors,” the report said. “For example, where Google’s algorithms deemed a comparison shopping website relevant to a user’s query, Google automatically returned Google Product Search – above any rival comparison shopping websites. Similarly, when Google’s algorithms deemed local websites, such as Yelp or CitySearch, relevant to a user’s query, Google automatically returned Google Local at the top of the [search page].””
The forced ranking of house properties is even worse when one recalls they were borrowing third party content without permission to populate those verticals.
Now with AI there is a blurry line of borrowing where many things are simply probabilistic. And, technically, Google could claim they sourced content from a third party which stole the original work or was a syndicator of it.
As Google kept eating the pie they repeatedly overrode user privacy to boost their ad income, while using privacy as an excuse to kneecap competing ad networks.
Remember the old FTC settlement over Google’s violation of Safari browser cookies? That is the same Google which planned on depreciating third party cookies in Chrome and was even testing hiding user IP addresses so that other ad networks would be screwed. Better yet, online business might need to pay Google a subscription fee of some sort to efficiently filter through the fraud conducted in their web browser.
HTTPS everywhere was about blocking data leakage to other ad networks.
AMP was all about stopping header bidding. It gave preferential SERP placement in exchange for using a Google-only ad stack.
Even as Google was dumping tech costs on publishers, they were taking a huge rake of the ad revenue from the ad serving layer: “Google’s own documents show that Google has siphoned off thirty-five cents of each advertising dollar that flows through Google’s ad tech tools.”
After acquiring DoubleClick to further monopolize the online ad market, Google merged user data for their own ad targeting, while hashing the data to block publishers from matching profiles:
“In 2016, as part of Project Narnia, Google changed that policy, combining all user data into a single user identification that proved invaluable to Google’s efforts to build and maintain its monopoly across the ad tech industry. … After the DoubleClick acquisition, Google “hashed” (i.e., masked) the user identifiers that publishers previously were able to share with other ad technology providers to improve internet user identification and tracking, impeding their ability to identify the best matches between advertisers and publisher inventory in the same way that Google Ads can. Of course, any puported concern about user privacy was purely pretextual; Google was more than happy to exploit its users’ privacy when it furthered its own economic interests.”
This chat was deep in the spoliation evidence of the cases (it wasn’t purged) to demonstrate the substantive discussions Google senior execs (Sissie) have over chat. My read is she is weighing in on the same issue from a different forum (privacy law in EU). 2/2 pic.twitter.com/11EBO7xqqh— Jason Kint (@jason_kint) May 9, 2025
In terms of cost, I really don’t think the O&O impact has been understood too, especially on YouTube. - Googler David Mitby
Did we tee up the real $ price tag of privacy? - Googler Sissie Hsiao
Google continues to spend billions settling privacy-related cases. Settling those suits out of court is better than having full discovery be used to generate a daisy chain of additional lawsuits.
As the Google lawsuits pile up, evidence of how they stacked the deck becomes more clear.
Courtesy of SEO Book.com
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Google's Hyung-Jin Kim Shares Google Search Ranking Signals
Written on May 16, 2025 at 5:57 am, by admin
On February 18, 2025 Google’s Hyung-Jin Kim was interviewed about Google’s ranking signals. Below are notes from that interview.
“Hand Crafting” of Signals
Almost every signal, aside from RankBrain and DeepRank (which are LLM-based) are hand-crafted and thus able to be analyzed and adjusted by engineers.
- To develop and use these signals, engineers look at data and then take a sigmoid or other function and figure out the threshold to use. So, the “hand crafting” means that Google takes all those sigmoids and figures out the thresholds.
- In the extreme hand-crafting means that Google looks at the relevant data and picks the mid-point manually.
- For the majority of signals, Google takes the relevant data (e.g., webpage content and structure, user clicks, and label data from human raters) and then performs a regression.
Navboost. This was HJ’s second signal project at Google. HJ has many patents related to Navboost and he spent many years developing it.
ABC signals. These are the three fundamental signals. All three were developed by engineers. They are raw, …
- Anchors (A) - a source page pointing to a target page (links). …
- Body (B) - terms in the document …
- Clicks (C) - historically, how long a user stayed at a particular linked page before bouncing back to the SERP. …
ABC signals are the key components of topicality (or a base score), which is Google’s determination of how the document is relevant to a query.
- T* (Topicality) effectively combines (at least) these three ranking signals in a relatively hand-crafted way. … Google uses to judge the relevance of the document based on the query term.
- It took a significant effort to move from topicality (which is at its core a standard “old style” information retrieval (”IR”) metric) … signal. It was in a constant state of development from its origin until about 5 years ago. Now there is less change.
- Ranking development (especially topicality) involves solving many complex matheivlatical problems.
- For topicality, there might be a team of … engineers working continuously on these hard problems within a given project.
The reason why the vast majority of signals are hand-crafted is that if anything breaks Google knows what to fix. Google wants their signals to be fully transparent so they can trouble-shoot them and improve upon them.
- Microsoft builds very complex systems using ML techniques to optimize functions. So it’s hard to fix things - e.g., to know where to go and how to fix the function. And deep learning has made that even worse.
- This is a big advantage of Google over Bing and others. Google faced many challenges and was able to respond.
- Google can modify how a signal responds to edge cases, for example in response to various media/public attention challenges …
- Finding the correct edges for these adjustments is difficult, but would be easy to reverse engineer and copy from looking at the data.
Ranking Signals “Curves”
Google engineers plot ranking signal curves.
The curve fitting is happening at every single level of signals.
lf Google is forced to give information on clicks, URLs, and the query, it would be easy for competitors to figure out the high-level buckets that compose the final IR score. High- level buckets are:
- ABC — topicality
- Topicality is connected to a given query
- Navboost
- Quality
- Generally static across multiple queries and not connected to a specific query.
- However, in some cases Quality signal incorporates information from the query in addition to the static signal. For example, a site may have high quality but general information so a query interpreted as seeking very narrow/technical information may be used to direct to a quality site that is more technical.
Q* (page quality (i.e., the notion of trustworthiness)) is incredibly important. lf competitors see the logs, then they have a notion of “authority” for a given site.
Quality score is hugely important even today. Page quality is something people complain about the most.
- HJ started the page quality team ~ 17 years ago.
- That was around the time when the issue with content farms appeared.
- Content farms paid students 50 cents per article and they wrote 1000s of articles on each topic. Google had a huge problem with that. That’s why Google started the team to figure out the authoritative source.
- Nowadays, people still complain about the quality and AI makes it worse.
Q* is about … This was and continues to be a lot of work but could be easily reverse engineered because Q is largely static and largely related to the site rather than the query.
Other Signals
- eDeepRank. eDeepRank is an LLM system that uses BERT, transformers. Essentially, eDeepRank tries to take LLM-based signals and decompose them into components to make them more transparent. HJ doesn’t have much knowledge on the details of eDeepRank.
- PageRank. This is a single signal relating to distance from a known good source, and it is used as an input to the Quality score.
- … (popularity) signal that uses Chrome data.
Search Index
- HJ’s definition is that search index is composed of the actual content that is crawled - titles and bodies and nothing else, i.e., the inverted index.
- There are also other separate specialized inverted indexes for other things, such as feeds from Twitter, Macy’s etc. They are stored separately from the index for the organic results. When HJ says index, he means only for the 10 blue links, but as noted below, some signals are stored for convenience within the search index.
- Query-based signals are not stored, but computed at the time of query.
- Q* - largely static but in certain instances affected by the query and has to be computed online (see above)
- Query-based signals are often stored in separate tables off to the side of the index and looked up separately, but for convenience Google stores some signals in the search index.
- This way of storing the signals allowed Google to …
User-Side Data
By User Side Data, Google’s search engineers mean user interaction data, not the content/data that was created by users. E.g., links between pages that are created by people are not User Side data.
Search Features
- There are different search features - 10 blue links as well as other verticals (knowledge panels, etc). They all have their own ranking.
- Tangram (fka Tetris). HJ started the project to create Tangram to apply the basic principle of search to all of the features.
- Tangram/Tetris is another algorithm that was difficult to figure out how to do well but would be easy to reverse engineer if Google were required to disclose its click/query data. By observing the log data, it is easy to reverse engineer and to determine when to show the features and when to not.
- Knowledge Graph. Separate team (not H/’s) was involved in its development.
- Knowledge Graph is used beyond being shown on the SERP panel.
- Example — “porky pig” feature. If people query about the relation of a famous person, Knowledge Graph tells traditional search the name of the relation and the famous person, to improve search results - Barack Obama’s wife’s height query example.
- Self-help suicide box example. Incredibly important to figure it out right, and tons of work went into it, figuring out the curves, threshold, etc. With the log data, this could be easily figured out and reverse engineered, without having to do any of the work that Google did.
Reverse Engineering of Signals
There was a leak of Google documents which named certain components of Google’s ranking system, but the documents don’t go into specifics of the curves and thresholds.
The documents alone do not give you enough details to figure it out, but the data likely does.
Courtesy of SEO Book.com
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Google Antitrust Leaked Documents
Written on May 16, 2025 at 5:57 am, by admin
User interaction signals
Create relevancy signals out of user read, clicks, scrolls, and mouse hovers.
Not how search works
Search does not work by delivering results which match a query that ends at the user. This view of search is incomplete.
How search works
The flow of the engagement metrics from the end user / searcher back to the search engine helps the search engine refine the result set.
Fake document understanding
Google looks at the actions of searchers much more than they look at raw documents. If documents elicit a positive reaction from searchers that is proof the document is good. If a document elicits negative reactions then they presume the document is bad.
Google learns from searchers
The result set is designed not just to serve the user, but to create an interaction set where Google can learn from the user & incorporate logged user data into influencing the rankings for future searches.
Dialog is the source of the magic
Each user interaction gives Google data to refine their ranking algorithms and make search smarter.
Happy users provide informed user interactions
Informed user interactions are part of a virtuous cycle which allows Google to better train their models & understand language patterns, then use that understanding to deliver a more relevant search result set.
Prior user behavior is used as a baseline.
Google is not pushing search personalization anywhere near as hard as they once did (at least not outside of localization) but in the above Google states prior selections is one of Google’s strongest ranking for rankings.
Once again rather than understanding documents directly they can consider the users who chose the documents. Users can be maps based on actions outside of standard demographics so that more like users are given more weight on their user interactions with the result set choices.
Google revenue growth is consistent
Core Google ad revenue grows much more consistently than any other large media business, growing at 20% to 22% year after year for 8 in 9 years with the one outlier year being 30% growth.
Apple is paid by Google to not compete in search.
Apple got around a 50% revshare in the mid 2000’s on through to the iPhone deal renewal.
Manipulating ad auctions
Google artificially inflates ad rank of the runner up in some ad auctions to bleed the auction winner dry. Ad pricing is not based on any sort of honest auction mechanism, but rather has Google looking across at your bids and your reactions to price gouging to keep increasing the ad prices they charge you.
Organics below the fold
Google not only pushes down the organic result set with 3 or 4 ads above the regular results, but then they can include other selections scraped from across the web in an information-lite format to try to focus attention back upward. Then after users get past a singular organic search result it is time to redirect user attention once again using a “People also ask” box.
Google can further segment user demand via ecommerce website styled filters, though some of the filters offered may be for other websites, in addition to things like size, weight, color, price, and location.
Courtesy of SEO Book.com
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Your exclusive look at the SMX Next agenda is here
Written on October 7, 2024 at 3:15 pm, by admin

Next month, thousands of seasoned search marketers will gather online to learn next-level SEO, PPC, and generative AI tactics, get in-depth answers to specific questions, and connect with like-minded community members and subject experts.
Are you ready to join them?
Your free SMX Next pass is just a few clicks away, and we can’t wait to host you online, November 13-14. The agenda is now live and ready for you to explore!
It’s all hand-crafted by the Search Engine Land programming committee, including Danny Goodwin, Barry Schwartz, Anu Adegbola, Brad Geddes, Eric Enge, and Greg Finn. Here’s a look at everything you get:
- Two keynote conversations about 2025 SEO and PPC trends, plus live Q&A.
- Actionable sessions on GEO, GenAI, N-E-E-A-T-T, and other critical search topics.
- Coffee Talk meetups with like-minded marketers and subject matter experts.
- Live Q&A with speakers including Amy Hebdon, Fred Vallaeys, Melissa Mackey, and more.
- Instant on-demand access for 180 days so you can watch and rewatch at your own pace.
- A personalized certificate of attendance to showcase your knowledge of the latest in search.
For nearly 20 years, more than 200,000 search marketers from around the world have attended SMX to learn game-changing tactics and make career-defining connections.
Don’t miss your final opportunity in 2024 to join them online for the only training event programmed by Search Engine Land, the industry publication you trust to stay competitive. Grab your free pass now!
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Visual content and SEO: How to use images and videos in 2025
Written on October 7, 2024 at 3:15 pm, by admin

Many businesses are finding their digital marketing efforts falling flat despite producing content regularly. The culprit?
An outdated approach that neglects the growing importance of visual content in SEO.
With tech giants like Google and Apple prioritizing AI-powered visual search, companies that don’t adapt risk losing visibility and relevance.
Many enterprises lack the centralized strategies and governance needed to effectively manage visual assets across departments.
This article outlines a seven-step process to futureproof your visual content and SEO for 2025.
By implementing these strategies, you can leverage the latest trends, optimize for AI-powered search and significantly boost your online presence and engagement.
How are major giants pivoting features to embrace visual search?
Google is now integrating video and image content, primarily from YouTube, websites and third-party sites, into the Top Insights section of product and AI-generated search results pages.
This change provides users a richer, more engaging experience by offering a diverse range of information beyond text-based results and reviews.
It also allows brands to leverage image and video content to boost visibility and engagement.
Similarly, Apple has released Visual Intelligence with Vision 3, offering new features such as image segmentation and object detection.
These new capabilities allow developers to build more sophisticated and powerful applications that utilize visual information.
Why are visual content and SEO challenging for enterprises and SMEs?
The biggest challenges in visual content and SEO include a lack of centralization, inconsistent policies, governance and knowledge across departments.
Search is multimodal, meaning content creation should focus on customer intent, considering images, videos, PDFs and all other touchpoints and channels.
It is evolving beyond text to include diverse visual content. This shift requires a customer-centric approach that prioritizes intent and experience. Many companies struggle with implementing consistent best practices for visual assets across departments.
With the rise of AI-powered search, it becomes even more critical to centralize all visual assets, ensure they are optimized and consistently distribute them across all channels.
Dig deeper: Visual optimization must-haves for AI-powered search
Top trends in visual content and SEO

Featured images and interactive short-form videos
- These elements are critical for enhancing user experience, as consumers are seeking app-like interactions.
- Platforms such as TikTok and Instagram Reels have popularized short, engaging videos, making them essential for reaching audiences.
- Overall, video content helps in engagement and improve conversions and saturating SERPs.
Personalization
- Tailoring experiences based on audience, journey, demographics, location and intent is vital for brands to succeed.
Mobile dominance
- Since most images and videos are consumed on mobile devices, it is crucial to ensure that your UI, UX and assets are optimized for mobile.
In-video interaction
- Brands are exploring interactive video formats that allow viewers to choose their own path or engage in features like a 360-degree view and zooming.
- Incorporating polls and quizzes can create a more immersive and engaging experience.
7-step process to futureproof your visual content and SEO in 2025

Well-chosen featured images or videos significantly boost a website’s click-through rate (CTR) and encourage user engagement.
It is important to follow best practices such as image relevance to content, high quality, appropriate file size and format and mobile optimization.
1. Curate
Compile a list of all channels, vendors, departments and touchpoints where visual content is created and consumed.
2. Centralize
Establish policies to organize your content. Ensure all images and videos reside in a digital asset management (DAM) system and are accessible via a content delivery network (CDN).
All channels should access images directly from the DAM, avoiding multiple copies of the same image or video sitting in file folders.
3. Optimize
Use high-quality, relevant images with appropriate file formats, image tags, sitemaps and structured data to enhance discovery and visibility.
Leverage Google NLP to check for content marked as inappropriate and prioritize images relevant to the search query.
Ensure visual content doesn’t affect site speed by using next-gen image formats and implementing lazy loading.
4. Distribute
Ensure content is consumed from one central location. Use cloud infrastructure and a CDN to host and distribute your assets efficiently.
5. Application, experience and infrastructure
Leverage entity search to gain a competitive edge by implementing a clear visual hierarchy and enhancing content scannability.
Well-structured, topical pages with relevant images and videos perform better. Develop snackable videos for your unique selling proposition (USP) and customer reviews.
Create content suitable for visual snippets, such as how-to guides and recipes. The goal is to optimize for Google’s multisearch feature, which combines image, video and text searches.
Infrastructure is one of the biggest gaps most businesses face.
Most DAM systems are designed only to store images and lack the capability to optimize them easily.
Having a DAM that provides real-time scoring of your asset quality and connects seamlessly with your websites and other channels is essential.
Dig deeper: Future-proofing digital experience in AI-first semantic search
6. Governance and checklist
Establish robust governance and checklists around quality, consistency and usage across all departments.
Continuously test which images are performing well in SERPs and conversions to refine your checklist.
7. Metrics and KPIs
Develop metrics to track SERP and rich snippet saturation, presence in AI overviews, overall click-through rates (CTR), clicks from visual search, engagement rates and page bounce rates.
As Google and other search engines incorporate conversational AI, short videos, images, and social media posts into search results – shifting away from traditional website listings – these strategies will help you effectively use visual content in 2025.
Success stories
Using the seven-step process as mentioned above, our clients were able to drive phenomenal success for their images on search.
A popular hotel in Georgetown saw a 104% increase in the number of times images appeared in search results versus the previous period.
Search results appearances lift:

A Massachusetts Resort and Spa saw a staggering lift in its visual search performance:
- + 871% increase in the number of times images appeared in search results versus the previous period.
- + 101% increase in overall image impressions versus the previous period.
Search results appearances lift:

Impressions lift:
Dominate visual search with well-optimized images and videos
Video and images are powerful tools for enhancing SEO and boosting online visibility. As LLMs become increasingly skilled at understanding and generating both text and visuals, you must prepare for more integrated visual-textual content creation and optimization strategies.
By prioritizing these areas, you can stay ahead of the curve in visual content and SEO for 2025. Embracing the latest technologies and features released by major tech companies will enable you to enhance your online presence and improve searchability.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Structured data and SEO: What you need to know in 2025
Written on October 7, 2024 at 3:15 pm, by admin

Many search engines rely on structured data to enhance user experiences – and this trend will likely intensify in 2025.
For this reason, structured data is no longer a “nice-to-have” but an essential part of any SEO strategy.
Here’s what you need to know about structured data, including why it matters, important trends, key schema types, advanced techniques and more.
What is structured data?
Structured data is a standardized format for organizing and labeling page content that helps search engines understand it more effectively.
Google uses structured data to create enhanced listings, rich results and various features in search engine results pages (SERPs).
Being included in these features can boost your website’s visibility and organic reach, especially in entity-based searches.
Vocabulary
The most commonly used vocabulary for structured data is Schema.org, an open-source framework that provides an extensive library of types and properties.
Schema.org includes hundreds of predefined types, such as Product
, Event
or Person
and properties like name
, price
and description
.
Format
The preferred format for implementing structured data is JSON-LD (JavaScript Object Notation for Linked Data), which is endorsed by Google and other search engines.
JSON-LD encapsulates structured data within a <script>
tag, keeping it separate from the core HTML.
This approach makes it more flexible, easier to implement and less intrusive. JSON-LD is particularly useful for dynamic content on larger websites.
Validation
Correct implementation of structured data is essential to be eligible for rich results.
To verify that structured data is properly implemented and can be processed by search engines, use tools like:
These tools check for errors or omissions in the schema, ensuring the markup is valid and effective.
Dig deeper: What is technical SEO?
Why structured data matters more than ever
Structured data enables search engines to interpret website content more deeply, enhancing how pages are indexed and presented in search results.
It allows brands to reach audiences in less competitive areas of search, such as voice and image search, allowing sites to drive traffic and engagement outside of traditional SEO.
Zero-click search and brand authority
More and more SERP features, like knowledge panels and featured snippets, depend on structured data to provide answers directly in search results. This means users can get information without clicking through the publisher’s site.
This rise of so-called zero-click searches, has made structured data even more indispensable in SEO.
While these features offer limited opportunity to drive visits, they can boost organic impressions, enhance brand recognition and maintain user interaction with the brand.
Being regularly shown in the rich results reinforce top-of-mind awareness (TOMA) – and in the E-E-A-T world, a trusted and authoritative brand is crucial for success in SEO.
Key schema types to use in 2025
While new schema types emerge regularly and should be tested where relevant, several “evergreen” types have proven their effectiveness over time.
Ecommerce
Product
schema (often used together with Offer
and Review
schema) is essential for ecommerce, providing details about products, such as price, availability and reviews.
This schema powers rich snippets like product carousels and review stars in SERPs, significantly improving click-through rates.
Merchant listings (a combination of Product
and Offer
schema) can be a great way for new ecommerce sites to gain early visibility and traffic through Google Shopping.
AggregateOffer
schema is ideal for marketplace websites, as it enables multiple vendors’ offers to be represented for the same product, allowing users to compare prices and options.
Informational
FAQ
schema allows websites to present common questions and answers directly in the search results.
It is highly effective for improving user engagement by providing concise answers for conversational queries and can appear in rich results and voice search.
Q&A
schema is designed for pages where users can submit questions and multiple answers are provided and is often found in forums or community-based platforms.
It powers a Q&A carousel that features both questions and answers directly in the SERP, increasing visibility and CTR for long-tail queries and conversational searches.
Article
and WebPage
schema can boost visibility in Google News, Discover and top stories carousels and get exposure in voice search.
Events
Event
schema is used to mark up details about virtual or physical events, such as concerts, conferences, webinars or local meetups.
It provides specifics like the date, location, start and end times, ticket availability, and performers. This information can be included in Google’s event listings, enhancing visibility in local or event-related searches.
Newly supported properties for BroadcastEvent
and ScreeningEvent
enhance how live events and screenings are presented in search.
Dig deeper: How to deploy advanced schema at scale
How to use structured data in 2025
While the applications below are not entirely new, their importance is set to increase through 2025 as user behaviors continue to shift.
Entity-based search
Entity-based search is where search engines prioritize entities – people, places, things and concepts – over individual keywords.
Instead of focusing on isolated words and relationships between them, search engines now better understand connections between entities and how they fit into a broader context.
Structured data like Person
, Organization
or Place
schema can clearly define relevant entities, enhancing their visibility in Knowledge Graphs and entity-based results.
Likewsie, SameAs
schema can be used to help search engines understand that an entity mentioned on one page is the same as an entity mentioned elsewhere.
By marking up an entity with SameAs
and linking it to trusted external sources, such as Wikidata, Wikipedia or authoritative social media profiles, it’s possible to reinforce the association and improve its recognition and reach in Knowledge Graphs and rich search results.
Dig deeper: How to use entities in schema to improve Google’s understanding of your content
Speakable
Speakable
schema (in beta at Google) is an important tool for optimizing content for voice search results.
It helps search engines identify which sections of a webpage are best suited for audio playback, including Google Assistant-enabled devices using TTS.
The goal is to provide concise, clear answers to users’ questions in spoken format. This is especially useful for news websites and publishers, as they can mark up critical content to be featured in voice responses.
Multimodal search
Multimodal search allows users to query search engines using various forms of input, such as text, images and voice, sometimes combined in a single query.
This is largely driven by AI models designed to process multiple data types simultaneously.
Structured data like VideoObject
and ImageObject
ensure that multimedia content is properly understood, indexed and ranked.
Schema nesting
Schema nesting allows for the representation of more complex relationships within structured data.
By nesting one schema type within another – such as a Product
schema within an Offer
schema, further nested within a LocalBusiness
schema – it’s possible to communicate layered information about product availability, pricing and location.
This enables search engines to understand not just individual data points but how they are connected, leading to more contextually rich search results, like specific local availability and offers tied to individual businesses.
Another example might be a Recipe
schema nested within a HowTo
schema, further nested within a Person
schema.
This communicates that a specific person (author or chef) created the recipe, which in turn contains step-by-step instructions on how to prepare the dish.
Person
schema would include the chef’s name, bio and social profiles.HowTo
schema would describe the cooking process, including the steps and required materials.Recipe
schema provides details like the list of ingredients, preparation time and nutritional information.
This setup would enable search engines to display rich snippets that include the recipe’s creator, ingredients and cooking instructions in a clear and organized manner.
While the primary benefits of schema nesting include improved contextual understanding and enhanced rich results, nesting also adds flexibility for meeting different search intents, allowing search engines to prioritize information based on the query.
Maximize your search visibility with structured data
Structured data is already a critical driver of SEO success for many websites, both large and small.
With hundreds of schema types, dozens of Google SERP features and increasing applications in AI and voice search, its importance will only continue to grow in 2025.
As Google and other search platforms continue to change, carefully and creatively leveraged structured data can provide a significant competitive edge, allowing to gain visibility in existing features and prepare for future opportunities.
Dig deeper: How schema markup establishes trust and boosts information gain
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
The art of AI-enhanced content: 8 ways to keep human creativity front and center
Written on October 7, 2024 at 3:15 pm, by admin

As artificial intelligence (AI) becomes increasingly integrated into content creation, it’s easy to feel like the human element of writing might be overshadowed.
While AI offers powerful tools that enhance productivity, streamline workflows and even generate content, it’s essential to retain your personal touch to create engaging and authentic material.
This article explores how you can balance AI with your creativity, ensuring your unique voice shines through.
1. Understand AI’s role in content creation
AI tools can help generate ideas, draft content, edit and optimize for search, making content creation faster and more efficient.
However, AI lacks the nuanced understanding of human emotions, context and cultural insights that come naturally to humans.
Recognizing AI’s limitations is the first step in using it as a supportive tool rather than a replacement for human creativity.
Use AI for repetitive tasks, not original thought
AI excels at handling repetitive tasks, but it falls short when it comes to the subtleties of original thought, humor and storytelling.
By delegating routine tasks to AI, you can free up valuable time to focus on the creative elements that rely on human intuition and personal experience.
- Automate research and data analysis: Use AI to gather information, analyze trends or compile statistics, allowing you to spend more time crafting a compelling narrative.
- Draft assistance: Let AI provide you with a starting point or outline for your content, but take the lead in shaping it into something uniquely yours.
Dig deeper: AI content creation: A beginner’s guide
2. Use AI as a creative partner, not your replacement
Think of AI as a co-creator. It’s there to assist, suggest and optimize – not to replace your unique voice.
AI can provide a structural backbone or help refine your content, but your ideas, insights and personal experiences are what will set your work apart.
Refine AI outputs with human insight
AI-generated content often needs a human touch to make it relatable and engaging.
Review and edit AI suggestions to align them with your voice, adding personal anecdotes, humor and insights that reflect your expertise and personality.
- Inject personal experiences: Add stories, examples or perspectives that only you can provide. This personal touch creates a connection with your audience that AI cannot replicate.
- Adjust for tone and style: AI can mimic various tones but often produces content that lacks warmth or emotional depth. Edit to include the subtleties of language that resonate with your readers.
Dig deeper: How to make your AI-generated content sound more human
3. Customize AI to match your voice
Many AI tools offer customization features that allow you to adjust settings to match your preferred tone, style and language. This ensures that AI outputs align more closely with your established brand voice.
Define your voice
To ensure consistency, take time to define your writing style. Are you formal, conversational, witty or authoritative?
Establish guidelines that reflect your voice and regularly tweak AI settings to match these preferences.
- Set clear parameters: Provide the AI with detailed prompts that include your desired tone, target audience and any specific stylistic nuances. The more context you provide, the better the AI can tailor its outputs to your needs.
- Iterate and refine: Use the feedback loop to your advantage. If the AI-generated content feels off, refine the prompts and give feedback to nudge the tool closer to your style.
Dig deeper: 3 ways to add a human touch to AI-generated content
4. Provide clear prompts and feedback to AI
AI relies heavily on the quality of the instructions it receives.
Vague prompts can lead to generic content, while detailed, contextual prompts can produce outputs that better reflect your intentions.
Give context-rich prompts
Instead of generic commands, offer AI tools with specific guidelines.
Don’t just ask AI to “Write an article introduction.”
Try “Write an engaging introduction for a blog post about sustainable fashion, targeted at eco-conscious millennials, in a friendly and approachable tone.”
- Focus on audience needs: When directing AI, always keep your audience in mind. Prompt AI to address their pain points, preferences and language style to create content that speaks directly to them.
- Refine outputs with detailed feedback: If AI’s first attempt isn’t quite right, refine it by tweaking prompts or adding more detailed instructions. This iterative process helps align AI-generated content with your vision.
Dig deeper: Advanced AI prompt engineering strategies for SEO
5. Enhance emotional resonance with AI content
AI-generated content can often feel sterile or detached.
While AI is good at generating logical and coherent text, it struggles to evoke emotions. Enhancing content with emotional resonance is where human creativity comes into play.
Add emotion, empathy and authenticity
After the AI generates content, go through it with a fine-tooth comb to add emotional elements that AI might miss. Infuse the text with empathy, humor, passion or urgency to make it more engaging.
- Use relatable language: Replace formal or robotic language with conversational and relatable phrases. Address your readers as if you’re speaking directly to them, making your content feel more personal.
- Connect through storytelling: Share personal stories, case studies or real-world examples that illustrate key points. These elements help bridge the gap between AI’s capabilities and human touch.
6. Personalize content for your audience
AI can analyze data to personalize content suggestions based on user preferences, but it’s your understanding of your audience that will make your content truly resonate. Use AI’s insights as a guide, but rely on your knowledge of your readers to fine-tune the final product.
Blend data-driven insights with human understanding
AI can highlight what topics are trending or what keywords to focus on, but only you can interpret this data in a way that speaks directly to your audience’s needs and values.
- Craft tailored messages: Use AI to segment your audience and suggest tailored content, but ensure the final message reflects your understanding of their motivations and desires.
- Maintain authenticity: Even when personalizing at scale, keep your brand’s authenticity intact. Avoid overly automated responses that can feel impersonal or generic.
Dig deeper: How to build and retain brand trust in the age of AI
7. Keep your creative process transparent
Transparency about your content creation process can actually enhance trust with your audience.
Letting readers know that AI aids in content creation while emphasizing the human touch reinforces the value of your insights and creativity.
Share behind-the-scenes insights
Consider sharing how you blend AI and human creativity in your workflow. This transparency can build a connection with your audience, showing them that while you use advanced tools, your unique voice remains at the core.
- Educate your audience: If AI tools are a part of your content process, explain how they help improve your work. This can position you as a forward-thinking creator who values both innovation and authenticity.
- Highlight human contributions: When presenting AI-assisted content, highlight the areas where your personal input was most significant, whether in crafting the narrative, adding context or injecting personality.
Dig deeper: AI-generated content: To label or not to label?
8. Embrace AI without losing yourself
The ultimate goal of integrating AI into content creation is to enhance your capabilities, not replace your essence. Remember that AI is a tool – an incredibly advanced and helpful one – but your creativity, judgment and personal touch are irreplaceable.
Stay true to your voice
Regularly revisit your content to ensure it aligns with your personal or brand voice. AI should amplify what makes your writing special, not dilute it.
By staying true to your style, you ensure that your audience connects with the real you, even in an AI-enhanced world.
- Continual earning and adaptation: Stay curious about how AI can improve your process, but remain committed to your core principles as a creator. Adapt your use of AI as it evolves, but always keep your voice at the forefront.
Crafting authentic content in the AI era
Balancing AI with human creativity requires a thoughtful approach that leverages technology’s strengths while preserving your authentic voice.
You can achieve the best of both worlds by treating AI as a creative partner, offering clear guidance and incorporating personal insights.
AI can enhance efficiency, improve accuracy, and spark new ideas, but it’s your creativity, empathy and unique touch that will truly resonate with your audience.
Use AI strategically, ensuring your authentic voice remains the highlight of your content. After all, while AI is here to stay, it’s human intelligence that drives meaningful connections.
Dig deeper: Future of AI in content marketing: Key trends and 7 predictions
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
SEO grew up, a lot of SEOs didn’t
Written on October 5, 2024 at 9:14 am, by admin

When we were young, we all wanted to sit at the adult’s table. But we couldn’t, often, because of our behavior.
Growing up, we often thought we were cool, but we weren’t. Look back at your past. There are probably photos or things you did that you find questionable today.
This is exactly how we SEOs should be looking at our work. SEO has grown up a lot; we SEOs didn’t.
Our mindset is a problem when tackling challenges like needing a strong brand or satisfying users the most.
What we do:
- Faking first-hand experience and looking for things we can leave out “because Google cannot measure that.”
- Buying (terrible) links (no one clicks), a practice that will be sketchy.
This is the wrong approach.
What we should be doing:
- Proving our experience, expertise, authoritativeness and trustworthiness.
- Creating the best content possible that is worth linking to or mentioning.
To finally grow up, we have to look into the mirror. We have to change.
If you are pressed on time, here’s a quick rundown of what I suggest you should start doing today:
- Embrace change as an opportunity to grow by making yourself familiar with it and turn obstacles on their head.
- Do what is rational, logical and repeatable, study the Search Quality Rater Guidelines (SQRG) to understand what Google wants and invest in preparation, not prediction.
- Stop cheating or manipulating and start building something extraordinary. Start taking responsibility for your actions and stop blaming others for your failures. Save your energy to fight for positive outcomes, instead of wasting your energy on a negative candy rush.
- Communicate, understand and execute SEO the right way: As a growth engine, leading to limitless growth, not marginal improvements. Favor building over optimizing and fixing. Find your real competitive advantage and don’t buy into snake oil, cookie-cutter strategies or get-rich-quick schemes.
Disclaimer: This is not a purely tactical guide. My goal is to make you think. Motivate you to ask yourself tough questions and provide you with the necessary mindset to fill out the shoes of adolescence. If you get one good thought from this I’m more than happy.
Let’s get into the trenches: This is how we grow up!
Embrace change as an opportunity to grow
When we were children, we didn’t want to accept that change was inevitable. As we grow up, we realize it’s much easier to accept the world for what it is than to strive for what it should be.
“You don’t get to dictate the course of events. And the paradoxical reward for accepting reality’s constraints is that they no longer feel so constraining.”
– Oliver Burkeman, “Four Thousand Weeks”
The same should apply to our SEO mindset.
Change can be old tactics no longer working (like putting white text on a white background; I only mention this because some website owners still do this today) or new things appearing on the horizon, like generative AI.
Many of our SEO playbooks are outdated, and forces are pulling on us to change. Here are two examples:
We often don’t have a crawl budget problem, but an indexing problem
Indexing is becoming harder.
From what we know, the size of Google’s index seems to be more or less static (about 400 billion documents). Due to AI, we are:
- Producing more (good and bad) content.
- The bar for content quality has a new, higher baseline.
More content of a higher quality baseline vs. same index size = it’s harder to get indexed.

Google is allergic to technical issues and low-quality content.
Google’s fierce changes in ecommerce searches
Google is under pressure to become a shopping engine.
- Around 50% of product searches start on Amazon, not Google.
- 61-74% of European shoppers search products on Amazon, not Google.
As a result, Google renovated the SERPs for commercial queries.
In 2023, Google started to push a new feature: product grids.

Categorized as “Merchant Listings,” these grids appear in commercial searches at Position 1 more and more often.
Based on my research for German SERPs, depending on the industry, product grids appear in position 1 between 15-45% of the time.
Thanks to some research done by Kevin Indig for the U.S., we know Google changed the rate of them appearing in Position 1 more often than in Position 3 around March 2024.
If your domain doesn’t play a role in product grids but your competitors do, you are in for a tough time.
The solution: Reframing problems as opportunities
Problems are what make life worth living. A problem can be reframed as a challenge or an opportunity. The cards are shuffled again, doors are opening up, and we are all new to these things.
Here’s a practical reframing example from a talk by Carrie Rose I heard this year:
You are an ecommerce shop and lack authority in your niche
The ranking domains are strong digital publishers, duh
Instead of putting your head into the sand, you can put the obstacle on its head
Don’t try to beat them, join them
How? By giving them content they don’t have that is worth linking to
Example: Internal searches showed an extreme increase in searches for White Vans
They figured out faster than everyone else it was due to the series Squid Game on Netflix and pitched an article to news outlets
Result: High-quality backlinks, traffic and additional revenue
Main takeaways:
- Embrace change, make yourself familiar with new developments, tools and technology (start with an hour a week).
- Strive to deliver “best in the world content,” not just “fine content.”
- Reframe problems into opportunities.
Double down on things that never change
On the one hand, we don’t want to change our behavior. On the other hand, our brain craves novelty, constantly looking for a dopamine hit.
I know it’s very easy to be sucked in. New trends like AI chatbots, AI phones, AI toothbrushes. All good. But you have to avoid the shiny toy syndrome and embrace what will always matter.
If you focus on the things that never change, you can predict the future.
There lies great power in what you can’t measure
It will always pay off to do what is logical and rational.
Google of 2004 is not Google of 2024. Just because Google cannot measure something right now doesn’t mean it cannot in the future. One example is having author bios and pages.
Due to the leak, we have our answers now, but having author bios and pages just because Google “wants” them didn’t make sense.
Things that cannot be measured are often underestimated and easily ignored. Fun fact: A metric is often more useful if it is harder to measure.
Prefer the repeatability in the present over the luck of the past
Luck is something positive that is not predictable. Repeatable means puzzle pieces falling into place like they used to. They are not the same.
In SEO, we love to look at what others did and you should. But focus on what is repeatable, not what was lucky.
You cannot follow in the footsteps of Amazon or HubSpot, as their operating conditions fundamentally changed. However, there are things worth learning from them.
- You can learn from HubSpot that it’s worth investing strategically in new channels/formats.
- You can learn from Amazon that it’s worth investing in what customers will always value: Good prices and fast delivery.
Do what is repeatable; don’t try to emulate what was lucky.
Some more things that will never change
To draw on the Amazon example, Google will always need users to be satisfied with their search product. At least as long as this is their biggest revenue source.
One of your goals, in this case, should always be to have “best in the world content,” not good content.
In the SQRG, Google tells us exactly what they want. They have been doing so for years:
- Have author pages when it makes sense for your target audience.
- Don’t just copy the content of others, as high content quality is signaled by effort and originality.
- Auto-generated content (think AI or any programmatic plays) has more obstacles in its way to being rated Highest in terms of page quality.
Make no mistake: The next update is coming. The next disruption of the SERPs you operate in is coming as well.
It’s much better to invest in preparation over prediction. What we can’t see coming hits us the hardest.
Main takeaways:
- Do what is rational and logical, even if you or Google can’t measure it (yet).
- Copy what is repeatable today, not what was lucky yesterday.
- Study the SQRG to see what Google wants.
- Invest in preparation, not prediction.
Stop manipulating, start taking responsibility
Do you remember that when you were a kid, you often tried to manipulate and cheat in games?
In Germany, there is a game called “Mensch ärger dich nicht,” which roughly translates to “man, don’t be angry.”
I took chances to roll the dice again because someone “distracted me,” for example. This also happens often in golf and is known as a “mulligan,” so you get a pass and can take another shot.
At times, SEOs are like children who don’t want to accept the rules, cheat when no one is looking and love to blame others for our failures.
Examples of SEO manipulation practices
SEO is a breeding ground for manipulative tactics, and everyone knows the good old tales. Here are a few classics and recent ones:
- Terminated agencies updating the disallow file, so the work of the new agency gets tanked
- Content stealing, like the SEO heist
- Purely AI-generated content is neither a competitive advantage nor a strategy. Even if it does work short-term, it is not something you can rely on in the future. We advocate investing in a long-term channel, but we are so greedy and in a hurry that we fall for get-rich-quick schemes like Gollum in his pursuit of his precious.
- Link buying, which sounds like exchanging drugs on a backyard
- In some countries this is not just against Google’s official guidelines, but even problematic due to the law (in Germany we have the “Wettbewerbsrecht” for example).
- Affiliate link spam
- Someone created thousands of domains, all just containing affiliate links to all kinds of ecommerce sites in Germany to earn a quick affiliate buck. This is literally a resource waste which makes the internet a huge garbage hole. Please stop.
- Here’s an example domain that got spammed with these affiliate domains:

If you want to be taken seriously in SEO, don’t take shortcuts. Like Sonia Simone said, they take too long.
Why Google can’t tell us the truth
It’s easy to point fingers at Google. To call them liars and whatnot. Anyone with a clear mind has to understand that they cannot tell us exactly what is going on behind the curtain. If they did, at least a small group of SEO goblins would do everything in their power to ruin the game for everyone else.
SEO is the perfect example of the tragedy of commons.
“Each man is locked into a system that compels him to increase his herd without limit – in a world that is limited. Ruin is the destination toward which all men rush, each pursuing his own best interest in a society that believes in the freedom of the commons.”
– Garrett Hardin, “The Tragedy of the Commons”

Some of us just can’t behave properly and break the system for everyone else.
Yet here we are, proud of engaging in our questionable get-rich-quick schemes, which no one would publicly endorse as employees of a real brand or big company.
Incentives beat intentions, unfortunately
Recently a guide on how to manipulate Reddit got a lot of traction. I understand what the intentions of the author were: Make things better. But I don’t agree with the approach and methods used to do it.
This is not an ad hominem case as this is not an individual problem.
Firstly, Google is blamed for creating incentives to spam Reddit. However, the article itself promoted incentives to do the same with the actual manual on a silver platter.
Publicly sharing the techniques to spam Reddit is promoting to spam Reddit even more, no matter how you put it. And it was never wise to fight fire with fire.
Also, people generally like Reddit and find the answers helpful. It’s no coincidence that they have these numbers:
- There were 32 billion search queries in 2023 containing “reddit”.
- There were 3,102 billion search queries worldwide in 2023.
- = Reddit appears in 1% of all search queries.
I’m not saying there is no spam. But we don’t know the denominator here. If you seek a needle in a hay stack you will find one.
Secondly, Google is blamed 100% for the consequences of the Reddit spam. I don’t agree.
If you would look at the chain of responsibility through the eyes of several great philosophers, like Kant, Aristotle or Sartre, you would come to the conclusion that users taking advantage of the spam techniques are to blame first, then the platform (= Reddit) and then Google (= the middleman).

FYI: Others cheating doesn’t give you permission to do the same. Enabling and incentivizing these tactics is not a free ticket to cheat, either.
Look into the mirror: If spammers wouldn’t spam there would be no problem, so the root cause is our sometimes unbearable human nature.
The blame is (also) on us, not (just) the others
It’s more comfortable to blame others than to check on ourselves.
“Google got worse” is one on the trendiest topics of 2023 and 2024.
“Before Content Marketing was a thing, idiots did not publish content. You wouldn’t write encyclopedia articles without knowing anything. Now, we created a perverse incentive for any idiot to write about anything. We sit in a mountain of garbage.”
– Peep Laja, CEO, Wynter
What if Google didn’t get worse, but the ratio of good to bad content shifted?
If you fill a glass with more water (bad content) than wine (good content), the relative amount of wine in the glass decreases, even if the quality of the wine itself is good. It becomes harder to serve good content.
Google is responsible for their search results, but we are responsible for the mountains of garbage we produce.
A German study presumably claimed Google got worse. Their argument, only focusing on a small query subset in a specific niche, is insufficient to make such claims. It’s not even what they said but what people want to believe.
According to Statista, users are once again slightly more satisfied with Google search. And yes, according to the 286-pager on Google being a monopoly, Google tried to devaluate search quality to test the impact on revenue. But that test only lasted three months.
No one can predict if there wouldn’t be a negative impact on revenue in the long term, which is all that matters.
Let’s assume for a moment that this was true: Google got worse and our domains were demoted in favor of some big digital publishers. How would the confirmation of this bias actually help me?
It doesn’t.
Yes, I can and should be vocal about it. But a lot of time and energy goes into being negative. Negativity is like a candy rush. It distracts us, so you need to avoid it.
Author Ryan Holiday hit the nail on the head with this quote:
“In our own lives, we aren’t content to deal with things as they happen. We have to dive endlessly into what everything “means,” whether something is “fair” or not, what’s “behind” this or that and what everyone else is doing. Then we wonder why we don’t have the energy to actually deal with our problems.”
We need this energy wasted being negative in working on achieving positive outcomes, like crafting the best content out there or being the most helpful resource for our target audience.
Loopholes are risky short-term arbitrage opportunities, not long-term safe bets
A loophole is not a real competitive advantage, but a short-term arbitrage technique that brings a lot of risk with it. If revealed to the outside, there can be grave consequences.
I loved this from Alex Birkett recently LinkedIn:
“Shortcuts in SEO often bring a sugar high, but they also come with a crash. […] If you treat it like a get-rich-quick scheme, you’ll need to ‘fix the plumbing’ later.”
Some things, like reputation, are not worth risiking, no matter how much there is to gain. Think of Sports Illustrated for example.
Building a good reputation takes years. Setting it on fire can happen in seconds.
Main takeaways:
- Stop cheating or manipulating and start building something extraordinary.
- Start taking responsibility for your actions and stop blaming others for your failures.
- Save your energy to fight for positive outcomes, instead of wasting your energy on a negative candy rush.
Communicate und understand SEO as a growth engine, not as routine maintenance/polishing the edges
The grand finale: SEO has gotten a lot bigger.
Keeping SEO small and limited might be a way to avoid change. Could this be why many SEOs were reluctant to admit that Google uses user signals in their ranking algorithms?
Less change = SEO is smaller = more comfortable + less risky.
As outlined at the beginning, change is an opportunity. We walk into the fire of discomfort only to step out of it stronger, wiser and better.
SEO in 2024 is nothing like it was in 2004 or 2014. The fundamenta principles are the same, but we are driving a much different vehicle now with much more horsepower under the hood.
SEO is the wrong word for what we are actually doing
Digital publishers often get two-thirds or more of their traffic through SEO. A lot of companies rely heavily on organic traffic.
Some examples like Hardbacon had to file for bankruptcy as a result of the HCU and other updates. Some are or were on the cusp of it, like HouseFresh, Retro Dodo and Healthy Framework.
SEO stands for search engine optimization. But 70% or more of traffic share vs. other channels doesn’t sound like optimizing to me.
Optimization sounds like squeezing the last 5-10% out of what you already have. Limited and marginal.
The problem is that we all have different understandings of SEO, so we are not talking with each other but past each other.
To some, SEO means fixing mistakes/bugs. To others, myself included, SEO means (almost) limitless growth.
Fixing and optimizing is not enough:

Fixing = keeping the bare minimum in place and unlocking the existing potential.
Optimizing = using the full existing potential.
Building = unlocking new growth = increasing the potential.
To visualize the idea further, see the following graphic:

- Fixing is like you are repairing a broken window in your one-room apartment.
- Placing nicer furniture in that room is optimizing, you make it more appealing.
- The missing piece, then, is building new rooms if you actually want to live in a 10-room mansion.
Here are some things you need to communicate, understand and execute SEO as a growth engine:
- A business mindset, the right metrics and language a C-level member will understand.
- An updated SEO mindset, as ranking is not the end but a means to an end (= creating a high-value touchpoint with your target audience).
- The ability to play well with others, to make them embrace SEO, not despise it because you let them starve all the time by being greedy and only taking, never giving.
- Finding your (real) competitive advantage.
Finding your (real) competitive advantage
The last bullet point is especially important. It’s something that is missing quite often, from my experience.
Why should I buy from you? Why is your content the best in the world? The answer has to be the opposite of the “jumping the line” techniques criticized earlier.
The obvious question is how you get or find your real competitive advantage. It’s part of a good strategy. A strategy is always unique to a company.
In “Good Strategy, Bad Strategy,” Richard Rumelt says the kernel of a strategy involves three pieces:
- Diagnosis.
- Guiding policy.
- Coherent actions.
To find a competitive advantage, you need to ask the right questions, like:
- What are my unique abilities/assets?
- Which of these abilities/assets matter to my target audience?
- Which of these remaining abilities/assets differentiate me from my competitors?
A SWOT analysis can be a helpful tool here. An alternative to getting started is to list all the abilities and assets that make your brand you and then answer the last two questions.
Examples of abilities or assets could be that you:
- Have a higher editorial output (quantity/quality).
- Employ SMEs in several fields, while others only cover one.
- Are uniquely fast in execution in relation to your company size.
- Have proprietary data that you can use for original research and data journalism.
- Are the manufacturer of a product (= having technical knowledge others don’t have).
Main takeaways:
- SEO is not just “a channel” it’s often the growth engine for most companies.
- Communicate it the right way: Limitless growth > marginal improvements.
- Favor building than optimizing and fixing.
- Come up with a unique competitive advantage that you can leverage in your SEO strategy.
We have work to do
I hope a lot of what I said is something you heard at least once already. But like Christian Morgenstern said (translation by me):
“Sometimes you see something 100 or 1,000 times until you really see it for the first time.”
- We don’t want to change. Change is inevitable, though.
- We don’t always learn from the things that never change. But they let us predict the future.
- We like to skip the line and to go faster than is actually possible. Too fast often means fragile. You don’t want your SEO to be fragile, but to be unbreakable.
- We want it as easy as possible. Some things, however, are not easy. Like author James Clear said, “The cheat code is the work you’re avoiding.”
We have work to do.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
The SEO impact of interstitials, before and after
Written on October 5, 2024 at 9:14 am, by admin

Recently, an SEO client of mine lost 82% of desktop organic traffic to their homepage in one month.
Then, there was another 97% decline in desktop organic traffic in the second month.
Why?
The client launched a mobile and desktop pop-up for all United States visitors. Against my advice, the client hoped the conversions would outweigh the decline in organic traffic.

Here is a breakdown of the intrusive interstitial and its impact on organic traffic.
The impact of before and after an interstitial on mobile and desktop
The homepage pop-up was related to a promotion and not triggered based on a time delay or scroll.
Here is an example of a mobile pop-up:

In the example above, the pop-up also negatively impacts the responsive design.
Here is the example on desktop:

In addition to the drop in organic traffic, the homepage lost 97% of its keyword rankings on desktop.

And saw a 96% decline in mobile rankings for the homepage.

Most of the impact was on branded terms, indirectly impacting the reduction in PPC branded spend.
It also negatively impacted our page experience and page speed.
The website also saw a 10-second decline in page speed.
Before the launch of the pop-up, the website averaged a 3-second load time.

And a 13-second load time after the launch of the pop-up.

You can see the negative impact across page experience with the pop-up with a large number of layout shifts.
The homepage saw a 97% decline in organic traffic cost value overall.
There is no interstitial penalty or manual action
The intrusive interstitial effects rankings, not indexing.
However, other areas of impact may occur, such as soft 404 errors, server speed and the site’s overall page experience.
Remember, Google crawls from the U.S., so if those overlays are only present in European countries, Google will not detect them.
The negative impact of interstitials is a demotion in search
When Google rolled out the Google mobile interstitial penalty on Jan. 10, 2017, it had a shockingly quiet impact. A few days later, SEO professionals started seeing pages lose 10 or more positions in rankings.
When the Google page experience hit desktop sites in February 2022, intrusive interstitials were included as a signal.
It’s important to remember that interstitials do not require manual action (at least not yet), but they still negatively impact your page experience.
Google’s John Mueller stated in an SEO Office Hours on Jan. 22, 2021:
- “This is kind of like something where we would slightly demote the website in search. Sometimes, the tricky part is these slight demotions. It’s not the case that we’ll remove the site from search or move it to page 100. But if it’s relevant content, we may still show it on the first page of the search results, just not as highly as possible.”
4 examples of bad interstitials
1. Checking to see if you’re a spammy bot is bad
Gary Illyes from Google posted on LinkedIn, mentioning:
- “Checking if the site connection is secure interstitials you see on some sites, some of the time, is the last search-friendly thing you can do.”
Example:

In short, there are other ways to avoid spammy bots from hitting your site.
If you detect a bot, Mueller recommended serving a 5xx status code to manage the robot detection interstitial.
If you run a hosting service and have a “you might be a bot” interstitial, make sure it uses a non-200 HTTP result code (perhaps 503) and that it doesn’t have a noindex on it. Serving a noindex+200 will result in pages being dropped from search, if search engines see it. pic.twitter.com/LFGQcq2dzf
— John
…
(@JohnMu) January 17, 2022
2. Redirect to interstitial page
If you show a user page A in the SERPs but redirect it to page B after the click and page B includes a pop-up or interstitial page, search engines will not detect content outside the redirected page.
3. Only displays interstitial
Here is an example of a pop-up display that covers the body copy:

4. Store locator pop-ups
Store locators are really hit or miss on pop-ups.
Here is an example of a good location locator pop-up:
Instead of a pop-up, Xero opts for a banner ad.
Once you click the “Change region” button, you’re brought to this option to select a region.

4 examples of good interstitials
1. Displaying a cookies policy interstitial
Legal interstitials are OK, just as long as search engines can index the content without doing anything special, like asking search engines to click on something to load the background content.
This means GDPR and cookie policies are fine.
Here is an example of a cookies policy built with a pop-up window:

Here is another example of a cookies policy built with a banner at the bottom (my preferred method):

Mueller mentioned in Google Webmaster Office Hours that Google is trying to recognize these legal banners as separate from advertising banners.
2. Requesting age verification
Age verification classifies as a legal pop-up, so there are no issues with these.
Here is an example of an age pop-up done well:

3. Leveraging banners at the top and bottom
Here is an example of Samsung’s promotional ad banner:

4. Exit intent pop-ups that are time or scroll-based
If you’re going to launch a pop-up, I recommend using an exit intent pop-up based on time on the page (20 seconds or more) or scroll depth.
Stay tuned for more updates on the recovery of the interstitial pop-up
I’ll be updating this article as I get more data on the impact of removing the interstitial and recovery.
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.
Decoding Googlebot crawl stats data in Google Search Console
Written on October 4, 2024 at 6:14 am, by admin
Tucked away in the Settings section of Google Search Console is a report few SEO professionals discuss, but I like to monitor.
These reports are known as Crawl Stats.
Here, you’ll find an interesting set of metrics on Googlebot crawl activity. These metrics are especially useful for websites with thousands or millions of pages.
Googlebot ‘Crawl stats’
Long ago, Google Search Console had easy-to-find metrics on Googlebot crawl activity. Then, it seemingly disappeared.
In reality, it was tucked away in the Settings section.
How to access the crawl stats reports:
- Click on Settings at the bottom of the left navigation.
- Go to the Crawl stats section.
- Click Open report.
About the crawl stats data
As Googlebot crawls your site, Google tracks and reports on various aspects of Googlebot’s activity and reports on it in Google crawl stats.
This is where you’ll find high-level statistics about Google’s crawling history on your website.
Google says this data is for advanced users
The Googlebot Crawl Stats data is not for the technical SEO rookies.
Google specifically says this data is aimed at “advanced users” with thousands of pages on their site, which may be why it’s located in such an unusual location, unseen by many in the SEO community.
One reason Google may perceive this as an advanced report is that so many things can influence these metrics, including network issues and cloud delivery services such as Akamai.
Who will find crawl stats most useful?
I find the Crawl Stats reports less of an “advanced” set of reports but something that’s more useful to enterprise SEOs without crawler monitoring tools such as Lumar and Botify.
When doing SEO on an enterprise website with thousands or millions of pages, crawler optimization is a vital task, and crawler activity metrics provide important insight for defining action items.
Smaller sites likely do not need to worry too much about crawler activity because there is probably enough crawl budget allocated to your site to crawl at an appropriate pace.
On the other hand, enterprise sites tend to have far more pages that need to be crawled, discovered, and/or refreshed than Google crawls their site each day.
For this reason, they must optimize for crawler activity, which is a tool to help guide next steps.
What to look for in this data
After years of reviewing this data across many sites, I have one primary rule:
- Do not spend a lot of time here unless you see fluctuations and correlations.
Often these reports are interesting but not actionable.
Example fluctuations that I tend to investigate:
- HTML requests decreased (or spiked) at the same time Bytes of JavaScript downloaded increased (or spiked).
- Average response time increased (or spiked) at the same time the number of HTML requests went down (or sudden fall).
- Total download size increased (or spiked), but the number of HTML requests did not change.
- The percent of requests for discovery (to discover new URLs) increases and the percent of requests for refresh goes down; however, you did not launch new URLs on the site.
When to look at this crawl stats
Crawl stats are good to peruse (and log) at least once a month.
They are especially good to monitor after major releases, such as a platform migration or major redesign. This will help you quickly understand how Google is responding to your newly launched changes.
Remember: If you have a bot monitoring tool such as Lumar or Botify, then this data isn’t as useful as you’ll find in the bot monitoring data provided by these tools.
Caveats about the crawl stats data
Many things can influence Google’s crawl stats metrics beyond a normal dev release.
This means the SEO team, product manager(s) and developer(s) must think outside the box when evaluating the fluctuations.
You must consider what could have caused an increase or decrease in Googlebot crawl activity, not only in your release but also within the network and tech stack.
Changes to something such as Akamai could potentially impact this data.
Log the crawl stats data in a spreadsheet
This is data I like to archive because Google reports such a small window of time.
A great example of this is a challenge I’m facing now with a client. What is reported in GSC right now looks like things are improving:

However, because I have metrics from six months ago, I can say that these metrics are 40% higher than they were six months ago.
While they’re trending down, they’re still worse than they were in the past. The client’s challenge is that development has no idea why this is happening (unfortunately, solving that problem is beyond the scope of this article).
You may think to just grab a screenshot. However, it makes it very hard to compare over time.
Notice there is no left axis in the chart. You really cannot tell what the lines reflect. (Note: Numbers do appear on the left/right axis when you are only viewing two metrics in the chart)
Instead, drop this data into a spreadsheet. Then, you have actual data that can be charted over time, calculated and used to compare with other metrics, such as visits.
Having the historical data in one place is often useful when discussing major changes with development to show how much better the metrics were 4-6+ months ago.
Remember, development likes hard, specific data, so charts with actual numbers on the left/right axis (or worse, no numbers on the x-axis at all) will be more useful to you than charts with varying numbers on the x-axis.
Remember, the reports boxes are paginated
Though the most important metrics you’ll need are likely visible in the default view, many of the report sections are paginated – and they’re easy to miss!

Which metrics to monitor and why
Let’s get into the primary metrics to look (very quickly) each month, along with a few tips to take away action items from the data:
Total crawl requests
- View this report in Google Search Console (located in the top chart).
- Google definition: “The total number of crawl requests issued for URLs on your site, whether successful or not.”
- If this metric goes up or down, compare it with average response time and total download size (bytes).
- An obvious reason for this metric could go up if you change a lot of code or launch a lot of new pages. However, that is by no means the only cause.

Total download size (byte)
- View this report in Google Search Console (located in the top chart).
- Google definition: “Total number of bytes downloaded from your site during crawling, for the specified time period.”
- If this metric goes up or down, compare it with average response time
- An obvious cause for this metric to increase is adding a lot of code across thousands of pages or launching a lot of new pages. However, that is by no means the only cause.

Average response time (ms)
- Google Search Console Report (located in the top chart).
- Google definition: “Average response time for all resources fetched from your site during the specified time period.”
- If this metric goes up or down, compare with with total crawl requests and total download size (bytes).

Crawl requests breakdown by response
- View this report in Google Search Console (located below the top chart).
- Google definition: “This table shows the responses that Google received when crawling your site, grouped by response type, as a percentage of all crawl responses…”
- Common responses:
- OK (200).
- Moved permanently (302).
- Server error (5xx).
- Other client error (4xx).
- Not found (404).
- Not modified (304).
- Page timeout.
- Robots.txt not available.
- Redirect error.

Crawl requests breakdown by file type
- View this report in Google Search Console.
- Google definition: “The file type returned by the request. Percentage value for each type is the percentage of responses of that type, not the percentage of of bytes retrieved of that type.”
- Common responses:
- JSON.
- HTML.
- JavaScript.
- Image.
- PDF.
- CSS.
- Syndication.
- Other XML.
- Video.
- Other file type.
- Unknown (failed requests).

Crawl requests breakdown by crawl purpose
- View this report in Google Search Console.
- Two purposes:
- Refresh.
- Discovery.
- This is an interesting metric for presentations; however, it only has a few useful use cases. For example:
- If the percent of Googlebot activity that is for Discovery suddenly increases, but we’re not adding URLs to the site, then you have an action item to figure out what is being crawled that shouldn’t be crawled.
- If the percent of Googlebot activity that is for Refresh decreases significantly, but you didn’t remove pages from the site, then you have an action item to figure out why fewer existing pages are being crawled.

Crawl requests breakdown by Googlebot type
- View this report in Google Search Console.
- Google definition: “The type of user agent used to make the crawl request. Google has a number of user agents that crawl for different reasons and have different behaviors.”
- It’s an interesting metric, but not very useful. It just shows Google is still using their desktop crawler. Honestly, I usually ignore these metrics.

You can click into each metric for more data
Often when you present any SEO concern to product managers and developers, they often want to see example URLs. You can click on any of the metrics listed in this report and get example URLs.
An interesting metric to look at is “other file types” because it’s not clear what’s in the “other file types” category (often it’s font files).
The screenshot below shows the examples report for “other file type.” Every file listed is a font file (blurred out for confidentiality reasons).

In this report of examples, each row reflects one crawl request. This means if a page is crawled multiple times it could be listed more than once in the “examples.”
As with all Google Search Console reports, this is a data sample and not every request from Googlebot.
Do you share these metrics with developers and product managers?
These metrics will typically generate one of two thoughts:
- “There’s nothing to look at here.”
- “What could have caused that?”
In my experience, the answers to “what caused that” tend to require the input of product managers and/or developers.
When presenting the data and your questions about potential causes for issues, remember to clearly explain that these metrics are not user activity and solely represent Googlebot’s activity and experience on the website.
I find product managers and developers often get a bit confused when discussing this data, especially if it doesn’t match up with other metrics they have seen or facts they know about the site.
By the way, this often happens for most Google Search Console data conversations.
If there are no Crawl Stats fluctuations or correlations to be concerned about, don’t bring it up to development, nor product management. It just becomes noise and prevents them from focusing on more critical metrics.
What’s next?
Check out your crawl stats to make sure there are no spikes or correlations that are concerning.
Then, determine how often you want to look at these and set up systems that prompt you to check these and other Google Search Console metrics in a systematic, analytical method each month.
While you check out your Googlebot Crawl Stats, I’ll write Part 4 in this series that will talk about how to know which URLs you should focus on for technical SEO improvements and in particular, Core Web Vitals metrics.
Dig deeper
This is the third article in a series recapping my SMX Advanced presentation on how to turn SEO metrics into action items. Below are links to the first two articles:
- Evaluating technical SEO data: Key segmentation approaches
- Key Google Search Console metrics to monitor every month
Courtesy of Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing
Category seo news | Tags:
Social Networks : Technorati, Stumble it!, Digg, de.licio.us, Yahoo, reddit, Blogmarks, Google, Magnolia.