Podcast appearances and mentions of gary illyes

  • 41PODCASTS
  • 169EPISODES
  • 36mAVG DURATION
  • 1MONTHLY NEW EPISODE
  • May 1, 2025LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about gary illyes

Latest podcast episodes about gary illyes

Search Off the Record
Launching Search Central Live Deep Dive

Search Off the Record

Play Episode Listen Later May 1, 2025 35:51 Transcription Available


In this episode of Search Off the Record, Google's Martin Splitt, Cherry, and Gary Illyes discuss the exciting expansion of Search Central Live with the introduction of Search Central Live Deep Dive. They share the history of Search Central events, the inspiration behind creating a more advanced, multi-day experience, and how community feedback helped shape the new format. Cherry explains how the Deep Dive events will offer in-depth sessions, practical workshops, and greater opportunities for networking within the Search community across the APAC region.   The team also offers a behind-the-scenes look at the complexities of planning international events, from budgeting and travel logistics to ensuring high-quality attendee experiences. They highlight plans for new locations including Budapest and South Africa, and outline how interested participants can apply. If you want to hear about the future of Google's Search community events and how to get involved, this episode is a must-watch.   Resources: Episode transcript → https://goo.gle/sotr090-transcript   Listen to more Search Off the Record → https://goo.gle/sotr-yt Subscribe to Google Search Channel → https://goo.gle/SearchCentral

SEO im Ohr - die SEO-News von SEO Südwest
Google legt in diesem Jahr besonderen Wert auf Original-Content, Probleme für KI-Bilder? SEO im Ohr - Folge 343

SEO im Ohr - die SEO-News von SEO Südwest

Play Episode Listen Later Feb 14, 2025 13:12


Fri, 14 Feb 2025 21:37:27 +0000 https://podcast552923.podigee.io/346-new-episode 85141e934e704022bffc362105d1d1b8 In diesem Jahr werden Originalinhalte für Google besonders wichtig sein. Zudem gibt es Anzeichen dafür, dass zumindest manche KI-Bilder in letzter Zeit weniger Suche-Traffic erhalten. Originalinhalte sind schon immer wichtig für den Erfolg in der Google Suche. In diesem Jahr könnte ihre Bedeutung weiter ansteigen - zumindest lässt sich das aus einer Aussage von Gary Illyes herauslesen. Es gibt Beobachtungen, dass Websites mit KI-Bildern in letzter Zeit weniger Traffic aus der Google Bildersuche erhalten. Ist die Produktion von Bildern per KI aus SEO-Sicht problematisch? Google hat zuletzt für manche Business Profiles aufgrund eines Problems weniger Reviews angezeigt. Das Problem soll inzwischen behoben sein. Zumindest einzelne Websites, die zuletzt von Google in Deutschland wegen Site Reputation Abuse mit einer manuellen Maßnahme belegt worden waren, sind wieder in die Suchergebnisse zurückgekehrt. Nach einer aktuellen Studie erhalten manche Websites schon mehr Traffic aus ChatGPT als von Google. full In diesem Jahr werden Originalinhalte für Google besonders wichtig sein. Zudem gibt es Anzeichen dafür, dass zumindest manche KI-Bilder in letzter Zeit weniger Suche-Traffic erhalten. no

Webcology on WebmasterRadio.fm
The AI and CTRs and Elon Musk's Massive DataCoup Edition

Webcology on WebmasterRadio.fm

Play Episode Listen Later Feb 6, 2025 75:40


This edition moves back and forth between coverage of several unique areas of search, AI, and web work. We start covering the DataCoup, which is our "deal-with-it" name for the changes being made by Elon Musk and his crew of youthful hackers to the very nerve centers of the US Government and the suspicion an off-the-books version of Grok is being trained on US government data. We note that DeepSeek is likely helping the Chinese state harvest American data almost as quickly as Elon Musk is hovering it up. We report on the use of updates to old press releases to inflate US Immigration activity in the news, Google's rollback of DEI efforts, Whoopi's phoney weight loss ads, YAIhoo being copiloted by Copilot, and the ChatGPT growth study from SEMrush, along with the SEER Interactive study showing AI Overviews destroying organic and PPC CTRs. We mention changes to Google's Quality Rater Guidelines to account for heavy handed use of AI, and Gary Illyes call for websites to focus on originality in content in 2025. We also move through the weekly parade of SEO tips and updates. A fun, fast paced newsy sorta episode.Support this podcast at — https://redcircle.com/webcology/donationsAdvertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacy

Search Off the Record
Crawling smarter, not harder

Search Off the Record

Play Episode Listen Later Aug 8, 2024 40:11 Transcription Available


In this episode of SOTR, John Mueller, Lizzi Sassman, and Gary Illyes talk about misconceptions around crawl frequency and site quality, what's challenging about crawling the web nowadays, and how search engines could crawl more efficiently.  Resources: Episode transcript → https://goo.gle/sotr079-transcript Gary's post on LinkedIn  → https://goo.gle/3YAT55q  Crawling episode with Dave Smart → https://goo.gle/3WShUsf  If-Modified-Since  → https://goo.gle/3ywXvja  About the IETF → https://goo.gle/3SGVVlo  Robots Exclusion Protocol → https://goo.gle/4dgmBSg  Proposal for new kind of chunked transfer  → https://goo.gle/3AgMF1c  Listen to more Search Off the Record → https://goo.gle/sotr-yt Subscribe to Google Search Channel → https://goo.gle/SearchCentral Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team. #SOTRpodcast

SEO Podcast by #SEOSLY
Confessions of a Black Hat SEO Expert: Charles Floate Tells All

SEO Podcast by #SEOSLY

Play Episode Listen Later Jun 10, 2024 58:49


In this episode, I have a very special guest - Charles Floate, an SEO expert with 16 years of experience in the industry. We dive deep into the world of black hat SEO, discussing everything from the most interesting tactics that worked well in the past to the current state of affiliate SEO after recent Google updates. Charles shares fascinating insights on topics like Google's Site Reputation Abuse policy, ranking non-English websites, and using AI-generated content for SEO. We also talk about his favorite AI tools and the importance of understanding Google's quality signals. One of the highlights is our discussion on parasite SEO and how Charles effectively leverages it for his clients. He even reveals some tips for SEO beginners and the areas where people tend to focus too much or too little. Throughout the interview, Charles openly shares his experiences, including the Google penalties he has faced and how he recovered from them. It's a truly eye-opening conversation that you won't want to miss! So grab a pen and paper, and get ready to learn from one of the best in the business. Don't forget to like, share, and subscribe to SEOSLY for more valuable SEO content. Let's dive in! Watch the video version: https://youtu.be/i1I87PmP4NY 00:00 Introduction to the podcast 00:31 Introduction to Charles Floate 01:13 What does Charles Floate do now in SEO? 02:07 Olga introduce's the topics for the podcast 02:27 What is black hat SEO? 03:14 What were the most interesting black hat SEO tactics that worked well in the past? 05:42 What is going on with Google's Site Reputation Abuse policy? 10:38 Olga about ranking Polish and non-English websites in Google 10:52 What other non-English languages or countries are you targeting? 12:09 Olga asks Charles about the number of websites he has 12:43 What kinds of SEO tests does Charles do? 14:18 Insights on Google Helpful Content update 15:51 Insights on AI-generated content and SEO 19:37 Favorite AI tools for SEO 23:15 Insights on Google quality signals 25:27 Are there any issues with Google indexing? 28:54 How do you do parasite SEO? 31:49 Insights on LinkedIn as a parasite SEO tool 32:30 What's the state of affiliate SEO after all these Google updates? 35:43 Have you seen any Google Helpful Content update recoveries? 36:54 Have you ever built and ranked a website only using white hat SEO tactics? 37:58 Olga asks Charles about all the Google penalties he has ever got 41:35 Did reconsideration requests help you recover? 42:42 Charles talks about his meeting with Gary Illyes from Google 43:55 What are the tools you would recommend for black hat SEO? 46:41 Tips for SEO beginners 49:29 Parasite SEO and affiliate marketing 52:59 What is the area of SEO people focus on too much/too little? 58:56 Where to find Charles Follow SEO Consultant Olga Zarr or hire Olga to help you with SEO Follow Olga Zarr X/Twitter Follow Olga Zarr on LinkedIn The best SEO newsletter The best SEO podcast SEO consultant Olga Zarr

#TWIMshow - This Week in Marketing
EP208 - Google Confirms: Fewer Links Needed for Effective SEO Rankings!

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 30, 2024 9:46


Episode 208 contains the Digital Marketing News and Updates from the week of Apr 15-19, 2024.1. Google Confirms: Fewer Links Needed for Effective SEO Rankings! - At the recent SERP Conference, Google's Gary Illyes reiterated a significant shift in SEO strategy: the diminishing importance of links in ranking web pages. During his presentation on April 19, 2024, Illyes highlighted that Google's algorithm now requires "very few links to rank pages," signaling a continued move away from heavily relying on link quantity for search engine rankings.This evolution reflects Google's ongoing updates to refine their search algorithms to focus more on content quality and user experience rather than traditional signals like the number of links. Illyes' statement underscores a broader trend where links, although still valuable, are not the central metric they once were in SEO. This shift encourages SEO professionals and business owners to diversify their SEO strategies, focusing more on quality content and holistic site optimization.For businesses, this means that developing robust, relevant content and ensuring a user-friendly site can be just as crucial as link-building efforts. This change is intended to make search results more useful and relevant to users, aligning more closely with Google's core objective of enhancing the user search experience.2. Preventing Deindexing After Hosting Transitions! - If you've recently migrated your site and noticed a drop in search visibility, then this is for you.The issue begins post-migration, where a site might disappear from Google's search results. This sudden disappearance often leads to a panic, but the root causes can typically be diagnosed and resolved through a few strategic steps. John Mueller of Google suggests that the primary check should be whether the new site setup is unintentionally blocking Google's crawlers, which is a common oversight during migrations.The first step in diagnosing this issue is to utilize Google Search Console. This tool can help identify whether pages are not found (404 errors) or if there's a robots.txt file blocking the crawlers. Such blockages can occur due to settings that discourage search engines from indexing the site, which might be enabled during the migration process without the site owner's knowledge.Mueller points out that often during a WordPress site migration, settings intended to hide the site from search engines during development are not reverted. To check this, you can navigate to the 'Reading' settings under 'Settings' in the WordPress admin panel. If the 'Discourage search engines from indexing this site' option is ticked, simply unticking this will resolve the issue.If the problem isn't related to this setting, it might be due to an SEO or migration plugin that inadvertently set up a robots.txt block. Alternatively, it could be a DNS setting issue or an error on the part of the web hosting service.3. Google's Stance on The Impact of Owning Multiple Websites - Google's John Mueller clarified concerns surrounding the impact of managing multiple websites on SEO rankings. He emphasized that owning several websites in itself does not directly harm rankings. However, the real issue lies in the distribution of effort and quality. Mueller pointed out that managing multiple sites often dilutes the ability to maintain high-quality content across all platforms. If the sites cover the same topic, it might appear as an attempt to manipulate rankings, which is not an ideal strategy for SEO success.For business owners, the takeaway is clear: focus on developing one authoritative website rather than spreading resources thin over multiple lesser sites. This approach not only enhances your site's quality and relevance but also aligns better with Google's algorithms, which favor comprehensive and user-focused content.This perspective is supported by Bill Hartzer of Hartzer Consulting, who notes that while it's common to think creating additional websites on the same topic might double success, this strategy rarely pans out. Consolidating sites into one primary, authoritative presence is usually more effective for maintaining strong SEO performance.For those contemplating the management of multiple websites, it's crucial to consider the strategic goals: are you aiming for quality or merely trying to capture more traffic? Opt for creating a single, robust site that truly serves your audience's needs and stands out in Google's search rankings.4. 404 Errors on Your Website's SEO Ranking - During the Google SEO office hours, Gary Illyes from Google addressed the common concern about the correlation between 404 errors and SEO rankings. Illyes confirmed that encountering 404 errors—'Page Not Found' notifications—is quite normal and does not inherently lead to ranking drops. He specifically addressed a scenario involving 'fake' 404 errors, which are URLs mistakenly or maliciously generated by bots, emphasizing that these errors are unlikely to influence a site's ranking negatively.For business owners monitoring their website's performance, it's crucial to understand that while 404 errors are not problematic by themselves, their origin and frequency should be analyzed. If genuine users encounter 404 errors, this could point to broken links or misspellings which should be redirected correctly to improve user experience and site functionality. Moreover, it's advised to periodically check for security vulnerabilities, especially if 404 errors might suggest bot activities searching for exploits.Illyes suggests utilizing analytics tools to identify if real users are encountering these errors and to take corrective actions like redirection or offering relevant content on error pages to retain visitor engagement. His insights clarify that while 404 errors are normal, the context and response to these errors can influence the overall quality and security of a website.5. Does Changing Your Web Hosting Impact SEO? Google Weighs In! - As business owners often ponder the implications of switching web hosting services, Google's Gary Illyes sheds light on this topic. Illyes clarifies that changing your web hosting platform should not negatively affect your SEO rankings, provided the migration is executed correctly.Key aspects to ensure a smooth transition include minimal downtime and maintaining website resolvability. This involves updating name server information and making sure the domain's IP address matches the new hosting location. Even if a website experiences downtime, as long as it's brief and the site's structure remains unchanged, the impact on SEO should be negligible.For those considering a host switch, it's crucial to choose a service that matches or exceeds the quality of your current host to avoid any potential drops in site performance. Illyes' insights confirm that with proper planning and execution, changing web hosts can be a seamless process that maintains your current SEO standing.6. How Does a 503 Status Affect Your Site's SEO? Google Explains - During the Google Search Central SEO office hours in April 2024, a significant clarification was made regarding the SEO implications of the 503 'Service Unavailable' status cod...

Search Off the Record
Gemini for Social Posts: Game Edition

Search Off the Record

Play Episode Listen Later Apr 25, 2024 32:24


In this episode of Search Off the Record, Gary Illyes, Lizzi Sassman, and John Mueller put Large Language Models (LLMs) to the test and compete to create the best social post using generative AI. Listen along as they weigh the pros and cons, assess how accurate the posts are, stumble on some owl facts, and see how Google's own GenAI tool, Gemini, can be used as a creativity tool.   Resources: Episode transcript → https://goo.gle/sotr072-transcript  SEO Starter Guide  → https://goo.gle/42alrmy rel=prev/next isn't a signal  → https://goo.gle/4aIWz8F Managing Multi-Regional Sites → https://goo.gle/3J3u5e9 Bubble chart analysis →  https://goo.gle/49q9VW8   Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.   #SOTRpodcast

#TWIMshow - This Week in Marketing
Ep207 - How Index Selection and Canonicalization Are Impacted During Google's Core Algorithm Updates

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 15, 2024 6:21


Episode 207 contains the Digital Marketing News and Updates from the week of Apr 08-12, 2024.1. How Index Selection and Canonicalization Are Impacted During Google's Core Algorithm Updates - In a recent LinkedIn conversation, a significant topic was brought to the forefront by David Minchala, addressing a common misconception in the SEO community regarding the impact of Google's core algorithm updates on indexing services like canonicalization.Minchala posed a question, suggesting that during core algorithm updates—or possibly any major update—services such as canonicalization might slow down. Canonicalization involves selecting the most representative URL for content that exists in multiple URLs and merging all signals from known duplicate URLs. The underlying concern was whether these crucial indexing services suffer in performance during extensive updates.Responding to this, Gary Illyes clarified that this assumption was incorrect. He explained that the processes of indexing, like canonicalization and index selection, are entirely independent of core updates. Illyes used a culinary analogy to elucidate his point, comparing core updates to adjusting ingredients in a recipe which can significantly alter the dish's outcome. In contrast, canonicalization and index selection processes are likened to activities in the salt mines or MSG factories—fundamental and separate from the immediate cooking process.This separation is crucial for SEO practitioners and business owners to understand, as it reassures that the stability and performance of indexing services remain unaffected by the changes introduced in core updates. These updates primarily tweak how Google's algorithms assess and rank web content based on relevance and quality, but they do not directly interfere with the technical processes of how content is indexed or canonical URLs are determined.In summary, while Google's core updates can significantly impact how websites are ranked, they do not directly influence the fundamental mechanisms of how websites are indexed. This distinction is crucial for effectively navigating SEO and ensuring that efforts are directed towards enhancing content quality and user experience, rather than worrying about the operational aspects of Google's algorithm updates.2. Google's John Mueller Demystifies 404 and 410 Codes! - Understanding the intricacies of SEO can be a daunting task, especially when it involves technical aspects like HTTP status codes. John Mueller, Google Search Advocate, clarified common misconceptions regarding the 404 and 410 HTTP status codes during a discussion on the r/SEO Reddit forum. These codes are used to indicate that a page on your website is either temporarily missing or permanently removed. Mueller emphasized that there is a minimal difference between the two from an SEO perspective, stating that both are treated similarly by Google's indexing process. Importantly, having these status codes on your website does not result in penalties from Google, which means they should not be a major concern for site owners.The discussion began with a website owner who had used AI-generated content for a videogame guide site. When issues arose from the AI content, the owner removed the pages and sought advice on recovery. This led to a broader conversation about whether 404 (not found) or 410 (gone) status codes could affect their site's SEO. Mueller reassured the community that the practical impact of these codes on search engine visibility is negligible.Mueller's advice serves as a crucial reminder for business owners: focus on creating high-quality, engaging content rather than getting bogged down by the technical nuances of HTTP status codes. By ensuring your site maintains valuable content, you're more likely to see sustained SEO success.3. Rising Ad Costs Due to Meta Platform Glitches! - Are you monitoring your Facebook advertising costs and performance? If not, you might want to start. A recent report highlighted significant system glitches within Meta's advertising platform that have been pushing up ad prices since early April 2024. These issues have been causing increased costs for advertisers, with some marketers experiencing a tripling in CPMs (cost per thousand impressions), a key advertising expense metric.According to insights from Bloomberg and additional details from Search Engine Land, these technical issues have not only escalated the costs but also led to mixed results and decreased sales, affecting the overall effectiveness of advertising campaigns. Interestingly, Meta has recognized some problems with ad delivery but suggests these are not widespread. They have reportedly fixed a few technical issues and are investigating further to ensure optimal outcomes for businesses using their platform.It's important to note that not every advertiser has been affected by these glitches. However, the potential impact on your ad spend due to these glitches could be significant. This situation mirrors a similar occurrence last year where a glitch led to many advertisers being overcharged. It's a crucial time to keep a vigilant eye on your account's performance and ad charges.Given that this issue arose at the close of the first quarter (January to March), any additional ad spend could inadvertently inflate Meta's revenue figures for the period, despite the possibility of subsequent refunds. While it might seem speculative, the timing of these glitches is indeed noteworthy.As a proactive measure, I recommend regularly checking your Facebook ad performance and noting any unusual fluctuations in costs. Staying informed and vigilant can help mitigate unexpected financial impacts and ensure your advertising budget is spent effectively.Should you notice inconsistencies, consider reaching out for expert analysis or directly to Meta support for clarification and potential rectification.

Search Buzz Video Roundup
Search News Buzz Video Recap: Google Core Update Volatility, Helpful Content Update Gone, Dangerous Google Search Results & Google Ads Confusion

Search Buzz Video Roundup

Play Episode Listen Later Apr 12, 2024


This week, we covered that the Google March 2024 core update is still rolling out 38 days later, but we saw more volatility this week. Just a reminder that the Google helpful content update no longer exists. Gary Illyes from...

#TWIMshow - This Week in Marketing
Ep206 - Recovering from Google's March 2024 Core Update

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 8, 2024 19:39


Episode 206 contains the Digital Marketing News and Updates from the week of Apr 1-5, 2024.1. Recovering from Google's March 2024 Core Update - In the aftermath of Google's March 2024 core update, many website owners, particularly small businesses, have felt the impact of significant traffic fluctuations. Google's John Mueller provided some clarity and advice on how to address these changes, especially for those who've experienced a downturn in website performance.The March 2024 update, one of Google's regular adjustments to its core algorithm, has been especially notable for its complexity and the breadth of its impact. Core updates are comprehensive, affecting various parts of the search algorithm, including how sites are ranked and indexed based on content quality, user experience, and many other factors.One key piece of advice that emerged from Mueller's discussion is not to rush into making changes while an update is still rolling out. This is because the full effects and intentions of the update might not be immediately clear, and premature adjustments could inadvertently harm your site's performance further. Mueller emphasized, however, that if there are clear areas for improvement on your site, especially those unrelated to the core update's specific focus, it's always a good time to address them.For businesses that have noticed a decline in rankings or traffic, Mueller's guidance focuses on long-term website health and user satisfaction. He suggests that optimizing your website for users—rather than search engines—is a critical strategy for recovery and future resilience. This approach, often termed "User Experience SEO," prioritizes how content and site design affect the user's interaction and satisfaction.A specific area highlighted for attention was the use of paid links or aggressive link-building strategies, which can negatively impact your site's ranking. If your site has been engaged in these practices, addressing them can be a step towards recovery.Mueller also advises diversifying your traffic sources to reduce dependency on search engine rankings. Focusing on building value for users can help attract direct visits and recommendations, aligning with Google's ultimate goal of rewarding sites that best serve their audience's needs.In summary, the March 2024 core update underscores the importance of maintaining a high-quality, user-focused website. Small business owners looking to recover from or thrive despite these updates should focus on:Patiently assessing the full impact of the update before making significant changes.Continuously improving site quality and user experience.Addressing specific known issues, such as paid links, that could harm your site's reputation with Google.Building a more robust online presence that isn't solely reliant on search engine traffic.2. Google's Approach to Canonical Pages - Google's Gary Illyes shed light on the intricate process of how Google selects canonical webpages. Essentially, publishers and SEOs traditionally view canonicalization as a method to earmark the 'original' or the most 'authoritative' version of a webpage for ranking purposes. However, Google's approach to canonicalization diverges significantly, focusing instead on deduplication - a method to identify and consolidate duplicate pages.Illyes clarifies that Google's primary aim in identifying a canonical page is to choose a version that best represents a set of duplicate pages based on collected signals. This involves a meticulous process where Google first determines if a page is a duplicate and then selects the most suitable version for indexing. This selection is critical as, typically, only canonical pages are displayed in search results.Google uses a variety of signals to make this decision, ranging from straightforward indicators like site owner annotations (e.g., rel=canonical tags) to more complex ones such as the page's overall significance on the internet. The importance of this process cannot be understated, as it directly impacts how content is presented in search results, especially for pages with multiple variants (e.g., product pages with different sizes or colors). Illyes's discussion also touches on the concept of duplicate clustering, where Google groups similar content together, and the notion of "centerpiece" content, which refers to the main content of a page that helps in the deduplication process.3. The Role of Content Quality in Google's Indexing Process - Google's Gary Illyes revealed key insights into how Google's indexing process evaluates the quality of webpages. llyes's discussion, aimed at demystifying the indexing stage, highlights the significant role content quality plays in determining a webpage's crawl frequency and its ranking in search results.The indexing process involves a comprehensive analysis of a page's textual content, including tags, attributes, images, and videos. During this stage, Google assesses various signals to gauge the page's quality. A critical takeaway for small business owners is the concept of 'index selection'. This step decides whether a page makes it into Google's index, a decision heavily influenced by the page's quality and the signals collected during the initial assessment.Illyes emphasizes that even if a webpage is technically sound, it might not be indexed if it doesn't meet Google's quality threshold. This underscores the importance of creating high-quality content that is not only relevant and informative but also engages the target audience effectively. Google has expressed an intention to conserve crawling resources by focusing on pages that warrant crawling, further stressing the need for content to meet high-quality standards.In addition to quality assessment, Illyes also delved into Google's approach to duplicate content, explaining the process of 'duplicate clustering'. This involves grouping similar pages and selecting a single canonical version to represent the content in search results. The selection is based on the quality signals associated with each duplicate page, highlighting the importance of ensuring each piece of content is optimized to stand out.Ensuring your content meets Google's quality standards can significantly enhance your website's visibility in search results. Here are actionable steps to align your content with these standards:Content Creation: Focus on developing content that thoroughly addresses your audience's needs and pain points. Tailor your content to reflect current search trends and demands.Content Structure: Make your content user-friendly. Structure it in an easily navigable format, and consider implementing schema markup to aid Google in understanding the context of your content.Content Refresh: Regularly update and refresh your content to keep it relevant and valuable, thereby improving your chances of being indexed and increasing your crawl frequency.4. Navigating Google's INP - Google has launched a tutorial aimed at helping website owners identify and rectify issues related to Interaction to Next Paint (INP), which has recently taken over from First Input Delay (FID) as a Core Web Vital. This shift signifies a new approach in assessing user experience on websites, placing a renewed emphasis on the speed and responsiveness ...

#TWIMshow - This Week in Marketing
Ep205 - Google's Guidance On Helpful Content Update

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 1, 2024 13:39


Episode 205 contains the Digital Marketing News and Updates from the week of Mar 25-29, 2024.1. Google's Guidance On Helpful Content Update - Google's Search Liaison recently addressed concerns about the Helpful Content Update (HCU), providing clear guidance for content creators facing ranking changes. Announced on March 25, 2024, this advice aims to dispel misconceptions and offer recovery strategies. Unlike earlier perceptions of the HCU affecting entire domains, Google now evaluates content on a page-by-page basis, focusing on the helpfulness of each page independently. This means that having some unhelpful content doesn't penalize an entire site, allowing helpful pages to be recognized individually.The clarification that the HCU is not a single signal but part of Google's complex ranking system highlights the need for creators to focus on producing genuinely helpful content. The emphasis is on the quality and relevance of content, rather than on trying to manipulate rankings through a single algorithmic factor.For those noticing ranking drops, Google suggests a thoughtful review of the content in question, urging creators to focus on its relevance and usefulness to users. Such decreases in visibility might indicate Google's preference for more pertinent content options, not necessarily a penalty against the site.Google has also introduced a new FAQ page to help publishers understand how to align their content with HCU principles, especially for content that has lost traffic following the update. By adhering to these guidelines and prioritizing user experience, website owners and SEO experts can better navigate the post-HCU search landscape.2. Rethinking BackLink Importance in SEO - John Mueller of Google indicated in a Reddit discussion that solely focusing on increasing backlink counts might not be the most effective strategy for marketers anymore. This change is part of Google's consistent effort over the past six months to update its stance on the role of links in SEO, suggesting a more comprehensive change in the valuation of links.The discussion was initiated by questions regarding the variation in backlink counts reported by different SEO tools, like Ahrefs and Google Search Console. Mueller pointed out that due to the varied methodologies of web crawling and indexing across these tools, discrepancies in link counts are natural. He stressed that concentrating on the sheer number of backlinks might divert marketers from improving their website's quality or relevance.Mueller further encouraged marketers to focus on other website aspects that could more significantly affect SEO success. He mentioned that search engines are now using more advanced methods, like sitemaps, for content discovery and evaluation, reducing dependency on backlinks for website ranking.This conversation is in line with recent adjustments in Google's guidelines and statements from Google representatives such as Gary Illyes, indicating a reduced emphasis on backlinks as a ranking factor. The removal of the term "important" from Google's description of links as a ranking signal confirms this trend.For business owners and digital marketers, this shift highlights the need for a more holistic SEO approach that extends beyond backlink building. Concentrating on content quality, user experience, and technical SEO elements, while keeping abreast of Google's algorithm updates, is crucial for maintaining a competitive edge in the digital arena.3. Core Web Vitals: A Ranking Factor That Requires Balanced Attention - On March 29, 2024, in episode 71 of Google's "Search Off The Record" podcast shed light on the complex role of Core Web Vitals (CWV) in SEO, making it clear that excelling in CWV scores alone does not ensure higher search visibility. This episode, featuring insights from Google engineers including Rick Viscomi, a web performance lead, and Lizzi Sassman, a senior technical writer, emphasized the importance of focusing on real-world user experiences over merely achieving high CWV scores.The conversation revealed a common misunderstanding among developers and marketers who prioritize CWV scores without considering actual user satisfaction. An eye-opening example shared was Google's own page experience documentation, which scored only 45 out of 100, illustrating that achieving top CWV scores is not crucial for success.John Mueller, explained that while CWV metrics do influence Google's ranking algorithms, slight improvements in these metrics may not significantly impact search rankings. He advised against focusing too much on CWV at the expense of other SEO priorities, as CWV is only one aspect of a comprehensive ranking system.The podcast also advocated for a holistic approach to web performance, suggesting improvements in CWV should be part of a broader strategy to enhance the overall user experience. This includes focusing on content quality and website design, aiming for a balanced effort that enhances user satisfaction and engagement.This discussion serves as a reminder that Core Web Vitals, while important, should not detract from other essential aspects of SEO like content relevance and user engagement. A well-rounded approach that prioritizes a superior user experience overall is crucial for long-term success in search engine rankings.4. Quality Over Quantity: Google's Stance on Content and Indexing - Again from episode 71 of the "Search Off The Record" podcast episode with Google's Gary Illyes, Lizzi Sassman, and guest Dave Smart, a key SEO principle was highlighted: the significant impact of content quality on Google's crawling and indexing frequency. The discussion emphasized a shift towards prioritizing content quality over quantity, a stance Google has consistently maintained. This approach contradicts the belief that merely increasing content production will lead to better rankings or more rapid indexing.John Mueller from Google further clarified this point on LinkedIn, addressing a misconception among website owners that boosting content volume could signal a site's quality to Google, thus improving crawl rates. Mueller likened this to the futile effort of making children prefer kale over ice cream by increasing kale consumption, humorously illustrating that quality cannot be replaced with quantity.Google and Bing's spokespeople have consistently stated that the frequency of content publication is not a ranking signal. This clarifies that producing more content does not automatically enhance a site's search visibility or ranking. Bing's Fabrice Canel supported this by advising a "less is more" strategy, suggesting that focusing on creating high-quality content is a more efficient use of crawl budgets and can result in higher quality traffic from search engines. Moreover, publishing repetitive content could negatively impact a site's crawl allocation, underlining the need for original and valuable content.For business owners in the digital marketing field, this underscores a critical message: investing in content quality, rather than quantity, is more likely to improve SEO performance. As user experience and content value become increasingly important, tailoring digital strategies to meet these expectations is key for sustained success in search engine rankings.5. Google's Plea to SEOs: Focus on Your Visitors, Not Just on Rankings - On March 2...

Search Off the Record
What is a web crawler, really?

Search Off the Record

Play Episode Listen Later Mar 14, 2024 30:05


In this episode of Search Off the Record, Gary Illyes and Lizzi Sassman take a deep dive into crawling the web: what is a web crawler, and how does it really work? Listen along as the Search team is joined by an expert web developer in the SEO community, Dave Smart, for an in-depth and technical discussion of all things crawling, and maybe dispel some myths along the way.    Resources: Episode transcript → https://goo.gle/sotr070-transcript  Managing your crawl budget → https://goo.gle/3IzRZxl Dave Smart on LinkedIn → https://goo.gle/3wPSuRA Tame the Bots → https://goo.gle/4cfCQ1P Search Central Help Forum → https://goo.gle/sc-forum Indexing API docs → https://goo.gle/3v8yVU0   Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.   #SOTRpodcast   Speaker: Gary Illyes, Lizzi Sassman Products Mentioned: Search Off The Record

#TWIMshow - This Week in Marketing
Ep201 - ‘How Google Search Crawls Pages'

#TWIMshow - This Week in Marketing

Play Episode Listen Later Mar 5, 2024 10:44


Episode 201 contains the Digital Marketing News and Updates from the week of Feb 26 - Mar 1, 2024.1. ‘How Google Search Crawls Pages' - In a comprehensive video from, Google engineer Gary Illyes sheds light on how Google's search engine discovers and fetches web pages through a process known as crawling.  Crawling is the first step in making a webpage searchable. Google uses automated programs, known as crawlers, to find new or updated pages. The cornerstone of this process is URL discovery, where Google identifies new pages by following links from known pages. This method highlights the importance of having a well-structured website with effective internal linking, ensuring that Google can discover and index new content efficiently.A key tool in enhancing your website's discoverability is the use of sitemaps. These are XML files that list your site's URLs along with additional metadata. While not mandatory, sitemaps are highly recommended as they significantly aid Google and other search engines in finding your content. For business owners, this means working with your website provider or developer to ensure your site automatically generates sitemap files, saving you time and reducing the risk of errors.Googlebot, Google's main crawler, uses algorithms to decide which sites to crawl, how often, and how many pages to fetch. This process is delicately balanced to avoid overloading your website, with the speed of crawling adjusted based on your site's response times, content quality, and server health. It's crucial for businesses to maintain a responsive and high-quality website to facilitate efficient crawling.Moreover, Googlebot only indexes publicly accessible URLs, emphasizing the need for businesses to ensure their most important content is not hidden behind login pages. The crawling process concludes with downloading and rendering the pages, allowing Google to see and index dynamic content loaded via JavaScript.2. Is Google Happy with 301+410 Responses? - In a recent discussion on Reddit, a user expressed concerns about their site's "crawl budget" being impacted by a combination of 301 redirects and 410 error responses. This situation involved redirecting non-secure, outdated URLs to their secure counterparts, only to serve a 410 error indicating the page is permanently removed. The user wondered if this approach was hindering Googlebot's efficiency and contributing to crawl budget issues.Google's John Mueller provided clarity, stating that using a mix of 301 redirects (which guide users from HTTP to HTTPS versions of a site) followed by 410 errors is acceptable. Mueller emphasized that crawl budget concerns primarily affect very large sites, as detailed in Google's documentation. If a smaller site experiences crawl issues, it likely stems from Google's assessment of the site's value rather than technical problems. This suggests the need for content evaluation to enhance its appeal to Googlebot.Mueller's insights reveal a critical aspect of SEO; the creation of valuable content. He criticizes common SEO strategies that replicate existing content, which fails to add value or originality. This approach, likened to producing more "Zeros" rather than unique "Ones," implies that merely duplicating what's already available does not improve a site's worth in Google's eyes.For business owners, this discussion underlines the importance of focusing on original, high-quality content over technical SEO manipulations. While ensuring your site is technically sound is necessary, the real competitive edge lies in offering something unique and valuable to your audience. This not only aids in standing out in search results but also aligns with Google's preference for indexing content that provides new information or perspectives.In summary, while understanding the technicalities of SEO, such as crawl budgets and redirects, is important, the emphasis should be on content quality. Businesses should strive to create original content that answers unmet needs or provides fresh insights. This approach not only helps with better indexing by Google but also engages your audience more effectively, driving organic traffic and contributing to your site's long-term success.3. UTM Parameters & SEO - Google's John Mueller emphasized that disallowing URLs with UTM parameters does not significantly enhance a website's search performance. Instead, he advocates for maintaining clean and consistent internal URLs to ensure optimal site hygiene and efficiency in tracking.Mueller's advice is straightforward: focus on improving the site's structure to minimize the need for Google to crawl irrelevant URLs. This involves refining internal linking strategies, employing rel-canonical tags judiciously, and ensuring consistency in URLs across feeds. The goal is to streamline site management and make it easier to track user interactions and traffic sources without compromising on SEO performance.A notable point Mueller makes is regarding the handling of external links with UTM parameters. He advises against blocking these through robots.txt, suggesting that rel-canonical tags will effectively manage these over time, aligning external links with the site's canonical URL structure. This approach not only simplifies the cleanup of random parameter URLs but also reinforces the importance of direct management at the source. For instance, if a site generates random parameter URLs internally or through feed submissions, the priority should be to address these issues directly rather than relying on robots.txt to block them.In summary, Mueller's guidance underscores the importance of website hygiene and the strategic use of SEO tools like rel-canonical tags to manage URL parameters effectively. His stance is clear: maintaining a clean website is crucial, but blocking external URLs with random parameters is not recommended. This advice aligns with Mueller's consistent approach to SEO best practices, emphasizing the need for site owners to focus on foundational site improvements and efficient management of URL parameters for better search visibility and tracking.4. Transition Required for Google Business Profile Websites - Google has announced that starting in March 2024, websites created through Google Business Profiles (GBP) will be deactivated, with an automatic redirect to the businesses' Google Business Profile in place until June 10, 2024. This move requires immediate attention from GBP website owners to ensure continuity in their online operations.For businesses unsure if their website is hosted through Google Business Profiles, a simple search on Google for their business name and accessing the edit function of their Google Business Profile will reveal if their website is a GBP creation. It's indicated by a message stating, “You have a website created with Google.” For those without a GBP website, the option to link an external site will be available.In response to this change, Google has recommended several alternative website builders for affected businesses. Among the suggested platforms are Wix, Squarespace, GoDaddy, Google Sites, Shopify (specifically for e-commerce), Durable, Weebly, Strikingly, and WordPress. Each offers unique features, with WordPress notable for its free website builder incorporating generative AI capabilities. However, users should be aware ...

#TWIMshow - This Week in Marketing
Ep199 - Google Updates Image Removal Process from Search Index

#TWIMshow - This Week in Marketing

Play Episode Listen Later Feb 19, 2024 19:40


Episode 199 contains the Digital Marketing News and Updates from the week of Feb 12 -16, 2024.1. Google Updates Image Removal Process from Search Index - Google has  updated their guidelines regarding the removal of images from their search index. As of February 16, 2024, the updated guidance includes new details for both emergency and non-emergency situations, ensuring that you can swiftly address image removal needs. Properly managing these images can protect your brand's reputation and ensure that only relevant and positive images are associated with your business in search results.The guidance highlights various methods to remove images, covering scenarios where immediate action is required and where there is no direct access to the Content Delivery Network (CDN) or Content Management System (CMS) hosting the images. For urgent removals, Google recommends using the Removals Tool, which temporarily removes images from search results. However, it's crucial to block or remove these images from your site to prevent them from reappearing after the removal request expires.One common issue addressed is the inability to access images hosted on a CDN or through a CMS that doesn't support indexing blocks. In such cases, deleting the images from your site may be necessary. Furthermore, Google has provided more detailed instructions on using robots.txt with wildcards for more effective control over which images are indexed. This update is essential for tailoring your site's visibility and ensuring that unwanted or irrelevant images do not detract from your online presence.Additionally, the update includes a caution regarding the use of the "noimageindex" robots tag. While this tag can prevent images on a specific page from being indexed, if those images appear on other pages, they might still be indexed. To fully block an image, the "noindex" X-Robots-Tag HTTP response header should be used.2. Google Updates Canonical Tag Documentation - Canonical tags play a foundational role in SEO by helping prevent duplicate content issues. They signal to search engines which version of a page is preferred when similar content appears under multiple URLs. Proper use of these tags ensures that the right page gets indexed and ranked, leading to improved website visibility and user experience.The essence of the update, based on Google's adherence to RFC 6596 standards, is the emphasis on explicit use of rel="canonical" annotations. Google specifies that the canonical tag is intended to identify the preferred version of a webpage among duplicates. This clarification does not change how Google processes these annotations but aims to make their intended use clearer.A noteworthy addition to Google's documentation is the guidance against using rel="canonical" for non-duplicative purposes, such as indicating alternate versions of a page (e.g., in another language or for a different media type). Instead, Google recommends using rel="alternate" for such cases. This adjustment highlights the importance of accurately using canonical and alternate tags to avoid confusion and ensure the correct page version is presented to search engine users.3. A New Video Series for Learning Google Search - Google has launched a video series titled "How Search Works" on its Search Central YouTube channel, aimed at demystifying the complexities of Google Search. Spearheaded by Google engineer Gary Illyes, this initiative promises to offer a behind-the-scenes look into the operational intricacies of the world's leading search engine. The series is designed to cater to a broad audience, including business owners, marketers, and even the general public, with the ultimate goal of boosting website visibility in Google's search results.The debut episode lays the groundwork for the series, with subsequent installments set to dive into practical strategies for improving your website's search engine ranking. Illyes highlights the series' technical focus, emphasizing its intent to equip viewers with the knowledge to enhance their site's online presence. Central to the discussion are the fundamental processes of Google Search: crawling, indexing, and serving. These stages represent how Google discovers URLs, understands and stores webpage content, and finally, how it ranks and presents search results.From the initial episode, Illyes stresses two pivotal insights. First, Google staunchly denies accepting payments for improved crawling frequency or search ranking positions. Illyes firmly states, "If anyone tells you otherwise, they're wrong," dismissing any misconceptions about pay-for-play in search rankings. Secondly, the quality of a website's content is underscored as the cornerstone for securing a favorable spot in search results. The definition of "quality" content, as per Google's standards, will be explored in future episodes, offering viewers a roadmap to achieve better visibility.4. Google's Guidance for Understanding Ranking Decline - In the dynamic landscape of search engine optimization (SEO), even the most authoritative websites can experience fluctuations in Google search rankings. This was the case for Wesley Copeland, owner of a gaming news website, who noticed a significant downturn in traffic and reached out to Google's Search Liaison, Danny Sullivan, for insights. On February 14, 2024, Sullivan responded with a practical blueprint for diagnosing and potentially reversing ranking declines, shedding light on the intricate dance of maintaining visibility in Google's search results.Sullivan's advice centers on leveraging Google Search Console to dissect and understand the factors contributing to a site's performance dip. He outlines a five-step process aimed at identifying where and why these declines might occur. This method starts with comparing the site's metrics over recent months against a previous period, focusing on the Queries report sorted by click change. This analytical approach helps pinpoint significant decreases in clicks, providing a clearer picture of the site's current standing in search rankings."If you're still ranking in the top results, there's probably nothing fundamental you have to correct," Sullivan reassures, indicating that fluctuations can often be attributed to Google's algorithmic changes rather than a decline in content quality or SEO efforts. He emphasizes that Google's algorithms are designed to evolve, constantly refining how content is ranked and presented to users based on relevance and utility.For business owners and SEO professionals, this conversation underscores the importance of regular performance reviews using tools like Google Search Console. It's crucial to recognize that high search rankings are not static achievements but ongoing efforts that align with Google's ever-changing criteria. Sullivan's parting message offers both assurance and a dose of reality, suggesting that while fundamental issues may not be present, the variability in how content is displayed can impact site visibility over time.The dialogue between Copeland and Sullivan se...

#TWIMshow - This Week in Marketing
Ep196 - Google's SEO Starter Guide Update: Streamlining for Clarity and Efficiency

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 29, 2024 15:27


Episode 196 contains the Digital Marketing News and Updates from the week of Jan 22-26, 2024.1. Google's SEO Starter Guide Update: Streamlining for Clarity and Efficiency - Google is currently reworking its popular SEO Starter Guide, initially released in 2010 and updated in 2017 to significantly streamline the guide, making it more accessible and relevant for today's website owners.The current SEO Starter Guide, which is about 8,500 words, will be reduced to less than half its size in the upcoming revision. Lizzi Sassman from Google mentioned that the new guide would be a little over 3,000 words. This reduction is achieved by cutting out repetitive and redundant information, aiming to eliminate duplication and streamline the content.Key aspects of the updated guide include:Focus on Modern Users: The guide will be more concise, focusing on general concepts rather than detailed technical instructions. This change reflects the rise of user-friendly content management systems like WordPress and Wix, which have simplified many aspects of SEO.Elimination of Redundancy: The new guide aims to avoid duplicating information available in more extensive resources on Google's Search Central site. It will serve as a one-stop introductory resource for SEO basics.Potential Impact on Guide's Ranking: Gary Illyes from Google speculated that the guide's ranking in Google search results might drop due to the reduced word count. However, the emphasis is on making the guide more user-friendly rather than maintaining its length for ranking purposes.Feedback-Driven Revision: John Mueller of Google highlighted that reader feedback has been instrumental in reshaping the guide. The goal is to make it more suitable for modern websites and accessible to those new to SEO.This updated guide will be a valuable resource for understanding the fundamentals of SEO in a more digestible format. It will provide clear, impactful advice without overwhelming readers with excessive details or outdated practices.2. Google's Stance: No Guaranteed Traffic in SEO - A recent statement by Google's John Mueller on January 17, 2024, has sparked attention among business owners and SEO professionals. The essence of Mueller's statement is that no one, not even experts, can guarantee increased traffic to a website as a result of specific changes. This was in response to an inquiry about whether removing certain parameters from a website would lead to an increase in traffic. Mueller's unequivocal response was, "Nobody can guarantee you traffic, sorry."This highlights a crucial aspect of digital marketing – the unpredictability and non-guaranteed nature of SEO (Search Engine Optimization). SEO involves optimizing your website to rank higher in search engine results, ideally increasing visibility and traffic. However, the algorithms that search engines use are complex and constantly evolving. This makes it challenging to predict exactly how changes to a website will impact its traffic.Many SEO professionals use estimates and formulas to predict the ROI (Return on Investment) of making specific changes to a website. They might estimate that improving rankings could lead to increased clicks and potentially more revenue. However, these are just estimates and should not be considered guaranteed outcomes. The digital marketing landscape is dynamic, and what works today might not work tomorrow.3.  Google's Stance on HTML Structure for SEO Rankings - It's crucial to understand the factors that influence your website's visibility on search engines like Google. One common area of focus is the structure of a website's HTML code. On January 26, 2024, an insightful update was shared regarding Google's stance on this matter, which is particularly relevant for those managing their own websites or working with digital marketing professionals.Gary Illyes from Google clarified on the latest episode of the ‘Search Off The Record' podcast that the HTML structure of web pages does not significantly impact search rankings. This revelation addresses a common misconception among website owners and SEO specialists who often prioritize meticulous HTML structuring in the hopes of boosting their search rankings.Illyes emphasized the value of diversity in website designs and structures, suggesting that if every website had the same HTML structure, the internet would become monotonous. He acknowledged that while basic elements like headings, title tags, and well-organized paragraphs are beneficial, obsessing over the intricate details of HTML structuring is largely unnecessary for SEO purposes.In 2018, John Mueller of Google also remarked that while a clear content structure is helpful for users, it does not directly influence ranking. This reinforces the idea that user experience should be the primary focus rather than the complexity of HTML structure.Furthermore, Google has stated that overusing elements like H1 tags or constantly rearranging them has little to no effect on a site's ranking. This information is especially useful for small business owners who might be allocating resources to fine-tune HTML structures under the assumption that it significantly impacts SEO.In summary, while maintaining a basic, user-friendly HTML structure is important, overemphasizing its complexity does not yield significant benefits in terms of SEO rankings. Business owners are advised to focus on creating valuable content and a pleasant user experience, rather than getting caught up in the intricacies of HTML coding for SEO purposes.4. Google Bot Does Not Read Content Within HTML Comments - On January 25, 2024 Google's John Mueller clarified that HTML comments, which are parts of the website code not visible to users but can contain notes or additional information for developers, were believed by some to influence Google's understanding of a site's content. However, John Mueller clarified that Googlebot, the search engine's crawling software, does not read or utilize the content within these HTML comments for indexing or ranking purposes. This clarification came as a response to a query on Reddit, where an individual inquired about the potential benefits of including content in HTML comments to enhance text recognition from images on their website.Mueller's response underlines an essential principle of web content creation: the importance of putting content directly on web pages, rather than in hidden elements like HTML comments. For business owners, this insight is particularly valuable. It emphasizes the need to focus on creating high-quality, visible content that users and search engines can easily access and understand. This approach not only ensures better engagement with potential customers but also aligns with Google's guidelines for optimal website performance in search results.Business owners should stay away from gimmicks and instead prioritize content that adds real value to their website visitors, as this is what Google's algorithms are designed to recognize and reward in search rankings. This update reiterates the ongoing need for transparency and user-centric strategies in digital marketing practices.

Search Off the Record
Rewriting the SEO Starter Guide

Search Off the Record

Play Episode Listen Later Jan 25, 2024 25:24


The SEO Starter Guide is getting a refresh! Have you ever wondered how the Search team plans and writes such a document? In this episode of Search Off the Record, John Mueller, Gary Illyes, and Lizzi Sassman discuss which topics are still relevant, the right level of detail for a starter guide, and whether or not including anti-patterns in documentation is a good idea.    Resources: Episode transcript → https://goo.gle/sotr068-transcript SEO Starter Guide →  https://goo.gle/42alrmy SEO Starter Guide: What's all the drama? → https://goo.gle/4b6ZLMq Let's pick a domain name → https://goo.gle/3tezc6t   Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.   #SOTRpodcast   Speaker: John Mueller; Gary Illyes; Lizzi Sassman;, Products Mentioned: Search Console;,

#TWIMshow - This Week in Marketing
Ep193 - Google's Shift Away from Third-Party Cookies has Started

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 8, 2024 14:57


Episode 193 contains the Digital Marketing News and Updates from the week of Jan 01-05, 2024.1. Google's Shift Away from Third-Party Cookies has Started - On January 4, 2024, Google embarked on a significant change in the digital marketing landscape by initiating the first stage of its strategy to remove third-party cookies. We have mentioned in our previous episode that it will begin in January 2024. However, I did not expect it to start so soon. Third-party cookies have long been a staple in digital marketing, allowing businesses to track users' browsing habits across different websites and deliver personalized advertising. Google's decision to phase out these cookies is part of its Privacy Sandbox initiative, which aims to balance user privacy with the needs of online businesses. Initially, this change will impact 1% of Chrome users globally, equivalent to about 30 million people.The impact of Tracking Protection's rollout is yet to be fully realized, but it's clear that advertisers need to be ready for a cookie-less future. This shift will cause issues for sites that depend on third-party cookies. Third-party cookies have traditionally provided vital insights for targeted advertising. With their removal, Google will categorize users into anonymized topic listings, allowing brands to target subsets of users but without the granular data previously available. This change is expected to lead to less effective advertising campaigns and, consequently, reduced revenue for web publishers. It also signifies increased costs for businesses in targeting ads.For small business owners, this shift represents a significant challenge and opportunity. The removal of third-party cookies means that the traditional methods of targeted advertising, which rely on detailed user data, will become less effective. Instead, Google plans to categorize users into anonymized topic listings. While this still allows for targeted advertising, it will be less specific than before, potentially leading to less effective campaigns and reduced revenue for web publishers. Additionally, the cost of ad targeting is likely to increase for many businesses.Google's move also reflects a broader trend in the digital world towards prioritizing user privacy. This trend is not only driven by tech giants like Google and Apple but also by regulatory changes in regions like the European Union. As a result, digital marketers are entering a phase of trial and error, learning to utilize new tools and strategies to maximize their advertising effectiveness in a privacy-focused online ecosystem.Business owners must adapt their digital marketing strategies to align with these new privacy standards. This adaptation involves exploring new tools and methods for reaching audiences in a way that respects their privacy while still achieving marketing goals. 2. The Community's Verdict on Buying DR & DA Services: More Harm Than Good - As a small business owner, you might be tempted to quickly boost your website's Domain Rating (DR) and Domain Authority (DA) by purchasing services from platforms like Fiverr. DR and DA are metrics developed by SEO companies to estimate a website's likelihood of ranking well in search engine results. While these metrics can be useful indicators of a site's health and link profile, they are not direct factors used by search engines like Google for ranking websites.The allure of these services is understandable: they promise quick, significant improvements in your website's perceived authority. However, it's crucial to understand the potential risks and downsides.Quality of Links: Many services offering to boost DR and DA do so by creating a large number of backlinks to your site. However, these links are often from low-quality or irrelevant sites. Search engines have evolved to recognize and penalize such artificial link-building tactics, which can harm your site's long-term SEO health.Short-term Gains, Long-term Risks: While you might see a temporary increase in your DR or DA, these gains can be short-lived. Search engines continuously update their algorithms to provide the most relevant search results, and they may penalize sites that engage in manipulative link-building practices.Misalignment with SEO Best Practices: The best SEO strategy focuses on creating high-quality content and obtaining backlinks naturally from reputable websites within your industry. Purchasing DR/DA boosting services often goes against these organic practices and can lead to a misalignment with Google's guidelines.Cost vs. Benefit: While the cost of these services might seem low compared to other marketing efforts, the potential damage to your site's reputation and ranking can be much more costly in the long run.The Reddit SEO community strongly advises against purchasing services to artificially boost Domain Rating (DR) and Domain Authority (DA) from platforms like Fiverr. Here are some key takeaways from their responses:Toxic Backlinks: Many users pointed out that the backlinks provided by these services are often "toxic" and flagged by search engine algorithms. One user stated, "because the backlinks are usually toxic (spammed and flagged by algos), flagged or have no actual power no matter what their DA says."Irrelevance of DR and DA: Several comments emphasized that DR and DA are third-party metrics and not Google metrics. They are often seen as "vanity metrics" with no real value in SEO. A user mentioned, "Because DR and DA are irrelevant third party metrics when it comes to SEO."No Impact on Rankings: Users highlighted that an increase in DR and DA does not translate to an increase in search engine rankings. One comment read, "Because it doesn't work and an increase in DR and DA (not even Google metrics) doesn't mean there will be an increase in rankings."Potential Harm to Site: There's a consensus that these services can harm your site's SEO health. One user warned, "Worst case scenario: you severely damage your backlink profile in Google."Low-Quality Links and PBNs: Several comments noted that these services often use low-quality links and private blog networks (PBNs), which are frowned upon by Google. A user shared, "They sell the same shit to everyone who orders these. Lots of low quality backlinks using the same PBN."Misleading Metrics: Users agreed that DR and DA are easy to manipulate and do not reflect a site's actual authority or relevance. A comment stated, "DR/DA are useless when it comes to link quality."Short-Term Gains, Long-Term Losses: The community believes that any short-term gains from these services are outweighed by long-term losses in rankings and credibility.In conclusion, the Reddit SEO community strongly advises against buying DR and DA boosting services. These services offer short-term, superficial gains at the expense of long-term SEO health and credibility. For small business owners, the focus should be on building organic backlinks, creating quality content, and following best SEO practices for sustainable growth and success.3. Revamp Your Website Without Losing SEO: Essential Tips from John Muller - Google's John Mueller highlights the critical importance of careful planning in website revamps to avoid SEO pitfalls. This topic is particularly relevant for small business owners who are considering updating their website's user interface (UI) and user experience (UX).UI and UX are fundamental aspects of a website, influencing how visitors interact and their overall satisfaction. Changes in these areas, along with adding new pages, can significantly impact a site's SEO performance. This is because these changes can affect everything from how Google crawls and understands a page to the on-page ranking factors. Here is what he mentioned:“One complexity is that a relaunch can mean so many different things, from just shifting a website to a new server and leaving everything else the same, to changing domain names and all of the content as well. First, you need to be absolutely clear what's changing on the website. Ideally map out all changes in a document, and annotate which ones might have SEO implications. If you're changing URLs, we have some great guidance on handling site migrations in our documentation. If you're changing the content or the UI, of course that will affect SEO too. If you're unsure about the effects, I'd strongly recommend getting help from someone more experienced - it's easy to mess up a bigger revamp or migration in terms of SEO, if it's done without proper preparation. Even with everything done properly, I get nervous when I see them being done. Fixing a broken migration will take much more time and effort than preparing well. In any case, good luck!”We also recommend that you discuss your needs with a reputable, competent SEO professional.Here are some key steps to consider:Crawl the Website: Use tools like Screaming Frog to crawl your site before and after changes. This helps identify issues like missing pages, broken links, or misconfigured meta elements.Create a Backup: Always have multiple backups of your website. This is a safety net against various potential issues that could arise during the update process.Stage the Website: Use a staging environment to test new changes. This is a duplicate version of your site where you can identify and fix technical bugs or errors before they go live.For small business owners, understanding and implementing these steps is crucial. A well-planned website revamp can enhance user experience and SEO performance, while a poorly executed one can lead to significant setbacks in site visibility and user engagement. Therefore, it's not just about making the site look better; it's about ensuring that these improvements align with SEO best practices to maintain or enhance your site's ranking on search engines.4. Importance of a Website's Homepage in Google's Eyes - Historically, the homepage was deemed the most important page due to the prevalence of directory and reciprocal links pointing to it. However, the focus shifted to inner pages as link-building strategies evolved, targeting content-rich pages for better ranking in search results. Despite this trend, recent statements from Google's Gary Illyes and John Mueller suggest a resurgence in the importance of the homepage.Illyes, in ep#66 of the Search off the Record podcast, emphasized that from Google's perspective, the homepage is the most important page of a site. Here is what he said :“... I can't speak for other search engines, obviously, but from Google's perspective, the homepage is the most important page on the site,... ”This statement is significant for business owners, as it implies that Google pays special attention to the homepage for indexing and understanding a website's structure. Mueller echoed this sentiment, noting that Google uses the homepage as a starting point for crawling and gauging the importance of other pages. He explained that pages linked directly from the homepage are often considered more important, influencing their weight in search results.This renewed focus on the homepage does not negate the value of inner pages but highlights the homepage's role as a gateway to the rest of the site. For business owners, this means ensuring that your homepage is not only well-designed and user-friendly but also strategically links to key inner pages. This approach can enhance the visibility and ranking of your website in Google's search results.In summary, while the importance of individual pages varies, the homepage holds a special place in Google's indexing and ranking process. As a small business owner, prioritizing your homepage's content and structure can significantly impact your website's overall performance in search results. This understanding is essential in today's competitive digital landscape, where a well-optimized homepage can be a game-changer for your online presence.5. Brand Over Keywords: John Mueller's Advice on Domain Names for SEO - In the realm of SEO, the choice of a domain name is a critical decision for any business owner. And the idea of using keyword domain name keeps on coming up. In a now deleted reddit post, John Mueller, Google's Search Advocate addressed a common query: Do keyword domain names offer an SEO advantage? His response was clear and straightforward. He advised against relying on keyword-specific domain names for long-term SEO strategy, stating that such a domain name does not provide a significant SEO advantage on Google. This advice is particularly relevant for businesses planning for long-term growth and online presence.The rationale behind Mueller's advice is twofold. First, keyword-focused domain names can limit brand recognition. They may pigeonhole a business into a narrow niche, making it challenging to expand or diversify offerings in the future. Second, these domain names can be restrictive when targeting other keywords, thereby limiting the website's ability to adapt to new products, services, or market trends.Mueller's insights reflect a shift in SEO trends, emphasizing the importance of building a strong, recognizable brand across various channels. This approach aligns with omnichannel marketing, where consistency in branding across different platforms is key to connecting with customers. In today's digital landscape, brand recognition and quality content are increasingly significant in achieving SEO success.In conclusion, Mueller's advice for 2024 and beyond is to focus less on keyword-centric domain names and more on developing a comprehensive, brand-focused online presence. This strategy not only aligns with current SEO trends but also ensures versatility and adaptability for businesses in the ever-evolving digital market. For small business owners, this means investing in a domain name that reflects your brand's identity and values, rather than just targeting specific keywords. This approach will help in building a lasting and flexible online presence, crucial for long-term success in the digital world.6. Facebook's New Link History Feature: Implications for Targeted Advertising - On January 3, 2024, Facebook introduced a significant update affecting digital marketing: the global archiving of all users' link history on both Android and iOS devices. This is particularly relevant for small business owners who utilize Facebook for targeted advertising.The new feature, which is enabled by default, allows Meta to use the collected data from users' link history for targeted advertising. This means that Facebook will keep track of the websites visited by users within its mobile browser over the last 30 days. However, it's important to note that links visited in Messenger chats are not included in this history.The feature could provide a valuable source of data for reaching high-value consumers, especially as the industry moves towards a cookieless future and faces stricter privacy laws. On the other hand, the reliance on such data could be temporary, and there are concerns about user privacy and the potential backlash.Users have the option to opt-out of this feature. To disable it, they can select any link within the Facebook app, launch Facebook's mobile browser, click on the three dots in the bottom right corner, select browser settings, and then toggle the switch next to "Allow Link History."A Facebook spokesperson stated, "You can choose to turn link history on or off at any time. When you turn link history off, we will immediately clear your link history, and you will no longer be able to see any links that you've visited. Additionally, we won't save your link history or use it to improve your ads across Meta technologies."7. LinkedIn Ad Prices Surge Amid Advertiser Boycott of Platform X - As of January 2, 2024, LinkedIn has experienced a significant surge in ad prices, a development crucial for small business owners to understand. This change, is attributed to increased demand following an advertiser boycott of another major platform, referred to as "X."The cost of ads on LinkedIn, a Microsoft-owned platform, has risen by as much as 30% in some cases. This increase is a direct result of advertisers shifting their focus from Platform X to LinkedIn. Leesha Anderson, Vice-President of Digital Marketing and Social Media at Outcast Ad Agency, noted that most of their clients have moved away from Platform X and are now focusing on LinkedIn.Despite the steep increase in prices, marketers are reporting substantial returns on their investments in LinkedIn ads. Advertisers are seeing up to 20% ROI, meaning for every $100 spent, they are generating profits of $120. This high ROI is particularly noteworthy given the increased costs of LinkedIn campaigns compared to those on Meta's platforms, where the cost per 1000 impressions can be as low as $10 to $15.LinkedIn's ad prices are determined by an auction system based on market demand. The greater the demand, the higher the ad price. This system has led to the current surge in prices due to the influx of advertisers from Platform X.The platform's annual ad revenue soared by 101% year-on-year in 2023, reaching almost $4 billion. Insider Intelligence forecasts that this growth will continue, with an additional 141% increase expected in 2024.Penry Price, LinkedIn's Vice-President of Marketing Solutions, attributed the increased investment in LinkedIn ads to the platform's unique targeting capabilities. LinkedIn claims to have twice the buying power of the average web audience, with four out of five members driving business decisions.For small business owners, this information is vital. While LinkedIn presents a solid alternative for advertising, especially in light of the boycott of Platform X, the increased costs must be considered. The platform's robust targeting capabilities and high ROI potential make it an attractive option, but businesses must carefully assess their budgets and advertising strategies to ensure the best use of their resources.

#TWIMshow - This Week in Marketing
Ep192 - Site Structure Strategy

#TWIMshow - This Week in Marketing

Play Episode Listen Later Dec 25, 2023 13:30


Episode 192 contains the Digital Marketing News and Updates from the week of Dec 18-22, 2023.1. Site Structure Strategy - Understanding the basics of SEO (Search Engine Optimization) can significantly enhance your online presence. Gary Illyes from Google has recently shed light on the importance of using a hierarchical site structure for SEO, a strategy crucial for making your website more accessible and understandable to both users and search engines.Illyes explains two types of site structures: hierarchical and flat. A flat site structure links every page directly from the home page, making each page just one click away. This approach was popular when sites relied heavily on web directories and reciprocal linking. However, as Google reduced the influence of PageRank as a ranking factor, the flat structure became less relevant.In contrast, a hierarchical site structure organizes content from general to specific. The home page covers the most general topic, with links to categories, subcategories, and individual pages that delve into more specific topics. This structure not only makes it easier for users to navigate your site but also helps search engines understand and categorize your content effectively.A hierarchical structure offers several advantages: Improved User Experience: It makes it easier for visitors to find what they're looking for, enhancing their overall experience on your site. Better SEO: By clearly categorizing your content, search engines can more easily index and rank your pages. Flexibility: It allows you to create distinct sections on your site, like a news section, which can be crawled and indexed differently by search engines. The choice between a hierarchical and a flat structure depends on your site's size and complexity. For larger sites with diverse content, a hierarchical structure is more beneficial. It allows for better organization and easier management of different content sections. He explained, "hierarchical structure will allow you to do funky stuff on just one section and will also allow search engines to potentially treat different sections differently. Especially when it comes to crawling. For example, having news section for newsy content and archives for old content would allow search engines to crawl news faster than the other directory. If you put everything in one directory that's not really possible."For small business owners, adopting a hierarchical site structure, as suggested by Gary Illyes from Google, can significantly improve your website's SEO performance. It's not just about organizing content; it's about making your site more accessible and relevant to both your audience and search engines. By implementing this structure, you can enhance user experience, improve search rankings, and ultimately drive more traffic to your site.2. Decoding the Dec 21 Spam Attack: Key Lessons to Elevate Your SEO Strategy! - On December 21, 2023, Google's search results were overwhelmed by a massive spam attack. This event highlights the vulnerability of search engines to spam tactics and the potential impact on businesses relying on online visibility.The attack involved numerous domains ranking for hundreds of thousands of keywords, indicating a large-scale operation. The spam was first noticed when almost all top search results for specific queries, like "Craigslist used auto parts," turned out to be spam, except for a few legitimate listings.The spam sites exploited three main opportunities within Google's ranking system: Local Search Algorithm: This algorithm is more permissive, allowing local businesses to rank without many links. Spammers used this to their advantage, targeting local search queries. Longtail Keywords: These are low-volume, specific phrases. Due to their low competition, it's easier for spammers to rank in these areas. New Domain Advantage: Google gives new sites a short period of 'benefit of the doubt' to rank in search results. Many spam domains were newly registered, exploiting this window. The effectiveness of this technique lies in the different algorithms Google uses for local and non-local searches. Local search algorithms are more lenient, allowing these spam sites to rank with minimal effort. he December 21, 2023, spam attack on Google's search results offers valuable insights for business owners looking to enhance their SEO strategies. This incident, where numerous domains ranked for an unusually high number of keywords, sheds light on the vulnerabilities and opportunities within Google's ranking system.Key Learnings from the Spam Attack Exploiting Low-Competition Areas: The spam attack targeted low-competition keywords, particularly in local search and longtail queries. For legitimate businesses, this highlights the potential of focusing on niche, specific keywords where competition is lower, increasing the chances of ranking higher. Understanding Google's Algorithms: The spammers took advantage of the local search algorithm's leniency and the initial ranking boost given to new domains. This underscores the importance of understanding how different SEO factors work, including the impact of new content and the specific requirements of local SEO. The Power of Longtail Keywords: The attack successfully utilized longtail keywords, which are specific and often less targeted by major competitors. For businesses, incorporating longtail keywords into their SEO strategy can capture niche markets and attract highly targeted traffic. Applying These Insights to Your SEO Strategy Focus on Local SEO: If you're a local business, optimize for local search queries. Ensure your business is listed accurately on Google My Business, and use local keywords in your website's content. Leverage Long Tail Keywords: Conduct thorough keyword research to identify longtail keywords relevant to your business. These keywords can drive targeted traffic and are generally easier to rank for. Monitor New Trends and Updates: Stay informed about the latest SEO trends and Google algorithm updates. Understanding these changes can help you adapt your strategies effectively. Diversify Your Online Presence: Don't rely solely on organic search rankings. Utilize social media, email marketing, and other channels to build a robust online presence. 3. Is Your Company Blog Google News Worthy? - Google's John Mueller addressed a crucial question: Can company blogs be eligible for Google News? This is particularly relevant for small business owners seeking to expand their reach and visibility online.Mueller clarified that while he works on search, which is somewhat separate from Google News, there's nothing in Google News content policies specifically excluding company blogs. This opens up an opportunity for business blogs to be featured, provided they meet certain criteria.To be considered for Google News, your blog content must adhere to specific guidelines. These include: Clear Dates and Bylines: Each article should have a visible publication date and author byline. Author, Publication, and Publisher Information: Details about the authors, the publication, and the company or network behind the content are essential. Contact Information: Providing contact details adds credibility and transparency to your content. While Google can automatically discover news content, being proactive can increase your chances. You can submit your blog URL for consideration through Google's Publisher Center. This step is crucial for small business owners looking to leverage their company blog for greater visibility.FYI: Google News does feature content from company blogs. For instance, GridinSoft company's blog and Adobe's company webpage have been shown in Google News. This demonstrates that while dedicated news sites are more common, company blogs that publish news are also considered.For small business owners, this information is a game-changer. It means that your company blog has the potential to be featured in Google News, provided it meets Google's content policies. This can lead to increased exposure, traffic, and potentially, a steady stream of advertising income. It's an opportunity to elevate your content strategy and expand your digital footprint in a meaningful way.4. Perfect SEO Isn't a Reality for Your Business - Google's John Mueller in his last SEO office hours of December 2023, where he stated, "no SEO is perfect." This insight is particularly relevant for business owners who may feel overwhelmed by the constantly evolving landscape of SEO.SEO is an ever-changing field, influenced by the continuous evolution of the internet, search engines, and user behavior. This fluidity means that what works today in SEO might not be as effective tomorrow. The technical elements like structured data and quality considerations are always in flux, making the idea of achieving 'perfect' SEO unattainable.Despite the impossibility of perfect SEO, Mueller emphasizes the importance of engaging in SEO practices. The goal isn't to achieve perfection but to adapt and evolve with the changes. SEO remains a crucial element in enhancing online visibility, driving traffic, and improving user engagement.Key Takeaways for Business Owners Adaptability is Key: Stay informed about the latest SEO trends and algorithm updates. Being adaptable in your SEO strategy is more valuable than striving for perfection. Focus on Quality and Relevance: Instead of chasing perfection, concentrate on creating high-quality, relevant content that resonates with your audience and adheres to SEO best practices. Continuous Learning and Improvement: SEO is a journey, not a destination. Regularly review and update your SEO strategies to align with current best practices and user preferences. Don't Be Discouraged: The complexity of SEO can be daunting, but don't let the pursuit of perfection discourage you. Even small, consistent efforts in SEO can yield significant benefits over time. For small business owners, understanding that 'no SEO is perfect' can be liberating. It shifts the focus from chasing an unattainable goal to developing a flexible, quality-focused approach that grows with your business and the digital landscape. Embracing this mindset allows you to navigate the complexities of SEO with more confidence and less stress, ultimately leading to a more robust and effective online presence.5. Does a Double Slash in URLs Affect Your SEO? - Google's Gary Illyes addressed a common query: does a double forward slash in a URL affect a website's SEO? Double forward slashes in URLs often result from coding issues in the CMS (Content Management System) or the .htaccess file. This can lead to the creation of duplicate webpages that differ only in their URL structure. Resolving this issue isn't as simple as rewriting the URL to remove the extra slash; the root cause must be identified and corrected.Gary Illyes clarified that from a technical SEO perspective, having double slashes in a URL is not problematic. According to RFC 3986, section 3, a forward slash is a standard separator in URLs and can appear multiple times, even consecutively. However, from a usability standpoint, double slashes are not ideal. They could potentially confuse users and some web crawlers.The usability of a website is crucial because it can affect user satisfaction and, indirectly, the site's popularity and visibility. If a site is difficult to navigate or understand, it may deter users and reduce the likelihood of being recommended or linked to by other sites. Similarly, anything that causes confusion for web crawlers can directly impact SEO. It's essential for a site to be easily crawlable and understandable.To avoid potential issues: Regularly check your website for double slashes and other URL anomalies. Consult with an htaccess expert or a developer to identify and fix the source of the problem. Use tools like Screaming Frog to pinpoint where the double forward slash issue starts, providing clues to the underlying technical issue. For small business owners, understanding and addressing these seemingly minor details can make a significant difference in SEO performance. While Google may be able to navigate through such issues, relying on this is not a best practice. Proactively managing your site's technical health ensures a better user experience and optimizes your site for search engines.6. DBAs Now Accepted for Advertiser Verification! - Google Ads has made a change to its Advertiser Verification Program and now accepts DBAs (Doing Business As) or trade names for verification. This development is particularly important for small business owners who often operate under trade names or DBAs.Previously, the Google Ads Advertiser Verification Program required advertisers to use their legal business names for verification. This posed a challenge for many businesses that operate under a DBA or a trade name different from their legal name. With this update, Google Ads acknowledges the common practice of using DBAs and adapts its verification process accordingly.Implications for Business Owners Broader Accessibility: This change makes the verification process more accessible to a wider range of businesses, especially small and medium-sized enterprises that commonly use DBAs. Brand Consistency: Businesses can now maintain brand consistency across their advertising and legal documentation. This is crucial for brand recognition and trust among consumers. Simplified Verification Process: The inclusion of DBAs simplifies the verification process for many businesses, reducing the administrative burden and potential confusion. To be verified under a DBA or trade name, the legal document submitted for verification must include both the legal name and the DBA/trade name. This ensures that Google can accurately associate the trade name with the legal entity behind it.7. Reservation Campaigns in YouTube Ads - YouTube/Google Ads has simplified the process of setting up reservation video campaigns, a type of advertising that offers fixed-rate impressions, ideal for brand awareness and product promotions.Reservation campaigns are a form of advertising where ad placements are purchased in advance at a fixed rate, typically on a cost-per-thousand impressions (CPM) basis. Unlike auction-based ads, where placements are bid on in real-time, reservation campaigns guarantee ad placement, making them ideal for high-impact advertising and ensuring visibility for crucial campaigns.Key Features of the New System Self-Service Options: Advertisers can now easily set up reservation video campaigns through Google Ads, streamlining the process of buying high-visibility ad placements like YouTube Select lineups and Masthead. Enhanced Targeting Options: The update includes advanced targeting capabilities, such as YouTube Select topic and interest-based targeting, along with demographic targeting, allowing advertisers to reach their desired audience more precisely. Access to Premier Content: Advertisers gain access to prominent placements like the YouTube Masthead and premier content via YouTube Select, ensuring a broader audience reach. Diverse Ad Formats: The system offers various ad formats, including non-skippable in-stream ads and bumper ads, catering to different campaign needs and audience preferences. Benefits for Business Owners Greater Control and Visibility: With fixed-rate impressions and guaranteed placements, reservation campaigns offer more control over ad impressions and higher visibility for your brand. Targeted Reach: The expanded targeting options enable businesses to tailor their campaigns more effectively, reaching the right audience with relevant content. Efficiency and Flexibility: The streamlined process saves time and effort, allowing businesses to focus more on the creative aspects and strategy of their campaigns. For small business owners, Google's update to reservation video campaigns on YouTube simplifies the process of creating impactful brand awareness and product promotion campaigns, leveraging YouTube's vast audience. Familiarizing yourself with this new system and aligning your campaigns with Google's policies will be key to maximizing your brand's exposure on one of the world's most popular video platforms.

#TWIMshow - This Week in Marketing
Ep188: Overlooked Details That Make or Break Your SEO

#TWIMshow - This Week in Marketing

Play Episode Listen Later Nov 27, 2023 21:00


Episode 188 contains the Digital Marketing News and Updates from the week of Nov 20-24, 2023.1. Overlooked Details That Make or Break Your SEO - Search engines look at many factors when determining how to rank web pages in search results. While flashy new SEO trends come and go, focusing on foundational website quality and technical basics tends to pay off more in the long run. Google's John Mueller, Martin Splitt, and Gary Illyes recently delved into the concept of site quality in a podcast, offering valuable insights for business owners and digital marketers. Their discussion demystifies site quality, emphasizing its simplicity and practicality. Site Quality is Not Complex: The Google experts encourage reading site quality documentation, asserting that understanding and achieving site quality is not as complicated as it may seem. Gary Illyes remarks, "It's not rocket science," suggesting that the basics of site quality are accessible to everyone. No Specific Tools for Site Quality: Unlike technical issues, there are no direct tools to measure site quality. Traffic metrics may indicate changes, but they don't pinpoint specific quality issues. This means business owners need to assess their content's effectiveness and relevance themselves. Reframing the Approach: Illyes advises reframing the problem by focusing on whether a page delivers what it promises to users. This user-centric approach is key to improving site quality. It's about creating content that helps users achieve their goals. Quality in Terms of Value Addition: Adding value is crucial for site quality. In competitive search queries, it's not enough to be relevant; your content must offer something unique and valuable that stands out from what's already available. Mueller explains that simply replicating what's in the search results doesn't add value. Instead, aim for content that exceeds the existing baseline. Breaking into Competitive SERPs: Illyes suggests an indirect approach to compete in tough SERPs. Choose realistic battles and focus on areas where you can genuinely offer something different and better. In summary, Google's experts highlight the importance of user-focused content, uniqueness, and value addition in achieving site quality. For business owners, this means focusing on creating content that genuinely helps users and offers something beyond what's already out there.2. Master the SEO Basics: Google's Advice for Effective Website Optimization - In the ever-evolving world of Search Engine Optimization (SEO), it's easy to get caught up in the latest trends and advanced tactics. However, Google's Search Relations team, featuring Martin Splitt, Gary Illyes, and John Mueller, emphasized the importance of mastering basic technical SEO issues first. This advice is particularly relevant for business owners who might not be deeply versed in the intricacies of SEO.Technical SEO involves optimizing the architecture and infrastructure of a website to enhance its crawling and indexing by search engines. This is crucial because, no matter how innovative your SEO strategies are, if search engines like Google can't properly crawl or render your site, your efforts won't yield the desired results. Illyes highlights the importance of ensuring that your content is accessible and useful, as these are key factors that Google considers.Another significant point discussed is the common misconception that high traffic automatically means high-quality pages. Mueller advises looking beyond just traffic metrics and focusing on user engagement and satisfaction. These are more accurate indicators of a page's usefulness and quality. It's important to focus on relevant queries and track lower-level pages to better understand a site's performance.The key to creating high-quality content is to focus on what helps people achieve their goals when they visit your page. This could mean providing comprehensive answers to common questions, solving problems, or sharing engaging stories. Illyes suggests that quality might be simpler than most think – it's about writing content that genuinely helps your audience.For business owners, the takeaway is clear: before diving into complex SEO strategies, ensure that your website's technical foundation is solid. Also, prioritize creating content that is not just high in volume but high in value to your audience. By focusing on these areas, you can significantly improve your website's SEO performance.3. Rethinking SEO Success: Beyond Traffic Metrics - In episode 66 of Google's "Search Off the Record" podcast, Google's John Mueller and Martin Splitt discussed a crucial aspect of SEO: the real value of traffic metrics. The conversation highlighted a common misconception in the SEO community—equating high traffic with success. While many SEO professionals boast about traffic increases, Mueller and Splitt emphasized the importance of focusing on more meaningful goals, like conversions and business impact.The podcast shed light on the tendency of SEOs to prioritize traffic statistics over Return on Investment (ROI) or the actual impact on earnings. Mueller speculated that this might be due to the delayed effects of SEO efforts on tangible business results. He pointed out that while traffic data is useful, it can be misleading if not analyzed in the context of its relevance and contribution to business goals.The discussion also touched on the different types of traffic and their varying values. Not all traffic contributes equally to sales or brand building; some may be irrelevant or non-converting. Therefore, understanding the nature of the traffic and its actual impact on sales or business growth is crucial.Mueller and Splitt's conversation serves as a reminder for SEO professionals to align their strategies with broader business objectives, rather than just chasing traffic numbers. It calls for a more nuanced approach to SEO, where the success is measured not just by the quantity of traffic, but by its quality and contribution to the business's bottom line.4. Google Clarifies the SEO Value of 404 Pages - Google recently shed light on the SEO implications of 404 error pages, offering valuable insights for business owners and digital marketers. A 404 error occurs when a page on a website cannot be found. Contrary to common belief, these pages can have a positive impact on a site's SEO if managed correctly.Google's John Mueller explained that 404 pages are a normal part of the web. They signal to search engines that a page no longer exists, which is crucial for maintaining a clean and up-to-date site structure. Importantly, 404 errors do not directly harm a site's overall ranking in search results.For business owners, this means that occasional 404 errors are not a cause for alarm. However, it's important to monitor these errors and ensure they are appropriate. For instance, if a product is no longer available, a 404 page is suitable. But if the page has moved, a 301 redirect to the new location is better for both users and search engines.Understanding the role of 404 pages in SEO is vital for maintaining a healthy website. It's about balancing user experience and search engine signals. Regularly checking for 404 errors and addressing them appropriately can contribute to a more effective online presence.This insight from Google highlights the importance of website maintenance and understanding the nuances of SEO. It's a reminder that not all errors are detrimental and that proper management of these pages can support a site's SEO strategy.5. AI in Content Creation: A Tool, Not a Threat - Google Search Relations team, including Martin Splitt, Gary Illyes, and John Mueller, discussed the role of AI in content creation. They view AI as a valuable aid to human creativity, not a replacement. The team emphasized that AI is excellent for certain tasks but not a catch-all solution. They humorously noted how technology, like Google Plus, can quickly become outdated, highlighting the rapid evolution of tech.The Google team believes AI can be particularly useful for overcoming writer's block or meeting tight deadlines. AI tools can suggest frameworks, phrases, and variations to speed up the writing process. However, they stressed that AI should be used responsibly and as a complement to human creativity, not as a substitute. This perspective encourages a balanced approach to AI in content creation, viewing it as a tool to enhance human efforts rather than overshadow them.Key Takeaways: AI as a Creative Aid: AI is seen as a tool to enhance human creativity, especially useful in overcoming writer's block or accelerating the writing process. Balanced Perspective: The Google team advocates for a responsible use of AI, emphasizing its role as a supplement to human creativity rather than a replacement. Rapid Technological Evolution: The discussion also touches on the fast-paced nature of technology, using Google Plus as an example of how quickly tech can become outdated. 6. Google's Guidance On SEO Tools - Google's John Mueller addressed a query regarding the use of SEO tools for content writing, specifically in the context of a Vietnamese travel agency blog. The question revolved around whether to include Vietnamese accents in keywords, as suggested by an SEO tool, considering the primary audience comprised American and Australian tourists unlikely to use these accents in searches.Mueller's response emphasized the importance of writing in the language of the audience, particularly for headers and body text. He advised not to depend entirely on SEO tools for writing guidance but to conduct independent research. Mueller suggested examining the Search Engine Results Pages (SERPs) with and without accents (e.g., "quảng binh" vs. "quang binh") to understand better what ranks higher and is more relevant to the target audience.The key takeaway from Mueller's advice is the significance of not relying solely on SEO tools. These tools are based on the current knowledge and trends in SEO, which can be limited and sometimes outdated. They were developed based on what SEOs believed to be effective at the time, such as keyword densities and reciprocal linking strategies, which eventually became less effective.Mueller's guidance underscores the dynamic nature of SEO and the need for writers and marketers to use their judgment and stay updated with current best practices. While SEO tools can provide valuable insights, they should not dictate content creation. Instead, a balance between tool-guided insights and personal research and understanding of the audience should drive content strategy.This advice is particularly relevant for small business owners and digital marketing enthusiasts who aim to create content that resonates with their audience while also performing well in search engines. Understanding the limitations of SEO tools and the importance of audience-centric content can lead to more effective and engaging digital marketing strategies.7. Follower Count: Not a Google Search Ranking Factor - There's a common belief among some digital marketers and business owners that a higher number of followers on social media platforms like Twitter or Instagram could positively influence their Google search rankings. This assumption stems from the idea that social signals, such as likes and followers, might be interpreted by Google as indicators of a site's popularity or credibility.Google has explicitly stated that follower counts on social media are not a factor in determining search rankings. This clarification is significant because it helps refocus SEO strategies on more impactful practices. Google's search algorithms are complex and take into account numerous factors, but social media follower counts are not among them.For small business owners and digital marketing enthusiasts, this information is vital. It means that while having a robust social media presence can be beneficial for brand awareness and customer engagement, it does not directly contribute to how well your website ranks in Google searches. Therefore, efforts should be more strategically directed towards proven SEO practices like content quality, website optimization, and building authoritative backlinks.Key Takeaways: Social media is valuable for engagement and brand presence, not for SEO in terms of follower counts. Focus on creating high-quality, relevant content and optimizing your website for a better user experience. Building a strong backlink profile from reputable sources can significantly impact your Google search rankings. Understanding what does and does not impact your website's ranking in search results is key to effective SEO. This clarification from Google serves as a reminder to focus on the core aspects of SEO that genuinely make a difference, rather than misconceptions like the impact of social media follower counts.8. Google to Remove Crawl Rate Tool from Search Console in 2024 - Google has announced that it will be deprecating the Crawl Rate Limiter legacy tool within Google Search Console on January 8, 2024. This decision comes as Google believes the tool has become less useful due to advancements in its crawling logic and the availability of other tools for publishers.The Crawl Rate Limiter allowed website owners to communicate to Google how often to crawl their site. It was particularly useful for sites experiencing server load issues due to frequent crawling by Googlebot. However, Google has improved its crawling algorithms to automatically adjust based on a site's server response. For instance, if a site consistently returns HTTP 500 status codes or if the response time significantly increases, Googlebot will automatically slow down its crawling.Gary Illyes from Google explained that the tool's usefulness has diminished over time. He noted that the tool's effect on crawling speed was slow and it was rarely used. With its removal, Google will set a new minimum crawling speed, which will be comparable to the old crawl rate limits, especially for sites with low search interest.For website owners experiencing issues with Googlebot crawling, Google recommends referring to a specific help document and using a report form to communicate any concerns.As a business owner, it's crucial to stay informed about changes in Google's tools and services, as they can impact your website's visibility and performance. The removal of the Crawl Rate Limiter tool signifies Google's confidence in its automated systems to manage site crawling efficiently. However, it also means that you should be more vigilant about monitoring your site's performance and be ready to use alternative methods to communicate any crawling-related issues to Google.9. Google Removes Key Robots.txt FAQs - Google recently removed its Robots.txt FAQ help document from its search developer documentation. This change has raised questions among webmasters and SEO professionals about the implications for website crawling and indexing.Robots.txt is a file used by websites to communicate with web crawlers about which parts of the site should or should not be processed or scanned. Google's FAQ page on this topic was a valuable resource for understanding how to use this file effectively. Its removal means that some specific guidance and clarifications are no longer directly available from Google.Key Takeaways from the Removed FAQs: A website doesn't necessarily need a Robots.txt file; without it, Googlebot will generally crawl and index the site normally. The Robots.txt file is recommended for controlling crawler traffic to prevent server issues, not for hiding private content. For controlling how individual pages appear in search results, use the Robots meta tag or X-Robots-Tag HTTP header. Changes in the Robots.txt file can take up to a day to be reflected in Google's cache and subsequently affect search results. Blocking Google from crawling a page using Robots.txt doesn't guarantee removal from search results. For explicit blocking, use the 'noindex' tag. As a business owner, it's important to understand that while the specific FAQs are no longer available, the fundamental principles of using Robots.txt remain unchanged. It's crucial to ensure that your website's Robots.txt file is correctly configured to guide search engines effectively. Remember, incorrect or unsupported rules in this file are typically ignored by crawlers, so accuracy is key.The removal of the Robots.txt FAQs by Google underscores the dynamic nature of SEO and the importance of staying informed about best practices. Business owners should consult with SEO professionals or refer to updated resources to ensure their website's Robots.txt file aligns with their digital marketing goals.10. Google Ads to Update Location Asset Requirements: What You Need to Know - Google Ads is set to update its location asset requirements in December. A location asset in Google Ads is a feature that allows advertisers to include specific location details, like addresses and phone numbers, in their ads. This is particularly useful for businesses with physical locations, as it helps potential customers find them easily.The upcoming change aims to clarify which types of location assets are not allowed, helping advertisers better understand the restrictions. The update will specifically address locations that are closed, not recognized by Google, or do not match the business running the ad. Additionally, assets with products or services that do not match the specified location will be disallowed.This update is significant for business owners and digital marketers. Using location assets effectively in ads can significantly boost a business's visibility and conversion potential. However, not adhering to these updated requirements could result in leaving out vital details, potentially harming your return on investment.11. Microsoft Advertising's Last-Minute Shopper Insights - As the holiday shopping season reaches its peak, Microsoft Advertising's Festive Season Marketing Playbook offers valuable insights for advertisers to capitalize on consumer spending. Here's a concise summary: Timing of Revenue Peaks: Despite some advertisers not yet seeing a peak in revenue, historical trends show significant spikes around Black Friday and Cyber Monday. This year, a 3-4% increase in holiday spending in the US is anticipated, potentially reaching up to $966.6 billion. The UK and Germany are also expected to see similar high spending, highlighting the global impact of the season. Shift in Consumer Behavior: A notable trend this year is the increased emphasis on deal-seeking. Over two-thirds of US shoppers are spending more time looking for coupons and deals, especially around the Cyber5 period (Thanksgiving, Black Friday, Small Business Saturday, Sunday, and Cyber Monday). Advertisers need to adapt to this trend and align their strategies accordingly. The Central Role of Search in Purchasing Decisions: Search remains a crucial component in guiding both online and in-store purchases. It's a pivotal tool for discovering new retailers, conducting pre-purchase research, and comparing prices. For example, Gen X consumers heavily rely on search to find the best prices. In the EMEA region, deal-seekers spend 33% more time searching than average shoppers, offering a significant opportunity for targeted advertising. Post-Cyber5 Opportunities: Search volumes remain high even after the Cyber5 period, presenting a continued opportunity for advertisers. Many holiday clicks and conversions happen during Cyber5 with lower cost per acquisition (CPA), so maintaining active advertising campaigns during this period can yield substantial benefits. Planning for Returns: The post-holiday return period is another critical aspect for businesses. Search volumes for returns peak shortly after Christmas and continue into the new year. Preparing for this influx and adjusting marketing strategies can help mitigate potential losses and maintain customer satisfaction. Strategic Holiday Planning Checklist: Microsoft suggests launching campaigns early, using remarketing and dynamic search ads, emphasizing value messages, leveraging AI for personalized offerings, and utilizing store support for profitable online growth.

Search with Candour
Helpful content update fallout and do links still matter in 2023?

Search with Candour

Play Episode Listen Later Oct 2, 2023 36:22


The duo of Jack Chambers-Ward & Mark Williams-Cook reunite once again to bring you the latest SEO news, updates and hot takes including: September 2023 helpful content update SISTRIX's analysis of the HCU fallout "Links are no longer a top 3 ranking factor" - Gary Illyes from Google For full show notes please go to search.withcandour.co.uk.

#TWIMshow - This Week in Marketing
Ep 180: Is Google Lying About Using Clicks in Rankings? A X-Googler Testimony

#TWIMshow - This Week in Marketing

Play Episode Listen Later Oct 2, 2023 19:12


Episode 180 contains the important Digital Marketing News and Updates from the week of Sep 25-29, 2023.1. Is Google Lying About Using Clicks in Rankings? A X-Googler Testimony - The debate over whether Google uses clicks as a direct ranking factor has taken a new turn. Eric Lehman, a former 17-year Google veteran who worked on search quality and ranking, recently testified in the ongoing U.S. vs. Google antitrust trial. Lehman stated that Google's machine learning systems, BERT and MUM, are becoming more critical than user data for search rankings. He believes that Google will increasingly rely on machine learning to evaluate text rather than user data.Lehman's testimony has sparked controversy, especially among SEO experts, who have long questioned Google's transparency about its ranking factors. Google's Gary Illyes, at a recent AMA, confirmed that Google uses historical search data for its machine-learning algorithm, RankBrain. However, he clarified that clicks are not necessarily a direct ranking factor. Instead, they are used for evaluating experiments and personalization.The Department of Justice (DOJ) also attempted to challenge Lehman's statements by questioning Google's advantage in using BERT over its competitors. Lehman clarified that Google's edge comes from inventing BERT, not from its user data. The DOJ's attempt to impeach Lehman's testimony seemed to backfire, adding more credibility to his statements.Lehman also touched on the sensitive topic of using clicks in search rankings. He mentioned that Google avoids confirming the use of user data in rankings to prevent people from thinking that SEO could be used to manipulate search results. This adds another layer of complexity to the ongoing debate.In summary, while clicks may not serve as a direct ranking factor, they are part of a more complex system that includes machine learning algorithms and personalization. Lehman's testimony has reignited the discussion about Google's ranking factors, raising questions about the company's transparency and the future of SEO.2.

#TWIMshow - This Week in Marketing
Ep 179: Google's Top Factor in

#TWIMshow - This Week in Marketing

Play Episode Listen Later Sep 25, 2023 15:12


Episode 179 contains the important Digital Marketing News and Updates from the week of Sep 18-22, 2023.1. YouTube's New AI-Powered Tools! - YouTube is stepping up its game with a slew of innovative creator tools, including groundbreaking generative AI features. Unveiled at the "Made On" showcase event, the platform introduced the "Dream Screen" generative AI tool, allowing creators to integrate AI-generated video and image backgrounds into their YouTube Shorts. This tool aligns YouTube with the latest creative trends, matching efforts by platforms like Snapchat, TikTok, and Instagram.Additionally, YouTube is launching "YouTube Create," a free video editing app reminiscent of TikTok's CapCut. This app offers a suite of editing tools, including audio cleanup, auto captions in multiple languages, filters, transitions, and direct publishing options. Initially tested in India and Singapore, it's now expanding to more regions and is currently available for Android users.To further fuel creativity, YouTube is introducing an AI ideas generator. By inputting a topic, creators receive content suggestions and even a downloadable content outline. While this tool offers inspiration, it's essential to use it as a guide rather than a strict blueprint to maintain originality.Other notable features include an automatic dubbing tool for content translation and an assistive search in creator music. These tools aim to simplify the content creation process, allowing creators to focus on their unique vision. 2. Google's New Report To Spot Checkout Issues on

SEO im Ohr - die SEO-News von SEO Südwest
Verwirrung um Links und Klicks als Rankingfaktor bei Google: Was ist dran? SEO im Ohr - Folge 269

SEO im Ohr - die SEO-News von SEO Südwest

Play Episode Listen Later Sep 24, 2023 15:56


Auf der Pubcon sagte Gary Illyes von Google, Links gehören schon länger nicht mehr zu den drei wichtigsten Rankingfaktoren. Dann stellt sich aber die Frage: Welche Rankingfaktoren sind die wichtigsten für Google? Verwirrung herrscht außerdem um die Frage, ob Klicks nun doch ein Rankingfaktor für Google sind, obwohl Google das bisher immer abgestritten hat. Die Aussage eines Ex-Googlers vor einem amerikanischen Gericht deutet jetzt darauf hin, dass Google Klicks sehr wohl für die Suche verwendet - in welcher Weise, ist jedoch weiter unklar. Im Zusammenhang mit dem laufenden Helpful Content Update hatte Google darauf hingewiesen, dass Inhalte schlechter Qualität von Drittanbietern auf Subdomains oder in Unterverzeichnissen der Gesamtbewertung einer Website schaden können. Allerdings ist dies noch nicht Bestandteil des aktuellen Updates. In den Bing Webmaster Tools werden jetzt zwar auch Daten aus dem KI-Chat angezeigt, allerdings lassen sich die Daten nicht filtern und separat auslesen. Die meisten Sorgen wegen Duplicate Content stammen laut John Müller von Google von SEOs. Bei Duplicate Content ist es wichtig zu differenzieren.

#TWIMshow - This Week in Marketing
Ep178 - Google Announces September 2023 Helpful Content System Update

#TWIMshow - This Week in Marketing

Play Episode Listen Later Sep 18, 2023 21:31


Episode 178 contains the notable Digital Marketing News and Updates from the week of Sep 11-15, 2023.1. TikTok Shop Launches in the US - With inspiring hashtags like #TikTokMadeMeBuyIt, TikTok now aims to revolutionize online shopping culture. TikTok Shop is now officially available in the US, allowing US businesses to sell their products directly on the TikTok platform. TikTok Shop is a social commerce platform that allows users to discover and purchase products directly from their favorite creators and brands. TikTok Shop adds shoppable videos and LIVE streams directly into the “For You” feeds for its 150 million American users. TikTok Shop extends beyond in-feed videos and LIVE streams. Users can discover new products via the search bar inside the TikTok app, filtering results to Shop. Businesses and brands get a dedicated “Shop Tab” to display products and promotions. This incorporates a product showcase where users can read reviews and purchase directly from your brand's profile. In addition, sellers can take advantage of “Fulfilled by TikTok” – a new logistics solution where TikTok manages storage, picking, packing, and shipping. The platform even includes an affiliate program, letting popular influencers and creators earn commissions by promoting TikTok Shop products. To get started, go to your TikTok app, visit your profile and, using the menu, navigate to Creator Tools. There, you will see the options to sign up for TikTok Shop as a seller or creator to earn brand commissions. Creators must have at least 5,000 followers and be 18 years old to be eligible for the TikTok Shop Affiliate program. TikTok Shop is integrated with well-known ecommerce platforms like Shopify, WooCommerce, BigCommerce, Magento, and Salesforce Commerce Cloud. This should make it easier for existing ecommece sellers to start selling on TikTok without creating a new store from scratch. In addition, TikTok has partnered with several multi-channel platforms like Channel Advisor and Feedonomics to support omni-channel businesses. Sellers can utilize apps from Zendesk, Printful, Yotpo, EasyShip, and more to add more functionality and features to TikTok Shops.2. Amazon Launches New AI Tool to Help Sellers Create Listing Content - Creating compelling product titles, bullet points, and descriptions has traditionally been a cumbersome task for sellers. Amazon has taken a significant step forward in simplifying the lives of its sellers by employing generative artificial intelligence (AI) to generate listing content. The process is remarkably straightforward. Sellers only need to supply a brief description or a few keywords about the product. Amazon's AI generates high-quality, detailed content for the seller's review. If satisfied, sellers can directly upload this content to their product listings. Robert Tekiela, vice president of Amazon Selection and Catalog Systems, expressed excitement about the developments in the announcement post. “With our new generative AI models, we can infer, improve, and enrich product knowledge at an unprecedented scale and with dramatic improvement in quality, performance, and efficiency. Our models learn to infer product information through the diverse sources of information, latent knowledge, and logical reasoning that they learn. For example, they can infer a table is round if specifications list a diameter or infer the collar style of a shirt from its image.”3. Suspended Google Advertisers Needs To Complete Verification Before Appeal - Google wrote, "In October 2023, selected advertisers whose accounts were suspended due to a violation of our Google Ads policies must complete Advertiser verification first to be able to appeal their account suspension."Google said that those advertisers on monthly invoicing are not required to complete Advertiser verification. If they do get suspended, Google said they may directly appeal their account suspension.Google will be rolling out this process on October 10, 2023, with full enforcement ramping up over approximately four weeks.Advertisers may be required to provide the following information for advertiser verification for the account suspension appeals process: D-U-N-S number if the advertiser is an organization US Social Security Number or phone number if the advertiser is an individual 4. Struggling To Get Results From Your YouTube Ads? Google Ads Creative Guidance Can Help! - Google Ads Creative Guidance is a new tool that helps advertisers improve the performance of their YouTube ads. It provides feedback and recommendations on key creative attributes, such as brand logo, video duration, voice-over, and aspect ratio.To access Creative Guidance, advertisers simply need to go to their Google Ads account and click on the "Videos" tab. Then, they can select the video ad they want to analyze and click on the "Creative Guidance" button.Creative Guidance will then provide feedback on the following creative attributes: Brand logo: Is the brand logo displayed prominently in the first 5 seconds of the ad? Video duration: Is the video ad the recommended length for its marketing objective? Voice-over: Does the video ad use a high-quality, human voice-over? Aspect ratio: Does the video ad group include all three video orientations: horizontal 16:9, vertical 9:16, and square 1:1? Creative Guidance will also provide recommendations for how to improve the performance of the video ad. For example, if the brand logo is not displayed prominently in the first 5 seconds of the ad, Creative Guidance will recommend adding a logo overlay to the video.Creative Guidance can be a valuable tool for advertisers who want to improve the performance of their YouTube ads. By following the feedback and recommendations from Creative Guidance, advertisers can create video ads that are more likely to capture the attention of viewers and drive results. While this tool seems promising, advertisers must remember that the AI can only provide recommendations based on historical data and existing best practices. Therefore, its suggestions may not align with every brand's style and strategy.To access Creative guidance in Google Ads, follow these instructions: In your Google Ads account,  Click the Campaigns icon Campaigns Icon.  Click the Assets drop down in the section menu.  Click “Videos”.  Click the “Analytics” tab next to “Videos”.  Select your video ad in the drop down menu.  In the “Ideas to try” section below the retention curves, you'll find the creative attributes you're missing with recommendations on how to take action.  The steps listed above are part of a new Google Ads user experience that is set to launch for all advertisers in 2024. A spokesperson for Google Ads said in a statement: "We'll let you know if your video is missing a best practice. If we have a recommendation or tool to implement a suggestion such as adding a voiceover, we'll direct you to it." "Voiceover has a big impact on YouTube. Thanks to the power of AI, quality voice overs in 15 languages are accessible directly in Google Ads (both in the asset library and built into the video creation tool) and coming soon to Ads Creative Studio. "Similarly, if your campaign would benefit from videos in different durations, we'll guide you to Trim video. Or, if you're missing a horizontal, square or vertical video, you can easily create one using a variety of high-quality templates." "These features empower marketers to take charge of their creative. AI can help turbocharge performance by tuning creative elements across all the different viewing experiences and content that YouTube viewers love." 5. Google: Don't Stuff Low-Quality Content at the Bottom of Your E-Commerce Category Pages - During September 2023, Google SEO Office hour, Gary Illyes has advised against stuffing low-quality, auto-generated content at the bottom of e-commerce category pages. Gary said, instead, "add content that people will actually find useful, don't add content because search might require it or so you think… please don't do those auto-generated low-quality repeated, blurbs of text over and over again on all your category pages.It just looks silly, even for the average person."6. Google Says Meta Description Length Doesn't Matter for SEO - Your SEO professional may disagree with this, but Google's John Mueller has said that the length of meta descriptions does not matter for the Google search ranking algorithm. However, meta descriptions are still important for SEO, as they can influence click-through rates (CTRs). His exact words were, "I'm sorry to tell you, those numbers are all made up.. Whoever told them to you is leading you astray, probably not just in this regard (and I hope you're not telling them to clients)." after Kushal wrote to John stating "Hey @JohnMu meta description length matters. One of client say 200-300 character not good, make it 155-160 character is ideal as per google algo. Being SEO What I suggest him. I know the hidden truth. Pls suggest."7. Google: No Such Thing as "Back to the Same" for Search Rankings After a Site Revamp -  In a recent tweet, Google Search's John Mueller said that there is no such thing as "back to the same" for search rankings after a website revamp. This means that when you make changes to your website, your search rankings may go up, down, or stay the same.There is no specific time or guarantee for how long it will take for your rankings to stabilize after a revamp. It can depend on a number of factors, such as the size and scope of the changes you made, the quality of your content, and the competitiveness of your keywords.If you are considering revamping your website, it is important to be aware of the potential impact on your search rankings. You should also make sure that you have a plan in place for monitoring your traffic and rankings after the revamp.So before you embark on your website revamp journey, work with a reputable SEO professional to figure out a SEO strategy that will either maintain the rankings or even improve.8. Google: Fixing INP Issues Won't Improve Search Rankings - Google's John Mueller reiterated what Google has been saying for some time around Core Web Vitals and the specific metrics within the page experience system. In short, if you fix INP issues, John Muller said, don't expect it to visibly change search ranking visibility.INP, or Input Interactivity, measures how responsive a page is to user input by selecting one of the single longest interactions that occur when a user visits a page. For pages with less than 50 interactions in total, INP is the interaction with the worst latency. For pages with many interactions, INP is most often the 98th percentile of interaction latency. It is one of the three Core Web Vitals, which are a set of metrics that Google uses to measure the user experience of a web page. In July, Google began sending out scary INP core web vital warnings to users. That email notice came a few weeks after Google created the INP report in Google Search Console. As a reminder, INP, Interaction to Next, is replacing FID, First Input Delay, in March 2024. More on INP can be found here and how to optimize for it can be found here.P.S: While INP is important for user experience, it is not a direct ranking factor. This means that fixing INP issues is not guaranteed to improve your search rankings. However, it is still important to fix INP issues, as they can improve the user experience of your website.9. Google Announces September 2023 Helpful Content System Update - Google rolled out a new update to its Helpful Content System on September 14, 2023. The update, which is the first since December 2022, is designed to promote helpful content and demote unhelpful content. It will take about two weeks to complete. And they'll update their ranking release history page when the rollout is complete. The three main things you should be aware of are: Loosening the guidance on machine generated content : Google's previous guidance on machine generated content emphasized that the Helpful Content system prioritizes content created by humans. That part of the guidance is removed, signaling a change in Google's attitude toward AI content to align it better with other seemingly contradictory guidance on AI content. Here is what the new guidance reads: “Google Search's helpful content system generates a signal used by our automated ranking systems to better ensure people see original, helpful content created for people in search results.” Hosting third-party content on subdomains (or on main domain) - There is a longstanding trend of hosting third-party content on the main part of a website or on a subdomain. An example of this is news media websites hosting third-party credit card affiliate content on a subdomain. The idea behind theses strategies may be that some of the main site's ranking power would help the subdomain content rank better. Google's September 2023 Helpful Content update has made a change that may negatively affect websites that host third-party content anywhere on their website. Now a new section added to the Helpful Content Update guidance advises: “If you host third-party content on your main site or in your subdomains, understand that such content may be included in site-wide signals we generate, such as the helpfulness of content. For this reason, if that content is largely independent of the main site's purpose or produced without close supervision or the involvement of the primary site, we recommend that it should be blocked from being indexed by Google.” Gary Illyes from Google also posted on LinkedIn that reads: “We've heard (and also noticed) that some sites “rent out” their subdomains or sometimes even subdirectories to third-parties, typically without any oversight over the content that's hosted on those new, generally low quality micro-sites that have nothing to do with the parent site. In fact the micro-sites are rarely ever linked from the parent sites, which don't actually want to endorse these often questionable sites. The only reason the owners of these shady (?) micro-sites rent the sub-spaces is to manipulate search results.” New warnings on attempts to fake updates to pages and faking freshness - Google added this line “Are you changing the date of pages to make them seem fresh when the content has not substantially changed?” under the avoid creating search engine-first content section. I see it all the time where sites will make a couple of changes to their content from years ago, update the date, and re-publish it. This is a common SEO “strategy” that is on Google's radar. So what is Helpful Content update? Google's helpful content update specifically targets “content that seems to have been primarily created for ranking well in search engines rather than to help or inform people.” This algorithm update aims to help searchers find “high-quality content.” Google wants to reward better and more useful content that was written for humans and to help users. Searchers get frustrated when they land on unhelpful web pages that rank well in search because they were written for the purpose of ranking in search engines. This is the type of content you might call “search engine-first content” or “SEO content.” Google's helpful content algorithm aims to downgrade those types of websites while promoting more helpful websites, designed for humans, above search engines. Google said this is an “ongoing effort to reduce low-quality content and make it easier to find content that feels authentic and useful in search.” Google has provided a list of questions you can ask yourself about your content. Read through those questions here, and in an unbiased manner, ask yourself if your content is in sync with this update. You should also read this. Please note if this update has hit you, it can take several months to recover if you do everything right and make changes to your content over time. Here is what Google wrote on this: “If you've noticed a change in traffic you suspect may be related to this system (such as after a publicly-posted ranking update to the system), then you should self-assess your content and fix or remove any that seems unhelpful. Our help page on how to create helpful, reliable people-first content has questions that you can use to self-assess your content to be successful with the helpful content system.”

#TWIMshow - This Week in Marketing
Ep173- Google Warns: Content Pruning Can Hurt Your SEO

#TWIMshow - This Week in Marketing

Play Episode Listen Later Aug 14, 2023 19:22


Episode 173 contains the notable Digital Marketing News and Updates from the week of August 7 -11, 2023.1. Amazon is Testing AI to Write Product Descriptions for Sellers - Amazon is reportedly testing a generative AI tool for sellers that writes copy for product listings. This marks a significant move in integrating large language models (LLMs) into ecommerce. The tool is anticipated to revolutionize how merchants create and optimize product descriptions.According to a report by TechCrunch, Amazon is testing the tool with a small group of sellers. The tool is able to generate titles and descriptions for product listings, as well as bullet points and images. The tool is still under development, but it has the potential to save sellers a significant amount of time and effort.The use of LLMs in ecommerce is still in its early stages, but it has the potential to disrupt the industry. LLMs can be used to generate product descriptions, write customer reviews, and even create marketing campaigns. As LLMs become more sophisticated, they are likely to play an increasingly important role in ecommerce.2. Meta to Replace Inactive Group Admins: What You Need to Know -  Meta has begun notifying some group admins that they need to be more active in moderating their groups, or Meta will assign another group member to the job instead. This change is likely in response to the ongoing moderator conflict at Reddit, which has seen some large and popular communities become unmoderated or even taken over by bad actors.The notification that Meta is sending to group admins says that they need to "take action to maintain activity" within their groups within one week. This includes things like posting new content, responding to comments, and enforcing the group's rules. If admins do not take action, Meta will assign another group member to the role.This change is likely to have a significant impact on many Facebook groups. Many groups rely on their admins to keep them active and moderated, and if those admins are replaced, it could lead to the decline or even collapse of those groups. It is also possible that this change could lead to an increase in spam and abuse in Facebook groups, as there will be fewer admins to monitor them.3. YouTube Cracks Down on Spammy Links in Shorts - YouTube is rolling out a new linking policy that will make it more difficult for creators to post spammy links in their Shorts videos and channel descriptions. The new policy will take effect on August 31, 2023.Under the new policy, links in Shorts videos and channel descriptions will be automatically converted to unclickable text. Creators will still be able to include links in their videos and descriptions, but they will need to be manually approved by YouTube before they can be clicked on.YouTube is making this change in an effort to curb the spread of spam on its platform. In recent months, YouTube has seen an increase in the number of spammy links being posted in Shorts videos and channel descriptions. These links often lead to malicious websites or scams.The new linking policy is expected to make it more difficult for spammers to spread their content on YouTube. It will also make it easier for users to identify and avoid spammy links.4. YouTube Expands Link Limit for Channel Profiles: Add Up to 14 Links Now! - YouTube is rolling out an update that will allow creators to add up to 14 links to their channel profiles, up from the previous limit of 5. This new feature is designed to give creators more ways to promote their content and drive traffic to their websites, social media profiles, and other online destinations.To add links to your channel profile, simply go to the "About" tab and click on the "Links" section. You can then enter up to 14 links, each with a title and a description. The links will be displayed in a grid format on your channel profile, and viewers can click on them to visit the linked destinations.This new feature is a welcome addition for YouTube creators, as it gives them more flexibility and control over how they promote their content. It can also help creators to drive traffic to their websites and social media profiles, which can help them to grow their audience and reach more viewers.5. YouTube Makes it Easier to Find Longer Content from Shorts - YouTube is testing a new feature that will allow creators to link their Shorts clips to longer video uploads. This will make it easier for viewers to find and watch longer content that is related to the Shorts they have enjoyed.The new feature is currently being tested with a small group of creators. If it is successful, it could be rolled out to all creators in the coming months.The ability to link Shorts clips to longer video uploads is a significant development for YouTube. It could help creators to grow their audiences and drive more traffic to their channels. It could also help to make YouTube a more discoverable platform for users.6. YouTube's New Report: See How Your Videos Are Performing by Format - YouTube has announced a new report that will allow creators to see how their videos are performing by format. The report, which is available in YouTube Studio, breaks down views, watch time, and other metrics by video format, including Shorts, longform videos, and live streams.This new report is a valuable tool for creators who want to understand how their viewers are engaging with their content. It can help creators to identify which formats are performing well and which ones could use improvement. This information can then be used to make strategic decisions about future content creation.For example, if a creator sees that their Shorts are getting a lot of views but their longform videos are not, they may want to focus on creating more Shorts in the future. Or, if a creator sees that their live streams are getting a lot of engagement but their Shorts are not, they may want to experiment with different ways to promote their live streams.The new report is just one of the many ways that YouTube is helping creators to succeed. By providing creators with insights into their performance, YouTube is empowering them to make informed decisions about their content creation.7. 3 Ways to Keep Your AI Chatbot Content Out of Google Search - AI chatbots are becoming increasingly common on websites across the globe, providing instant customer service and engagement. But AI-powered agents can also produce inaccurate content that website owners may not want to be a part of their human-edited and generated content. Google's Search Advocate, John Mueller, recently advised website owners to consider ways to block artificial intelligence (AI) chatbot content from indexing. This is because AI chatbot content can often be repetitive and low-quality, which can negatively impact a website's search engine ranking.There are three main ways to block Google from indexing AI chatbot content: Use a roboted iframe. This is a frame that the site owner can control independently, allowing them to apply a noindex directive to it, which prevents it from being indexed by search engine bots since Google can crawl iframes. Use a roboted JavaScript file / resource. This is a file that can be used to control how Google crawls and indexes a website. By adding a noindex directive to this file, Google will be prevented from indexing the AI chatbot content. Use the data-nosnippet attribute. This attribute can be added to HTML elements to prevent them from being included in the snippet that is displayed in search results. This can be useful for blocking AI chatbot content that is not relevant to the user's search intent. It is important to note that these methods are not foolproof. Google may still be able to index AI chatbot content if it is not properly blocked. However, by using these methods, website owners can significantly reduce the chances of their AI chatbot content being indexed.8. Google Ads Updates: Get More Customers with New Features - Google Ads is rolling out a number of new features and making major updates to its platform in the coming months. These changes are designed to help advertisers improve their campaigns and get better results. The changes to Google Ads were unveiled via an announcement on their announcement page. They wrote: “This holiday season is just around the corner, which means one thing: shoppers and retailers alike are working on their checklists. In fact, today's shoppers are being more intentional with their holiday purchases, with 74% planning their shopping ahead of time.” “AI-powered tools are transforming businesses' ability to move faster, better understand the intent of their customers, and engage them in new ways across the path to purchase.” “Today, we're excited to share new tools to generate insights and new features to help you do more with Google AI.” Here are  the things you should know about: Local inventory ads and Omnichannel Shoppers - According to Google's data, 61% of holiday shoppers use five or more channels while shopping over two days. To reach these omnichannel consumers, Google is expanding ad formats and adding new optimization tools. For example, businesses can now use “Pickup Later” annotations for local inventory ads if they don't have a feed set up. Insights On Early Holiday Shopping - Advertisers can now access more granular performance reporting on products, brands, and labels over time in the Ads interface to capitalize on this. The Performance tab in Google Merchant Center has also been upgraded with competitive benchmarks on bestselling items and pricing visibility. Targeting - Google is enhancing the promotions, shipping details, and visuals on product listings to help items stand out. Advertisers can now target deals to specific locations and categories. Same-day delivery and return information will also be more prominent. AI Tools In Product Studio - Google is launching new tools to help retailers improve product images and create 3D visual assets. Product Studio will be available within Merchant Center Next, Google's upgraded interface for managing Shopping campaigns. It will also be available on the Google & YouTube App on Shopify. Shopify merchants can access Product Studio directly through their Shopify account. Early adopters can test out Product Studio ahead of its full release. AI Strategy in Performance Max - Google urges more advertisers to adopt its AI-powered Performance Max campaigns this holiday season. The platform can now optimize for high lifetime value new customers, helping businesses focus spending on valuable first-time buyers. Early adopters who have switched from standard Shopping campaigns to Performance Max have reportedly seen a 25% increase in conversion value on average, at similar return on ad spend levels. 9. Google Updates Performance Max Best Practices: How to Get Better Results - Google has updated its Performance Max best practices guide, with new recommendations for businesses of all sizes. The guide covers everything from setting goals and budgets to creating effective creative.One of the key changes in the new guide is the emphasis on using Performance Max to achieve specific business goals. The updated best practices guide underscores how Performance Max campaigns can help retailers optimize ad spending across Google's search, display, YouTube, and other inventory. In short – Performance Max utilizes AI to present customers with the most relevant combination of ad creatives across devices and marketing channels.The guide also provides new recommendations for leveraging Google's Performance Planner for optimizing budgets and bids. Performance Planner gives retailers suggested budget and bid adjustments to help campaigns achieve better performance for the same spend. Additionally, the guide offers tips on demand forecasts to understand predicted trends relevant to your business.The guide encourages marketers to experiment to measure the uplift in conversion value from switching to Performance Max. Here is what Google wrote: “If you are running Standard Shopping campaigns, you can run an experiment in order to measure the uplift in conversion value from switching to Performance Max. … If you're satisfied with the results of your A/B experiment, you can then continue running your new Performance Max campaign to replace your Standard Shopping campaign.”The guide provides tips on strategies like lower ROAS targets to increase visibility for high-priority products leading to significant retail moments. “For example, you may want a campaign for holiday merchandise, a campaign for high-margin products, and a campaign for everything else. Setting a lower ROAS target can also help maximize visibility for these products…”To help Performance Max campaigns learn faster, Google advises consolidating campaign structures where possible. “When setting up a new Performance Max campaign, you should consolidate your campaign structure where you can. Google AI works best when it can optimize performance across channels using a unified budget.”With the latest Performance Max capabilities, retailers can optimize for acquiring high-value customers. “New Customer Acquisition with High Value optimization is also now available in beta to help you optimize for new customers with high predicted lifetime value.”The guide concludes with a thorough review of reporting and insights, focusing on using retail-centric reports.The updated Performance Max best practices guide is a valuable resource for businesses of all sizes. By following the recommendations in the guide, businesses can improve the performance of their campaigns and achieve their desired results.10. Google Downgrades HowTo and FAQ Rich Results: What You Need to Know - Google will be showing fewer rich results in its search results, specifically showing less FAQ rich results across the search result snippets and limiting How-To rich results to desktop devices.Google said the FAQs, FAQPage structured data, rich results will only be shown for “well-known, authoritative government and health websites.”For all other sites, this rich result will no longer be shown regularly, Google added. Which sites Google decides to show them for are automated and algorithmic.Google said there is no reason to remove structured data from your site. Google said “Structured data that's not being used does not cause problems for Search, but also has no visible effects in Google Search.”Google said the How-To, from HowTo structured data, rich results will only be shown for desktop users, and not for users on mobile devices.Google added that “with mobile indexing, Google indexes the mobile version of a website as the basis for indexing: to have How-To rich results shown on desktop, the mobile version of your website must include the appropriate markup.”Google wrote, “For both of these items, you may also notice this change in the Search Console reporting for your website. In particular, this will be visible in the metrics shown for FAQ and How-To search appearances in the performance report, and in the number of impressions reported in the appropriate enhancement reports. This change does not affect the number of items reported in the enhancement reports. The search appearances, and the reports, will remain in Search Console for the time being.”11. Semantic HTML: Can It Help With SEO Success in 2023? - Semantic HTML is the practice of using HTML elements to convey the meaning of the content on a web page. This is in contrast to using HTML elements only for presentational purposes.There are many benefits to using semantic HTML for SEO. First, it helps search engines understand the content of your page better. This can lead to your page ranking higher in search results. Second, it can help users find the information they are looking for on your page more easily. Third, it can make your page more accessible to people with disabilities.Google's John Mueller, in a recent SEO Office Hours session, answered a question about whether the semantic HTML element has an impact on Google. John answered the question directly but there is a fair bit of nuance that was left out of his answer that needs to be addressed.The questioner asked: “Does the use of an HTML tag have an impact on Google? Is it better to put the content of a product listing page in an tag?”John responded: “The HTML element does not have any particular effect in Google Search. This is similar to lots of other kinds of HTML tags. There's so much more to using HTML than just Google Search though! Sometimes there are accessibility or semantic reasons to use a specific kind of markup, so don't only focus on SEO.”John Mueller correctly said that there are “semantic reasons” for some HTML elements. Semantic HTML tells developers (or search engines) what the purpose of that code is. For example, the element tells developers or search engines that whatever is wrapped within that element is the footer section of the webpage. The semantic HTML element describes the purpose of that section of content. The element makes it easier for search engines to identify where the main content is, making it easier to find where to find what Google's Martin Splitt calls, the Centerpiece Annotation.Of course you need to have good content in the block and if you do then you definitely make it easier for the Google crawler.12. Google Warns: Content Pruning Can Hurt Your SEO - Gizmodo published an article “exposing” CNET for deleting thousands of pages, as they put it to “game Google Search.” This, even though content pruning is a fairly common advanced SEO practice. And CNET decided which pages to “redirect, repurpose or remove (deprecate)” by looking at metrics such as: Pageviews. Backlink profiles Amount of time passed since the last update. And from an August 2023 leaked internal memo we learned that someone at CNET thinks that old content “sends a signal to Google that says CNET is fresh, relevant and worthy of being placed higher than our competitors in search results,”It's evident that, CNET needs better advice on how SEO works. Deleting content does not signal those three things. Publishing relevant, trustworthy, helpful, quality content for your audience on a technically sound website is what makes you worthy of greater organic search visibility.Here is the bad news for CNET - Google doesn't want to reward sites that are primarily driven by SEO traffic. The helpful content system is meant to reward websites that are primarily creating content for users, not search engines. Also, there is no “penalty” for having old content on your website. Google will not send a manual action notice to CNET, or any site, because you have an article that was published in 2015, or 2007, or 2003, or whatever year.To bolster my point, Google's Danny Sullivan, via his @SearchLiaison account on X, posted: “Are you deleting content from your site because you somehow believe Google doesn't like ‘old' content? That's not a thing! Our guidance doesn't encourage this. Older content can still be helpful, too.”When someone asked Sullivan on what to do if the old content has broken links, is no longer relevant or can't be made more helpful. Sullivan's response was: “The page itself isn't likely to rank well. Removing it might mean if you have a massive site that we're better able to crawl other content on the site. But it doesn't mean we go ‘oh, now the whole site is so much better' because of what happens with an individual page.”The “deleting old content is good for SEO” belief started back to 2011 (12+ years back) after Google launched the Panda update. After the launch, a Google employee wrote :”In addition, it's important for webmasters to know that low quality content on part of a site can impact a site's ranking as a whole. For this reason, if you believe you've been impacted by this change you should evaluate all the content on your site and do your best to improve the overall quality of the pages on your domain. Removing low quality pages or moving them to a different domain could help your rankings for the higher quality content.”IMO, may be that idea made sense back in 2011 but these days one needs to be informed and careful.  Google's Danny Sullivan, clarified that there is more need for nuance in this particular discussion and tried to make it clear that Google has never advised people to delete content simply because it's old. Other prominent Googlers, including John Mueller and Gary Illyes, have also advised improving content, instead of removing it, whenever possible.Deleting old content can be good for SEO performance. To be clear: deleting old content alone – just because it's old – probably won't help you much. However, deleting, improving and consolidating content should be part of your SEO strategy because it helps improve your overall content quality – or, as Mueller once put it, “building out your reputation of knowledge on that topic.”

#TWIMshow - This Week in Marketing
Ep171-Google's John Mueller: Programmatic SEO is Spam

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jul 31, 2023 17:10


Episode 171 contains the notable Digital Marketing News and Updates from the week of July 24-28, 2023.1. Google and Microsoft Ad Revenue Up: What Does This Mean for Marketers? - Google and Microsoft both reported strong earnings for Q2 2023, with ad revenue growth outpacing overall revenue growth. Google's ad revenue was up 3.3% year-over-year (revenue increased by $1.85 billion from the previous year, while YouTube ad revenue was $7.67 billion, up from $7.34 billion in 2022), while Microsoft's ad revenue was up 8%. Google's ad revenue growth was driven by growth in Search and YouTube. Microsoft's ad revenue growth was driven by growth in Bing and LinkedIn. LinkedIn's overall revenue increased by $197 million, driven by stronger interest in its Talent Solutions recruitment offerings. But LinkedIn saw a decline in overall ad spend in the most recent period, though Microsoft also reduced its overall marketing spend to offset this. On another front, LinkedIn is also still reporting that it has 202M+ users in United States.Based on LinkedIn's reports from earlier this year, is that the platform has seen A 22% increase in views of updates in the main feed year-over-year A 25% increase in public conversations in the app YoY Newsletter creation increased 10X in 2022 What you really need to know, however, is who of your target audience is active in the app, and for that, you can use LinkedIn's Audience insights, and/or your company page analytics, to glean more info on your specific active audience. Taking into account that LinkedIn is seeing ‘record levels' of engagement, these types of data notes can help to inform your strategy, and reach the people that are engaging with your business.2. Meta's Ad Revenue Up 12% In Q2 - Meta's ad revenue increased by 12% in the second quarter of 2023 compared to the year-earlier period, surpassing expectations.Advertising revenue for Q2 2023 was $31.5 billion compared to $28.2 billion in the same period in 2022. This success helped drive total Meta revenue to grow 11% year-on-year from $28.8 billion in Q2 2022 to $32 billion in Q2 2023.Meta's Q2 2023 performance also showed impressive growth in user engagement. Facebook's daily active user count increased by 5% to reach 2.06 billion. The number of monthly active users was 3.03 billion, up 3% year-on-year. The “family of apps” category, which includes Instagram and WhatsApp, saw a 7% year-over-year rise in daily active users, totalling 3.07 billion. The number of monthly active users in this category also increased by 6% to 3.88 billion. This is the first time that Meta has achieved double-digit growth since the fourth quarter of 2021, with much of its strong performance driven by growth in ad revenue. This sends a clear message that digital advertising is bouncing back after it took a setback due to the economic downturn. Now, forecasters are predicting increased ad spend for later on in the year, which will create more opportunities for marketers.3. Google: DSA And GDA Advertisers Should Upgrade To Performance Max - Google announced two voluntary upgrades to Performance Max for Dynamic Search (DSA) and Google Display (GDA) advertisers. The AI-driven platform optimizes performance across channels by allowing users to customize their inputs, making it more responsive to individual business needs. Advertisers have seen an average increase of 15% – 20% in conversions after upgrading to a unified campaign strategy.One of the recent enhancements in Performance Max prevents your campaign from serving ads on traffic related to certain brands. This would offer more control over search results. Advertisers can now specify landing page URLs or exclude certain URLs at the campaign level, offering greater customization via brand settings.Performance Max customizes your entire Search ad to match consumer intent better using automatically created assets. These automatic assets will soon appear in the asset reporting table, giving users more control. Performance Max also simplifies the implementation of audience strategies through goals directly integrated into the campaign, like new customer acquisition. Google plans to add re-engagement goals in Performance Max later this year to help retain existing customers. Instead of manually managing complex user lists and exclusions, Google will automatically distinguish between new and existing customers.Although voluntary, the self-upgrade process is highly recommended to start with Performance Max for DSA and GDA campaigns. Advertisers who choose to upgrade can use best practices offered by Google to ensure new Performance Max campaigns are set up for success.4. TikTok Text Posts: A New Way to Express Yourself - TikTok has announced a new feature called "Text Posts" that allows users to share text-only content on the platform. Text Posts can be up to 1000 characters long and can include images, stickers, and hashtags.The introduction of Text Posts is a significant development for TikTok, as it allows users to share more creative and expressive content on the platform. Text Posts could also be valuable for businesses as they present a new avenue for content creation and customer engagement on TikTok.5. Google Merchant Center Adds Customer Support: What You Need to Know - Google Merchant Center is a free service that allows businesses to manage their product listings on Google Shopping. Google has launched a new customer support feature on Merchant Center. The new tool enables retailers to input their customer support information and returns policy. Merchant Center is then able to share these details with shoppers without them ever having to leave the program.Google explained the importance of providing customer service information via a statement issued on Merchant Center: "Customer service is important for your business and your customers because it allows you to help customers solve any issues with your product or service. "It also helps you build trust with your customers. So it's important that your customers know how they can reach you for support." Here's how Google said retailers can add their customer support information: Log into Merchant Center. Once on the Home page, navigate to the Add customer support info card. Select Add info. Input the following customer support information: Customer service telephone number Customer service email address Customer service web page URL (i.e. link to a customer service form) Select the “Live chat support available” toggle if your business supports this Select the “Chat bot support available” toggle if your business supports this After providing the relevant information, select your preferred contact method. Once these steps are complete, scroll down to the bottom of the page and click  ‘Save'. 6. Google Cracks Down on Automated and AI-Generated Reviews - On August 28, 2023, the “Product Ratings policies” will be updated.. The updated Product Ratings policies addresses the issue of automated and AI-generated content in reviews. The new policies state that reviews that are primarily generated by an automated program or AI application should be flagged as spam using the attribute.The rules clearly state not to submit reviews that stem from conflicts of interest or contain inauthentic remarks. This includes reviews paid for, employee-written, or composed by individuals with a vested interest in the product.This is to ensure that users see genuine and helpful reviews when they are making purchasing decisions. To ensure compliance, Google combines automated and human evaluation methods. Machine-learning algorithms will support this effort while specially trained experts deal with more complex cases requiring context. Actions against violations can range from disapproving violating content or reviews, issuing warnings, or suspending accounts for repeated or severe offenses.You can read the updated product ratings policy here.Businesses can now use a new tool called the "Review Spam Report" to submit spam reviews to Google for review.7. Google Expands Site Names Support for Subdomains: Improve Your Visibility in Search - Google has expanded support for site names on subdomains through the alternateName property, allowing websites to display their preferred site name in search results. This is a significant improvement for websites with multiple subdomains, as it can help them to improve their visibility in search results.Previously, Google would only display the main domain name in search results, even if a website had a preferred site name for a subdomain. This meant that websites with multiple subdomains could be difficult to find in search results, as users would have to know the exact subdomain to visit the website.With the expanded support for site names on subdomains, Google will now display the preferred site name for a subdomain in search results if it is configured correctly. This will make it easier for users to find websites with multiple subdomains, and it could lead to an increase in traffic for these websites.As a reminder, the best way to indicate a preferred site name to Google is to make use of WebSite structured data, as explained in their site name documentation.Find out more in the blog post and in their documentation.8. Are .AI Domains Good for SEO? - There are two kinds of domain names. There are gTLD and ccTLD. A gTLD is a Generic Top Level Domain. These kinds of domains are not associated with any country and can be used worldwide. Typical gTLDs are .com, .net, .org, .biz, .xyz and so on. A ccTLD is a TLD (top level domain) that is associated with a specific country. Google uses ccTLDs to localize the websites that use them with the countries those TLDs are associated with. The .in TLD helps Google to determine which country that domain name is relevant to. This aligns with how people of the world generally expect the Internet to work..AI domains are country-code top-level domains (ccTLDs) that are associated with the island of Anguilla. However, they are also becoming increasingly popular for use by businesses and organizations around the world. In the July 2023 Google SEO Office hours session, Google's Gary Illyes answered the question about whether there was a downside to using the .AI domain since it's associated with the Caribbean island of Anguilla. “As of early June, 2023, we treat .ai as a gTLD in Google Search, so yeah, you can use it for your global presence””Gary's answer calls attention to the importance of verifying if a domain extension chosen for a website is treated as a ccTLD or a gTLD because that could make a difference in the website ability to rank worldwide.There are a few reasons why .AI domains may be a good choice for SEO. First, they are relatively short and easy to remember, which can make them more likely to be typed into search engines. Second, they are associated with the concept of artificial intelligence, which is a growing field of interest.However, it is important to note that .AI domains do not have any inherent SEO benefits. Whether or not they help your website rank in search results will depend on a number of other factors, such as the quality of your content and the number of backlinks you have.Google publishes the list of ccTLDs that are treated by Google as generic top level domains. The list shows that ccTLDs like .eu and .asia are treated like gTLDs. Other international domains that are treated like gTLDs are .ad, .co, .fm, .tv and of course .ai.9. Google's John Mueller: Programmatic SEO is Spam - Google's John Mueller has called out "programmatic SEO" as a form of spam. This type of SEO involves creating large numbers of low-quality landing pages that are designed to rank for specific keywords.Mueller said that programmatic SEO is often used to target "hyper-specific" keywords, such as "songs about dogs" or "maps of schools." These keywords are often not very competitive, so it is easy to rank for them with low-quality content.Mueller warned that programmatic SEO can lead to penalties from Google. He said that Google is "trying to get rid of this stuff" and that it is "not something that we want to see in the search results."

#TWIMshow - This Week in Marketing
Ep170-Should You Match Google's Rewritten Titles?

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jul 24, 2023 18:15


Episode 170 contains the notable Digital Marketing News and Updates from the week of July 17-21, 2023.1. Sell Directly on TikTok with WooCommerce - WooCommerce and TikTok have partnered to allow WooCommerce merchants in the United States to sell directly on TikTok. This new program, currently in beta, gives store owners access to an audience of over 150 million, 61% of which engage in ecommerce behavior.To participate in the program, merchants must have a WooCommerce store in the United States and be approved by TikTok. Once approved, merchants can create a TikTok Shop and start selling their products.TikTok Shops offer a number of features that can help merchants sell more products, including: The ability to create product catalogs and tags The ability to run product ads The ability to track sales and performance The WooCommerce and TikTok partnership is a great opportunity for merchants to reach a new audience and grow their businesses.2. TikTok Launches Ads Transparency Library: See Who's Advertising What - TikTok has launched a new Ads Transparency Library (Commercial Content Library), which provides users with more information about the ads they see on the platform. The library includes information such as the advertiser, the target audience, and the creative used in the ad.The Ads Transparency Library is a welcome addition to TikTok, as it gives users more control over the ads they see. It also helps to increase transparency and accountability for advertisers.Having access to this data can give marketers a better understanding of campaign performance and the TikTok algorithm. This key information will help reveal what creatives work, what ideas don't work and more. Having this data at hand will enable marketers to make more informed decisions, potentially maximizing reach and ROI.Access to TikTok's Commercial Content Library is available to everyone globally. However, only data from Europe is available. TikTok said that its team is already working on ways to include advertising data from more countries, such as the U.S., in the future. But a date for this release is yet to be confirmed.Read TikTok's official blog post to find out more about its ads transparency library.3. Meta : How to Integrate Your Brand with Threads - Social media app Threads, Meta's new Twitter alternative, has seen a nearly 70% decline in the number of daily active users since its July 7 peak, according to market intelligence firm Sensor Tower, spoiling their explosive launch just two weeks ago and paling in comparison to Twitter. Now Meta is providing guidance to brands on how to integrate with Threads, its ephemeral messaging app for teenagers. The guidance includes tips on how to create "epic entrances," engage followers, and run challenges or contests.Meta suggests that brands make a grand entrance on Threads by combining images, memes, and open-ended questions to announce their arrival. They should also engage followers by creating interactive content, such as polls or quizzes. Additionally, brands can run challenges or contests to encourage users to create and share content on Threads.The guidance also emphasizes the importance of using puns and talking about Threads in order to promote the app. Finally, Meta suggests that brands explore Threads' existing tools, such as stickers and GIFs, to create engaging content.4. 30 New Ecommerce Metrics in GA4: Get More Insights into Your Shopping Performance - Google Analytics 4 (GA4) has just announced an expansion of its ecommerce measurement capabilities, adding 30 new dimensions and metrics. These new metrics provide more granular data on items, promotions, and shopping behavior, making it easier for marketers to track and analyze their ecommerce performance.Some of the new metrics include: Item Name: The name of the product that was purchased. Brand: The brand of the product that was purchased. Category: The category of the product that was purchased. Promotion Name: The name of the promotion that was used to purchase the product. Checkout Step: The step in the checkout process where the purchase was made. Gross item revenue (The total revenue from items only, excluding tax and shipping) Gross purchase revenue (The total revenue from purchases made on your website or app) Refund amount (The total amount from refunds given on your website or app) These new metrics can be used to answer questions such as: What are the most popular products? What brands are performing well? What promotions are driving sales? Which checkout steps are causing the most abandonment? The addition of these new metrics is a significant boost for ecommerce marketers, providing them with more data to track and analyze their performance. These changes to GA4 make it easier to see meaningful ecommerce data. Marketers will no longer have to build custom reports to access key revenue metrics.5. Google Updates Misrepresentation Policy: What You Need to Know - Google has updated its Misrepresentation policy with detailed information on how marketers can build trust. The document advises what steps and precautions brands should take to make sure their products and offers are eligible to be served in Search.The updated policy includes new requirements for brands to provide clear and transparent information about their products and offers. This includes information about the product's availability, pricing, and shipping. Brands are also required to provide accurate and up-to-date information about their reviews and testimonials. For each issue specified, Google provided specific instructions that brands should follow: Business Identity Ensure that the official business name is provided and that there is consistency across the registered business name and domain name. Make sure a brand's website features an ‘About Us' page as this establishes authenticity and helps customers to understand their unique journey. Link out to the brand's social media profiles from the website so that customers can follow those accounts should they so wish. Transparency Make sure website content and messaging is completely clear and include details regarding shipping, returns and privacy policies. Ensure honesty and transparency about the brand's business model and how the company operates. Online reputation Display honest reviews and testimonials about a brand's products and services to help customers understand how to use them. Feature any badges or seals of approval from official third-party sources. Clearly display how customers can get in touch. Be sure to tell customers if the brand publishes a blog post. Make sure customers know if the brand was mentioned in a third-party article. Professional design Make sure that the brand's website has an SSL certificate to reassure customers that their sensitive data is stored securely. The brand's website should be easy to navigate and shouldn't contain any unnecessary redirects or redirects to broken links. Try to avoid placeholders where possible as this gives Google and the customer the impression that the website is still under construction and not yet ready for SERPs. Google explained that there are several steps brands can take to help it to understand their business faster and more accurately: Create and verify a Google Business Profile. Share up-to-date information in the Merchant Center under the Business information settings. Link relevant third-party platforms to Merchant Center. Follow Google's SEO guidelines to ensure a strong customer experience is provided. Opt into the Google Customer Reviews or other third-party review services to improve eligibility for seller ratings. Match product data in the product feed with your website to make sure that customers are seeing the same information across both platforms. Google is also taking steps to crack down on misrepresentation in the Merchant Center. Merchants who violate the policy may have their products removed from Google Search and Shopping results.Read Google's “Building Trust with your Customers” guide for more information on its Misrepresentation policy.6. Avoid Spam Risks with Your Domain Name - Google advises against choosing cheap top-level domains (TLDs), such as .xyz or .club, due to the increased risk of spam. These domains are often used by spammers because they are inexpensive and easy to register.Google's Search Relations Team (John Mueller, Gary Illyes, and Martin Splitt) in the Jul 20, 2023 podcast episode recommended that website owners choose TLDs that are well-known and reputable. They also suggest considering branding and marketing factors, not just SEO, when choosing a TLD.Google's Search Relations team debunked the misconception that having a TLD matching your keywords provides an inherent SEO advantage. When Splitt asked if owning a domain like fantastic.coffee could offer any SEO benefits for a coffee shop, Illyes responded with a definitive “No.”For more on website domain best practices, check out the full episode of Google's podcast.P.S: I covered this is Episode 163. Now Google is talking about the same issue. 7. Your Domain Name Matters: Don't Forget the Branding - Another update from the latest episode of Google's Search Off The Record podcast. Google's John Mueller has advised that domain name selection should prioritize long-term branding over keyword-centric SEO strategies. Keywords in domain names do not impact Google search rankings, but they can influence user behavior. Therefore, it is more important to choose a domain name that is memorable and easy to type, even if it does not contain any keywords.Here are some additional tips from Mueller for choosing a domain name: Choose a domain name that is easy to pronounce and remember. Avoid using hyphens or numbers in your domain name. Keep your domain name short and concise. Make sure your domain name is available in all relevant top-level domains (TLDs). 8. 301 vs. 404: Which is Better for SEO? -  When a web page is deleted or moved, it can be redirected to a new page using a 301 or 404 status code. Google's Gary Illyes answers the question about which status code is "less harmful" A 301 redirect tells search engines that the page has been permanently moved to a new location, while a 404 error code tells search engines that the page cannot be found.So, which is better: 301 or 404? Google's Gary Illyes answers the question about which status code is "less harmful" in July's SEO Office hour.Per Illyes, it actually depends on a case by case basis thought 301 redirects are generally considered to be better for SEO than 404 error codes. If the page is missing because two sites were merged, a publisher can 301 redirect old or outdated pages to the new pages that are similar in topic.However, there are some cases where a 404 error code may be preferable. For example, if a page has been deleted because it was spam or malicious, a 404 error code will prevent search engines from indexing the page.9. Good Page Experience is Not Enough for SEO - Google's John Mueller recently clarified that having a good page experience is not a silver bullet for SEO. In other words, having a website that loads quickly, is mobile-friendly, and has no errors will not necessarily improve your search rankings if your content is not high-quality or relevant to the user's search intent.Mueller's comments are a reminder that SEO is a complex process that involves a variety of factors. While page experience is an important factor, it is not the only one. If you want to improve your search rankings, you need to focus on creating high-quality content that is relevant to your target audience.10. Google Doesn't Favor AI-Generated Content - Google's Search Liaison, Danny Sullivan, has clarified that Google does not give any special ranking boost to AI-generated content. In fact, he says that "lots of AI content on the web that doesn't rank well and hence isn't well received" by Google Search.Sullivan's comments come after a recent article in Vox Media claimed that AI content is "currently well-received by search engines." However, Sullivan says that this is not the case, and that Google's search algorithms are designed to rank content based on its helpfulness and quality, not on how it was produced.This means that AI-generated content can still rank well in Google Search, but only if it is actually helpful and informative. If it is not, it is likely to be ignored by Google's algorithms.11. Should You Match Google's Rewritten Titles? - Google often rewrites the titles of pages in the search results, most of the time removing the site name from the title. That seem to indicate to them that maybe Google sees the site name as redundant and perhaps they should just drop the site name from the title tag altogether. This can be for a variety of reasons, such as to make the titles more concise, to make them more relevant to the user's search intent, or to avoid duplicate titles.There is no consensus on whether or not it is a good idea to match the titles of your pages to the titles that Google rewrites. Some people believe that it is important to match the titles in order to improve your click-through rate (CTR). Others believe that it is not important to match the titles, and that you should focus on creating high-quality content that will attract users to your site.Google's John Muller offered his recommendations on this topic by writing: “I would not assume that a rewritten version is better (for SEO or for users), and I'd recommend keeping your site name in there — because it makes it easier to confirm a site name that we show above the title. Also, it's a well-known pattern, so I wouldn't change it just for Google.”The World Wide Web Consortium (W3C) says that the purpose of the title element is to define what the webpage (referred to as a document) is about. And Google largely follows those standards.Google's official title element recommendations (on Google Search Central) for title tags echoes what the W3C recommends in a little more detail. Google advises that title elements should be descriptive and concise. The title elements should not be vague. Lastly, Google recommends concisely branding the title. That means using the site name is fine but repeating a marketing slogan across the entire site is not necessarily concise.Why does Google rewrite titles? Years ago many SEO sites recommended adding keywords in the title tag instead of recommending to describe what the page is about. Eventually Google figured out that people were stuffing keywords in the title tag. Obviously, if the keyword is relevant to what the document is about then put the keyword in there if you want. Another reason Google rewrites titles is because the description of the entire page is not appropriate. For example, Google often ranks a webpage for what is essentially a subtopic of the main topic of the webpage. This happens when Google ranks a webpage for a phrase that is in the middle of the document.So should you match Google's title rewrite? In my opinion it is not a good idea because Google might be ranking the page for a subtopic. If you want a reality check about the title element, give ChatGPT a try by inputting the text of the document and asking it to summarize it in ten words. Then again be careful because ChatGPT can spit out incorrect information.

SEO Podcast Unknown Secrets of Internet Marketing
Episode 576: Algorithmic Brand Optimization: Interview with Google's Gary Illyes

SEO Podcast Unknown Secrets of Internet Marketing

Play Episode Listen Later Jul 18, 2023 76:01


This is an old interview with Google Webmaster Trends Analyst Gary Illyes. Understanding Algorithmic Preferences: Incorporating Branding, E-A-T Concepts, and Optimization for Search Engines and Social Media Platforms.We deemed it immensely valuable to revisit this information from the archives, particularly in light of all algorithmic changes that have been happening in the SEO industry over the last 36 months.Additionally, we wish to inform everyone of our intention to pivot the direction of the podcast, aiming to offer enhanced clarity regarding the evolving SEO landscape. In light of this, we are planning to revamp the format for future episodes.---The Unknown Secrets of Internet Marketing podcast is a weekly podcast hosted by internet marketing experts Matt Bertram and Chris Burres. The show provides insights and advice on digital marketing, SEO, and online business. Topics covered include keyword research, content optimization, link building, local SEO, and more. The show also features interviews with industry leaders and experts who share their experiences and tips. Additionally, Matt and Chris share their own experiences and strategies, as well as their own successes and failures, to help listeners learn from their experiences and apply the same principles to their own businesses. The show is designed to help entrepreneurs and business owners become successful online and get the most out of their digital marketing efforts.Please leave us a review if you enjoyed this podcast: https://g.page/r/CccGEk37CLosEB0/reviewFind more great episodes here: bestseopodcast.com/Follow us on:Facebook: @bestseopodcastInstagram: @thebestseopodcastTiktok: @bestseopodcastLinkedIn: @bestseopodcastPowered by: ewrdigital.comHosts: Matt Bertram & Chris BurresDisclaimer: For Educational and Entertainment purposes Only.

Edge of the Web - An SEO Podcast for Today's Digital Marketer
604 | News from the EDGE | Week of 06.26.2023

Edge of the Web - An SEO Podcast for Today's Digital Marketer

Play Episode Listen Later Jun 28, 2023 36:54


Jacob's off this week, and the two annoying ones go at it…..at the News from the EDGE, of course! We take on AI warnings from Gary Illyes, and news from Google about things new and gone……And to tie this all up in a bow, we cover the fact that a study shows that developers who are TRAINING AI are actually cheating and using AI to train AI. Yeah - that's happening. Nothing could go wrong there Giving the news from the upcoming Skynet, for the end of June 2023. News from the EDGE: [00:05:29] There's something new in Google Search Console [00:09:55] EDGE of the Web Title Sponsor: Site Strategics [00:10:26] There's something gone from GA4 [00:13:46] And YouTube says - “No More Confusion” from Fan Accounts [00:16:02] EDGE of the Web Sponsor: Inlinks AI Blitz: [00:17:05] Google's Gary Illyes: Don't Use AI & LLMs To Diagnose SEO Issues [00:20:00] AI Topics From Google Search Central Live Tokyo [00:22:46] Study shows developers training AI are using AI AI Tools: [00:25:22] 5 Useful ChatGPT Plugins  [00:27:20] Automatically Generate FAQ content  [00:29:08] Turn Your Data into Meaningful Narrative Barry Blast from Search Engine Roundtable: [00:30:37] Google Search Ranking Volatility Exploding Today - Where Is The Confirmed Algorithm Update? [00:32:21] Google: Launching A New Domain Before Migrating Content To It Reduces Some Risk   Thanks to our sponsors!   Site Strategics https://edgeofthewebradio.com/site Inlinks https://edgeofthewebradio.com/inlinks   Follow Us: Twitter: @ErinSparks Twitter: @MordyOberstein Twitter: @TheMann00 Twitter: @EDGEWebRadio   #StandwithUkraine edgeofthewebradio.com/ukraine

#TWIMshow - This Week in Marketing
Ep166 - Microsoft Launches AI-Powered Advertising Tool That Can Help You Increase Your Conversion Rates

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jun 26, 2023 16:15


Episode 166 contains the notable Digital Marketing News and Updates from the week of June 19 - 23, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.1. YouTube's New Thumbnail A/B Testing Tool - YouTube has launched a new thumbnail A/B testing tool that allows creators to test different variations of their thumbnail images to see which one performs best. The tool is currently in live testing with a small group of creators, but it is expected to be rolled out to more creators in the coming months.The thumbnail A/B testing tool is a valuable addition to YouTube's creator toolkit. By testing different thumbnail images, creators can ensure that they are using the most effective thumbnail to attract viewers and maximize their video views.2. TikTok Expands TikTok Shops to All Businesses In The US, UK, Canada, and France - TikTok is expanding access to its TikTok Shops eCommerce program to all businesses in the US, UK, Canada, and France. The program allows businesses to sell products directly through TikTok, and it has been growing rapidly in popularity. In the first quarter of 2023, TikTok Shops generated over $1 billion in GMV.To join the TikTok Shops program, businesses need to have a TikTok account and a Shopify or Ecwid store. Once they are approved, they can start adding products to their TikTok Shop and linking their product catalog to TikTok. Businesses can then tag products in their videos and create shopping posts that allow users to purchase products directly from TikTok.The expansion of TikTok Shops to all businesses in these four countries is a major step for the platform. It will allow more businesses to reach a wider audience and sell more products through TikTok. This is likely to further boost the growth of TikTok Shops in the coming months and years.3. Instagram Now Lets You Download Public Reels! - Instagram is now rolling out the ability for users to download publicly posted Reels content to their camera roll. This means that users can now save Reels to their devices for offline viewing, sharing, or editing. To download a Reel, simply tap on the share icon and then select the "Download" option.There are a few limitations to this feature. First, only Reels from public accounts can be downloaded. Second, creators can opt out of enabling downloads of their content in their Account Settings. Finally, some users have reported audio issues with some Reels content, which could be linked to Meta's music licensing agreements.Overall, this is a welcome addition to Instagram that will give users more flexibility with how they interact with Reels content. It will also make it easier for users to share Reels with others, even if they don't have Instagram.4. LinkedIn's New AI Image Detector Catches 99% of Fake Profiles - LinkedIn has developed a new AI image detector that has a 99% success rate in catching fake profile photos. The detector uses a variety of techniques to identify fake images, including analyzing the image's pixels, comparing it to other images in its database, and looking for inconsistencies in the image's metadata.The advent of AI-generated profile images has made it easier to create fake LinkedIn profiles, which has exacerbated an already huge problem. In the first half of 2022, LinkedIn detected and removed 21 million fake accounts.Anecdotal evidence suggests that LinkedIn's AI image detector is working well. An affiliate marketer who deployed fake LinkedIn profiles said that their success rate dropped significantly after LinkedIn implemented the new detector.5. Meta Expands Reels Ads To Instagram And Tests AI Features - Meta is expanding Reels ads to Instagram and testing AI features to boost engagement and broaden advertiser reach. Previously, Reels ads were only available on Facebook. Now, advertisers can run ads between Reels on Instagram, expanding beyond Facebook. Meta is also introducing app promotion ads to Reels to boost app downloads. Additionally, Meta is testing AI to optimize music in single-image Reels ads, aiming for better viewer engagement.6. Meta's New Speech-to-Text Translation Tool Can Translate Text into Audio in 6 Languages - Meta has announced a new AI system called Voicebox that can translate text to audio in a variety of styles and voices. The system is based on a new method called Flow Matching, which allows it to synthesize speech with fewer learning and processing requirements than other similar systems.Voicebox can translate text into audio in six languages: English, French, German, Spanish, Italian, and Portuguese. It can also perform noise removal, edit content, and transfer audio style.The system is still under development, but Meta plans to make it available to developers in the future.7. GA4 Update: Now You Can Control Conversion Credit - Google Analytics 4 has added a new feature that allows you to select which channels are eligible to receive conversion credit for web conversions shared with Google Ads. This means that you can now choose to give credit to organic and paid channels, even if the last non-direct click was not from a Google ad.This new feature can help you to get a more accurate picture of the impact of your marketing campaigns. For example, if you're running a paid campaign and you see that organic traffic is also converting, you can now give credit to both channels. This will help you to allocate your budget more effectively and improve your ROI.8. Google Sues Rank and Rent Marketer: What You Need to Know - Google is suing an online marketer for violating its terms of service and for engaging in activities that mislead users and violate both federal and state laws. The marketer, who was a member of a public Facebook group called "Rank and Rent – GMB Strategy & Domination," is accused of creating fake businesses and fake websites, and then associating them with Voice over Internet Protocol (VoIP) phone numbers whose area codes correspond to the fake businesses' supposed locations. Google alleges that the marketer has been associated with over 350 fake Business Profiles listings since mid-2021.9. Google Launches New INP Report in Search Console to Help you Prepare for FID to INP Transition - Google has launched a new report in Google Search Console to help site owners prepare for the upcoming change of First Input Delay (FID) to Interaction to Next Paint (INP) as a Core Web Vital metric. The report, which can be accessed by clicking this link and selecting a property, shows how well your site performs with the INP metric.INP is a measure of how long it takes for a user to be able to interact with a page after it has loaded. It is a more accurate measure of user experience than FID, as it takes into account the time it takes for the browser to paint the first content on the page as well as the time it takes for the user to be able to interact with that content.The INP report in Google Search Console shows the following data for each page on your site: The average INP for the page The percentage of users who experienced an INP of 100 milliseconds or less The percentage of users who experienced an INP of 300 milliseconds or less Google has not yet announced a specific date for the FID to INP transition, but it is expected to happen in March 2024. By using the INP report in Google Search Console, you can start preparing your site for the change and ensure that your users have a good experience. 10. Google Postpones Data-Driven Attribution Switch: What You Need to Know - Google has postponed the switch to data-driven attribution in Google Ads from June to mid-July 2023. This means that first click, linear, time decay, and position-based attribution models will still be available for new conversion actions until mid-July. However, once the switch is made, these models will be removed from all Google Ads reporting. Advertisers who want to continue using these models will have to manually switch to them.The reason for the postponement is that Google wants to give advertisers more time to prepare for the switch. Data-driven attribution is a more complex model than the other attribution models, and it requires more data. Google wants to make sure that advertisers have enough data to get accurate results from data-driven attribution.11. Google Ads Now Lets You Track Store Sales and Optimize Your Bids -  Google has announced that store sales reporting and bidding are now available across Performance Max campaigns within Google Ads. This means advertisers can measure total sales wherever customers prefer to shop and optimize their bids for in-store revenue.To use this feature, you'll need to upload and match your transaction data from your business to Google. Once you've done that, you'll be able to see how your ads translate into offline purchases.Store sales reporting and bidding offer a number of benefits for businesses, including: The ability to measure the true value of your ads in terms of in-store sales The ability to optimize your bids for in-store revenue The ability to gain insights into how your ads are driving offline purchases If you're using Performance Max campaigns, I encourage you to take advantage of this new feature. It's a great way to track the impact of your ads on your offline sales and optimize your bids for maximum results.12. Google's John Mueller Warns: Custom Elements in Head Can Hurt SEO - Google's John Mueller has advised against using custom elements in the head of a web page. Custom elements are HTML tags that are not part of the standard HTML specification. They can be used to add new functionality to web pages, but they can also disrupt how Google renders and indexes pages.Mueller's warning comes after a user on Twitter asked him if it was technically valid to have a custom element in the document head. Mueller responded that using custom elements in the likely breaks page rendering in Google Search. This is because Google's crawlers may not recognize custom elements, which could lead to the page being indexed incorrectly or not at all.As an alternative to custom elements, Mueller recommends using JSON-LD tags. JSON-LD is a lightweight markup language that can be used to add structured data to web pages. This data can be used by Google to better understand the content of a page, which can improve its ranking in search results.13. Your Homepage is Not Indexed? Fix It With These 3 Steps! - Google's John Mueller explained in a recent Twitter thread that there are a few reasons why a homepage might not be indexed by Google. These reasons include: The homepage may not be linked to from any other pages on the site. The homepage may have been blocked from crawling by robots.txt or a meta robots tag. The homepage may contain errors that prevent Googlebot from crawling it successfully. The homepage may not have enough content to be indexed. Mueller also said that the crawl budget, which is the amount of time and resources that Googlebot has available to crawl websites, is not usually a factor in why homepages are not indexed. However, if a site has a large number of pages, the crawl budget could become a limiting factor.To fix a homepage that is not indexed, you can try the following: Add links to the homepage from other pages on the site. Remove any robots.txt or meta robots tags that are blocking the homepage from being crawled. Fix any errors that are preventing Googlebot from crawling the homepage successfully. Add more content to the homepage. If you are still having trouble getting your homepage indexed, you can use Google Search Console to troubleshoot the issue.14. Launch a New Domain Before Migrating Content to Reduce SEO Risk - Google's John Mueller says it can be beneficial to launch a new domain before migrating your site to it. This can help to reduce some of the risks associated with migration, both internal and external. For example, if you launch the new domain and start building links to it, this can help to establish its authority before you start migrating your content. Additionally, if there are any problems with the migration, you can still keep your old domain up and running while you troubleshoot.Here are some of the benefits of launching a new domain before migrating your site: It can help to reduce the risk of losing traffic during the migration. It can help to build up the authority of the new domain before you start migrating your content. It gives you a chance to test out the new domain and make sure it is working properly. 15. Google's Gary Illyes: Don't Use AI for SEO - Google's Gary Illyes recently warned SEOs against using large language models (LLMs) and artificial intelligence (AI) to diagnose SEO issues. He said that these tools are not yet sophisticated enough to provide accurate insights into the complex factors that affect search rankings. Instead, Illyes recommends using traditional SEO methods, such as manual analysis of site data and search console reports.16. Google Ads is Phasing Out DSAs: Are You Ready for PMax? - Google Ads is asking advertisers to "upgrade" from dynamic search ads (DSAs) to Performance Max (PMax) campaigns. This is because Google is gradually phasing out DSAs in favor of PMax, which is a more sophisticated campaign type that can reach a wider audience and generate more conversions.When you upgrade your DSA campaign to PMax, your existing assets, settings, and budget will be used to create a new PMax campaign. You can then continue to optimize your PMax campaign as you would any other Google Ads campaign.Some PPC experts believe that this is the beginning of the end for DSAs, and that Google will eventually force everyone to switch to PMax. If you're currently running DSA campaigns, it's a good idea to start planning your transition to PMax now.17. Microsoft Ads Renames Platforms and Integrates AI - Microsoft is renaming several of its advertising platforms and integrating AI into its ad platforms to help advertisers automate campaign creation and improve campaign management efficiencies. The name changes include: Microsoft Advertising becomes Microsoft Ads Xandr becomes Microsoft Audience Network PromoteIQ becomes Microsoft Retail Media Microsoft is also adding AI-powered predictive targeting to its ad platform. This will allow advertisers to target their ads more effectively to potential customers.18.  Microsoft Launches AI-Powered Advertising Tool That Can Help You Increase Your Conversion Rates - Microsoft has announced the launch of Predictive Targeting, an artificial intelligence-powered advertising tool that uses machine learning to help advertisers reach new, receptive audiences and drive higher conversion rates. Predictive Targeting is now available to all advertisers on the Microsoft Audience Network.The tool uses a variety of data points, including website traffic, search history, and social media activity, to create a profile of each user. This profile is then used to predict which ads are most likely to be clicked on by each user.Predictive Targeting is available through Microsoft's Advertising Platform, and it can be used to target ads across a variety of channels, including search, display, and video.  For more information on Microsoft's predictive targeting feature, read its complete guide here.

Search Buzz Video Roundup
Search News Buzz Video Recap: Google Heated Search Results, Using AI For SEO, INP Report, Bing Chat & Google Ads Updates

Search Buzz Video Roundup

Play Episode Listen Later Jun 23, 2023


This week we saw the continues heated and incredibly volatile Google search results continue throughout the whole week, and then it exploded this morning. Google's Gary Illyes said we should avoid using LLMs and AI for diagnosing SEO issues. Gary also...

#TWIMshow - This Week in Marketing
Ep164 - Apple Amps Up Privacy: A Glimpse at iOS 17 and macOS Sonoma

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jun 12, 2023 27:02


Episode 164 contains the notable Digital Marketing News and Updates from the week of June 5 - 9, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.1. Google's Structured Data Validator vs Schema.org -During June 2023, Google SEO Office Hours, Google's Martin Splitt answered a question about structured data validation and how Google's validator can show different results than the Schema.org validator.Both Google and Schema.org offer tools for validating if structured data is correct. Google's tool validates structured data and it also offers feedback on whether the tested structured data qualifies for rich results in the search engine results pages. Rich results are enhanced search listings that makes the listing stand out on the search results. The Schema.org Schema Markup Validator checks if the structured data is valid according to the official standards.Per Splitt, “Schema.org is an open and vendor-independent entity that defines the data types and attributes for structured data. Google, as a vendor however, might have specific requirements for some attributes and types in order to use the structured data in product features, such as our rich results in Google Search. So while just leaving out some attributes or using some type of values for an attribute is fine with Schema.org, vendors such as Google and others might have more specific requirements in order to use the structured data you provide to actually enhance features and products.”In conclusion, Google's validator has a purpose that is different from just checking if the structured data is valid. It's checking to see if the structured data that Google requires (for potentially showing a webpage in enhanced search results) is valid. The Schema.org validator is just checking for standards and has nothing to do with how Google uses structured data.You can watch the June SEO office hour here.2. Google's Latest Search Console Update Makes it Easier to Fix Video Indexing Issues - Google has released an update to its Search Console, aimed at refining video indexing reports. This enhancement promises to offer you more precise problem descriptions and actionable solutions to help boost the visibility of your videos in Google Search.Previously, users encountered a generic "Google could not identify the prominent video on the page" error. Now, Google has decided to provide more specific details to overcome this problem. Here's what you need to know: Video outside the viewport: If your video isn't fully visible when the page loads, you'll need to reposition it. Make sure the entire video lies within the renderable area of the webpage. Video too small: If your video is smaller than desired, you should increase its size. The height should exceed 140px, and the width should be greater than 140px and constitute at least one-third of the page's width. Video too tall: If your video is taller than 1080px, it's time to resize it. Decrease the height to less than 1080px to comply with Google's new guidelines. While you might still see some old error messages for the next three months, Google plans to phase these out, replacing them with these new, more detailed notifications.By adhering to these updates, you can maximize your video's prominence on Google Search and enhance user engagement. Happy optimizing!3. Navigating the World of Domains: A Google Insider's Advice -  Let's delve into the world of domain names and how they can impact your business's digital reach, guided by insights from Google Search Advocate, John Mueller.Mueller recently clarified the differences between generic top-level domains (gTLDs) and country code top-level domains (ccTLDs), following Google's decision to reclassify .ai domains as gTLDs, breaking away from their previous association with Anguilla.In essence, gTLDs (such as .com, .store, .net) are not tied to a specific geographical location, unlike ccTLDs (like .nl for the Netherlands, .fr for France, .de for Germany) that are country-specific. Mueller pointed out that if your business is primarily targeting customers within a certain country, a ccTLD might be the way to go. On the other hand, if you're aiming for a global customer base, a gTLD could be the better option.Importantly, Mueller also highlighted the need to consider user perception. He posed a question to consider: will users click on a link they believe is meant for another country's audience?Furthermore, Mueller also cautioned against using TLDs that may appear spammy, as it can harm your site's credibility.His advice underscores the importance of strategic decision-making when registering your domain, reminding us that the choice of a domain name is not just a technical one, but a business decision that can have a significant impact on your online presence.4. Google's Verdict on the Impact of Security Headers on Search Rankings - In your quest for a secure website, you may have come across HTTP headers - bits of data that offer valuable metadata about a webpage to browsers or web crawlers. The most well-known among these are response headers, like the infamous 404 Error or the 301 redirect.A subset of these headers, known as security headers, play a critical role in fortifying your site against malicious attacks. For instance, the HSTS (HTTP Strict Transport Security) header mandates that a webpage be accessed only via HTTPS, not HTTP, and ensures the browser remembers this preference for the future.While a 301 redirect can guide browsers from HTTP to HTTPS, it leaves your site exposed to potential 'man-in-the-middle' attacks. An HSTS header, on the other hand, ensures your browser requests the HTTPS version directly, effectively bolstering site security.A question was recently posed to Google's John Mueller about whether integrating security headers, like HSTS, could influence website ranking. Mueller's response was clear: the HSTS header does not impact Google Search. This header's purpose is to guide users to the HTTPS version of a site. As for deciding which version of a page to crawl and index, Google uses a process known as canonicalization, which doesn't rely on headers like HSTS.So, while security headers might not boost your site's search ranking, their importance in maintaining a secure browsing experience for your users cannot be overstated. Remember, a secure website is a trusted website, and trust forms the foundation of any successful online presence.5. Debunking 'Index Bloat': Google's Take on Effective Web Page Indexing - In a recent episode of Google's 'Search Off The Record' podcast, the Search Relations team at Google tackled the topic of web page indexing, putting a spotlight on the much-discussed theory within the SEO community: "Index Bloat."This theory, often cause for concern, refers to a situation where search engines index pages that aren't beneficial for search results. It includes pages like filtered product pages, printer-friendly versions, internal search results, and more. Advocates of the index bloat theory argue that such pages can confuse search engines and negatively impact search rankings. They link this issue to the concept of a crawl budget, which is the number of URLs a search bot will crawl during each visit. The theory proposes that index bloat can lead to an inefficient use of this crawl budget, with search bots wasting time and resources gathering unneeded data.However, Google's John Mueller challenged this theory, stating there is no known concept of index bloat at Google. According to Mueller, Google doesn't set an arbitrary limit on the number of indexed pages per site. His advice to webmasters is not to worry about excluding pages from Google's index, but instead, focus on creating and publishing useful content.While some supporters of the index bloat theory have pointed to issues like accidental page duplication, incorrect robots.txt files, and poor or thin content as causes, Google asserts that these are not signs of a non-existent "index bloat," but simply general SEO practices that require attention.Some have suggested using tools like Google Search Console to detect index bloat by comparing the actual number of indexed pages to what's expected. Google's stance implies this comparison isn't indicating a problem, but is instead part of routine website management and monitoring.Google's official stance dismisses the idea of index bloat. Instead, the emphasis should be on ensuring the pages submitted for indexing are valuable and relevant, thereby enhancing the overall user experience.6. Controlling Googlebot: Decoding Google's Search Relations Podcast Insights - In the latest episode of the 'Search Off The Record' podcast, Google's Search Relations team, John Mueller and Gary Illyes, delved into two key topics: blocking Googlebot from crawling certain parts of a webpage and preventing Googlebot from accessing a website completely.When asked how to stop Googlebot from crawling specific sections of a webpage, such as the "also bought" areas on product pages, Mueller emphasized that there's no direct method to achieve this. "It's impossible to block crawling of a specific section on an HTML page," he clarified.However, Mueller did propose two strategies, albeit not perfect ones, to navigate this issue. One involves utilizing the data-nosnippet HTML attribute to stop text from being displayed in a search snippet. The other strategy involves using an iframe or JavaScript with the source blocked by robots.txt. But be wary, as Mueller cautioned against this approach, stating it could lead to crawling and indexing issues that are difficult to diagnose and solve.Mueller also reassured listeners that if the same content appears across multiple pages, it's not a cause for concern. "There's no need to block Googlebot from seeing that kind of duplication," he added.Addressing the question of how to prevent Googlebot from accessing an entire site, Illyes provided a straightforward solution. Simply add a disallow rule for the Googlebot user agent in your robots.txt file, and Googlebot will respect this and avoid your site. For those wanting to completely block network access, Illyes suggested creating firewall rules that deny Google's IP ranges.To sum up, while it's impossible to stop Googlebot from accessing specific HTML page sections, methods like the data-nosnippet attribute can offer some control. To block Googlebot from your site altogether, a simple disallow rule in your robots.txt file should suffice, though you can take further steps like setting up specific firewall rules for a more stringent blockade.7. Sweeping Changes to Google Ads Trademark Policy: What You Need to Know -  Google Ads is making significant changes to its Trademark Policy that could impact how your advertisements are run. Starting July 24, Google will only entertain trademark complaints that are filed against specific advertisers and their ads. This is a shift away from the current policy, where complaints can lead to industry-wide restrictions on using trademarked content.This change is a response to feedback from advertisers who found the previous system frustrating due to over-flagging and broad blocks. The new policy aims to streamline resolutions, making them quicker and more straightforward. In addition, it will provide greater clarity and transparency for advertisers, a much-needed improvement many have been advocating for.As explained by a Google spokesperson, "We are updating our Trademark Policy to focus solely on complaints against specific advertisers in order to simplify and speed up resolution times, as opposed to industry-wide blocks that were prone to over-flagging. We believe this update best protects our partners with legitimate complaints while still giving consumers the ability to discover information about new products or services.”Do note that any trademark restrictions implemented before July 24 under the current policy will continue to apply. However, Google plans to phase out these limitations for most advertisers gradually over the next 12-18 months.You can learn more about these changes by visiting the Google Ads Trademarks policy page here.8. Double Menus, Double Fun: SEO Unaffected by Multiple Navigations - In a recent SEO office hours video, Google's Gary Illyes made it clear that the presence of multiple navigation menus on your website doesn't affect your SEO performance - be it positively or negatively.The question arose during the video discussion, asking whether having two navigation menus - a main one featuring important site categories and a secondary one focusing on brand-related extensions - could potentially harm SEO performance.Illyes' response was reassuring. He stated that it's highly unlikely that multiple navigation menus would have any impact on your website's SEO. In other words, whether you have one, two, or even more navigation menus on your page, Google's algorithms are sophisticated enough to recognize these elements and process them accordingly.So, rest easy and design your website to best serve your audience. Remember, whether your navigation is on the top, left, or bottom of your page, Google's got it figured out!9. Google's Eye on XML Sitemap Changes: Resource Efficiency in Action - Google's own Gary Illyes recently reaffirmed that the tech giant is diligent about scanning XML sitemaps for updates before launching the reprocessing protocol. This practice is rooted in the desire to conserve valuable computational resources by avoiding unnecessary reprocessing of unchanged files.When asked whether Google compares current and previous versions of XML sitemaps, Illyes's response was a resounding yes. He explained that Google refrains from reprocessing sitemaps that have remained the same since their last crawl - a measure designed to prevent wastage of computing resources.However, any modifications in your sitemap, whether in the URL element or 'last mod', will trigger a new round of parsing and generally initiate reprocessing. Illyes pointed out that this doesn't automatically guarantee that the altered URLs will be crawled, as they must still pass through the usual quality evaluations like any other URL.Importantly, if a URL is deleted from the sitemap because it no longer exists, it doesn't imply that it will instantly be removed from the index or prioritized for crawling to expedite its deletion. Keep this in mind when making changes to your sitemap.10. Boost Your Search Rankings: Google's Advice on Consolidating Pages - In a recent SEO office hours video, Google's Gary Illyes brought up a valuable point about web page consolidation. He discussed 'host groups', a term used when Google displays two results from the same domain in search results, with one listed below the other.Illyes suggested that when your website forms a host group, it indicates that you have multiple pages capable of ranking well for a particular query. In such cases, he recommended considering the consolidation of these pages, if feasible.This advice aligns with Google's host groups documentation, which recommends setting one of these pages as the 'canonical' if you'd prefer users to land on that page over the other.The concept of a host group comes into play when two or more consecutive text results from the same site rank for the same query and hence, get grouped together.The rationale behind Google's recommendation for consolidation could be understood as an attempt to prevent your pages from competing against each other. When two pages vie for the same ranking, consolidating them could potentially boost the ranking of the remaining page.From an SEO perspective, having two listings could increase your click-through rate. However, the idea of consolidation is to create a more streamlined user experience and possibly enhance your page's ranking.Keep in mind that this is an approach to consider and may not suit every situation. Always consider your unique context and audience needs when making SEO decisions.11. Unlocking Video Thumbnails in Google Search: Key Insights Revealed -  Recent changes to Google's approach to video thumbnails in search results have prompted many queries. These alterations ensure that video thumbnails are displayed only when the video constitutes the main content on a webpage.This doesn't imply that the video must be the first element on your page. Instead, as Google's Gary Illyes explains, the video should be immediately noticeable — it should be "in their face right away." This user-centric approach enhances the user experience, eliminating the need for them to hunt for the video on the page.Illyes encourages web developers and SEO experts to consider the user's perspective. When visitors land on your page, they should not have to actively search for the video. It should be prominently displayed, akin to the approach of popular video platforms like Vimeo and YouTube.Remember, the aim of these changes is to reduce confusion and streamline the user experience by ensuring that videos are easy to find and view. Take inspiration from major video sites to better understand what Google's algorithms are seeking.12. Enhanced Conversion Tracking with Microsoft Advertising's New Cross-Device Attribution Model -  Microsoft Advertising is set to enhance its tracking capabilities with the introduction of a Cross-Device attribution model. Revealed in Microsoft's latest product update roundup in June, this model promises to provide more accurate insights into customer conversion journeys across multiple devices and sessions.With this new feature, if a customer clicks an ad on their laptop and later completes a purchase on their phone, Microsoft Advertising will attribute the conversion to the original ad click on the laptop. This development will ensure that your marketing efforts are accurately credited, regardless of the device where the conversion ultimately occurs.As a result of this new tracking model, marketers may notice a slight uptick in the number of conversions reported in their performance metrics. If you observe an increase in conversions, the new Cross-Device attribution model could be the driving factor. Keep an eye on your reports to understand the full impact of this latest update on your performance data.13. New Verification Mandates for Microsoft Ads: Everything You Need to Know -  Starting August 1st, Microsoft Advertising will be implementing a new policy to enhance transparency and security. Only ads from verified advertisers will be displayed on the platform. If you haven't yet met the Microsoft Ads verification requirements, it's crucial to complete them before August 1st to ensure your ads continue to run smoothly.The Microsoft Ads Advertiser Identity Verification program, which was launched in June 2022, is rolling out the following important dates: As of July 1st, all new advertisers must be verified before their ads can go live. If you haven't received an email from Microsoft about account verification by July 15th, you should reach out to Microsoft support. Starting August 1st, Microsoft Advertising will exclusively display ads from verified advertisers. Once verified, all ads will showcase: The name and location of the advertiser. The business or individual responsible for funding the ad. Additional information explaining why a user is seeing a specific ad, including targeting parameters. In addition to these updates, Microsoft Advertising is also launching a new feature - the Ad Library. This will enable all users to view ads shown on Bing that have gained any impressions in the European Union. Users will be able to search for ads in the Ad Library by using the advertiser's name or by entering words included in the ad creative. The details of the advertiser will be displayed in the Ad Library.Stay ahead of the game and get your account verified to enjoy uninterrupted ad delivery with Microsoft Advertising!14. Unleashing New Opportunities: LinkedIn Introduces Direct Messaging for Company Pages - In a bid to foster more professional connections and interactions, LinkedIn is set to expand its messaging tools. The platform has now introduced a new feature that allows Company Pages to send and receive direct messages (DMs). This marks a major development as previously, one-to-one messaging was only available for individual LinkedIn members.LinkedIn's new feature, termed Pages Messaging, paves the way for members to directly contact brands. Conversations can cover a broad range of topics from products and services to business opportunities. To handle these two-way conversations, organizations will be equipped with a dedicated inbox, enabling them to manage and prioritize incoming inquiries that are most relevant to their business.As a result of this feature, companies might see a significant increase in messages inquiring about opportunities. However, LinkedIn's 'focused inbox' system, which segregates DMs based on priority and topic settings, can help manage the influx. In addition, companies have the option to disable the Message feature if they wish.LinkedIn has been quietly testing this feature with a select group of users in the past month. Considering that over 63 million companies actively post on their LinkedIn Company Pages, this new feature could potentially revolutionize direct interactions and unearth fresh opportunities.Furthermore, LinkedIn is exploring the integration of an AI assistant to aid in lead nurturing. This could be a significant asset, allowing users to research the person they are communicating with without the need to manually browse through their profile or posts.While it might not be a 'game-changer', the new Company Page messaging feature, which is being rolled out from today, is certainly a noteworthy addition to consider in your LinkedIn marketing strategy.15. Apple Amps Up Privacy: A Glimpse at iOS 17 and macOS Sonoma - In a continued commitment to user privacy, Apple has introduced fresh security enhancements in iOS 17 and macOS Sonoma, aimed at curbing intrusive web tracking. The new Link Tracking Protection feature is at the heart of this upgrade.Activated by default in Mail, Messages, and Safari (while in Private Browsing mode), Link Tracking Protection zeroes in on tracking parameters in link URLs, which are often used to monitor user activity across different websites. The feature scrubs these identifiers, thereby thwarting advertisers' and analytics firms' attempts to bypass Safari's intelligent tracking prevention functionalities.Typically, these tracking parameters are attached to the end of a webpage's URL, bypassing the need for third-party cookies. When a user clicks the modified URL, the tracking identifier is read, enabling the backend to create a user profile for personalized ad targeting.Apple's new feature disrupts this process by identifying and removing these tracking components from the URL, ensuring the user's web page navigation remains as intended. This operation is quietly executed during browser navigation in Safari's Private Browsing mode and when links are clicked within the Mail and Messages apps.To strike a balance, Apple has also unveiled an alternate method for advertisers to gauge campaign effectiveness while preserving user privacy. Private Click Measurement, now accessible in Safari Private Browsing mode, enables the tracking of ad conversion metrics without disclosing individual user activity.In conclusion, Apple's latest efforts reflect a renewed commitment to user privacy, promising to make online experiences safer and more secure across their operating systems.

#TWIMshow - This Week in Marketing
Ep159 – Google: “Coming Back Soon” Pages For Site Migrations

#TWIMshow - This Week in Marketing

Play Episode Listen Later May 8, 2023 21:07


Episode 159 contains the notable Digital Marketing News and Updates from the week of May 1-5, 2023.1. Google Updates Guidance For Cross-Domain Canonicals – Google has updated its help documentation for canonicalization. It removed a line about syndicated content and also updated a line about noindex to prevent selection of a canonical page within a single site. Per Google,  “The canonical link element is not recommended for those who wish to avoid duplication by syndication partners, because the pages are often very different. The most effective solution is for partners to block indexing of your content.”But first, let me explain what canonicalization actually means. A canonical tag (aka “rel canonical”) is a way of telling search engines that a specific URL represents the master copy of a page. Using the canonical tag prevents problems caused by identical or “duplicate” content appearing on multiple URLs. Practically speaking, the canonical tag tells search engines which version of a URL you want to appear in search results. A cross-domain canonical is when the duplicate page appears on an entirely different website (domain).Previously, Google has recommended you use the canonical tag for syndicated content even though it was a hit or miss. 2. Video Description In Structured Data Is Now Optional – The description of the video is used to help Google understand the video, but it is no longer a requirement for video structured data. But Google still recommends doing it. Google updated the “description” property from “required” to “recommended” in the structured data documentation.Please fill out the description if you can.3. Google: Clarification On E-E-A-T – Danny Sullivan, Google's Search Liaison said on Twitter, “Our existing documentation does explain E-E-A-T and YMYL in the context of how they should be considered by content creators seeking to produce helpful, people-first content. It's all on this page.”Remember, Google has been saying for years now that there is no E-A-T (or now E-E-A-T) algorithm, but Google does have factors and signals it looks for that align with the values around determining content that demonstrates E-E-A-T.The page Danny linked to says, “Google's automated systems are designed to use many different factors to rank great content. After identifying relevant content, our systems aim to prioritize those that seem most helpful. To do this, they identify a mix of factors that can help determine which content demonstrates aspects of experience, expertise, authoritativeness, and trustworthiness, or what we call E-E-A-T. While E-E-A-T itself isn't a specific ranking factor, using a mix of factors that can identify content with good E-E-A-T is useful.” Google's example says its “systems give even more weight to content that aligns with strong E-E-A-T for topics that could significantly impact the health, financial stability, or safety of people, or the welfare or well-being of society. We call these “Your Money or Your Life” topics, or YMYL for short.””So again, E-E-A-T is important, but optimizing for it directly it not really possible. You can however write content that demonstrates those qualities and Google should understand that and reward that type of content. But there is not one specific thing to do, it is many things signals that Google uses to determine if a piece of content demonstrates experience, expertise, authoritativeness, and trustworthiness, aka E-E-A-T.4. Google: How To Accelerate Website Indexing – During May'23 Google SEO office-hours Q&A session, a concerned website owner inquired about the slow indexing of their 16,000-page site. Google Search Relations team member Gary Illyes explained that indexing speed depends on several factors, the most crucial being the quality of the site and its popularity on the internet. Illyes suggested that website owners ensure their content is of the highest quality possible to increase the indexing speed. High-quality content is essential as Google's algorithms prioritize it when determining indexing priority. However, the definition of high-quality content extends beyond the mere text of individual articles.According to Google's John Mueller, the quality of content encompasses the overall website, including its layout, design, integration of images, and page speed. These elements contribute to a positive user experience and are key factors that Google considers when assessing quality.​Illyes also recommended that once you ensured that your content is the highest quality you can possibly make, try to run some social media promos perhaps so you get people to start talking about your site. This is what he had to say, “How fast a site is indexed depends on a bunch of things, but the most important one is the quality of the site, followed by its popularity on the internet. Once you ensure that your content is the highest quality you can possibly make, try to run some social media promos perhaps so you get people to start talking about your site. That will likely help.”Reading between the lines, so backlinks, links from external websites to your site, contribute to faster indexing and improved search engine rankings. And Technical SEO (optimizing a website's structure, code, and page speed to meet search engines' crawling and indexing requirements) is crucial.5. Google: Focus On Content Quality Over Posting Frequency – Google's Search Advocate John Mueller, recently engaged in a candid discussion on the r/BigSEO subreddit, addressed the question of blogging frequency. “The problem with trying to keep a frequency up is that it's easy to end up with mediocre, fluffy content, which search engine quality algorithms might pick up on, and then assume the whole site is like that. Taking risks like this in the beginning when it's easy for you to start over is probably fine, doing that in the long run will be painful when it catches up to you. (All of this applies even more if you're taking short-cuts with gen-ai content).”So folks, prioritize creating unique and compelling content over maintaining a consistent posting schedule. 6. Google: Avoid Outdated Link Building Strategies – Google's John Mueller made it very clear that you should avoid outdated link building strategies such as SENuke or PBN's.The concept of SENuke and similar link-building tools is that you blast loads of (usually rubbish) content all over the internet, to get backlinks. And a private blog network (PBN) is a group of websites that only exists to provide backlinks to other websites. The purpose of a PBN is to manipulate Google to improve a site's Google search rankings. PBNs clearly violate Google's Webmaster Quality Guidelines and can result in harsh penalties. Google's link spam guidelines state: “Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site.” So, PBN's clearly fall within this guidance.7. Google: Disavowing Links Based On Third-Party Metrics Is A Terrible Idea – When John was asked if one should disavow links based on those pages being under a certain score from a third-party tool, John said no. John wrote on Twitter, “That seems like a terrible idea.” “Also, none of those metrics are things Google cares about, as any SEO tool will tell you… hopefully,” he added.8. Google: Links From Spammy Sites Are Not TrustWorthy – In the May 2023 Google SEO office hours, someone asked “If a domain gets penalized does it affect the links that are outbound from it?” To that question, Duy Nguyen from the Google Search Quality team said “I assume by penalize you mean that the domain was demoted by our spam algorithms or manual actions. In general, yes, we don't trust links from sites we know is spam.” This is common knowledge however it is good to have someone from the spam system say it so clearly.9. Google: “Coming Back Soon” Pages For Site Migrations – Google's John Mueller wrote on Twitter, “I'd try very hard to avoid having a “coming back soon” page during a revamp. I don't think you'd gain a lot by bouncing it back & forth.” “If you need it for

#TWIMshow - This Week in Marketing
[Ep157] - Google Adds Page Experience To ‘Helpful Content' Guidance

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 24, 2023 21:14


Episode 157 contains the notable Digital Marketing News and Updates from the week of Apr 17-21, 2023.1. Twitter Requires All Advertisers To Pay For Verification First - Twitter has informed all advertisers that they'll have to sign up to either Twitter Blue or Verification for Organizations in order to keep running ads in the app. In effect, this now means that brands will have to pay Twitter $8 per month for a blue tick, or $1,000 per month for its Verification for Organizations offering – though brands that are already spending ‘in excess of $1,000 per month' will soon be given gold checkmarks automatically.The cheapest option would be to buy a Twitter Blue subscription for your brand, which will cost your business an extra $96 per year, and if you're planning to run Twitter ads, that's unlikely to have a huge impact on your annual budget.You'll also get a verified tick for your brand account, which could help to give your brand more legitimacy in the app even though the checkmark doesn't seem to communicate the same level of authority or trust that it once did. Given that the blue checkmark can also be bought by anyone, as there's no checking process involved – there's no actual verification in Musk's Twitter Blue process. That means that someone else could also register your brand name, and also get a blue tick for it.Hmm. The question is: Should you pay for verification?2. Instagram Allows You To Add Up To 5 Links In Your Profile Bio - Instagram has finally  launched one of its most requested feature updates, giving users the ability to add up to five links in their IG bio, expanding on its capacity to drive traffic. In the announcement, Instagram wrote:“Starting today, the update will make it easier for creators and other users to highlight their passions, bring awareness to causes they care about, promote brands they love, showcase their personal business, and more.”This is bad news for products such as Linktree, and other linking tools. Instagram's opposition to external links has long been the key driver of usage for third-party link aggregator tools, but now, people will be able to replicate that capacity within the app itself, which will no doubt see many abandon their paid subscriptions to third-party apps.But then again, some of these tools enable branding options that could still act as an enticement, along with more link display options. It's also become such a standard behavior now that users don't find it jarring, so maybe, some businesses will stick with third-party link tools, even with this new capacity available.To add multiple links to your IG profile, head to ‘Edit profile' > ‘Links' > ‘Add external link'. From there, you can drag and drop to order your links as you'd like them to appear in the app.3. Google: Just Because A Site Is Good Now, Doesn't Mean It Will Be #1 Forever - Sayan Dutta asked Google's John Muller that “Recently I am noticing that websites are being removed from Google News. My 3 years old site suddenly showing Not Live on Publisher Center. I saw that with a few of my sites.”Google's John Mueller replied  that just because a site appears to be doing super well in terms of Google ranking and SEO today that it won't one day degrade in value. John added, "just because something's in Google News now doesn't mean it'll be there forever."Sometimes sites just lose their luster, the topic may be not as relevant, or the content quality does not increase as its competitor's content quality increase. Sometimes, sites change ownership and the new owners do not put in the work needed. Sometimes sites just can't keep up with the speed on innovation.There you go folks, SEO is not evergreen or perennial. 4. Google Adds New Return Policy Structured Data Support For Merchant Listing - A structured data type communicates to search engines that the data is about a specific data type. Structured data types have “properties” that provide information about the data type.A new returns section has been added to the structured data type definitions within Google's product structured data document. This is for merchant listings, not yet product snippets, and these new properties types apply to merchant listing experiences. This addition came on the same day that Google began showing shipping and return information in its search results.The new MerchantReturnPolicy type has two required properties (applicableCountry and returnPolicyCategory). Required properties are not optional and must be present in the structured data in order to ensure eligibility for the rich results specific to MerchantReturnPolicy.Google's new section for the returns policy shopping experience eligibility explains that there is an alternate way to become eligible without having to configure the associated structured data. They recommend configuring the shipping settings return policies in the Google Merchant Center Help (details on how to configure it here).5. Google Introduces New Crawler & Explains The Use Cases For Its Different Crawler Types - Google has added a new crawler to its list of Google Crawlers and user agents, this one is named GoogleOther. It is described as a "generic crawler that may be used by various product teams for fetching publicly accessible content from sites." For example, it may be used for one-off crawls for internal research and development, Google explained. The GoogleOther crawler always obeys robots.txt rules for its user agent token and the global user agent (*), and uses the same IP ranges as Googlebot. The User agent token is "GoogleOther" and the full user agent string is "GoogleOther."Here is what Gary Illyes from Google wrote on LinkedIn:“We added a new crawler, GoogleOther to our list of crawlers that ultimately will take some strain off of Googlebot. This is a no-op change for you, but it's interesting nonetheless I reckon. As we optimize how and what Googlebot crawls, one thing we wanted to ensure is that Googlebot's crawl jobs are only used internally for building the index that's used by Search. For this we added a new crawler, GoogleOther, that will replace some of Googlebot's other jobs like R&D crawls to free up some crawl capacity for Googlebot. The new crawler uses the same infrastructure as Googlebot and so it has the same limitations and features as Googlebot: hostload limitations, robotstxt (though different user agent token), http protocol version, fetch size, you name it. It's basically Googlebot under a different name.”At the same time, Google updated the Googlebot page, and listed the different crawler types & its uses that may show up in your server logs. Using this information, you can verify if a web crawler accessing your server is indeed a Google crawler instead of spammers or other troublemakers. Google's crawler fall into three categories: Googlebot - The main crawler for Google's search products. Google says this crawler always respects robots.txt rules. Next time when you find crawl—**.googlebot.com or geo-crawl-**.googlebot.com in your server logs, know that Google Search crawler visited your site. Special-case crawlers – Crawlers that perform specific functions (such as AdsBot), which may or may not respect robots.txt rules. The reverse DNS mask for these  visits show up as rate-limited-proxy-***-***-***-***.google.com User-triggered fetchers – Tools and product functions where the end-user triggers a fetch. For example, Google Site Verifier acts on the request of a user or some Google Search Console tools will send Google to fetch the page based on an action a user takes. The reverse DNS mask for these visits show up as ***-***-***-***.gae.googleusercontent.com P.S: Listen/watch the show to hear my perspective on why it  is important for any website owner to review server logs and keep the troublemakers away.6. Google Removed Older Search Ranking Algorithm Updates From Its Ranking Systems Page - Google has updated its documented Google ranking systems page and completely removed the page experience system, the mobile-friendly system, page speed system and the secure site system rankings. You can spot the difference if you compare the live page to the archived page.These removals make me wonder if any of these algorithm updates mattered at all to the overall Google ranking system?7. Google To Remove Page Experience Report, Mobile Usability Report & Mobile-Friendly Tests From Search Console Report - In the coming months, Google will deprecate the page experience report within Google Search Console, the mobile usability report, and the mobile-friendly testing tool. The core web vitals and HTTPs report will remain in Google Search Console, Danny Sullivan of Google announced.The original page experience report launched in Search Console in April 2021 and was designed for just mobile pages. Google added a desktop version with the launch of the desktop version of the algorithm in January 2022. Now that it is 2023, Google is going to remove that page experience report completely and "will transform into a new page that links to our general guidance about page experience," Danny Sullivan wrote.In December 2023, Google will also drop Google Search Console's mobile usability report (originally launched in 2016), the mobile-friendly test tool (launched in 2016) and mobile-friendly test API. Google said this is not because mobile friendly and usability is not important, Google said, "it remains critical for users, who are using mobile devices more than ever, and as such, it remains a part of our page experience guidance. But in the nearly ten years since we initially launched this report, many other robust resources for evaluating mobile usability have emerged, including Lighthouse from Chrome.”8. Google Adds Page Experience To ‘Helpful Content' Guidance - Google added a new section for providing a great page experience to its guidance around how to create helpful content, Google explained. Google also revised its help page about page experience to add more details about helpful content. Here is what Google added to the helpful content guidance:“Provide a great page experience: Google's core ranking systems look to reward content that provides a good page experience. Site owners seeking to be successful with our systems should not focus on only one or two aspects of page experience. Instead, check if you're providing an overall great page experience across many aspects. For more advice, see our page, Understanding page experience in Google Search results.”There is a FAQ section at the bottom of the “page experience” documentation that you need to read through if you are maintaining or leading your SEO efforts. Here are some items from the FAQ section: Without the Page Experience report, how do I know if my site provides a great page experience? The page experience report was intended as a general guidepost of some metrics that aligned with good page experience, not as a comprehensive assessment of all the different aspects. Those seeking to provide a good page experience should take an holistic approach, including following some of our self-assessment questions covered on our Understanding page experience in Google Search results page. Is there a single “page experience signal” that Google Search uses for ranking? There is no single signal. Our core ranking systems look at a variety of signals that align with overall page experience. Page experience signals had been listed as Core Web Vitals, mobile-friendly, HTTPS and no intrusive interstitials. Are these signals still used in search rankings? While not all of these may be directly used to inform ranking, we do find that all of these aspects of page experience align with success in search ranking, and are worth attention Are Core Web Vitals still important? We highly recommend site owners achieve good Core Web Vitals for success with Search and to ensure a great user experience generally. However, great page experience involves more than Core Web Vitals. Good stats within the Core Web Vitals report in Search Console or third-party Core Web Vitals reports don't guarantee good rankings.

#TWIMshow - This Week in Marketing
[Ep156] - Are Backlinks Still Relevant In 2023?

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 17, 2023 27:05


Episode 156 contains the notable Digital Marketing News and Updates from the week of Apr 10-14, 2023.1. Microsoft Ads Launches PLA Extensions - PLA Extensions are the first fully integrated retail media solution from Microsoft allowing brands to serve product ads both onsite and offsite with a single budget while automatically optimizing for the best balance of performance and reach.What are onsite and offsite ads? Advertising your products on a retailer's website (known as “onsite”) is a strategy with a proven return on investment (ROI)—but advertising only onsite can limit your reach. Onsite advertising has genuine advantages, such as increase awareness of your products with a retailers loyal shoppers, as they're already shopping (with high intent to make a purchase) on the retailer's website. However, reaching only shoppers that are already searching for your products on a retailer's site limits your reach.PLA Extension allows you to place product ads both onsite and offsite with a single budget, while automatically optimizing for the best balance of performance and reach. While previously limited by ad supply constraints onsite, you can now reach significantly more in-market shoppers offsite. With PLA Extension, you'll have unified reporting and attribution without any heavy lifting. This feature unlocks omnichannel attribution in an automated and scalable way, enabling you to measure the impact of your onsite and offsite ads and see a more complete view of the shopper's journey.When you opt to use PLA Extension—if an onsite campaign is underspending—the feature will automatically extend your product listing ad to offsite product placements powered by the Microsoft Search Network and the Microsoft Audience Network, optimizing performance and reach across on and offsite product ad placements.For example, suppose John is searching for a new pair of boots on his favorite retailer's website. He sees your ad for a particular brand of boots but becomes distracted before adding them to his cart and leaves the website without making a purchase. Later that evening, he's browsing online and sees your ad again on Bing. He clicks through the ad on Bing and is taken to your product page on the retailer's website, and this time, makes a purchase. Meanwhile, Sandra, another shopper in need of new boots who isn't familiar with your brand, has also seen your ad on DuckDuckGo and clicks through to make a purchase as well. Thanks to the PLA Extension feature, you've reached in-market shoppers both on and off the retailer's site with a single PLA campaign.PLA Extension drives awareness of your brand and points prospective customers back to your product landing page, growing your audience, and increasing the likelihood of sales.The introduction of this feature in the Microsoft PromoteIQ solution is just one of the ways Microsoft is solidifying its vision to build the most complete omnichannel retail media stack.2. GA4 Gives You The Ability To Change How You Count Your Conversions - Google Analytics 4 (GA4) now allows users to modify the counting method for conversions, introducing a “once per session” option, which is similar to how Universal Analytics (UA) operated. There are two different counting methods that you can select for a conversion event:  The “Once per event” (Recommended) setting means that Google Analytics 4 properties count an event as a conversion every time it occurs. This option is recommended because it reflects the behavior of users on your site or app, and allows you to distinguish between sessions where multiple conversions occurred and sessions where only one conversion occurred. For example: A user completes 5 conversions in one session. This setting counts 5 conversions. The “Once per session (Legacy) This setting means that Google Analytics 4 properties count an event as a conversion only once it occurs within a particular session. Once per session is how Universal Analytics properties count goals. Select this option if it's important for your GA4 conversion count to closely match your UA conversion count. Otherwise, select Once per event. For example, A user completes 5 conversions in one session. This setting counts 1 conversion.  A session is a group of user interactions with your website or app that take place within a given time frame. In Analytics, a session initiates when a user either opens your app in the foreground or views a page or screen and no session is currently active (e.g. their previous session has timed out). By default, a session ends (times out) after 30 minutes of user inactivity. However, you can adjust the session timeout period. There is no limit to how long a session can last. An engaged session is a session that lasts longer than 10 seconds, has a conversion event, or has at least 2 pageviews or screenviews.If you don't make a choice, Google Analytics will automatically use the default counting method. The default depends on how conversion events were created. i. “Once per session” is the default counting method for all conversions that were created from Universal Analytics goals: 1. in an automatically created Google Analytics 4 property 2. using the goals migration tool in the Setup Assistant after April 2023ii. “Once per event” is the default counting method for all other conversionsYou can quickly tell which counting method each of your conversion events uses by going to the Conversion events table in Admin > Conversions. Any conversion with an  icon next to it has the Once per session counting method. If there's no icon in the Conversion name column, then that conversion event uses the Once per event.To change the conversion counting method, you need to have at least an “Editor” role on the property.3. In U.S Search Ads Accounted For 40.2% Of All Digital Ad Revenue In 2022 - According to the 2022 IAB's Internet Advertising Revenue Report, search ads accounted for $84.4 billion of the $209.7 billion in U.S. digital advertising revenues.  In 2021, search ads accounted for $78.3 billion of the $189 billion spending.The IAB report pointed out that the growth of search revenues wasn't as strong as other formats – it lost 1.2 percentage points in total revenue share. That's because display revenues were up 12%, digital video was up 19.3% and digital audio was up 20.9%, year on year. With 40.2% of all digital ad revenue in 2022, paid search is still the leading format.After rebounding in 2021, social media saw its smallest level of growth in 10 years, according to the IAB. In 2022, revenue from social platforms was $59.7 billion, up from $57.7 billion in 2021. Apple's App Tracking Transparency (ATT) feature was cited as one key reason for stalled growth.You can view the entire report here (note: the report is free, but you must log in or create an account to download it).4. Google Released 2022 Web Spam Report - Every year Google releases its web spam report showing how much better the search company got at fighting search spam. Google has named its machine learning based systems for spam identification with its own name - SpamBrain. SpamBrain is Google's AI-based spam-prevention system that it launched in 2018 but never spoke about externally as SpamBrain until 2021.Here are some high level numbers that Google shared in its 2022 report: SpamBrain detected 5 times more spam sites in 2022 compared to 2021  SpamBrain detected 200 times more spam sites in 2022 compared to when SpamBrain first launched in 2018 SpamBrain was incorporated in the December 2022 link spam update 10 times improvement in hack site detection SpamBrain can detect spam during crawling, so a page doesn't need to be indexed to be found to be spammy 5. Google Updates Policy For Video Thumbnails In Search Results - On April 13, 2023 Google published in Search Central Blog that they have made a change so that video thumbnails only appear next to Google search results when the video is the main content of a page. This will make it easier for users to understand what to expect when they visit a page. Previously, they showed video thumbnails in two different ways. For pages where the video was the main content of the page, the video thumbnail appeared at the beginning of a listing: This remains unchanged. The other format was for when a video was present on a page but not the main element of a page. In that case, the thumbnail appeared after a listing: This second format is going away. According to Google during their experiments phase,this change has had minimal impact on overall engagement for publishers.This change will impact search appearance reported metrics for videos in the performance report in Search Console. There will be annotations in the video indexing report and the video enhancements report.In my own experiments, I noticed that instead of video thumbnails, Google is replacing it with an image.To learn more about video indexing best practices, check out the video best practices guide. 6. Google Warns Against Cloaking HTTP Status Codes - Cloaking is a simple method to hide a website's natural appearance from search engines. This technique gives search engines a version of the webpage (or different content) that is different from what website visitors actually see when they visit the website. Black Hat SEO practitioners use cloaking to try to help improve a website's ranking in search engines like Google.Now during Google's April 2023 SEO Office Hours, a website owner asked Gary Illyes, Analyst at Google, whether giving Googlebot a different HTTP status code from the one served to human visitors would be acceptable. Specifically, the site owner wanted to serve an HTTP status code of 410 (Gone) to Googlebot while giving users a 200 (OK) status code. An HTTP status code of 410 informs search engines that a page has been deleted for good and needs to be removed from their index. In contrast, an HTTP status code of 200 means the request worked and the desired resource was successfully given. Giving different HTTP status codes to search engines and users is also considered as “cloaking.”In response to the website owner's question, Illyes strongly advised against cloaking status codes, stating it's risky. He explained that multiple serving conditions could lead to potential issues, such as the site getting de-indexed from Google. Instead, Illyes recommends using a “noindex” robots meta tag to remove specific pages from Google Search. This approach is more straightforward and safer than setting up potentially problematic serving conditions.Cloaking is considered a direct violation of Google's webmaster guidelines.7. Google: No SEO Reason To Delay Release Of Thousands Of Pages - Google's John Mueller was asked if it makes sense to publish 8,000 new pages at once or slowly publish them over time. Like, would publishing them all at once cause some of SEO issue with Google Search.The answer is no, go ahead and publish the 8,000 pages at once. But John Mueller said on Twitter, "If it's great content that the internet has missed and which users have been waiting for, why would you artificially delay it?"Basically, if you created thousands of great pages, just release them. If they are not great pages, then why release any of them? The number of pages doesn't necessarily matter, what does matter is if they are good pages. You know, the quality over quantity thing...8. Google: How To Investigate A Sudden Ranking Drop - Gary Illyes from Google during the latest Google SEO office hours said that it would be "really uncommon that you would completely lose rankings for just one keyword." The question asked was, "Is there a way that my site was deleted from SERP for one certain keyword? We were 1st and now we are absent completely. Page is in the index."Gary said if you do see that happen, it "usually" means you were "out-ranked by someone else in search results." Your competitors might have improved their content or implemented better SEO strategies, leading them to outrank your website for the targeted keyword.But what happens if you really did disappear for that one keyword phrase? Gary's recommendation is to check the following: Check to make sure that this is not a regional thing where you rank well in one region but you don't rank in another region. "I would check if that's the case globally." "Ask some remote friends to search for that keyword and report back. If they do see your site then it's just a glitch in the matrix," he added. You can accomplish the same on your own using a virtual private network (VPN) to change your location and simulate searches from different countries. This will allow you to see if your website's ranking is consistent across various locations. And I would like to remind you that Search results can be influenced by a user's location and search history. If a website's disappearance for a specific keyword is limited to a particular region or individual users, it could be due to geo-targeting or personalization factors. If your site is consistently absent from test searches, you may have a more significant problem. Check to see if you "did anything that might have caused it." Such as, did you change the internal link structure or page layout or acquired more links? Or did you use the disavow tool recently? Each of these may have some effect on ranking. So going through the check list is probably going to help. At times it is a technical issue: Crawling or indexing issues can cause your website to disappear from search results. This could be due to problems with your robots.txt file, sitemap, site structure or issues like broken links, duplicate content, or slow page load times.9. Are Backlinks Still Relevant In 2023? - Google's John Mueller said that when it comes to links SEO make to manipulate search rankings and gain ranking position in Google, those links are mostly ignored by Google Search. This is a line Googlers have been saying since Penguin 4.0 came out in 2016 and continues to say.John said on Twitter, "Of course -- most of those links do nothing; we've spent many years ignoring that kind of thing." This was when he was asked, "Most of the Seo practitioners make backlinks for just manipulating search results and gain positions. If private tools like Semrush and Ahrefs can detect those IPs which are building backlinks, doesn't google track them?"So there you have it again, Google says they are excellent at ignoring links that are links designed to try to manipulate Google Search rankings. But as a reminder, Google does not demote but rather devalues those links.P.S: Demotes would mean that it would lower the rankings of a website for doing something bad. Devalues means it would likely ignore the link spam and not downgrade the rank of the web site. 

#TWIMshow - This Week in Marketing
[Ep152] - Google Launches Broad Core Algorithm Update

#TWIMshow - This Week in Marketing

Play Episode Listen Later Mar 20, 2023 24:49


Get up to speed with the Digital Marketing News and Updates from the week of Mar 13-17, 2023.1. Meta Launches Paid Verification Program - Meta has announced that it's making its new Meta Verified program available to users in the US, which means that American users will now be able to purchase a blue checkmark on Facebook or Instagram for $US11.99 per month on the web, or $US14.99 in-app, accounting for respective App Store charges. As per the Meta Verified guidelines, you actually need to sign up to each platform separately to get a checkmark on each. Meta Verified requires users to provide photo ID to prove their identity. So you'd be paying at least $US23.98 per month to get a blue tick in both apps, which equates to $US 287.76 per annum to buy the perception of credibility. After verification is complete, you can sign-up to the program which will give you: A verification tick on Facebook or IG Proactive account protection from impersonation Dedicated account support from Meta's team Exclusive stickers for Facebook and Instagram Stories and Facebook Reels 100 Stars a month to allocate to other creators on Facebook Elon Musk is saying “You're Welcome Zuck!”. After all, he is the one who started this trend.P.S: My analysis is in the show.2. Twitter Launches ‘Unskippable' Video Marketing Education Course - Twitter has launched ‘Unskippable', a new, eight-part educational series on video marketing, and how to create video promos that stand out in the Twitter feed. The new series aims to provide a practical overview of all the key elements of thumb-stopping video clips. Each video in the series is around 2 minutes long, making it easily consumable, without a major time investment. The tips and advice in the series come via Twitter's own creative team, which 'helps advertisers produce thousands of top-performing ads on the platform every year'.The new series is part of Twitter's ‘Flight School' education platform, which is available for free, and provides insights into key Twitter advertising best practices.And now may be a good time to test out Twitter ads, since around 70% of Twitter's top advertisers have reportedly stopped or reduced their Twitter spending, as a result of Elon Musk's changes at the app.3. LinkedIn Adds AI-Generated Profile Summaries and Job Listings - LinkedIn's adding a new GPT-powered tool that will provide personalized writing suggestions for creating your LinkedIn profile. To get started, tap on the ‘Start' button, select what you want it to create, and the system will come up with your LinkedIn profile summary, based on your info, and samples from millions of user entries. The system will use OpenAI's GPT models to generate these new summaries, which could make it much easier to put together a good representation of your skills and experience, without you having to come up with a creative way to stand out.LinkedIn's also testing a new AI-powered job description tool, which will make it faster and easier to write job descriptions. Here is what LinkedIn wrote in the announcement: “When you're ready to post a job, simply provide some basic information, including the job title and company name. Our tool will then generate a suggested job description for you to review and edit, saving you time and effort while still giving you the flexibility to customize it to your needs. By streamlining this part of the hiring process, you can focus your energy on more strategic aspects of your job.”4. Microsoft Testing Ads For Doctors & Clinics - Microsoft has announced another vertical ad option for medical professionals named Doctor and Clinic Ads. This is an open beta available in the United States, Australia, India, Germany, France, Canada, and the United Kingdom.Doctor and Clinic Ads are intent-triggered based on search for conditions, symptoms, specialists, and more. These rich placements provide real-time information to consumers and inspire action, all with no keywords required. Bing said they are dynamically generated based on the data you specify in your feed file, such as specialties, locations, and service type—in-person/video. The more details you provide in the feed file, the more information Bing can include in your ads and better match your ads to the user's intent. Other things to know about Doctor & Clinic Ads: The auction for Doctor and Clinic Ads is independent from Text Ads. You can participate in the Doctor and Clinic Ads auction with the campaign associated to your feed file and also participate in the Text Ad auction with your regular campaigns. The Doctor and Clinic Ads auction is cost-per-click (CPC)-based. Feed automation is supported through scheduling to easily keep up with any changes you make. It's Recommend that you start with $100–$500 per day. You can bring your own data in the form of audience lists. With Microsoft Audience Network, healthcare providers can display their ads on a network of trusted websites, apps, and social media platforms. They can also target their ads to specific audiences, such as people interested in health and wellness, or people searching for healthcare services.It's good to see Microsoft/Bing expanding capabilities in their ad offerings. I think Google invests in hiring more offshore reps to try to dupe advertisers into spending more and more, instead of delivering results for advertisers.5. Google: You Should Ignore Spammy Referral Traffic - What should you do if you get a lot of referral traffic from a spammy domain? Well that's exactly what Tom asked Google's John Muller. And to that Google's John Mueller said when it comes to spammy referral traffic, you can ignore it and not worry about it regarding SEO.p.s: If you care to know what & how I feel on this topic then listen/watch the show recording.  6. Google: Writing Content In Less Common Languages Is Not Automatically Low Quality - What do you do if you have a site with pages that contains less common languages? For example Cebuano that is spoken by approximately 22M people.Google's John Mueller said lesser-used or known languages published on the web are not considered low-quality content just because they are lesser known. The page can have words in any language or script, our systems will try to index it appropriately, and try to show it to users who search for those words. It doesn't matter if there's no ISO 639-1 country code for it. If this is good content for a niche audience, I would absolutely *not* remove it from indexing. Good content is good content. Your site won't be "penalized" for content in an obscure language. (But also, thin content is thin content, regardless of which language it's in.)7. Google: Stop Using The Disavow Tool - The link disavow tool has been covered in the past (ep-137: Stop Wasting Your Time By Disavowing Random Links Flagged By Disavow Tools). Thai week, Farhad asked John Muller this question: ”How would an ordinary webmaster or SEO marketing exec know whether or not to spend time disavowing spam links to their domain?”John Muller in response to a disavow link services, said, "Some people do things that they can bill, regardless of whether it's needed or makes sense. To be honest, anyone who does not know, should *not* use it. That's why the tool is not a part of the search console UI. That's why our messaging has been consistently to not use it unless you know there's an actual issue. To paraphrase: When in doubt, leave disavow out." So stop using the link disavow tool unless you have a manual action.Also recently Gary Illyes from Google reiterated his disavow advice saying at PubCon not to use it. He repeated that it hurts more than it helps for the most part.8. Google: Nesting Structured Data Is Always Better - Google's Lizzi Sassman answered a question in a Google SEO Office hours session about whether it's okay to combine different structured data types. Combining multiple structured data is called nesting.She said “Nesting your structure data can help us understand what the main focus of the page is. For example, if you put recipe and review at the same level, it's not as clear as telling us that the page is a recipe with a nested review. This means that the primary purpose of the page would be a recipe and that the review is a smaller component of that.”Her answer illuminated an important point about how Google interprets structured data.  and whether it's better to combine structured data or two separate them out. 9. Google Launches Broad Core Algorithm Update - Google Search has announced that they have rolled out the first broad core update of the 2023 year, and this one is named the March 2023 broad core update. It began on March 15, 2023, at about 10:30 am ET and can take about two weeks to roll out. This is a global update impacting all regions, in all languages. The goal of this update is to reward great web pages so some pages will be bumped off the list. Read below if you want a deeper explanation on what a core update is and what it means for you.Several times a year, Google makes significant, broad changes to their search algorithms and systems. They refer to these as core update. Core updates are designed to ensure that overall, they're delivering on their mission to present helpful and reliable results for searchers through improving how their systems assess content overall. These changes may cause some pages that were previously under-rewarded to do better in search results.One way to think of how a core update operates is to imagine you made a list of the top 100 movies in 2021. A few years later in 2024, you refresh the list. It's going to naturally change. Some new and wonderful movies that never existed before will now be candidates for inclusion. You might also reassess some films and realize they deserved a higher place on the list than they had before. The list will change, and films previously higher on the list that move down aren't bad. There are simply more deserving films that are coming before them.Pages that experience a change after a core update should focus on ensuring it is still offering the best content.  Also consider an audit of the drops you may have experienced. What pages were most impacted and for what types of searches? Look closely at these to understand how these pages may perform against the self-assessment questions. For example, there may be other pages that are doing a better job of helping the searcher because they have first-hand knowledge on that topic. You might also have others you trust (that are unaffiliated with your site) provide an honest assessment. You can also review Google's advice on how to recover from a core update ranking drop.P.S: Like I always say, unless SEO & Search Marketing is your full-time job, work with a reputable agency who can guide you through this process.

Search Off the Record
Search and Open Source

Search Off the Record

Play Episode Listen Later Mar 8, 2023 28:04


On this episode of Search Off the Record, guest Edu Pereda, Engineer at Google joins Martin Splitt and Gary Illyes to chat about Search Open Sourcing projects. Open source is a source code made available for the public to modify and distribute. Learn about the Search open source library projects, requirements in mind when creating open source libraries, why not every library can be open source, and more in this podcast episode!   Resources: Episode transcript →https://goo.gle/sotr057-transcript  Find us on Twitter → https://goo.gle/3JgUSEB Edu Pereda on Twitter → https://goo.gle/3ZntryE    Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team. #SOTRpodcast #Opensource   

#TWIMshow - This Week in Marketing
[Ep150] - Should You Rewrite Your Content With ChatGPT?

#TWIMshow - This Week in Marketing

Play Episode Listen Later Mar 6, 2023 28:33


Get up to speed with the Digital Marketing News and Updates from the week of Feb 27-Mar 3, 2023.1. PSA: US TikTok Ban Moves a Step Closer - More bad news for TikTok, with the US House Foreign Affairs Committee voted to give President Joe Biden the power to ban the Chinese-owned app, if he deems such a move necessary, amid ongoing security discussions around its potential connection to the Chinese Communist Part (CCP).TikTok responded to the vote by tweeting that “A U.S ban on TikTok is a ban on the export of American culture and values to the billion-plus people who use our  service worldwide…”While Today's announcement doesn't give Biden the full green light to ban the app, with the US Senate still required to give sign-off before a ban could be implemented. But it's another step towards that next stage, which increasingly feels like it will lead to a TikTok ban, or at the least, a significant change in direction for the app.Remember that TikTok, along with 58 other Chinese-created apps, was banned completely in India by the Ministry of Electronics and Information Technology on 29 June 2020. So if you are relying on traffic from TikTok then it is high time you diversify your traffic sources.2. Google Shares How Its Keyword-Matching System For Search Ads Work - Google has released a 28 page comprehensive guide during Google Search Ads Week 2023, providing a unique behind-the-scenes glimpse into its keyword-matching system for search ads.To achieve better results, advertisers can optimise their campaigns by gaining an understanding of Google Ads keyword-matching process.Google's guide provides a comprehensive breakdown of the system, which includes how the company utilises machine learning and natural language understanding technologies to determine keyword eligibility, and how the responsive search ads creative system selects the best-performing creative for users.It is essential to note that grouping keywords is critical to campaign optimisation. By eliminating the need to add the same keyword in multiple match types, advertisers can avoid segmenting and reducing the available data that Smart Bidding can use for optimisation, which can result in fewer conversions and higher costs.The guide is an invaluable resource for anyone seeking to enhance their Google Ads campaigns. Incorporating the insights and best practices outlined in the guide can boost the chances of success and drive more conversions.  This is why I always tell my listeners to work with a reputable learning and growing agency who is in the know. Afterall, you can not make moves or leverage opportunities if you are not in the know.3. Google Ads Is Changing Location Targeting Settings In March 2023 - Starting March 2023, “Search Interest” targeting will no longer be available in Google Ads. Campaigns that use “Search Interest” targeting will be migrated to “Presence or Interest” targeting. These changes will be consistent in Search, Display, Performance Max, and Shopping campaigns. The Presence option lets you show your ads to people who are likely to be located, or regularly located in the locations you've targeted.The Search Interest option lets you show your ads to anyone searching on Google for your targeted location. If a person doesn't specify a location in their search, then the system uses the location where a user is likely to be located for targeting. This option is only available for Search campaigns.So after this change is in effect, a person who lives in Northern VA but often travels to Maryland for shopping or work. While home in VA, the person searches for "plumber near me." Now Google is going to show some Maryland plumbers who are not licensed in VA.  Am I the only one who thinks that the real winner of this change is Google!!4. Google Ads Introduces AI-Powered Search Ads - During the Google's Search Ads Week, a new customer acquisition goal for Search campaigns has been launched globally. This goal utilizes Smart Bidding and first-party data to optimize campaigns and attract new customers during peak periods. According to Google, by combining the new customer acquisition goal with bidding strategies like Maximize conversion value with a target ROAS, advertisers can prioritize and target high-value customers. The new customer acquisition goal has two modes that help you to reach your campaign goals: Value New Customer: Bid higher for new customers than for existing customers New Customers Only: Bid for new customers only. 5. Microsoft Bing's Fabrice Canel : SEO Will Never Be "dead" - Fabrice Canel, the Principal Product Manager for Microsoft Bing, gave a keynote presentation at the Pubcon convention in Austin, Texas. His presentation offered valuable information on optimizing websites for the new Bing search experience as well as shared the benefits of using Bing Webmaster Tools to monitor traffic data and make necessary adjustments to improve visibility in search results.First, Canel suggested to stay with the same SEO playbooks for optimizing content for Bing's AI experience because it's still the early days for AI search. Throughout his keynote at Pubcon, Canel stressed the importance of SEO professionals in guiding Bing's search crawlers to high-quality content.Then Canel emphasized the importance of setting the lastmod tag to the date a page was last modified, not when the sitemap was generated. Remember lastmod was covered in previous episodes in details. ICYMI, the lastmod tag is an HTML attribute indicating when a particular webpage or URL received significant changes. This tag is used in sitemaps to help search engines like Bing understand when a page was last updated. Lastmod also helps searchers identify and access the most up-to-date content available. When a lastmod tag is present, Bing will display the updated date in search results. This signals to searchers that the webpage may have new or updated information they haven't seen yet. According to Canel, 18% of sitemaps have lastmod values not correctly set, typically set to the date and time the sitemap is generated.Thirdly, Canel recommended website to  adopting IndexNow to inform search engines of recent modifications to website content instantly. FYI: IndexNow was covered in episode# 90 (Jan 10-15, 2022). According to Canel, 20 million websites have already adopted IndexNow, and he expects more top websites, search engines, and content management systems to follow suit. Canel adds that manually crawling a webpage to see if its content has changed wastes resources and energy and creates CO2. He also suggests having sitemaps to provide search engines with all relevant URLs and corresponding modification dates.Most importantly, he wanted website owner focus on writing quality content and use semantic markup to convey information about the pages.Lastly, we learned Bing Webmaster Tools will soon include traffic data from Bing's AI chat.6. Google On ‘lastmod' Tag In XML Sitemap - I covered “lastmod” in episode#146. It is back again. Google's John Mueller said on Twitter if you are "providing something new for search engines that you'd like reflected in search," then update the date, if not, then don't. John added, "The issue is more that some CMS's / servers set the lastmod to the current date/time for all pages. This makes that data useless. Good CMS's setting it thoughtfully, even if not always perfect, is much more useful."The current Google documentation says, "Google uses the lastmod value if it's consistently and verifiably (for example by comparing to the last modification of the page) accurate." And according to a recent study at Bing (also covered in episode#146) revealed that among websites with at least one URL indexed by Bing: 58% of hosts have at least one XML sitemap (sitemap known by Bing).84% of these sitemaps have a lastmod attribute set 79% have lastmod values correct.  18% have lastmod values not correctly set.  3% has lastmod values for only some of the URLs. 42% of hosts don't have one XML sitemap (Bing does not know it) P.S: Don't be the business that is skipping the basics and easy to do stuff and looking to do advanced stuff. #DoTheBasics first.7. Google: Don't Combine Site Moves With Other Big Changes - Sometimes businesses make changes to their top-level domain as well as update their website. So Google Search Advocate John Mueller during a recent Search Of The Record Podcast with Gary Illyes, and Senior Technical Writer Lizzi Sassman asked “What happens if I do a domain change, and move from a “.ch”, which is a Swiss top level domain, to “.com”? Is that a problem? Like if I combine a domain change with other stuff?”In response, Illyes, shared that these changes should be done in smaller pieces over months. Making too many changes at once could result in lower rankings and lost traffic. For example, if a website is moving from “example.ch” and “example.fr” to “example.com,” Illyes recommended moving “example.fr” first and waiting before moving “example.ch.”Mueller and Sassman questioned Illyes on why he's so concerned about spreading out site moves. Illyes admitted that many site moves he's been involved with have resulted in lost traffic. Illyes also mentioned that misconfigurations, such as incorrect redirects, are common mistakes that can cause traffic loss. However, traffic shouldn't be lost during a domain change if everything is done correctly.If all you're doing is redirecting URLs from one site to another, there's a low risk for adverse effects. On the other hand, if you do lose rankings and traffic, there's no specific timeframe for a full recovery.8. Google's Gary Illyes: Google Does Not Care Who Authors or Links To The Content - Gary Illyes from Google gave a keynote and a Q&A session at PubCon and while the keynote was pretty vanilla stuff, the Q&A did reconfirm a lot of what has been said in the past around authorship, links and disavowing links. In short, Google does not give too much weight to who writes your content. So if you get a Walt Mossberg to write a piece of content on your site, just because it is Walt, doesn't make it rank well. If the content is written well, it will rank well, but by default, just because Walt wrote it, doesn't make it rank well. Gary also said that links are not as important as SEOs think they are.  And disavowing links is just a waste of time.P.S: All these topics have been covered in the past shows. 9. Google: PageRank Sculpting Is A Myth - Every website is assigned a unique value by the Google PageRank algorithm. This value, also called PageRank, has long been an important factor in link building and link exchange. PageRank sculpting is a technique in which an attempt is made to distribute the PageRank of a website to other subpages. Assuming that the home page receives the highest PageRank because it is the most important within the sites hierarchy, the PageRank will decrease as you go further down into the structure. Before 2009, it was common practice to control the PageRank through sculpting so that only certain pages would benefit. For example, function pages such as the imprint or contact page were linked internally with the attribute “nofollow.” Thus, the link power increased (as measured by PageRank) for the remaining internal links. Unfortunately, some SEO Experts still feel that they can control how Google passes your link equity throughout your site by using the nofollow link attribute. So Google's John Muller said on Twitter that it is an SEO myth to say you can use the nofollow attribute on links sculpt PageRank. Remember, back in 2019 he tweeted that Internal PageRank Sculpting Is A Waste Of Time. Another #SEOMythBusted. I'll file this under #AvoidBadSEOAdvice.10. Check Domain Reputation Before You Buy A Domain - Google's John Mueller was asked about a domain name purchased several months ago but still does not rank well in Google Search. John explained that if a domain has a "long and complicated history." "It's going to be hard to convince search engines that it's something very different & unrelated to what was done in the past decades," John added.In short, he is saying that not only was this domain abusing search engines for a long, long time, but also that the new content on this old domain is not different enough or unrelated enough from what the topic was previously where the search engine would consider it a brand new site and wipe the site clean.Basically the issue here is “domain legacy penalty” - It's a penalty that's associated with a domain from when it was registered by someone else in the past. Apparently the penalty remains after the domain is registered by someone else years later. Which makes sense or else bad actors will keep on transferring domain ownership to bypass the penalty. The way to prevent is to check the past history of a domain name is to visit Archive.org. Archive.org downloads and creates an archive of websites throughout the Internet.A similar issue happened a few years ago to ZDNet. One of their domains was hyphenated (CXO-Talk.com). So they purchased the non-hyphenated variant (CXOTalk.com) from a third party domain auction. ZDNet was unaware that the domain had been used by spammers.  Soon after ZDNet migrated all their content from CXO-Talk.com to CXOTalk.com, their website was banned from Google. ZDNet wrote an article about what happened to them and had the following advice: Before purchasing any domain at auction, be sure to check its history using backlink tools If the domain has a bad history, use Google Webmaster Tools to do a clean-up before putting the domain into service Google's system of problem remediation lacks transparency and responsiveness. They can and should do better. I still don't really know what caused the problem or how to fix it. 11. Should You Rewrite Your Content With ChatGPT? - Google's John Mueller went back and forth on Twitter with some SEO practitioners on the topic of using ChatGPT to (re)write existing content. Basically Ujesh was wondering if he can rewrite his own content with the help of tools like #ChatGPT without losing its helpfulness and relevancy. He was curious to see if it will  reduce the quality of the article due to AI involvement or does it boost the article considering the quality revamp ?To that question, John asked “Why do you need to rewrite your own content? Is it bad?” IMO, this is a fair question.To John's question Paulo replied, “let's say that English is not my main language. Then, I write something in my mother tongue, translate it in my own limited vocabulary, and ask AI to enhance the vocabulary. The content is not bad, but limited by my knowledge of a language, not the topic I'm trying to cover.”And John responded by saying “Why do you want to just publish something for the sake of publishing something, rather than publishing something you know to be useful & good? (This is not unique to LLM/AI NLG, it's the same with unknown-quality human-written content.) What do you want your site known for?”John is saying that, if your content is bad, why are you writing it in the first place? If you know your content is bad, then it is not helpful, will ChatGPT make it helpful for you? How do you know if the ChatGPT version is helpful and quality if your content you originally wrote is not quality? Maybe instead of using ChatGPT to improve the quality of your content, maybe you should focus on topics that you can write quality content about?

#TWIMshow - This Week in Marketing
[Ep145] - Google: When Fake URLs Are Generated By Your Competitor

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 30, 2023 16:13


Get up to speed with the Digital Marketing News and Updates from the week of Jan 23 - 27, 2023.1. Key Takeaways From Microsoft FY23 Q2 Earnings - Microsoft Corp. announced its financial results for the quarter ended December 31, 2022, which showed an increase in revenue of 2% to $52.7 billion. LinkedIn revenue increased 10% driven by Talent Solutions and has once again seen ‘record levels' of in-app engagement in the most recent quarter, with the platform reporting 18% growth in total user sessions. Per LinkedIn “We once again saw record engagement among our more than 900 million members. Three members are signing up every second. Over eighty percent of these members are from outside the United States.” Hey LinkedIn, how many of these accounts are fake or bot accounts? At the same time, LinkedIn has warned, however, that this number will likely decline in 2023 due to a broader slowdown in hiring, particularly in the tech sector, where many of LinkedIn's job postings stem from. Total ad revenue increased 10%. Microsoft also announced its plans to increase its ad revenue from $10 billion annually to $20 billion. And if achieved, it would make Microsoft the sixth-largest digital ad seller worldwide. Remember, Microsoft has a partnership with Netflix partnership and allows advertisers to purchase ads through their demand side platform Xandr. Microsoft will take a reseller fee, and experts predict that the partnership will be a huge revenue driver, easily clearing $10 billion in ad sales or more.You can read the full earnings statement from Microsoft here. 2. LinkedIn Trying To Boost Newsletter Discovery By Showcasing Which Newsletters The Profile User Has Subscribed - LinkedIn's looking to make it easier to find relevant newsletters in your niche, by adding a new option that will enable members to view what newsletters another member is subscribed to in the app. Per LinkedIn, “We've heard from members that newsletters on LinkedIn are a great way to gather new insights and ideas on professional topics that they care about. We've also heard that members are looking for better ways to discover even more newsletters that would be relevant to them. To aid in this discovery, we are making newsletter subscriptions visible to others, including on profiles. Starting February 11th, 2023, you'll be able to see which newsletters members find value in, the same way you can see your shared interests, pages and groups.”IMO, this is a double-edge sword unless LInkedIn gives me a way to control which subscriptions should be revealed in public vs kept private. On the other hand, creators and influencers can charge $$ to subscribe to a newsletter and promote it for a fee. Though I would not do this but to easy his own.3. Twitter Launches “Search Keyword Ads” - Twitter has introduced a new ad unit called “Search Keywords Ads”, which allows advertisers to pay for their tweets to appear at the top in search results for specific keywords. Search Keywords Ads are similar to promoted tweets but with the added benefit of appearing in search results. This will allow advertisers to reach a wider audience, as users searching for specific keywords (targeting users actively searching for specific keywords, which provides a more accurate signal of user intent.) will now be exposed to sponsored tweets. Advertisers can find Search Keywords Ads as a new campaign objective within the Twitter Ads interface.My question: How long before Instagram copies this?4. Google Optimize Discontinued. Now What? - If you have not heard of Google Optimize before today then I do not blame you. It was a nifty service from Google (formerly called Google Website Optimizer), is an analytics and testing tool created by Google. It allows you to run experiments that are aimed to help online marketers and webmasters to increase visitor conversion rates and overall visitor satisfaction.And now Google has decided to discontinue this service September 30, 2023. "Google Optimize and Optimize 360 will no longer be available after September 30, 2023. Your experiments and personalizations can continue to run until that date," Google wrote.Google added, "We launched Google Optimize over 5 years ago to enable businesses of all sizes to easily test and improve your user experiences. We remain committed to enabling businesses of all sizes to improve your user experiences and are investing in A/B testing in Google Analytics 4."I am unhappy about this announcement because Optimize worked seamlessly with GA and it will leave a significant gap in the market for affordable and beginner-friendly A/B testing options. However, I'm hopeful that Google will integrate some of these features into GA-4.5. Google Ads Now Supports Account-Level Negative Keywords - Creating a list of negative keywords allows you to block your ads from showing for specific irrelevant terms for your brand, making it easier for your ads to reach your desired audience and resulting in more successful conversions. To prevent unwanted impressions or clicks from certain search terms across multiple campaigns, advertisers can now create a negative keyword list at the account level and then apply it to relevant campaigns. This will save you the effort of adding the same negative keywords to individual campaigns and make it easier to manage future changes to negative keywords across campaigns. But be forewarned that a limit of 1,000 negative keywords can be excluded for each account. So do not go crazy!!You can read the full announcement from Google here.6. US Justice Department Sues Google Again, Wants To Dismantle Its Ad Division - The U.S. Department of Justice (DOJ) officially filed an antitrust lawsuit against Google on January 24. The DOJ alleges Google has a monopoly on the current digital advertising ecosystem. Eight states so far have joined forces with the DOJ on the lawsuit. They include: Virginia, California, Colorado, Connecticut, New Jersey, New York, Rhode Island, and Tennessee. Remember that this lawsuit is separate from the first lawsuit from the DOJ back in 2020 against Google. In the 153-page document, the DOJ argues that Google has created an advertising environment that favors its Alphabet-owned products unfairly. Here is what DOJ wrote :“Google, a single company with pervasive conflicts of interest, now controls:(1) the technology used by nearly every major website publisher to offer advertising space for sale; (2) the leading tools used by advertisers to buy that advertising space; and (3) the largest ad exchange that matches publishers with advertisers each time that ad space is sold. Google abuses its monopoly power to disadvantage website publishers and advertisers who dare to use competing ad tech products in a search for higher quality, or lower cost, matches. Google uses its dominion over digital advertising technology to funnel more transactions to its own ad tech products where it extracts inflated fees to line its own pockets at the expense of the advertisers and publishers it purportedly serves.”If Google is found guilty in this lawsuit it could have an impact on the broader advertising sector. You can read the full lawsuit document here.7. Two Imp Elements For Google Discover Follow Feed - Google updated it's Google Discover feed guidelines and wrote that the "most important content for the Follow feature is your feed title element and your per item link elements." In addition to that, make sure that your feed (aka rss) is up-to-date, like you would for your sitemap.Remember, that the Google Discover follow feed feature offers relevant content to Chrome Android users and represents an importance source of traffic that is matched to user interests. It is one way to capture a steady stream of traffic apart from Google News and Google Search.8. Google: Don't Use Relative Paths In Your rel-canonical - A canonical URL lets you tell search engines that certain similar URLs are actually the same. Because sometimes you have products or content that you can find on multiple URLs — or even multiple websites. Using canonical URLs (HTML link tags with the attribute rel=canonical), you can have these on your site without harming your rankings.Gary Illyes from from the Google Search Relations team posted another PSA on LinkedIn, this one says "don't use relative paths in your rel-canonical." Gary wants you to use the full, absolute URL, when it comes to rel-canonical. This is not new advice, Google said this in 2018 and in 2013.This is such a common mistake that Google's John Muller wrote, “One of the problems is that it's relative to where the content was found. www or non-www? http or https? staging subdomain? staging domain? random other domain that's hosted by the same company? If you want to pick something specific, it's good to be specific.”If the terms canonical, relative, or absolute path has made you dizzy then perhaps you should seek the advice or help of an expert.9. Google: When Fake URLs Are Generated By Your Competitor - Mike Blazer asked John, "Bulk generate non-existing URLs on a competitor's site that lead to 5XX server errors when opened. Googlebot sees that a substantial number of pages on that domain return 5XX, the server is unable to handle requests. Google reduces the page #crawl frequency for that domain."John Mueller from Google said that bulk-generating fake URLs of your competitor's site should not lead to negative SEO and ranking issues for that site. "This is not something I'd worry about," he added.P.S: The audio version of the show contains my analysis and opinion on this topic.

#TWIMshow - This Week in Marketing
[Ep139] - Google Adds ‘Experience' To Search Quality Raters Guidelines. EAT Becomes EEAT.

#TWIMshow - This Week in Marketing

Play Episode Listen Later Dec 19, 2022 25:15


1. Reddit Launches Reddit for Business Website - Reddit's launched a Reddit for Business website, which provides a heap of insights, tips, and notes for prospective Reddit ad buyers, which could help them with their paid Reddit marketing. The mini-site includes usage stats, tips, case studies and educational resources to help guide you on your Reddit ads journey. In the announcement, Reddit wrote that:“90% of users trust Reddit to learn about new products and brands. For context, that's higher than Google, Amazon, Twitter, Instagram, and Snapchat. The communities' trust turns into action. Redditors tend to be more informed consumers, more valuable buyers, and they go hard for their favorite brands. They're also more likely to recommend the products they love - which never hurts”If you're thinking about Reddit ads, it's definitely worth a look, with the case studies, in particular, acting as a guide book, of sorts, for effective, creative Reddit marketing. Listen to the podcast to hear what I feel about Reddit Ads.2. Instagram ‘Group Profile' - This week, Instagram updated its group chat function and are now calling it ‘Group Profiles'. Content inside the group profile will only be visible to group members, and Collaborative Collections. Group Profiles are effectively shared profiles where you can post content for the entertainment of the group (that won't be shared with your followers), while Collaborative Collections will enable you to save posts to a private collection within your group chat. To me, this is just the beginning of IG Group since Meta (parent company of Facebook) has realized that a lot of IG users do not use Facebook thus need a way to keep them on the platform longer.3. GA4 Now Has A Built-In Landing Page Report - A landing page is the first page someone lands on when they visit your website. Google Analytics 4 (GA4) is rolling out an out of the box functionality that allows users to generate landing reports. Before this update, which is being rolled out to users gradually, landing page reports had to be manually constructed. This was an confusing process, which required multiple clicks. Use the report to evaluate the effectiveness of your landing pages. The default, automated report includes metrics for Views, New Users, Average Engagement Time per Session, Conversions, and Total Revenue. However, these fields can be modified and customized to business needs.4. Google Launches Search Status Dashboard - Google has launched a status dashboard for Google Search at status.search.google.com. The Google Search status dashboard will show you if there is an outage or issue with Google Search. Specifically, Google will confirm if there is an issue with crawling, indexing, or serving in Google Search. Google wrote, “as we head into 2023, we want to introduce another tool for the public to understand the most current status of systems which impact Search—crawling, indexing, and serving… while system disruptions are extremely rare, we want to be transparent when they do happen.”5. Google Ads Introduces Simulation Tool For Data-Driven Attribution - Before making a purchase or completing another valuable action on your website, people may click or interact with several of your ads. Typically, all credit for the conversion is given to the last ad customers interacted with. But was it really that ad that made them decide to choose your business? Data-driven attribution gives credit for conversions based on how people engage with your various ads and decide to become your customer. It uses data from your account to determine which keywords, ads, and campaigns have the greatest impact on your business goals.Google has already made it clear that they prefer advertisers use data-driven attribution as their preferred model. To convince Advertisers on the benefits of using Data-Driven Attribution, Google is rolling out a simulation tool for eligible advertisers that will allow them to see how automated bidding would have reacted to data-driven attribution over the last seven days. P.S: Google is also working to bring data-driven attribution to more advertisers and more ad types. Historically, data-driven attribution has supported Search, Shopping, Display and YouTube ads. Google is expanding its support to app conversions and will begin supporting Discovery formats (including those in Performance Max) next year.Google claims that “Advertisers who switch to data-driven attribution from another attribution model typically see a 6% average increase in conversions.” You can read the announcement from Google here.6. Google Ads Introduces Four New Features To The Insights Page - Google has introduced four new features to the Insights page in Google Ads - search terms insights, asset insights, audience insights, and change history insights.  Search terms insights let you see what your customers are searching for without having to examine every individual search term, Google explained. By grouping these terms into broader, intent-based categories, you can easily identify which themes are most popular with your customers along with metrics like conversion performance, search volume and search volume growth. Asset insights helps you learn about the creative assets that "resonate with your potential customers," Google wrote. This is an easy way to identify highly-engaged audiences, and can be used to inform the assets and landing pages you create for these groups, or even shift your entire marketing and product strategy. Using our previous sporting goods retailer example, you might see that bicycle-related image and video creative resonates with “Green Living Enthusiasts.” Based on this insight, you can pair these assets with a landing page that prominently features your sustainability efforts, or go one step further and use the insight to build out a completely new product line, Google added. Audience insights, was recently added to the account level on Google Ads. This insights page helps you understand the characteristics of the people who engaged with your business. These insights give you a better idea of what your customers care about, making it easier to tune your creative for their unique interests and traits. Change history insights help you identify how changes you've made in your account may impact performance. When there's a significant shift in your campaign's key metrics, this insight can help you find out which changes you made that may have caused this shift in performance — and then figure out how to proceed. 7. Google Rolls-out December 2022 Link Spam Update - Google has rolled out a December 2022 link spam update. In the announcement, Google wrote, “Our launch today, which we refer to as the December 2022 link spam update, will take about two weeks to fully roll out. Ranking may change as spammy links are neutralized and any credit passed by these unnatural links are lost. This launch will affect all languages.”This update is leveraging Google's SpamBrain - Google's AI-based spam prevention system that the search company launched in 2018. SpamBrain is credited by Google for catching about six times more spam sites in 2021 than it did in 2020, reducing hacked spam by 70% and reducing gibberish spam on hosted platforms by 75%.In the announcement, Google wrote that SpamBrain can not only “detect spam directly, it can now detect both sites buying links, and sites used for the purpose of passing outgoing links.”To avoid getting into trouble, follow the advice from July 2021 : “Site owners should make sure that they are following the best practices on links, both incoming and outgoing. Focusing on producing high quality content and improving user experience always wins out compared to manipulating links. Promote awareness of your site using appropriately tagged links, and monetize it with properly tagged affiliate links.”8. Google Recommends You To Stop Chasing Expired & Repurpose Domains - In the SEO world, expired domains are a big thing. So much so that there are forums and secondary marketplace to buy and sell expired domains with good Domain Authority (DA). Google's John Mueller said that using expired domains, and repurposed domains are "SEO-flotsam, index-cruft." John went on to add that search engines have had a lot of practice with dealing with that appropriately. John went on to add that you can "build a new site if you want," but he said, "don't assume there's old & tasty SEO-juice to lick up just by placing a site on an old domain."Back in Jan 2021, Aaseesh from the Google Webmaster Trends Analysts team said in a Google Webmaster Help thread that "Google understands when domains change ownership so it won't necessarily rank for the queries it used to rank pre change of ownership. So if the sole purpose of buying a domain is to get search traffic from the old domain, I would suggest against doing so since there's no benefit."There you go, another SEO myth busted. 9. Google: Diverse Content Can Also Rank If You Just Write For Your Readers - Danny Sullivan, the Google Search Liaison, said on Mastodon that your content does not need to be on one topic, one niche topic, for it to rank well. You can write about diverse topics on your site. The main thing, Danny said, was that you should write content however you want for your people and not for search engines.10. Google On Why Your Page Pops On/Off From The Search Index - When @seowolf asked John Muller what could be the reason for his page to shown in search results at a low position and then not showing up at all in search results. John responded “That can happen, and probably isn't particularly uncommon. Things can pop into the index for a bit, and then pop out again. When things are on the edge and the system has to make a yes/no decision, sometimes it goes one way for a while, and then the other way again. The web is noisy, I think that's fine.”Last year, John talked about “edge of indexing” - your pages are on the threshold of having enough quality to be in the Google index and not enough quality to be in the Google index. If you're teetering on the edge of indexing, there's always fluctuation.It basically means you need to convince Google that it's worthwhile to index more.What John is sharing aligns with Gary Illyes of Google shared about being on the edge of quality - if your pages disappear from Google Search and then reappear when you manually submit them to Google, that means you're on that edge. In fact, John Mueller of Google also said that if you need to submit pages to Google manually, it is likely a sign of a quality issue with those pages.11. Google Adds ‘Experience' To Search Quality Raters Guidelines. EAT Becomes EEAT. - In addition to AI based ranking, Google also leverages third-party Search Quality Raters (spread out around the world and highly trained using the Search Quality Raters Guidelines) to evaluate and rate a website. So Search Quality Raters Guidelines is considered as the Google Search Bible. If you want to rank higher and better then you have to read through the guidelines and make sure you are following it.Now. Google has updated the Google search quality raters guidelines and added a new E to E-A-T. Google is adding experience on top of expertise, authoritativeness, and trustworthiness. Google said that trust is the most important of all four elements here. What is Google looking for with experience? Google said when you write the content, does that "content also demonstrate that it was produced with some degree of experience, such as with actual use of a product, having actually visited a place or communicating what a person experienced?" Google explained that there are "some situations where really what you value most is content produced by someone who has first-hand, life experience on the topic at hand." Google shared this example, "if you're looking for information on how to correctly fill out your tax returns, that's probably a situation where you want to see content produced by an expert in the field of accounting. But if you're looking for reviews of a tax preparation software, you might be looking for a different kind of information—maybe it's a forum discussion from people who have experience with different services."Google said in the updated guidelines that Experience, Expertise and Authoritativeness are important concepts that can support your assessment of trust, with trust being the most important member of E-E-A-T. Google said, "trust is the most important member of the E-E-A-T family because untrustworthy pages have low E-E-A-T no matter how Experienced, Expert, or Authoritative they may seem."But how does "experience" differ from "expertise?" Google said, "pages that share first-hand life experience on clear "Your Money or Your Life” (YMML - pages or topics that could potentially impact a person's future happiness, health, financial stability, or safety.)  topics may be considered to have high E-E-A-T as long as the content is trustworthy, safe, and consistent with well-established expert consensus. In contrast, some types of YMYL information and advice must come from experts."The latest (Dec 15, 2022) version of search quality raters guidelines is 176 pages long and can be found here. If reading this update is making you nauseated then reach out to us for a complimentary consultation. 

#TWIMshow - This Week in Marketing
[Ep137] - Google On NoIndex Pages + Crawl Budget

#TWIMshow - This Week in Marketing

Play Episode Listen Later Dec 5, 2022 16:08


1. LinkedIn App Now Allows You To Schedule Posts - According to LinkedIn “We're starting to roll out post scheduling on desktop and Android so that our creators can easily plan the content they want to share next, with iOS coming soon. This means you can schedule text posts, videos, and images up to three months in advance.”2. Pinterest Stops Creator Rewards Program - Pinterest has announced that as of Nov 30th, it is ending its Creator Rewards program. The Creator Rewards Program which offered cash bonuses when creators completed goals such as hitting certain engagement metrics. 3. New Ad Targeting Options In Twitter Ads - Twitter announced some new ad targeting options, which look fairly similar to its existing ad goals, but with some important differences. The first update is within its ‘Conversions' objective, with advertisers now able to focus their promotions onto users that are more likely to take specific actions in response. Per Twitter: “Website Conversions Optimization (WCO) is a major rebuild of our conversion goal that will improve the way advertisers reach customers who are most likely to convert on a lower-funnel website action (e.g. add-to-cart, purchase). Our user-level algorithms will then target with greater relevance, reaching people most likely to meet your specific goal - at 25% lower cost-per-conversion on average, per initial testing.”Twitter's also launched its ‘Dynamic Product Ads', which enable advertisers ‘to showcase the most relevant product to the right customer at the right time'.Finally, Twitter's also launching its updated Collection Ads format, which enables advertisers to share a primary hero image, along with smaller thumbnail images below it. “The primary image remains static while people can browse through the thumbnails via horizontal scroll. When tapped, each image can drive consumers to a different landing page.”You can read more about Twitter's latest ad updates here.4. TikTok Announces Fall Semester Curriculum Of Its Creative Agency Partnerships (CAP) University - TikTok has announced the Fall Semester curriculum of its Creative Agency Partnerships (CAP) University program, which aims to ‘teach agency creatives how to show up on the platform'. CAP University aims to provide in-depth training and insight for marketing and ad partners, to help them maximize their use of the platform for their clients' promotions. The initiative was first launched back in April, with an initial course run, but now, TikTok has updated its lesson plan for the next phase.The most significant new addition is ‘Content to Cart', which explores the potential of eCommerce in the app, via its evolving set of product and shopping showcase tools.You can learn more about CAP University's Fall Semester curriculum here. 5. Two New Metrics In GA4 Reports - Google has added two new metrics to GA4 properties - Views per session and average session duration.Views per session tracks the number of app screens or web pages people look at during a single visit, while average session duration measures the time users spend on the website.6. Per Google, HTTP/3 Doesn't Impact SEO - HTTP/3 is a new standard in development that will affect how web browsers and servers communicate, with significant upgrades for user experience, including performance, reliability, and security.Now Google Search Advocate John Mueller debunks  theories that  HTTP/3 can directly impact a website's SEO. According to Mueller, “Google doesn't use HTTP/3 as a factor in ranking at the moment. As far as I know, we don't use it in crawling either. In terms of performance, I suspect the gains users see from using HTTP/3 would not be enough to significantly affect the core web vitals, which are the metrics that we use in the page experience ranking factor. While making a faster server is always a good idea, I doubt you'd see a direct connection with SEO only from using HTTP/3. Similar to how you'd be hard pressed to finding a direct connection to using a faster kind of RAM in your servers.”7. Google On “SEO Compliance Score" - Google's Search Liaison, Danny Sullivan, said (on Mastodon) there is no such thing as an “SEO compliance score” from Google. This was in response to user @bertran making a claim that Google has an SEO compliance score. Just goes to show how much made up things exists in the SEO world. Furthermore, Danny shared that the "Search Essentials is covering things we've long said: 1) avoid technical errors that prevent indexing content. 2) don't spam. 3) consider some best practices about producing content meant for humans."8. Google Has Algorithms To Detect & Demote AI Altered Plagiarized Content - There are “gurus” who are peddling  that AI tools that can (re)generate content that will rank you on top. However, Duy Nguyen from Google's search quality team said that Google has "algorithms to go after" those who post AI plagiarized content, then the algorithms can "demote site scraping content from other sites."Duy Nguyen said, "Scraping content, even with some modification, is against our spam policy. We have many algorithms to go after such behaviors and demote site scraping content from other sites."9. Google's POV On Long & Short Content - During a recent Google SEO office hour, the topic of Long & Short Content came up. According to Gary Illyes, Google is not more or less likely to crawl or index shorter or more niche content than other types of content. The fact that the content is shorter and perhaps easier to crawl does not guarantee that Google will index it more quickly or more favorably than lengthy stuff. "Niche content can also be indexed. It's not in any way penalized, but generally content, that's popular on the internet, for example, many people linked to it, gets crawled and indexed easier," Gary added.Then another user asked if splitting a long article into multiple interlinked pages results in thin content. But first, what is think content? Some people think that thin content means a webpage with not much content on it. But in reality, thin content is essentially when a site's text, or info, or visual elements are not relevant to the visitor's intent or does not provide them with what they are looking for. Thin pages are characterized by a lack of originality, a tiny difference from other pages, and/or a lack of unique added value.  For example, a product page that a retailer copies from the manufacturer site with nothing additional added to it.Doorway pages are also considered a form of thin content. Because often times these web pages are designed to rank for specific keywords. For example a page created to rank for a keyword phrase and different city names, where all the pages are virtually the same except for the names of the cities.When an user asked “Would it be considered thin content if an article covering a lengthy topic was broken down into smaller articles and interlinked?” Lizzi Sassman from Google answered: “Well, it's hard to know without looking at that content.But word count alone is not indicative of thin content.These are two perfectly legitimate approaches: it can be good to have a thorough article that deeply explores a topic, and it can be equally just as good to break it up into easier to understand topics.It really depends on the topic and the content on that page, and you know your audience best.So I would focus on what's most helpful to your users and that you're providing sufficient value on each page for whatever the topic might be.”However pagination (splitting one lengthy topic across multiple pages that are interlinked and requires site visitor clicks to the next page to keep reading the content) is fine. Google Search Central has a page about pagination best practices. 10. Google: Stop Wasting Your Time By Disavowing Random Links Flagged By Disavow Tools - In SEO, Disavow means to notify Google to ignore the harmful or low-quality links that are pointing to your site and are beyond your control. Third party tools use proprietary algorithms to score backlinks according to how spammy or toxic the tool company feels they are. According to Google Search Advocate John Mueller, “disavowing random links that look weird or that some tool has flagged, is not a good use of your time. It changes nothing. Use the disavow tool for situations where you actually paid for links and can't get them removed afterwards.”Obviously this brings the question, how do you know if the links were paid for specially when you had hired an agency/consultant for a backlink project? My thoughts in the show recording. 11. What Google Thinks Of Backlinks & Rankings - So what are backlinks? Backlink in SEO is using another website to link to your website so that when clicked, the user will be redirected to your website. During a recent Google SEO office hour, Duy Nguyen from Google's search quality team said that "backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago.  We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries." According to Duy, it is a waste of time and effort to generate backlink. Here is what he said :"link building campaigns, which are essentially link spam according to our spam policy. We have many algorithms capable of detecting unnatural links at scale and nullifying them. This means that spammers or SEOs spending money on links truly have no way of knowing if the money they spent on link building is actually worth it or not, since it's really likely that they're just wasting money building all these spammy links and they were already nullified by our systems as soon as we see them."12. Google On NoIndex Pages + Crawl Budget - Crawl Budget is the number of pages Googlebot crawls and indexes on a website within a given timeframe. The vast majority of sites out there don't need to worry about crawl budget unless you run a very big site, added a bunch of pages, or have lots of redirects. On 2nd Dec 22, Lizzi Sassman from Google updated the crawl budget management help document to kill two myths around the crawl budget. They are:“1. Any URL that is crawled affects crawl budget, and Google has to crawl the page in order to find the noindex rule. However, noindex is there to help you keep things out of the index. If you want to ensure that those pages don't end up in Google's index, continue using noindex and don't worry about crawl budget. It's also important to note that if you remove URLs from Google's index with noindex or otherwise, Googlebot can focus on other URLs on your site, which means noindex can indirectly free up some crawl budget for your site in the long run.2. Pages that serve 4xx HTTP status codes (except 429) don't waste crawl budget. Google attempted to crawl the page, but received a status code and no other content.”Coincidentally, during the latest SEO office hour, someone asked if a large number of noindex pages that will adversely impact a website's indexing or crawling. Or if we need to be mindful of Ratio Of index/noindex Pages? Or worry about noindex pages linked from spammy Sites. Here are official response from Google on these questions: “Noindex is a very powerful tool that search engines support to help you, this site owner, keep content out of their indexes. For this reason, it doesn't carry any unintended effects when it comes to crawling and indexing. For example, having many pages with noindex will not influence how Google crawls and indexes your site.”“No, there is no magic ratio to watch out for. Also, for a site that's not gigantic, with less than a million pages, perhaps, you really don't need to worry about the crawl budget of your website. It's fine to remove unnecessary internal links, but for small to medium-sized sites, that's more of a site hygiene topic than an SEO one.”“Noindex is there to help you keep things out of the index, and it doesn't come with unintended negative effects, as we said previously. If you want to ensure that those pages or their URLs, more specifically, don't end up in Google's index, continue using noindex and don't worry about crawl budget.”

#TWIMshow - This Week in Marketing
[Ep136] - Google Publishes A Guide To Current & Retired Ranking Systems

#TWIMshow - This Week in Marketing

Play Episode Listen Later Nov 28, 2022 10:40


1. Google's Introduces Policy Circumvention - Google has added a new spam policy to its search spam policies - “Policy circumvention.” In short, if you take any action you take to bypass the other Google Search spam or content policies such as creating new sites, using other sites or other methods to distribute that content, maybe on third-party sites or other avenues then Google will restrict or remove the content from showing up in search. Here is what Google wrote:“If you engage in actions intended to bypass our spam or content policies for Google Search, undermine restrictions placed on content, a site, or an account, or otherwise continue to distribute content that has been removed or made ineligible from surfacing, we may take appropriate action which could include restricting or removing eligibility for some of our search features (for example, Top Stories, Discover). Circumvention includes but is not limited to creating or using multiple sites or other methods intended to distribute content or engage in a behavior that was previously prohibited”2. Google's Advice On When You Should Move Your Blogs To A Sub-Domain - John Mueller of Google recently shared his advice on when you should add blogs to a Sub-Domain. John shared that he will move blogs to a subdomain over a www when he thinks the content on the subdomain can live on its own. He said "my way of thinking with regards to subdomains is that it depends on what you're trying to do. Is it content that's meant to be tightly connected to the main site? Then put it on the main site. If you want the content to stand on its own, then a subdomain is a good match."He also shared that there are technical considerations to think about outside of SEO. He said "There's also the technical side-effect of subdomains sometimes making things a bit more complicated: verification in search console, tracking in analytics, DNS, hosting, security, CSPs, etc."Lastly, John added "To be clear, I think it will affect rankings of the new content, but ultimately it depends on what you want to achieve with it. Sometimes you want something separated out, sometimes you want to see something as a part of the main site. These are different situations, and the results will differ."3. Google: 60% Of The Internet Is Duplicate & Prefers https - Gary Illyes from Google shared during Google Search Central Live in Singapore that 60% of the content on internet is duplicate. To find duplicates, Google compares the checksum generated from the main content and if the checksum matches then the content is duplicate. Lastly, Gary mentioned that Google will always pick a https url over http. Ensure that you have https on your website and focus on producing something way more unique and useful than most of what is out on the internet.  4. Google Re-confirms That E-A-T Applies To Every Single Search Query - During recent SMX Next event, Hyung-Jin Kim, the Vice President of Google Search (who has been working on search quality for the past 20 years and leads up core ranking at Google Search) reconfirmed that E-A-T is used in every single query, it is applied to everything Google Search does. "E-A-T is a core part of our metrics," he added, explaining that it is to "ensure the content that people consume is going to be, is not going to be harmful and it is going to be useful to the user." Here is the transcript of what he said exactly:“E-A-T is a core part of our metrics and it stands for expertise, authoritativeness and trustworthiness. This has not always been there in Google, and it is something we have developed about 10 to 12 to 13 years ago. And it is really there to make sure that, along the lines of what we talked about earlier, that is it really there to ensure the content that people consume is going to be, is not going to be harmful and it is going to be useful to the user. These are principles we live by every single day. And E-A-T, that template, of how we rate an individual site based on expertise, authoritativeness and trustworthiness, we do it to every single query and every single result. So it is actually pretty pervasive throughout everything we do. I will say that YMYL queries, the your money or your life queries, such as when I am looking for a mortgage or when I am looking for the local ER, those we have a particular eye on and pay a bit more attention to those queries because those are some of the most important decisions people can make, some of the most important decisions people will make in their lives. So I will say that E-A-T is has a bit more of an impact there but again, I will say that E-A-T applies to everything, every single query that we have.”5. Google Publishes A Guide To Current & Retired Ranking Systems - You can find out which algorithms Google uses to rank search results and which ones are no longer in use with the help of a new guide to Google's ranking systems. Furthermore, Google distinguishes between ranking "systems" and ranking "updates" in its most recent guide, using new terminology. RankBrain is one example of a system that is always operating in the background. On the other hand, an update describes a one-time adjustment to ranking structures.For instance, when Google returns search results, the helpful content system is always active in the background, while it is subject to modifications to enhance its performance. Other examples of one-time adjustments to ranking algorithms include spam updates and updates to the core algorithm.Here is the list, in alphabetical order, of Google's ranking systems that are currently operational. BERT: Short for Bidirectional Encoder Representations from Transformers, BERT allows Googe to understand how combinations of words can express different meanings and intent. Crisis information systems: Google has systems in place to provide specific sets of information during times of crisis, such as SOS alerts when searching for natural disasters. Deduplication systems: Google's search systems aim to avoid serving duplicate or near-duplicate webpages. Exact match domain system: A system that ensures Google doesn't give too much credit to websites with domain names that exactly match a query. Freshness systems: A system designed to show fresher content for queries where it would be expected Helpful content system:  A system designed to better ensure people see original, helpful content, rather than content made primarily to gain search engine traffic. Link analysis systems and PageRank: Systems that determine what pages are about and which might be most helpful in response to a query based on how pages link to each other. Local news systems: A system that surfaces local news sources when relevant to the query. MUM: Short for Multitask Unified Model, MUM, is an AI system capable of understanding and generating language. It improves featured snippet callouts and is not used for general ranking. Neural matching:  A system that helps Google understand representations of concepts in queries and pages and match them to one another. Original content systems: A system to help ensure Google shows original content prominently in search results, including original reporting, ahead of those who merely cite it. Removal-based demotion systems: Systems that demote websites subject to a high volume of content removal requests. Page experience system: A system that assesses various criteria to determine if a webpage provides a good user experience. Passage ranking system: An  AI system Google uses to identify individual sections or “passages” of a web page to understand better how relevant a page is to a search. Product reviews system:  A system that rewards high-quality product reviews written by expert authors with insightful analysis and original research. RankBrain:  An AI system that helps Google understand how words are related to concepts. Allows Google to return results that don't contain exact words used in a query. Reliable information systems: Google has multiple systems to show reliable information, such as elevating authoritative pages, demoting low-quality content, and rewarding quality journalism. Site diversity system: A system that prevents Google from showing more than two webpage listings from the same site in the top results. Spam detection systems: A system that deals with content and behaviors that violate Google's spam policies.

Search Off the Record
How do removals work in Google Search, really?

Search Off the Record

Play Episode Listen Later Aug 4, 2022 28:57


In this episode of Search Off the Record, Lizzi Sassman, John Mueller, and Gary Illyes have an insightful conversation around how removals work in Google Search. Specifically, they chat about how Google Search operates when removals are in the mix. How does crawling, indexing, and blocking work in the backend? What really happens at each of these stages when it comes to removals? Listen to this podcast to get a behind-the-scenes look at Google Search   Resources:  Check out our episode about blocking → https://goo.gle/3P1UWYq  Control what you share with Google → https://goo.gle/3oUyoym  In-depth guide to how Google Search works → https://goo.gle/3bx2CV2  Episode transcript → https://goo.gle/sotr043-Transcript  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.  

Search Off the Record
How Barry Schwartz Created RustyBrick

Search Off the Record

Play Episode Listen Later Jul 14, 2022 30:28


In this episode of Search Off the Record, Lizzi Sassman and Gary Illyes sit down with Barry Schwartz - a search engine connoisseur. Specifically, they explore how Barry runs his online career, balancing RustyBrick and Search Engine Roundtable. We even talk about website development, and the origin story behind RustyBrick  - a web construction firm specializing in website development, web page design, SEO optimization, and more! Check out and subscribe to Barry Schwartz's YouTube channel → https://goo.gle/3z3FCWw    Episode transcript → https://goo.gle/sotr042-Transcript    Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.

Search Off the Record
How can you block content from Google Search?

Search Off the Record

Play Episode Listen Later Jun 30, 2022 31:18


In this episode of Search Off the Record, we take a look at what it takes to block content from Google Search. Lizzi Sassman, Gary Illyes, and John Mueller chat about the role of robots.txt files and robots meta tags. What roles do they play in crawling and indexing? What other practices can you employ to remove unwanted content from Google Search? Listen to this podcast to discover some useful Search tips and tricks when it comes to barring content.  Episode transcript →  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.

Search Off the Record
Transcript for How can you block content from Google Search?

Search Off the Record

Play Episode Listen Later Jun 30, 2022


In this episode of Search Off the Record, we take a look at what it takes to block content from Google Search. Lizzi Sassman, Gary Illyes, and John Mueller chat about the role of robots.txt files and robots meta tags. What roles do they play in crawling and indexing? What other practices can you employ to remove unwanted content from Google Search? Listen to this podcast to discover some useful Search tips and tricks when it comes to barring content.  Episode transcript →  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.

Search Off the Record
Let's talk about UX and SEO

Search Off the Record

Play Episode Listen Later Jun 21, 2022 31:08


In this episode of Search Off the Record, Martin Splitt, Gary Illyes, and Lizzi Sassman sit down to talk about user experience and SEO. What is UX? Besides Core Web Vitals, what other things could we try to automatically infer about the user experience of a site? From interstitials to site structure, our team of Search experts sit down and chat about the relationship between UX and SEO.  Episode transcript → https://goo.gle/sotr040-Transcript    Let's talk Core Web Vitals video → https://goo.gle/3xyX0k9  Let's talk Core Web Vitals podcast → https://goo.gle/3xWcQGJ  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.  

Search Off the Record
A spotlight on SEO and data with Michelle Robbins

Search Off the Record

Play Episode Listen Later Jun 2, 2022 32:32


In this episode of Search Off the Record, we talk with Michelle Robbins - a software engineer by trade and member of the Search Marketing community. Lizzi Sassman and Gary Illyes sit down with Michelle to discuss how she got interested in SEO and even talk about data's relation to SEO. How can data improve your SEO? Moreover, they explore common issues with data: bias, misrepresentation, and correlation vs. causation. Listen to this episode to see where the worlds of data and SEO overlap. Episode transcript → https://goo.gle/sotr039-Transcript  Michelle Robbins on Twitter → https://goo.gle/3Nd8jVE Michelle Robbins on LinkedIn → https://goo.gle/3PWnpAM  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.

Search Off the Record
Let's talk about Web 3.0

Search Off the Record

Play Episode Listen Later May 19, 2022 33:55


With Web 3.0 on the horizon, it's hard to not talk about the next evolution of the internet. In this episode of Search Off The Record, Martin Splitt, Gary Illyes, and Alan Kent offer up their thoughts about the World Wide Web's next iteration, and fondly recollect their experiences with older versions of the web. They ponder what people think Web 3.0 is and what it might become. Is it decentralized? Is it crypto? Both or none of that? Episode transcript → https://goo.gle/sotr038-Transcript  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.

Search Off the Record
Let's talk sitemaps

Search Off the Record

Play Episode Listen Later May 5, 2022 28:21


A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. In this episode of Search Off the Record, we sit down with Gary Illyes, John Mueller, and Lizzi Sassman to discuss all things sitemaps. How do they help with website SEO? What is the future of sitemaps? Listen to this episode to find out! Episode transcript →  https://goo.gle/sotr037-Transcript   Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.