Podcasts about Googlebot

Web crawler used by Google

  • 70PODCASTS
  • 180EPISODES
  • 27mAVG DURATION
  • 1EPISODE EVERY OTHER WEEK
  • Sep 18, 2024LATEST

POPULARITY

20172018201920202021202220232024


Best podcasts about Googlebot

Latest podcast episodes about Googlebot

SERP's Up SEO Podcast
SERP's Up | Making bot log SEO data easy (easier?)

SERP's Up SEO Podcast

Play Episode Listen Later Sep 18, 2024 38:41


Ever wondered why Googlebot loves your blog's cat photos but totally dismisses your high-value product pages? Wix's Mordy Oberstein and Crystal Carter are joined by Roxana Stingu, Head of Search and SEO at Alamy, to dig into crawl patterns, bot behaviors, and crawl budgets. Learn how to make your pages better for bots, update old content to be relevant again, and minimize server costs by managing bot activity.  Discover how specialized bot logs can offer insights into everything from quality issues to market trends and detect patterns of malicious bots you may want to block. (Best Shatner Impression) “The robots are interacting with our site, but how?” Don't miss out as we navigate the world of search engine bots on the SERP's Up Podcast! Key Segments [00:01:27] What's On This Episode of SERP's Up? [00:02:18] Focus Topic of the Week: Bot Logs [00:04:06] Focus Topic Guest: Roxana Stingu [00:24:27] Tool Time: Built-in Bot Lot Reports [00:30:19] Snappy News [00:34:06] Follow of the Week: Anne Berlin Hosts, Guests, & Featured People: Mordy Oberstein Crystal Carter Roxana Stingu  Anne Berlin  Rich Sanger  Resources: Wix SEO Learning Hub  Searchlight SEO Newsletter  SEO Resource Center  It's New: Daily SEO News Series  Wix Bot Log Reports  Alamy  News: Google Search Ranking Volatility Still Heated A Week After Core Update  Report: Half Google AI Overviews Links Overlap With Top Search Results

SEO Is Not That Hard
SEO A to Z - part 27 - "200 (OK) to #"

SEO Is Not That Hard

Play Episode Listen Later Aug 23, 2024 12:21 Transcription Available


Send us a Text Message.Unlock the secrets of HTTP response codes and elevate your SEO strategy in today's episode of "SEO is not that hard" with your host, Ed Dawson. Ever wondered how to preserve your hard-earned SEO value when moving or deleting web pages? Discover why the 200 OK code is the linchpin of successful HTTP requests, ensuring that Googlebot and web browsers recognize and serve your content seamlessly. Learn the nuances of the 301 redirect, a critical tool for forwarding all ranking signals and backlinks to a new URL, safeguarding your site's SEO value during permanent changes. But that's not all—Ed also demystifies the 302 redirect, explaining when and how to use it for temporary page moves without compromising your SEO efforts. By the end of this episode, you'll have a crystal-clear understanding of when to implement these redirects to maintain your page rankings and keep your SEO strategy on point. Plus, get real-world insights from the broadband industry on effectively managing redirects and retaining SEO value. Join us for this essential guide to mastering HTTP response codes and take control of your web pages' SEO performance like a pro!SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.comYou can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tipsTo get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO then book an appointment with me nowSee Edd's personal site at edddawson.comAsk me a question and get on the show Click here to record a questionFind Edd on Twitter @channel5Find KeywordsPeopleUse on Twitter @kwds_ppl_use"Werq" Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/

SEO Is Not That Hard
SEO A to Z - part 26 - "Web Crawler to Zero Volume Keywords"

SEO Is Not That Hard

Play Episode Listen Later Aug 21, 2024 13:45 Transcription Available


Send us a Text Message.Can you imagine turning your SEO game around with just a few strategic tweaks? Join me, Ed Dawson, on this episode of "SEO is Not That Hard" as we uncover the journey from understanding web crawlers to mastering zero volume keywords. We'll demystify how Googlebot indexes new content, the critical role of web hosting in your site's performance, and the nuanced craft of web scraping. Get ready to dive deep into the technical underpinnings that keep your online presence robust and resilient.We'll break down the essentials of website monitoring and why tools like Uptime Robot and Little Warden are vital for quick issue detection and resolution. Learn how downtime can tank your rankings and how to avoid common pitfalls like accidental changes to your robots.txt file. We'll also explore the impact of web spam on search results and advocate for white hat SEO practices to keep your site in Google's good graces. Whether you're a veteran SEO professional or new to the game, this episode is packed with insights to elevate your strategy and keep your website performing at its best.SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.comYou can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tipsTo get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO then book an appointment with me nowSee Edd's personal site at edddawson.comAsk me a question and get on the show Click here to record a questionFind Edd on Twitter @channel5Find KeywordsPeopleUse on Twitter @kwds_ppl_use"Werq" Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/

Confessions of an SEO
Googlebot Worthy - Season 4, Episode 31

Confessions of an SEO

Play Episode Listen Later Aug 13, 2024 10:32


This week's show is a gentle rant. It started innocently enough talking about an article of a published test on Javascript and Googlebots. Last week's episode Mystery of the Negative SEO Canonical You can text Confessions - 512-222-3132 Check out my ⁠⁠HCS guide⁠⁠ Ep 22 - ⁠⁠What's Canonical Got To Do With It⁠⁠ You will want to read my write up on what I observed in various sites that were decimated by the Sept HCU. ⁠⁠⁠⁠⁠⁠⁠http://fixthedamncanonical.com⁠⁠⁠⁠⁠⁠⁠ Its set to NAME YOUR OWN PRICE on Gumroad - so if you can't afford even $10, you can get it for $0. To get your copy of Decoding Google's Helpful Content System - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/helpfulcontentsystem⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠Confessions of an SEO - On Semrush's Top 10 SEO Podcasts ⁠⁠⁠⁠⁠⁠⁠ Suggest podcast topics on Confessions Hotline - 512-222-3132 Some tools I use and recommend: Cora Software ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/confessionscora⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ On page analysis tool - use it every day. GSC Tool - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/gsctool⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ A chrome extension that helps me with repetitive tasks in search console and has a connector to the Google Indexing API so it doesn't go through 3rd parties. Rank Week - ⁠⁠⁠⁠https://bit.ly/conc-s-rankweek⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ For those who read these things - please click below - it's a short link to the Confessions of An SEO Knowledge Panel - Thank you for your click!⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Confessions of An SEO⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

SEO 101 on WebmasterRadio.fm
SEO 101 Episode 473 - Google Updates, Search Volatility, Console Features, and Crawling Insights

SEO 101 on WebmasterRadio.fm

Play Episode Listen Later Aug 12, 2024 18:04


 In this episode of the SEO 101 Podcast, we explore significant updates from Google, including recent search ranking volatility and the launch of new recommendations in Search Console. We also discuss Googlebot's link crawling behavior, shedding light on how it impacts SEO strategies and website performance. Tune in for insights like this every week!Advertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacy

SEO Is Not That Hard
SEO A to Z - part 20 - "Query Deserves Freshness to Robots.txt "

SEO Is Not That Hard

Play Episode Listen Later Aug 7, 2024 14:33 Transcription Available


Send us a Text Message.Ever wondered why some websites always seem to have the freshest content for trending topics? Discover the power of "Query Deserves Freshness" (QDF), a pivotal ranking factor for news-related searches. We'll debunk the myth that every query demands new content and show you when freshness really matters. Plus, learn the ins and outs of the "rank and rent" strategy—a lucrative method for generating leads in local service niches. With practical advice on rank tracking and a deep dive into what influences your webpage's position in search results, this episode is packed with actionable insights to elevate your SEO game.But that's not all! We uncover best practices for using 301 and 302 redirects without tanking your rankings, and highlight how to fix those pesky redirect chains that slow down your site and confuse Googlebot. Tools like Screaming Frog are your new best friends. We'll also discuss the importance of related searches in keyword research, the critical role of relevancy in link building, and how to expedite the indexing process via Google Search Console. To top it off, we'll emphasize why responsive web design is crucial for both user experience and SEO, and how third-party reviews can skyrocket your local business rankings. Tune in for an episode jam-packed with strategies that will revolutionize your approach to SEO!SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.comYou can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tipsTo get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO then book an appointment with me nowSee Edd's personal site at edddawson.comAsk me a question and get on the show Click here to record a questionFind Edd on Twitter @channel5Find KeywordsPeopleUse on Twitter @kwds_ppl_use"Werq" Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/

SEO Is Not That Hard
SEO A to Z - part 9 - "Google Patent Applications to Guest Posting"

SEO Is Not That Hard

Play Episode Listen Later Jul 12, 2024 15:04 Transcription Available


Send us a Text Message.Ever wondered how to truly crack Google's SEO code? This episode of "SEO is Not That Hard" promises to arm you with the knowledge and strategies that can elevate your website's visibility. Join me, Ed Dawson, as we delve into the often-overlooked world of Google's patent applications and their hidden secrets about search ranking algorithms. Discover how to use Google's specialized search engine for patents to your advantage. We'll also demystify the role of Google quality raters and their comprehensive guidelines—a must-know for anyone serious about SEO. Plus, learn actionable tips to boost your local SEO through positive Google reviews and navigate the controversial concept of the Google sandbox with insights from my personal experiences.But that's not all; the second half of the episode is a treasure trove of essential Google tools and guidelines. Get acquainted with Google Search Console for monitoring your site's performance and Google Search Essentials for best practices. From leveraging Google Suggest for effective keyword research to mastering Google Tag Manager and Google Trends, you'll find practical advice to optimize your site. We also dissect the impacts of Google Updates on your rankings, the role of Googlebot in crawling your pages, and introduce the intriguing concept of GPT in AI. Finally, we explore the grey areas of SEO tactics that sit between black hat and white hat practices. This episode is packed with invaluable insights to help you navigate the complex world of SEO and take your website to the next level.SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.comYou can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tipsTo get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO then book an appointment with me nowSee Edd's personal site at edddawson.comAsk me a question and get on the show Click here to record a questionFind Edd on Twitter @channel5Find KeywordsPeopleUse on Twitter @kwds_ppl_use"Werq" Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/

SEO Is Not That Hard
Bonus Episode - All 101 Quick SEO Tips Compilation Megapod.

SEO Is Not That Hard

Play Episode Listen Later Jun 11, 2024 95:46 Transcription Available


Send us a Text Message.Ready to supercharge your SEO game? This episode compiles 101 quick SEO tips from our past sessions, giving you an extended masterclass on improving your website's visibility and ranking. Learn why website speed isn't just about user experience but also a critical factor for Google rankings. We'll also dive into the importance of linking new pages to existing ones, securing your site with HTTPS, and the risks of buying backlinks. Plus, discover how obtaining legitimate, relevant backlinks can significantly boost your search engine favorability.Ever wondered how to build a successful website that consistently attracts traffic? We've got you covered with strategies centered around answering real user queries and staying updated with the latest SEO practices. We'll explore how structuring your content strategy around user questions can create evergreen content. For new websites, targeting long-tail keywords and exercising patience can pave the way to gradual ranking success. We also share tips on monetizing your site, leveraging platforms like Shopify and WordPress, and using visual content and 301 redirects to retain SEO value.Struggling with common SEO challenges? Our episode is packed with actionable insights, from using descriptive anchor text and leveraging Google Search Console, to staying updated on trending topics and navigating Google updates. Learn how to use Google AutoComplete for content ideas and avoid pitfalls like blocking Googlebot with your robots.txt file. We also emphasize the importance of consistent URL structures and optimizing local business profiles through Google Business Profile. Whether you're aiming to enhance your SEO or boost user engagement, this episode is a goldmine of practical tips and strategies to help you succeed.SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.comYou can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tipsTo get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO then book an appointment with me nowAsk me a question and get on the show Click here to record a questionFind Edd on Twitter @channel5Find KeywordsPeopleUse on Twitter @kwds_ppl_use"Werq" Kevin MacLeod (incompetech.com)Licensed under Creative Commons: By Attribution 4.0 Licensehttp://creativecommons.org/licenses/by/4.0/

SEO im Ohr - die SEO-News von SEO Südwest
Sind Backlinks doch unverzichtbar? SEO im Ohr - Folge 307

SEO im Ohr - die SEO-News von SEO Südwest

Play Episode Listen Later Jun 8, 2024 12:21


Sat, 08 Jun 2024 19:15:24 +0000 https://podcast552923.podigee.io/310-neue-episode 3fe787b99c296307d731d846d3f5ab01 Eine aktuelle Studie zeigt eine deutliche Korrelation zwischen Platzierungen in den Top-10 von Google und der Zahl von Backlinks. Sind Backlinks also doch unverzichtbar für gute Rankings? In einer Studie, in der für 200 Keywords die Top-10-Rankings in Google betrachtet wurden, zeigte sich, dass nur Websites mit einer bestimmten Anzahl von Backlinks gute Platzierungen erhielten. Google hat dagegen immer wieder die Bedeutung von Backlinks für die Rankings relativiert. Was ist richtig? Seit etwa einem Monat sind die Suchergebnisseiten von Google in Bewegung. Es sieht aus, als würde Google ein langgezogenes und kontinuierliches Updates ausrollen. Was ist da los? Google wird ab dem 5. Juli Websites nur noch mit dem Googlebot Smartphone crawlen. Websites, die für diesen Googlebot gesperrt sind, werden dann nicht mehr indexiert. Viele fragen sich, welchen Nutzen der Google Leak für SEOs bringt. Viel Neues ist nicht dabei, aber einige Erkenntnisse sind wichtig. full Eine aktuelle Studie zeigt eine deutliche Korrelation zwischen Platzierungen in den Top-10 von Google und der Zahl von Backlinks. Sind Backlinks also doch unverzichtbar für gute Rankings? no Christian Kunz

SEO Podcast by #SEOSLY
JavaScript SEO in 2024: Q&A with Martin Splitt from Google (Issues, Mistakes, Tips, Facts, & Myths)

SEO Podcast by #SEOSLY

Play Episode Listen Later May 7, 2024 56:22


Ever wondered how JavaScript impacts SEO? Here is the full written guide on JavaScript SEO on my website. In this interview, I ask Martin Splitt from Google the most pressing JavaScript SEO questions. Martin sheds light on Googlebot's behavior, the importance of server-side rendering, and how to identify and fix JavaScript SEO issues. Watch now to level up your JavaScript SEO knowledge! Watch the video on YouTube: https://youtu.be/0iHDk7uqByI In this comprehensive Q&A session, we cover a wide range of topics, including:✅ The path Googlebot follows when visiting a page ✅ How Google decides whether to index a specific page ✅ The new robots.txt reports in Google Search Console ✅ Dealing with crawl budget issues for small and large websites ✅ The time difference between Googlebot crawling and rendering a page ✅ How Googlebot handles dynamic content and JavaScript rendering ✅ The impact of "no-index" and "index" tags in source code vs. rendered HTML ✅ Blocking users based on geolocation and its SEO implications ✅ Understanding and identifying crawl budget issues ✅ Googlebot's behavior with button links, JavaScript links, and redirects ✅ Effective communication between SEOs and developers ✅ Evaluating JavaScript-based sites with SEO tools ✅ Best practices for e-commerce websites, pagination, and infinite scroll✅ Common JavaScript SEO mistakes and how to avoid themThis interview is packed with valuable insights and actionable tips for optimizing your JavaScript-based website for search engines. Follow SEO Consultant Olga Zarr or hire Olga to help you with SEO Follow Olga Zarr X/Twitter Follow Olga Zarr on LinkedIn The best SEO newsletter The best SEO podcast SEO consultant Olga Zarr

Confessions of an SEO
Fetch Worthy Content - Season 4, Episode 14

Confessions of an SEO

Play Episode Listen Later Apr 9, 2024 15:22


This episode I'm taking just a teensy break from discussing Google's Helpful Content System. This is the conference I'll be spilling all the beans on the Helpful Content Data and answering as many questions. Its the Rank Elevation SEO Summit Happening June 10-11, 2024 At the Delta Grand Okanagan Resort In Kelowna, British Columbia - Canada ⁠⁠https://www.entityelevation.com/seo-summit-2024/ ⁠⁠ This week's topic comes from an episode of Search Off the Record where google engineers and their Google friends about general topics. I learned more about what is and what isn't a Googlebot. As well as some insight into what Google is looking for in a site or a new site. Search Off The Record - https://www.youtube.com/watch?v=xVg9LcrSwyQ ⁠⁠⁠Forensic SEO - HouseFresh Video Series⁠⁠⁠ If you want to not miss any future videos or discussions on what might be going on in Google - make sure to get on the notification list for Search Driven Insights - ⁠⁠⁠⁠⁠⁠⁠https://bit.ly/searchdriveninsights⁠⁠⁠⁠⁠⁠⁠ This week's sponsor of Confessions of an SEO Magic Mind - ⁠⁠⁠⁠⁠⁠https://magicmind.com/confessions⁠⁠⁠⁠⁠⁠ Coupon code: Confessions20 For the latest from the Indexation Research check out it's new home on Substack. Monthly $18 / Annual membership has perks and gets you two months free plus a few more fun things! Substack: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://crawlornocrawl.substack.com/⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Latest News from CONC ⁠⁠⁠⁠⁠⁠New Desktop Googlebot⁠⁠⁠⁠⁠⁠ Youtube channels - ⁠h⁠⁠⁠⁠⁠ttps://www.youtube.com/@forensicseo⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠https://bit.ly/searchdriveninsights⁠⁠⁠⁠⁠⁠ Here's a test - please click on this link ⁠⁠⁠⁠⁠⁠⁠American Way Media, INC - SEO Consultant⁠⁠⁠⁠⁠⁠⁠ Confessions of an SEO - On Semrush's Top 10 SEO Podcasts ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.semrush.com/blog/seo-podcasts/⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Suggest podcast topics on Confessions Hotline - 512-222-3132 Your SEO deals on Tools I use: Some tools I use and recommend: Cora Software ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/confessionscora⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ On page analysis tool - use it every day. GSC Tool - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/gsctool⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ A chrome extension that helps me with repetitive tasks in search console and has a connector to the Google Indexing API so it doesn't go through 3rd parties. Ziptie Monitoring Software - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/checkoutziptie⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Software than monitors whether Google is serving your content or not. Rank Week - ⁠⁠⁠⁠⁠⁠⁠⁠https://bit.ly/confessionsrankweek⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠For those who read these things - please click below - it's a short link to the Confessions of An SEO Knowledge Panel - Thank you for your click!⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Confessions of An SEO⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Search Buzz Video Roundup
Search News Buzz Video Recap: Ongoing Google March Core Update, Googlebot To Crawl Less, Pay For Google Search AI & More

Search Buzz Video Roundup

Play Episode Listen Later Apr 5, 2024


This week, I posted the big Google Webmaster Report for April; it is an excellent place to catch up quickly on the news. Google's March 2024 core update had some volatility this week, and it is still rolling out. There are still no....

#TWIMshow - This Week in Marketing
Ep201 - ‘How Google Search Crawls Pages'

#TWIMshow - This Week in Marketing

Play Episode Listen Later Mar 5, 2024 10:44


Episode 201 contains the Digital Marketing News and Updates from the week of Feb 26 - Mar 1, 2024.1. ‘How Google Search Crawls Pages' - In a comprehensive video from, Google engineer Gary Illyes sheds light on how Google's search engine discovers and fetches web pages through a process known as crawling.  Crawling is the first step in making a webpage searchable. Google uses automated programs, known as crawlers, to find new or updated pages. The cornerstone of this process is URL discovery, where Google identifies new pages by following links from known pages. This method highlights the importance of having a well-structured website with effective internal linking, ensuring that Google can discover and index new content efficiently.A key tool in enhancing your website's discoverability is the use of sitemaps. These are XML files that list your site's URLs along with additional metadata. While not mandatory, sitemaps are highly recommended as they significantly aid Google and other search engines in finding your content. For business owners, this means working with your website provider or developer to ensure your site automatically generates sitemap files, saving you time and reducing the risk of errors.Googlebot, Google's main crawler, uses algorithms to decide which sites to crawl, how often, and how many pages to fetch. This process is delicately balanced to avoid overloading your website, with the speed of crawling adjusted based on your site's response times, content quality, and server health. It's crucial for businesses to maintain a responsive and high-quality website to facilitate efficient crawling.Moreover, Googlebot only indexes publicly accessible URLs, emphasizing the need for businesses to ensure their most important content is not hidden behind login pages. The crawling process concludes with downloading and rendering the pages, allowing Google to see and index dynamic content loaded via JavaScript.2. Is Google Happy with 301+410 Responses? - In a recent discussion on Reddit, a user expressed concerns about their site's "crawl budget" being impacted by a combination of 301 redirects and 410 error responses. This situation involved redirecting non-secure, outdated URLs to their secure counterparts, only to serve a 410 error indicating the page is permanently removed. The user wondered if this approach was hindering Googlebot's efficiency and contributing to crawl budget issues.Google's John Mueller provided clarity, stating that using a mix of 301 redirects (which guide users from HTTP to HTTPS versions of a site) followed by 410 errors is acceptable. Mueller emphasized that crawl budget concerns primarily affect very large sites, as detailed in Google's documentation. If a smaller site experiences crawl issues, it likely stems from Google's assessment of the site's value rather than technical problems. This suggests the need for content evaluation to enhance its appeal to Googlebot.Mueller's insights reveal a critical aspect of SEO; the creation of valuable content. He criticizes common SEO strategies that replicate existing content, which fails to add value or originality. This approach, likened to producing more "Zeros" rather than unique "Ones," implies that merely duplicating what's already available does not improve a site's worth in Google's eyes.For business owners, this discussion underlines the importance of focusing on original, high-quality content over technical SEO manipulations. While ensuring your site is technically sound is necessary, the real competitive edge lies in offering something unique and valuable to your audience. This not only aids in standing out in search results but also aligns with Google's preference for indexing content that provides new information or perspectives.In summary, while understanding the technicalities of SEO, such as crawl budgets and redirects, is important, the emphasis should be on content quality. Businesses should strive to create original content that answers unmet needs or provides fresh insights. This approach not only helps with better indexing by Google but also engages your audience more effectively, driving organic traffic and contributing to your site's long-term success.3. UTM Parameters & SEO - Google's John Mueller emphasized that disallowing URLs with UTM parameters does not significantly enhance a website's search performance. Instead, he advocates for maintaining clean and consistent internal URLs to ensure optimal site hygiene and efficiency in tracking.Mueller's advice is straightforward: focus on improving the site's structure to minimize the need for Google to crawl irrelevant URLs. This involves refining internal linking strategies, employing rel-canonical tags judiciously, and ensuring consistency in URLs across feeds. The goal is to streamline site management and make it easier to track user interactions and traffic sources without compromising on SEO performance.A notable point Mueller makes is regarding the handling of external links with UTM parameters. He advises against blocking these through robots.txt, suggesting that rel-canonical tags will effectively manage these over time, aligning external links with the site's canonical URL structure. This approach not only simplifies the cleanup of random parameter URLs but also reinforces the importance of direct management at the source. For instance, if a site generates random parameter URLs internally or through feed submissions, the priority should be to address these issues directly rather than relying on robots.txt to block them.In summary, Mueller's guidance underscores the importance of website hygiene and the strategic use of SEO tools like rel-canonical tags to manage URL parameters effectively. His stance is clear: maintaining a clean website is crucial, but blocking external URLs with random parameters is not recommended. This advice aligns with Mueller's consistent approach to SEO best practices, emphasizing the need for site owners to focus on foundational site improvements and efficient management of URL parameters for better search visibility and tracking.4. Transition Required for Google Business Profile Websites - Google has announced that starting in March 2024, websites created through Google Business Profiles (GBP) will be deactivated, with an automatic redirect to the businesses' Google Business Profile in place until June 10, 2024. This move requires immediate attention from GBP website owners to ensure continuity in their online operations.For businesses unsure if their website is hosted through Google Business Profiles, a simple search on Google for their business name and accessing the edit function of their Google Business Profile will reveal if their website is a GBP creation. It's indicated by a message stating, “You have a website created with Google.” For those without a GBP website, the option to link an external site will be available.In response to this change, Google has recommended several alternative website builders for affected businesses. Among the suggested platforms are Wix, Squarespace, GoDaddy, Google Sites, Shopify (specifically for e-commerce), Durable, Weebly, Strikingly, and WordPress. Each offers unique features, with WordPress notable for its free website builder incorporating generative AI capabilities. However, users should be aware ...

#TWIMshow - This Week in Marketing
Ep196 - Google's SEO Starter Guide Update: Streamlining for Clarity and Efficiency

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 29, 2024 15:27


Episode 196 contains the Digital Marketing News and Updates from the week of Jan 22-26, 2024.1. Google's SEO Starter Guide Update: Streamlining for Clarity and Efficiency - Google is currently reworking its popular SEO Starter Guide, initially released in 2010 and updated in 2017 to significantly streamline the guide, making it more accessible and relevant for today's website owners.The current SEO Starter Guide, which is about 8,500 words, will be reduced to less than half its size in the upcoming revision. Lizzi Sassman from Google mentioned that the new guide would be a little over 3,000 words. This reduction is achieved by cutting out repetitive and redundant information, aiming to eliminate duplication and streamline the content.Key aspects of the updated guide include:Focus on Modern Users: The guide will be more concise, focusing on general concepts rather than detailed technical instructions. This change reflects the rise of user-friendly content management systems like WordPress and Wix, which have simplified many aspects of SEO.Elimination of Redundancy: The new guide aims to avoid duplicating information available in more extensive resources on Google's Search Central site. It will serve as a one-stop introductory resource for SEO basics.Potential Impact on Guide's Ranking: Gary Illyes from Google speculated that the guide's ranking in Google search results might drop due to the reduced word count. However, the emphasis is on making the guide more user-friendly rather than maintaining its length for ranking purposes.Feedback-Driven Revision: John Mueller of Google highlighted that reader feedback has been instrumental in reshaping the guide. The goal is to make it more suitable for modern websites and accessible to those new to SEO.This updated guide will be a valuable resource for understanding the fundamentals of SEO in a more digestible format. It will provide clear, impactful advice without overwhelming readers with excessive details or outdated practices.2. Google's Stance: No Guaranteed Traffic in SEO - A recent statement by Google's John Mueller on January 17, 2024, has sparked attention among business owners and SEO professionals. The essence of Mueller's statement is that no one, not even experts, can guarantee increased traffic to a website as a result of specific changes. This was in response to an inquiry about whether removing certain parameters from a website would lead to an increase in traffic. Mueller's unequivocal response was, "Nobody can guarantee you traffic, sorry."This highlights a crucial aspect of digital marketing – the unpredictability and non-guaranteed nature of SEO (Search Engine Optimization). SEO involves optimizing your website to rank higher in search engine results, ideally increasing visibility and traffic. However, the algorithms that search engines use are complex and constantly evolving. This makes it challenging to predict exactly how changes to a website will impact its traffic.Many SEO professionals use estimates and formulas to predict the ROI (Return on Investment) of making specific changes to a website. They might estimate that improving rankings could lead to increased clicks and potentially more revenue. However, these are just estimates and should not be considered guaranteed outcomes. The digital marketing landscape is dynamic, and what works today might not work tomorrow.3.  Google's Stance on HTML Structure for SEO Rankings - It's crucial to understand the factors that influence your website's visibility on search engines like Google. One common area of focus is the structure of a website's HTML code. On January 26, 2024, an insightful update was shared regarding Google's stance on this matter, which is particularly relevant for those managing their own websites or working with digital marketing professionals.Gary Illyes from Google clarified on the latest episode of the ‘Search Off The Record' podcast that the HTML structure of web pages does not significantly impact search rankings. This revelation addresses a common misconception among website owners and SEO specialists who often prioritize meticulous HTML structuring in the hopes of boosting their search rankings.Illyes emphasized the value of diversity in website designs and structures, suggesting that if every website had the same HTML structure, the internet would become monotonous. He acknowledged that while basic elements like headings, title tags, and well-organized paragraphs are beneficial, obsessing over the intricate details of HTML structuring is largely unnecessary for SEO purposes.In 2018, John Mueller of Google also remarked that while a clear content structure is helpful for users, it does not directly influence ranking. This reinforces the idea that user experience should be the primary focus rather than the complexity of HTML structure.Furthermore, Google has stated that overusing elements like H1 tags or constantly rearranging them has little to no effect on a site's ranking. This information is especially useful for small business owners who might be allocating resources to fine-tune HTML structures under the assumption that it significantly impacts SEO.In summary, while maintaining a basic, user-friendly HTML structure is important, overemphasizing its complexity does not yield significant benefits in terms of SEO rankings. Business owners are advised to focus on creating valuable content and a pleasant user experience, rather than getting caught up in the intricacies of HTML coding for SEO purposes.4. Google Bot Does Not Read Content Within HTML Comments - On January 25, 2024 Google's John Mueller clarified that HTML comments, which are parts of the website code not visible to users but can contain notes or additional information for developers, were believed by some to influence Google's understanding of a site's content. However, John Mueller clarified that Googlebot, the search engine's crawling software, does not read or utilize the content within these HTML comments for indexing or ranking purposes. This clarification came as a response to a query on Reddit, where an individual inquired about the potential benefits of including content in HTML comments to enhance text recognition from images on their website.Mueller's response underlines an essential principle of web content creation: the importance of putting content directly on web pages, rather than in hidden elements like HTML comments. For business owners, this insight is particularly valuable. It emphasizes the need to focus on creating high-quality, visible content that users and search engines can easily access and understand. This approach not only ensures better engagement with potential customers but also aligns with Google's guidelines for optimal website performance in search results.Business owners should stay away from gimmicks and instead prioritize content that adds real value to their website visitors, as this is what Google's algorithms are designed to recognize and reward in search rankings. This update reiterates the ongoing need for transparency and user-centric strategies in digital marketing practices.

#TWIMshow - This Week in Marketing
Ep195 - Google's Guide to Seamless Website Transition Without SEO Hiccups

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 22, 2024 11:58


Episode 195 contains the Digital Marketing News and Updates from the week of Jan 15-19, 2024.1. Google's Guide to Seamless Website Transition Without SEO Hiccups - Google's John Mueller has provided advice for businesses transitioning to a new website without affecting their SEO. This guidance is crucial for small business owners who are considering or in the process of moving to a new website.Mueller emphasized the importance of either removing or updating the old website. Keeping both the old and new sites live with conflicting information can confuse users and harm SEO. Inconsistent details, such as different business hours or addresses on both sites, can frustrate customers and make it challenging for search engines to index and rank pages correctly.To ensure a smooth transition, Mueller suggests implementing redirects from the old site to the new one. This approach aids users in finding the correct website and helps search engines transfer any existing signals to the new domain, potentially boosting its standing. Mueller recommends getting help from a web developer or hosting provider to implement the redirects properly and suggests keeping them in place for at least one year.Redirects are crucial when transitioning from one website domain to a new one. They ensure users who visit the old website domain are automatically forwarded to the new domain and pass on the value of links pointing to the old domain to the new domain. This link equity transfer is vital for the new website to retain its search engine ranking.If redirects to a new domain aren't possible, Mueller suggests updating the old site with a notice alerting visitors that the content has moved to a new domain. If that can't be done, take down the old site completely. He assures that the new site won't be penalized by search engines for outdated content on the old domain. However, both the old and new websites may still appear in searches for a while, which may need clarification for users.After making the transition, it's important to monitor the process closely. Use analytics software to check that redirects function correctly and traffic is sent to the right places. Keep an eye on your search engine rankings to catch any unexpected dips that might indicate a problem with the transition. Informing users about the change through emails, social media, and website announcements is also crucial.In summary, moving to a new website is a significant change for any business. Handling the shift carefully is vital to maintaining search engine rankings and providing users with a continuous experience. Mueller's guidance should assist in making the transition from an old site to a new one go smoothly.2. Google's Clarification on 'Index, Follow' Meta Tag: What It Means for Your Website - Google's John Mueller recently provided clarity on the use of the 'index, follow' meta tag, a topic of interest for many website owners and SEO professionals. The HTML meta element communicates metadata. Metadata is machine readable information that a crawler like Googlebot can read. Muller's clarity sheds light on a common misunderstanding in the SEO community.The 'index, follow' meta tag has been widely used with the belief that it instructs search engine crawlers to index the content of a webpage and follow its links. However, Mueller clarified that this tag is essentially redundant for Google, as indexing and following links are default behaviors of search engine robots. The tag looks like this: .Mueller explained that the 'index' directive in the robots meta tag has no function for Google and is completely ignored. Similarly, the 'follow' directive is unnecessary because Googlebot automatically follows links unless instructed otherwise. Google's documentation confirms that the default values for robots tags are 'index, follow', and these don't need to be specified. Furthermore, he wrote, “Google has https://developers.google.com/search/docs/crawling-indexing/special-tags & https://developers.google.com/search/docs/crawling-indexing/robots-meta-tag to document the meta tags that have functions. You can use anything else, it'll be ignored. is an option, if you want to throw people off.”If the robots meta you want to use isn't' listed in Google's documentation then Googlebot is going to ignore it. So don't try to be cute or innovative unless you want to throw people off. The tag is still valid and tells the search engine crawlers to not index the content on the webpage and to not follow any links.For small business owners managing their websites, this clarification means that including the 'index, follow' meta tag does not impact how Google crawls or indexes their pages. The focus should instead be on ensuring that the website is accessible, with high-quality content that serves the audience's needs.While Bing treats the 'index, follow' tag similarly, it does allow explicit statements of 'index' or 'follow' if desired, as per Bing's documentation. However, for Google, the emphasis is on streamlining the website's HTML and removing unnecessary elements that do not contribute to its performance in search results.3. Understanding the Impact of Changing from WWW to Non-WWW on SEO - In a recent discussion, Google's John Mueller clarified a common query among website owners and digital marketers about the impact of changing a domain from 'www' to 'non-www' on search rankings. This is particularly relevant for business owners who are considering such changes to their websites.Mueller stated that switching from a 'www' to a 'non-www' domain should not significantly affect a website's rankings. He suggested that Google is adept at recognizing such changes, and the rankings are unlikely to suffer substantially as a result. This response came after a query about a large site that changed its domain URL to remove 'www' and implemented 301 redirects via user agent Mozilla. The site owner noticed a worsening impact over time, leading to concerns about Google's recognition of this form of 301 redirect.Mueller clarified that server-side redirects like 301 do not use user agents, indicating a possible misunderstanding in the implementation. He emphasized that changing from 'www' to 'non-www' doesn't really change much in terms of SEO. If a website is experiencing significant changes in rankings, it is likely due to other factors.It's important for business owners to understand that while such domain changes are recognized by Google, the key is in the proper implementation of redirects. Google is quick to pick up on changes like 'http to https' or 'www to non-www,' and these types of changes should not negatively impact a site's rankings. However, changes in URL structures or significant parts of the site might lead to slower reactions from Google.4. Maximizing SEO: The Importance of Text Placement Over Images - In a recent

#TWIMshow - This Week in Marketing
Ep194 - Google's Clear Message: No 'Perfect Page' Formula for Search Rankings

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 15, 2024 13:51


Episode 194 contains the Digital Marketing News and Updates from the week of Jan 08-12, 2024.1. Google's Clear Message: No 'Perfect Page' Formula for Search Rankings - Google's Search Liaison, Danny Sullivan debunked a common myth in SEO: the existence of a 'perfect page' formula for achieving high rankings in search results. Danny emphasized that there is no universal formula or set of rules that websites must adhere to for high placement in search results. This clarification challenges the widespread belief, dating back to the early days of SEO, that specific word counts, page structures, or other optimizations can guarantee success in search rankings.According to Danny, third-party SEO tools often suggest creating pages in certain ways to succeed in search. However, he asserted that these tools cannot predict rankings accurately. Their advice is usually based on finding averages among top-ranking pages, but Google's algorithm values both commonalities and unique differences in content. Following such generalized advice does not guarantee a top ranking.Instead of focusing on mythical formulas, Danny recommended prioritizing helpfulness and relevance to users. For instance, including an author byline should be based on its usefulness to readers, not because it might supposedly boost rankings. His key advice is to put readers and audience first, as being helpful to them aligns more closely with the various signals used by Google to reward content.For small business owners, this means shifting focus from trying to crack a non-existent 'perfect SEO formula' to creating content that genuinely serves its audience. This reader-first approach is more likely to resonate with the diverse and evolving signals used by search engines. It's not just about organizing content; it's about making your site more accessible and relevant to both your audience and search engines.In summary, Google's message is clear: there is no perfect blueprint for guaranteed rankings. However, creating content that genuinely serves its purpose continues to be rewarded. This approach encourages small business owners to focus on quality and relevance, rather than chasing elusive SEO shortcuts, ultimately leading to a more robust and effective online presence.The question for you: Who will you listen and follow? Danny Sullivan, Google's Search Liaison or a third-party tool who makes money when can keep you as a customer? 2. Author Bylines Don't Impact Search Rankings - Google's Search Liaison, Danny Sullivan, stated, "Author bylines aren't something you do for Google, and they don't help you rank better." This statement addresses the belief that including author bylines and bios on a website can directly influence its ranking in Google's search results. Sullivan emphasized that these elements are for the benefit of the readers, not for the search engine.He further explained that while publications with author bylines may exhibit other characteristics that align with Google's criteria for useful content, the bylines themselves do not contribute to better rankings. This clarification dispels the myth that specific structural elements like author bylines can be used as a shortcut to achieve higher search rankings.John Mueller of Google also previously stated (Sep 20, 2021) that author bylines are not a requirement for Google Search. This reinforces the idea that while bylines can enhance the credibility and transparency of content for readers it is not going to give you a ranking boost. For small business owners, this means prioritizing the creation of content that genuinely serves the needs and interests of their audience. By focusing on being helpful and providing value to readers, businesses are more likely to align with the various signals Google uses to reward content. This approach not only enhances user experience but also supports long-term SEO success without relying on misconceptions about specific page elements.3. Navigating the Japanese Keyword Hack: Google's John Mueller Offers Recovery Advice - In a recent interaction on Reddit, Google's John Mueller provided crucial advice for website owners who fall victim to the Japanese keyword hack. This is especially relevant for small business owners who manage their websites and are concerned about online security and SEO.The Japanese keyword hack is a type of SEO spam attack where a website is injected with thousands of foreign language pages, often in Japanese or Chinese, without the owner's knowledge or consent. This attack can significantly impact a site's search rankings and integrity. In the reported case, a website owner discovered over 20,000 such pages suddenly indexed on their site.Mueller's advice for recovery begins with identifying how the breach occurred. It's crucial to understand the vulnerability that allowed the hack, to ensure it is properly secured. He suggested considering automatic updates or switching to a hosting platform that handles security as potential solutions.For SEO implications, Mueller advised that once a site's most important pages are cleaned of unwanted content, they can be reindexed quickly. He reassured that old hacked pages, if not visible to users, do not pose immediate problems and can remain indexed for months without issue. Importantly, Mueller clarified that spammy backlinks pointing to these invisible indexed pages do not require disavowing. Instead, the focus should be on cleaning up the site's visible content and preventing internal search results from being indexed.Regarding the issue of spammy backlinks causing internal search pages to be indexed, Mueller clarified that this was separate from the hacking issue. He recommended against disavowing the links, as the pages would naturally drop from search results over time. Proactively blocking search results pages from indexing, using methods like robots.txt or noindex, is advised for both new and existing sites to prevent exploitation by spammers.For small business owners, this incident underscores the importance of website security and regular maintenance. Regular security updates, malware scans, and link audits should be part of routine website management. It's a shared responsibility between website owners and search engines to keep search results free of hacked and spammy content.In summary, Mueller's advice highlights the need for proactive security measures and vigilant maintenance to protect websites from hacking and spam attacks. By focusing on these areas, small business owners can safeguard their online presence, maintain their search rankings, and ensure a trustworthy experience for their users.4. Google's Updated Guidance on Search Snippets - Google has recently updated its documentation to clarify how its algorithm selects snippets for search results. Traditionally, it was believed that the snippet in search results, which consists of a title, URL breadcrumb, and a brief description, was derived mainly from the meta description of a webpage. However, Google's updated guidance indicates that the primary source of the snippet is the page content itself, not the structured data or the meta description.This update represents a shift in how meta descriptions might be written and how content is optimized. The previous version of Google's documentation implied that the snippet was mostly derived from the meta description, suggesting that on-page content could also be selected for the snippet. The new documentation makes it clear that the page content is the main source of the snippet, with the meta description being used only when it describes the page better than other parts of the content.Google has removed a substantial amount of content from the previous version of the documentation, including a paragraph that advised site owners on how to suggest content for snippets. The new wording emphasizes that snippets are primarily created from the page content, and the meta description is used only when it might give users a more accurate description of the page than content taken directly from the page.5. Google's Temporary Reprieve for Select Sites on Third-Party Cookie Phase-Out - Google Chrome has started restricting third-party cookie access, a significant change that began on January 4, 2024, and is expected to reach 100% of users globally by Q3 2024. Recognizing the challenges this transition poses, Google is offering a limited "Deprecation Trial" to allow websites and businesses additional time to migrate away from third-party cookie dependencies. This trial is specifically for non-advertising use cases and will temporarily re-enable third-party cookie access for eligible third-party services until December 27, 2024.To qualify for these trials, services must meet strict eligibility criteria set by Google. Services categorized as advertising-related will not be approved, and origins matching known ad-related domains, including subdomains, will be rejected. Only services with proven functional breakage, not just data collection issues, are eligible. Applicants must provide detailed steps to reproduce the broken functionality in bug reports, and Google will only accept requests where apparent breakage is validated.For approved services, unique access tokens can be added in Chrome to enable trials. Google has granted a grace period until April 1, 2024, for approved sites to deploy their tokens, addressing the short window between registration opening and the initial blocking of cookies for 1% of traffic.This extension is not intended to relieve data collection inconveniences but to allow services with functional breakage to use third-party cookies. Business owners relying on third-party services or cookies should conduct an audit of their site's usage and prepare contingency plans. With the rollout already underway, it's crucial to address potential impacts before a larger percentage of visitors are affected.6. Google Search Console Retires Crawl Rate Tool - Google has officially removed the crawl rate setting tool from Google Search Console. The crawl rate limiter was a legacy tool within Google Search Console that allowed website owners to communicate to Google how frequently their site should be crawled. It was primarily used when a website experienced server load problems due to frequent crawling by Googlebot. However, Google has deemed this feature no longer necessary due to improvements in its crawling logic and other tools available to publishers.With the removal of this tool, Google now relies more on its enhanced crawling logic to determine the appropriate frequency for crawling websites. Googlebot now automatically adjusts its crawl rate in response to how a site or server responds to its requests. For instance, if a server consistently returns HTTP 500 status codes or if the response time for requests significantly lengthens, Googlebot will automatically slow down it's crawling.This means that Google's automated systems are now more adept at managing crawl rates without manual intervention. While this may reduce the level of control website owners have over crawl rates, it also simplifies the process, as Googlebot will intelligently adjust its behavior based on the site's response. Google has set the minimum crawling speed to a lower rate, comparable to the old crawl rate limits. This adjustment aims to continue honoring the settings that some site owners had previously set, especially if the search interest is low and the crawlers do not waste the site's bandwidth.7. The Evolving Role of Hashtags on LinkedIn - On January 12, 2024, Andrew Hutchinson reported in Social Media Today about the changing effectiveness of hashtags on LinkedIn. For a long time, LinkedIn did not support hashtags. However, in 2018, the platform reactivated hashtag discoverability and began encouraging users to categorize their posts using hashtags. The goal was to segment content better, enabling LinkedIn to show users more relevant content. But as algorithms have evolved, the necessity for hashtags has diminished. Modern social platform systems are now adept at understanding the context of a post's text, visuals, user history, and all keywords included, making hashtags less critical for content categorization.LinkedIn's current stance is that while hashtags can be helpful for viewers to easily identify what a post is about and find related posts, they should be closely related to the topic of the post to be most effective. LinkedIn also considers conversation topics and keywords to surface relevant information. This suggests that LinkedIn is now less reliant on hashtags for maximizing discovery. However, when searching for conversations on a specific topic, users can still use topics or hashtags.For business owners, this means that while hashtags in LinkedIn posts are less relevant than before, it's still important to be mindful of the keywords mentioned in the main post. Understanding and tapping into the right conversation streams based on target topics is crucial. Following popular pages and people within your business niche can provide insights into common hashtags related to your sector. Additionally, searching for hashtags in the app can help identify trending topics and discussions in your industry.Research by LinkedIn expert Richard van der Blom indicates that using 3-10 hashtags in LinkedIn posts used to significantly boost reach, but now hashtags provide no additional reach boost. This change suggests that LinkedIn is gradually deemphasizing hashtags as a discovery tool, focusing more on topical relevance rather than maximizing reach.LinkedIn's system is not designed for virality, so hashtags do not necessarily function on LinkedIn the way they do on other social media platforms. This could be part of LinkedIn's strategy to prevent users from gaming the system with hashtags.8. Expanding E-Commerce Horizons: X's New Partnership with Shopify - On January 9, 2024, X announced an expanded partnership with Shopify. This collaboration is set to open new opportunities for Shopify merchants to promote their products on X's platform, aligning with X's vision of becoming an "everything app."The partnership is designed to enable all Shopify merchants to reach a broader customer base by leveraging the power of X Business. While X already had a deal with Shopify in 2022 to display products in-stream, this new development promises to facilitate broader product awareness actions, enable easier catalog uploads, and provide more ways for Shopify merchants to maximize ad outcomes.Shopify President Harley Finkelstein highlighted the importance of reaching customers wherever they are. He emphasized that more platforms mean more choice, entrepreneurship, and opportunities for business growth. Although many details of the partnership are yet to be disclosed, the collaboration is expected to offer significant benefits for retailers.X's work with Shopify could potentially facilitate new opportunities for retailers in-stream. This would be particularly impactful if X's vision of integrating peer-to-peer payments and transfers in the app materializes. Such a feature could lead to innovative ways for retailers to expand their audience and leverage X's reach.For small business owners, this partnership represents a significant opportunity. It suggests that leveraging platforms like X, in conjunction with e-commerce giants like Shopify, could be a strategic move to expand their digital footprint and reach new customers. The potential for in-stream payments and transfers could further enhance the customer experience and streamline the purchasing process.9. Microsoft Has A New Ad Creation Tool - On January 11, 2024, Microsoft announced a significant update to its retail media tool, introducing a new Creative Studio element powered by generative AI. The new Creative Studio allows users to generate entirely new ads in various formats using conversational AI prompts. This AI-powered solution is designed to boost creativity and productivity, particularly benefiting retailers and advertisers by simplifying the process of creating banner ads. The tool's ease of use is particularly advantageous for smaller businesses that may lack the resources to run effective banner campaigns.One of the key features of this tool is its ability to create ads based on just a product URL. Users can then further customize the creative for different channels. Microsoft ensures that the AI-generated ads will automatically align with each retailer's style guide. Additionally, users have the flexibility to customize ads by updating and emphasizing selected words and phrases, cropping and cleaning up backgrounds, and modifying other ad elements, all based on text prompts.

#TWIMshow - This Week in Marketing
Ep191 - Google Reiterates: Domain Age Does Not Impact Search Rankings

#TWIMshow - This Week in Marketing

Play Episode Listen Later Dec 18, 2023 15:32


Episode 191 contains the Digital Marketing News and Updates from the week of Dec 11-15, 2023.1. Google Reiterates: Domain Age Does Not Impact Search Rankings - In a recent discussion, Google's John Mueller addressed a long-standing question in the SEO community: Does the age of a domain name impact Google search rankings? This topic, often debated among SEO professionals, has been clarified by Mueller, providing valuable insights for business owners looking to understand the nuances of search engine optimization.Key Insights from the Discussion: Domain Age and Rankings: Mueller stated that the age of a domain name does not impact Google search rankings. This clarification dispels a common belief among SEOs that older domain names correlate with top rankings. Misinterpretation of Google's Patent: The misconception about domain age as a ranking factor may have originated from a misreading of a Google patent titled "Information Retrieval Based on Historical Data." The patent, however, focuses on identifying spam sites using domain-related information, not on boosting the rankings of legitimate domains. The Role of Domain Data: The patent mentions using domain data to catch throwaway domains used by spammers. It indicates that valuable, legitimate domains are often paid for several years in advance, unlike spam domains. However, this information is used to predict the legitimacy of a domain for spam detection, not for ranking purposes. This clarification from Google means that focusing on the age of your domain as a ranking strategy is misguided and highlights the importance of accurate information in the SEO industry. Instead, attention should be directed towards creating high-quality, relevant content and optimizing your website for a better user experience. It is a reminder to always seek clarity and accuracy in SEO strategies, ensuring that efforts are directed towards genuinely effective methods.2. Google Chrome: Phasing Out Third-Party Cookies for Enhanced Privacy - On January 4, 2024, Google will begin testing a new feature that restricts third-party cookies by default in Chrome, impacting 1% of users globally. This rollout is part of Google's broader initiative to phase out third-party cookies entirely by the second half of 2024, marking a significant shift in online privacy and digital advertising practices.Key Aspects of Google's Update: Testing of Tracking Protection: The new tracking protection tool will be rolled out to a small percentage of Chrome users, starting with 1% globally. This feature is designed to limit third-party cookies, which have been a fundamental part of the web for nearly three decades. Impact on Websites and Advertisers: Websites that depend on third-party cookies for advertising or other purposes might face challenges due to this rollout. It's crucial for site owners to prepare in advance for a web without third-party cookies. Google's Privacy Sandbox Initiative: This initiative aims to develop technology that safeguards online privacy while providing tools for successful digital businesses. The primary objectives are to phase out third-party cookies and reduce cross-site tracking, maintaining free online content and services. For small business owners, especially those relying on digital advertising, this update signals a need to adapt to new privacy-focused online practices. The phasing out of third-party cookies will require a shift in how businesses target and reach their audiences online. Preparing for these changes is essential to ensure continued effectiveness in digital marketing strategies.3. Google's Local Search Algorithm Update: Prioritizing Open Businesses - On December 16, 2023, Google announced an update to its local search ranking algorithm, emphasizing the 'openness' signal for non-navigational queries. This change means that Google now ranks open businesses higher in local search results than those that are closed, particularly during the business's operating hours. This update is a response to the evolving needs of users who are increasingly looking for immediate services or products.Key Aspects of the Update: Strengthened Openness Signal: Google's local search algorithm now considers whether a business is currently open as a more significant factor in ranking for local pack rankings. This change applies to non-navigational queries, where users search for types of services rather than specific brands. Impact on Business Visibility: Businesses that are open at the time of a user's search query are more likely to appear higher in search results. This update benefits businesses that operate 24/7, as they remain visible even when competitors are closed. Advice Against Manipulating Business Hours: Google's Search Liaison, Danny Sullivan, advised against changing business hours to appear open 24/7 unless true. This manipulation could lead to penalties or adjustments in the ranking signal in the future. For small business owners, this update underscores the importance of accurately listing business hours on Google. Ensuring your Google Business Profile reflects true open hours can improve your visibility in local search results, especially during your operating hours. This change can significantly impact customer footfall and inquiries, making it crucial for businesses to update their profiles accurately.4. Google Confirms: Publishing Content in Both PDF and HTML Formats is Acceptable - On December 13, 2023, Google's John Mueller clarified a common query regarding the publication of content in both PDF and HTML formats. In a recent AskGooglebot video, Mueller confirmed that it's perfectly fine to publish content in both formats, as Google's systems can index them separately, even if the content is technically a duplicate.Key Insights from John Mueller's Statement: Independent Indexing of PDF and HTML: Google can find and index both PDF and HTML pages separately. This means that even if the content in both formats is the same, they can be shown independently in search results. Managing Duplicates: If Google's systems identify the content as duplicates, they usually defer to the HTML page version. However, website owners have control over this through various methods like using a 'noindex' HTTP header or robots meta tag to block indexing of one format, or using the 'rel=canonical' link element to indicate a preference. Practical Use Cases: Mueller highlighted that the choice of format often depends on what the audience prefers or requires. For example, restaurant menus are best suited as HTML pages for mobile viewing, while forms or guidebooks might be more practical in PDF format. Google's stance on publishing content in both PDF and HTML formats offers businesses an opportunity to diversify their content strategy. By understanding how Google indexes and handles these formats, business owners can make informed decisions about content publication, ensuring their information reaches the audience effectively in their preferred format.5. Google Takes Action Against Parasite SEO - On December 11, 2023, Google's Search Liaison Danny Sullivan announced that the search company has taken steps to address the issue of 'parasite SEO.' This term refers to the practice of leveraging high-authority third-party websites to rank content that otherwise wouldn't perform well in search results. Sullivan indicated that while the specific change related to the upcoming helpful content update isn't live yet, Google has implemented other measures to combat this problem.Key Points from Google's Announcement: Addressing Third-Party Content Abuse: Google has enhanced its systems to better handle third-party content that falls under the category of parasite SEO. This includes content hosted on reputable sites but created primarily for SEO manipulation. Ongoing Efforts: Sullivan emphasized that Google will continue to refine its approach to dealing with such issues, indicating an ongoing commitment to maintaining the integrity of search results. Advice for Content Creators: The advice given by Google regarding third-party content remains relevant. Creators are encouraged to focus on producing helpful, people-first content to be recognized as valuable by Google's algorithms. Impact on Business Owners: This development is particularly important for small business owners who rely on organic search traffic. Understanding and adhering to Google's guidelines is crucial for maintaining visibility and credibility in search results. Google's proactive steps to address parasite SEO highlight the importance of ethical SEO practices. For business owners, this serves as a reminder to focus on creating high-quality, user-centric content rather than resorting to manipulative tactics. Staying informed about these changes and aligning with Google's guidelines is key to achieving long-term success in search engine rankings.6. Googlebot Crawling Experiment: Surprising Insights for Website Owners - On December 14, 2023, an intriguing experiment conducted by technical SEO expert Kristina Azarenko revealed the impact of disabling Googlebot from crawling a website. This experiment, which ran from October 5 to November 7, offers valuable insights for business owners about the importance of Googlebot crawling and its effects on website performance.Key Findings from the Experiment: Impact on Favicon and Video Search Results: The website's favicon was removed from Google search results, and video search results took a significant hit, which still hadn't recovered post-experiment. Stability and Volatility in Search Positions: While positions remained relatively stable, there was slightly more volatility in Canada. Slight Decrease in Traffic: The website only saw a slight decrease in traffic during the experiment. Increase in Reported Indexed Pages: Despite pages having 'noindex' meta robots tags, they ended up being indexed because Google couldn't crawl the site to see those tags. Multiple Alerts in Google Search Console (GSC): The site received multiple alerts in GSC, such as 'indexed though blocked by robots.txt' and 'blocked by robots.txt.' This experiment highlights the critical role of Googlebot in maintaining and updating a website's presence in search results. For business owners, it underscores the importance of ensuring that Googlebot can crawl their site effectively. Any impediments to crawling can lead to unexpected issues, such as outdated information in search results or the indexing of unintended pages.7. Google Ads Unveils Expanded Video Reach Campaigns - On December 12, 2023, Google Ads announced a significant enhancement to Video Reach Campaigns (VRC), offering businesses more formats to maximize their brand awareness on YouTube. This update reflects the evolving ways people engage with YouTube, from mobile Shorts to TV screen browsing, presenting brands with diverse opportunities to connect with their audience.Key Features of the Update: Expanded Video Formats: Video Reach Campaigns now include the ability to scale video creatives to in-feed and Shorts, in addition to in-stream ads. This expansion allows advertisers to reach viewers across various YouTube formats, increasing the potential for brand exposure. AI-Powered Efficiency: Leveraging Google's AI, multiformat ads in VRC deliver enhanced reach and efficiency. Testing has shown that campaigns opting into all three inventory types (in-feed, in-stream, and Shorts) achieved an average of 54% more reach at 42% lower CPM (Cost Per Mille) compared to in-stream-only campaigns. Case Study - Bayer's Midol Campaign: Bayer utilized VRC for their Midol multi symptom relief campaign, opting into in-feed, in-stream, and Shorts ads. A head-to-head test against in-stream-only VRC revealed that the multiformat approach delivered 30% more reach at a 45% lower CPM.8. Google Ads Enhances Business Operation Verification Process - On December 12, 2023, Google Ads announced the rollout of a comprehensive guide to assist businesses in completing their Business Operation Verification. This new resource aims to clarify the verification process, providing detailed explanations and examples to help advertisers understand and successfully navigate the verification requirements.Key Features of the Update: Step-by-Step Guide: The guide offers a clear, step-by-step explanation of the Business Operation Verification process, including reasons for verification failure and example scenarios where verification might be required. Easy Accessibility: The guide is accessible through the billing icon in Google Ads and via in-account prompts, ensuring that advertisers can easily find and use the resource. Clarification on Verification Requirements: The document outlines specific scenarios and reasons why a business might need to undergo verification, helping advertisers to better prepare and complete the process efficiently. Google's Commitment to Transparency: This update reflects Google's ongoing efforts to maintain a secure and transparent advertising environment. By providing additional information to advertisers, Google aims to enhance the overall experience and effectiveness of the advertiser verification program. Google Ads' introduction of a detailed guide for Business Operation Verification is a proactive step towards supporting advertisers in meeting compliance standards. For business owners, this resource is a valuable tool in optimizing their advertising efforts and maintaining a credible presence on Google's platform.9. Meta's Threads App Embraces Decentralization with ActivityPub Integration - On December 13, 2023, Meta announced a significant step towards decentralization and interoperability for its Threads app by integrating with ActivityPub, a decentralized social networking protocol. This move allows Threads content and profiles to be accessible via other Fediverse-based apps, marking a notable shift in Meta's approach to social media.Key Aspects of Meta's Integration with ActivityPub: Threads Content Viewable on Mastodon: The integration enables Threads content to be viewable on Mastodon and other ActivityPub-based systems, effectively plugging Threads into the Fediverse concept. The Fediverse Concept: The Fediverse is a collection of independent, federated servers working together to facilitate open social media access. It aims to reduce centralized control, allowing each server to establish its own parameters and algorithms. Meta's Shift Towards Decentralization: Historically, Meta has been known for its 'walled garden' approach. This integration represents a significant move towards decentralization, potentially connecting Threads content to a range of tools and platforms that utilize the ActivityPub standard. Interoperability and Data Portability: The integration aligns with Meta's commitment to data portability and control over content. It also facilitates a broader connection into the Mastodon network, expanding Threads' audience reach and engagement. Meta's integration of Threads with ActivityPub is a bold step towards a more open and interconnected social media ecosystem. For business owners, it presents an opportunity to leverage the expanded reach and interoperability offered by this integration, potentially enhancing their social media strategy and audience engagement.

#TWIMshow - This Week in Marketing
Ep188: Overlooked Details That Make or Break Your SEO

#TWIMshow - This Week in Marketing

Play Episode Listen Later Nov 27, 2023 21:00


Episode 188 contains the Digital Marketing News and Updates from the week of Nov 20-24, 2023.1. Overlooked Details That Make or Break Your SEO - Search engines look at many factors when determining how to rank web pages in search results. While flashy new SEO trends come and go, focusing on foundational website quality and technical basics tends to pay off more in the long run. Google's John Mueller, Martin Splitt, and Gary Illyes recently delved into the concept of site quality in a podcast, offering valuable insights for business owners and digital marketers. Their discussion demystifies site quality, emphasizing its simplicity and practicality. Site Quality is Not Complex: The Google experts encourage reading site quality documentation, asserting that understanding and achieving site quality is not as complicated as it may seem. Gary Illyes remarks, "It's not rocket science," suggesting that the basics of site quality are accessible to everyone. No Specific Tools for Site Quality: Unlike technical issues, there are no direct tools to measure site quality. Traffic metrics may indicate changes, but they don't pinpoint specific quality issues. This means business owners need to assess their content's effectiveness and relevance themselves. Reframing the Approach: Illyes advises reframing the problem by focusing on whether a page delivers what it promises to users. This user-centric approach is key to improving site quality. It's about creating content that helps users achieve their goals. Quality in Terms of Value Addition: Adding value is crucial for site quality. In competitive search queries, it's not enough to be relevant; your content must offer something unique and valuable that stands out from what's already available. Mueller explains that simply replicating what's in the search results doesn't add value. Instead, aim for content that exceeds the existing baseline. Breaking into Competitive SERPs: Illyes suggests an indirect approach to compete in tough SERPs. Choose realistic battles and focus on areas where you can genuinely offer something different and better. In summary, Google's experts highlight the importance of user-focused content, uniqueness, and value addition in achieving site quality. For business owners, this means focusing on creating content that genuinely helps users and offers something beyond what's already out there.2. Master the SEO Basics: Google's Advice for Effective Website Optimization - In the ever-evolving world of Search Engine Optimization (SEO), it's easy to get caught up in the latest trends and advanced tactics. However, Google's Search Relations team, featuring Martin Splitt, Gary Illyes, and John Mueller, emphasized the importance of mastering basic technical SEO issues first. This advice is particularly relevant for business owners who might not be deeply versed in the intricacies of SEO.Technical SEO involves optimizing the architecture and infrastructure of a website to enhance its crawling and indexing by search engines. This is crucial because, no matter how innovative your SEO strategies are, if search engines like Google can't properly crawl or render your site, your efforts won't yield the desired results. Illyes highlights the importance of ensuring that your content is accessible and useful, as these are key factors that Google considers.Another significant point discussed is the common misconception that high traffic automatically means high-quality pages. Mueller advises looking beyond just traffic metrics and focusing on user engagement and satisfaction. These are more accurate indicators of a page's usefulness and quality. It's important to focus on relevant queries and track lower-level pages to better understand a site's performance.The key to creating high-quality content is to focus on what helps people achieve their goals when they visit your page. This could mean providing comprehensive answers to common questions, solving problems, or sharing engaging stories. Illyes suggests that quality might be simpler than most think – it's about writing content that genuinely helps your audience.For business owners, the takeaway is clear: before diving into complex SEO strategies, ensure that your website's technical foundation is solid. Also, prioritize creating content that is not just high in volume but high in value to your audience. By focusing on these areas, you can significantly improve your website's SEO performance.3. Rethinking SEO Success: Beyond Traffic Metrics - In episode 66 of Google's "Search Off the Record" podcast, Google's John Mueller and Martin Splitt discussed a crucial aspect of SEO: the real value of traffic metrics. The conversation highlighted a common misconception in the SEO community—equating high traffic with success. While many SEO professionals boast about traffic increases, Mueller and Splitt emphasized the importance of focusing on more meaningful goals, like conversions and business impact.The podcast shed light on the tendency of SEOs to prioritize traffic statistics over Return on Investment (ROI) or the actual impact on earnings. Mueller speculated that this might be due to the delayed effects of SEO efforts on tangible business results. He pointed out that while traffic data is useful, it can be misleading if not analyzed in the context of its relevance and contribution to business goals.The discussion also touched on the different types of traffic and their varying values. Not all traffic contributes equally to sales or brand building; some may be irrelevant or non-converting. Therefore, understanding the nature of the traffic and its actual impact on sales or business growth is crucial.Mueller and Splitt's conversation serves as a reminder for SEO professionals to align their strategies with broader business objectives, rather than just chasing traffic numbers. It calls for a more nuanced approach to SEO, where the success is measured not just by the quantity of traffic, but by its quality and contribution to the business's bottom line.4. Google Clarifies the SEO Value of 404 Pages - Google recently shed light on the SEO implications of 404 error pages, offering valuable insights for business owners and digital marketers. A 404 error occurs when a page on a website cannot be found. Contrary to common belief, these pages can have a positive impact on a site's SEO if managed correctly.Google's John Mueller explained that 404 pages are a normal part of the web. They signal to search engines that a page no longer exists, which is crucial for maintaining a clean and up-to-date site structure. Importantly, 404 errors do not directly harm a site's overall ranking in search results.For business owners, this means that occasional 404 errors are not a cause for alarm. However, it's important to monitor these errors and ensure they are appropriate. For instance, if a product is no longer available, a 404 page is suitable. But if the page has moved, a 301 redirect to the new location is better for both users and search engines.Understanding the role of 404 pages in SEO is vital for maintaining a healthy website. It's about balancing user experience and search engine signals. Regularly checking for 404 errors and addressing them appropriately can contribute to a more effective online presence.This insight from Google highlights the importance of website maintenance and understanding the nuances of SEO. It's a reminder that not all errors are detrimental and that proper management of these pages can support a site's SEO strategy.5. AI in Content Creation: A Tool, Not a Threat - Google Search Relations team, including Martin Splitt, Gary Illyes, and John Mueller, discussed the role of AI in content creation. They view AI as a valuable aid to human creativity, not a replacement. The team emphasized that AI is excellent for certain tasks but not a catch-all solution. They humorously noted how technology, like Google Plus, can quickly become outdated, highlighting the rapid evolution of tech.The Google team believes AI can be particularly useful for overcoming writer's block or meeting tight deadlines. AI tools can suggest frameworks, phrases, and variations to speed up the writing process. However, they stressed that AI should be used responsibly and as a complement to human creativity, not as a substitute. This perspective encourages a balanced approach to AI in content creation, viewing it as a tool to enhance human efforts rather than overshadow them.Key Takeaways: AI as a Creative Aid: AI is seen as a tool to enhance human creativity, especially useful in overcoming writer's block or accelerating the writing process. Balanced Perspective: The Google team advocates for a responsible use of AI, emphasizing its role as a supplement to human creativity rather than a replacement. Rapid Technological Evolution: The discussion also touches on the fast-paced nature of technology, using Google Plus as an example of how quickly tech can become outdated. 6. Google's Guidance On SEO Tools - Google's John Mueller addressed a query regarding the use of SEO tools for content writing, specifically in the context of a Vietnamese travel agency blog. The question revolved around whether to include Vietnamese accents in keywords, as suggested by an SEO tool, considering the primary audience comprised American and Australian tourists unlikely to use these accents in searches.Mueller's response emphasized the importance of writing in the language of the audience, particularly for headers and body text. He advised not to depend entirely on SEO tools for writing guidance but to conduct independent research. Mueller suggested examining the Search Engine Results Pages (SERPs) with and without accents (e.g., "quảng binh" vs. "quang binh") to understand better what ranks higher and is more relevant to the target audience.The key takeaway from Mueller's advice is the significance of not relying solely on SEO tools. These tools are based on the current knowledge and trends in SEO, which can be limited and sometimes outdated. They were developed based on what SEOs believed to be effective at the time, such as keyword densities and reciprocal linking strategies, which eventually became less effective.Mueller's guidance underscores the dynamic nature of SEO and the need for writers and marketers to use their judgment and stay updated with current best practices. While SEO tools can provide valuable insights, they should not dictate content creation. Instead, a balance between tool-guided insights and personal research and understanding of the audience should drive content strategy.This advice is particularly relevant for small business owners and digital marketing enthusiasts who aim to create content that resonates with their audience while also performing well in search engines. Understanding the limitations of SEO tools and the importance of audience-centric content can lead to more effective and engaging digital marketing strategies.7. Follower Count: Not a Google Search Ranking Factor - There's a common belief among some digital marketers and business owners that a higher number of followers on social media platforms like Twitter or Instagram could positively influence their Google search rankings. This assumption stems from the idea that social signals, such as likes and followers, might be interpreted by Google as indicators of a site's popularity or credibility.Google has explicitly stated that follower counts on social media are not a factor in determining search rankings. This clarification is significant because it helps refocus SEO strategies on more impactful practices. Google's search algorithms are complex and take into account numerous factors, but social media follower counts are not among them.For small business owners and digital marketing enthusiasts, this information is vital. It means that while having a robust social media presence can be beneficial for brand awareness and customer engagement, it does not directly contribute to how well your website ranks in Google searches. Therefore, efforts should be more strategically directed towards proven SEO practices like content quality, website optimization, and building authoritative backlinks.Key Takeaways: Social media is valuable for engagement and brand presence, not for SEO in terms of follower counts. Focus on creating high-quality, relevant content and optimizing your website for a better user experience. Building a strong backlink profile from reputable sources can significantly impact your Google search rankings. Understanding what does and does not impact your website's ranking in search results is key to effective SEO. This clarification from Google serves as a reminder to focus on the core aspects of SEO that genuinely make a difference, rather than misconceptions like the impact of social media follower counts.8. Google to Remove Crawl Rate Tool from Search Console in 2024 - Google has announced that it will be deprecating the Crawl Rate Limiter legacy tool within Google Search Console on January 8, 2024. This decision comes as Google believes the tool has become less useful due to advancements in its crawling logic and the availability of other tools for publishers.The Crawl Rate Limiter allowed website owners to communicate to Google how often to crawl their site. It was particularly useful for sites experiencing server load issues due to frequent crawling by Googlebot. However, Google has improved its crawling algorithms to automatically adjust based on a site's server response. For instance, if a site consistently returns HTTP 500 status codes or if the response time significantly increases, Googlebot will automatically slow down its crawling.Gary Illyes from Google explained that the tool's usefulness has diminished over time. He noted that the tool's effect on crawling speed was slow and it was rarely used. With its removal, Google will set a new minimum crawling speed, which will be comparable to the old crawl rate limits, especially for sites with low search interest.For website owners experiencing issues with Googlebot crawling, Google recommends referring to a specific help document and using a report form to communicate any concerns.As a business owner, it's crucial to stay informed about changes in Google's tools and services, as they can impact your website's visibility and performance. The removal of the Crawl Rate Limiter tool signifies Google's confidence in its automated systems to manage site crawling efficiently. However, it also means that you should be more vigilant about monitoring your site's performance and be ready to use alternative methods to communicate any crawling-related issues to Google.9. Google Removes Key Robots.txt FAQs - Google recently removed its Robots.txt FAQ help document from its search developer documentation. This change has raised questions among webmasters and SEO professionals about the implications for website crawling and indexing.Robots.txt is a file used by websites to communicate with web crawlers about which parts of the site should or should not be processed or scanned. Google's FAQ page on this topic was a valuable resource for understanding how to use this file effectively. Its removal means that some specific guidance and clarifications are no longer directly available from Google.Key Takeaways from the Removed FAQs: A website doesn't necessarily need a Robots.txt file; without it, Googlebot will generally crawl and index the site normally. The Robots.txt file is recommended for controlling crawler traffic to prevent server issues, not for hiding private content. For controlling how individual pages appear in search results, use the Robots meta tag or X-Robots-Tag HTTP header. Changes in the Robots.txt file can take up to a day to be reflected in Google's cache and subsequently affect search results. Blocking Google from crawling a page using Robots.txt doesn't guarantee removal from search results. For explicit blocking, use the 'noindex' tag. As a business owner, it's important to understand that while the specific FAQs are no longer available, the fundamental principles of using Robots.txt remain unchanged. It's crucial to ensure that your website's Robots.txt file is correctly configured to guide search engines effectively. Remember, incorrect or unsupported rules in this file are typically ignored by crawlers, so accuracy is key.The removal of the Robots.txt FAQs by Google underscores the dynamic nature of SEO and the importance of staying informed about best practices. Business owners should consult with SEO professionals or refer to updated resources to ensure their website's Robots.txt file aligns with their digital marketing goals.10. Google Ads to Update Location Asset Requirements: What You Need to Know - Google Ads is set to update its location asset requirements in December. A location asset in Google Ads is a feature that allows advertisers to include specific location details, like addresses and phone numbers, in their ads. This is particularly useful for businesses with physical locations, as it helps potential customers find them easily.The upcoming change aims to clarify which types of location assets are not allowed, helping advertisers better understand the restrictions. The update will specifically address locations that are closed, not recognized by Google, or do not match the business running the ad. Additionally, assets with products or services that do not match the specified location will be disallowed.This update is significant for business owners and digital marketers. Using location assets effectively in ads can significantly boost a business's visibility and conversion potential. However, not adhering to these updated requirements could result in leaving out vital details, potentially harming your return on investment.11. Microsoft Advertising's Last-Minute Shopper Insights - As the holiday shopping season reaches its peak, Microsoft Advertising's Festive Season Marketing Playbook offers valuable insights for advertisers to capitalize on consumer spending. Here's a concise summary: Timing of Revenue Peaks: Despite some advertisers not yet seeing a peak in revenue, historical trends show significant spikes around Black Friday and Cyber Monday. This year, a 3-4% increase in holiday spending in the US is anticipated, potentially reaching up to $966.6 billion. The UK and Germany are also expected to see similar high spending, highlighting the global impact of the season. Shift in Consumer Behavior: A notable trend this year is the increased emphasis on deal-seeking. Over two-thirds of US shoppers are spending more time looking for coupons and deals, especially around the Cyber5 period (Thanksgiving, Black Friday, Small Business Saturday, Sunday, and Cyber Monday). Advertisers need to adapt to this trend and align their strategies accordingly. The Central Role of Search in Purchasing Decisions: Search remains a crucial component in guiding both online and in-store purchases. It's a pivotal tool for discovering new retailers, conducting pre-purchase research, and comparing prices. For example, Gen X consumers heavily rely on search to find the best prices. In the EMEA region, deal-seekers spend 33% more time searching than average shoppers, offering a significant opportunity for targeted advertising. Post-Cyber5 Opportunities: Search volumes remain high even after the Cyber5 period, presenting a continued opportunity for advertisers. Many holiday clicks and conversions happen during Cyber5 with lower cost per acquisition (CPA), so maintaining active advertising campaigns during this period can yield substantial benefits. Planning for Returns: The post-holiday return period is another critical aspect for businesses. Search volumes for returns peak shortly after Christmas and continue into the new year. Preparing for this influx and adjusting marketing strategies can help mitigate potential losses and maintain customer satisfaction. Strategic Holiday Planning Checklist: Microsoft suggests launching campaigns early, using remarketing and dynamic search ads, emphasizing value messages, leveraging AI for personalized offerings, and utilizing store support for profitable online growth.

#TWIMshow - This Week in Marketing
Ep 185: Mobile-First Indexing: Google's 7-Year Mission Accomplished!

#TWIMshow - This Week in Marketing

Play Episode Listen Later Nov 6, 2023 17:15


Episode 185 contains the Digital Marketing News and Updates from the week of Oct 30 - Nov 3, 2023.1. Mobile-First Indexing: Google's 7-Year Mission Accomplished! - Imagine a world where your smartphone is the key to unlocking the vast potential of the internet. That's the vision Google has pursued for nearly seven years, and they've just announced a significant milestone: the completion of mobile-first indexing. This means that Google now primarily uses the mobile version of content for indexing and ranking across all websites.Back in 2016, Google embarked on this journey, recognizing the shift towards mobile internet usage. By 2018, half of the websites in Google's search results were indexed this way. Fast forward to today, and Google has officially declared the process complete. What does this mean for you? It's simple: if your website isn't optimized for mobile, you're not speaking the same language as Google's search engine—and potentially missing out on valuable traffic.For the few sites that still don't work on mobile devices, Google will continue to use its legacy desktop crawler, but this is a temporary measure. The message is clear: mobile optimization is no longer optional; it's essential.As a business owner, this update is a call to action. Ensure your website is mobile-friendly, with responsive design and content that shines on smaller screens. This isn't just about staying in Google's good graces—it's about providing your customers with the best experience, no matter how they find you.Google's update is a reminder of the ever-evolving nature of the web and the importance of keeping pace with these changes. As mobile-first indexing becomes the norm, it's an opportunity to review your online presence and ensure that your business is set up for success in a mobile-centric world.P.S. Don't let your website get left behind in the desktop era. Embrace mobile optimization and open the door to a world of opportunities.2. Google Launches Nov'23 Core Update -  Google has announced its November 2023 Core Update. This update is the latest in a series of adjustments that Google makes to its search algorithms, which can significantly impact where your site appears in search results.The November update is particularly noteworthy because it's the fourth broad core algorithm update of the year, following closely on the heels of the October 2023 Core Update. These updates are part of Google's ongoing efforts to improve the searcher's experience by providing the most relevant and high-quality results.What's new with this update? Google has improved a different core system than the one adjusted last month. This means that even if you've made recent changes to your site, you might still see fluctuations in your search rankings. Google's guidance remains the same: focus on creating high-quality content that provides value to your users.If your site's rankings are negatively impacted by an update, Google advises that there aren't specific actions to take to recover. However, they do offer a list of questions to consider if your site is hit by a core update. It's also possible to see a bit of recovery between core updates, but the most significant changes typically occur after subsequent updates. In conclusion, Google's Core Updates are a reminder of the importance of having a robust content strategy that prioritizes user experience. By focusing on creating valuable, high-quality content, you're more likely to weather the storm of algorithm changes and maintain a strong presence in search results.3.  According to Google, Small Websites Can Win in the Ever-Evolving Search Landscape - In the digital age, where the internet seems dominated by large players, it's easy to wonder if small websites still have a fighting chance. Danny Sullivan, Google's Search Liaison with over 25 years in the search space, shared an encouraging perspective that's vital for you as a business owner to understand.Search engines have evolved tremendously since the pre-Google era, and with each update, concerns arise about the visibility of small sites. Yet, history has shown us that small sites have not only survived but can thrive and become leaders in their niches. Sullivan himself has witnessed this growth firsthand, having run two successful small sites before joining Google.The key takeaway? Google's goal is to reward great content, regardless of the site's size. They are committed to the success of the open web ecosystem, where quality content leads to satisfaction all around—for searchers, content creators, and the search engines themselves.However, the web is dynamic, with changing content and shifting user expectations. Google continuously adapts, striving to improve search results. This includes recognizing valuable contributions from various sources, including forums and personal experiences, ensuring a diverse mix of results.For you, the message is clear: don't get caught up in an "expert arms race" or create content solely with Google's algorithms in mind. Instead, focus on what benefits your readers. Author bios, for instance, should be crafted for your audience, not for search engine optimization. Quality content that serves your readers' expectations aligns naturally with Google's ranking principles.Sullivan emphasizes that there's no checklist for success. Instead, he points to guidelines that help assess if content is genuinely people-first, such as providing original information or analysis. If your content consistently delivers value and leaves visitors thinking, "Wow, that was great. I learned something. That was helpful," you're on the right track.Remember, it's not about being labeled as an "expert" or having the perfect About page—it's about creating content that resonates with and serves your audience. As a business owner, focusing on delivering great content is your key to thriving in the search landscape.P.S. For more insights on creating helpful content, visit Google's guide here: Creating Helpful Content.4. Unpacking the Real Impact of Keyword Stuffing on Your Site - In the realm of SEO (Search Engine Optimization), "keyword stuffing" has been a buzzword for all the wrong reasons. But what does it truly mean for your website's content? Danny Sullivan, Google's Search Liaison, sheds light on this topic, clarifying that keyword stuffing isn't about the number of times a word is used. Instead, it's about the context and quality of your writing.Keyword stuffing refers to cramming a webpage with keywords or numbers to manipulate a site's ranking in Google search results. This often results in content that sounds unnatural or out of context. For example, repeating "unlimited app store credits" to an excessive degree can be flagged as keyword stuffing. It's not the repetition per se that's the issue; it's when the repetition leads to non-sensible patterns and unhelpful writing.Sullivan advises against writing with Google's algorithms as your primary audience. Phrases shouldn't be repeated unnaturally just to ensure Google "gets it." Instead, focus on how people consume content. Google's sophisticated language analysis can understand meaning and concepts without the need for over-repetition. The best practice is to write naturally, as if you're explaining something to a person, not a search engine.This insight is crucial for you as a business owner. The content on your website should be crafted for your customers, providing them with value and a pleasant reading experience. If your content is engaging, informative, and written in a natural tone, you're already aligning with Google's preference for high-quality, user-focused content.Remember, the goal is not to outsmart search engines but to create content that genuinely serves your audience. By doing so, you'll naturally improve your site's SEO and foster a trustworthy relationship with your customers.5. Write for Your Readers, Not for Google - In the digital world, where Google's algorithms can feel like an enigma, Danny Sullivan, Google's Search Liaison, offers a simple yet profound piece of advice: "Stop thinking 'What should I do for Google?' when writing content." Instead, focus on what your readers need and want. This shift in perspective is not just refreshing; it's crucial for creating content that resonates with your audience and, ironically, performs well on Google too.Sullivan's guidance comes as a reminder that the content you produce should cater to the interests and queries of your readers, not to the perceived demands of search engines. The essence of his message is to prioritize the quality and usefulness of your content over SEO tactics. Google's ranking systems are designed to reward content that serves readers well, which is why your primary goal should be to address the needs and questions of your audience.This approach is more than just a best practice; it's a strategic move in the ever-evolving landscape of SEO. By focusing on your readers, you're likely to naturally include the keywords and topics they're searching for, which aligns with how Google assesses and ranks pages. Sullivan reiterates this point, encouraging content creators to write in a way that's most helpful for their audience.For you, this means taking a step back from the technicalities of SEO and asking yourself: What information does my audience seek? How can I provide value in a way that's engaging and informative? The answers to these questions will guide you in crafting content that stands out, not just to search engines, but more importantly, to the people you aim to serve.In conclusion, let go of the "What should I do for Google?" mindset. Instead, embrace a reader-first approach to content creation. It's a strategy that will serve your business well, building trust with your audience and, as a result, with search engines too.6. Vintage or Outdated? Google's Take on Old Content's Value - Understanding the importance of content relevance is crucial for any business owner. The digital space is often seen as a race to stay current, but Sullivan's insights suggest that the true value lies in the content's quality and utility to the reader, not its publication date. He states, "Just because something is older doesn't make it unhelpful." This is a significant point, especially for news publishers and content creators who worry about the need to constantly update their articles to maintain relevance.The conversation arose from a question about the feasibility of news publishers updating old articles. Sullivan's response was clear: if the content is written with the audience in mind and remains relevant, it doesn't lose its value over time. This perspective is part of Google's broader guidance, which emphasizes creating content that prioritizes the reader's experience.For you, this means that the evergreen content on your site—those pieces that provide timeless value—can continue to serve your audience and contribute to your site's authority. It's a reminder to focus on quality over quantity and to consider the lasting impact of what you publish.In summary, don't rush to discard or rewrite your older content just because of its date. Evaluate its ongoing relevance and usefulness to your audience. If it still answers questions, solves problems, or provides insight, it remains a valuable asset to your website.7. The Truth About Schema: Does It Really Boost Your Google Ranking? - Are you looking to improve your website's ranking on Google? You might have heard that adding schema, or structured data markup, to your pages can give you a leg up in search results. However, Danny Sullivan, Google's Search Liaison, has clarified a common misconception: schema does not directly improve your site's ranking.Schema markup is a code that you put on your website to help search engines return more informative results for users. While it's true that schema can enhance the way your page appears in search results—potentially increasing click-through rates—it's important to understand that it doesn't give your page a ranking boost. Sullivan emphasizes, "Using schema doesn't give you a ranking boost. It can help you be eligible for certain displays or enhancements, but it doesn't somehow boost you to the top of results."This clarification is crucial for business owners like you who are investing time and resources into SEO strategies. It's easy to get caught up in the myriad of tips and tricks out there, but Google's stance is clear: schema is about aiding search engines in understanding your content, not about climbing the search result ladder.Moreover, if your site violates Google's schema guidelines, it won't impact your ranking. Instead, it may disqualify your site from being eligible for those rich results or enhancements that schema can provide. Sullivan's message is a reminder to focus on creating quality content first and foremost. The use of schema should be viewed as a tool for improving user experience, not as a shortcut to higher rankings.As a business owner, it's essential to align your SEO efforts with practices that genuinely help your audience. By focusing on delivering valuable, high-quality content, and using schema as it's intended—to enhance user understanding—you're setting your site up for long-term success.Remember, the goal is to make your content helpful and accessible to your audience. Schema can contribute to this by making your content more understandable to search engines, which in turn helps users find the great content you're providing.8. Boost Your Video's Visibility: Google's New Structured Data Guidelines - Google has updated its guidelines for video structured data with new recommendations that could enhance your video content's visibility in search results. Structured data is a standardized format for providing information about a page and classifying the page content. For videos, this means using specific code to communicate details to Google about the video content on your site, which can then be displayed in rich search results.The key change is that Google now requires not just the date but the exact time a video was published and also when the video expires, using the ISO 8601 format. This update aims to make the publication time more precise, which can be crucial for timely content. Google suggests including timezone information with this data; otherwise, it defaults to the timezone of Googlebot, which may not align with your local time.If you ignore these new recommendations, the only immediate consequence is the potential for a timezone mismatch. However, it's always wise to present structured data as Google expects. Think of it as fine-tuning your content for optimal performance – you're giving Google exactly what it's looking for, which can only benefit your site.In summary, take the time to update your video content with the precise publication time and timezone. It's a simple step that aligns with Google's best practices and could make a noticeable difference in your content's online presence.9. Google's Sep'18 Top Money-Making Search Queries Revealed - Have you ever wondered which search queries are the golden tickets in the vast lottery of Google searches? A recent revelation from an ongoing antitrust trial has shed light on this very topic, and the insights are invaluable for any business looking to capitalize on Google's search engine.For a single week in September 2018, terms like "iPhone," "auto insurance," and "cheap flights" were the most profitable for Google in the United States. This information, previously shrouded in secrecy due to Google's tight-lipped policies on revenue data, was disclosed in a heavily redacted slide. While the exact figures remain undisclosed, the list of terms provides a fascinating glimpse into user behavior and market trends.Why does this matter to you? Understanding these high-revenue search terms can inform your digital marketing strategy. If your business is related to any of these lucrative categories, there's a clear opportunity to align your online content and advertising efforts with what users are actively seeking.The top revenue-driving searches ranged from tech products to insurance and entertainment services, indicating diverse areas where Google users are willing to spend their money. This diversity suggests that there's room for businesses of all kinds to find their niche in the search engine landscape.In conclusion, while the specifics of Google's earnings from these searches are not public, the takeaway for you is the importance of targeting the right keywords in your SEO and PPC campaigns. By focusing on terms related to your business that have proven profitable, you can potentially increase your visibility to high-intent customers and drive more revenue through your online channels.10. Google Ads Click Tracker Policy - In the digital age, tracking the performance of your online ads is crucial for understanding customer behavior and maximizing your return on investment. Google Ads has recently updated its policies, and there's an important change you should be aware of: the requirement to use certified click trackers.A click tracker is a tool that records clicks on your ads, providing valuable data on how users interact with your advertising. Google has now mandated that, for accounts using click trackers for the first time after September 11, 2023, only certified click trackers will be accepted. Ads using non-certified trackers may be disapproved. This policy will be fully enforced over the next 12 to 18 months, with the timing depending on factors like the advertiser's history and account activity.Why is this important for you? Compliance with Google's policies ensures that your ads remain active and effective. Using a certified click tracker not only aligns with Google's standards but also guarantees that the data you collect is reliable and actionable.If you're currently using a third-party click-tracking service, it's time to check if it's on Google's approved list. If not, you'll need to switch to a certified provider or risk having your ads disapproved. This may seem daunting, but it's a step towards more transparent and trustworthy ad tracking.For those who provide click-tracking services, Google has opened applications for certification. Meeting their guidelines means you can continue to serve businesses like yours without interruption.

#TWIMshow - This Week in Marketing
Ep 183: Microsoft's New Automated Bidding Features for Your Business!

#TWIMshow - This Week in Marketing

Play Episode Listen Later Oct 23, 2023 13:41


Episode 183 contains the important Digital Marketing News and Updates from the week of Oct 16-20, 2023.1. Microsoft's New Automated Bidding Features for Your Business! - Microsoft Advertising has rolled out two automated bidding strategies, Target CPA and Maximize Conversions, to all advertisers. These features aim to simplify the advertising process and are now available in all regions where Microsoft's Audience Ads are served. Automated bidding strategies help you optimize your ad campaigns without constant monitoring, freeing up your time for other business tasks.Target CPA allows you to set a cost-per-acquisition goal, and the system will automatically adjust your bids to meet that target. Maximize Conversions, on the other hand, aims to get you as many conversions as possible within your set budget. Both strategies offer flexibility in budgeting and performance measurement, putting you in control while also reducing manual effort.Microsoft's Audience Ads are native display ads that leverage Microsoft's data insights to target your desired audience effectively. These ads appear on various platforms like MSN, Outlook, and more. 2. Google's October 2023 Core and Spam Updates Completed! - Google has finished rolling out two significant updates that you should be aware of: the October 2023 Core Update and the October 2023 Spam Update. These updates can affect your website's visibility on Google, so let's break it down.October 2023 Core Update: This update took almost 14 days to roll out completely, starting on October 5th and ending on October 19th. It had a significant impact on SEOs and websites, causing a lot of volatility in search rankings. The update is global and affects all types of content. If you've noticed changes in your website's performance, this could be why. Google hasn't specified what percentage of queries were impacted, but it's safe to say that the reach is wide.October 2023 Spam Update: This update started on October 4th and ended on October 20th. It targets spam techniques that violate Google's policies, particularly in languages like Turkish, Vietnamese, Indonesian, Hindi, and Chinese. If your website was affected, Google suggests reviewing its spam policies.Interestingly, these two updates overlapped, making it challenging to pinpoint which update caused changes to your website's rankings. If you're not involved in spammy practices and see changes, it's likely due to the Core Update. If you do engage in spam and notice alterations, it's probably the Spam Update.Why Should You Care? These updates can either elevate your website's visibility or push it down the search rankings. If you've been hit, it's crucial to adapt your strategies accordingly.Need Expert Guidance? If all this sounds overwhelming, we're here to help. Contact us for expert advice on navigating these Google updates seamlessly.3. YouTube's New AI-Powered "Spotlight Moments" - YouTube has launched an advertising feature called "Spotlight Moments," designed to boost your brand's visibility during key cultural events. This new feature uses artificial intelligence to place your brand's videos next to the most relevant and engaging content related to specific cultural moments, like Halloween or Christmas.Here's how it works: YouTube uses AI to identify popular videos related to these cultural moments. Your ads will then be served alongside this content, giving your brand a chance to shine during these high-traffic periods. All these videos will be curated into dynamically updated playlists on a sponsored hub called the "YouTube Culture Hub," which will also feature your brand's logo."Spotlight Moments" offers a unique opportunity to increase your brand's visibility and reach your target audience during times when they are most engaged. The big question, of course, is what kind of return on investment (ROI) you can expect from this new ad package.4. YouTube with New Product Targeting Features! - YouTube is introducing a new feature that allows creators to add timestamps to tagged products in their videos. This new feature aims to enhance your advertising strategy by targeting potential customers at the most opportune moments.When a timestamp is added to a tagged product, a shopping cart button appears at a time when viewer engagement is expected to be high. This precision in targeting can result in increased engagement and conversions.The feature is especially timely as YouTube plans to remove some ad controls for newly uploaded videos next month (removing individual ad controls for newly uploaded videos in November for the following ad types: Pre-roll, Post-roll, Skippable, and Non-skippable. Moving forward, creators will only be able to decide between displaying ads before or after a video, and whether to have this option on or off. If they turn this option on, YouTube will automatically decide which ad type to display as appropriate), giving creators and sponsors more control in the meantime.But that's not all. YouTube is also rolling out additional features to improve the shopping experience. One such feature allows creators to bulk tag affiliate products in video libraries, potentially earning revenue from older, high-traffic content. Additionally, a new reporting tool will soon be available in YouTube Studio to reveal which affiliate products are generating the most revenue for brands.5. Google Ads Certification Now Requires Video Recording: What You Need to Know! - Google Ads has recently updated its certification process, adding a new layer of scrutiny for those taking the exam. The new rules require test-takers to record themselves during the exam, among other requirements.Specifically, you'll need to take a picture of your photo ID, a picture of your face, and record a video of your test-taking environment. These steps are part of Google's efforts to ensure exam integrity. During the exam, a live proctor will monitor your activity via webcam and screen-sharing software. If any unauthorized behavior is noticed, the proctor will pause your exam and message you. The guidelines also prohibit the use of open books, notes, scratch paper, and even restroom breaks during the exam.The new rules have stirred some reactions in the community, with some finding them overly strict, especially for those who work from home and may have other people around. According to Google, the changes aim to maintain the credibility of the Google Ads certification, making it a more reliable indicator of expertise. However, I disagree with Google.6.

mixxio — podcast diario de tecnología
Sujétame el nanómetro, que voy

mixxio — podcast diario de tecnología

Play Episode Listen Later Sep 29, 2023 15:50 Transcription Available


Prejuicios por el fondo de tus videollamadas / Intel Irlanda arranca los 4 nm / Raspberry Pi 5 por sorpresa / Redada de Nvidia en Francia / Diseña tu propia galaxia Patrocinador: Por fin llega a los cines The Creator, una de las películas de ciencia ficción más esperada. Se estrena el 29 de septiembre y que no te puedes perder por nada del mundo. Dirigida por Gareth Edwards (su primera película tras Rogue One), estoy seguro que The Creator será un clásico instantáneo. — ¿Has visto ya el trailer? Prejuicios por el fondo de tus videollamadas / Intel Irlanda arranca los 4 nm / Raspberry Pi 5 por sorpresa / Redada de Nvidia en Francia / Diseña tu propia galaxia

#TWIMshow - This Week in Marketing
Ep176- Google Ads Limited Ad Serving Policy: What You Need to Know

#TWIMshow - This Week in Marketing

Play Episode Listen Later Sep 4, 2023 15:40


Episode 176 contains the notable Digital Marketing News and Updates from the week of August 28 - Sep 1, 2023.1. Google Announces Lighthouse 11 with New Accessibility Audits, and LCP Bug Fix - Google PageSpeed Insights (PSI) is a free tool to help you find and fix issues slowing down your web application. PageSpeed Insights (PSI) reports on the user experience of a page on both mobile and desktop devices, and provides suggestions on how that page may be improved. An open-source tool called Lighthouse collects and analyzes lab data that's combined with real-world data from the Chrome User Experience Report dataset. Google has released the latest version (v.11) of Lighthouse, an open-source tool that helps developers and webmasters measure the performance of their websites. Lighthouse 11 includes a number of new features and improvements, including: New accessibility audits: Website accessibility is not currently a ranking factor and quite likely not a quality signal. However it's a best practice for a website to function correctly for as many people as possible. Lighthouse 11 introduces thirteen new accessibility audits that help developers identify and fix accessibility issues on their websites. Changes to how best practices are scored: Lighthouse 11 has changed the way that best practices are scored. This makes it easier for developers to understand how their websites are performing and what they can do to improve their scores. Largest Contentful Paint scoring bug fixed: Lighthouse 11 has fixed a bug that was affecting the scoring of Largest Contentful Paint (LCP). LCP is a measure of how long it takes for the largest content element on a page to become visible. Updated Interaction to Next Paint (INP) to reflect it's no longer experimental: Lighthouse 11 has updated the Interaction to Next Paint (INP) metric to reflect that it is no longer experimental. INP measures the time it takes for a user to be able to interact with a page after it has loaded. In my opinion, INP is in line to become an official Core Web Vital in 2024. These changes make Lighthouse 11 a more powerful and useful tool for developers and webmasters who want to improve the performance of their websites.2. YouTube Creators Can Now Remove Community Guideline Strikes - YouTube has announced that creators will now be able to remove Community Guideline strikes from their channels by completing educational courses. This is a new policy that was introduced in June 2023, and it is designed to help creators learn about the Community Guidelines and how to avoid violating them in the future.To remove a strike, creators will need to complete a course that covers the Community Guidelines and how to create compliant content. The course is available in several languages, and it takes about an hour to complete. Once the course is completed, the strike will be removed from the channel.This new policy is a positive step for YouTube creators. It provides them with a way to learn from their mistakes and avoid getting strikes in the future. It also shows that YouTube is committed to creating a safe and positive environment for its users.3. Your Site's Language Doesn't Protect It From Google's Penalty - Google can issue manual actions to any site, regardless of the language it is written in. This means that sites written in non-native English can still be penalized by Google if they violate the company's webmaster guidelines.Google Search Advocate, John Muller was asked if Google penalizes sites written by non-native English writers. Mueller responded that manual actions and algorithm changes are independent of the native language of the authors or the site language. He also said that Google does not have a list of "bad" languages, and that all sites are treated equally.This means that site owners who write in non-native English need to be just as careful as those who write in English as their first language. They should avoid any practices that could lead to a manual action, such as keyword stuffing, cloaking, and duplicate content.4. How Googlebot Handles AI-Generated Content

#TWIMshow - This Week in Marketing
Ep174- Google Doesn't Use Third-Party SEO Tool Scores for Rankings

#TWIMshow - This Week in Marketing

Play Episode Listen Later Aug 21, 2023 16:29


Episode 174 contains the notable Digital Marketing News and Updates from the week of August 14 -18, 2023.1. Google Ads Sunsetting Enhanced CPC on Shopping campaigns - Google Ads Shopping campaigns will no longer use Enhanced cost-per-click (eCPC) starting in October. Instead, Shopping campaigns using eCPC will behave as though they are using Manual CPC bidding, according to an email Google sent advertisers. Google explained that it is adopting more advanced strategies and campaigns as its technology improves. Google pointed out that eCPC was launched more than 10 years ago, and new strategies, such as target ROAS, Maximise conversion value and fully automated campaigns like Performance Max, can help you achieve the same or better results.If you are using eCPC, Google recommends taking the following actions:  In your Standard Shopping campaigns, try the one-click Target ROAS experiments for Shopping, which you can find in your campaign settings.  Alternatively, you can trial Google Ads' newest fully automated solution, Performance Max campaigns. 2. Google Testing Small Advertisers Premium Support - Not sure if this true but Search Engine Journal reported that Google Ads has launched  a new pilot program to provide enhanced customer service for a select group of small Google Ads customers. The goal of the paid pilot program is to: “… provide agencies and advertisers with specialized one-on-one support tailored to specific customer needs.”This marks a shift for Google, which has historically reserved this high-touch level of support for its largest advertising clients.My thoughts in the show. Listen and find out.3. Google Makes it Easier for Customers to Find You on Social Media - Google has announced that businesses can now add social media links to their Google Business Profiles. This new feature allows businesses to make it easier for customers to find and connect with them on social media.To add social media links to your Google Business Profile, you will need to: Go to your Google Business Profile. Click on the Edit button. Click on the Contact tab. Under Social profiles, enter the URL for each social media platform where you have a presence. Click on the Save button. Once you have added your social media links, they will be displayed on your Google Business Profile. Customers will be able to click on the links to visit your social media profiles.4. Google: Domain Name Does Not Make Or Break Your SEO - We've covered this topic in the past however it came up this week so we are covering it again. Here is what Google's John Muller wrote:“An #seo question from the X-Twitter world: In the domain name, is the use of dash ( – ) recommended or not? It's fine  Pick a domain name for your brand for the long run, don't just collect keywords (the common reason for dashes). Build out a domain. – For SEO, dashes are very minimally better in URLs than underscores. Don't change your URLs for them tho. Don't use spaces, commas, colons, etc in URLs.  Your domain name is never going to make or break your SEO.” 5. Is It Okay to Use noindex to Remove a Page from SERP Sitelinks? - Google's John Mueller answered a question about whether it's okay to use the noindex meta tag to remove a page from SERP sitelinks. Here is the original question: “Is noindex a good way to get a page out of a search result sitelinks? Should be an option in web console imho but a specific real estate agent's page is part of the website sitelinks where there are lots of other pages like About etc. that should be there instead. Should I temporarily noindex the agent's page to get it off the sitelinks?”What are SERP sitelinks? SERP Sitelinks are clustered links in the search results from one domain and are are typically shown when someone searches for the name or URL of a website.And a noindex meta tag is a directive, which means that search engines are obligated to obey the request. In the case of a meta noindex tag, this means that search engines are required to drop a webpage from the index. In simple terms, there's an indexing engine, which is the part with Googlebot that goes out, crawls the web and acquires website content for possible inclusion into Google's index. And there's also a ranking engine.Mueller said that it's generally fine to use noindex for this purpose, but there are a couple of things to keep in mind.“I suspect (computers do weird things, so no guarantees :-)) what would happen is we'd drop it (during the noindex) and return it to normal (same state as before) when you remove the noindex. We wouldn't see a temporary noindex as a signal that you like it a little bit less — it's either indexable or not, the ranking side is separate from the indexing.”Google lists a number of steps that publishers can take to keep less desirable pages out of the sitelinks in their official documentation: Make sure that the text you use as your page titles and in your headings is informative, relevant, and compact.  Create a logical site structure that is easy for users to navigate, and make sure you link to your important pages from other relevant pages.  Ensure that your internal links' anchor text is concise and relevant to the page they're pointing to.  Avoid repetitions in your content. However it still does not solve the mystery as to why a less desirable page is showing up in SERP sitelinks. May be what is less desirable to the site owner is more desirable to Google? 6. Why Your Site Might Not Be Ranking in Google Despite Good SEO? During August 2023, SEO Office hour, someone asked “Why is my site not ranking despite low competition and good SEO? I have a sitemap, indexed pages, updated content, backlinks and on-page optimization. But my site is still not in the top 200 results for my keywords.”Before I jump into what Google's John Muller said in his reply, I want to point out that “Good” is subjective here. It depends on the expertise, knowledge, and competence of the individual making that judgment. Until failure makes a knowledge gap obvious, it is difficult to be aware of what one does not know.So "good" SEO might mean different things. Anyways, now on to how Muller responded. First he said “I see this kind of question often. Google tends to focus on various technical aspects when it comes to talking about SEO, but you need to do more.A good way to think about this is to compare it to the offline world.  When it comes to books, does a good cover photo, a reasonable sentence length, few misspellings, and a good topic mean that a book will become a best-seller? Or as a restaurant, does using the right ingredients and cooking in a clean kitchen mean you'll get a lot of customers? Technical details are good to cover well, but there's more involved in being successful.”Hmm. So Mueller responded to the question in riddles. IMO, what he is saying that in addition to technical details you need to have good quality content aka helpful content. More of my thoughts in the show. 7. Google Doesn't Use Third-Party SEO Tool Scores for Rankings - Google has clarified its stance on third-party SEO tool scores, stating that Google does not use these scores to determine search rankings. “Once again, no, Google does not use scores from third-party SEO tools for search. However, that doesn't mean that they're all useless.” This means that while these tools can be helpful for providing insights into a website's SEO performance, they should not be relied upon as a definitive measure of success. “Transparently calculated scores can be useful to estimate your website's standing or to point out actual issues. They could help with the next steps or perhaps even qualify the work that was done.”John Mueller, a Google Search Advocate, made the clarification in a recent episode of the "Ask Googlebot" video series. He explained that Google's algorithms are constantly evolving, and that they take into account a variety of factors when ranking websites. These factors include the quality and relevance of the content, the number and quality of backlinks, and the user experience.Mueller mentioned Google's Chrome Lighthouse tool as a helpful tool that doesn't directly impact rankings. Lighthouse generates scores for website performance, but Google doesn't use them to rank websites in search results. However, Mueller explained the performance scores from Lighthouse could benefit you: “The score is transparently created based on various tests, which you can look at. With the overall score, you can estimate how well your website performs for those tests.” Website analytics scores can help you find problems, such as utilizing overly brief anchor text that could negatively impact site navigation. Though these concerns may not directly affect search engine ranking positions, they could still affect user experience and click-through rate.He said that these tools can be helpful, but they should not be used as a substitute for understanding how Google's algorithms work. Website authority scores, spam scores – Google doesn't use any of those in ranking. It's just an SEO myth that won't die.

#TWIMshow - This Week in Marketing
Ep172-Blocking Banner Ads From Getting Indexed

#TWIMshow - This Week in Marketing

Play Episode Listen Later Aug 7, 2023 16:04


Episode 172 contains the notable Digital Marketing News and Updates from the week of July 31 - August 4, 2023.1. OpenAI's GPTBot: The New Web Crawler That's Changing the Game - OpenAI has announced the launch of GPTBot, a new web crawler that uses large language models to crawl the web. GPTBot is designed to be more efficient and effective than traditional web crawlers, and it can be used to collect data for a variety of purposes, including research, development, and marketing.You can use robots.txt to block GPTBot from accessing your website, or parts of it. To disallow GPTBot to access your site you can add GPTBot to your site's robots.txt:User-agent: GPTBotDisallow: /To allow GPTBot to access only parts of your site, you can add the GPTBot token to your site's robots.txt like this:User-agent: GPTBotAllow: /directory-1/Disallow: /directory-2/GPTBot documentation. You can read the documentation on GPTBot.GPTBot IP ranges. OpenAI also published the IP ranges that GPTBot uses. It only lists one, but I suspect they will add more over time.2. Google Releases Responsive Search Ads Guide - Responsive search ads (RSAs) are a type of ad that allows you to create multiple headlines and descriptions, and Google will test different combinations to see which ones perform best. This can help you to create more effective ads that are more likely to get clicks.To create an RSA, you will need to provide a minimum of 15 headlines and 4 descriptions. Google will then test different combinations of these headlines and descriptions, and the best-performing combinations will be used in your ads.Now Google has released a Responsive Search Ads guide to help marketers improve their campaign performance.The new RSA guide includes Google's advice and information on how: To write ads for maximum conversions. RSAs use Google AI to generate ad copy. To evaluate a campaign's effectiveness. A Google spokesperson said: "Our goal is to simplify the work it takes to create high-quality ads that achieve your business objectives at scale. To do this, we're hard at work applying the newest AI-powered innovations across asset generation, customizing assets for relevance, and both simplifying and improving tools for managing ads."Read Google's Responsive Search Ads guide (PDF) for more information.3. YouTube Shorts Adds Six New Features - YouTube Shorts, the short-form video platform from YouTube, has launched a number of new features to improve content creation. The new features rolled out on YouTube Shorts are: Collab – This new feature allows creators to film a Short alongside other YouTube or Short videos in a side-by-side format.  Multiple layout options are available. Q&A stickers – Content creators can use this new feature to ask viewers questions. The audience can then leave their replies in the comments section, enabling creators to give them a shoutout as they can see who left the response. Mobile-first vertical live – This new feature is what will allow live creators to get discovered in the Shorts feed. YouTube is hoping this will enable creators to connect live with a new audience and build communities in a more modern way.  Creation suggestions – YouTube can now automatically bundle the audio and effects from a Short you may be remixing. The platform can surface the same audio time stamp from the Short you just watched, and add the same effect as a creation suggestion.  Content creators also have the option to mix and match.  Shorts playlist – Creators can now save Shorts to playlists directly onto YouTube.  Transform horizontal videos to Shorts – This new feature is being trialled in the coming weeks. Users will now be able to choose a video to remix and adjust the layout, zoom and crop to turn it into a Short.4. YouTube's New AI-Generated Video Summaries: A Quick Way to Get the Cliff Notes - YouTube is testing a new feature that uses artificial intelligence to generate summaries of videos. The summaries are displayed in a small box below the video player, and they provide a brief overview of the video's content. “We're starting to test AI auto-generated summaries on YouTube, so that it's easier for you to read a quick summary about a video and decide whether it's the right fit for you. To begin with, you may see these summaries on watch and search pages. While we hope these summaries are helpful and give you a quick overview of what a video is about, they do not replace video descriptions (which are written by creators!).”The summaries are generated using a technique called natural language processing, which allows computers to understand and process human language. The summaries are not perfect, but they can be a helpful way to get a quick overview of a video before you decide to watch it. Remember, generative AI produces a lot of misleading, untrue, and confused content.5. Linking to High-Authority Sites Won't Help Your SEO - Google Search Advocate John Mueller recently clarified that linking to popular sites like Wikipedia does not impact search rankings, dispelling a myth that has persisted among SEO practitioners. He stated that linking to high-authority sites does not help SEO, and that the focus should be on providing useful links that benefit users.Mueller further illustrated his point with a humorous example, saying, “Here's my affiliate site about handbags – and here's a link to CNN & Wikipedia, please take me seriously now, k?” His statement underscores the misconception that linking to high-authority sites is akin to borrowing their credibility. Instead he advised: “Does this link provide additional, unique value to users? Then, link naturally. Is this link irrelevant to my users? Then don't link to it.”6. Keyword Stuffing: How Many Times Should You Use a Keyword on a Page? -  Keyword stuffing is a practice of repeating the same keywords or phrases too many times on a page in an attempt to manipulate search engine rankings. However, Google has become very good at detecting keyword stuffing, and pages that engage in this practice are often penalized in search results.In a recent Twitter thread, Google's John Mueller said that there is no need to worry about keyword stuffing if you only use a keyword 10-20 times on a page. However, if you unnecessarily overuse a keyword, then you may be engaging in keyword stuffing and could be penalized by Google. The question you should ask yourself : “Will the reader be ok with having to read the keyword multiple times?”7. Google Releases Appeal Form For Ads Trademark Policy Violations - Google announced a change to its Google Ads trademark policy and will require advertisers to file an appeal form if their ad is flagged for trademark infringement. Previously, advertisers could simply click a button to appeal a flagged ad, but the new process requires them to provide more information about their ad and why they believe it is not infringing on a trademark.The change is intended to simplify and speed up the appeals process for advertisers, as well as to provide more clarity for trademark holders about why their trademarks are being used in Google Ads. However, some advertisers have complained that the new process is too cumbersome and time-consuming.8. Blocking Banner Ads From Getting Indexed - Someone on Reddit asked Google's John Muller if it is possible to prevent Banner Ads from getting Indexed. The person was worried that GoogleBot will consider banner ads as duplicate content since the same banner will appear on all the pages on the site.John simply answered that: “You cannot noindex a part of a page like that.” And John is **somewhat** correct. However, if you have good knowledge & understanding of Semantic HTML then you will know that to avoid potential confusion and issues, you need to put your banner ads outside of the “main” tag or if it appears within “main” then wrap the ad within the “aside” tag.While Google in general can identify what content is advertising and which content is main content, but don't leave it to chance, use semantic HTML markup to make it clear for Google. It's an SEO best practice to make the page structure exceedingly clear for Google.

Believe you can because you can!
Unearthed SEO Opportunities: How Googlebot Ignores Slow Pages with Julia Nesterets (#591)

Believe you can because you can!

Play Episode Listen Later Aug 3, 2023 52:52


Speed matters. In our next episode, we’re talking to Julia Nesterets, who will guide us on an untrodden SEO path – the correlation between slow pages and Googlebot’s behavior. In an era where user experience is everything, we often overlook how our site speed can make or break our SEO efforts. But there’s an underlying…

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast
JavaScipt that Doesn't Hurt SEO Performance -- Serge Bezborodov // JetOctopus

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast

Play Episode Listen Later Jul 12, 2023 13:55


Serge Bezborodov, CTO and Co-Founder of JetOctopus, talks about log files and Googlebot. Though it's not usually SEO's decision, javascript integration becomes inevitable once you have a large website. JavaScript presents several challenges to SEO due to its dynamic nature and potential impact on website crawlability, indexation, and user experience. Today, Serge discusses javascript that doesn't hurt your SEO performance. Show Notes Connect With: Serge Bezborodov: Website // LinkedInThe Voices of Search Podcast: Email // LinkedIn // TwitterBenjamin Shapiro: Website // LinkedIn // TwitterSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast
Interlinking Structure's Impact on Googlebot -- Serge Bezborodov // JetOctopus

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast

Play Episode Listen Later Jul 11, 2023 14:13


Serge Bezborodov, CTO and Co-Founder of JetOctopus, talks about log files and Googlebot. Effective internal linking plays a vital role in ensuring the crawlability of your website. By strategically placing links between relevant pages, you create a roadmap that search engine bots can follow to discover and index your content more efficiently. Today, Serge discusses interlinking structure's impact on Googlebot. Show NotesConnect With: Serge Bezborodov: Website // LinkedInThe Voices of Search Podcast: Email // LinkedIn // TwitterBenjamin Shapiro: Website // LinkedIn // TwitterSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast
Controlling Googlebot Thru Log Files Analysis -- Serge Bezborodov // JetOctopus

Voices of Search // A Search Engine Optimization (SEO) & Content Marketing Podcast

Play Episode Listen Later Jul 10, 2023 13:35


Serge Bezborodov, CTO and Co-Founder of JetOctopus, talks about log files and Googlebot. Unlike typical SEO crawlers, Googlebot has the ability to crawl pages that may not be present in your website's existing structure or indexed. What we want to avoid is valuable pages not being crawled and indexed, while Googlebot's crawl budget is potentially wasted on outdated or low-quality pages. Today, Serge discusses controlling Googlebot through log file analysis. Show NotesConnect With: Serge Bezborodov: Website // LinkedInThe Voices of Search Podcast: Email // LinkedIn // TwitterBenjamin Shapiro: Website // LinkedIn // TwitterSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Search Buzz Video Roundup
Search News Buzz Video Recap: Google Update, Twitter Blocked Google, Bing Removed From ChatGPT & More SEO, PPC and Analytics News

Search Buzz Video Roundup

Play Episode Listen Later Jul 7, 2023


This morning, we are seeing significant movement with the Google search results ranking and algorithm. Twitter blocked unregistered users from seeing tweets, which blocked Googlebot and resulted in huge drops in Google Search traffic but a few days later...

#TWIMshow - This Week in Marketing
Ep166 - Microsoft Launches AI-Powered Advertising Tool That Can Help You Increase Your Conversion Rates

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jun 26, 2023 16:15


Episode 166 contains the notable Digital Marketing News and Updates from the week of June 19 - 23, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.1. YouTube's New Thumbnail A/B Testing Tool - YouTube has launched a new thumbnail A/B testing tool that allows creators to test different variations of their thumbnail images to see which one performs best. The tool is currently in live testing with a small group of creators, but it is expected to be rolled out to more creators in the coming months.The thumbnail A/B testing tool is a valuable addition to YouTube's creator toolkit. By testing different thumbnail images, creators can ensure that they are using the most effective thumbnail to attract viewers and maximize their video views.2. TikTok Expands TikTok Shops to All Businesses In The US, UK, Canada, and France - TikTok is expanding access to its TikTok Shops eCommerce program to all businesses in the US, UK, Canada, and France. The program allows businesses to sell products directly through TikTok, and it has been growing rapidly in popularity. In the first quarter of 2023, TikTok Shops generated over $1 billion in GMV.To join the TikTok Shops program, businesses need to have a TikTok account and a Shopify or Ecwid store. Once they are approved, they can start adding products to their TikTok Shop and linking their product catalog to TikTok. Businesses can then tag products in their videos and create shopping posts that allow users to purchase products directly from TikTok.The expansion of TikTok Shops to all businesses in these four countries is a major step for the platform. It will allow more businesses to reach a wider audience and sell more products through TikTok. This is likely to further boost the growth of TikTok Shops in the coming months and years.3. Instagram Now Lets You Download Public Reels! - Instagram is now rolling out the ability for users to download publicly posted Reels content to their camera roll. This means that users can now save Reels to their devices for offline viewing, sharing, or editing. To download a Reel, simply tap on the share icon and then select the "Download" option.There are a few limitations to this feature. First, only Reels from public accounts can be downloaded. Second, creators can opt out of enabling downloads of their content in their Account Settings. Finally, some users have reported audio issues with some Reels content, which could be linked to Meta's music licensing agreements.Overall, this is a welcome addition to Instagram that will give users more flexibility with how they interact with Reels content. It will also make it easier for users to share Reels with others, even if they don't have Instagram.4. LinkedIn's New AI Image Detector Catches 99% of Fake Profiles - LinkedIn has developed a new AI image detector that has a 99% success rate in catching fake profile photos. The detector uses a variety of techniques to identify fake images, including analyzing the image's pixels, comparing it to other images in its database, and looking for inconsistencies in the image's metadata.The advent of AI-generated profile images has made it easier to create fake LinkedIn profiles, which has exacerbated an already huge problem. In the first half of 2022, LinkedIn detected and removed 21 million fake accounts.Anecdotal evidence suggests that LinkedIn's AI image detector is working well. An affiliate marketer who deployed fake LinkedIn profiles said that their success rate dropped significantly after LinkedIn implemented the new detector.5. Meta Expands Reels Ads To Instagram And Tests AI Features - Meta is expanding Reels ads to Instagram and testing AI features to boost engagement and broaden advertiser reach. Previously, Reels ads were only available on Facebook. Now, advertisers can run ads between Reels on Instagram, expanding beyond Facebook. Meta is also introducing app promotion ads to Reels to boost app downloads. Additionally, Meta is testing AI to optimize music in single-image Reels ads, aiming for better viewer engagement.6. Meta's New Speech-to-Text Translation Tool Can Translate Text into Audio in 6 Languages - Meta has announced a new AI system called Voicebox that can translate text to audio in a variety of styles and voices. The system is based on a new method called Flow Matching, which allows it to synthesize speech with fewer learning and processing requirements than other similar systems.Voicebox can translate text into audio in six languages: English, French, German, Spanish, Italian, and Portuguese. It can also perform noise removal, edit content, and transfer audio style.The system is still under development, but Meta plans to make it available to developers in the future.7. GA4 Update: Now You Can Control Conversion Credit - Google Analytics 4 has added a new feature that allows you to select which channels are eligible to receive conversion credit for web conversions shared with Google Ads. This means that you can now choose to give credit to organic and paid channels, even if the last non-direct click was not from a Google ad.This new feature can help you to get a more accurate picture of the impact of your marketing campaigns. For example, if you're running a paid campaign and you see that organic traffic is also converting, you can now give credit to both channels. This will help you to allocate your budget more effectively and improve your ROI.8. Google Sues Rank and Rent Marketer: What You Need to Know - Google is suing an online marketer for violating its terms of service and for engaging in activities that mislead users and violate both federal and state laws. The marketer, who was a member of a public Facebook group called "Rank and Rent – GMB Strategy & Domination," is accused of creating fake businesses and fake websites, and then associating them with Voice over Internet Protocol (VoIP) phone numbers whose area codes correspond to the fake businesses' supposed locations. Google alleges that the marketer has been associated with over 350 fake Business Profiles listings since mid-2021.9. Google Launches New INP Report in Search Console to Help you Prepare for FID to INP Transition - Google has launched a new report in Google Search Console to help site owners prepare for the upcoming change of First Input Delay (FID) to Interaction to Next Paint (INP) as a Core Web Vital metric. The report, which can be accessed by clicking this link and selecting a property, shows how well your site performs with the INP metric.INP is a measure of how long it takes for a user to be able to interact with a page after it has loaded. It is a more accurate measure of user experience than FID, as it takes into account the time it takes for the browser to paint the first content on the page as well as the time it takes for the user to be able to interact with that content.The INP report in Google Search Console shows the following data for each page on your site: The average INP for the page The percentage of users who experienced an INP of 100 milliseconds or less The percentage of users who experienced an INP of 300 milliseconds or less Google has not yet announced a specific date for the FID to INP transition, but it is expected to happen in March 2024. By using the INP report in Google Search Console, you can start preparing your site for the change and ensure that your users have a good experience. 10. Google Postpones Data-Driven Attribution Switch: What You Need to Know - Google has postponed the switch to data-driven attribution in Google Ads from June to mid-July 2023. This means that first click, linear, time decay, and position-based attribution models will still be available for new conversion actions until mid-July. However, once the switch is made, these models will be removed from all Google Ads reporting. Advertisers who want to continue using these models will have to manually switch to them.The reason for the postponement is that Google wants to give advertisers more time to prepare for the switch. Data-driven attribution is a more complex model than the other attribution models, and it requires more data. Google wants to make sure that advertisers have enough data to get accurate results from data-driven attribution.11. Google Ads Now Lets You Track Store Sales and Optimize Your Bids -  Google has announced that store sales reporting and bidding are now available across Performance Max campaigns within Google Ads. This means advertisers can measure total sales wherever customers prefer to shop and optimize their bids for in-store revenue.To use this feature, you'll need to upload and match your transaction data from your business to Google. Once you've done that, you'll be able to see how your ads translate into offline purchases.Store sales reporting and bidding offer a number of benefits for businesses, including: The ability to measure the true value of your ads in terms of in-store sales The ability to optimize your bids for in-store revenue The ability to gain insights into how your ads are driving offline purchases If you're using Performance Max campaigns, I encourage you to take advantage of this new feature. It's a great way to track the impact of your ads on your offline sales and optimize your bids for maximum results.12. Google's John Mueller Warns: Custom Elements in Head Can Hurt SEO - Google's John Mueller has advised against using custom elements in the head of a web page. Custom elements are HTML tags that are not part of the standard HTML specification. They can be used to add new functionality to web pages, but they can also disrupt how Google renders and indexes pages.Mueller's warning comes after a user on Twitter asked him if it was technically valid to have a custom element in the document head. Mueller responded that using custom elements in the likely breaks page rendering in Google Search. This is because Google's crawlers may not recognize custom elements, which could lead to the page being indexed incorrectly or not at all.As an alternative to custom elements, Mueller recommends using JSON-LD tags. JSON-LD is a lightweight markup language that can be used to add structured data to web pages. This data can be used by Google to better understand the content of a page, which can improve its ranking in search results.13. Your Homepage is Not Indexed? Fix It With These 3 Steps! - Google's John Mueller explained in a recent Twitter thread that there are a few reasons why a homepage might not be indexed by Google. These reasons include: The homepage may not be linked to from any other pages on the site. The homepage may have been blocked from crawling by robots.txt or a meta robots tag. The homepage may contain errors that prevent Googlebot from crawling it successfully. The homepage may not have enough content to be indexed. Mueller also said that the crawl budget, which is the amount of time and resources that Googlebot has available to crawl websites, is not usually a factor in why homepages are not indexed. However, if a site has a large number of pages, the crawl budget could become a limiting factor.To fix a homepage that is not indexed, you can try the following: Add links to the homepage from other pages on the site. Remove any robots.txt or meta robots tags that are blocking the homepage from being crawled. Fix any errors that are preventing Googlebot from crawling the homepage successfully. Add more content to the homepage. If you are still having trouble getting your homepage indexed, you can use Google Search Console to troubleshoot the issue.14. Launch a New Domain Before Migrating Content to Reduce SEO Risk - Google's John Mueller says it can be beneficial to launch a new domain before migrating your site to it. This can help to reduce some of the risks associated with migration, both internal and external. For example, if you launch the new domain and start building links to it, this can help to establish its authority before you start migrating your content. Additionally, if there are any problems with the migration, you can still keep your old domain up and running while you troubleshoot.Here are some of the benefits of launching a new domain before migrating your site: It can help to reduce the risk of losing traffic during the migration. It can help to build up the authority of the new domain before you start migrating your content. It gives you a chance to test out the new domain and make sure it is working properly. 15. Google's Gary Illyes: Don't Use AI for SEO - Google's Gary Illyes recently warned SEOs against using large language models (LLMs) and artificial intelligence (AI) to diagnose SEO issues. He said that these tools are not yet sophisticated enough to provide accurate insights into the complex factors that affect search rankings. Instead, Illyes recommends using traditional SEO methods, such as manual analysis of site data and search console reports.16. Google Ads is Phasing Out DSAs: Are You Ready for PMax? - Google Ads is asking advertisers to "upgrade" from dynamic search ads (DSAs) to Performance Max (PMax) campaigns. This is because Google is gradually phasing out DSAs in favor of PMax, which is a more sophisticated campaign type that can reach a wider audience and generate more conversions.When you upgrade your DSA campaign to PMax, your existing assets, settings, and budget will be used to create a new PMax campaign. You can then continue to optimize your PMax campaign as you would any other Google Ads campaign.Some PPC experts believe that this is the beginning of the end for DSAs, and that Google will eventually force everyone to switch to PMax. If you're currently running DSA campaigns, it's a good idea to start planning your transition to PMax now.17. Microsoft Ads Renames Platforms and Integrates AI - Microsoft is renaming several of its advertising platforms and integrating AI into its ad platforms to help advertisers automate campaign creation and improve campaign management efficiencies. The name changes include: Microsoft Advertising becomes Microsoft Ads Xandr becomes Microsoft Audience Network PromoteIQ becomes Microsoft Retail Media Microsoft is also adding AI-powered predictive targeting to its ad platform. This will allow advertisers to target their ads more effectively to potential customers.18.  Microsoft Launches AI-Powered Advertising Tool That Can Help You Increase Your Conversion Rates - Microsoft has announced the launch of Predictive Targeting, an artificial intelligence-powered advertising tool that uses machine learning to help advertisers reach new, receptive audiences and drive higher conversion rates. Predictive Targeting is now available to all advertisers on the Microsoft Audience Network.The tool uses a variety of data points, including website traffic, search history, and social media activity, to create a profile of each user. This profile is then used to predict which ads are most likely to be clicked on by each user.Predictive Targeting is available through Microsoft's Advertising Platform, and it can be used to target ads across a variety of channels, including search, display, and video.  For more information on Microsoft's predictive targeting feature, read its complete guide here.

#TWIMshow - This Week in Marketing
Ep164 - Apple Amps Up Privacy: A Glimpse at iOS 17 and macOS Sonoma

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jun 12, 2023 27:02


Episode 164 contains the notable Digital Marketing News and Updates from the week of June 5 - 9, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.1. Google's Structured Data Validator vs Schema.org -During June 2023, Google SEO Office Hours, Google's Martin Splitt answered a question about structured data validation and how Google's validator can show different results than the Schema.org validator.Both Google and Schema.org offer tools for validating if structured data is correct. Google's tool validates structured data and it also offers feedback on whether the tested structured data qualifies for rich results in the search engine results pages. Rich results are enhanced search listings that makes the listing stand out on the search results. The Schema.org Schema Markup Validator checks if the structured data is valid according to the official standards.Per Splitt, “Schema.org is an open and vendor-independent entity that defines the data types and attributes for structured data. Google, as a vendor however, might have specific requirements for some attributes and types in order to use the structured data in product features, such as our rich results in Google Search. So while just leaving out some attributes or using some type of values for an attribute is fine with Schema.org, vendors such as Google and others might have more specific requirements in order to use the structured data you provide to actually enhance features and products.”In conclusion, Google's validator has a purpose that is different from just checking if the structured data is valid. It's checking to see if the structured data that Google requires (for potentially showing a webpage in enhanced search results) is valid. The Schema.org validator is just checking for standards and has nothing to do with how Google uses structured data.You can watch the June SEO office hour here.2. Google's Latest Search Console Update Makes it Easier to Fix Video Indexing Issues - Google has released an update to its Search Console, aimed at refining video indexing reports. This enhancement promises to offer you more precise problem descriptions and actionable solutions to help boost the visibility of your videos in Google Search.Previously, users encountered a generic "Google could not identify the prominent video on the page" error. Now, Google has decided to provide more specific details to overcome this problem. Here's what you need to know: Video outside the viewport: If your video isn't fully visible when the page loads, you'll need to reposition it. Make sure the entire video lies within the renderable area of the webpage. Video too small: If your video is smaller than desired, you should increase its size. The height should exceed 140px, and the width should be greater than 140px and constitute at least one-third of the page's width. Video too tall: If your video is taller than 1080px, it's time to resize it. Decrease the height to less than 1080px to comply with Google's new guidelines. While you might still see some old error messages for the next three months, Google plans to phase these out, replacing them with these new, more detailed notifications.By adhering to these updates, you can maximize your video's prominence on Google Search and enhance user engagement. Happy optimizing!3. Navigating the World of Domains: A Google Insider's Advice -  Let's delve into the world of domain names and how they can impact your business's digital reach, guided by insights from Google Search Advocate, John Mueller.Mueller recently clarified the differences between generic top-level domains (gTLDs) and country code top-level domains (ccTLDs), following Google's decision to reclassify .ai domains as gTLDs, breaking away from their previous association with Anguilla.In essence, gTLDs (such as .com, .store, .net) are not tied to a specific geographical location, unlike ccTLDs (like .nl for the Netherlands, .fr for France, .de for Germany) that are country-specific. Mueller pointed out that if your business is primarily targeting customers within a certain country, a ccTLD might be the way to go. On the other hand, if you're aiming for a global customer base, a gTLD could be the better option.Importantly, Mueller also highlighted the need to consider user perception. He posed a question to consider: will users click on a link they believe is meant for another country's audience?Furthermore, Mueller also cautioned against using TLDs that may appear spammy, as it can harm your site's credibility.His advice underscores the importance of strategic decision-making when registering your domain, reminding us that the choice of a domain name is not just a technical one, but a business decision that can have a significant impact on your online presence.4. Google's Verdict on the Impact of Security Headers on Search Rankings - In your quest for a secure website, you may have come across HTTP headers - bits of data that offer valuable metadata about a webpage to browsers or web crawlers. The most well-known among these are response headers, like the infamous 404 Error or the 301 redirect.A subset of these headers, known as security headers, play a critical role in fortifying your site against malicious attacks. For instance, the HSTS (HTTP Strict Transport Security) header mandates that a webpage be accessed only via HTTPS, not HTTP, and ensures the browser remembers this preference for the future.While a 301 redirect can guide browsers from HTTP to HTTPS, it leaves your site exposed to potential 'man-in-the-middle' attacks. An HSTS header, on the other hand, ensures your browser requests the HTTPS version directly, effectively bolstering site security.A question was recently posed to Google's John Mueller about whether integrating security headers, like HSTS, could influence website ranking. Mueller's response was clear: the HSTS header does not impact Google Search. This header's purpose is to guide users to the HTTPS version of a site. As for deciding which version of a page to crawl and index, Google uses a process known as canonicalization, which doesn't rely on headers like HSTS.So, while security headers might not boost your site's search ranking, their importance in maintaining a secure browsing experience for your users cannot be overstated. Remember, a secure website is a trusted website, and trust forms the foundation of any successful online presence.5. Debunking 'Index Bloat': Google's Take on Effective Web Page Indexing - In a recent episode of Google's 'Search Off The Record' podcast, the Search Relations team at Google tackled the topic of web page indexing, putting a spotlight on the much-discussed theory within the SEO community: "Index Bloat."This theory, often cause for concern, refers to a situation where search engines index pages that aren't beneficial for search results. It includes pages like filtered product pages, printer-friendly versions, internal search results, and more. Advocates of the index bloat theory argue that such pages can confuse search engines and negatively impact search rankings. They link this issue to the concept of a crawl budget, which is the number of URLs a search bot will crawl during each visit. The theory proposes that index bloat can lead to an inefficient use of this crawl budget, with search bots wasting time and resources gathering unneeded data.However, Google's John Mueller challenged this theory, stating there is no known concept of index bloat at Google. According to Mueller, Google doesn't set an arbitrary limit on the number of indexed pages per site. His advice to webmasters is not to worry about excluding pages from Google's index, but instead, focus on creating and publishing useful content.While some supporters of the index bloat theory have pointed to issues like accidental page duplication, incorrect robots.txt files, and poor or thin content as causes, Google asserts that these are not signs of a non-existent "index bloat," but simply general SEO practices that require attention.Some have suggested using tools like Google Search Console to detect index bloat by comparing the actual number of indexed pages to what's expected. Google's stance implies this comparison isn't indicating a problem, but is instead part of routine website management and monitoring.Google's official stance dismisses the idea of index bloat. Instead, the emphasis should be on ensuring the pages submitted for indexing are valuable and relevant, thereby enhancing the overall user experience.6. Controlling Googlebot: Decoding Google's Search Relations Podcast Insights - In the latest episode of the 'Search Off The Record' podcast, Google's Search Relations team, John Mueller and Gary Illyes, delved into two key topics: blocking Googlebot from crawling certain parts of a webpage and preventing Googlebot from accessing a website completely.When asked how to stop Googlebot from crawling specific sections of a webpage, such as the "also bought" areas on product pages, Mueller emphasized that there's no direct method to achieve this. "It's impossible to block crawling of a specific section on an HTML page," he clarified.However, Mueller did propose two strategies, albeit not perfect ones, to navigate this issue. One involves utilizing the data-nosnippet HTML attribute to stop text from being displayed in a search snippet. The other strategy involves using an iframe or JavaScript with the source blocked by robots.txt. But be wary, as Mueller cautioned against this approach, stating it could lead to crawling and indexing issues that are difficult to diagnose and solve.Mueller also reassured listeners that if the same content appears across multiple pages, it's not a cause for concern. "There's no need to block Googlebot from seeing that kind of duplication," he added.Addressing the question of how to prevent Googlebot from accessing an entire site, Illyes provided a straightforward solution. Simply add a disallow rule for the Googlebot user agent in your robots.txt file, and Googlebot will respect this and avoid your site. For those wanting to completely block network access, Illyes suggested creating firewall rules that deny Google's IP ranges.To sum up, while it's impossible to stop Googlebot from accessing specific HTML page sections, methods like the data-nosnippet attribute can offer some control. To block Googlebot from your site altogether, a simple disallow rule in your robots.txt file should suffice, though you can take further steps like setting up specific firewall rules for a more stringent blockade.7. Sweeping Changes to Google Ads Trademark Policy: What You Need to Know -  Google Ads is making significant changes to its Trademark Policy that could impact how your advertisements are run. Starting July 24, Google will only entertain trademark complaints that are filed against specific advertisers and their ads. This is a shift away from the current policy, where complaints can lead to industry-wide restrictions on using trademarked content.This change is a response to feedback from advertisers who found the previous system frustrating due to over-flagging and broad blocks. The new policy aims to streamline resolutions, making them quicker and more straightforward. In addition, it will provide greater clarity and transparency for advertisers, a much-needed improvement many have been advocating for.As explained by a Google spokesperson, "We are updating our Trademark Policy to focus solely on complaints against specific advertisers in order to simplify and speed up resolution times, as opposed to industry-wide blocks that were prone to over-flagging. We believe this update best protects our partners with legitimate complaints while still giving consumers the ability to discover information about new products or services.”Do note that any trademark restrictions implemented before July 24 under the current policy will continue to apply. However, Google plans to phase out these limitations for most advertisers gradually over the next 12-18 months.You can learn more about these changes by visiting the Google Ads Trademarks policy page here.8. Double Menus, Double Fun: SEO Unaffected by Multiple Navigations - In a recent SEO office hours video, Google's Gary Illyes made it clear that the presence of multiple navigation menus on your website doesn't affect your SEO performance - be it positively or negatively.The question arose during the video discussion, asking whether having two navigation menus - a main one featuring important site categories and a secondary one focusing on brand-related extensions - could potentially harm SEO performance.Illyes' response was reassuring. He stated that it's highly unlikely that multiple navigation menus would have any impact on your website's SEO. In other words, whether you have one, two, or even more navigation menus on your page, Google's algorithms are sophisticated enough to recognize these elements and process them accordingly.So, rest easy and design your website to best serve your audience. Remember, whether your navigation is on the top, left, or bottom of your page, Google's got it figured out!9. Google's Eye on XML Sitemap Changes: Resource Efficiency in Action - Google's own Gary Illyes recently reaffirmed that the tech giant is diligent about scanning XML sitemaps for updates before launching the reprocessing protocol. This practice is rooted in the desire to conserve valuable computational resources by avoiding unnecessary reprocessing of unchanged files.When asked whether Google compares current and previous versions of XML sitemaps, Illyes's response was a resounding yes. He explained that Google refrains from reprocessing sitemaps that have remained the same since their last crawl - a measure designed to prevent wastage of computing resources.However, any modifications in your sitemap, whether in the URL element or 'last mod', will trigger a new round of parsing and generally initiate reprocessing. Illyes pointed out that this doesn't automatically guarantee that the altered URLs will be crawled, as they must still pass through the usual quality evaluations like any other URL.Importantly, if a URL is deleted from the sitemap because it no longer exists, it doesn't imply that it will instantly be removed from the index or prioritized for crawling to expedite its deletion. Keep this in mind when making changes to your sitemap.10. Boost Your Search Rankings: Google's Advice on Consolidating Pages - In a recent SEO office hours video, Google's Gary Illyes brought up a valuable point about web page consolidation. He discussed 'host groups', a term used when Google displays two results from the same domain in search results, with one listed below the other.Illyes suggested that when your website forms a host group, it indicates that you have multiple pages capable of ranking well for a particular query. In such cases, he recommended considering the consolidation of these pages, if feasible.This advice aligns with Google's host groups documentation, which recommends setting one of these pages as the 'canonical' if you'd prefer users to land on that page over the other.The concept of a host group comes into play when two or more consecutive text results from the same site rank for the same query and hence, get grouped together.The rationale behind Google's recommendation for consolidation could be understood as an attempt to prevent your pages from competing against each other. When two pages vie for the same ranking, consolidating them could potentially boost the ranking of the remaining page.From an SEO perspective, having two listings could increase your click-through rate. However, the idea of consolidation is to create a more streamlined user experience and possibly enhance your page's ranking.Keep in mind that this is an approach to consider and may not suit every situation. Always consider your unique context and audience needs when making SEO decisions.11. Unlocking Video Thumbnails in Google Search: Key Insights Revealed -  Recent changes to Google's approach to video thumbnails in search results have prompted many queries. These alterations ensure that video thumbnails are displayed only when the video constitutes the main content on a webpage.This doesn't imply that the video must be the first element on your page. Instead, as Google's Gary Illyes explains, the video should be immediately noticeable — it should be "in their face right away." This user-centric approach enhances the user experience, eliminating the need for them to hunt for the video on the page.Illyes encourages web developers and SEO experts to consider the user's perspective. When visitors land on your page, they should not have to actively search for the video. It should be prominently displayed, akin to the approach of popular video platforms like Vimeo and YouTube.Remember, the aim of these changes is to reduce confusion and streamline the user experience by ensuring that videos are easy to find and view. Take inspiration from major video sites to better understand what Google's algorithms are seeking.12. Enhanced Conversion Tracking with Microsoft Advertising's New Cross-Device Attribution Model -  Microsoft Advertising is set to enhance its tracking capabilities with the introduction of a Cross-Device attribution model. Revealed in Microsoft's latest product update roundup in June, this model promises to provide more accurate insights into customer conversion journeys across multiple devices and sessions.With this new feature, if a customer clicks an ad on their laptop and later completes a purchase on their phone, Microsoft Advertising will attribute the conversion to the original ad click on the laptop. This development will ensure that your marketing efforts are accurately credited, regardless of the device where the conversion ultimately occurs.As a result of this new tracking model, marketers may notice a slight uptick in the number of conversions reported in their performance metrics. If you observe an increase in conversions, the new Cross-Device attribution model could be the driving factor. Keep an eye on your reports to understand the full impact of this latest update on your performance data.13. New Verification Mandates for Microsoft Ads: Everything You Need to Know -  Starting August 1st, Microsoft Advertising will be implementing a new policy to enhance transparency and security. Only ads from verified advertisers will be displayed on the platform. If you haven't yet met the Microsoft Ads verification requirements, it's crucial to complete them before August 1st to ensure your ads continue to run smoothly.The Microsoft Ads Advertiser Identity Verification program, which was launched in June 2022, is rolling out the following important dates: As of July 1st, all new advertisers must be verified before their ads can go live. If you haven't received an email from Microsoft about account verification by July 15th, you should reach out to Microsoft support. Starting August 1st, Microsoft Advertising will exclusively display ads from verified advertisers. Once verified, all ads will showcase: The name and location of the advertiser. The business or individual responsible for funding the ad. Additional information explaining why a user is seeing a specific ad, including targeting parameters. In addition to these updates, Microsoft Advertising is also launching a new feature - the Ad Library. This will enable all users to view ads shown on Bing that have gained any impressions in the European Union. Users will be able to search for ads in the Ad Library by using the advertiser's name or by entering words included in the ad creative. The details of the advertiser will be displayed in the Ad Library.Stay ahead of the game and get your account verified to enjoy uninterrupted ad delivery with Microsoft Advertising!14. Unleashing New Opportunities: LinkedIn Introduces Direct Messaging for Company Pages - In a bid to foster more professional connections and interactions, LinkedIn is set to expand its messaging tools. The platform has now introduced a new feature that allows Company Pages to send and receive direct messages (DMs). This marks a major development as previously, one-to-one messaging was only available for individual LinkedIn members.LinkedIn's new feature, termed Pages Messaging, paves the way for members to directly contact brands. Conversations can cover a broad range of topics from products and services to business opportunities. To handle these two-way conversations, organizations will be equipped with a dedicated inbox, enabling them to manage and prioritize incoming inquiries that are most relevant to their business.As a result of this feature, companies might see a significant increase in messages inquiring about opportunities. However, LinkedIn's 'focused inbox' system, which segregates DMs based on priority and topic settings, can help manage the influx. In addition, companies have the option to disable the Message feature if they wish.LinkedIn has been quietly testing this feature with a select group of users in the past month. Considering that over 63 million companies actively post on their LinkedIn Company Pages, this new feature could potentially revolutionize direct interactions and unearth fresh opportunities.Furthermore, LinkedIn is exploring the integration of an AI assistant to aid in lead nurturing. This could be a significant asset, allowing users to research the person they are communicating with without the need to manually browse through their profile or posts.While it might not be a 'game-changer', the new Company Page messaging feature, which is being rolled out from today, is certainly a noteworthy addition to consider in your LinkedIn marketing strategy.15. Apple Amps Up Privacy: A Glimpse at iOS 17 and macOS Sonoma - In a continued commitment to user privacy, Apple has introduced fresh security enhancements in iOS 17 and macOS Sonoma, aimed at curbing intrusive web tracking. The new Link Tracking Protection feature is at the heart of this upgrade.Activated by default in Mail, Messages, and Safari (while in Private Browsing mode), Link Tracking Protection zeroes in on tracking parameters in link URLs, which are often used to monitor user activity across different websites. The feature scrubs these identifiers, thereby thwarting advertisers' and analytics firms' attempts to bypass Safari's intelligent tracking prevention functionalities.Typically, these tracking parameters are attached to the end of a webpage's URL, bypassing the need for third-party cookies. When a user clicks the modified URL, the tracking identifier is read, enabling the backend to create a user profile for personalized ad targeting.Apple's new feature disrupts this process by identifying and removing these tracking components from the URL, ensuring the user's web page navigation remains as intended. This operation is quietly executed during browser navigation in Safari's Private Browsing mode and when links are clicked within the Mail and Messages apps.To strike a balance, Apple has also unveiled an alternate method for advertisers to gauge campaign effectiveness while preserving user privacy. Private Click Measurement, now accessible in Safari Private Browsing mode, enables the tracking of ad conversion metrics without disclosing individual user activity.In conclusion, Apple's latest efforts reflect a renewed commitment to user privacy, promising to make online experiences safer and more secure across their operating systems.

#TWIMshow - This Week in Marketing
Ep161 - Google's John Mueller on "Gambling with SEO shortcuts"​

#TWIMshow - This Week in Marketing

Play Episode Listen Later May 22, 2023 18:49


Episode 161 contains the notable Digital Marketing News and Updates from the week of May 15-19, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.1. HomeAdvisor Penalized by FTC for False Claims - HomeAdvisor, a popular online marketplace for home improvement services, has been penalized by the Federal Trade Commission (FTC) for making false claims about the quality of leads it sells to service providers.The FTC's complaint alleged that HomeAdvisor had made false, misleading, or unsubstantiated claims about the quality and source of the leads it sells to service providers. The complaint also alleged that HomeAdvisor often told service providers that its leads result in jobs at rates much higher than it can substantiate. Some of the false and misleading claims made by HomeAdvisor are: That its leads were "hand-picked" by HomeAdvisor experts. That its leads were "pre-qualified" and "ready to buy." That its leads were "more likely to convert" than leads from other sources. The FTC's complaint also alleges that HomeAdvisor failed to disclose that its leads were often generated by bots or other automated means, and that they were not always from people who were actually interested in buying home improvement services.As part of the settlement, HomeAdvisor will pay $50 million to the FTC and will be required to change its business practices. These changes include: Disclosing that its leads may be generated by bots or other automated means. Disclosing that its leads may not be from people who are actually interested in buying home improvement services. Providing service providers with more information about the leads they are buying. This settlement is a reminder that businesses must be careful about the claims they make about their products or services. Businesses that make false or misleading claims can be subject to penalties from the FTC.If you believe that you have been the victim of false or misleading claims about leads, you can contact the FTC at https://www.ftc.gov/complaint.If you need assistance with your marketing or advertising, please contact me for a free consultation. I can help you ensure that your business is compliant with all applicable laws and regulations. You can also contact me for a comprehensive audit to help you ensure that your business is not making false or misleading claims.2. Meta Adds New Lead Generation Tools to Facebook - Meta is adding new lead generation tools to Facebook, making it easier for businesses to collect contact information from potential customers. The new tools include a lead form builder, which allows businesses to create custom lead forms, , a lead magnet library,  a lead generation chatbot and a lead generation ad, which allows businesses to promote their lead forms to a wider audience.The lead form builder allows businesses to create custom lead forms that can be embedded on their website or in their Facebook posts. The lead magnet library provides businesses with a selection of pre-made lead magnets, such as ebooks, checklists, and white papers. The lead generation chatbot allows businesses to collect contact information from potential customers through a conversation in Messenger.The new lead generation tools are available to all businesses that use Facebook. To use the lead form builder, businesses will need to create a Facebook Page and then add the lead form builder to their Page. To use the lead generation ad, businesses will need to create a Facebook Ad and then select the "Lead Generation" objective.The new lead generation tools are a great way for businesses to collect contact information from potential customers. The lead form builder is easy to use and allows businesses to create custom lead forms that fit their needs.Meta says that the new lead generation tools are designed to help businesses "grow their audience, increase engagement, and drive sales."3. Meta Starts Refunding Advertisers for Overspending Glitch - Meta has started to issue refunds to advertisers impacted by a glitch last month that resulted in overspending and higher than usual CPAs.The glitch occurred on April 23, when Meta's automated system overspent advertisers' daily budgets in a matter of hours. CPAs also tripled.Meta has said that it is still investigating the cause of the glitch, but it has apologized for the inconvenience and promised to make it right.If you were affected by the glitch, you can request a refund by visiting Meta's website.4. Google Merchant Center COVID-19 Update -  Starting from June 2023, Google is making a significant change to its COVID-19 policy under its Sensitive Events Policy. The company will be lifting restrictions on COVID-19 related content, meaning that Shopping ads and free listings containing COVID-19 related terms will no longer be restricted. In addition, the approval via LegitScript or Project N95 will no longer be required. This policy update will allow Shopping ads and free listings for certain types of face masks, vaccines, and other COVID-19 related products.However, it's important to note that all content related to COVID-19 will still be subject to all other Shopping policies. These policies prohibit content that could be harmful to users and the overall shopping ecosystem, including any form of misrepresentation.If you're unsure about how this update might impact your business, or if you need assistance in adjusting your marketing strategy in line with these changes, I'm here to help. Please do not hesitate to contact me for an audit and let's make sure your business continues to thrive in these changing times.5. YouTube to Offer Unskippable Ads on Connected TV - YouTube is the most popular streaming service on TV screens in the U.S., with an estimated 150 million unique viewers. Now, the company is testing a new ad format that will allow advertisers to show unskippable 30-second ads on connected TV devices.The new ad format is called YouTube Select, and it is currently being tested with a small group of advertisers. If the test is successful, YouTube plans to roll out the format to more advertisers in the coming months.YouTube Select ads will be placed before, during, or after videos on connected TV devices. Viewers will not be able to skip these ads, which will last for 30 seconds each. Advertisers will be able to target their ads based on a variety of factors, including the viewer's interests, demographics, and viewing history.The introduction of unskippable ads on connected TV is a significant development for the advertising industry. It is a way for advertisers to reach a large audience with their messages, and it is a way for YouTube to generate more revenue. These ads can be a more effective way to reach your target audience, since they will be guaranteed to see your ad.And if you are interested in running YouTube Select ads, I can do an audit, assess your need and then develop a strategy for using YouTube Select to reach your target audience . 6. Google Analytics 4 Audience Builder Refresh - Google Analytics 4 has recently refreshed its audience builder, adding new dimensions and metrics, enhanced ways to manipulate event value and event count, and a new option to match dates.Here are some of the new features: New dimensions and metrics: GA4 has added new dimensions and metrics to the audience builder, including: Low engagement sessions: This dimension helps identify users showing low engagement with a website or app. For instance, it allows the creation of an audience segment of users having more than three low-engagement sessions within the past five days. Session duration: This metric measures the average duration of a session. Session count: This metric measures the number of sessions that a user has had in a given period of time. Enhanced alternatives for manipulating event value and event count: GA4 has enhanced the alternatives for manipulating event value and event count. For instance, you can now create audiences based on the total value of events or the number of events that have occurred within a given period of time. New option to match dates: GA4 has added a new option to match dates when creating audiences. This option allows you to create audiences based on events that have occurred within a specific date range. These enhancements make it easier to create more sophisticated audiences in GA4. If you're using GA4, I encourage you to check out the new audience builder and see how it can help you improve your marketing campaigns.If you need assistance with updating your Google Analytics 4 audience, please contact me for an audit.7. Google Search Console to BigQuery Bulk Data Export Feature Does Not Include Historical Data - Google announced in February 2023 that it would be adding a new feature to Google Search Console that would allow users to export their data to BigQuery. his feature allows users to export all of their Search Console data to BigQuery, where it can be analyzed using SQL queries. BigQuery is a cloud-based data warehouse that offers powerful data analysis capabilities. By exporting their Search Console data to BigQuery, users can gain insights into their website's search performance that would not be possible with the Search Console web interface alone.For example, users can use BigQuery to analyze the following: The keywords that are driving traffic to their website The pages on their website that are performing the best in search The countries and regions from which their website is receiving traffic The devices that users are using to access their website This information can be used to improve the website's search performance and make it more visible to potential visitors.However, it has now been revealed that this feature will not include historical data.This means that users will only be able to export data from the day that they set up the export. This could be a problem for users who want to track their performance over time.Google has not said why it has decided not to include historical data in the export feature. However, it is possible that the company is concerned about the amount of data that would need to be stored.Reach out to me, if you need assistance exporting your historical data from Google Search Console data to BigQuery so that you can track your performance over time, or if you need assistance with an audit of your website's search performance. 8. Google Releases New Crawler for Testing Tools - Google has released a new crawler, Google-InspectionTool, which will be used by Google Search's testing tools, such as the Rich Result Test and Google Search Console's URL inspection tool. Google-InspectionTool mimics Googlebot, except for the user agent and user agent token.If you are a crawler junky and you analyze the crawling activity and bot activity in your log files, you might see Google-InspectionTool show up. That is especially if you use the Rich Result Test and URL inspection in Google Search Console.Google-InspectionTool is a valuable tool for webmasters and SEOs. It can be used to test how Googlebot sees your website and to identify any potential problems. If you are having trouble with your website's search engine ranking, Google-InspectionTool can be a helpful tool to troubleshoot the issue.This new crawler is a helpful tool for webmasters and SEOs who want to test their websites and make sure that they are crawlable by Google. If you are having trouble with your website's crawlability, you can use Google-InspectionTool to troubleshoot the issue.If you need assistance with your website's search engine ranking, please contact me for an audit. I can help you identify and fix any issues that are preventing your website from ranking well in Google.9. Will SSL Boost Your SEO Rankings? - Google's John Mueller has stated that an SSL certificate does not boost a website's SEO. This means that having an SSL certificate will not improve your website's ranking in Google search results.An SSL certificate encrypts the data that is transmitted between your website and a user's browser. This helps to protect users' privacy and security. However, it does not have any impact on how Google ranks websites.If you are concerned about the security of your website, you should still get an SSL certificate. However, you should not do it in the hope that it will improve your SEO.If you need help with your website's SEO, please contact me for an audit. I can help you identify and fix any SEO issues that are affecting your website's ranking.10. Google's John Mueller on "Gambling with SEO shortcuts"​ - I want to share with you a thought-provoking discussion from a recent Mastodon thread that I believe has valuable insights for everyone in the SEO and digital marketing community.The conversation started with Preeti Gupta (@ilovechoclates_) posing a hypothetical scenario in which a site, filled with scraped reviews from established platforms like Capterra, G2, and Trustpilot, is created to gain traffic and revenue. The question raised was whether such a practice is ethical, especially the review scraping part.Simon Cox (@simoncox) responded that while he wouldn't engage in such practices on ethical grounds, the lack of a strong brand would likely render such efforts futile. He emphasized that there are no shortcuts in building a trustworthy business.John Mueller (@johnmu) replied that while sites taking such shortcuts might temporarily gain traffic from search, the majority won't. He compared it to a gamble, and cautioned that when things go south, there won't be any sympathy. He emphasized that attempting to cut corners with SEO, such as using low-quality content or artificial backlinks, can lead to more harm than good. Rather than providing a shortcut to better rankings, these tactics can actually harm your site's reputation and visibility in Google's search results. Mueller advises that sticking to Google's guidelines and focusing on creating high-quality, relevant content is the best approach to SEO. He stressed the importance of building a long-term, value-based presence rather than relying on short-term, potentially unethical strategies. This is a timely reminder for all businesses about the importance of ethical SEO practices.This discussion serves as a stark reminder that ethical, long-term strategies are at the heart of successful digital marketing. There are no shortcuts to building a reputable brand and a loyal customer base. Scraping content and attempting to profit from others' work might seem tempting, but such tactics are likely to backfire in the long run.If you're unsure about your SEO strategies, would like an audit to ensure your methods align with best practices or if you could benefit from a comprehensive audit of your website's SEO, please don't hesitate to reach out to me. We're here to provide guidance and support to help you avoid potential pitfalls, ensure your site is fully optimized for long-term success in the digital space. 

#TWIMshow - This Week in Marketing
[Ep157] - Google Adds Page Experience To ‘Helpful Content' Guidance

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 24, 2023 21:14


Episode 157 contains the notable Digital Marketing News and Updates from the week of Apr 17-21, 2023.1. Twitter Requires All Advertisers To Pay For Verification First - Twitter has informed all advertisers that they'll have to sign up to either Twitter Blue or Verification for Organizations in order to keep running ads in the app. In effect, this now means that brands will have to pay Twitter $8 per month for a blue tick, or $1,000 per month for its Verification for Organizations offering – though brands that are already spending ‘in excess of $1,000 per month' will soon be given gold checkmarks automatically.The cheapest option would be to buy a Twitter Blue subscription for your brand, which will cost your business an extra $96 per year, and if you're planning to run Twitter ads, that's unlikely to have a huge impact on your annual budget.You'll also get a verified tick for your brand account, which could help to give your brand more legitimacy in the app even though the checkmark doesn't seem to communicate the same level of authority or trust that it once did. Given that the blue checkmark can also be bought by anyone, as there's no checking process involved – there's no actual verification in Musk's Twitter Blue process. That means that someone else could also register your brand name, and also get a blue tick for it.Hmm. The question is: Should you pay for verification?2. Instagram Allows You To Add Up To 5 Links In Your Profile Bio - Instagram has finally  launched one of its most requested feature updates, giving users the ability to add up to five links in their IG bio, expanding on its capacity to drive traffic. In the announcement, Instagram wrote:“Starting today, the update will make it easier for creators and other users to highlight their passions, bring awareness to causes they care about, promote brands they love, showcase their personal business, and more.”This is bad news for products such as Linktree, and other linking tools. Instagram's opposition to external links has long been the key driver of usage for third-party link aggregator tools, but now, people will be able to replicate that capacity within the app itself, which will no doubt see many abandon their paid subscriptions to third-party apps.But then again, some of these tools enable branding options that could still act as an enticement, along with more link display options. It's also become such a standard behavior now that users don't find it jarring, so maybe, some businesses will stick with third-party link tools, even with this new capacity available.To add multiple links to your IG profile, head to ‘Edit profile' > ‘Links' > ‘Add external link'. From there, you can drag and drop to order your links as you'd like them to appear in the app.3. Google: Just Because A Site Is Good Now, Doesn't Mean It Will Be #1 Forever - Sayan Dutta asked Google's John Muller that “Recently I am noticing that websites are being removed from Google News. My 3 years old site suddenly showing Not Live on Publisher Center. I saw that with a few of my sites.”Google's John Mueller replied  that just because a site appears to be doing super well in terms of Google ranking and SEO today that it won't one day degrade in value. John added, "just because something's in Google News now doesn't mean it'll be there forever."Sometimes sites just lose their luster, the topic may be not as relevant, or the content quality does not increase as its competitor's content quality increase. Sometimes, sites change ownership and the new owners do not put in the work needed. Sometimes sites just can't keep up with the speed on innovation.There you go folks, SEO is not evergreen or perennial. 4. Google Adds New Return Policy Structured Data Support For Merchant Listing - A structured data type communicates to search engines that the data is about a specific data type. Structured data types have “properties” that provide information about the data type.A new returns section has been added to the structured data type definitions within Google's product structured data document. This is for merchant listings, not yet product snippets, and these new properties types apply to merchant listing experiences. This addition came on the same day that Google began showing shipping and return information in its search results.The new MerchantReturnPolicy type has two required properties (applicableCountry and returnPolicyCategory). Required properties are not optional and must be present in the structured data in order to ensure eligibility for the rich results specific to MerchantReturnPolicy.Google's new section for the returns policy shopping experience eligibility explains that there is an alternate way to become eligible without having to configure the associated structured data. They recommend configuring the shipping settings return policies in the Google Merchant Center Help (details on how to configure it here).5. Google Introduces New Crawler & Explains The Use Cases For Its Different Crawler Types - Google has added a new crawler to its list of Google Crawlers and user agents, this one is named GoogleOther. It is described as a "generic crawler that may be used by various product teams for fetching publicly accessible content from sites." For example, it may be used for one-off crawls for internal research and development, Google explained. The GoogleOther crawler always obeys robots.txt rules for its user agent token and the global user agent (*), and uses the same IP ranges as Googlebot. The User agent token is "GoogleOther" and the full user agent string is "GoogleOther."Here is what Gary Illyes from Google wrote on LinkedIn:“We added a new crawler, GoogleOther to our list of crawlers that ultimately will take some strain off of Googlebot. This is a no-op change for you, but it's interesting nonetheless I reckon. As we optimize how and what Googlebot crawls, one thing we wanted to ensure is that Googlebot's crawl jobs are only used internally for building the index that's used by Search. For this we added a new crawler, GoogleOther, that will replace some of Googlebot's other jobs like R&D crawls to free up some crawl capacity for Googlebot. The new crawler uses the same infrastructure as Googlebot and so it has the same limitations and features as Googlebot: hostload limitations, robotstxt (though different user agent token), http protocol version, fetch size, you name it. It's basically Googlebot under a different name.”At the same time, Google updated the Googlebot page, and listed the different crawler types & its uses that may show up in your server logs. Using this information, you can verify if a web crawler accessing your server is indeed a Google crawler instead of spammers or other troublemakers. Google's crawler fall into three categories: Googlebot - The main crawler for Google's search products. Google says this crawler always respects robots.txt rules. Next time when you find crawl—**.googlebot.com or geo-crawl-**.googlebot.com in your server logs, know that Google Search crawler visited your site. Special-case crawlers – Crawlers that perform specific functions (such as AdsBot), which may or may not respect robots.txt rules. The reverse DNS mask for these  visits show up as rate-limited-proxy-***-***-***-***.google.com User-triggered fetchers – Tools and product functions where the end-user triggers a fetch. For example, Google Site Verifier acts on the request of a user or some Google Search Console tools will send Google to fetch the page based on an action a user takes. The reverse DNS mask for these visits show up as ***-***-***-***.gae.googleusercontent.com P.S: Listen/watch the show to hear my perspective on why it  is important for any website owner to review server logs and keep the troublemakers away.6. Google Removed Older Search Ranking Algorithm Updates From Its Ranking Systems Page - Google has updated its documented Google ranking systems page and completely removed the page experience system, the mobile-friendly system, page speed system and the secure site system rankings. You can spot the difference if you compare the live page to the archived page.These removals make me wonder if any of these algorithm updates mattered at all to the overall Google ranking system?7. Google To Remove Page Experience Report, Mobile Usability Report & Mobile-Friendly Tests From Search Console Report - In the coming months, Google will deprecate the page experience report within Google Search Console, the mobile usability report, and the mobile-friendly testing tool. The core web vitals and HTTPs report will remain in Google Search Console, Danny Sullivan of Google announced.The original page experience report launched in Search Console in April 2021 and was designed for just mobile pages. Google added a desktop version with the launch of the desktop version of the algorithm in January 2022. Now that it is 2023, Google is going to remove that page experience report completely and "will transform into a new page that links to our general guidance about page experience," Danny Sullivan wrote.In December 2023, Google will also drop Google Search Console's mobile usability report (originally launched in 2016), the mobile-friendly test tool (launched in 2016) and mobile-friendly test API. Google said this is not because mobile friendly and usability is not important, Google said, "it remains critical for users, who are using mobile devices more than ever, and as such, it remains a part of our page experience guidance. But in the nearly ten years since we initially launched this report, many other robust resources for evaluating mobile usability have emerged, including Lighthouse from Chrome.”8. Google Adds Page Experience To ‘Helpful Content' Guidance - Google added a new section for providing a great page experience to its guidance around how to create helpful content, Google explained. Google also revised its help page about page experience to add more details about helpful content. Here is what Google added to the helpful content guidance:“Provide a great page experience: Google's core ranking systems look to reward content that provides a good page experience. Site owners seeking to be successful with our systems should not focus on only one or two aspects of page experience. Instead, check if you're providing an overall great page experience across many aspects. For more advice, see our page, Understanding page experience in Google Search results.”There is a FAQ section at the bottom of the “page experience” documentation that you need to read through if you are maintaining or leading your SEO efforts. Here are some items from the FAQ section: Without the Page Experience report, how do I know if my site provides a great page experience? The page experience report was intended as a general guidepost of some metrics that aligned with good page experience, not as a comprehensive assessment of all the different aspects. Those seeking to provide a good page experience should take an holistic approach, including following some of our self-assessment questions covered on our Understanding page experience in Google Search results page. Is there a single “page experience signal” that Google Search uses for ranking? There is no single signal. Our core ranking systems look at a variety of signals that align with overall page experience. Page experience signals had been listed as Core Web Vitals, mobile-friendly, HTTPS and no intrusive interstitials. Are these signals still used in search rankings? While not all of these may be directly used to inform ranking, we do find that all of these aspects of page experience align with success in search ranking, and are worth attention Are Core Web Vitals still important? We highly recommend site owners achieve good Core Web Vitals for success with Search and to ensure a great user experience generally. However, great page experience involves more than Core Web Vitals. Good stats within the Core Web Vitals report in Search Console or third-party Core Web Vitals reports don't guarantee good rankings.

#TWIMshow - This Week in Marketing
[Ep156] - Are Backlinks Still Relevant In 2023?

#TWIMshow - This Week in Marketing

Play Episode Listen Later Apr 17, 2023 27:05


Episode 156 contains the notable Digital Marketing News and Updates from the week of Apr 10-14, 2023.1. Microsoft Ads Launches PLA Extensions - PLA Extensions are the first fully integrated retail media solution from Microsoft allowing brands to serve product ads both onsite and offsite with a single budget while automatically optimizing for the best balance of performance and reach.What are onsite and offsite ads? Advertising your products on a retailer's website (known as “onsite”) is a strategy with a proven return on investment (ROI)—but advertising only onsite can limit your reach. Onsite advertising has genuine advantages, such as increase awareness of your products with a retailers loyal shoppers, as they're already shopping (with high intent to make a purchase) on the retailer's website. However, reaching only shoppers that are already searching for your products on a retailer's site limits your reach.PLA Extension allows you to place product ads both onsite and offsite with a single budget, while automatically optimizing for the best balance of performance and reach. While previously limited by ad supply constraints onsite, you can now reach significantly more in-market shoppers offsite. With PLA Extension, you'll have unified reporting and attribution without any heavy lifting. This feature unlocks omnichannel attribution in an automated and scalable way, enabling you to measure the impact of your onsite and offsite ads and see a more complete view of the shopper's journey.When you opt to use PLA Extension—if an onsite campaign is underspending—the feature will automatically extend your product listing ad to offsite product placements powered by the Microsoft Search Network and the Microsoft Audience Network, optimizing performance and reach across on and offsite product ad placements.For example, suppose John is searching for a new pair of boots on his favorite retailer's website. He sees your ad for a particular brand of boots but becomes distracted before adding them to his cart and leaves the website without making a purchase. Later that evening, he's browsing online and sees your ad again on Bing. He clicks through the ad on Bing and is taken to your product page on the retailer's website, and this time, makes a purchase. Meanwhile, Sandra, another shopper in need of new boots who isn't familiar with your brand, has also seen your ad on DuckDuckGo and clicks through to make a purchase as well. Thanks to the PLA Extension feature, you've reached in-market shoppers both on and off the retailer's site with a single PLA campaign.PLA Extension drives awareness of your brand and points prospective customers back to your product landing page, growing your audience, and increasing the likelihood of sales.The introduction of this feature in the Microsoft PromoteIQ solution is just one of the ways Microsoft is solidifying its vision to build the most complete omnichannel retail media stack.2. GA4 Gives You The Ability To Change How You Count Your Conversions - Google Analytics 4 (GA4) now allows users to modify the counting method for conversions, introducing a “once per session” option, which is similar to how Universal Analytics (UA) operated. There are two different counting methods that you can select for a conversion event:  The “Once per event” (Recommended) setting means that Google Analytics 4 properties count an event as a conversion every time it occurs. This option is recommended because it reflects the behavior of users on your site or app, and allows you to distinguish between sessions where multiple conversions occurred and sessions where only one conversion occurred. For example: A user completes 5 conversions in one session. This setting counts 5 conversions. The “Once per session (Legacy) This setting means that Google Analytics 4 properties count an event as a conversion only once it occurs within a particular session. Once per session is how Universal Analytics properties count goals. Select this option if it's important for your GA4 conversion count to closely match your UA conversion count. Otherwise, select Once per event. For example, A user completes 5 conversions in one session. This setting counts 1 conversion.  A session is a group of user interactions with your website or app that take place within a given time frame. In Analytics, a session initiates when a user either opens your app in the foreground or views a page or screen and no session is currently active (e.g. their previous session has timed out). By default, a session ends (times out) after 30 minutes of user inactivity. However, you can adjust the session timeout period. There is no limit to how long a session can last. An engaged session is a session that lasts longer than 10 seconds, has a conversion event, or has at least 2 pageviews or screenviews.If you don't make a choice, Google Analytics will automatically use the default counting method. The default depends on how conversion events were created. i. “Once per session” is the default counting method for all conversions that were created from Universal Analytics goals: 1. in an automatically created Google Analytics 4 property 2. using the goals migration tool in the Setup Assistant after April 2023ii. “Once per event” is the default counting method for all other conversionsYou can quickly tell which counting method each of your conversion events uses by going to the Conversion events table in Admin > Conversions. Any conversion with an  icon next to it has the Once per session counting method. If there's no icon in the Conversion name column, then that conversion event uses the Once per event.To change the conversion counting method, you need to have at least an “Editor” role on the property.3. In U.S Search Ads Accounted For 40.2% Of All Digital Ad Revenue In 2022 - According to the 2022 IAB's Internet Advertising Revenue Report, search ads accounted for $84.4 billion of the $209.7 billion in U.S. digital advertising revenues.  In 2021, search ads accounted for $78.3 billion of the $189 billion spending.The IAB report pointed out that the growth of search revenues wasn't as strong as other formats – it lost 1.2 percentage points in total revenue share. That's because display revenues were up 12%, digital video was up 19.3% and digital audio was up 20.9%, year on year. With 40.2% of all digital ad revenue in 2022, paid search is still the leading format.After rebounding in 2021, social media saw its smallest level of growth in 10 years, according to the IAB. In 2022, revenue from social platforms was $59.7 billion, up from $57.7 billion in 2021. Apple's App Tracking Transparency (ATT) feature was cited as one key reason for stalled growth.You can view the entire report here (note: the report is free, but you must log in or create an account to download it).4. Google Released 2022 Web Spam Report - Every year Google releases its web spam report showing how much better the search company got at fighting search spam. Google has named its machine learning based systems for spam identification with its own name - SpamBrain. SpamBrain is Google's AI-based spam-prevention system that it launched in 2018 but never spoke about externally as SpamBrain until 2021.Here are some high level numbers that Google shared in its 2022 report: SpamBrain detected 5 times more spam sites in 2022 compared to 2021  SpamBrain detected 200 times more spam sites in 2022 compared to when SpamBrain first launched in 2018 SpamBrain was incorporated in the December 2022 link spam update 10 times improvement in hack site detection SpamBrain can detect spam during crawling, so a page doesn't need to be indexed to be found to be spammy 5. Google Updates Policy For Video Thumbnails In Search Results - On April 13, 2023 Google published in Search Central Blog that they have made a change so that video thumbnails only appear next to Google search results when the video is the main content of a page. This will make it easier for users to understand what to expect when they visit a page. Previously, they showed video thumbnails in two different ways. For pages where the video was the main content of the page, the video thumbnail appeared at the beginning of a listing: This remains unchanged. The other format was for when a video was present on a page but not the main element of a page. In that case, the thumbnail appeared after a listing: This second format is going away. According to Google during their experiments phase,this change has had minimal impact on overall engagement for publishers.This change will impact search appearance reported metrics for videos in the performance report in Search Console. There will be annotations in the video indexing report and the video enhancements report.In my own experiments, I noticed that instead of video thumbnails, Google is replacing it with an image.To learn more about video indexing best practices, check out the video best practices guide. 6. Google Warns Against Cloaking HTTP Status Codes - Cloaking is a simple method to hide a website's natural appearance from search engines. This technique gives search engines a version of the webpage (or different content) that is different from what website visitors actually see when they visit the website. Black Hat SEO practitioners use cloaking to try to help improve a website's ranking in search engines like Google.Now during Google's April 2023 SEO Office Hours, a website owner asked Gary Illyes, Analyst at Google, whether giving Googlebot a different HTTP status code from the one served to human visitors would be acceptable. Specifically, the site owner wanted to serve an HTTP status code of 410 (Gone) to Googlebot while giving users a 200 (OK) status code. An HTTP status code of 410 informs search engines that a page has been deleted for good and needs to be removed from their index. In contrast, an HTTP status code of 200 means the request worked and the desired resource was successfully given. Giving different HTTP status codes to search engines and users is also considered as “cloaking.”In response to the website owner's question, Illyes strongly advised against cloaking status codes, stating it's risky. He explained that multiple serving conditions could lead to potential issues, such as the site getting de-indexed from Google. Instead, Illyes recommends using a “noindex” robots meta tag to remove specific pages from Google Search. This approach is more straightforward and safer than setting up potentially problematic serving conditions.Cloaking is considered a direct violation of Google's webmaster guidelines.7. Google: No SEO Reason To Delay Release Of Thousands Of Pages - Google's John Mueller was asked if it makes sense to publish 8,000 new pages at once or slowly publish them over time. Like, would publishing them all at once cause some of SEO issue with Google Search.The answer is no, go ahead and publish the 8,000 pages at once. But John Mueller said on Twitter, "If it's great content that the internet has missed and which users have been waiting for, why would you artificially delay it?"Basically, if you created thousands of great pages, just release them. If they are not great pages, then why release any of them? The number of pages doesn't necessarily matter, what does matter is if they are good pages. You know, the quality over quantity thing...8. Google: How To Investigate A Sudden Ranking Drop - Gary Illyes from Google during the latest Google SEO office hours said that it would be "really uncommon that you would completely lose rankings for just one keyword." The question asked was, "Is there a way that my site was deleted from SERP for one certain keyword? We were 1st and now we are absent completely. Page is in the index."Gary said if you do see that happen, it "usually" means you were "out-ranked by someone else in search results." Your competitors might have improved their content or implemented better SEO strategies, leading them to outrank your website for the targeted keyword.But what happens if you really did disappear for that one keyword phrase? Gary's recommendation is to check the following: Check to make sure that this is not a regional thing where you rank well in one region but you don't rank in another region. "I would check if that's the case globally." "Ask some remote friends to search for that keyword and report back. If they do see your site then it's just a glitch in the matrix," he added. You can accomplish the same on your own using a virtual private network (VPN) to change your location and simulate searches from different countries. This will allow you to see if your website's ranking is consistent across various locations. And I would like to remind you that Search results can be influenced by a user's location and search history. If a website's disappearance for a specific keyword is limited to a particular region or individual users, it could be due to geo-targeting or personalization factors. If your site is consistently absent from test searches, you may have a more significant problem. Check to see if you "did anything that might have caused it." Such as, did you change the internal link structure or page layout or acquired more links? Or did you use the disavow tool recently? Each of these may have some effect on ranking. So going through the check list is probably going to help. At times it is a technical issue: Crawling or indexing issues can cause your website to disappear from search results. This could be due to problems with your robots.txt file, sitemap, site structure or issues like broken links, duplicate content, or slow page load times.9. Are Backlinks Still Relevant In 2023? - Google's John Mueller said that when it comes to links SEO make to manipulate search rankings and gain ranking position in Google, those links are mostly ignored by Google Search. This is a line Googlers have been saying since Penguin 4.0 came out in 2016 and continues to say.John said on Twitter, "Of course -- most of those links do nothing; we've spent many years ignoring that kind of thing." This was when he was asked, "Most of the Seo practitioners make backlinks for just manipulating search results and gain positions. If private tools like Semrush and Ahrefs can detect those IPs which are building backlinks, doesn't google track them?"So there you have it again, Google says they are excellent at ignoring links that are links designed to try to manipulate Google Search rankings. But as a reminder, Google does not demote but rather devalues those links.P.S: Demotes would mean that it would lower the rankings of a website for doing something bad. Devalues means it would likely ignore the link spam and not downgrade the rank of the web site. 

SEO Podcast by #SEOSLY
#36: Google Indexation With Tomek Rudzki From Onely & Zip Tie

SEO Podcast by #SEOSLY

Play Episode Listen Later Mar 29, 2023 65:27


#TWIMshow - This Week in Marketing
[Ep153] - Google: This Is How We Process Your Robot.txt

#TWIMshow - This Week in Marketing

Play Episode Listen Later Mar 27, 2023 24:41


Get up to speed with the Digital Marketing News and Updates from the week of Mar 20-24, 2023.1. TikTok CEO Makes His Case To US Congress - TikTok's fate in the US market is still unknown. However, TikTok's CEO Shou Zi Chew's appeared before the US House Committee on Energy and Commerce this week, in which Chew had the opportunity to present TikTok's case, and convince senators that TikTok does not pose a national security threat. In his pre-prepared testimony, Chew wrote that “More than 150 million people in the United States use TikTok on a monthly basis, with the average user today being an adult well past college age. Their videos provide a lens through which the rest of the world can experience American culture. Examples include TikTok's role in bringing exposure to American musicians, artists, chefs, and many more. While users in the United States represent 10 percent of our global community, their voice accounts for 25 percent of the total views around the world.”It is interesting to note the average user age, as per Che'w example here, which is likely older than many would expect. The general consensus is that if the US imposes a TikTok ban, other regions will follow. It's not a certainty that a ban will be imposed, but Chew's written testimony doesn't do a lot to provide new assurance. 2. LinkedIn Rolls Out 4 Updates For Business Pages - LinkedIn's latest features for business pages aim to enhance brand visibility, audience engagement, and hiring efficiency. They are: LinkedIn now lets you plan your business page posts up to three months ahead for steady interaction with followers. Available now on desktop, this feature will come to mobile soon. LinkedIn is introducing live, audio-only discussions, eliminating reliance on external broadcasting applications. Listeners can engage in the discussion through emojis and request to speak if they wish to contribute verbally.  For businesses with fewer than 1,000 employees, LinkedIn now offers an automatic job posting feature. Once activated, the platform will automatically share one open role daily as a pre-scheduled post. The posts can be edited after they're shared. LinkedIn Pages can now follow other Pages, making it easier to join chats related to your field with a feed dedicated to content from the businesses you're following. It's interesting to see that LinkedIn is doubling down on the audio segment at a time when Facebook is killing it. I'm not suggesting that the platforms should necessarily copy each other even though that seems to be happening a lot lately (esp Meta copying TikTok) however audio is a hard space to win in and the only one seems to be doing a decent job is Clubhouse. 3. 6 New Features In Reddit Ads - Reddit has announced 6 updates to its Ads Manager to accommodate its growing global advertising business and cater to the diverse needs of its advertisers. These changes are effective immediately and offer specific improvements for self-service clients and international advertisers. Simple Campaign Creation: Enables advertisers to create a campaign with a single ad in just three easy steps: create an ad, set up targeting and budgets, and select payment. This feature is particularly helpful for self-serve advertisers who want to launch ads quickly and seamlessly. Automated Ad Creation: Allows advertisers to build and test multiple sets of ad variations to determine which messaging connects best with their target audience. This feature saves time previously spent creating new ads one-by-one. Multi-Currency Support: Advertisers from more than 40 countries can choose from various international currencies, including the Australian Dollar (AUD), Canadian Dollar (CAD), Euro (EUR), British Pound (GBP), and New Zealand Dollar (NZD). Improved Community Search: Lets advertisers find targetable communities based on topic relevance, ensuring better reach to relevant audiences. Reporting Retention: Now supports a 12-month look-back, enabling advertisers to analyze their ads' activity over time. Bulk Edit Improvements: Allow advertisers to change bids, budgets, and third-party reporting trackers for ads across multiple campaigns and ad groups simultaneously. 4. Instagram Adds Reminder Ads and Promoted Results in Search - Just two weeks back in episode#151 I covered “TikTok Enters The Search Ads Market Even Though It Is Inching Towards A US Ban”. In that show, I said that it will be soon that Instagram will copy this feature. And now my predictions came true. Maybe I should quit my day job and become a forecaster. Haha. j/k.Instagram's launching two new ad options, with Reminder Ads, that enable users to opt into alerts ahead of an event, and ads in search results, helping to better connect with users in a discovery mindset.After a user opts-in via the ad CTA, they'll then receive three subsequent notifications of that event, with the first coming a day before, then another 15 minutes ahead of the start time, with a final alert as it begins. Which will ensure that you don't miss out – and while three reminders may seem a little much, if you're really keen (or forgetful), it could be of benefit. Reminders can be set up to three months ahead of time, and once you've added a reminder to a post, you can create additional posts with reminders for the same event, without adding new event details. The event time will also be displayed in local time equivalent.The search ads will help to connect users based on contextual keyword. “Ads will show up in the feed that people can scroll when they tap into a post from search results. We plan to launch this placement globally in the coming months.”You can learn more about Reminder Ads here, while Search Ads are being rolled out with selected accounts from this week.5. Microsoft Introduces Category-Based Targeting For Retailers - Search advertising is evolving, especially in retail media. Keyword targeting has been ubiquitous in search advertising for a long time, as it is an effective way for advertisers to reach shoppers who search via keywords. But traditional keyword targeting has constraints within the retail media space. It may limit campaign reach and performance for advertisers and the advertising revenue generated through a retailer's retail media programme. Microsoft's retail media platform, PromoteIQ, has launched an innovative solution to address these constraints. To address advertisers' desire to maximize their reach to relevant shoppers on a retailer's website and app, Microsoft's retail media platform, PromoteIQ has launched a new solution that targets retail shoppers based on the categories they browse, and leverages keywords as a booster for campaign bids. This new solution will give advertisers the ability to unlock the benefit of insights into retail shoppers' behaviours, achieving stronger performance for both advertisers and retailers.With Microsoft PromoteIQ, advertisers can boost bids with keywords to show ads to those shoppers who are searching for specific products on a retailer's website and app, for a higher chance of converting purchase intent to a sale.Unlike traditional keyword targeting—which requires advertisers to research and build an exhaustive list of keywords per campaign—when managing retail media campaigns that leverage category behaviours with the Microsoft PromoteIQ platform, advertisers only need to test and retain a few high-performing keywords. For retailers, the efficiency in campaign management translates into more demand.Microsoft said that in their tests, "campaigns that boost bids by keyword whilst targeting by category exhibit 320% higher click-through-rate (CTR) than the campaigns without boosting bids by keyword."6. Countdown to GA4 : Google Ads Can See A Significant Impact If Not Using GA4 - As the sun begins to set on Google Analytics Universal Analytics (UA), businesses should be gearing up for the transition to Google Analytics 4 (GA4). Google Ads Liaison, Ginny Marvin reminded advertisers that some ""the GA4 deadline is fast approaching. Universal Analytics properties will stop processing new data on July 1. That means new data will stop flowing from UA properties into Google Ads which could significantly impact campaign performance."For example, advertisers who are currently using Performance Max (PMAX) campaigns and pulling in conversions from UA, may see a drastic decrease in campaign spend, higher ROAS/CPA is set (as system won't detect any conversions). This is bad for the advertiser as well as Google since Google makes also money when your ads get served/clicked. If you are unsure whether you need to migrate or have already migrated then you can talk to us. 7. 3 New Features In Google Discovery Ads - According to the Google's announcement, the 3 new features in Discovery Ads will help a merchant stand out on most engaging ad surfaces: Rollout of Product Feeds to all Discovery Advertisers that show consumers items based on their interests and intent. Specifically, retailers can use lifestyle images and short text with their Google Merchant Center catalog to deliver more relevant ads. Google claims that by adding product feeds to Discovery ads with sales or lead gen goals, advertisers can achieve 45% more conversions at a similar CPA on average. What's more, you can pair them with a Video action campaigns to drive deeper consideration and engagement with consumers who are exploring content on YouTube. Product-level reporting (launching later this month) will allow advertisers to track how their Google Merchant Center catalog items are performing in product feeds against metrics like impressions or clicks. For example, a retailer who's promoting a wide range of products can now see which types of products are generating more interest and take business-related actions. Starting in Q2, Discovery advertisers can get a more accurate view of their campaign performance within the Google ecosystem through data-driven attribution (DDA). DDA gives credit for conversions based on how people engage with your ads, and uses your account data to determine which campaigns have the greatest impact on your business goals. You can then pair those insights with automated bidding strategies like Max Conversions to get even more conversions. Advertisers who switch to data-driven attribution from another attribution model typically see a 6% average increase in conversions. To accurately assess the effectiveness of campaigns, Google introduced Conversion Lift experiments last year. This is one of their most recent measurement solutions that enable advertisers to determine incremental conversions based on either users or geography. Now Google is rolling out conversion lift experiment. This is  for advertisers who are running both Discovery ads and Video action campaigns,and wants to measure their ads impact based on Geography. You do have to reach out to your Google account representative to learn how you can participate. In my opinion, unless you have a large budget it's best to wait. 8. Google Ads Tests Blue Badges For Verified Advertisers In Search - Looks like “blue check mark” is the flavor of the season. After Meta rolled out a blue checkmark program, Google is testing showing blue badge icons and labels on some search ads for advertisers who are verified by Google Ads. The blue label is a blue circle with ridges and check mark within it.  This is from the ongoing Google advertiser verification program and some users are seeing Google little blue check marks for advertisers who are verified.9. Google Clarifies Its 15MB Googlebot Limit - Google has updated the Googlebot help document on crawling to clarify that the 15MB fetch size limit applies to each fetch of the individual subresources referenced in the HTML as well, such as JavaScript and CSS files. Google initially added information about this 15MB limit several months ago, which caused a lot of concern in the SEO industry, The help document now reads:“Googlebot can crawl the first 15MB of an HTML file or supported text-based file. Each resource referenced in the HTML such as CSS and JavaScript is fetched separately, and each fetch is bound by the same file size limit. After the first 15MB of the file, Googlebot stops crawling and only considers the first 15MB of the file for indexing. The file size limit is applied on the uncompressed data. Other Google crawlers, for example Googlebot Video and Googlebot Image, may have different limits.”Furthermore, Gary Ilyes posted on LinkedIn that, "..The 15MB resource fetch threshold applies on the JavaScript resources, too, so if your JavaScript files are larger than that, your site might have a bad time in Google Search. See googlebot.com for information about the fetch features and limitations. Yes, there are JavaScript resources out there that are larger than 15MB. No, it's not a good idea to have JavaScript files that are that large."Also note that the 15mb limit is just about fetching,  it's totally separate from the indexing side.If the talk about css, java script, fetching, and indexing is making you dizzy then it's a sign that you need assistance from a reputable agency/expert.10. Non-Supported Rel Link Attributes Do Nothing In Google Search - The Rel Link attribute has been covered in past episodes (Ep 141, 91). Given the amount of confusion and bad advice floating around the web, Google's John Muller clarified that only the supported Rel Link Attributes (nofollow and me) works. The rest has no impact on your Google search index or ranking. He went on to say that you can even use made like . However, the Google bot will just ignore it. #SEOmythBusted11. Google: Keyword Stuffing Alone Does Not Make A Page Unhelpful - Google's John Mueller said that keyword stuffing alone would not make a page be deemed unhelpful. John added that Google is good at ignoring tactics like keyword stuffing, so that alone likely won't be the reason for ranking issues in Google Search. According to him, keyword stuffing is easy for search engines to ignore, it was one of the first things that people did to manipulate the results back in the 90's :-).The discussion on Keyword stuffing came up after a poster inquired if it is possible that they got hit by the helpful content update because they have too much content on one page. I looked at the page the posted shared and I'm not surprised that they got hit by the latest helpful content update because the page seems to have thin content.P.S: In 2018, John Mueller said something similar, saying that keyword stuffing alone wouldn't result in a penalty and then last year saying keyword stuffed URLs doesn't lead to a penalty either.12. Google: This Is How We Process Your Robot.txt - Google's John Mueller posted a clarification on how and when Google processes the removal requests, or exclusion requests, you make in your robots.txt. The action is not taken when Google discovers the change in your robots.txt, but rather after first the robots.txt is processed and then the specific URLs that are impacted are individually reprocessed by Google Search.Both have to happen. First, Google needs to pick up on your changes in your robots.tx and then Google needs to reprocess the individual URLs, on a URL-by-URL basis, for any changes in Google Search to happen. This can be fast or not, depending on how fast the specific URLs are reprocessed by Google.In fact, many search engine ranking algorithms work this way, which is why when an update rolls out, sometimes it takes about two weeks to fully rollout because that is how long most important URLs on the internet take for Google to reprocess them. 

#TWIMshow - This Week in Marketing
[Ep146] - Bing Shared The Importance of “lastmod” Tag

#TWIMshow - This Week in Marketing

Play Episode Listen Later Feb 6, 2023 21:49


Get up to speed with the Digital Marketing News and Updates from the week of Jan 30 - Feb 3, 2023.1. LinkedIn Now Lets You Add An SEO Title And Description - Publishing articles on LinkedIn? You can now customize SEO titles and descriptions for any LinkedIn articles you will publish or have already published. Here are the details on the 2 fields: SEO title: You have 60 characters to work with. LinkedIn says: “We'll use your added SEO title in place of your article title for search engine result pages, such as Google search.” SEO description: You have 160 characters to work with here.LinkedIn says: “We'll use the SEO description in place of the first few lines of your article on search engine result pages. We suggest utilizing keywords, summarizing your writing, and aiming to write between 140-160 characters.” 2.Twitter Will Share Ad Revenue With Twitter Blue Verified Creators - Elon Musk, owner and CEO of Twitter, announced that starting Friday (3rd Feb 2023), Twitter will share ad revenue with creators. The new policy applies only to ads that appear in a creator's reply threads.Unlike other social platforms, creators on Twitter must have an active subscription to Twitter Blue (depending on your location it is $8/month or more)  and meet the eligibility requirements for the Blue Verified checkmark. Eligibility for the Twitter Blue Verified check mark includes having an active Twitter Blue subscription and meeting the following criteria. Your account must have a display name, profile photo, and confirmed phone number.  Your account has to be older than 90 days and active within the last 30 days.  Recent changes to your account's username, display name, or profile photo can affect eligibility. Modifications to those after verification can also result in a temporary loss of the blue checkmark until Twitter reviews your updated information.  Your account cannot appear to mislead or deceive.  Your account cannot spam or otherwise try to manipulate the platform for engagement or follows. 3. Snapchat Releases Q4'22 Earnings Result - Snapchat has published it's Q4'22 earnings result and it seems they have  added 12 million more active users while Snapchat+ subscriptions continue to rise. Highlights: They currently have 375 million daily active users (DAU) North American user growth is still flat, while European users saw a slight uptick. But it's the ‘Rest of the World', specifically India, which is driving Snap growth. While RoW is helping to boost the overall usage numbers, however on the revenue side, it's not contributing in a significant way. Snapchat's revenue has increased and  it's still reliant on the US and Canada, with other markets trailing well behind on the revenue front. Snap's Revenue Per User has actually declined year-on-year - so while it is growing, it's not bringing in revenue at equivalent scale, and it's even going backwards in some respects. In Q4'22, Snapchat+ (subscription service ) reached over 2.0 million paying subscribers (0.5% of Snapchat's active user base). 4. Meta Released Q4'22 And Full-Year Earnings Results - Meta has released its Q4'22 and full-year earnings results. Facebook's monthly active user count rose to 2.96 billion in the quarter, a slight increase on Q3. Facebook growth remaining essentially flat in every market except the ‘Rest of the World' bucket. This is a concern since RoW does not contribute significantly to their revenue. Facebook's daily active figures are much the same - largely stagnant, though people in the Asia Pacific are coming back to the platform more often. Meta is closing on 4 billion monthly active users across its four apps. The current global population is 8 billion – so around half of all the people in the world are active on Facebook, Instagram, WhatsApp and/or Messenger. Meta says that ad impressions in the quarter were up 23% year-over-year, as it continues to find new opportunities for ad placements. 5. Google Released Q4'22 Earnings Report - Google's parent company Alphabet,inc  announced a 9% decline in revenue (fourth straight quarter of declining profits.) and 1% drop in Search revenue. It earned $76 billion in sales during the Oct-Dec 2022 timeframe and $283billion (10% increase from 2021 revenue) in annual revenue. Even YouTube's advertising sales fell by nearly 8%.  Their operating margin dropped to 26% from 31%.On  Jan 20, 23, Alphabet laid off 12,000 workers (and estimating to spend between $1.9 billion and $2.3 billion on employee severance costs.) because they knew they are not growing the way they expected. In hindsight, the lay off makes sense from an operation stand point. But what does this mean for Digital Search Marketing? Well the tech giant is still a major player in the digital advertising landscape, and its investments in AI show its commitment to continued growth and innovation.6. Google's Mueller Criticizes Some SEO Service Providers - Negative SEO companies are those that will build spammy links to a client's competitor in order to make the competitor's rankings drop. John Mueller took a strong position against SEO companies that provide negative SEO and other agencies that provide link disavow services outside of the tool's intended purpose, saying that they are “cashing in” on clients who don't know better. His advice “Don't waste your time on it; do things that build up your site instead.”The disavow tool was released back in 2012 (after the Penguin update that caused the link selling bust)  and was meant to be used to disavow links if Google notifies you of ‘unnatural links' to your site. Not enlist the services of a negative SEO company.7. Per Google, There Is No Optimal Keyword Density - During recent Google SEO Office hours, someone asked “Does Google use keyword density?” To that question, Google's John Muller said:“Well, no, Google does not have a notion of optimal keyword density. Over the years, our systems have gotten quite well at recognizing what a page is about, even if the keywords are not mentioned at all. That said, it is definitely best to be explicit. Don't rely on search engines guessing what your page is about and for which queries it should be shown. If your homepage only mentions that you "add pizazz to places" and shows some beautiful houses, both users and search engines won't know what you're trying to offer. If your business paints houses, then just say that. If your business sells paints, then say that. Think about what users might be searching for and use the same terminology. It makes it easier to find your pages, and it makes it easier for users to recognize that they have found what they want. Keyword density does not matter, but being explicit does matter and contrary to the old SEO myth, story, joke, commentary, you don't need to mention all possible variations either.”8. Google: Switch To GA4, Or We Will Do It For You - Google has once again reminded us that Universal Analytics (UA) will stop processing data on July 1, 2023. Now Google in an email shared that starting in March, any customer who does not set up a GA4 property with basic settings, Google will configure one with a few basic settings consistent with the existing Universal Analytics property; this includes certain conversion events, Google Ads links, and existing gtag or analytics.js tags.If you don't want Google automatically creating GA4 properties for you, you may opt-out via the instructions in this link. My recommendation : Please migrate to GA4 if you have not already done so ASAP to maximize on historical data and take advantage of the latest analytics capabilities.9. Google Recommendation On How To Use “lastmod” Tag - The presence of the lastmod tag may prompt Googlebot to change the publication date in search results, making the content appear more recent and more attractive to click on. As a result, a lot of site owners are fooling around with the lastmod date.So during this week's office hour Q&A section, Google Search Advocate John Mueller and Analyst Gary Illyes recommended that website owners use the lastmod tag when the webpage was modified significantly to require re-crawling. Do not use lastmod for minor corrections or updates. 10. Bing Shared The Importance of “lastmod” Tag - Through a blog post, Microsoft Bing shared the importance of ‘lastmod' tag and how Bing is going to rely on this tag for crawling. Here is what they have written about it:“The "lastmod" tag is used to indicate the last time the web pages linked by the sitemaps were modified. This information is used by search engines to determine how frequently to crawl your site, and to decide which pages to index and which to leave out. The inclusion of the "lastmod" tag in your sitemap is crucial as it allows search engines to easily determine when a page was last updated. Without it, search engines may delay crawling updated content or may over-crawl your website as they cannot accurately determine if the content has been modified.”Bing also clarified that some site owners are using the same date for sitemap generation and last modified. However that is incorrect. “The date must be set to the date the linked page was last modified, not when the sitemap is generated.  The "lastmod" tag should be regularly updated when the page is updated for search engines to understand the frequency of updates. To ensure this, it is recommended to generate your sitemaps at least daily.We are revamping our crawl scheduling stack to better utilize the information provided by the "lastmod" tag in sitemaps. This will enhance our crawl efficiency by reducing unnecessary crawling of unchanged content and prioritizing recently updated content. We have already begun implementing these changes on a limited scale and plan to fully roll them out by June.”

#TWIMshow - This Week in Marketing
[Ep145] - Google: When Fake URLs Are Generated By Your Competitor

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jan 30, 2023 16:13


Get up to speed with the Digital Marketing News and Updates from the week of Jan 23 - 27, 2023.1. Key Takeaways From Microsoft FY23 Q2 Earnings - Microsoft Corp. announced its financial results for the quarter ended December 31, 2022, which showed an increase in revenue of 2% to $52.7 billion. LinkedIn revenue increased 10% driven by Talent Solutions and has once again seen ‘record levels' of in-app engagement in the most recent quarter, with the platform reporting 18% growth in total user sessions. Per LinkedIn “We once again saw record engagement among our more than 900 million members. Three members are signing up every second. Over eighty percent of these members are from outside the United States.” Hey LinkedIn, how many of these accounts are fake or bot accounts? At the same time, LinkedIn has warned, however, that this number will likely decline in 2023 due to a broader slowdown in hiring, particularly in the tech sector, where many of LinkedIn's job postings stem from. Total ad revenue increased 10%. Microsoft also announced its plans to increase its ad revenue from $10 billion annually to $20 billion. And if achieved, it would make Microsoft the sixth-largest digital ad seller worldwide. Remember, Microsoft has a partnership with Netflix partnership and allows advertisers to purchase ads through their demand side platform Xandr. Microsoft will take a reseller fee, and experts predict that the partnership will be a huge revenue driver, easily clearing $10 billion in ad sales or more.You can read the full earnings statement from Microsoft here. 2. LinkedIn Trying To Boost Newsletter Discovery By Showcasing Which Newsletters The Profile User Has Subscribed - LinkedIn's looking to make it easier to find relevant newsletters in your niche, by adding a new option that will enable members to view what newsletters another member is subscribed to in the app. Per LinkedIn, “We've heard from members that newsletters on LinkedIn are a great way to gather new insights and ideas on professional topics that they care about. We've also heard that members are looking for better ways to discover even more newsletters that would be relevant to them. To aid in this discovery, we are making newsletter subscriptions visible to others, including on profiles. Starting February 11th, 2023, you'll be able to see which newsletters members find value in, the same way you can see your shared interests, pages and groups.”IMO, this is a double-edge sword unless LInkedIn gives me a way to control which subscriptions should be revealed in public vs kept private. On the other hand, creators and influencers can charge $$ to subscribe to a newsletter and promote it for a fee. Though I would not do this but to easy his own.3. Twitter Launches “Search Keyword Ads” - Twitter has introduced a new ad unit called “Search Keywords Ads”, which allows advertisers to pay for their tweets to appear at the top in search results for specific keywords. Search Keywords Ads are similar to promoted tweets but with the added benefit of appearing in search results. This will allow advertisers to reach a wider audience, as users searching for specific keywords (targeting users actively searching for specific keywords, which provides a more accurate signal of user intent.) will now be exposed to sponsored tweets. Advertisers can find Search Keywords Ads as a new campaign objective within the Twitter Ads interface.My question: How long before Instagram copies this?4. Google Optimize Discontinued. Now What? - If you have not heard of Google Optimize before today then I do not blame you. It was a nifty service from Google (formerly called Google Website Optimizer), is an analytics and testing tool created by Google. It allows you to run experiments that are aimed to help online marketers and webmasters to increase visitor conversion rates and overall visitor satisfaction.And now Google has decided to discontinue this service September 30, 2023. "Google Optimize and Optimize 360 will no longer be available after September 30, 2023. Your experiments and personalizations can continue to run until that date," Google wrote.Google added, "We launched Google Optimize over 5 years ago to enable businesses of all sizes to easily test and improve your user experiences. We remain committed to enabling businesses of all sizes to improve your user experiences and are investing in A/B testing in Google Analytics 4."I am unhappy about this announcement because Optimize worked seamlessly with GA and it will leave a significant gap in the market for affordable and beginner-friendly A/B testing options. However, I'm hopeful that Google will integrate some of these features into GA-4.5. Google Ads Now Supports Account-Level Negative Keywords - Creating a list of negative keywords allows you to block your ads from showing for specific irrelevant terms for your brand, making it easier for your ads to reach your desired audience and resulting in more successful conversions. To prevent unwanted impressions or clicks from certain search terms across multiple campaigns, advertisers can now create a negative keyword list at the account level and then apply it to relevant campaigns. This will save you the effort of adding the same negative keywords to individual campaigns and make it easier to manage future changes to negative keywords across campaigns. But be forewarned that a limit of 1,000 negative keywords can be excluded for each account. So do not go crazy!!You can read the full announcement from Google here.6. US Justice Department Sues Google Again, Wants To Dismantle Its Ad Division - The U.S. Department of Justice (DOJ) officially filed an antitrust lawsuit against Google on January 24. The DOJ alleges Google has a monopoly on the current digital advertising ecosystem. Eight states so far have joined forces with the DOJ on the lawsuit. They include: Virginia, California, Colorado, Connecticut, New Jersey, New York, Rhode Island, and Tennessee. Remember that this lawsuit is separate from the first lawsuit from the DOJ back in 2020 against Google. In the 153-page document, the DOJ argues that Google has created an advertising environment that favors its Alphabet-owned products unfairly. Here is what DOJ wrote :“Google, a single company with pervasive conflicts of interest, now controls:(1) the technology used by nearly every major website publisher to offer advertising space for sale; (2) the leading tools used by advertisers to buy that advertising space; and (3) the largest ad exchange that matches publishers with advertisers each time that ad space is sold. Google abuses its monopoly power to disadvantage website publishers and advertisers who dare to use competing ad tech products in a search for higher quality, or lower cost, matches. Google uses its dominion over digital advertising technology to funnel more transactions to its own ad tech products where it extracts inflated fees to line its own pockets at the expense of the advertisers and publishers it purportedly serves.”If Google is found guilty in this lawsuit it could have an impact on the broader advertising sector. You can read the full lawsuit document here.7. Two Imp Elements For Google Discover Follow Feed - Google updated it's Google Discover feed guidelines and wrote that the "most important content for the Follow feature is your feed title element and your per item link elements." In addition to that, make sure that your feed (aka rss) is up-to-date, like you would for your sitemap.Remember, that the Google Discover follow feed feature offers relevant content to Chrome Android users and represents an importance source of traffic that is matched to user interests. It is one way to capture a steady stream of traffic apart from Google News and Google Search.8. Google: Don't Use Relative Paths In Your rel-canonical - A canonical URL lets you tell search engines that certain similar URLs are actually the same. Because sometimes you have products or content that you can find on multiple URLs — or even multiple websites. Using canonical URLs (HTML link tags with the attribute rel=canonical), you can have these on your site without harming your rankings.Gary Illyes from from the Google Search Relations team posted another PSA on LinkedIn, this one says "don't use relative paths in your rel-canonical." Gary wants you to use the full, absolute URL, when it comes to rel-canonical. This is not new advice, Google said this in 2018 and in 2013.This is such a common mistake that Google's John Muller wrote, “One of the problems is that it's relative to where the content was found. www or non-www? http or https? staging subdomain? staging domain? random other domain that's hosted by the same company? If you want to pick something specific, it's good to be specific.”If the terms canonical, relative, or absolute path has made you dizzy then perhaps you should seek the advice or help of an expert.9. Google: When Fake URLs Are Generated By Your Competitor - Mike Blazer asked John, "Bulk generate non-existing URLs on a competitor's site that lead to 5XX server errors when opened. Googlebot sees that a substantial number of pages on that domain return 5XX, the server is unable to handle requests. Google reduces the page #crawl frequency for that domain."John Mueller from Google said that bulk-generating fake URLs of your competitor's site should not lead to negative SEO and ranking issues for that site. "This is not something I'd worry about," he added.P.S: The audio version of the show contains my analysis and opinion on this topic.

#TWIMshow - This Week in Marketing
[Ep137] - Google On NoIndex Pages + Crawl Budget

#TWIMshow - This Week in Marketing

Play Episode Listen Later Dec 5, 2022 16:08


1. LinkedIn App Now Allows You To Schedule Posts - According to LinkedIn “We're starting to roll out post scheduling on desktop and Android so that our creators can easily plan the content they want to share next, with iOS coming soon. This means you can schedule text posts, videos, and images up to three months in advance.”2. Pinterest Stops Creator Rewards Program - Pinterest has announced that as of Nov 30th, it is ending its Creator Rewards program. The Creator Rewards Program which offered cash bonuses when creators completed goals such as hitting certain engagement metrics. 3. New Ad Targeting Options In Twitter Ads - Twitter announced some new ad targeting options, which look fairly similar to its existing ad goals, but with some important differences. The first update is within its ‘Conversions' objective, with advertisers now able to focus their promotions onto users that are more likely to take specific actions in response. Per Twitter: “Website Conversions Optimization (WCO) is a major rebuild of our conversion goal that will improve the way advertisers reach customers who are most likely to convert on a lower-funnel website action (e.g. add-to-cart, purchase). Our user-level algorithms will then target with greater relevance, reaching people most likely to meet your specific goal - at 25% lower cost-per-conversion on average, per initial testing.”Twitter's also launched its ‘Dynamic Product Ads', which enable advertisers ‘to showcase the most relevant product to the right customer at the right time'.Finally, Twitter's also launching its updated Collection Ads format, which enables advertisers to share a primary hero image, along with smaller thumbnail images below it. “The primary image remains static while people can browse through the thumbnails via horizontal scroll. When tapped, each image can drive consumers to a different landing page.”You can read more about Twitter's latest ad updates here.4. TikTok Announces Fall Semester Curriculum Of Its Creative Agency Partnerships (CAP) University - TikTok has announced the Fall Semester curriculum of its Creative Agency Partnerships (CAP) University program, which aims to ‘teach agency creatives how to show up on the platform'. CAP University aims to provide in-depth training and insight for marketing and ad partners, to help them maximize their use of the platform for their clients' promotions. The initiative was first launched back in April, with an initial course run, but now, TikTok has updated its lesson plan for the next phase.The most significant new addition is ‘Content to Cart', which explores the potential of eCommerce in the app, via its evolving set of product and shopping showcase tools.You can learn more about CAP University's Fall Semester curriculum here. 5. Two New Metrics In GA4 Reports - Google has added two new metrics to GA4 properties - Views per session and average session duration.Views per session tracks the number of app screens or web pages people look at during a single visit, while average session duration measures the time users spend on the website.6. Per Google, HTTP/3 Doesn't Impact SEO - HTTP/3 is a new standard in development that will affect how web browsers and servers communicate, with significant upgrades for user experience, including performance, reliability, and security.Now Google Search Advocate John Mueller debunks  theories that  HTTP/3 can directly impact a website's SEO. According to Mueller, “Google doesn't use HTTP/3 as a factor in ranking at the moment. As far as I know, we don't use it in crawling either. In terms of performance, I suspect the gains users see from using HTTP/3 would not be enough to significantly affect the core web vitals, which are the metrics that we use in the page experience ranking factor. While making a faster server is always a good idea, I doubt you'd see a direct connection with SEO only from using HTTP/3. Similar to how you'd be hard pressed to finding a direct connection to using a faster kind of RAM in your servers.”7. Google On “SEO Compliance Score" - Google's Search Liaison, Danny Sullivan, said (on Mastodon) there is no such thing as an “SEO compliance score” from Google. This was in response to user @bertran making a claim that Google has an SEO compliance score. Just goes to show how much made up things exists in the SEO world. Furthermore, Danny shared that the "Search Essentials is covering things we've long said: 1) avoid technical errors that prevent indexing content. 2) don't spam. 3) consider some best practices about producing content meant for humans."8. Google Has Algorithms To Detect & Demote AI Altered Plagiarized Content - There are “gurus” who are peddling  that AI tools that can (re)generate content that will rank you on top. However, Duy Nguyen from Google's search quality team said that Google has "algorithms to go after" those who post AI plagiarized content, then the algorithms can "demote site scraping content from other sites."Duy Nguyen said, "Scraping content, even with some modification, is against our spam policy. We have many algorithms to go after such behaviors and demote site scraping content from other sites."9. Google's POV On Long & Short Content - During a recent Google SEO office hour, the topic of Long & Short Content came up. According to Gary Illyes, Google is not more or less likely to crawl or index shorter or more niche content than other types of content. The fact that the content is shorter and perhaps easier to crawl does not guarantee that Google will index it more quickly or more favorably than lengthy stuff. "Niche content can also be indexed. It's not in any way penalized, but generally content, that's popular on the internet, for example, many people linked to it, gets crawled and indexed easier," Gary added.Then another user asked if splitting a long article into multiple interlinked pages results in thin content. But first, what is think content? Some people think that thin content means a webpage with not much content on it. But in reality, thin content is essentially when a site's text, or info, or visual elements are not relevant to the visitor's intent or does not provide them with what they are looking for. Thin pages are characterized by a lack of originality, a tiny difference from other pages, and/or a lack of unique added value.  For example, a product page that a retailer copies from the manufacturer site with nothing additional added to it.Doorway pages are also considered a form of thin content. Because often times these web pages are designed to rank for specific keywords. For example a page created to rank for a keyword phrase and different city names, where all the pages are virtually the same except for the names of the cities.When an user asked “Would it be considered thin content if an article covering a lengthy topic was broken down into smaller articles and interlinked?” Lizzi Sassman from Google answered: “Well, it's hard to know without looking at that content.But word count alone is not indicative of thin content.These are two perfectly legitimate approaches: it can be good to have a thorough article that deeply explores a topic, and it can be equally just as good to break it up into easier to understand topics.It really depends on the topic and the content on that page, and you know your audience best.So I would focus on what's most helpful to your users and that you're providing sufficient value on each page for whatever the topic might be.”However pagination (splitting one lengthy topic across multiple pages that are interlinked and requires site visitor clicks to the next page to keep reading the content) is fine. Google Search Central has a page about pagination best practices. 10. Google: Stop Wasting Your Time By Disavowing Random Links Flagged By Disavow Tools - In SEO, Disavow means to notify Google to ignore the harmful or low-quality links that are pointing to your site and are beyond your control. Third party tools use proprietary algorithms to score backlinks according to how spammy or toxic the tool company feels they are. According to Google Search Advocate John Mueller, “disavowing random links that look weird or that some tool has flagged, is not a good use of your time. It changes nothing. Use the disavow tool for situations where you actually paid for links and can't get them removed afterwards.”Obviously this brings the question, how do you know if the links were paid for specially when you had hired an agency/consultant for a backlink project? My thoughts in the show recording. 11. What Google Thinks Of Backlinks & Rankings - So what are backlinks? Backlink in SEO is using another website to link to your website so that when clicked, the user will be redirected to your website. During a recent Google SEO office hour, Duy Nguyen from Google's search quality team said that "backlinks as a signal has a lot less significant impact compared to when Google Search first started out many years ago.  We have robust ranking signals, hundreds of them, to make sure that we are able to rank the most relevant and useful results for all queries." According to Duy, it is a waste of time and effort to generate backlink. Here is what he said :"link building campaigns, which are essentially link spam according to our spam policy. We have many algorithms capable of detecting unnatural links at scale and nullifying them. This means that spammers or SEOs spending money on links truly have no way of knowing if the money they spent on link building is actually worth it or not, since it's really likely that they're just wasting money building all these spammy links and they were already nullified by our systems as soon as we see them."12. Google On NoIndex Pages + Crawl Budget - Crawl Budget is the number of pages Googlebot crawls and indexes on a website within a given timeframe. The vast majority of sites out there don't need to worry about crawl budget unless you run a very big site, added a bunch of pages, or have lots of redirects. On 2nd Dec 22, Lizzi Sassman from Google updated the crawl budget management help document to kill two myths around the crawl budget. They are:“1. Any URL that is crawled affects crawl budget, and Google has to crawl the page in order to find the noindex rule. However, noindex is there to help you keep things out of the index. If you want to ensure that those pages don't end up in Google's index, continue using noindex and don't worry about crawl budget. It's also important to note that if you remove URLs from Google's index with noindex or otherwise, Googlebot can focus on other URLs on your site, which means noindex can indirectly free up some crawl budget for your site in the long run.2. Pages that serve 4xx HTTP status codes (except 429) don't waste crawl budget. Google attempted to crawl the page, but received a status code and no other content.”Coincidentally, during the latest SEO office hour, someone asked if a large number of noindex pages that will adversely impact a website's indexing or crawling. Or if we need to be mindful of Ratio Of index/noindex Pages? Or worry about noindex pages linked from spammy Sites. Here are official response from Google on these questions: “Noindex is a very powerful tool that search engines support to help you, this site owner, keep content out of their indexes. For this reason, it doesn't carry any unintended effects when it comes to crawling and indexing. For example, having many pages with noindex will not influence how Google crawls and indexes your site.”“No, there is no magic ratio to watch out for. Also, for a site that's not gigantic, with less than a million pages, perhaps, you really don't need to worry about the crawl budget of your website. It's fine to remove unnecessary internal links, but for small to medium-sized sites, that's more of a site hygiene topic than an SEO one.”“Noindex is there to help you keep things out of the index, and it doesn't come with unintended negative effects, as we said previously. If you want to ensure that those pages or their URLs, more specifically, don't end up in Google's index, continue using noindex and don't worry about crawl budget.”

Earned Media Hour with Eric Schwartzman
Ex-Google Search Quality Team Member Opens Kimono

Earned Media Hour with Eric Schwartzman

Play Episode Listen Later Nov 18, 2022 45:58


Fili Wiese is a former Google Engineer and member of the search quality team. I spoke to him after his keynote at the Affiliate Meet Market in Berlin in October 2022. Tech SEO Topics Covered: How to tell Googlebot what to crawl and what to ignore How to manage your crawl budget How to retain… The post Ex-Google Search Quality Team Member Opens Kimono appeared first on Eric Schwartzman.

SEO Podcast by #SEOSLY
#1: Introduction to the SEO Podcast by #SEOSLY & Weekly SEO News

SEO Podcast by #SEOSLY

Play Episode Listen Later Oct 31, 2022 27:49


#1: Introduction to the SEO Podcast by #SEOSLY & weekly SEO news Welcome to the first episode of the SEO podcast by #SEOSLY. English Google SEO office hours from October 2022 This is an audio-only recording of the experimental Google SEO office-hours from October 2022. The answers are compiled by the Google Search Relations team. By Google Search Central. LISTEN. Search Central docs and SEO (Search Off the Record) Find out if Google Search Central implements its SEO best practices and guidelines on its own website and documentation! Hosts John and Lizzi discuss SEO tools, sitemaps, hreflang links, robots.txt, click-through rates, and more By Search Off the Record. LISTEN. The State Of SEO: Survey Data To Plan Your Next Year In SEO Inform your SEO strategy for 2023 with data, insights, and findings from the SEO community. More than 3,600 SEO pros responded. By Ben Steele (SEJ). LEARN MORE. Google October 2022 Spam Update Complete On October 19, 2022 Google released the October 2022 spam update. This update was global and affected all languages. The rollout was complete as of October 21, 2022. By Google Search Central. LEARN MORE. SEO Audits with Olga Zarzeczna (Search with Candour) I was a guest on the Search with Candour SEO podcast where Jack Chambers-Ward and I talked about – you guessed it – SEO audits.Check this episode and subscribe to the Search with Candour SEO podcast. Top SEO Mistakes from Olga Zarzeczna (Edge of the Web) I was pleased to be the guest on the Edge of the Web SEO podcast again. This time Erin and I spoke about SEO mistakes I come across in SEO audits.Don't miss that episode and subscribe to the Edge of the Web SEO podcast. Old Google Page Speed Algorithms Are No Longer Used Google's John Mueller confirmed this morning that Google no longer uses the old Google page speed algorithms from 2010 or 2018. Instead, he said, Google only uses the page experience update, looking at the core web vital metrics. Reported by Barry Schwartz. LEARN MORE. Google Analytics 4 gets new features & reporting Google announced (in this post) new features and reporting for Google Analytics 4 including behavioral modeling in real-time reporting, custom channel reporting for data-driven attribution, and integration with Campaign Manager 360. Reported by Barry Schwartz. LEARN MORE. New Google Business Profile Web Search Menu Now Rolling Out Google is giving business owners the ability to quickly edit their business profile directly in web search through these new action buttons (previously it required many more clicks). Reported by no one other than Barry Schwartz. LEARN MORE. Google is removing a lot of reviews from Google Business Profiles Joy Hawkins is reporting and a lot of people are confirming that Google has been doing waves of review takedowns. Reported by Joy Hawkins. LEARN MORE. #1:Just because something is human-written doesn't make it helpful & good content. In case you need a friendly reminder from John Mueller about helpful content, here it is. "Just because something is human-written doesn't make it helpful & good content. I'd really focus on making things awesome, unique, compelling, that people recommend to friends - not just something that's technically ok." #2: Google: You'd Be Lucky To End Up In The Same Place After Changing URLs John Mueller said when it comes to changing URLs, "if you're lucky, it'll end up in the same place." Meaning, if you do everything right and Google does everything right, which is rare to have both align, then you'd be lucky to have the new URL rank in the same position as the old URL in Google Search. Zero-Clicks Study The topic has had plenty of coverage and attention, but often without a great deal of context, so that's why we've carried out a new zero-clicks study with a more refined sample size from our own data sources. 13 essential SEO skills you need to succeed Critical thinking. Problem-solving. Persistence. These are just three critical SEO skills. Discover 10 more here. By Danny Goodwin (Search Engine Land) Crawl efficacy: How to level up crawl optimization Crawl budget is a vanity metric. Your goal should be to guide Googlebot in crawling important URLs fast once they are published or updated. By Jes Scholz (Search Engine Land)

#TWIMshow - This Week in Marketing
[Ep131] - Google/YouTube Audio Ads Now Available

#TWIMshow - This Week in Marketing

Play Episode Listen Later Oct 24, 2022 21:50


1. Highlights From Snapchat Q3'22 Earnings - Here are the highlights from the Snapchat's Q3'22 Earnings report. Analysis is in the podcast.: Revenue increased 6% YoY to $1.13 billion in Q3' 22.  Average revenue per user (ARPU) was $3.11 in Q3'22, compared to $3.49 in Q3'21. ARPU in US was $8.13 ARPU in ROW $.89 Daily Active Users (DAUs) were 363 million in Q3'22, an increase of 57 million, or 19% year-over-yearSnap is still seeing the majority of its growth in the 'Rest of World' category, where it's added 45 million users year-over-year. Snapchat+ reached over 1.5 million paying subscribers in Q3'22 and is now offered in over 170 countries. 2. Uber Testing “Journey Ads” - Marketers Can Target Riders By Destination - Uber, the ride-hailing company is expanding its advertising business and is testing out ‘Journey Ads'  to its riders based on their travel history and their geographic locations. The  journey ads will be shown in the Uber app at least three times during the riders' journey. Uber will allow brands to place ads using data drawn from riders' travel history and their precise geographic destinations.  For example, if a user books an Uber to a specific retailer, cinema or airport, an advertiser could buy ads centered on that location. An Uber spokeswoman said, "Uber does not share user data with advertisers. The information that Uber shares is limited to aggregated information or other non-personal data, such as the percentage of users who clicked on an ad, or the number of users who visit the retail locations of advertisers using our platform. Users can opt-out of targeted ads on the Uber app at any time"3. New Features for Facebook Groups Announced In Communities Summit 2022 - Facebook has announced a range of new groups features during the Communities Summit 2022, including new Stories sharing tools, view-only chats, custom member profile displays, and more. Reels in Groups- With Reels now in Groups, community members can share information, tell stories and connect on a deeper level. Group admins and members can also add creative elements such as audio, text overlay and filters on top of their videos before sharing to bring their stories to life. The ability to share a public Facebook event for your community to your Instagram story. Whether you're a group admin hosting a meet-up to celebrate a community milestone, or a group member sharing your passion with friends, this feature can help you showcase your community more broadly. Facebook's also testing new customization options for people's 'About Me' section specific to each community, in order to better highlight relevant information for that audience, along with a new indicator that highlights whether you're open to private messaging. Facebook's also testing the ability for group admins and moderators to create view-only chats to send one-way communications to all of their members (without having to maintain or respond to messages in the chat), and a new process to highlight top contributing members, who'll be able to earn points by taking on group engagement responsibilities. Facebook's also trying out a new element within Admin Assist that'll enable group admins to implement automatic removal of posts containing information, as determined by Meta's third-party fact-checkers. 4. Instagram Announces ‘Creator Portfolio' Option - Instagram has officially announced its ‘Creator Portfolio' option, which will enable creators to build what's essentially a media kit type package, in order to showcase their audience reach, as well as previous brand work. It's similar to Instagram's Creator Marketplace listings where influencers are presented in a collective showcase – though a Creator Portfolio enables individual users to share their profile direct with potential partners in the app. 5. Google: The Importance Of Image Filenames And One Big Mistake To Avoid - Google's documentation doesn't say if image filenames are ranking factors. But they do say that Google takes note of them in order to help figure out what the image is about. For that reason it's recommended by Google that images be given meaningful filenames. It's also a good practice to give meaningful filenames to images because it makes it easier for organization purposes to be able to see the image filename and know what it's about. In Episode 48 of Search Off the Record podcast, Google's Lizzi Sassman and John Mueller discussed the importance of image filenames and here is the gist of their conversation: Image filenames that are descriptive are helpful from an Image Search standpoint because they help Google understand what an image is about. Alt text and the text surrounding the image provide a stronger and more important signal about the image than the filename. Changing the image filename of an already indexed image has a “minimal effect” and likely won't make it better. Changing the filename of an indexed image may result in the renamed image going uncrawled and not indexed for months. 6. Google : The One Thing To Do Before Implementing Geo IP Redirects - Geo IP based redirection is the process of automatically redirecting a website visitor by their geolocation (country, state or city). It works by detecting the visitor's location from their IP address by matching it against a database of IPs and locations. In the latest #AskGooglebot video series, Google's Search Advocate, John Mueller, shares that “this is a pretty complex topic and it can cause headaches depending on how it's implemented. So I'd recommend getting help from experts or reaching out to the Search Central help forum for advice. The most important aspect here is that Googlebot usually crawls and indexes a website just from a single location. If you're showing different content by location only one version will be indexed for search. Because of that, if there's something you consider important on your website, make sure it's on the default content that's shown to all users. And finally, if there's any page you want to have findable in search, make sure that you're not blocking users in other locations from reaching that page.”7. Google: To Have An Unique URL You Need A Character The /? - Google's John Mueller said on Twitter "Using just "/?" at the end of a URL is equivalent to not having anything after the slash. You need at least 1 character after the "?"."8. Google: There Is No Need To Use English URLs For Pages Not In English - Google's John Mueller wrote on Twitter "There's no need to use English URLs of the content isn't in English. That said, keywords in URLs are a bit overrated, so I wouldn't worry too much about it." The same information was conveyed back in 2018 in a video.9. Google: Spam Update Rolling Out - It has been 11 months since the last time Google released a spam update and now Google announced the October 2022 spam update. The spam updates help document reads:“While Google's automated systems to detect search spam are constantly operating, we occasionally make notable improvements to how they work. When we do, we refer to this as a spam update and share when they happen on our list of Google Search ranking updates. For example, SpamBrain is our AI-based spam-prevention system. From time-to-time, we improve that system to make it better at spotting spam and to help ensure it catches new types of spam.”Here are the important things you need to know about the October 2022 Spam Update: Name: Google October 2022 Spam Update  Launched: October 19, 2022 at around 11 am ET  Rollout: It will take several days to fully roll out  Targets: It improves Google's spam detection techniques, Google said "sites that violate our policies may rank lower in results or not appear in results at all."  Penalty: It penalizes spam techniques that are against Google's spam policies.  Global: This is a global update impacting all regions and languages.  Impact: Google will not disclose what percentage of queries or searches were impacted by this update.  Recover: If you were hit by this, Google said you should review its spam policies to ensure they are complying with those.  Refreshes: Google will do periodic refreshes to the spam update. It can take many months to recover, Google said. 10. YouTube Ads: Moment Blast & Expanded Product Feeds For Discovery Ads - YouTube is launching a new offering called “Moment Blast,” designed to reach viewers during events like live sports, movie releases, or product launches. Moment Blast gives advertisers prime positioning on YouTube Select content on connected TVs (CTV) and other devices, plus a Branded Title Card and optional Masthead placement. Moment Blast is intended for brands looking to raise awareness during key moments, like major sporting events, movie releases or product launches. In the announcement, YouTube says “People have always connected in front of the TV screen, but YouTube gives them the unique chance to bond over shared passions — like watching live-streamed concerts, fitness classes or even religious ceremonies together. They feel a similar connection to the ads they get, too.” YouTube also claims that 59% of respondents in a survey feel that ads they see on YouTube are more relevant than on linear TV or other streaming apps.Furthermore, YouTube will be expanding product feeds to Discovery ads to help advertisers scale their social media creative and reach more engaged viewers. Soon, product feeds will also include local offers, allowing brands to show real-time availability for products in their Google Merchant Center so people can find the most convenient place to buy. Creators will also be able to transform their content into virtual storefronts. Additionally, more creators will have the ability to tag products in their videos and Shorts. These features will be available on November 10.11. Google Ads Updates “Unavailable Video” Policy - This week, Google quietly updated its “Unavailable Video” policy to give advertisers a clearer understanding of the disapproval reason and how to rectify the issue quickly. The four reasons why you may receive an ad disapproval with “Unavailable Video” policy are: Video status changed to “deleted” after submission. Video status changed to “private” after submission. Video marked live premiere. Videos marked sponsors or members only. 12. Google Ads: Seasonal Video Ad Templates Are Now Available - To support your holiday and seasonal marketing efforts, new seasonally-themed video ad templates are available in Google Ads. The templates have designs and music tracks that reference the following holidays and moments: Diwali, Hanukkah, Christmas and seasonal sales events like Cyber Monday and Black Friday. Expect to see additional templates applicable to more seasonal moments throughout the coming months.As with all of their templates, you can customize them with your images, logo, brand colors and text. Videos you make may be used in any campaign that accepts video assets, like Performance Max campaigns, Video action campaigns and App campaigns. These campaign types automatically scale to YouTube Shorts inventory, where vertical video assets are most effective. You can browse all of the templates by clicking here.13. Google Ads : 3 New Reporting Columns - Google Ads has rolled out three new columns to reports that make it easier to understand how conversions impact campaign performance. Before this update, you would have had to apply custom segments to see each conversion action and the categories they fall under. Now the data is more readily available. The new columns are: Results : shows the number of conversions you've received across your primary conversion actions for each of the standard goals in your account. In addition, you'll also see the impact this campaign is driving against goals it's not optimizing towards. Results value : shows the calculated conversion values you've received across your primary conversion actions for each of the standard goals in your account. Conversion goals: shows the goals listed in your campaign-level settings that drive performance. 14. Google/YouTube Audio Ads Now Available - Google in their announcement cites data from Edison Research that finds YouTube is the second-most popular destination for listening to podcasts. Now, all Google advertisers can reach podcast listeners with audio ads on YouTube. Audio ads are also served to people who listen to music on YouTube. However, there's an option to limit targeting to podcast listeners if that's who you're primarily interested in targeting. 

Tallest Tree Digital Podcast
Title Tags, "Link Juice," and the Death of AMP

Tallest Tree Digital Podcast

Play Episode Listen Later Oct 9, 2022 51:23


In this episode Cord & Einar discuss the title tag length, "link juice," AMP is dead, measuring forms with GA4, how Googlebot handles iframes, and how to think about copies of original content on the web.Sources Cited:Search Engine Land: What should the title tag length be in 2023?SEO Roundtable: Google: Anything That Talks About Link Juice Should Be IgnoredLily Ray

Search Off the Record
Let's talk image SEO

Search Off the Record

Play Episode Listen Later Oct 6, 2022 33:37


Do you want to know how to increase traffic to your website with images? What about making your website more accessible for screen readers with alt text for an image? Our hosts John and Lizzi chat about optimizing for image search. They cover everything from naming your image file, how often Googlebot crawls images, increasing visibility of your page through image search, to improving alt text and image accessibility.  Resources: Episode transcript → https://goo.gle/sotr048-transcript  Google images best practices → https://goo.gle/3CAevUD  SEO for Google Images | Search Central Lightning Talks → https://goo.gle/3e76PQo  Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team. #SOTRpodcast

#TWIMshow - This Week in Marketing
[Ep124] - Google Shares Insights On Factors That Determine Which Content Is Indexed

#TWIMshow - This Week in Marketing

Play Episode Listen Later Sep 5, 2022 15:30


1. Quick Updates - Meta Provides New Insights Into How its Video Distribution Algorithms Work Meta Invites Applications for the Third Phase of its ‘Community Accelerator' Program Instagram Best Practices For Recommended Content Twitter Shares New Insight into the Value of Utilizing Ad Format Combinations in Your Tweet Marketing Google Updates Privacy Threshold For Analytics Search Queries Report Danny Sullivan at Google Tweets That “Helpful Content” Is WIP 2. Twitter ‘Circles' Option Is Available To All Users - Because sometimes your Tweets aren't for everyone add up to 150 people to yours and use it. “Before you post on Twitter, you'll now see an option to share your Tweet with either your circle or your full followers list. Circles can contain up to 150 people, and you can adjust who's in and who's out at any time. Don't worry, no one will be notified of any changes you make to your circle.” Members of Circle will be alerted that their tweets are only viewable by those in the group via a green indicator attached to each Circle tweet.3. Microsoft Ads Re-extends RSA Migration To Feb. 2023 - In April,2022 Microsoft extended the original June 30 deadline to August 29. Now Microsoft has announced that they are extending that deadline to February 1, 2023. Microsoft says the extension is in response to advertisers need for more time. Expanded Text Ads (ETAs) will continue to serve with RSAs but advertisers will no longer be able to create new or edit existing ETAs.You can read the announcement here.4. Google Wants You To Add Product Information To Your Business Profiles! - Google has added a new section for products to the Google Business Profile guidelines. The section says "If you run a retail business, you can show nearby shoppers what you sell by adding your in-store products to your Business Profile at no charge." You can either add products to your Business Profile manually through the Product Editor or with Pointy - a hardware device that is free from Google and sits in the middle of  your barcode scanner and point of sale system so that it can add your products to Google.Products submitted via Product Editor or Pointy must adhere to the Shopping Ads Policy. Google does not allow content related to regulated products and services, including alcohol, tobacco products, gambling, financial services, pharmaceuticals and unapproved supplements, or health/medical devices. Submitting products that violate Google's policy may result in removal of the entire product catalog, including products that aren't in violation.You can read the guidelines over here.5. 4 New Features In Google Shopping Campaign - Google just announced four new features for advertisers to implement in their ad campaigns and merchant feeds.  Conversion value rules for store sales and store visits - Advertisers can now set store visits or sales default values at the campaign level. Before this update, Google Ads applied conversion value rules equally to all conversion actions. In addition to setting specific conversion values for store visits and sales, you can select the values at the campaign level. If you're running multiple campaigns promoting store visits, you can assign a higher value to one than the other. Additionally, you can set rules for store visits or sales on the conditions of geographic location, audiences, or devices. The ability to adjust values by location or device means you can increase the value of store visits for customers in New York versus customers in other areas, for example. You can set conversion value rules by logging in to your Google Ads account and navigating to Measurement > Conversions > Value rules. Then, click create conversion value rule and fill in the required information. Product-specific insights - Product-specific insights are available at the account level and help advertisers spot underperforming offers, identify products with missing feed attributes, and compare bidding with your top competitors. Product insights work on shopping and Performance Max campaigns and are intended to leverage ads performance data to optimize products and provide visibility on what actions to take to fix issues. Deals Content API - The Deals Content API is intended to make uploading and managing deals easier at scale. Merchants and advertisers can now add their sales and promotions to their listings via the Content API, which makes it even easier for merchants to upload and manage their deals at scale. Shipping & Returns Annotations - Merchants will now be able to list the expected delivery date (dynamic) (“Delivery by XX/YY”) and free returns right on their ads. Advertisers can also easily add their return policies. 6. Google Publishes 6 SEO Tips For E-commerce Websites - Alan Kent, a Developer Advocate at Google, shared six SEO tips that combine structured data and Merchant Center to get the most out of your website's presence in search results. Ensure Products Are Indexed - Googlebot can miss pages when crawling a site if they're not linked to other pages. On ecommerce sites, for example, some product pages are only reachable from on-site search results. You can ensure Google crawls all your product pages by utilizing tools such as an XML sitemap and Google Merchant Center. Creating a Merchant Center product feed will help Google discover all the products on your website. The product page URLs are shared with the Googlebot crawler to use as starting points for crawls of additional pages potentially. Check Accuracy Of Product Prices Search Results - If Google incorrectly extracts pricing data from your product pages, it may list your original price in search results, not the discounted price. To accurately provide product information such as list price, discounts, and net price, it's recommended to add structured data to your product pages and provide Google Merchant Center with structured feeds of your product data. This will help Google extract the correct price from product pages. Minimize Price & Availability Lag - Google crawls webpages on your site according to its own schedule. That means Googlebot may not notice changes on your site until the next crawl. These delays can lead to search results lagging behind site changes, such as a product going out of stock. It would be best if you aimed to minimize inconsistencies in pricing and availability data between your website and Google's understanding of your site due to timing lags. Google recommends utilizing Merchant Center product feeds to keep pages updated on a more consistent schedule. Ensure Products Are Eligible For Rich Product Results - Eligibility for rich product results requires the use of product structured data. To get the special rich product presentation format, Google recommends providing structured data on your product pages and a product feed in Merchant Center. This will help ensure that Google understands how to extract product data to display rich results. However, even with the correct structured data in place, rich results are displayed at Google's discretion. Share Local Product Inventory Data - Ensure your in-store products are found by people entering queries with the phrase “near me.” First, register your physical store location in your Google Business Profile, then provide a local inventory feed to Merchant Center. The local inventory feed includes product identifiers and store codes, so Google knows where your inventory is physically located. As an additional step, Google recommends using a tool called Pointy. Pointy is a device from Google that connects to your in-store point-of-sale system and automatically informs Google of inventory data from your physical store. The data is used to keep search results updated. Sign Up For Google Shopping Tab - You may find your products are available in search results but do not appear in the Shopping tab. If you're unsure whether your products are surfacing in the Shopping tab, the easiest way to find out is to search for them. Structured data and product feeds alone aren't sufficient to be included in the Shopping tab. To be eligible for the Shopping tab, provide product data feeds via Merchant Center and opt-in to ‘surfaces across Google.' For more on any of the above tips, see the full video from Google.7. Google Shares Insights On Factors That Determine Which Content Is Indexed - Gary Illyes and Martin Splitt from Google recently published a podcast discussing what's known as a crawl budget and what influences Google to index content. Gary Illyes said that the concept of a crawl budget was something created outside of Google by the search community and most sites don't need to worry about the crawl budget. According to Gary, part of the calculation for a crawl budget is based on practical considerations like how many URLs does the server allow Googlebot to crawl without overloading the server. Another interesting point that was made was how, in relation to crawling, there are different considerations involved. There are limits to what can be stored so, according to Google, that means utilizing Google's resources “where it matters.” It all boils down to the  issue of “spending our resources where it matters.”Because Google can't index everything, it tries to index only the content that matters and how frequently it is updated. "Google can infer from a site overall which areas they might need to crawl more frequently. E.g. if there's a blog subdirectory & there are signals that it's popular/important, then Google might want to crawl there more." " And it's not just update frequency, it's also about quality. E.g. if G sees a certain pattern is popular (folder), & people are talking about it & linking to it, then it's a signal that ppl like that directory,"Listen to the podcast here.

Search Off the Record
Should I worry about crawl budget?

Search Off the Record

Play Episode Listen Later Aug 25, 2022 29:04


There is a lot of confusion around crawl budget so Martin, Lizzi, and Gary from Search Relations are here to bring clarity for everyone unsure about it. Are you worried about your website not being crawled quickly? Not sure what consumes and wastes your crawl budget? Learn when you should and shouldn't be worried about your crawl budget. This episode covers misconceptions and frequently asked questions concerning crawl budget.   Resources:  What Crawl Budget Means for Googlebot → https://goo.gle/3pIwXTM   Episode transcript → https://goo.gle/sotr045-transcript    Search Off the Record is a podcast series that takes you behind the scenes of Google Search with the Search Relations team.   #SOTRpodcast

NotiPod Hoy
La formación en pódcast y audio se afirma en las universidades

NotiPod Hoy

Play Episode Listen Later Jun 28, 2022 4:30


Lo esencial que debes saber: - Lanzan “Máster en pódcast y audio digital” en España. - Facebook, Twitter o Discord: ¿Cuál es la mejor plataforma para crear comunidades para un pódcast? - En ‘Cámara abierta' de RTVE dedicaron un programa al sonido,conversaron con el periodista de música y con Mar Abad. - ¿Cómo agregar una narración de audio a las publicaciones de Substack? - Nuevo libro en castellano - “Deconstruyendo los medios”, de Pepe Cerezo, director de Evoca Media. - Googlebot solo rastrea e indexa los primeros 15 MB de contenido HTML.
 Únete a nuestro nuevo grupo en Twitter ‘Hablemos de los pódcast'. Una comunidad para que los podcasters compartan tendencias, novedades y experiencias que les ayuden a hacer crecer un pódcast. ¡Conéctate aquí! Pódcast recomendado Brazalete Negro. Un programa que cuenta el lado oscuro del fútbol. Historias de perdedores en el campo y en la vida. Equipos malditos, jugadores desaparecidos y estadios trágicos. Producido por Panenka Podcast y Radio Primavera Sound.

#TWIMshow - This Week in Marketing
[Ep114] - Updates From Google SEO Day

#TWIMshow - This Week in Marketing

Play Episode Listen Later Jun 27, 2022 17:59


1. Shopify Announces Audience During Their Semi-Annual Showcase - Shopify Editions is Shopify's new semi-annual showcase demonstrating the speed and breadth of innovation at Shopify to bring merchants into the future. Per Shopify, they are powering a radically different model of commerce: Connect to Consumer (C2C). To help merchants embrace C2C, they unveiled an entirely new B2B offering, launching Tokengated Commerce, partnered with Google to help consumers shop local, brought commerce to Twitter,  launched Tap to Pay on iPhone—just to name a few.Shopify Audiences, a new marketing tool helps Shopify Plus merchants find new customers. Using Shopify's unique view on purchase intent and their merchant network helps identify buyers who are looking for your products. Then create and export high-interest audiences to ad platforms like Facebook.Other notable things announced at Shopify Editions are:a.) B2B on Shopify makes it seamless for Shopify Plus merchants to sell to other businesses on the same platform that they use for D2C. Say goodbye to spreadsheets, one-off invoices, and manual data entry. Not to mention, they've also partnering with ERP providers such as NetSuite, Brightpearl, and Acumatica to automatically integrate merchants' essential data with their B2B solution.b.) With Shopify's offline GMV growing nearly 80% year over year in Q1 2022, they figured out that in-person shopping is coming back in a big way—and they want a piece of the pie. That's why they're one of the first commerce platforms to launch Tap to Pay on iPhone in the U.S. in partnership with Stripe. Currently in early access with select Shopify point-of-sale merchants, Tap to Pay on iPhone will be fully available in the U.S. in the coming months.With Tap to Pay on iPhone, they're lowering the barrier to entry for Shopify merchants to expand into offline retail for the first time without needing extra hardware or investment. Per Shopify, “Think farmers markets, new pop-up experiences, test shops for established merchants to expand to new physical locations. Enabling Tap to Pay on iPhone will broaden the possibilities in offline commerce, giving merchants who haven't yet tried in-person selling an easy way to connect with customers IRL.”c.) Local Inventory on Google . Turning browsers into local buyers. We know that nearly all U.S. consumers have searched for local businesses online, with more than a third searching multiple times every week. Shopify is about to improve those searches in a big way with local inventory sync on Google, available through Shopify's Google channel. Now, Shopify merchants can automatically let nearby customers know when a product is available in store. Shopping local has never been more convenient. Local inventory sync on Google is globally available via Shopify's Google channel to merchants using Shopify's point-of-sale app.You can find all the details here.2. Twitter Launches New Shopify Integration - Social commerce continues to be a growing opportunity for merchants. Orders placed with Shopify merchants through partner integrations quadrupled YoY in the first quarter of 2022. Sometimes, a brand's biggest audience is the one they haven't tapped. Twitter's hundreds of millions of users represent potential connections for independent merchants, and that's why Shopify is the first commerce platform to partner with Twitter as it continues to scale its Twitter Shopping ecosystem. In fact, shopping-related Tweets saw 40-billion impressions over the last year alone. Using Shopify's new Twitter sales channel, merchants can reach consumers directly from their Twitter profiles, creating a frictionless path to purchase in today's digital townsquare. This will enable Shopify merchants to list their products on their Twitter Professional Profiles, with each item, when tapped, then redirecting users to the Shopify product page to make a purchase.Twitter's shopping features include Twitter Shops and Shop Spotlight, and, starting today, both are available for free to all Shopify U.S. merchants selling to U.S. consumers. “The Twitter sales channel makes it quicker and easier to meet our customers wherever they are,” said Jessica Stevens, Senior Social Media Manager at Trixie Cosmetics. “The automatic syncing is going to help us save so much time, and the sales channel allows me to easily connect the two platforms that we already tap into to sell products and engage with customers.”3. Twitter Releases Twitter Write, Notes Feature - Finally available on some user profiles, Twitter's new "Notes" feature offers a quick, seamless method to include lengthy text pieces in your tweets. The Notes UI is quite simple and has all the standard features of a blog post composer, such as the ability to add header pictures, insert photos and links within the text, and a quick way to add tweets.Once posted, Notes will display as a Twitter card that directs users to the complete article. You have a lot more area for your longer-form content on the app because Note titles are only allowed to be 100 characters long and the body of a Note can be up to 2,500 words.Additionally, unlike tweets, notes can be edited after they are published, and a "Edited" label will be appended at the top of the note.4. LinkedIn Updates Event Engagement Options & Simplifies ‘Repost' Process - In order to promote on-topic conversation and interaction within LinkedIn Events, LinkedIn is now introducing new comment engagement options. You may now interact with participants before, during, and after the session by using the Comments tab of a LinkedIn event. You can also reply to particular comments in-stream. With more than 24,000 events being added to the app each week as of this writing, the option is intended to take advantage of LinkedIn Events engagement. Furthermore, LinkedIn recently added LinkedIn Events to its simplified "Boost" ad option, giving users another method to promote their LinkedIn event listings.p.s. Users are currently starting to see the enhanced Comments choices in-stream as the option is being rolled out.Additionally, LinkedIn is making it simpler to share updates on LinkedIn by adding a new, streamlined "Repost" option to the "Share" menu. When you tap the "Share" prompt, a new "Repost" option will soon be available, allowing you to share without leaving a comment (currently your only option). By removing the requirement for users to contribute their own commentary to every re-share, this will make it simpler to share posts.5. Will Google Crawl URLs In Structured Data? - Does Google crawl alternative kinds of links like those found in structured data? Google's John Mueller answered:“So for the most part, when we look at HTML pages, if we see something that looks like a link, we might go off and kind of like try that URL out as well. That's something where if we find a URL in JavaScript, we can try to pick that up and try to use it. If we find a link in kind of a text file on a site, we can try to crawl that and use it.”Do not blindly assume that just because a link is in structured data means it will or will not be indexed.He recommends that if you want Google to go off and crawl that URL, make sure that there's a natural HTML link to that URL, with a clear anchor text as well, that you give some information about the destination page. If you don't want Google to crawl that specific URL, then maybe block it with robots.txt or on that page use a rel=canonical pointing to your preferred version, anything like that.6. Google Crawls And Indexes The First 15MB - The Googlebot help page has been updated to reflect the fact that Googlebot will only crawl the first 15MB of a page before stopping. Therefore, if you want to make sure that Google ranks your website appropriately, make sure that the first 15MB of the page can be crawled and indexed by Googlebot.Generally speaking, you should aim to keep your pages as lightweight as possible for both users and search engine crawlers. However, Google is being very upfront about how much of your website it will use here.Using the URL Inspection tool in Google Search Console to examine what portions of the page Google renders and detects in the debugging tool is an excellent approach to test this.7. Google: Images Can Impact Your Rankings - You might not be aware, but the images on your website might affect how well it ranks in Google Search results.If the image dimensions on a web page are not specified, the page's content may move about while the image loads. This is interpreted by Google Search as a bad signal.The solution is as easy as including the width and height attributes in the image's HTML code.8. Google: Keywords In Domain Names Are Overrated - Keywords in domain names are overstated, according to John Mueller of Google. He repeated what he and other Google employees have said countless times over the years. He advised choosing something for the long term and for your business instead.9. Google: It's OK To Link Your WhatsApp Number - John Mueller from Google stated on Twitter that including a link to your WhatsApp number on your website is not a negative SEO strategy. Contrary to what some SEO toolkits claim, connecting to a WhatsApp, phone, or fax number is acceptable, and Google does not evaluate your site differently dependent on the type of number you link to.10. Google: Do This For Product Variants Pages - When a product is offered in a variety of sizes or colors, ecommerce sites generally take URL structure into consideration. A product variant is any combination of a product's attributes. For product variants, Google supports a broad variety of URL configurations.To help Google understand which variant is best to show in Search, choose one of the product variant URLs as the canonical URL for the product.The following restrictions apply if you decide to display many product variants on a single page (i.e., if the variants all have the same URL). The page may be ineligible for Product rich results in search results because the experience is only supported for pages holding a single product (and product variants may be treated as distinct products by Google Search). Experiences such as Google Shopping cannot take a user to a specific variant of a product on your site, leading to the user needing to select the variant they wish to purchase on your site before checkout. This can lead to a poor user experience if the shopper already selected the variant they wanted in Google Shopping. If you choose to use a distinct URL per variant, Google recommends using either: A path segment, such as /t-shirt/green  A query parameter, such as /t-shirt?color=green 11. Google Changes Requirements For Local Service Ads - Google has increased the requirements for local service ads, mandating five reviews. This is an increase over the prior requirement of simply requiring one review.According to Ginny Marvin of Google's AdsLiaison, "The procedure for placing an ad with Local Services Ads has not changed; rather, the quantity of reviews has increased from 1 to 5. More customer testimonials increase credibility and increase the likelihood of connecting with prospective customers."Industries that now need five reviews are: Real estate agents and brokers Lawyer for personal injuries HVAC Companies Home Cleaning Companies Tree care/services companies Roofing 12. Updates From Google SEODay - John Mueller, a Google Search Advocate, discussed improvements to the search engine's scoring system and the effect of page experience on rankings at an online session at SEODay 2022.Google now bases desktop search results on a site's desktop experience and mobile search results on a site's mobile experience, among other modifications that were made public.The search engine calculates experience scores using three main metrics: largest contentful paint (LCP), first input delay (FID), and cumulative layout shift (CLS)."Interactivity to next page," often known as INP, is a new page experience statistic that Google has added. Other things that came out of SEODay are: Users can create custom reports and obtain a fresh perspective on the data by using analytics and Search Console data.  Videos & Images Take On More Prominent Role In Search - “We see that people love videos and authentic images in search results, so we try to show them more,” Mueller said. Google now offers a WordPress plugin for creating Web stories - A collection of pages that often have videos. Authentic Product Reviews Factored Into Rankings - “People have high expectations of reviews they find online, so we've also worked specifically on updates to algorithms with regards to ranking these product reviews,” he said.

Webcology on WebmasterRadio.fm
Google Core Update Finished Rolling Out

Webcology on WebmasterRadio.fm

Play Episode Listen Later Jun 9, 2022 55:18


Google Core Update finished rolling outhttps://www.seroundtable.com/google-may-2022-core-update-done-rolling-out-33562.htmlEU Publishers and Webmasters can't block US traffic without also blocking Google and Googlebot https://www.seroundtable.com/eu-publishers-block-us-users-google-33557.htmlDon't add your company name to image alt-text. It's redundant, undescriptive, and not helpfulhttps://www.seroundtable.com/google-company-name-in-blank-image-alt-text-33538.htmlYoast and IndexNow “They changed how it worked a bit (or explained it better) where we only have to ping one endpoint. With that, we basically replace the xml sitemap ping with a call to IndexNow and it's done.””https://www.searchenginejournal.com/yoast-on-why-its-adopting-indexnow-protocol/453516/Support this podcast at — https://redcircle.com/webcology/donationsAdvertising Inquiries: https://redcircle.com/brandsPrivacy & Opt-Out: https://redcircle.com/privacy

SEO para Google
327: Diccionario SEO Básico

SEO para Google

Play Episode Listen Later Mar 2, 2022 32:33


Diccionario SEO básicoSEO: El SEO (Search Engine Optimization) son técnicas para salir en las mejores posiciones de ⚠️GOOGLE⚠️, multiplicar visitas y generarEl Algoritmo de Google es la forma que tiene el buscador de posicionar las páginas ante una búsqueda, es decir, es lo que decide si sales primero, segundo o en la segunda página.Los más famosos son Google Panda, Google Penguin, Google Medical Update...Directrices de Google, Centro de búsqueda de Google o Search Central: Las recomendaciones que indica Google en sus guías de uso de buenas prácticashttps://developers.google.com/searchJohnMueller: Jefazo de búsquedas de Google. La información interna de Google proviene de gente como John Mueller, analista de tendencias de los webmasters de Google. John Mueller dirige varias reuniones de la Central de Webmasters de Google donde responde a preguntas de webmasters y expertos en SEO.Escucha su podcast https://pod.link/1512522198Una página error 404 es la que se muestra al usuario cuando éste hace click en un enlace roto. En ella se le informa de que la página no existe y se le da la opción de volver hacia atrás o usar un buscador en las versiones predeterminadas. Es recomendable contar con una página 404 personalizada para tratar de retener al usuario en la web ofreciéndole enlaces a contenidos alternativos que le puedan interesar.Búsqueda por voz: Uso de dispositivos para hacer búsquedas con la voz. Siri, Asistente de Google, Alexa...Paginación: Las paginaciones sirven para separar los elementos de un catálogo o listado en varias páginas. Esto tiene varias ventajas, entre la que destaca el hecho de que se reduce el tiempo de carga de una URL, en especial cuando hay demasiados elementos que mostrar. Es importante configurarlas correctamente para evitar errores de rastreo e indexación.El tráfico orgánico de una página web son todas las visitas que recibe desde los buscadores de Internet, referencias en otras páginas, menciones en redes sociales, etc. De este tráfico quedan excluidas todas las visitas que llegan desde publicidad de pago ya que no derivan del posicionamiento natural de la URL en los resultados de búsqueda.Yandex es un motor de búsqueda orientado a las consultas realizadas en caracteres rusos. Por ello, es muy conocido con el nombre de "Google ruso" ya que, además, más del 50% de los móviles Android en Rusia pertenece a Yandex. Es especialmente interesante en cuanto a que ofrece mejores resultados en ruso que Google y tiene medidas muy duras contra el spam.Un PBN es una red privada de blogs con distinta IP y distinto dominio que se utiliza para crear enlaces entre ellos y mejorar así el posicionamiento. De esta forma, se intenta sortear a Google para hacer que estos blogs crezcan mucho más rápido que si se hiciera de forma orgánica. Sin embargo, esta estrategia se debe planificar con mucho cuidado para que se vea natural a ojos de Google.Google Trends es una herramienta gratuita de Google que muestra las tendencias de búsqueda de una keyword durante los últimos doce meses. También permite ver temas relacionados con esa palabra clave, comparar dos o más keywords y segmentar la información geográficamente por países y comunidades autónomas, en el caso de España.El Mobile First Index da relevancia a la versión móvil de una página web a la hora de mostrar los resultados de búsqueda. Como consecuencia, las páginas web que tienen una versión optimizada para móviles tienen más opciones para posicionar en los resultados de las búsquedas que se hacen desde estos dispositivos.Una granja de enlaces es una página web que sirve para crear links hacia otras páginas con el fin de mejorar el posicionamiento de éstas. Esto se traduce en un sitio web de poco contenido y sin valor, con un excesivo número de enlaces salientes hacia páginas de diversas temáticas que no tienen relación. Esta práctica está considerada Black Hat SEO y está penalizada por Google.El Pogo Sticking es la secuencia por la que un usuario hace click en una URL de los resultados de búsqueda, entra en esa web y pulsa el botón de retroceso para regresar a los resultados y escoger otro. Es un factor de relevancia para Google ya que interpretará que esa web no responde a la intención de búsqueda del usuario. Se puede usar como técnica black hat seo si se hace con algún programa.Las keywords LSI (Latent Semantic Indexing, indexación semántica latente) son términos que ayudan a comprender el contexto de la keyword principal a través de la semántica, es decir, que no se trata de sinónimos ni de palabras clave derivadas. Por ello, facilitan que Google comprenda mejor el contenido de una web, por lo que lo indexará correctamente para mostrarlo en los resultados de búsqueda.El keyword research es una investigación que se realiza para encontrar aquellos términos de búsqueda que son relevantes para posicionar un proyecto web a partir de una o más palabras clave. De esta forma, se localizan las keywords por las que queremos ser encontrados en los motores de búsqueda para atraer tráfico, ya sea de orgánico o de pago.Las migas de pan son un elemento que indica la estructura de enlaces que se ha seguido desde la portada hasta llegar a la URL actual, lo que permite retroceder de nuevo a un link concreto sin usar el botón Retroceso del navegador. Así, el usuario puede comprender la jerarquía de contenidos de la web y cómo llegar hasta los contenidos que le interesan, facilitándole la navegación.El interlinking es la práctica de crear links internos en una web, es decir, enlazar contenidos entre las URL de un mismo dominio. De esta forma, se relacionan contenidos similares, se crea la arquitectura web, se transmite autoridad desde una página más potente a otra y se facilita la navegación tanto a los usuarios como a los robots de los motores de búsqueda.El Grey Hat SEO incluye aquellas acciones que se utilizan para acelerar el ritmo normal de posicionamiento de una web pero sin llegar a incumplir de forma clara las normas de Google, de forma que no hay riesgo de sufrir penalizaciones. Un ejemplo sería la compra de enlaces artificiales pero teniendo en cuenta aspectos como colocar anchor text variados y mantener cierta distancia temporal entre links.Indexar: Hacer que tus páginas aparezcan en GoogleAutoridad: El peso y credibilidad que le da Google a una web en función de muchos parámetros pero priorizando los enlaces que recibe de otras webs. Cuanta más autoridad mejores enlaces y más autoridad pasan. Un link de elpais.com es mejor que de un periódico local porque tiene más autoridad.Crawl budget: Tiempo que nos da el robot de Google para revisar nuestra webEl crawling o rastreo de sitios web es el recorrido que hace un pequeño bot de software (un crawler) para leer y analizar el código y contenido de una web, saltando de página en página a través de los enlaces que va encontrando. En el caso del GoogleBot (el crawler de Google), este se encarga de rastrear y examinar nuestras webs, para posteriormente incorporarlas a su índice.Googlebot: Araña de Google. Robot que pasa por nuestra webThin content: Contenido de poco valorDisavow links o desautorizar enlaces es la acción que consiste en pedirle a Google que no tenga en cuenta aquellos links entrantes fraudulentos, de baja calidad o artificiales que apuntan hacia tu web. Es una acción que debe ser muy meditada ya que puede tener consecuencias importantes para el posicionamiento.El deep linking es una estrategia de enlazado interno que se utiliza para promocionar los propios contenidos de la página web. Se trata de colocar enlaces a post, vídeos o imágenes para facilitar que tanto el usuario como los robots de búsqueda lleguen de forma directa al contenido que queremos sin hacerle regresar a la página de inicio.El contenido evergreen es aquel que nunca se queda anticuadoUn clúster SEO es una forma de mostrar una serie de contenidos agrupados que están relacionados alrededor de una misma intención de búsqueda.Subdominio: Un subdominio es una forma de tener un sitio (web) relacionado, como anexo, a una web principal. Los subdominios son del tipo: http://subdominio.dominio.com .PageSpeed Insights: Página para analizar la velocidad de tu webAnchor text: Texto del enlace.Backlinks: Los Backlinks son los links o enlaces entrantes que apuntan desde otras páginas a la tuya propia.Black Hat SEO: Técnicas no permitidas por Google que permiten subir posiciones en GoogleLa densidad de keywords es el porcentaje de veces que aparece una palabra (o serie de palabras) en el conjunto del texto frente al número de palabras totales.La etiqueta Meta Robots es una etiqueta HTML que se utiliza para indicar a los buscadores que traten a una url de una manera determinada.No-index, no-followSEO Negativo: Técnica black hat que consiste en penalizar a un competidor por ejemplo comprando enlaces de baja calidad hacia otra web.Tráfico: Visitas que recibe una webCloaking: es una técnica de SEO Black Hat muy utilizada que consiste en mostrar contenidos diferentes dependiendo de si es un usuario o un robot de buscadores el que lo lee.Canibalización de keywords: La canibalización de keywords ocurre cuando en una web hay varias páginas que compiten por las mismas palabras claveContenido duplicado: El contenido duplicado se produce cuando el mismo contenido aparece en múltiples URLs y que en principio no es motivo de penalización, a no ser que un alto porcentaje de tu web tenga contenido duplicado. CTR: El CTR (Click Through Rate) es el número de clics que obtiene un enlace respecto a su número de impresiones. Siempre se calcula en porcentaje, y es una métrica que se utiliza normalmente para medir el impacto que ha tenido una campaña digital.Etiqueta Canónical fue presentada por Google, Yahoo! y Bing en el año 2009 para solucionar la problemática de contenido duplicado o similar en SEO e indica la página principal que quieres indexar.Los Microformatos son una form