Episode 164 contains the notable Digital Marketing News and Updates from the week of June 5 – 9, 2023. And the show notes for this episode was generated using generative AI. And like always, I curated the articles for the show.

1. Google’s Structured Data Validator vs Schema.org –During June 2023, Google SEO Office Hours, Google’s Martin Splitt answered a question about structured data validation and how Google’s validator can show different results than the Schema.org validator.

Both Google and Schema.org offer tools for validating if structured data is correct. 

Google’s tool validates structured data and it also offers feedback on whether the tested structured data qualifies for rich results in the search engine results pages. Rich results are enhanced search listings that makes the listing stand out on the search results. The Schema.org Schema Markup Validator checks if the structured data is valid according to the official standards.

Per Splitt, “Schema.org is an open and vendor-independent entity that defines the data types and attributes for structured data. Google, as a vendor however, might have specific requirements for some attributes and types in order to use the structured data in product features, such as our rich results in Google Search. So while just leaving out some attributes or using some type of values for an attribute is fine with Schema.org, vendors such as Google and others might have more specific requirements in order to use the structured data you provide to actually enhance features and products.”

In conclusion, Google’s validator has a purpose that is different from just checking if the structured data is valid. It’s checking to see if the structured data that Google requires (for potentially showing a webpage in enhanced search results) is valid. The Schema.org validator is just checking for standards and has nothing to do with how Google uses structured data.

You can watch the June SEO office hour here.

2. Google’s Latest Search Console Update Makes it Easier to Fix Video Indexing Issues –
Google has released an update to its Search Console, aimed at refining video indexing reports. This enhancement promises to offer you more precise problem descriptions and actionable solutions to help boost the visibility of your videos in Google Search.

Previously, users encountered a generic “Google could not identify the prominent video on the page” error. Now, Google has decided to provide more specific details to overcome this problem. Here’s what you need to know:

  1. Video outside the viewport: If your video isn’t fully visible when the page loads, you’ll need to reposition it. Make sure the entire video lies within the renderable area of the webpage.
  2. Video too small: If your video is smaller than desired, you should increase its size. The height should exceed 140px, and the width should be greater than 140px and constitute at least one-third of the page’s width.
  3. Video too tall: If your video is taller than 1080px, it’s time to resize it. Decrease the height to less than 1080px to comply with Google’s new guidelines.

While you might still see some old error messages for the next three months, Google plans to phase these out, replacing them with these new, more detailed notifications.

By adhering to these updates, you can maximize your video’s prominence on Google Search and enhance user engagement. Happy optimizing!

3. Navigating the World of Domains: A Google Insider’s Advice –  Let’s delve into the world of domain names and how they can impact your business’s digital reach, guided by insights from Google Search Advocate, John Mueller.

Mueller recently clarified the differences between generic top-level domains (gTLDs) and country code top-level domains (ccTLDs), following Google’s decision to reclassify .ai domains as gTLDs, breaking away from their previous association with Anguilla.

In essence, gTLDs (such as .com, .store, .net) are not tied to a specific geographical location, unlike ccTLDs (like .nl for the Netherlands, .fr for France, .de for Germany) that are country-specific. Mueller pointed out that if your business is primarily targeting customers within a certain country, a ccTLD might be the way to go. On the other hand, if you’re aiming for a global customer base, a gTLD could be the better option.

Importantly, Mueller also highlighted the need to consider user perception. He posed a question to consider: will users click on a link they believe is meant for another country’s audience?

Furthermore, Mueller also cautioned against using TLDs that may appear spammy, as it can harm your site’s credibility.

His advice underscores the importance of strategic decision-making when registering your domain, reminding us that the choice of a domain name is not just a technical one, but a business decision that can have a significant impact on your online presence.

4. Google’s Verdict on the Impact of Security Headers on Search Rankings – In your quest for a secure website, you may have come across HTTP headers – bits of data that offer valuable metadata about a webpage to browsers or web crawlers. The most well-known among these are response headers, like the infamous 404 Error or the 301 redirect.

A subset of these headers, known as security headers, play a critical role in fortifying your site against malicious attacks. For instance, the HSTS (HTTP Strict Transport Security) header mandates that a webpage be accessed only via HTTPS, not HTTP, and ensures the browser remembers this preference for the future.

While a 301 redirect can guide browsers from HTTP to HTTPS, it leaves your site exposed to potential ‘man-in-the-middle’ attacks. An HSTS header, on the other hand, ensures your browser requests the HTTPS version directly, effectively bolstering site security.

A question was recently posed to Google’s John Mueller about whether integrating security headers, like HSTS, could influence website ranking. Mueller’s response was clear: the HSTS header does not impact Google Search. This header’s purpose is to guide users to the HTTPS version of a site. As for deciding which version of a page to crawl and index, Google uses a process known as canonicalization, which doesn’t rely on headers like HSTS.

So, while security headers might not boost your site’s search ranking, their importance in maintaining a secure browsing experience for your users cannot be overstated. Remember, a secure website is a trusted website, and trust forms the foundation of any successful online presence.

5. Debunking ‘Index Bloat’: Google’s Take on Effective Web Page Indexing – In a recent episode of Google’s ‘Search Off The Record’ podcast, the Search Relations team at Google tackled the topic of web page indexing, putting a spotlight on the much-discussed theory within the SEO community: “Index Bloat.”

This theory, often cause for concern, refers to a situation where search engines index pages that aren’t beneficial for search results. It includes pages like filtered product pages, printer-friendly versions, internal search results, and more. Advocates of the index bloat theory argue that such pages can confuse search engines and negatively impact search rankings. They link this issue to the concept of a crawl budget, which is the number of URLs a search bot will crawl during each visit. The theory proposes that index bloat can lead to an inefficient use of this crawl budget, with search bots wasting time and resources gathering unneeded data.

However, Google’s John Mueller challenged this theory, stating there is no known concept of index bloat at Google. According to Mueller, Google doesn’t set an arbitrary limit on the number of indexed pages per site. His advice to webmasters is not to worry about excluding pages from Google’s index, but instead, focus on creating and publishing useful content.

While some supporters of the index bloat theory have pointed to issues like accidental page duplication, incorrect robots.txt files, and poor or thin content as causes, Google asserts that these are not signs of a non-existent “index bloat,” but simply general SEO practices that require attention.

Some have suggested using tools like Google Search Console to detect index bloat by comparing the actual number of indexed pages to what’s expected. Google’s stance implies this comparison isn’t indicating a problem, but is instead part of routine website management and monitoring.

Google’s official stance dismisses the idea of index bloat. Instead, the emphasis should be on ensuring the pages submitted for indexing are valuable and relevant, thereby enhancing the overall user experience.

6. Controlling Googlebot: Decoding Google’s Search Relations Podcast Insights – In the latest episode of the ‘Search Off The Record’ podcast, Google’s Search Relations team, John Mueller and Gary Illyes, delved into two key topics: blocking Googlebot from crawling certain parts of a webpage and preventing Googlebot from accessing a website completely.

When asked how to stop Googlebot from crawling specific sections of a webpage, such as the “also bought” areas on product pages, Mueller emphasized that there’s no direct method to achieve this. “It’s impossible to block crawling of a specific section on an HTML page,” he clarified.

However, Mueller did propose two strategies, albeit not perfect ones, to navigate this issue. One involves utilizing the data-nosnippet HTML attribute to stop text from being displayed in a search snippet. The other strategy involves using an iframe or JavaScript with the source blocked by robots.txt. But be wary, as Mueller cautioned against this approach, stating it could lead to crawling and indexing issues that are difficult to diagnose and solve.

Mueller also reassured listeners that if the same content appears across multiple pages, it’s not a cause for concern. “There’s no need to block Googlebot from seeing that kind of duplication,” he added.

Addressing the question of how to prevent Googlebot from accessing an entire site, Illyes provided a straightforward solution. Simply add a disallow rule for the Googlebot user agent in your robots.txt file, and Googlebot will respect this and avoid your site. For those wanting to completely block network access, Illyes suggested creating firewall rules that deny Google’s IP ranges.

To sum up, while it’s impossible to stop Googlebot from accessing specific HTML page sections, methods like the data-nosnippet attribute can offer some control. To block Googlebot from your site altogether, a simple disallow rule in your robots.txt file should suffice, though you can take further steps like setting up specific firewall rules for a more stringent blockade.

7. Sweeping Changes to Google Ads Trademark Policy: What You Need to Know –  Google Ads is making significant changes to its Trademark Policy that could impact how your advertisements are run. Starting July 24, Google will only entertain trademark complaints that are filed against specific advertisers and their ads. This is a shift away from the current policy, where complaints can lead to industry-wide restrictions on using trademarked content.

This change is a response to feedback from advertisers who found the previous system frustrating due to over-flagging and broad blocks. The new policy aims to streamline resolutions, making them quicker and more straightforward. In addition, it will provide greater clarity and transparency for advertisers, a much-needed improvement many have been advocating for.

As explained by a Google spokesperson, “We are updating our Trademark Policy to focus solely on complaints against specific advertisers in order to simplify and speed up resolution times, as opposed to industry-wide blocks that were prone to over-flagging. We believe this update best protects our partners with legitimate complaints while still giving consumers the ability to discover information about new products or services.”

Do note that any trademark restrictions implemented before July 24 under the current policy will continue to apply. However, Google plans to phase out these limitations for most advertisers gradually over the next 12-18 months.

You can learn more about these changes by visiting the Google Ads Trademarks policy page here.

8. Double Menus, Double Fun: SEO Unaffected by Multiple Navigations – In a recent SEO office hours video, Google’s Gary Illyes made it clear that the presence of multiple navigation menus on your website doesn’t affect your SEO performance – be it positively or negatively.

The question arose during the video discussion, asking whether having two navigation menus – a main one featuring important site categories and a secondary one focusing on brand-related extensions – could potentially harm SEO performance.

Illyes’ response was reassuring. He stated that it’s highly unlikely that multiple navigation menus would have any impact on your website’s SEO. In other words, whether you have one, two, or even more navigation menus on your page, Google’s algorithms are sophisticated enough to recognize these elements and process them accordingly.

So, rest easy and design your website to best serve your audience. Remember, whether your navigation is on the top, left, or bottom of your page, Google’s got it figured out!

9. Google’s Eye on XML Sitemap Changes: Resource Efficiency in Action – Google’s own Gary Illyes recently reaffirmed that the tech giant is diligent about scanning XML sitemaps for updates before launching the reprocessing protocol. This practice is rooted in the desire to conserve valuable computational resources by avoiding unnecessary reprocessing of unchanged files.

When asked whether Google compares current and previous versions of XML sitemaps, Illyes’s response was a resounding yes. He explained that Google refrains from reprocessing sitemaps that have remained the same since their last crawl – a measure designed to prevent wastage of computing resources.

However, any modifications in your sitemap, whether in the URL element or ‘last mod’, will trigger a new round of parsing and generally initiate reprocessing. Illyes pointed out that this doesn’t automatically guarantee that the altered URLs will be crawled, as they must still pass through the usual quality evaluations like any other URL.

Importantly, if a URL is deleted from the sitemap because it no longer exists, it doesn’t imply that it will instantly be removed from the index or prioritized for crawling to expedite its deletion. Keep this in mind when making changes to your sitemap.

10. Boost Your Search Rankings: Google’s Advice on Consolidating Pages – In a recent SEO office hours video, Google’s Gary Illyes brought up a valuable point about web page consolidation. He discussed ‘host groups’, a term used when Google displays two results from the same domain in search results, with one listed below the other.

Illyes suggested that when your website forms a host group, it indicates that you have multiple pages capable of ranking well for a particular query. In such cases, he recommended considering the consolidation of these pages, if feasible.

This advice aligns with Google’s host groups documentation, which recommends setting one of these pages as the ‘canonical’ if you’d prefer users to land on that page over the other.

The concept of a host group comes into play when two or more consecutive text results from the same site rank for the same query and hence, get grouped together.

The rationale behind Google’s recommendation for consolidation could be understood as an attempt to prevent your pages from competing against each other. When two pages vie for the same ranking, consolidating them could potentially boost the ranking of the remaining page.

From an SEO perspective, having two listings could increase your click-through rate. However, the idea of consolidation is to create a more streamlined user experience and possibly enhance your page’s ranking.

Keep in mind that this is an approach to consider and may not suit every situation. Always consider your unique context and audience needs when making SEO decisions.

11. Unlocking Video Thumbnails in Google Search: Key Insights Revealed –  Recent changes to Google’s approach to video thumbnails in search results have prompted many queries. These alterations ensure that video thumbnails are displayed only when the video constitutes the main content on a webpage.

This doesn’t imply that the video must be the first element on your page. Instead, as Google’s Gary Illyes explains, the video should be immediately noticeable — it should be “in their face right away.” This user-centric approach enhances the user experience, eliminating the need for them to hunt for the video on the page.

Illyes encourages web developers and SEO experts to consider the user’s perspective. When visitors land on your page, they should not have to actively search for the video. It should be prominently displayed, akin to the approach of popular video platforms like Vimeo and YouTube.

Remember, the aim of these changes is to reduce confusion and streamline the user experience by ensuring that videos are easy to find and view. Take inspiration from major video sites to better understand what Google’s algorithms are seeking.

12. Enhanced Conversion Tracking with Microsoft Advertising’s New Cross-Device Attribution Model –  Microsoft Advertising is set to enhance its tracking capabilities with the introduction of a Cross-Device attribution model. Revealed in Microsoft’s latest product update roundup in June, this model promises to provide more accurate insights into customer conversion journeys across multiple devices and sessions.

With this new feature, if a customer clicks an ad on their laptop and later completes a purchase on their phone, Microsoft Advertising will attribute the conversion to the original ad click on the laptop. This development will ensure that your marketing efforts are accurately credited, regardless of the device where the conversion ultimately occurs.

As a result of this new tracking model, marketers may notice a slight uptick in the number of conversions reported in their performance metrics. If you observe an increase in conversions, the new Cross-Device attribution model could be the driving factor. Keep an eye on your reports to understand the full impact of this latest update on your performance data.

13. New Verification Mandates for Microsoft Ads: Everything You Need to Know –  Starting August 1st, Microsoft Advertising will be implementing a new policy to enhance transparency and security. Only ads from verified advertisers will be displayed on the platform. If you haven’t yet met the Microsoft Ads verification requirements, it’s crucial to complete them before August 1st to ensure your ads continue to run smoothly.

The Microsoft Ads Advertiser Identity Verification program, which was launched in June 2022, is rolling out the following important dates:

  • As of July 1st, all new advertisers must be verified before their ads can go live.
  • If you haven’t received an email from Microsoft about account verification by July 15th, you should reach out to Microsoft support.
  • Starting August 1st, Microsoft Advertising will exclusively display ads from verified advertisers.

Once verified, all ads will showcase:

  • The name and location of the advertiser.
  • The business or individual responsible for funding the ad.
  • Additional information explaining why a user is seeing a specific ad, including targeting parameters.

In addition to these updates, Microsoft Advertising is also launching a new feature – the Ad Library. This will enable all users to view ads shown on Bing that have gained any impressions in the European Union. Users will be able to search for ads in the Ad Library by using the advertiser’s name or by entering words included in the ad creative. The details of the advertiser will be displayed in the Ad Library.

Stay ahead of the game and get your account verified to enjoy uninterrupted ad delivery with Microsoft Advertising!

14. Unleashing New Opportunities: LinkedIn Introduces Direct Messaging for Company Pages – In a bid to foster more professional connections and interactions, LinkedIn is set to expand its messaging tools. The platform has now introduced a new feature that allows Company Pages to send and receive direct messages (DMs). This marks a major development as previously, one-to-one messaging was only available for individual LinkedIn members.

LinkedIn’s new feature, termed Pages Messaging, paves the way for members to directly contact brands. Conversations can cover a broad range of topics from products and services to business opportunities. To handle these two-way conversations, organizations will be equipped with a dedicated inbox, enabling them to manage and prioritize incoming inquiries that are most relevant to their business.

As a result of this feature, companies might see a significant increase in messages inquiring about opportunities. However, LinkedIn’s ‘focused inbox’ system, which segregates DMs based on priority and topic settings, can help manage the influx. In addition, companies have the option to disable the Message feature if they wish.

LinkedIn has been quietly testing this feature with a select group of users in the past month. Considering that over 63 million companies actively post on their LinkedIn Company Pages, this new feature could potentially revolutionize direct interactions and unearth fresh opportunities.

Furthermore, LinkedIn is exploring the integration of an AI assistant to aid in lead nurturing. This could be a significant asset, allowing users to research the person they are communicating with without the need to manually browse through their profile or posts.

While it might not be a ‘game-changer’, the new Company Page messaging feature, which is being rolled out from today, is certainly a noteworthy addition to consider in your LinkedIn marketing strategy.

15. Apple Amps Up Privacy: A Glimpse at iOS 17 and macOS Sonoma – In a continued commitment to user privacy, Apple has introduced fresh security enhancements in iOS 17 and macOS Sonoma, aimed at curbing intrusive web tracking. The new Link Tracking Protection feature is at the heart of this upgrade.

Activated by default in Mail, Messages, and Safari (while in Private Browsing mode), Link Tracking Protection zeroes in on tracking parameters in link URLs, which are often used to monitor user activity across different websites. The feature scrubs these identifiers, thereby thwarting advertisers’ and analytics firms’ attempts to bypass Safari’s intelligent tracking prevention functionalities.

Typically, these tracking parameters are attached to the end of a webpage’s URL, bypassing the need for third-party cookies. When a user clicks the modified URL, the tracking identifier is read, enabling the backend to create a user profile for personalized ad targeting.

Apple’s new feature disrupts this process by identifying and removing these tracking components from the URL, ensuring the user’s web page navigation remains as intended. This operation is quietly executed during browser navigation in Safari’s Private Browsing mode and when links are clicked within the Mail and Messages apps.

To strike a balance, Apple has also unveiled an alternate method for advertisers to gauge campaign effectiveness while preserving user privacy. Private Click Measurement, now accessible in Safari Private Browsing mode, enables the tracking of ad conversion metrics without disclosing individual user activity.

In conclusion, Apple’s latest efforts reflect a renewed commitment to user privacy, promising to make online experiences safer and more secure across their operating systems.