Get up to speed with the Digital Marketing News and Updates from the week of Feb 20-24, 2023.
1. LinkedIn Released Five New Features – According to a post from Keren Baruch, Director of Product at LInkedIn, here are the platform updates that will help your business:
- Going forward, you get to choose the content type your Activity section shows first.
- Last year, LinkedIn introduced the option to schedule posts. And now they’re extending that feature to your newsletters and articles as well.
- The one-click subscribe URL and an embeddable button makes it easier for the author to share and drive subscriptions .
- Customize the way your articles appear on search engines. Go to any article you’ve created and click on the “Publishing menu” in the top left corner. From there, click “Settings” and you’ll be able to customize your SEO title and description to appear in searches. This means you can control how your content appears on those search engines, making it more discoverable to those who are interested in what you have to say.
- When you search for newsletter authors, you’ll see their newsletter right in the search results under their name.
2. The GA4 Auto-migration Deadline – Google is sunsetting Universal Analytics (UA) on July 1st. Starting in March, Google will automatically create Google Analytics 4 (GA4) properties for any customer who does not set up a GA4 property with basic settings.
If you do not opt-out of auto migration by February 28, 2023, Google will transition your UA account to GA4 without any custom strategy. To ensure accurate tracking and analysis, you should make the switch to GA4 now and customize the setup as needed.
GA4 is much more than just a new “version” of Google Analytics. It’s a completely new platform – built from the ground up to collect, process, and report on data differently than before. Migrating to GA4 is a complex and integral process that requires strategic planning and expert implementation.
3. Automating Google Search Console Data Export – Due to limitations related to serving latency, storage, processing resources, and others, Search Console has a limit on the amount of data that can be displayed or exported. The maximum you can export through the Search Console user interface is 1,000 rows of data. Currently, the upper limit for the data exported through the Search Analytics API (and through the Looker Studio connector) is 50,000 rows per day per site per search type, which may not be reached in all cases.
Now Google has rolled out a feature that allows you to automate a daily bulk export of your Search Console performance data to BigQuery. This includes all of your Search Console performance data but not the anonymized queries. The daily data row limit does not impact this data, so you can extract more data using this method.
This will allow you to run complex queries over your data to an external storage service, where you can do deeper analysis in a more automated fashion, Google announced.
Remember, BigQuery is Google’s fully managed, serverless data warehouse that enables scalable analysis over petabytes of data. It is a service that supports querying using ANSI SQL. It also has built-in machine learning capabilities.
Even though Google said this is likely more helpful for larger sites with larger datasets, I personally believe smaller sites will benefit from automating the export and retention of historical data.
P.S: If terms like data warehouse, BigQuery, and SQL is making you dizzy then it’s time for you to seek the assistance of a reputed data driven marketer.
4. Per Google, Embedding Reviews From Other Sites Does Not Help With Rankings – Google’s John Mueller was asked if it would help with web rankings in Google to embed reviews from sites like Facebook, Bing, Google, etc., on your website. John said no, not for web search, and he also warned us about not using structured data on those third-party site reviews.
Remember, back in 2016, John said the same thing. And in 2019, Google announced that “Self-serving reviews aren’t allowed for LocalBusiness and Organization”
So this is not new information. It’s just that people tend to forget or newbies do not know. And bye bye review companies.
5. Google Responds If Core Web Vitals (CWV) Should Be A Priority For A Small Business – Brenda Malone asked Google’s John Mueller, “Since Google Search Console ONLY shows Core Web Vitals and Page Experience metrics for ‘sufficiently popular sites’ that have enough CrUX data, I am wondering. Do you think that small, less than 1K page, mom-and-pop websites should stress out about CWV since Google SAYS they only count CWV if there is CrUX data?”
Google’s John Mueller replied that for small and local businesses, “in most cases,” core web vitals work should not be at the top of their list. He said this because the page experience ranking factor is not huge and more so, for sites with very few pages, Google doesn’t usually have CrUX data for the pages on that site, and in those cases, the data is not used. He then goes on to say this:
“Should you improve speed/CWV anyway? Maybe. Users do notice it, so if your site is super-slow, users might bail. Should it be your top priority? I don’t know – to me it would depend on what you want to achieve with the site (is it a “business-card”, or more interactive?) & how bad it currently is.
..it sometimes feels like folks focus too much on it. I get it – it’s something measurable in the SEO world where very little is measurable. Prioritizing is hard, but especially for smaller, local businesses, in most cases this shouldn’t be top of the list.“
P.S: If the terms CrUX & CWV is new to you then I recommend you seek the advice of a reputed search marketer unless you are preparing for a career in search marketing.
6. Will Your Rankings Drop If You Have A Keyword Rich Domain? – In last episode (ep#148), I covered how Google’s John Mueller is not a fan of keyword rich domains (or exact match domains). Now a website owner asked John Muller if he can attribute his drop in rankings (to position#100 from 1) to having an exact match domain. By the way, the website owner asking this question owns www.cbd-uk.com and used to rank for “cbd uk”
To this question, John Mueller replied that an existing site would not drop in ranking just because it uses an exact match domain. “Changes like that would be due to other reasons,” he added.
Furthermore, John advises the website owner (who used to earn a livelihood from selling CBD products) to check Search Console for any manual action. Otherwise it is due to a latest update to Google’s algorithms that is now viewing the site differently (for example Google recently launched updates for neutralizing / handling spammy links, webspam, and for not promoting helpful content).
This is a great evidence of what I tell my clients – Spend $$ on Google Ads even if you are ranking#1 on organic search. You never want to be the mercy of the Google’s algorithm.