chrislynch.link

Digital Marketing and SEO News

Would you pay to tweet?

I’m amazed this video isn’t being shared more widely…

Under the new rules that Twitter might roll out, paying users of Twitter will “outrank” non-paying users when it comes to visibility. The original purpose of the blue tick, which was to verify people with public identities were who they said they were, will be replaced by a two-tier system of paying and non-paying users. Elon Musk is right, for a lot of Twitter users $8 is what they pay for a latte – but will they pay that to use Twitter or will they flee to an alternative platform they can use for free?

Uncomfortable Fact: “Your” community belonged to Twitter

(That’s Bill Gates, not Elon Musk but… you get the picture)

I’m seeing many people in my writing community express genuine concern that they will lose access to their community as a consequence of what is happening with Twitter. The sad truth is that if you think you have/had a community on Twitter… you’re wrong. It was never “your” community. It belonged to Twitter. The platform belonged to them, the data belonged to them, and they sold it. Now, the business model may be changing. It could be the greatest “bait and switch” in social media history so far – create a community, sell it, then get the users to buy it back on a never-ending rental model. Nice work if you can get it and an object reminder that if the product is free then YOU are the product.

Back when I wrote “The Truth about SEO” and was working as a consultant with businesses and brands who were building their online presence, I repeatedly warned about the dangers of building your business on someone else’s platform. If you don’t own your customer/community data then it means that someone else does.

Should you pay for “Twitter Blue”?

My guess is that if Twitter goes ahead with this the platform will be barely functional for anyone who wants to tweet unless they pay the $8 a month. You could write the tweet of all tweets, a magnum opus in 240 characters that would break the heart and steal the soul of all who read it but nobody would because, without that blue tick, you’re getting buried under Crazy Joes Second Hand Tyre and Fried Chicken Emporium. That may be less of a problem than it sounds, however.

50% of Twitter users post less than 5 times a month. When these “lurkers” do tweet, it’s most likely to be as a reply to someone else’s message rather than starting a new thread of their own. By contrast, it’s around 10% of Twitter users who create 80% of the content on the site.

These are the users that Elon Musk is targeting with his new Twitter; the low and mid-tier influencers who haven’t reached “blue tick” status yet but that have invested significantly in their online presence and don’t want to start again on a new platform. Combine that group with businesses who will see (or be advised) that getting a “blue tick” is a mandatory part of presenting their brand on Twitter and you have a decent size customer base for the new incarnation of “Twitter Blue”.

So, should you pay? If you’re a “lurker”, absolutely not. If you only use Twitter to keep in touch with a small group of friends and like-minded people? Probably not. If you’re using Twitter to build an online brand or promote yourself in any way though, a blue tic will soon be mandatory.

Why not Plan B: Pay the Creators?

Here’s a question… If 10% of users are making Twitter the place that 90% of its users want to visit, why would you risk losing them? Surely it makes more sense to double down on those users and ensure that they stay on the platform? Seems logical but the reasons Musk may not be thinking in this way are twofold:

  1. He thinks Twitter is too big for celebrities and public figures to walk away from
  2. He thinks of the rest of humanity as “useful humanoid robots”

Take a look at the second clip below where Musk waits a painful amount of time for applause as he predicts a future of unlimited economic growth because of an unlimited number of robots (that somehow our fixed-size, fixed-resource planet will accommodate).

The truth that Musk has probably latched on to here is that most of the 10% of Twitter users who create content are replaceable. There are more “useful humans” still available to him, even if some of the existing creators decide to walk away.

How to protect yourself against the Twitterpocalypse

I’ve always been an advocate of posting content to multiple channels and of using your own website as your primary online presence. It’s a more difficult route because “walled gardens” like Twitter, Facebook, etc. don’t like you posting links that take users off their platform (if Twitter Blue means the algorithm no longer hates links, maybe I’ll be putting my $8/month down!) but it is ultimately more sustainable.

Having a presence on multiple platforms reduces the risk of losing access to a community and increases the number of people you are able to reach. You may think Mastodon is too much like hard work or that TikTok dances are too hard on the hips but coming to platforms late only makes it harder to develop your presence.

A Prediction: Twitter’s Future is Pay to Play

I predicted that Twitter might become a paid service some time ago, along with the risk posed by decentralised platforms such as Mastodon and Jack Dorsey’s “Blue Sky”.

As a subscription service, Twitter could have a vibrant future but I suspect that this would be short-lived. We are seeing proof that social media platforms have a shelf life; each new generation of consumers tends to gravitate towards platforms that their parents don’t use. Facebook and Twitter are both suffering heavy attrition to TikTok for example (a platform that, currently, also does not charge its users to post content). Unless it finds away to attract new users, the benefits of being on Twitter will diminish. If those users who may feel compelled to stick with Twitter and pay to maintain their communities there also start to lose interest, or find that $8 isn’t worth it, Twitter could eventually start to feel more like a graveyard than “the world’s town square”.

Unlike Twitter’s blue check mark however, you normally only have to pay for your grave once.

Social Media: It’s about control…

I saved this image when I first saw it back in October. Since then, Elon Musk has bought Twitter, and in the days and weeks that have followed the acquisition, there has been a crash in advertising revenue, mass layoffs, and an exodus of users. Amongst those losing their jobs are content moderators who have been at the forefront of the battle to keep misinformation and hate speech off Twitter.

Advertisers are leaving Twitter in droves and it’s up to users if they want to follow suit. For some, the platform is already becoming too toxic. Others, particularly “influencers” will be wondering if their personal brand is damaged or enhanced by being present on Twitter.

The problem for many users is – where do you go if Twitter is no longer for you?

As the image above outlines, all of the social networks are owned by someone… Except one.

Enter Mastodon

A literal “elephant in the room”, Mastodon is a decentralized social network similar to Twitter that, by design, has no one owner. Anyone with enough technical skill can set up a Mastodon server. That server becomes part of the federated network of Mastodon servers, meaning users on any server can follow and see content from users on any other server.

The experience is not quite the same as Twitter – discovery is more difficult and although Mastodon registrations have increased dramatically since Elon Musk purchased Twitter, there are still only a fraction of the users on Mastodon that there are on Twitter (and they are spread across a large number of servers, so you need to hunt them down).

If you are looking to connect and converse with other people with similar interests, Mastodon has a lot to offer. If you’re looking to reach a large number of people, you’re going to find the community small compared to Twitter (however, this does amplify your voice and mean you’re more likely to get interactions, so it’s not necessarily a bad thing).

The other thing to keep in mind with Mastodon is that it is moderated, just like Twitter is (or was), but moderation is server-specific. When you first register with Mastondon you need to pick a server and you will inherit the moderation rules of that server. If you truly want “free” social media, your only option is to run your own server, set your own rules, and then rely on federation (through the “Fediverse” of Mastodon servers) to spread your message across the network of Mastodon servers.

Setting up a Mastodon server is fairly complex, but something I think I may try… just so that if I bump into Elon and he mentions that he owns a social media platform I can say “oh yeah, I’ve got one of those as well”

DMCA request kills Moz in Google Search index

A little while ago I wrote about how easy it was to launch a DMCA (Digital Millenium Copyright Act) complaint against a website and how writers and other creatives could use this to get websites that were stealing their content removed from the Google index. (And, if you’re not on Google, frankly you aren’t anywhere…)

It occurred to me at the time that DCMA requests could be used maliciously if Google weren’t paying proper attention. I didn’t include it in the article, it seemed a bit far-fetched that Google would just remove a site without properly investigating it. I was wrong.

Moz.com, one of the most popular websites about SEO and digital marketing, vanished from the Google index recently after a DCMA request that included Moz.com was actioned by Google. The error was quickly spotted by eagle-eyed SEO pundits and reported to Google, prompting a response from Google’s Danny Sulivan. Whilst Sullivan’s response is short and somewhat lukewarm, the replies to it make for interesting reading – Moz.com evidently isn’t the first site to be hit like this.

Google were quick out of the blocks to resolve the issue and moz.com was back in the index less than 12 hours later, a much quicker response than other site owners have reported getting.

Why should it matter to you?

This matters because if it can happen to a big, well-known site like Moz.com then it can happen to anyone. Moz.com have got clout in the SEO space and if its traffic had been significantly damaged by this, Google would have heard about it loud and clear from the SEO community. This was a highly visible error and it was corrected quickly. If other reports are to be believed, less prominent sites have a much harder time getting DCMA penalties reversed.

What this means is that any “bad actor” can target your site, bury its domain in a long list and send in a DCMA request that could get you kicked out of Google’s index. The process of your site reappearing in the index won’t even begin until you challenge the complaint, and you could be in for a long wait. This seems like a great place for Google to be working on cleaning up their index without human intervention but, as has already been established, Google is not great at working out who the original creators of online content are.

What can you do about it?

Short answer? Absolutely nothing. Just keep a close eye on the amount of organic traffic you are getting from Google and investigate thoroughly any significant dips.


Original article: https://searchengineland.com/dmca-request-removes-moz-from-google-search-index-384943

Bigger boys made me do it: Why copying “leaders” is sometimes a bad idea

If you’ve worked for any serious amount of time in eCommerce or digital marketing, you’ll have come across a client who wants to copy functionality or design from another company. They probably also think it’s easy to do this.

(True story, I once had a customer push me aside to grab my laptop so he could “show me this copy and paste thing”. This numbskull legitimately thought that if he copied a button from Amazon and pasted it into a Word doc he sent me, I could then paste it into his site and the functionality would magically follow it.)

Anyway, setting aside The Wizard of Copy and Paste, there’s still a lot of “let’s look at what X are doing about Y”, especially when it comes to legislation and regulation. Not sure how best to interpret EU Cookie Law or GDPR? Well, why not just copy what Google’s doing? I mean, those guys must have it right… right?

Well, as I’ve had the pain of pointing out on many an occasion, Google doesn’t always play by the rules. Case in point – Google’s super confusing approach to cookies.

I’ve never liked Google’s approach to cookies, especially their semi-implied assumption that allowing their cookies on their website means I also want to allow them in other places using Google services. I may not mind Google knowing about me but I might have a problem with one of the sites I visit using that same permission and assuming that it’s safe to drop Google Analytics on me without additional consent.

Seems obvious, and anything over thirty nanoseconds checking the ICO’s very clear guidelines would show that this approach is wrong, but I’ve come across more than a few developers who think this is OK “because that’s what Google do”.

Wake up call. Big tech usually aren’t following the rules. In fact, most of the time, big tech companies (including Google) are why new rules are being made.

If you’re looking for evidence of this, look no further than the latest fine issued to Google by the EU and the subsequent changes to how Google notifications regarding cookies…

We’ve all been there – you land on a website and the button to “Accept All” cookies is nice and clear and big and bold. Meanwhile, the button to reject cookies is as well concealed as the plans to demolish Arthur Dent’s house in The Hitchhikers Guide to the Galaxy (by which I mean they were in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying “Beware of the Leopard”).

And that’s OK right? Because, you know, that’s what Google does… right?

Nope. After a bruising court battle and yet another large finge. Google have been told that their cookie notification message must make the option to reject cookies just as clear and obvious and accessible as the option to accept them and that their current labyrthine implementation was in violation of current EU laws. Consequently, as announced in this blog post, Google are rolling out an innovate new approach to cookie notifications where it’s just as easy to say “no” as it is to say “yes”.

Here’s Google’s innovative new look. It has two buttons.

What it means for all of us

Hopefully this clear directive from the EU, and Google’s gruding capitulation to it, will drive forward change in the way cookie notifications are designed and herald the end of increasingly complex user interfaces that do everything they can to obfuscate the process and frustrate users into simply clicking “Accept All”.

What it means for your website

If you’re a website builder of any stripe, take this as a clear message that you can’t rely on simply emulating the actions of “bigger boys” instead of reading and understanding relevant legislation for yourself. More importantly, it is vital when looking at regulations that you try to absorb the “spirit” of the legistation/regulation as well as the letter of it.

Just because you can find a loophole, doesn’t mean you have to force yourself (or your client) through it.

Has any visitor to a website ever really thought “Wow, that’s clever!” when they are forced to navigate an annoying complex cookie notification? Or do they think, “Wow, that’s so annoying” or “Wow, that’s deliberately complicated, how stupid do they think I am.”

Frustrating your user is never a good idea. Trying to be outsmart them is even worse.


Find out more: https://www.theverge.com/2022/4/21/23035289/google-reject-all-cookie-button-eu-privacy-data-laws

Confirmed: AI-Generated Content Is Against Google’s Guidelines

Throughout my career in digital marketing, there were a few rules that I always held to be immutable. One of them was this – If you can automate it, it’s probably SPAM. This rule steered me the right way on many occasions when tempting shortcuts were offered to me. I’d lived through the dark times of SEO, you see, times when Google was more easily fooled than it is today and SEO forums were awash with hacks, grifts, and ways to “trick” Google.

One of these tricks was “content spinning“, taking a copy of an existing article that was ranking well and replacing words and terms to create an article that was essentially the same but was technically different. This used to fool Google. It doesn’t anymore.

Whatever happened to the good old days?

Like any trick, it was inevitable that Google would develop a means to detect content spinning and guard against it. How successful they were at this is possibly a topic for debate – after all, there are still issues with plagiarised content outranking original – but fundamentally Google came out strongly against content spinning and the practice has all but died out following strong moves by Google in 2010 and 2011 against “content farms” and sites featuring predominantly duplicate content.

You’d think in 2022 we’d be a little wiser and know that shortcuts, tricks, and automated ways of altering sites to improve SEO are invariably engineered out by Google. Sadly, we are not. There’s a new snake oil business in town… and it’s called Artificial Intelligence.

The Rise of Skynet AI Content Generation

If you’re in the digital marketing, eCommerce, or web development space you’ve probably run across tools like Jasper (formerly Jarvis), Rytr, AI Writer, … the list goes on and on. It’s actually not hard to get into the AI content game. Thanks to freely available, open tools like GPT3, it’s getting easier and easier to spin up your own AI toolset. Of course, some of the output is pure nightmare fuel but it’s more than good enough that major news agencies have been using AI to generate content.

Legit copywriters, the type who have lungs and have to eat and stuff, have been up in arms about AI content generation. After all, this kit is taking aim directly at their livelihoods, offering a cheaper and more convenient option to content-hungry websites and digital marketers. Like the Luddites of old, they’ve been ready to smash the looms – if only there were actually looms and not just a load of code floating around in the untouchable “cloud” somewhere. But, are they just the old guard refusing the make way for the new?

YouTube player

Is Human Copy Writing Dead? Better ask Google.

Personally, I’ve been pretty skeptical about AI content generation for a long time but that is changing. We are at a stage where AI already produces passable content and it is improving all the time. Eventually, there is little doubt that it will be able to replicate or replace human copywriting for basic topics using reference material. There is even evidence that AI can be used to generate fiction, given sufficient input data, although the output there is of a significantly… lower quality. Imagination and originality are, for now at least, out of the reach of artificial intelligence. Phew.

So, is that it? Is it game over for copywriters? Well… no. Riding over the horizon is a most unlikely savior.

Google.

The Man from Google, He Say “No”

Like the Lone Ranger, John Mueller has come riding over the horizon to save the day with an unambiguous declaration that AI-generated content is contrary to Google’s guidelines. This actually isn’t something new but SEOs, especially those who like to find ways to try and outsmart Google, often need it spelled out for them.

“For us these would, essentially, still fall into the category of automatically generated content which is something we’ve had in the Webmaster Guidelines since almost the beginning.

And people have been automatically generating content in lots of different ways. And for us, if you’re using machine learning tools to generate your content, it’s essentially the same as if you’re just shuffling words around, or looking up synonyms, or doing the translation tricks that people used to do. Those kind of things.

My suspicion is maybe the quality of content is a little bit better than the really old school tools, but for us it’s still automatically generated content, and that means for us it’s still against the Webmaster Guidelines. So we would consider that to be spam.”

John Mueller: Google Search Advocate

More importantly, people and businesses that are currently investing in SEO projects that are utilizing these tools need to take note – Google has a habit of not only fixing its algorithm so that it can’t be manipulated by factors that it doesn’t like but also of penalizing sites that have been using the techniques that Google are engineering against.

Google may not be able to detect AI-generated content now but it’s safe to bet that if they are currently using humans to detect it then they are also using the inputs and outputs from those humans to train their own AI. Like a pair of Rock’em Sock’em Robots, Google’s own AI is on a collision course with AI Generated Content. Personally, my money’s on Google.

It Doesn’t Matter Who is Right, It Matters Who Has the Deepest Pockets… and it’s Google

Back in 2015, Google announced that sites needed to be mobile-enabled in order to rank. Web developers went into meltdown, updating sites with a frenzy that hadn’t been seen since the Millenium Bug. And then… nothing happened. The Mobilepocalypse never occurred and it was many years until the mobile index overtook the desktop index as the primary driver for search engine results.

It didn’t matter – the point was that since 2015 the accepted wisdom has been to make sites mobile-enabled because this is “Google Best Practice”. For SEOs, web developers, and eCommerce consultants, what Google says… goes.

And if Google say AI Content Generation is gone… it’s gone. It just doesn’t know it yet.

(And Google are happy to take money from the likes of Jasper for Adwords in the meantime. Just sayin’)


Read More: https://www.searchenginejournal.com/google-says-ai-generated-content-is-against-guidelines/444916/

How Google’s Pirate Update can kill off sites stealing your content

Google has revealed that it now has a specific penalty that it applies to sites that receive repeated upheld DCMA (Digital Millenium Copyright Act) takedown requests. In other words, it has a special button it can press to kill off sites hosting pirated content.

According to Google, sites hit with the “pirate penalty” can see their traffic from Google drop by an average of 89%. Quite why the reduction isn’t 100% is a different question, but it’s good to see Google taking real action against websites hosting pirated and copied content. The Pirate update actually dates back to 2014, but this is the first time in a while Google has reported on its efficacy.

In a new document released Feb 2022, Google said “we have developed a ‘demotion signal’ for Google Search that causes sites for which we have received a large number of valid removal notices to appear much lower in search results.”

It’s a little vague what constitutes a “large number” but this new penalty is an important reminder not to take it lying down if your copywritten content is being stolen and reused/shared on the web without your permission. (Especially as Google has a habit of ranking copied content above the original)

You can find more information on how to file a DCMA Takedown request with Google here.

All (Good) Things Must Come To an End. Is Facebook dying?

Facebook isn’t cool any more and users are leaving in droves. Ad revenue is down and shares are plummeting just at the point when Marky Mark Zuckerberg needs every penny he can scrape together to build “the metaverse”; a virtual world in which people can work, play, and live out a fantasy existence where Facebook remains implausibly relevant for anyone under 25.

Incredibly it wasn’t helping Donald Trump rise to power, being one of the significant factors in the success of the Vote Leave Brexit campaign, Cambridge Analytica, or being home to all your favourite anti-vax swivel headed loons that blew a hole in Facebook. It was short form video and the rise of the teenage dance craze of TikTok.

Yes, not since line dancing, Lindy Hop or the Mashed Potato has copying dance moves been the cornerstone of such a mass movement. Facebook tried to catch up, launching its TikTok clone “Reels” on Instagram, but it’s never captured the zeitgeist like TikTok has.

The architects of “surveillance capitalism” at Facebook made two fatal missteps. They didn’t realise people like to be watched even more than Facebook liked to watch them and it worked too hard to keep what it had instead of thinking about where it needed to go to stay relevant. As Facebook battled governments, technology companies, and their own users over privacy concerns, TikTok captured a user base who were quite happy to dance around in their pants online… just as long as Grandma wasn’t watching.

Let’s eat, Grandma!

We’ve reached a stage where Facebook has perhaps simply existed too long.

When I joined Facebook is was in my 20s. Now I’m in my 40s and I’ve got kids. I don’t think Facebook is cool. My kids definitely don’t. And they sure as hell don’t want to be on the same social network as their old man.

No new users + Users leaving the platform + New Privacy rules = Trouble for Facebook.

This is, perhaps, the fate of all social media networks.

When I grow up, I want to be an influencer

Back when I joined Facebook, there was also no such thing as an “influencer”.

Celebrities endorsed products, not people who were only famous for already (somehow) being famous. Today being “Instafamous” has replaced real famous and if you’ve got an audience online it’s incredibly easy to monetise it.

Building an audience on a new network is much easier than on an established one. Facebook, Twitter, Instagram, and YouTube are all crowded spaces. TikTok, however briefly, was virgin territory and a great opportunity for influencers to build their audience. We are into our second generation of online influencers now, savvy Internet natives who have grown up watching the generation before and who understand the game.

So, what happens next?

Facebook isn’t going to die tomorrow. A long lingering death beckons, a thousand cuts delivered by an army of regulators, competitors, and disengaged users.

Could someone buy it? Maybe, but you’re looking for a buyer with a huge amount of money and the appetite to take on a tarnished brand. After MySpace and Tumblr, I’d expect most businesses to be a little shy about the prospect of buying an ailing, but expensive, social network.

As much as I hate to say it, the metaverse might be Facebook’s only viable play. There are no formats left to move to – text, audio, video are all sewn up and easy to replicate. The metaverse is a new format, a new space that Facebook could control in a way it hasn’t been able to control the real world (no matter how much it has tried).

Did the web get to big for Google to Google?

In their article “Google Considers Reducing Webpage Crawl Rate“, Search Engine Journal reported that Google may soon visit websites to look for new and updated content a lot less frequently than it currently does. It’s an interesting article because none of the talking heads from Google really comes clean on just why they are considering this… but I think I might have the answer. I’ve definitely got a theory…

What Google said about Reducing Crawl Rate

“… what I mean is that computing, in general, is not really sustainable. And if you think of Bitcoin, for example, Bitcoin mining has real impact on the environment that you can actually measure, especially if the electricity is coming from coal plants or other less sustainable plants. We are carbon-free, since I don’t even know, 2007 or something, 2009, but it doesn’t mean that we can’t reduce even more our footprint on the environment. And crawling is one of those things that early on, we could chop off some low-hanging fruits.”

Gary Illyes – Google

There’s no doubt that Google crawl a lot of webpages only to find that nothing has happened. Nothing at all. Fundamentally it’s wasteful but they only know they’ve wasted their computing time and bandwidth after they’ve come to your website and found that the last time you updated it was in late 2017 with a “coming soon” post. Existing technologies like XML sitemaps and RSS feeds can address this problem though, easily providing Google with a way to check if a site has been updated without having to crawl every single page.

So… what’s the problem? Well, the major problem that Google are talking about is the sustainability of massive computing operations like theirs and their carbon footprint. But… Google are already carbon neutral (Gary Illyses statement that Google are carbon-free is actually incorrect, they were carbon neutral in 2007 and plan to be carbon free by 2030).

With the vast amount of money Google has though, it could easily become a carbon-negative business. Changing what they do and how they do it seems like more work than they need to do when they could, alternatively, just plant a small forest or two. You can plant a tree for £1 at https://moretrees.eco/.

You have to ask… what gives? Why is crawling suddenly such a bad thing?

Theory: The Internet is Growing Faster than Google Can Cope With

I’m not convinced by Google’s “green-washing” of the reduction in crawl rate. I think the reasons for it are much simpler than Google are letting on. I think the Internet is getting too big for Google to handle.

Back in 2021, https://websitesetup.org/ estimated that the web was growing by 576,000 new websites per day. If Google want to continue to index all of the web, that means they need to index 576,000 sites more every single day. Over a year that’s an increase of 210,240,000 sites over the course of the year. And that was 2021. The internet is expanding exponentially, with more and more people creating more and more content every single day.

In 2020, the amount of data on the internet was estimated to have hit 40 zetabytes (a zetabyte being roughly a trillion gigabytes). Compare this to an early estimate by Eric Schmidt, then CEO of Google, at the web being a measly 5 million terabytes or so of data and, even then, Google having taken 7 years to index just .004% of that.

To paraphrase Douglas Adams…

The internet is big. Really, really big.

me, by way of douglas adams

You have to wonder if it is physically (digitally?) possible for Google to index all of it. You also have to wonder if it’s economically viable.

There are more bytes of data than there are people on the planet

Google’s core business model remains showing adverts to people who click on them. We all use Google for free and, according to the now infamous study by SparkToro, there are an increasing percentage of searches on Google that do not generate a click at all. No click = no revenue for Google. Although only roughly half the population of the planet uses the internet, even if every single one of them is a generating revenue for Google by clicking on links, the expansion of the internet is massively outstripping the expansion of the human race.

Is there a tipping point at which Google simply won’t be able to index all, or even a significant percentage, of the internet at all? Will the number of sites needing to be indexed not only outstrip Google’s ability to index them (according to Eric Schmidt, it already did) but also outstrip the commercial imperative to do so?

Remember – every crawl costs money, every addition to the index costs money, every search costs money, and Google only gets paid when you click.

How can Google fix it?

There are already technologies out there, including the humble XML sitemap and RSS feeds, that allow websites to direct Google and other search engines to their new and updated content. Will Google issue another one of its famous pieces of “guidance” that websites will need to provide these, or some other form, of feed if they want to be indexed?

Some pundits more on the fringe of these matters have even questioned if, in the future, Google might charge website owners to index their site. This would effectively make the entire Google search index “pay to play” and every single entry you see on Google would be paid for and, therefore, an advert. Think that sounds ridiculous? Google already did this with Google Shoppingconverting it from a free service to a paid service back in 2012, a decision they reversed in 2020 in response to the financial crisis caused by the Covid-19 pandemic.

Of course, Google could just follow the likes of Bing and start to support IndexNow… (but that’s a whole other story).

How does this affect your website and what should you do?

Google may already not be visiting your site as often as it used to. Updating your site on a regular basis has been SEO best practice for some time, but keeping a regular schedule of adding and updating content is likely to become more important as Google look to get crawling under control. If Google thinks your site is dormant, it will come and check it less frequently. That might not be a problem if your site is genuinly dormant, but it will be a problem for you when you do have new content and you’re waiting for Google to pick it up.

If you want to stay ahead of the curve on this one, you need to:

  1. Ensure your website has a functioning XML sitemap.
  2. Ensure your website has RSS feeds.
  3. Ensure your blog posts appear on your homepage
  4. Blog frequently.
  5. Share your content regularly and repeatedly across social media channels

Google’s John Mueller gives… Jurassic Park SEO Advice?

The Jurassic Park Test was one of the first pieces of SEO advice I wrote. It was one of the first chapters I wrote for The Truth About SEO and one of the first videos I recorded for YouTube. Here I am, all the way back in 2019 waxing lyrical about dinosaurs and their relevance to SEO… (God, I looked younger back then!)

YouTube player

So, imagine my delight and surprise when I saw this article on SEO roundtable quoting Google’s own John Mueller quoting the inimitable Doctor Malcolm from Jurassic Park. Or, to put it another way, imagine my delight and surprise when I saw John Mueller from Google quoting The Truth About SEO quoting the inimitable Doctor Malcolm from Jurassic Park.

Thanks, John! If you need any more tips, let me know.

Google lights blue touch paper on Page Experience for desktop

The clock is now officially ticking for any website developers who’ve been saying “Page Experience only affects mobile” to clients, with Google setting their deadline for when it will start to roll Page Experience into desktop search as a ranking factor.

It’s coming, and it’s coming in February 2022.

Although Google are saying that this is not a “major update” there are likely to be a large number of websites who see their rankings shift and if you’ve got a client base who are switched on and monitoring their rankings then you can expect some emails and phone calls come February.

This also big news for WordPress developers and users as WordPress has been notoriously poor on Core Web Vitals and Page Experience, to the point that the core WordPress team has initiated a project to improve the performance of WordPress.  (In my view, not a moment too soon – when you’re losing ground to the abomination that is Wix… you’ve got problems).

If you want to see how you site is performing, the Google Pagespeed index tool should tell you everything you need to know: https://developers.google.com/speed/pagespeed/insights/