Are you relying on AI-generated content to boost your website's ranking or planning to use it? If...
90.6% of Content Gets No Organic Traffic
Acquiring customers through organic search has changed significantly during the last several years. Google has performed multiple algorithm updates, and the way we search continues to evolve. We are asking questions that are more specific and more complex, and our searches often have more than one word. In fact, 94.7% of all queries asked on Google have a search volume of 10 or less per month.
Organic traffic remains the lifeblood of most websites and is the main source of conversion for many businesses. People may first learn about a brand or a product on social media, on TV, or even in a movie. But when they want to buy that product, they almost always search for it, and Google often becomes the last step in completing a conversion. That’s why it’s so important for your website to be discoverable and ranking high on Google.
Despite the fact that SEO is one of the most powerful tools in digital marketing, 90.6% of content gets no organic traffic. This stat comes from Ahrefs data of one billion pages.
Think about that for a second. Almost all content created by businesses and individuals receives no organic traffic from Google. In this blog post, we'll explore why it may be the case and what you can do to ensure your content is seen by your target audience.
Moz, an SEO software company, created a Mozlow hierarchy of SEO needs, a play of words with Maslow’s pyramid of human needs. It’s a great starting point for anyone who wants to understand the core principles of ranking without going into major technical details.
Let’s look at the SEO metrics the importance of which is often overlooked, causing most websites not to rank on Google.
The very first thing is to make sure your website is accessible to search engines so they can index your content. It sounds so obvious, but a lot of websites don’t follow this very important first step. They impatiently wait for content to appear on the first page of Google, but it never will – Google simply can’t access their website.
If you are about to scroll further as you think this doesn’t apply to your website, don’t rush – we suggest you check if all of your content is open to a crawler.
How to check if Google can access your website?
- Google Search Console
- Use Google search, enter site:url (for example, site:abcd.com, no spaces) – you’ll see what pages show up in results, it means Google can “see” them.
It will take time before Google indexes your website – it usually takes several weeks.
What to do if your website isn’t accessible to Google?
- Check the robots.txt file to see if there are any directives preventing Google bots from accessing your website
- If you're using a content management system, check to see if there is a setting that is preventing your website from being indexed
- If you're not sure why your website isn't accessible to a Google bot, you can submit a sitemap to Google through the Search Console. The company that developed your website will know how to do it.
You are creating content for two types of visitors: people and robots (a Google crawler bot).
Of course, you are producing content for your readers, your consumers, your followers. It goes without saying it needs to be engaging and interesting. But for your audience to get to your website and discover all this amazing work that you do, you first need to get through the review of a Google bot.
The huge amounts of data that Google has, as well as recent advances in natural language processing, make it easier for Google to identify content that your readers are interested in and searching for.
Here are a few tips and recommendations on how to create content that both of your website “audiences” – people and robots – will find relevant, resulting in a high position in a SERP (Search Engine Results Page).
- Don’t rely on your own assumptions about what a reader is interested in
It is the number one mistake that companies make, and is a key reason why 90.6% of websites get no traffic. We often think we know what people will find interesting and what they want, but it is too often far from reality.
Fortunately, there are many tools to discover what people are searching for. Some of the most popular SEO discovery tools include:
Identify several keywords or phrases that are relevant to your business, and you will receive a trove of data. Be creative, and experiment with related searches and topics. You are no doubt going to get a lot of inspiration and ideas. Don’t discard keywords and phrases that have low search volumes. As mentioned above, almost 95% of all things that people search on Google get asked 10 or fewer times per month. Creating compelling content is not always easy, but it is worth the effort. Read Intentful’s recommendations on how to produce content that matches customer intent.
- Use long-form text and make sure the context is clear
What is the required word count for successful SEO? There are many studies, with most experts agreeing that 800 to 2,000 words seem to be an optimal length. If you think “Who is going to read it all in the age of attention span limited to less than a second?” you may find the reason for such word count very interesting.
It’s not just the word count that matters. Your content needs to be long enough for a robot to understand the context and be able to extract core information.
Also, having long-form content doesn’t mean it needs to be an unreadable and boring text that no one will read. Work with your UI designer to format the page in a way that is easy to read.
While a Google bot is a "target audience" for the discovery and ranking of your content, you still write for humans. If they land on your website but find the information difficult to read, they won’t come back.
- Avoid duplicate content
Another huge issue that often gets overlooked is the unintentional use of duplicate content. It is very important to have only unique and original content. Examples of unintentional duplicate content may include a text provided by your business partner (e.g., a product manufacturer provides a description for a PDP, a Product Detail Page), or a press release sent out to multiple news outlets and they all use the same content. If you use the same content as another website, or publish the same content on multiple pages of your own website, your content won’t rank high and might not rank at all.
- Keep your content fresh and updated
This is also important for both people and robots. Add new sections, blog posts, and other types of pages on a regular basis, and you'll see your site start to climb up the search engine results pages. If that’s not always possible, updating existing content with just a short description or paragraph rewrite is often helpful. If you are adding new content, make sure these pages are included in a sitemap and open to Google (see above regarding the crawl accessibility).
Once you apply the above three tips to the content you create, your ranking will start to improve.
The use of keywords is important, but don’t get too fixated on just the keywords – it is really the context and the intent that matter the most: adding one or two keywords will not be of help if the context of the page is irrelevant. Also, don’t overdo it - “keyword stuffing” is a bad idea.
If you are wondering how many keywords to include per piece of content, 1 keyword per 200 words of copy is a good rule to keep in mind. Adding a keyword in the title and in the meta description is helpful, too.
Also, please keep in mind that a keyword may be used in different contexts. For example, "New York" might be:
- New York City
- New York pizza
- New York and Company, a clothing brand
- New York Palace cafe in Budapest, Hungary
It’s just another reason to make sure keywords are always used in the right context.
Let’s assume we are building content for a website that specializes in travel to New York City, so “New York” is our starting point for keyword research. When using “New York” during the initial target keyword analysis, here is what we get as top results:
As you can see, there are only a few keywords that are related to New York as a travel destination. But by adding just a little bit of context, there is more relevance to travel.
Great user experience
As with content, there are two types of visitors that will be evaluating the UX of your website: people and robots, so don’t neglect this step. The core UX factors include:
- Fast loading, interactivity, and visual stability
You may be tempted to test if the website loads fast on your laptop and on your phone, but that’s not enough. Google uses multiple tests to evaluate many signals. Learn more about Core Web Vitals and why they matter. We also highly recommend that you ask your web team for a Core Web Vitals report, or you can check it yourself. Most websites fail the Core Web Vitals test both on mobile and desktop.
- Your page is mobile-friendly
Even if visitors come to the desktop version of your website, Google now mostly ranks based on mobile-first experience.
- A user makes a minimum number of clicks to get to the information he or she wants
If your website has multilevel navigation, we highly recommend adding breadcrumbs structured data.
- There are no intrusive interstitials
No one likes them. Not a Google bot, not people.
All of this is essential for keeping users engaged and ensuring they have a positive experience. For more details on user experience, we encourage you to review Google’s Page Experience guidelines.
Share-worthy content – earning backlinks
One of the main factors that search engines use to rank websites is the number and quality of backlinks that a website has. When another site links to yours because your content is valuable or interesting, it is seen as a vote of confidence in the world of SEO. The more backlinks you have, the better. The quality of your backlinks is also important. A backlink from a high-quality website is worth more than a backlink from a low-quality website.
Use SEO tools like ahrefs or SpyFu to see what websites link to your website, and also to check out the competition.
Title, URL, and description
A lot of websites have empty titles, making it more difficult for a Google bot to understand what the page is about. It is another reason why so many websites get poor ranking on search engines. The issue became so significant that at some point Google announced that they will be generating web titles.
- Make sure you have titles and meta descriptions for all pages on your website
The title should have a keyword and be relevant to the page content.
- The URL should be short, ideally, contain a core keyword, and be easy to remember
To increase the chances of your website appearing in the featured snippets, the sections above all organic search results, your developers need to add a special markup. Learn more about structured data in official Google guidelines.
According to a great summary by Backlinko, the featured snippet gets an average CTR of 8.6%. Why is it important, if the first natural result gets a higher CTR? Because ranking for a featured snippet you get a very decent CTR without ranking high in organic search – and the magic happens because of structured data.
Growth through content
Once you get the technical basics addressed, it all comes down to producing and updating unique and relevant content matched to customer intent. Creating great content is not easy, and you need a lot of it to keep up with the digital age. At Intentful, we use the latest in AI technology to create targeted, high-quality content at scale.