One of the interesting debates that are circulating in SEO circles these days is the concept of content accuracy as a ranking signal.
The idea makes sense. Google is so “smart” that it’s spiders figure out which page contains the most accurate information. If that fails, the algorithm decides based on which page is most popular. Undoubtedly, the logic follows, the page that most people use, and trust must be accurate.
A Google Insider Refutes the Theory
It sounds great, but a source that’s close to the search engine giant said it’s not the case.
Google employee Danny Sullivan explicitly stated it’s not. He admitted on Twitter that machines have no clue when it comes to determining the accuracy of any content page. Instead, they rely on “signals” that align with the company’s innate ideas about the relevancy of the topic and the authority of the source.
Topic Relevancy – How on-topic the page is for a precise search query.
Authority – Authority refers to a measure of how credible the sources are at producing factual information.
Popularity Stems from Authority and Relevancy
Both of these concepts almost always produce popularity, which clouds the issue. That means that it’s not necessarily the fact that a page is popular with users that makes it rank highly. Users love the site because their content is authoritative and very relevant to the query.
Where this issue breaks down for anyone interested in SEO is that it’s still vital to develop accurate content. It may not be the sole, or most important, ranking factor, but visitors expect it, and it builds trust that generates repeat visits.
Trust Comes from Accuracy
That trust that users show makes search engines more likely to trust the pages. When they do, they’re more likely to send even more organic traffic. So the question becomes why anyone would want to rank intentionally inaccurate content in the first place?
One thing knowledge like this will provide for search professionals is a relief. Now it means that SEOs don’t have to spend time worrying about accuracy as a ranking factor. They should all strive to put forth high-quality, authoritative, and fundamental research. However, the deciding factor is not going to be whether the information is best in class.
High-Quality Content Is Authoritative
Still, accurate information is the way to be authoritative and topically relevant. Providing precise and objective answers is the preferred method to build an audience. That’s why the idea of bringing forth the best solution is in the DNA of SEOs.
Popularity Doesn’t Apply to New and Trending Sites
Popularity has always been controversial as a ranking factor. The primary reason is that new pages don’t have a chance to be as popular as established ones. For trending or viral topics, there may be no authoritative sources worth referencing. Sullivan stated that 15% of all queries that Google handles daily are new. Therefore they can’t use popularity as a factor for rankings.
That will remain an issue, but that doesn’t mean that Google’s claim is right. They may use different factors for deciding the ranks of new versus old websites. Established niches without many changes have different ranking criteria than new ones.
Now you have a perfect idea of why this topic is controversial. In most people’s minds, high-quality and accuracy relate closely. If Google doesn’t know which page is accurate (and they admit they don’t,) how can it determine quality? It would appear that the whole ranking system is a series of best-guess assumptions. If sites get a lot of links and visits, they argue, they’re probably okay.
A General Guideline for Accuracy
Researchers verify sources for accuracy following several principles.
|Authority||Is it a medical doctor giving the advice? Does the person have an advanced degree or hold exclusive titles? How much credibility does the source hold in a traditional sense is the main question.|
|Purpose||Why is someone developing the content in the first place? Are they doing it to advertise a product? If so, it may not be as accurate as a pure research piece. Understand the purpose to get a grasp of how high-quality the content is across the board.|
|Objectivity||Before the web, objectivity was commonly a topic worth considering. If the person creating the content is not objective, it’s tougher to trust their word. Once again, it’s very tough for an algorithm to determine this, but at some level, they must try.|
|Accuracy||Accuracy focuses on facts. Does the piece contain easy to verify information like addresses or dates? Is all of that data, right? Although Google claims they don’t check the accuracy, they pull from data sources all the time and must be making judgments about them.|
Don’t Neglect Technical Details
Google, which is so into tech, probably also wants sites with very few technical problems. They provide lots of tools that help webmasters see if their website is functioning at top levels. If they run into problems, they may decide that the site lacks some of the necessary credibility and authority to rank.
SEOs are also able to look deeply into the technical requirements that keep sites near the top. They will use Rich Snippets, Meta Tags, Alt Tags, and much more to let a search engine spider understand a page. They will also build smart hierarchies of menus and links to achieve the same purpose. With many of these details in place and the right kind of content, companies begin to rise.
Set a Course for the Top of the SERPs
SEO is an evolving field that requires constant monitoring. It’s worth finding the best SEO specialists for the job because otherwise, it will consume your life! It’s almost impossible to tackle SEO part-time because it’s so demanding. The results are fantastic, though, because nobody converts like search engine traffic. The people who show up are in the market and ready to buy. Give them what they want, and it’s easy for businesses to get new customers from search engines every day.