

There are things that, no matter how much we say them very loudly and/or always, are not true. And in the case of SEO, the terrain is perfect for mythology for several factors:
All this means that there are a series of mantras that we often feel, either due to ignorance, or due to lack of real experience in projects or very partial, or due to lack of updating. I often feel like clients, which could be more or less normal, but I also detect them from the voices of professionals, and I always think if they really believe it or if there is some interest behind lying. What we will do.
The fact is that I wanted to compile a series of myths about SEO that are more common, and I have tried to refute them. It is not always a matter of throwing away an idea completely, but at least making people think about it, putting it in context, pondering. However, the most used word for SEOs is "it depends", whether we are Galician or not, but you will allow me in some cases to be a little emphatic.
To begin with, I have 8, but with the times it is likely that this can grow:
What is being defended? Many think that SEO optimization is done only once, especially during the creation of the website or a migration. It is common for many clients to ask us for specific SEO actions, to leave it focused and ready.
Counterargument: SEO is an ongoing process. Google's algorithms are constantly evolving, as well as user preferences. It's a game in which we play the best we can with our players and everything we've trained. But count: there is a referee who is very strict and is probably bought; the rivals also play, we are not alone; The refereeing rules, the field of play, or the environmental conditions are constantly changing. Think about what would happen if you were to play a Champions League final with the players, strategies and form of 10 or 20 years ago, against a team of today. Well, that's it: what worked yesterday may not be effective today. And what we do today, perhaps it will have an effect in a few months, we have to see its evolution, the evolution of the terrain, what the competition does, etc. Constantly updating content, technical review of the page, and tracking search trends are necessary to maintain a good position.
Of course, if you do a sprint of SEO things that are well done, the website should do well, but the way things work, their nature advises continuous work, monitoring and adjustments. A few years ago we made this infographic, and I thought that with some nuances it is still valid:

Example: Changes to Google' s Core Web Vitals (loading speed and usability) show how algorithms are based on dynamic technical factors. If regular modifications are not made, your position may lose visibility. Or the impact that AI is having on SEO, most websites are not prepared for a GEO stage.
What is being defended? If the content is of quality, Google will index it and position it well, without having to worry about the technical part (speed, indexing, schdemas, html, etc.).
Counterargument: Technical SEO is critical. Good content can be made invisible if it does not have a correct technical structure. Google considers things like site structure, loading speed, and mobile-friendliness as critical factors. We have seen dozens of deserved content absolutely invisible to Google's eyes due to very basic technical SEO errors: blocking by robots, noindex, poorly implemented canonicals, or poor JS rendering. Nowadays also. Content SEO is a lot, but it makes no sense to work on it without a solid foundation of technical SEO.
What is being defended? With the rise of AI, some believe that SEO is no longer relevant.
Counterargument: SEO isn't dead, it's evolved. We have witnessed many SEO announcements that did not happen: blogs, social networks, etc. Now it's AI, merely. But the data is very stubborn: it is still essential to ensure organic visibility in search engines, since the largest volume of traffic and research on the Internet is still very long. Even if you think that AI or GEO positioning will replace SEO, which is highly doubtful, you should know that without a solid SEO job well done, you have no chance of getting your head out of the results of LLMS like ChatGPT.
What is being defended? A few years ago, many thought that external links (backlinks) were the key to getting a good position. Then we have had times when many have claimed that they had no impact at all. They are the same ones who now have to make false rankings to get out to AI, in which they magically always come out first.
Counterargument: Backlinks are important, but they're not everything. Google also values other factors such as the quality of the content, user experience, relevance and technical quality of the site. A balanced approach is essential. In general, our experience tells us that in niches that are not very competitive, links may not be necessary. However, in ground sectors they will always be necessary to compete in conditions. But they are the icing on the cake: the project must have a good foundation (infrastructure, architecture, technical SEO, content, social signals, etc.), and then in tie-breaking situations, make some good backlinks that the creature thanked you for. And at the current stage of at least SEO for AI, links are still very relevant, although mentions take on another dimension of importance.
What is being defended? That a well-made SEO strategy guarantees to reach the first position of Google.
Counterargument: No one can guarantee Google's #1. Algorithms are very complex and depend on many factors, including competition, domain authority, user experience, and constant changes in Google's algorithm. In principle, a good SEO strategy, well deployed and always under review and adjustments depending on the panorama, will allow us to compete and have visibility. Now, if someone promises you that your website will appear first for certain searches (and often in a certain time!), then they're lifting your shirt. Either they don't understand SEO, or they know they're promising you something they can't guarantee. In both cases I think it is a good reason to run away from him. The SEO sector has grown rapidly and disorderly due to changes on the Internet and a huge demand from companies, and has led to the appearance of freaks and scammers for everything. Encouragement and time will put everyone in their place.
What is being defended? Some believed that placing more keywords on the page would improve ranking.
Counterargument: a slightly excessive misinterpretation of the phrase leads to keyword stuffing, and can be penalized. Google prioritizes content quality and keyword relevance over user search intent, not quantity. But semantic search is not perfect either, and the basics should always be there: each URL must be oriented towards SEO objectives, which are transferred to search patterns that we want to capture, and are synthesized in keywords. Obviously these must be kept in mind and must appear in the article and in the relevant places (slug, title, description, headings and wherever you consider the bodysuit). From there, write naturally and knowledgeably on the subject, and surely unconsciously do good SEO practices: use of long-tails, related entities, synonymy game, etc. On the other end of the magical arts spectrum, it's also not possible to rank for a keyword that you don't have only once on your website. Let us make efforts, but the onlookers, to Lourdes.
What is being defended? Longer content has a better chance of ranking Google better .
Counterargument: Quality always prevails over quantity. Google rewards content that satisfies search intent. And sometimes the search is satisfied in one or 4 words. If a long piece of content doesn't add value or doesn't respond to search effectively, you won't benefit from its length. The content must be as long as it should be. However, it always helps to have a little context and story before serving the dish, of course. So if you get short or long content, go ahead. Now: if you propose a long content, think that nowadays this content can have a general positioning due to the theme of the page, but also specific positioning for each sub-section. In order for Google or AIs to be able to pick up this specific content, you need to fix the content, label the code well so that hierarchies are understood, and understand each block of content or each question and answer, as part of a set but with the ability to be extracted and understood independently.
What is being defended? Google would immediately penalize any duplicate content.
Counterargument: Although duplicate content can affect rankings, Google doesn't automatically penalize. Duplicate content may not rank well, but Google can index it and show it if it has the right relevance.
Example: there are thousands. But for example, press releases: the same text is published simultaneously in the same way or practically in dozens of media. And you'll often see that several are indexed for Google, Discover, or Google News. Obviously, as far as possible we should avoid duplicate content, but we do not become obsessed. There are ways to label it, too.
SEO has evolved. Old strategies based on outdated myths are not enough to succeed in today's search ecosystem. Moving forward with up-to-date practices and respecting the user's search intent is critical. As algorithms continue to change, it's essential to understand how to integrate technical factors, content optimization, and user experience for lasting success.

Hello! drop us a line