Here is the transcript to read along
Hello and welcome to a brand new podcast!
First of all, thanks for tuning in. In the broadest sense, the podcast is about everything that has to do with online marketing and I will be as close as possible to the hot topics and forecast a bit what impact this will have in the future. So if you call a domain your own, you should definitely give it a listen.
I’m Marco, I’ve already got about 13 years of online marketing under my belt and have seen a correspondingly large number of projects, so I can also talk a bit out of the sewing box. So let’s get right to it and get to one of today’s top 5 topics.
Topic number 1 – Spelling errors can decrease the quality of a web page.
On 18.10. Google said in a Hangout that when content is published and you don’t exclude the content from the index, you also want it to be rated by Google, so to speak. So you are responsible for the content. That is, if I have users on my site who diligently comment, possibly produce spelling errors and perhaps not so nice content, this plays a role for the quality of the site in any case and Google will in any case make no direct distinction between what is now the actual content of the page and what the comments and thus simply assumes that the content that is on the page, are the ones that you want to index. Translated with www.DeepL.com/Translator (free version) A funny solution as a countermeasure would be that every user who registers for the comment section first has to pass a spelling test and if he passes it, he is allowed to comment. Just kidding, of course. But either way, I’ll have to think about who’s writing on my site. This is of course very dependent on what kind of mass of users I have on the site, as well as the topics around which it is ultimately about. If I have very controversial topics, I then sometimes get the very “beloved” trolls that I get on the page, which do not necessarily stand for good spelling and that sometimes pulls down the website rankings. Based on the content that I then simply have.
My conclusion is in any case: Once you open an area where it is possible for users to comment, you should definitely interact with the users and of course somewhere control the quality that is ultimately produced on my site, this is content that is visible to all and if a lot of bad content is produced by the users should perhaps still consider why this content is produced and what you can do about it. Translated with www.DeepL.com/Translator (free version)
Basically, it makes sense to include users on many pages, because ultimately you get the best feedback from the users and if we are already on the subject of content come directly to the next topic.
Topic 2 – Automatic translations also achieve as good rankings as manual translations.
Google does not recommend that automatically translated texts are simply taken over without checking, which also says logical common sense. However, if you look at the translation tools of the last few years, they are getting better and better. A favorite of mine is definitely DeepL. Compared to Google Translate, which in part simply recognizes the meaning much better and provides much cleaner and more understandable translations. Until 2017, Google did not want machine translations as a matter of principle. Then a year later they did a little backtrack and said it depends on the intended use, and now they’ve actually gotten to the point where they’re saying “yes automated translations are okay, but can lead to manual measures if the quality is poor”. But then it is also understandable, because from a certain amount it is then necessary to react. Of course, I wonder how the whole thing is then controlled, whether Google then runs the content through its translate tool, so to speak, and in a certain way counter-checked for logical errors and from a certain number is reported or whether the page is simply checked for frequent errors and this is marked when too many matches that there should be checked the website.
My opinion is relatively clear: quality goes before mass, this has of course always been confirmed for the last few years and not only in terms of content. Whereas I can absolutely understand that when store pages with a lot of content succumb to the temptation to use automated translations in some form for themselves.
Topic 3 is Flash is dead
This time it is actually not a lurid headline as it is often used in clickbait news that something dies, but it is actually that Flash will disappear from the search results of Google and that after more than 23 years. I actually grew up with this technology and got to know it in the very early stages where it was very very complicated to put together a few seconds of animation in hours of work. But the fact is there have been many security holes and many problems with the technology over the years and Google has now just made short work of it in the end. But honestly, if you look around on the pages, you don’t really see it anymore. I think I found a page last year where I consciously noticed it. And also Firefox has yes in version 69 Flash, 2017 Google has banned Flash from advertising, if ultimately all the big guys anyway already pull the ripcord it is actually only a matter of time. And Google says even Flash can hinder the conversion of a website to mobile-first indexing and is ultimately simply no longer up to date. That means the recommended techniques are of course Html 5 and Co. that are today simply the techniques of choice.
Conversely, this means that SWF files will soon no longer be visible in the Google index. This means changing over now at the latest, if you are actually still using such an old technology.
Topic 4 – Google Site Kit
In case you haven’t heard of it, it’s a plugin for WordPress from Google itself. WordPress was first released in 2003 and now has a market share of about 50 percent, so it’s the most widely used CMS you can get and now there’s an official plugin from Google called Site Kit. What can it do? This has a lot of advantages and also makes it much easier for the user, without having any agencies attached to it in any way, and you can verify Search Console more easily without major code adjustments. One has easy access directly from the dashboard. This gives you the source data from Google itself. If you try to rank well in Google can link with it Analytics, AdSense and so on. So the whole data pool completely reinholen and gets the data directly from Google and can even create roles and permissions if this is necessary and also customize the interface. Google itself gives recommendation one should check that then weekly, the data, current posts on the performance check and how that affects the users. In any case, all services that you use in the Google world should be linked. Auf jeden Fall sollten alle Dienste verknüpft werden die man in der Google-Welt verwendet.
The conclusion is relatively clear: Since the plug-in is free, you should install it in any case, at least if you are interested in displaying all data from Google centrally in WordPress and simply work with the data in the form simplified. You simply can’t get more relevant info in a simpler way if you want to optimize for Google, no matter if you are a company or a normal private person. And then we come to a somewhat more complex issue with the last topic, BERT.
Topic 5 – BERT
One or the other may have already seen it. BERT Has now appeared in the news and is called in the long form “Bidirectional Encoder Representations from Transformers” and ultimately describes new technique of Google how to deal with the search results. Google is now simply trying to understand more of the context and not so much evaluating the search query word by word, but rather in the complex context. The technology itself has been open-sourced before and is now simply finding its way into Google’s code. Since the whole thing also involves a bit more computing power, Google has also upgraded technically. According to Google, it’s actually the biggest jump in the last five years and one of the biggest jumps ever. Of course, we’ll have to see how this affects the rankings in the next few weeks. That is quite clear. Of course, it has to be said that this is first intended for the English market and other languages will follow later. So that’s going to take a little while. The impact is directly natural on search results and featured snippets and according to Google, one in ten search queries in English should be better understood with it. Other languages will simply follow later. This is still a very new technology, so let’s see what happens and what the long-term effects will be. Ultimately, it’s particularly interesting because of the ranking implications and the way Google wants to learn to understand search queries with the strong reference to voice search will still be a very interesting topic in the coming months and even years. This means that we will definitely keep an eye on the topic and provide more and deeper information about it.
This was the first podcast of its kind. I hope it was helpful and maybe also a bit instructive at one point or another. If there are any questions or ideas for the next podcast, just feel free to send an email to email@example.com. I’ll get back to you and thank you very much for listening.