How Webmasters use social sites to create a keyword monopoly
The regular user thinks that if he searches for a keyword in Google he will receive a listing of ten different websites that are the best matches for the keyword entered. This common believe is wrong. Clever marketers have found ways to make their website appear in all of the search results on that page or at least in many of them.
We have two different approaches though. The first is to get a so called double ranking. If two pages of a website are on the same results page the lower ranking one gets moved just below the higher ranking one. If you have the number 1 spot the lower ranking one will get the number 2 spot even if it would only qualify for number 9 or 10.
That's two spots of ten already. What happens if you submit a site to a social news or bookmarking website? Right, it gets a new entry in the search engines that points to the original url that the marketer wants to promote. Many social sites like Digg have such a high standing at Google that their results are often seen in the top 10 or 20 if you search for a keyword.
There is another way that is currently possible. Google treats subdomains as if they were different domains. Marketers create keyword rich subdomains that are pushed in the search engines as well. So, even if you expect to see 10 different results the reality could be different.
Update: The situation has evolved even more over the last years. With Google adding more of their products into the search results, webmasters now use those products to dominate a search results listing. If they see YouTube links in there hard coded, they create and upload videos to YouTube, and optimize them afterwards to make their videos appear on Google's frontpage.
Advertisement
Doesn’t Windows 8 know that www. or http:// are passe ?
Well it is a bit difficulty to distinguish between name.com domains and files for instance.
I know a service made by google that is similar to Google bookmarks.
http://www.google.com/saved
@Ashwin–Thankful you delighted my comment; who knows how many “gamers” would have disagreed!
@Martin
The comments section under this very article (3 comments) is identical to the comments section found under the following article:
https://www.ghacks.net/2023/08/15/netflix-is-testing-game-streaming-on-tvs-and-computers/
Not sure what the issue is, but have seen this issue under some other articles recently but did not report it back then.
Omg a badge!!!
Some tangible reward lmao.
It sucks that redditors are going to love the fuck out of it too.
With the cloud, there is no such thing as unlimited storage or privacy. Stop relying on these tech scums. Purchase your own hardware and develop your own solutions.
This is a certified reddit cringe moment. Hilarious how the article’s author tries to dress it up like it’s anything more than a png for doing the reddit corporation’s moderation work for free (or for bribes from companies and political groups)
Almost al unlmited services have a real limit.
And this comment is written on the dropbox article from August 25, 2023.
First comment > @ilev said on August 4, 2012 at 7:53 pm
For the God’s sake, fix the comments soon please! :[
Yes. Please. Fix the comments.
With Google Chrome, it’s only been 1,500 for some time now.
Anyone who wants to force me in such a way into buying something that I can get elsewhere for free will certainly never see a single dime from my side. I don’t even know how stupid their marketing department is to impose these limits on users instead of offering a valuable product to the paying faction. But they don’t. Even if you pay, you get something that is also available for free elsewhere.
The algorithm has also become less and less savvy in terms of e.g. English/German translations. It used to be that the bot could sort of sense what you were trying to say and put it into different colloquialisms, which was even fun because it was like, “I know what you’re trying to say here, how about…” Now it’s in parts too stupid to translate the simplest sentences correctly, and the suggestions it makes are at times as moronic as those made by Google Translations.
If this is a deep-learning AI that learns from users’ translations and the phrases they choose most often – which, by the way, is a valuable, moneys worthwhile contribution of every free user to this project: They invest their time and texts, thereby providing the necessary data for the AI to do the thing as nicely as they brag about it in the first place – alas, the more unprofessional users discovered the translator, the worse the language of this deep-learning bot has become, the greater the aggregate of linguistically illiterate users has become, and the worse the language of this deep-learning bot has become, as it now learns the drivel of every Tom, Dick and Harry out there, which is why I now get their Mickey Mouse language as suggestions: the inane language of people who can barely spell the alphabet, it seems.
And as a thank you for our time and effort in helping them and their AI learn, they’ve lowered the limit from what was once 5,000 to now 1,500…? A big “fuck off” from here for that! Not a brass farthing from me for this attitude and behaviour, not in a hundred years.