Displaying posts tagged under "SEO"
Cutts explains when Google decides to make a ccTLD global.
Google’s SEO/Webspam leader Matt Cutts has released a new video about using a country code top level domain name for a website not targeted to that country.
His message is the same that Google has been giving for many years, but he added some more color about how the company decides what should become generic.
Google makes some domains globally targeted because they’ve been widely adopted for global use rather than country use. .Co and .TV are examples.
What if you want to use .li for Long Island or .ky for Kentucky?
Cutts said you probably shouldn’t get ahead of yourself.
For .li, the country code for Liechtenstein, Cutts said they looked into it and found that the domain is widely used in Liechtenstein. Thus, it wouldn’t be right to give it a global meaning.
He also gave .ky (Cayman) as an example.
Here are the ccTLDs Google treats as generics (as of today): .ad, .as, .bz, .cc, .cd, .co, .dj, .fm, .io, .la, .me, .ms, .nu, .sc, .sr, .su, .tv, .tk, .ws.
Note that .ly, a popular extension that belongs to Libya, is not on the list.
Cutts recommends taking down a parking page about a month before launching a new site.
Google web spam czar Matt Cutts just published a video in which he asks himself a question (rather than taking it from the community): should I keep a domain name parked before I launch a web site?
In short, Cutts says no.
He works in a reference to eNom’s backpack girl and then goes on to explain that Google has a filter to try to keep parked domain names out of its search results.
This filter doesn’t immediately know when a domain changes from a parked page to a “real” web site, so he recommends putting up a placeholder page 3-4 weeks before launching a site.
That placeholder page can be something as simple as “coming soon” with a few lines of text — just make sure it’s not an ad-filled parking page.
Here’s the video:
Patent covers method and systems for SEO suggestions and search engine submission.
The United States Patent and Trademark Office has issued a patent (pdf) to Go Daddy for “method for improving a web site’s ranking with search engines”.
U.S. patent number 8,271,488 describes a system for helping web site owners edit their web pages for better search results and then automatically submitting the sites to multiple search engines.
If some of this seems outdated, that’s because the patent application was filed in 2003.
The images in the patent show Go Daddy’s former search engine product called Traffic Blazer. The company now offers a product called search engine visibility that has similar functions.
The patent describes a method where a web site owner wants to rank for certain keywords in a search engine. The system makes suggestions to the site owner on how to edit his or her web site to rank better for these keywords, such as add the word to the title tag. It then automatically submits the sites to multiple search engines.
In another embodiment, the system would automatically edit the customer’s web page for better search engine rankings.
A number of businesses still offer search engine submission services, even though the importance of submitting a site for inclusion in search engines has decreased over the past decade. Automatically analyzing web sites and making suggestions for better search rankings is still popular, although much of the attention has shifted from what’s on the web site to external factors like who’s linking to it.
A handy guide to determine if Google is gunning for you.
Last week Google announced some changes to its algorithm that affect a whopping 12% of search results.
“low-quality sitesâ€”sites which are low-value add for users, copy content from other websites or sites that are just not very useful”
While some say this is aimed at so-called “content farms”, here are the warning signs that this change is aimed at your web site(s):
- You created your web site with the click of a button (or a few buttons)
- You didn’t create any original content for your web site
- Your web site asks questions but doesn’t provide answers — and is waiting for a “user” to generate the answer for you (ahem)
- When users do answer those questions, the answers suck (ahem)
- You developed a hundred sites in a weekend
- Although your web site presents existing information in a “unique” way, hundreds of other web sites have also tried to present that same information in a “unique” way
- Your web site is a splog
- Your content is similar to that found on other web sites; you just re-write it in a different way
- Your only plan for traffic generation is search traffic
Search engine files patent application for methods of detecting link spam.
Ever since people caught on to how Google used the number of incoming links to a web site in its ranking algorithms, people have tried to game the system. From selling links to creating link farms, SEO has focused much of its attention on link building over the years. As a result, the search engines have had to counter this by trying to separate good links from bad.
Yahoo has filed a patent for discovering abnormal link structures and demoting the rank of web pages based on these abnormal incoming links.
Titled “Detection of Undesirable Web Pages”, patent application 20100094868 (pdf) was filed in October 2008 and published today.
The patent describes a statistical method of determining when links pointing to a web page have been artificially generated. The method determines a normal range of links across a number of factors, and then looks for patterns that do not conform to the natural change in links over time:
As the value of the normalized entropy metric associated with a set of inlinks referencing the destination page approaches an outer limit of an acceptable range (e.g., 0 or 1), the likelihood that the set of inlinks to the destination web page is “unnatural” increases. In other words, there exists an inference that some of the inlinks among the set have been created for the purpose of artificial promotion of the destination web page rather than based on the genuine interests from a diverse set of independent users.
Some of the factors considered include:
-IP Address of link source
-Top level domain of links
-Language of each link (e.g. English, French, German)
-Autonomous system (i.e. a networked system of computing devices)
-Anchor text of links
-PageRank of incoming links
-Link age-attenuation weightings
Of course, many search engine optimization experts already try to skirt these measures by spreading links about in more natural ways. Further, I’d be surprised if some of Yahoo’s competitors weren’t using some of these same tactics before the patent application was filed.