Last Friday, Donuts CEO Paul Stahura posted on CircleID an argument for strong renewal rates in new top level domain names.
He even put a number on it: “…we conservatively predict an initial renewal rate of greater than 80% for Donuts gTLDs.”
First of all, kuddos to Donuts for sticking its neck out there with a prediction. It might have its own motivations for doing so, but a lot of TLD registries are backing off of public predictions of any kind after their initial registration predictions were way off.
I’ve had many conversations over the past few months about what renewal rates for new TLDs will be. I’ve talked to domainers, registries and registrars.
I think I can make an intelligent argument for all three scenarios: renewal rates below existing TLDs, renewal rates comparable, and renewal rates higher.
Stahura makes a pretty good case for the latter, and you should read it in its entirety before reading on.
One of the arguments Stahura makes is one that I haven’t heard before, and it really turns a key assumption on its head.
If you look at renewal rates of existing TLDs, the longer a domain has been registered, the more likely it will be renewed. I’ve always considered this as a negative for new TLDs; every domain that comes up for renewal next year will have been registered for just one year.
Stahura looks at it differently. He looks at the registration date compared to the birth date of the TLD. Domains registered within the first year of .com availability are almost always renewed, perhaps because they are better names. Domains registered many years after TLD availability are less frequently renewed because they are worse names.
I’m not sure if I buy it, but it’s an interesting argument I haven’t heard before.
Stahura takes it further to point out that the bigger the registration numbers for a TLD, the lower the renewal rate of the later-renewed domains. It makes sense that the first 10,000 names registered are better than the next 10,000 names.
Having reviewed the zone files of some larger TLDs, such as Donuts’ own .guru (77,000), I can say that some of the more later registrations seem like they fit better on Donuts’ .WTF.
Another key argument from Stahura that I’ve heard from the “high renewal rate camp” is that higher prices means higher renewal rates.
It’s a bit counter-intuitive. He bases this on the idea that lower-priced domains are bought for their traffic (that’s the case for some, anyway), and the idea that people are stupid.
OK, he doesn’t say people are stupid, but he writes:
A buyer paying $20 for a name has a sunk cost incentive to renew when it comes time for renewal. Registrants think twice about deleting names that cost them $20 and think three times if they cost $200, but don’t think twice about deleting a name they got for $1 or free.
Of course, it is completely irrational to consider the sunk cost of a domain, or really any investment. But people are irrational. They’re stupid when it comes to sunk costs.
Show of hands: how many people reading this consider how much they’ve sunk into a domain before deciding to renew it?
I’m raising my hand, even though I know it’s completely irrational.
The only consideration that should come into the decision to renew a domain name is if it makes sense to pay the future costs of owning the domain, not the past costs.
Yet I, and many others, consider sunk costs all the time. It clouds our decision making.
It’s this exact irrationality that can be used as a counter-argument to Stahura’s point about sunk costs.
When it comes to renewing domain names, we’re more likely to renew domains that we’ve owned for a longer time. We look at the sunk costs of owning them over the years. “I’ve owned this domain for ten years, there’s no way I’m giving up on it now!”
Yes, the initial cost of most new TLDs was higher than .com, but they’ve only been registered for one year. The investment-to-date for many new TLDs is less than many .com’s. This could make renewal rates lower.
I don’t have many new TLD registrations, but I know this: I’ll scrutinize the more expensive ones much more closely than the cheap ones when it comes time to renew.
At least I think I will. Only time will tell.
What donuts wants, and what happens are two different stories.
.guru and a few others early on had some brutal attempts at registration, along with tm names that will drop. Auto renewal may save them a bit, with ill informed renewals.
Rubens Kuhl says
The other good thing from Donuts prediction is that they are not a publicly listed company, so they can’t be accused of trying to say that just to inflate company stock. I think their prediction will be very close to what happens, the only factor that may lower it a bit are uninformed domainers that tried to register the domains for traffic value dropping them after making the “duh” math.
I actually think 80% is pretty conservative.
Other than .xyz, which clearly has inflated first year registrations because of the Network Solutions giveaways, I think many of the people who decided to take a chance on NTLDs will want to at least hold them for one renewal period, just to see if the market grows.
For a domain that cost $10, paying another $10 to see if nTLDs takes off is not a bad investment. However, for the premium domains, it’s a harder choice. On one hand, if the renewals are much larger than they’re worth, dropping them would be smart move, but on the other hand, there;s the stigma of “Sunk Cost”, so some of them would renew just because they feel they are so heavily invested already.
As for me, I plan to renew virtually all of my new TLD domains. Some of the .xyz domains may not be worth keeping, but there’s this part of me that thinks they may actually have a shot.
people who got in too deep with no plans will be forced to let domains go. those who planned knew it would take a number of years to gain traction (if they do). I only bought 7 or so and sold one already that covered my full investment so I guess I’ll renew but honestly, I don’t have much faith in these new gtlds and have since stopped looking up new releases to see whats available really caring what is coming out when. it was all too much too fast and i really don’t know how many people were able to afford to invest in the new gtlds as well as keep up with buying new .com’s and renewals. you can dump tens of thousands of dollars with no possible return. too risky.
“…I don’t have much faith in these new gtlds and have since stopped looking up new releases to see whats available really caring what is coming out when.”
I’ve been trying to ignore the new release list too. It’s like domain crack. I made a few exceptions, mainly because I saw some with a hint of potential.
“So the reason for domainer interest is the other component of domain name value: the semantic content of the name itself. It’s meaning these domainers are buying, not traffic.
The domainers bought the names because they believe they’ll be able to re-sell the names for a higher value later”
Except that a) It’s extremely difficult to outbound sell domains and b) The only names almost without exception (perhaps .net and .org) that are typed in by end users to buy are .com names. “Hey who owns propertyconsultant.com I want that name”.
In other words if you own a name in .com you frequently get end users writing to you to buy the name. Same domain in other TLD the amount of activity drops down to near zero. Not saying it never happens but it’s extremely rare.
Likewise you can own “propertyconsulting.guru” but nobody is going to say “hey let me write to the owner of propertyconsulting.guru” and try to buy it from them.
And while it is possible for someone to search a sale site and find that name the chance of enough of those people doing so to make a decent dollar is quite small.
“Registrants think twice about deleting names that cost them $20 and think three times if they cost $200, but don’t think twice about deleting a name they got for $1 or free.”
Is also because anything you pay a large amount for you think about more than something you pay a small amount for. It’s a bigger decision.
Someone who pays you $10,000 for a domain has likely decided it has great value. This has nothing to do with sunk costs it has to do with the amount of time spent contemplating the purchase (in addition to sunk costs).
“I’m raising my hand, even though I know it’s completely irrational.”
It’s not irrational at all. If I tell you you can buy dnwire.com (hypothetical I don’t own it) and you pay $500 for it you’ve already decided that it has value to you. So of course you are going to value it highly when you renew.
How about this “raise of hand” to prove my point.
Let’s say I own dnwire.com and it’s worth $500. And I give it to you. Does the fact that you didn’t pay me for the name and I gave it to you lessen the value in your mind?
Let’s say dnwire.com is deleted and you are tracking it and you snap it up for $69. Does that mean you will sell it for $100? No because you value it at more than $69. And so on.
Joseph Peterson says
I’d applaud Mr. Stahura for acknowledging that domainers “act as an unpaid sales force by selling domain names”. Good.
During the past year, while courting mainstream opinion, some registries went out of their way to distance themselves from domain resellers — sometimes in contemptuous fashion, saying that they had priced their products higher in order to exclude parasitic resellers.
Months ago, I had to go out of my way to emphasize that “registries could consider domainers as an outsourced sales department. In fact, domainers are generally salesmen who pay the registries up-front in order to publicize their products for sale.” So it’s heartening to see a registry giving domainers credit rather than simultaneously marketing to them and repudiating them.
Donuts makes a detailed and partially persuasive case for high renewal rates. Aspects of that argument, however, fall apart on closer inspection.
For instance: “[T]he purchase motivation for any [nTLD] buyer is different than those purchasing in .COM”. How so? Mr. Stahura says, “[R]egistrations … are based on the semantic value of the underlying terms, and not merely the traffic the name might generate”.
There’s no such distinction.
Very few newly registered .COM domains will yield high parking revenue; and that isn’t why domainers are buying them. Whether I register or HerbalBeer.com or Herbal.beer, SnowboarderClub.com or Snowboarder.club, my rationale is exactly the same. In my view, the idea that nTLD domains and .COMs are registered for different reasons is sheer nonsense.
The chief differences in the renewal decision boil down to these factors: (1) nTLDs are more expensive than matching .COMs; (2) nTLDs currently sell less frequently than .COMs; and (3) nTLDs currently sell for less than .COMs. In other words, lower expected returns and higher carrying costs. So if domainers are rational, they will drop nTLD domains preferentially — assuming the choice is between 2 domains of equal semantic quality.
But I agree that people are irrational and will think about “sunk” costs. A lot of teetering-on-the-brink drops will be renewed for a year or two before the registrants are comfortable cutting their losses. So 2015 renewal rates will benefit from that deferred decision to a large extent. That’s equally true of first-year .COM renewal decisions. And that rate is 52%.
It’s true that the earliest nTLD registrations will be of higher quality overall. And that should lead to a higher renewal rate. But they also tend to cost much more, which will lead to the opposite.
Also, the name spaces for these keyword nTLDs are much more limited in terms of good domain options than, say, .CO, which is one TLD upon which Mr. Stahura makes a comparison.
This is new stuff, really. So we’ll see what happens.
Frank Schilling says
80% may be high but it will be over 65% .. if you got LLL.NTLD or WORD.NTLD you are going to keep it.. The first 100,000 registrations in most strings will be big generic strings. You’re not letting those go if you’ve got low fixed price renewals.
Obviously a few of the earlier gtlds got thru the cracks, now every gtld has caught on.
Today world.host green.host got released $6000-12000 per year… Let’s be realistic…
Artificial hoarding of unlimited supply with a few parties trying to secure all the demand.
Even .property .diet every keyword abbreviation was secured beforehand all that was left was .click leftovers which domainers paid $7 each for, and will try to market and sell, before they eventually drop them.
You have to read between the lives people, trust your instincts, to many insiders have to much money backing these releases to tell you the truth.
You have gtlds that cost more than a new car per year, for a computer generated link.
I would think that also varies with how many of them someone initially buys.
If you buy 1 lll.ntld you would tend to keep more (just 1) than if you bought 100 lll.ntld.
As such 1000 people owning 1 name is better than 10 people owning 100 names.
Of course for names such as lll or word they are obviously more likely to get purchased if the original registrant drops them.
Not that anyone would do this of course but a way for a registry to game this would simply be,prior to the expiration date, have a shill buyer buy their own names and make that public info. That would make people think the names have value and they would be (if paying attention) be more likely to think that they could hit the jackpot.
That said the value of a lll.ntld is most certainly going to be under 10k.
Joseph Peterson says
100,000 a severe overestimate.
Even the OED only contains 171,476 English words in current use; and most of those will be meaningless or undesirable when paired with a second word as TLD — e.g. Incomprehensible.tips, Avuncular.hosting, or Cogitate.guru.
Most nTLDs have fewer than 100,000 domains registered; and it’s already demonstrably true that many (if not most) of these domains are NOT “big generic strings”.
Easily refuted by a glance at any list of registered nTLD domains.
“The first 100,000 registrations in most strings will be big generic strings”. Really?
.GURU has a little over 77,000 registered domains. Compared to very limited niche extensions like .BIKE, .TATTOO, .BEER, or .BLACKFRIDAY, most of us would consider .GURU to be fairly versatile — regardless of quality, simply in terms of the number of viable word combinations. After all, one can be the guru for any noun. But you can’t add just any word to .BIKE or .BEER and expect it to make sense.
So we should see 77,000 “big generic strings” in .GURU at the moment, if your statement is correct. With 23,000 more to go. And what are people buying? According to Registered.today, they’re buying stuff like this:
Are those the “big generic strings” you were referring to? “You’re not letting those go”!?!?
Frank Schilling says
Dude.. every number .. every 4 letter .. foreign words .. compound phrases .. span the dot names.. .. 100k is not a lot .. Maybe in some narrow strings it will be 10k but on generics it will be 1mil .. in the mix 100k average? When I started with no names I entered Rick Schwartz’, (then open) forum and recall him saying all the good names were gone and I said something like: “Tell the guy making $5 an hour at a fast food restaurant that all the good ones are gone when they can buy a good name for $100 and flip it for $500 (one week’s paycheck) ” .. some version of that applies here. 100k good ones on average.. maybe not good enough for your liking but good all the same.
Joseph Peterson says
Every number? Well, there are an infinite number of numbers. 3 digit numbers? OK, add 1,000 domains. 4 digits? Add 10,000. Numerical domains past 4 digits are seldom registered outside .COM as things are.
Foreign words? Not in English nTLDs. Rarely makes sense.
What do you mean by spanning the dot names? Hacks like Retra.in or Cheque.red? Or phrases with the dot between words — which is what I’m mainly talking about? As for hacks, I have a complete list. For nTLDs, there are just 14,000 domain hacks and mostly bad.
Frank, anybody can throw out a large number like 100,000 domains without producing any actual list of those domains. You can say, “Oh, they’re all too good for someone to drop them.” But unless we can see the list of 100,000 or 1 million domains in a given nTLD that are too good to drop, then how can we judge?
Every nTLD will have exhausted viable single-word domains in English before the 100,000 point — often before the 10,000 or even 1,000 point. The Oxford English Dictionary sets an upper limit of less than 200,000. To get there, we’d have to see EVERY English word, however obscure, registered in these new TLDs; and that’s patently absurd.
If we include all numerical domains of 2 through 4 digits, that adds 11,110 domains. But who really expects 5931.photography and 9782.guru and 0250.host to be registered?
I should include the approximately 20,000 cities / metropolitan areas in the USA.
So we end up with 3-word domains (counting the TLD as the third word). Those have had a hard time selling in .COM on the whole; so most prudent observers would anticipate relatively few of those being worth the renewal fee for domain investors. Some are worth investing in; in fact, I bought some from Uniregistry today. But the quantity of 3-word domains worth buying outside of .COM will always be constrained by demand-per-supply dilution.
Neologisms and coined phrases typically don’t sell in the aftermarket outside .COM, .ORG, and ccTLDs. With so many new TLDs, why would someone coming up with the brand name “Habitrol” tomorrow pay extra for Habitrol.guru if they could just as easily register Habitrol.click for less than $20? The domainer inside the Habitrol.guru ticket booth would just see his customer step around him!
So where are these massive numbers coming from, Frank?
Every word in English (171,476)
+ Every numerical string 2 to 4 digits in length (11,110)
+ Every 3-letter non-word string (16,792)
+ Every city / metropolitan area in the USA (20,000
= 219,378 domains.
That is the most generous estimate possible. Anything beyond that would be one of the following:
(1) nTLD domains of 3+ words
(2) Neologisms that ignored .COM
(3) non-.COM Acronyms of 4+ letters
(5) Numerical strings of 5+ digits
(6) Foreign words + English TLDs
That 220,000 number is a good thumb rule for an upper limit on short, intuitive domain registrations. And it is VERY generous, since mostly it consists of stuff like this:
Many nTLDs have a core of less than 50,000 or 5,000 or even 500 desirable names.
Yes, some nTLDs will surpass 200,000 domains. But it’s very difficult to say with a straight face that those will be simply too valuable to drop.
Frank, I completely agree with you that there are still opportunities to find and flip good domains. Any day of the week, I can still find .COMs worth buying at $10. And I’m buying your company’s nTLD domains too.
But the number of “big generic strings” that are too good to drop in any given nTLD is much smaller than that off-the-cuff estimate of 100,000.
Frank Schilling says
“So where are these massive numbers coming from, Frank?”
“Every word in ‘English’ (171,476) ”
… the world is a lot bigger than english.. look through the namejet ‘most bid’ list and you’ll see chinese phonetic translations such as youmai.com and dengshi.com .. (means nothing to us but many of those have 30 or 40 bidders) there are tens of thousands of foreign language top terms that make sense .. but let’s ignore all of them and go with 33% of the dictionary that makes sense per string on average including plurals of those dict terms so that = 60k names there
“+ Every numerical string 2 to 4 digits in length (11,110)”
…half of these = 5k
“+ Every 3-letter non-word string (16,792)”
…most of these = 13k
“+ Every city / metropolitan area in the USA (20,000”
… cut these in half and double it again to include foreign cities in the same language = 20k
There .. You’re at 98k premium names.. Now add thousands of compound phrases which make sense in each string: fastcash.finance dripdrip.plumbing rumbleseat.cars and “cute” names and “sayings” and so on.. you’re well over 100k premium generics per string with value to more than one person.
I will be first to concede that “some”newGtlds have a core of less than 50,000 desirable names, but on average across all G’s, the number will be at least 100k. In future, there are going to be more registrants and more domainers. Your perception of what constitutes a “premium name” is going to change as more people arrive, much in the same way that worthless swamp land in Florida of the 1930’s is today’s million dollar an acre suburb in Boca.
Joseph Peterson says
Just in case anybody misconstrues this, the 2 of us are not in some sort of a boxing match. Both of us are just scribbling crude, best guesses about numbers on an online napkin. Neither of us will be exactly right. Personally, I just enjoy discussion.
Point by point:
Yes, I agree the world is bigger than English. Close to 30% of the domains I’ve personally owned at one time or another are non-English, and I report on non-English domain sales every week here at DomainNameWire.com.
But it doesn’t follow that an English keyword TLD like .PHOTOGRAPHY or .WEBSITE will automatically be valuable when paired with German or Spanish or Chinese. As you say, Frank, the world is bigger than English. Most of these new TLDs — being English words — are too restrictive to represent that world. We find mostly non-English, often meaningless TLDs being internationally recognized and sought after. Your example of NameJet selling Chinese Pinyin domains such as YouMai.com proves that .COM is global, but it proves no such thing for .PLUMBING or .TATTOO.
At first glance, “33% of the dictionary that makes sense per string on average” seems like a conservative number. But let’s look at a random sample of that dictionary. I’ll start at the 1127th word in alphabetical order (since today is 11/27), and I’ll pick every third word:
I haven’t cherry-picked obscure words here. That’s what the dictionary actually looks like.
So what you’re saying, Frank, if I understand correctly, is that 1/3 of words in the English dictionary (which is 100% of the list above) can be paired with pretty much EVERY new TLD and make sense. Some nTLDs will make use of a smaller list than the words above, but other nTLDs can embrace even more words — for example, between “Adhesive” and “Adhesives” we can add “Adhesively” and “Adhesiveness”. The point is that “on average” the whole list above should “make sense per string”.
Which nTLD string can be added to all of those? Even in something quite versatile like .GURU or .CLICK, most of us would be hard pressed to identify 1 word out of those 20 to register. And that’s 1/20 of 33%, which is just 1.65% of the dictionary. For nTLD strings that are subject-specific like .PHOTOGRAPHY or .HOSTING, the word hoard may be substantially smaller than even that 1.65%.
Half of every numerical string up to 4 digits in length? On average, for each new TLD? That’s very ambitious!
Here is the odd-numbered half:
1127.guru + 1127.hosting + 1127.wiki + 1127.plumbing + etc.
1131.guru + 1131.hosting + 1131.wiki + 1131.plumbing + etc.
1133.guru + 1133.hosting + 1133.wiki + 1133.plumbing + etc.
1135.guru + 1135.hosting + 1135.wiki + 1135.plumbing + etc.
1137.guru + 1137.hosting + 1137.wiki + 1137.plumbing + etc.
4-digit numbers make up about 90% of those you were counting, since there are 10 times as many 4-digit numbers as 3-digit numbers.
77% of the 3-letter acronyms will be registered across each new TLD on average? You realize that’s hundreds of nTLDs we’re talking about, right?
Starting at position 1127 (for November 27) alphabetically, this is what we’re looking at:
BRJ, BRK, BRL, BRM, BRN, BRO, BRP, BRQ, BRS, BRT, BRU, BRV, BRW, BRX, BRY, BRZ, BSA, etc.
So 77% of those will be registered in .GURU … and in .HOSTING … and in .HOST … and in .WIKI … and in .PLUMBING … and in .PICS … and in .TATTOO … and in .SEXY. That’s on average, to be fair. Some will have fewer than 77%.
But think about what is required for the average LLL registration numbers across nTLDs to be 77%. For every single TLD that sees only 20% of these 3-letter acronyms registered, 3 other nTLDs must have 100% LLL registration for the average amongst those 4 to be 80%.
I do basically agree with your math on the GEO city strings. They’re plentiful and could be valuable in many TLDs. Your own company’s .PROPERTY for example.
I still don’t see “100k good ones on average” for each new TLD. Why not? Because I’m not prepared to consider the following domains to be any good:
(1) Language mismatches
(2) 33% of randomly selected dictionary words
(3) Meaningless 4-digit numbers registered with a shotgun
(4) The same 3-letter acronyms registered in hundreds of new TLDs
Without those domains — or others basically like them — I’m afraid your count won’t add up as intended, Frank.
Not all worthless swamp land is predestined to become Boca Raton. The globe is finite. Name spaces can be enlarged ad infinitum.
NCR domains says
“I’ll start at the 1127th word in alphabetical order (since today is 11/27), and I’ll pick every third word”
No that’s not the way to do it. To prove your point you should list the 60 consecutive words-starting at 1127 and then pick good ones and see if they account to one third not picking every third word.
Not to disagree with the essence of what you are saying.
Picking every third word out of 60 vs 20 consecutive words is just another way to say he’s taking 20 random words, so the essence is the same.
That said, I think picking every third word rather than every word is actually a better approach because of the tendency for similar words to be next to each other. Jumping to the 1127th word insures no cherry picking, and skipping a pre set number of words just eliminates a skewed result from multiple words that share the same root.
Another alternate would be picking the 1127th word from each letter. The main idea is that there aren’t as many dictionary words that would be a great domain as some would think.
NCR domains says
Yes that is a better way of randomly picking words, now you have 20 samples and the question is that if 33% of them ( 7 words) make sense in any extension.
Picking 20 words randomly out of 60 is not the same as picking the 20 best ones out of 60 random words. But I just checked that part of the dictionary, out of 100 words there must be a couple at most that are worth more than the registration fee. Frank’s 33% estimate is way too high, I would say a couple percents at most.
There are way many more words in the dictionary than words we commonly use, whose amount must be around 10,000 or 20,000. While there may be 33% of names that make sense with a random gTLD out of 20,000 words in the common language, there surely aren’t 33% out of the whole dictionary, since most of these we never use and we don’t even know the meaning.
50 consecutive words from the dictionary, if you want to register 33% of those be my guest:
But that’s okay, you can easily recoup the $185,000 application fee by selling at a premium price those thousands names you reserved.
I’d say that in most gTLDs, only a few thousands names at most match nicely with the right of the dot. If you pick an average or poor match, you will probably never sell it at a high price since someone can just pick the same SLD for reg fee in another TLD for an equally average or poor match. In order to prevent this you would have to register your SLD in all hundreds new gTLDs, but then might as well pay the price for a great match in one TLD.
Joseph Peterson says
That’s a fair criticism. Very lazy and inaccurate of me to write that. Taking 100% of every 3rd word is not the same as taking the best 33% of all words.
My thought was that actually using a random number generator to pick words from throughout the alphabet — although random — would look to most people as if I were cherry picking obscure words. So I went with a recognizable, arbitrary pattern that most people would accept as random and unbiased, picking every third word.
But my main point stands.
With a sampling of 33% for the interval in question, whatever percentage of words within that 1/3 list looks like high quality, that percentage can be extrapolated to the whole list. So if 1 word out of those 20 is of adequate quality, then something like 5% of the dictionary could reasonably be presumed to be of adequate quality.
Yes, a true random sample would be much more accurate. Perhaps by chance I picked a bad patch of the alphabet. But my goal isn’t to establish a precise number — just to indicate the lower magnitude of that number.
Joseph Peterson says
Here are some useful numbers. According to Wikipedia:
“A 1995 study shows that junior-high students would be able to recognize the meanings of about 10,000–12,000 words, whereas for college students this number grows up to about 12,000–17,000 and for elderly adults up to about 17,000 or more.”
The number of distinct words that Shakespeare wrote more than once in all his works combined is just 17,158. He used 31,534 words altogether but many of those just once each.
But it’s also true that “[K]nowledge of 5000 word families is necessary for 99.9% word coverage.” We know a lot of words we rarely use.
Most of these 12,000–17,000 words will not make sense or be desirable when combined with a given nTLD term.
So a reasonable estimate for the number of single English words that would be desirable in a versatile nTLD like .CLICK or .LINK would be a small fraction of the 12,000-17,000 total words we know. Maybe 5,000?
.GURU and .CLUB would have even lower totals because (in general) we need nouns or adjectives as opposed to verbs, adverbs, etc. Throw out common words like “adjectives”, “opposed”, “verbs”, “throw”, “common”, etc. They don’t fit. That leaves — what — maybe 3,000?
And subject-matter-specific nTLDs like .PLUMBING or .FISH or .ENGINEERING make use of a vocabulary that is still more limited. Say 1,000 or so — tops.
““Tell the guy making $5 an hour at a fast food restaurant that all the good ones are gone when they can buy a good name for $100 and flip it for $500 (one week’s paycheck) ”
If it were so easy to flip for $500 and buy for $100 my guess is that you would have an entire call center of people doing just this instead of or in addition to operating a registry. Of course maybe you are doing this.
I’ve been in this business since the mid 90’s. And have made money. And I deal with all ends of the market from owning to buy and so on. While anything is possible buying for $100 and selling for $500 on any type of consistent basis is not something that someone who is flipping burgers is going to be able to do so let’s be realistic.
Now of course if you want to argue that you could buy 1000 names at $100 and statistically sell enough of them (at more than $500 of course) and make money I would possibly agree with that. But the burger flipper guy isn’t going to do that. Besides you’d still have to have a thesis on what domains to buy. Not only that even the 3l domains that I own are 95% unsold after almost 17 years. Luckily of course those were names that were purchased for under $100.
Domainer Extraordinaire says
“Dude.. every number .. every 4 letter .. foreign words .. compound phrases .. span the dot names.. .. 100k is not a lot .. Maybe in some narrow strings it will be 10k but on generics it will be 1mil .. in the mix 100k average? When I started with no names I entered Rick Schwartz’, (then open) forum and recall him saying all the good names were gone and I said something like: “Tell the guy making $5 an hour at a fast food restaurant that all the good ones are gone when they can buy a good name for $100 and flip it for $500 (one week’s paycheck) ” .. some version of that applies here. 100k good ones on average.. maybe not good enough for your liking but good all the same.”
You’re counting on an infinite number of foolish domainers. Those wannabe domainers have small bank accounts. They are already realizing that finding buyers for worthless domain names is very hard.