The Top Policy Mistakes ICANN Made In The New gTLD Program
Now that everyone has had time to fully digest the big reveal of the new gTLD applicants and the strings applied for, its time to look at the biggest issues ICANN has created for the domain name system.
We are not going to look at mistakes ICANN made in the implementation of the program (such as the TAS debacle) just the fundamental policy mistakes ICANN made in our opinion which are have potential to cause a lot of problems, for the program, for ICANN and ultimately the applicants.
1. Allowing generic TLD’s to be run as a closed Registry.
ICANN could have easily avoided what has already been biggest criticism of the program, allowing one company to own a generic term on a closed basis for the sole benefit and use of the applicant.
With Amazon applying to operate over 75 new gTLD’s all on a closed basis, including .App, .Books, .Music , .News, .Shop meaning that no one besides Amazon would be able to register any domains under any of the extensions they get.
Basically Amazon would own the entire right of the dot space for any of the extensions they win.
Google has applied to operate some closed gTLD’s as well for Generic terms.
There is already an outcry from the media, bloggers and other groups.
It is not outside the realm of possibilities that the DOJ and/or their European Counterparts will take a look at this possibly pushing back the whole program or at least tying up what are about the best possible generic TLD’s for which there are many applicants.
ICANN could have just not allowed this situation to occur by so stating in the GuideBook.
However ICANN didn’t and now the rest of the world is waking up to the fact that one company can own the entire vertical to the right of the dot and they are not happy about it
Its going to be a problem.
2. Creating rules and pricing for the The Uniform Rapid Suspension System (URS) which no provider apparently wants to provide services for at the price set by ICANN.
ICANN set the URS system up under pressure of Rights Groups (TM) to make it a quick, easy and cheap way of taking down domains with a target price of $300-$500 for a complaint.
The problem is it seems that none of the usual suspects that would handle such matters have any interest in doing so at the prices suggested by ICANN and therefore ICANN is now back with this unsettled mess of the URS considering revising the rules which they spent years developing.
Its going to be a problem
3. Didn’t Limit the number of applications by any one company.
It always seemed logical that ICANN would impose some sort of limit that one company or related companies with some amount of cross ownership, would be able to apply for, as not to allow a set of companies to control too much of the right of the dot.
I don’t know what the proper amount should have been; 10, 25, 50, 100, or 200 but clearly ICANN didn’t set any limit and over 20% of the 1,409 separate strings applied for can wind up being owned by just one company.
Once again when you talk about any one company controlling a substantial part of any market, its likely to raise eyebrows.
4. Similarly Confusing Strings.
ICANN policy on Similarly Confusing Strings, well it pretty confusing.
Here is what the Guidebook has to say about String Confusion:
“ICANN will not approve applications for proposed gTLD strings that are identical or that would result in user confusion, called contending strings”
“The String Similarity Panel will review the entire pool of applied-for strings to determine whether the strings proposed in any two or more applications are so similar that they would create a probability of user confusion if allowed to coexist in the DNS.”
When asked how string confusion is going to be determined, here is what Kurt Pritz the new head of the gTLD program had to say at the ICANN meeting in Prague:
“So the standards are published the way they are. ”
“So quite extensive discussion occurred all about this issue and how to measure that, whether it should be measured mathematically or through some algorithm or some matching program.”
“And it was decided really that confusion is a human reaction. ”
“And confusion and the likelihood of confusion should be determined by, you know, reasonable average people that are looking at strings that are familiar with the script and language of that. ”
“And so the standards very brief. ”
“But in the Guidebook there has to be a likelihood that user confusion would result if both these TLDs would be submitted. ”
“And that’s the extent of the guidance given to the evaluators.”
Wow, not much help
When I went to law school they taught students to draw contracts up with exact language, to avoid vagueness or something open to many interpretations, otherwise there is a good chance your client will wind up in court arguing over the language of the contract if any issue comes up with the other party.
By setting totally subjective standards that can’t even be put into words and leave it all in the hands of an examiner, well that’s just asking for trouble.
Is .inc confusingly similar to .ink (both applied for)?
If you say them out loud they sound identical, however they are both well known terms and have completely different meanings.
Is .Ngo confusingly similar to .Ng (ccTLD of Nigeria) or .No (ccTLD of Norway)?
According to Alexa Raad of Architelos.com, “although ICANN has an algorithm to compare two strings to see if there is more than 30% “similarity” it is still a judgement call, so it would have to be any string that is either visually or verbally similar enough to cause confusion for the internet user.”
I think ICANN should have went with some detailed standards and should have given the examiners much more guidance on what constitutes confusingly similar strings.
ICANN has an algorithm in place but hasn’t even set any guidance on how much weight that algorithm should be given by the examiner in making the decision.
It’s going to be a problem.
5. Not limiting the number of new gTLD’s in the 1st round.
There are 22 TLD as we speak, so expanding the space by an unlimited number seems quite risky to internet stability which is part of ICANN mission statement.
I suggested at the time that ICANN select a reasonable number maybe a doubling of the space in every year so that in year 1 there would be 22 new gTLD’s and if there were no issues then 44 in the 2nd year and again assuming no issues 88 in the 3rd and so and so on.
Maybe that wasn’t the proper method but It seems it would have been a good idea of there was some limit.