Domain pulse 2017

Two weeks ago I have had my ten minutes of fame in Vienna, presenting a tiny bit of facts & figures regarding new gTLDs. The presentation (video and slides) is included in this post, but unfortunately it is in German. Because of that, I will give you a quick summary of what I said:

 

  1. new gTLD registration numbers are constantly (more or less, peaks smoothed out) going up. (Page 2)
  2. We had four noteworthy peaks since 2014 and .xyz is (the biggest part of) the reason in all four of them. It’s either “Oh look, Google is now Alphabet and registered a .xyz domain name!” or “Hey, wanna buy a .xyz domain for more or less free?“. While there are seemingly useful domain names being registered throughout those peaks, the majority of them consists of names like “tnnby”, “wyj198” orappel-ic1ocud
    appel-iclobnd
    appel-icloccd
    appel-iclocdd
    appel-iclockd
    appel-iclocrd
    appel-iclomud
    appel-iclonqd
    appel-iclonsd
    appel-icloppd
    appel-iclouisd
    appel-iclouncd
    appel-iclouod
    appel-iclouwd
    appel-iclwcud

    or simply “aqqkfqaqqq”. (Page 2-7)

  3. The western world is way more conservative in spending money in new gTLDs than the eastern world. There is little to no difference between the US and Europe when it comes to the way investments are being made, while it feels like the US is more vocal about it, though (through news outlets, blogs, etc.). Note: This is with reference to the way of investing, not the amount of money spent. (Page 3-7)
  4. Anonymous registrations are rising because Registrars begin enabling anonymous registration of domain names by default. Most anonymous registrations account to Asia. (Page 7)
  5. Local registration times reflect working hours (time span and amount) in each region. (Pages 8-11)
  6. Two pie charts displaying the distribution of new gTLD types by amount of TLDs and amount of domains. (Page 12)

You can find the video of the presentation (again, in German) here and the presentation itself here. Thanks for reading!

 

P.S.: Make sure to also check out the presentation (Video in english and presentation itself) Patrick Myles, Data Analyst at CENTR, with a bunch of interesting data as well!

Ductus Exemplo

That one different gummy bearSometimes when you buy a pack of sweets, some of the sweets are damaged or look weird. One time, I even had them mixed with a different type of sweets. This may happen to you once a year or less, but the people who are working in the factory surely see it happening more often.

This works for everybody: Doctors see more injuries, Bartenders hear more weird stories and we see more inaccurate WHOIS entries and compliance-violations than most other people.

At some point, it has become so much that I started to become very interested in how this works in detail. ICANN provides binding (hah!) agreements between Registrars and Registries. Let’s take postal codes as an example.

ICANN says (through the RAA): Postal codes need to be in the format specified in RFC 5733 and in accordance with the UPU address format for the country in question (in this example: China).
IETF says (through RFC 5733): Postal codes are represented through character strings with a defined minimum and maximum length.
UPU says (through international addressing sheet China): 6 digits.

You could agree with me here that we established the fact that the postal code for a registrant form China needs to be six digits. Not two, not eight, not letters, not symbols and so on, right?

Well, while writing this post, we have 11.100.156 new gTLD domains registered to registrants from China.

905.693 of them are in violation of the UPU address format, the RFC 5733 and thus, ICANNs RAA. That’s 8,1%. And that’s postal codes in China only. Worldwide we also have

  • 2.211 domains with invalid country ISO-3166-1 codes
  • 205.952 telephone numbers with a non-existing country codes
  • 263.906 domains don’t even have a telephone number entered at all
  • about 50 email addresses that should fail even the simplest email syntax check (containing spaces, missing @, missing host name, etc.)
  • 941 missing email addresses
  • 137.138 disposable email addresses

And we’re far from being done with our checks yet.

Knowing this makes ICANNs specifications look like a joke to me. To put it into perspective, here is my favourite piece from ICANN regarding this:

  1. Validate the presence of data for all fields required under Subsection 3.3.1 of the Agreement in a proper format for the applicable country or territory.
  2. Validate that all email addresses are in the proper format according to RFC 5322 (or its successors).
  3. Validate that telephone numbers are in the proper format according to the ITU-T E.164 notation for international telephone numbers (or its equivalents or successors).
  4. Validate that postal addresses are in a proper format for the applicable country or territory as defined in UPU Postal addressing format templates, the S42 address templates (as they may be updated) or other standard formats.
  5. Validate that all postal address fields are consistent across fields (for example: street exists in city, city exists in state/province, city matches postal code) where such information is technically and commercially feasible for the applicable country or territory.
  6. Verify:
    1. the email address of the Registered Name Holder (and, if different, the Account Holder) by sending an email requiring an affirmative response through a tool-based authentication method such as providing a unique code that must be returned in a manner designated by the Registrar, or
    2. the telephone number of the Registered Name Holder (and, if different, the Account Holder) by either (A) calling or sending an SMS to the Registered Name Holder’s telephone number providing a unique code that must be returned in a manner designated by the Registrar, or (B) calling the Registered Name Holder’s telephone number and requiring the Registered Name Holder to provide a unique code that was sent to the Registered Name Holder via web, email or postal mail.

And those inaccurate informations are for valid and active domains. A quick check revealed that only 1% of those domains are in clientHold or any similar status, which would allow such false entries to exist. But even then you could question the acceptance of such methods, since even a domain on clientHold is gone and can’t be registered by someone else for the time being.

Now this post isn’t supposed to be just a rant. We want to lead by example and act within our capabilities to make the whole new gTLD domain space a better place (I know it sounds cheesy, but whatever). So we approached ICANN with the goal of finding a way to provide them with the most up-to-date inaccuracies. Basically, we want to enable ICANN to fast-forward the process of identifying “faulty” domains and Registrars/Registries who just don’t care and thus, “poisoning” (yep, strong word) the whole thing.

Believe me when I tell you that while working with every type of client from the domain industry, I realised that if everyone would do their job just with a little more effort, this whole thing would not only be way more easy for you and me, but also way more fun. Instead of reacting to a bizillion of errors and fixing problems that are being thrown in our faces because some Registry is changing creationDate entries (suddenly dated back, missing, etc.) – and thus, making our statistics worthless – we could provide you with so much more interesting and amazing statistics you’d neglect your own wife because you “just wanna browse nTLDStats a bit” instead of coming to bed.

That being said – good night!

 

The nTLDStats way

If you look at other websites providing statistics for new gTLDs, you’ll find similar numbers all along. Of course you’d think that those statistics must be right, since they basically look all the same. And then there’s nTLDStats. The website that always has different numbers. Sometimes not so much different than the other websites, but then there are also times in which there is a big drop on other statistic websites while there is none on nTLDStats. I remember when .xyz had that drop after their first year. Everyone else displayed that drop immediately as the domains got removed from the zone file. We didn’t. It’s “German Gründlickeit“. We care about very accurate statistics, which is why we take into account the domain life cycle. And that’s only one of the little things that we take into consideration while providing statistics. Sure, most people only want to see “how the TLD is doing”. For them it doesn’t matter whether a TLD with over 2 million domains has 2000 domains more on nTLDStats than on other statistic websites. But then again, if we wouldn’t provide such accurate data, no one would. And we have a hand full of customers who actually care about that seemingly neglectable difference.

Key-Systems and Google RegistryLet me give you another example: Transfers. We track domain transfers. No one else does. And the more I think about it, the more I am convinced that it’s something everyone should do, because it is important. Let’s use Key-Systems as an example. They registered about 179.000 new gTLD domains. Additionally, roughly 6k have been transferred to Key-Systems. From those 185k, about 15.000 have been transferred away. That’s 8% – and in new gTLDs, that’s quite something. Where did they go? Say “hello” to Google (sorry, I’ll never get used to Alphabet)! 53% of their current domains are transfer-ins. 50% from Key-Systems, 3% from other Registrars and 47% are actually registered by Google. And the transfers are still happening: We see domains moving from KS to Google in March, April and in May.

 

And the last example: Parking! Not that important anymore, but in 2014, people were actually interested in whether the new gTLDs are being used or not.

These stats (and others) are only available because we’re very thorough when it comes to providing statistics. And the best part is that you can access that data, too.

 

Heres how:

API

Our API is free and currently in beta. It provides most of the things you can see on our website. The only requirement is to create an account, with which you can then request an API token (both free of charge). It’s already used by many people and companies and you can become one of them.

 

Customized data reports

Our website already provides you with a lot of insight. Sometimes, that’s not enough, though. For clients who need to be well-informed and up-to-date, we provide customized reports. Those reports are created out of several so-called “data streams”. Here’s an example:

Data Stream 1: new gTLD registration numbers
Data Stream 2: Registrar

With those two streams, we can provide you a report telling you how much domains of a certain (or more) TLD(s) a registrar actually registered. You could also just use a single Data Stream, i.e. Registrar, to get the registration (and/or drops/transfers/renews etc) statistics for one (or more) registrar(s). The more Data Streams being added to one report, the more complex the calculations become. But at the same time, you’ll get a lot more insight.

No matter how complex a report is, though, clients can issue a new report manually with a click of a button. Once requested, our system will start calculating and notify the client when the report is done (usually between 5-40 minutes). The report will be filled with the latest data, only a few hours old, before our systems update again. Depending on what Data Streams a report will be made of, new data could be available after an hour already. And the best part is: You can generate a report whenever you want, how often you wish to. There are no limitations.

So if you need statistics, we really should be your only choice (says the COO of the company). Also, Twitter. (If you don’t get it: IMDB.)