Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

RidcullyTheBrown writes "A story from the Sydney Morning Herald is reporting that ICANN is under pressure to introduce non-Latin characters into DNS names sooner rather than later. The effort is being spearheaded by nations in the Middle East and Asia. Currently there are only 37 characters usable in DNS entries, out of an estimated 50,000 that would be usable if ICANN changed naming restrictions. Given that some bind implementations still barf on an underscore, is this really premature?" From the article: "Plans to fast-track the introduction of non-English characters in website domain names could 'break the whole internet', warns ICANN chief executive Paul Twomey ... Twomey refuses to rush the process, and is currently conducting 'laboratory testing' to ensure that nothing can go wrong. 'The internet is like a fifteen story building, and with international domain names what we're trying to do is change the bricks in the basement,' he said. 'If we change the bricks there's all these layers of code above the DNS ... we have to make sure that if we change the system, the rest is all going to work.'" Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?

That's one possible problem. Then there are characters that are technically equivilent but have different representations. (Accented vowels for instance: you can code them directly, or you can code the accent and the vowel seperate.) You need some way to make sure they both go the same place, no matter UTF-8, -16, -32 or whatever else people throw at it.

And, of course, you need to make sure when someone types this into a browser some major DNS server someplace won't crash.

I'm all for adding non-latin characters. But I do recognize that it should be a slow process.

It depends on your operating system. The "standard" way is to hold Ctrl+Shift and then type the hexadecimal representation of the unicode code point that you want, but that conflicts with a lot of keyboard shortcuts that people use and so implementors often alter it a bit (for example, with GTK+ you press Ctrl+Shift+U and then type the code point).

If your keyboard has a compose key then you can often compose a glyph from two similar looking glyphs. For example, for an o with an umlaut, " o -> ö (though I expect Slashdot will filter that character out).

Macintosh users have an Option key that they can use to make weird glyphs (option-8 for the infinity symbol, option-g for the copyright symbol, etc). On most operating systems, various other combinations of the Ctrl/Shift/Meta/Alt/AltGr modifier keys and regular keys will allow you to type more glyphs. Most desktop environments also have an on-screen keyboard type program that ease experimentation in this area.

Users of complex (e.g, Asian) scripts have a host of input methods to choose from and configure.

Finally, if all else fails, create a text file full of your faviourite non-ascii characters and resort to the tried and tested method of copying and pasting!:)

Accented vowels would be a problem, at least in spanish. Though their use is "mandatory", people with mediocre spelling don't use them in the internet. Even people who use them don't always do it: even though the use of accents is mostly regular, there are many (and very common) irregular placements.

Let's say for instance we have an online shop for tea called "Sólo Té" (Tea Only). Both accents are due to irregular rules ("Sólo" = "Only" and "Solo" = "Alone", "Te" is a personal pronoun and "Té" = Tea). Some people would try the current www.solote.com, others would try the correct www.sóloté.com, some would try www.sólote.com and yet others www.soloté.com depending on their spelling capabilities.

What this basically means is that in order to make sure everybody finds your domain and to avoid phishing you have to register four different domains.

A solution to this problem could be what Google does right now with accents: map them to the unnacented vowel. Thus "Solo Te" and "Sólo Té" would both find the "Sólo Té" store.

Though their use is "mandatory", people with mediocre spelling don't use them in the internet.

I don't have mediocre English spelling, and I would use the correct accented characters in English words like "naive" - except I don't know how to type those characters. Like many people, I know how to type the characters that are on the keyboard. Additionally, because there's no need for me to type characters outside the ones printed on the keys on my keyboard to make the internets come down my tubes, I have no incentive to learn how to type any differently than I already do.

If you are spanish-speaking (which was my example) not knowing how to place accents is not an excuse. They're a fundamental part of the language, unlike in english where they're only required for foreign words written in their original form.

In Argentina some people have keyboards with spanish language distribution (that is, with extra letters) and some learn the ASCII codes and use the ALT key (along with the code typed in the Numpad) to place accents and the letters Ñ and ñ (which are mandatory

Kind of an interesting point. Maybe we should just let Google run the DNS system, and just replace it with a giant search engine. If we make actually typing in a web address hard enough, then that's what we're effectively doing anyway: people will just start typing everything (including the domain name of sites they want to go to) into the Google Search box at the top of their browser window, instead of the actual address bar.

Actually, DNS arguably is a giant search engine, which simply works on a 1:1 relationship and uses a distributed database (you input one piece of information, and it gives you some corresponding piece of information back). Replacing it with a 'fuzzier' search engine that would give you back a number of results, ranked by relevance, isn't that huge a leap.

Instead of changing the fundamental DNS which is a programmer's and administrator's tool, not an advertising medium. It is founded, like programming languages, on a fundamental 7-bit ASCII character set, and is not intended to be used for NLS text.

A far better solution is some form of VDNS that translates NLS text names into the proper domain name at the system level. That also allows the same domain to have multiple language translations to reflect localized product and service names.

We seriously need to kick the general political community in the arse. They keep trying to impose technical decisions, and it fails as miserably as any corporate PHB's uninformed decisions. ASK the techies to propose solutions instead of shoving ill-conceived ideas down our throats.

For example -- once you mandate multibyte domains, you implicitly mandate multibyte URL components. Goodbye direct mapping of names to the directories, file systems, and servers.

The problem is that it was designed for natural language text in the US back when some computers could deal with the new fancy feature of lower-case letters and others couldn't, and when humans tended to get confused about that sort of thing even though they all spoke English, and some computers could deal with 8-bit bytes and punctuation while others were very limited. I don't know if the IBM 48-character character sets were still around, but 64-character was still widespread, and EBCDIC was certainly st

The Internet is not just the web - you might remember that there are other applications such as email, ftp, ssh, telnet, ping, traceroute, and some people use programs other than browsers to access these things.

The reason ICANN wants to do lots of testing (after having dragged their feet for years before getting started) is that IDNs fundamentally change how DNS works, and it's really important not to break too much when you do that (not that ICANN traditionally worried about that.) It's *not* simple, and

I agree. The characters are grapheme, so we should treat them that way.Perhaps even whole idea of encoding alphabets is a relict (and biased to phonetic alphabets, as well)? Today computers have enough power to operate on pictures as UI, so why don't we switch to shape-based data processing?

That would instantly break digital divide between present and history (think digitization of ancient documents) and between various cultures.

As bonus, we get to ditch keyboard-induced RSI, that feeling of being constrain

But it's not working. Mainly for all those people that want non-latin characters. It's been broken from the beginning. Sure, there is historical reasons why we have the system we do, but change is definitely needed. Twomey is right that a change can't be rushed and it needs to be done right (for reasons of security, compatibility, stability, etc). However, the change does need to occur and there needs to be some level of pressure to ensure that it happens.

> Wont this open up the system to many more phishing attacks involving addresses which include non-latin characters which look similar to latin ones?

Even worse, although your problem is reason enough to postpone doing this change. It will break the very idea of the Internet as a common when URLs can't even be typed in on all keyboards. There are good reasons why DNS didn't even include the whole ASCII set. Least common denominator is a good design decision. Every character currently allowed is easy to generate on ALL keyboards, can be printed in an unambigious way by EVERY printing system, etc. Remember that a lot of wire services aren't even 7-bit ASCII clean, email addresses on a lot of news wires have to use (at) instead of @.

More bluntly, of what use is the parts of the Internet I can't even type the domain name for? As things now stand I CAN, and have, snarfed firmware directly from.com.tw sites where I couldn't read any of the text. Learned things from sites where I couldn't read anything but the code text and command lines. Seen images and understood even when the captions were meaningless to me. I'm sure the reverse is equally true, that those who do not speak English still benefit from the English majority of the Internet the same way. All this because DNS is currently universal. Break that universal access feature and, frankly they can just as easy ingore ICANN and just get the hell off the Internet and make their own walled garden network based in IPv6 technology.

At a minimum, unicode DNS should be restricted to IPv6 ONLY. No sense wasting scarce IPv4 resources on supporting walled off ghettos.

'when URLs can't even be typed in on all keyboards'As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.'of what use is the parts of the Internet I can't even type the domain name for?'Its of no use... to you. But then again, can you read Japanese, Korean, Arabic, Sanskrit or any other non-latin language? no? Then your usability isn't in question here.

As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.

I wonder how you got +4 mod points... this makes no sense at all!!

Let's suppose you are are a japanese person and you travel to Brazil. Nevermind if can speak portuguese or not, but then you need to send an e-mail using your company's webmail server from a computer at the hotel. And suppose this webmail server has kanji characters in its URL. How are you going to type them? Believe me, brazilian portuguese Windows has no support for asian languages (at least not by default, and actually I don't know if it's even possible with a regular brazilian Windows XP). What now?

As far as Japanese go, there are very usable technologies that allow to type in kanji. Using a standard latin keyboard. It works pretty well, and i'm not sure what other languages have such options available, but since most of Asia uses the same kanji system I'm pretty sure that at least Asia has viable typing options.

I must have missed where Japan conquered 51%+ of the area east of the Ural mountains.

AFAIK (and I'm not an expert), China, Japan, Korea and Vietnam used very similar writing system decended from Chinese Hanji characters. Vietnam and Korea (South Korea at least) later adopted other alphabets. So really, only China and Japan commonly use Hanji/Kanji, and even then, the CJK unification of hanji/hanja/kanji characters really annoyed a few purists when similar hanji/hanja/kanji were merged in unicode.

So, other than hanji/kanji, there is hangul (S. Korea), hana/kana (Japan -- yes, they have more than one writing system!), the Thai alphabet, the Cyrillic alphabet (former USSR), the Arabic alphabet (Middle East), Hebrew (Israel), the Brahmic scripts (India) and the Georgian alphabet. (And this is just off the top of my head, I wouldn't be surprised if there were a few more writing systems in use in Asia!).

And then, just to confuse the problem, there are the various forms of encoding. Admittedly, unicode would probably be one of the better methods, but there are a lot of pre-unicode encodings in common use.

When you expand the problem to be worldwide, there's also the Ethiopian and Greek alphabets that are used in their respective regions. There's also a ton of latin-based alphabets, which introduces many more characters than are currently used in the DNS system. (Including characters that look a lot like existing characters!)

And then you have the problem of alphabets used only by very small groups, such as Cherokee (Oh, I'm going to get flamed!). There are very few people who can write in Cherokee, but does that mean that the Cherokee language shouldn't be part of the DNS system?

Just because the letters aren't printed on your keyboard doesn't mean it won't type them. Have a look at the list of keyboard layouts in your OS. Sure, it's an inconvenience for you, but less of an inconvenience than it is to the people for whom it is a barrier to entry. Or you could use Google - a lot of people don't even bother typing in domain names any more, they just search.

The whole point about this is that it avoids walled gardens, because the DNS records are still held by ICANN. The alternative is that China decides it's had enough, and creates its own root servers, causing a very real split.

"It will break the very idea of the Internet as a common when URLs can't even be typed in on all keyboards"

You know, when one sees comments like that, it's not strange that non-7bit ascii countries find themselves rather exasperated with the rate of progress. If you take a few seconds to actually research the issue you'll find both a suggestive lack of multi-thousand key keyboards, as well as a whole host of solutions to that problem.

I mean, I can cut'n'paste chinese and japanese into vi, save the file with a unicode filename, and it'll just work. Earlier valid technical reasons are gone, everyone else has solved this; now the excuses start sounding really hollow.

Wont this open up the system to many more phishing attacks involving addresses which include non-latin characters which look similar to latin ones?

Potentially, yes. But I'm not too bothered about that. Protecting people from their own stupidity
is rarely a good long term strategy. However, i18n for DNS is a particularly bad idea for purely
pragmatic reasons. Currently, anyone anywhere in the world can go to any URL in the world in their
web browser. If we allow the full range of unicode characters, that s

They are going to have to wait for the system to be capable of using more characters.Its just a fact of life that the encoding scheme implemented has a limited set of characters that is readable by the technically adept people who built the thing.

Its a great idea to enable lookup by character strings using alphabets from other languages but if it takes time to implement a global standard thats just too bad.

If they are desperate to implement lookup in their own character sets then let them get on with it - b

Yes, countries that use non-English characters should be able to interact with the rest of the world using their natural language. No, they shouldn't rush the change and risk a possible crash of a large portion of the Internet. Be patient young patawans, soon you will be able to have DNS names with any character you can think of, but it will be reliable and actually work.

Besides, think of how well prepared DNS will be to start supporting lookups in extra-terestrial languages when the time comes if we do this now! We'll be completely compatible with Martian, Klingon, Mimbari, and Vulcan networking systems the day we meet them! We should be able to view each other's pron almost immediately!

"Yes, countries that use non-English characters should be able to interact with the rest of the world using their natural language."Why... No really. You speak as if this is a good thing. Why should they be able to use their natural language rather than English? Why shouldn't they be restricted to a limited area of local language speaking people?

The reason the Internet is useful is because everyone speaks TCP/IP. Incompatible protocols are to be actively discouraged because they balkanise the network. Langu

Languages are anachronisms, the only reason we have more than one is the physical distance between locations and difficulty travelling allowed them to evolve independently.

So why does every language have strata of slang and jargon that may well be incomprehensible to outsiders? In south-east England, a fairly small area, one has a wide range of speech depending on economic status and social circle. If one has a few people speaking a common language, it won't stay uniform for long, even if everyone's still in the same place.

So get rid of them, insist on a common language.

Sure, and why don't we just all wear the same clothes, just because different styles or colours can be taken too seriously (on gang turf, for example)? And let's all eat the same food, no need for various cuisines when flavourless mush can keep us alive.

Languages make the world more interesting. I enjoy very much traveling about and seeing how the local communicate, the phonological inventory and morphological quirks they employ, the different judgements on eloquent speech they hold. If all this disappeared, it would be very dull.

And your claim that languages are "too difficult" is a peculiar opinion of some in first world nations. The vast majority of human beings are multilingual, see e.g. Edwards, John. Multilingualism [amazon.com] (London: Penguin, 1994). It should only take a person a couple of weeks to acheive a basic conversational level in a foreign language, which can easily be done before each time you set off on vacation. I've never had a problem learning enough of the language to talk with the locals about their culture and mine, and I think my language skills are actually fairly humdrum in comparison to a lot of people I've met.

And if all national tongues disappear in favour of some world language imposed by fiat, what would happen to all the literature written in them? Poetry translates infamously poorly. People have spent millennia composing art in words, one of the skills that makes us the unique species we are. Are we to throw all of those great monuments away?

Strawman, neither of the examples are communication protocols which benefit from the network effect. Language is.

Language may be employed in various ways. Not only to communicate, but also to obfuscate (as some Roma do with their use of Romani) or to explore new possibilities of form (conlangers, bits of Sandor Weores and James Joyce).

People make the world more interesting. It's nice to be able to talk to them.

People aren't solitary individuals, they belong to larger societies that shape them. Understanding his language is part of understanding a person.

Nope. Spanish, Italian, German or other romanic or germanic language I could probably pick up as required. Chinese is apparently particularly difficult.

Chinese's difficulty is mainly at the level of official orthography. I studied Chinese at Defense Language Institute while in the Navy, where we concentrated only on the spoken language and learnt but a few characters, and after the first two months I no longer felt any barriers. Granted, I occasionally had to ask a person to explain what they meant, but still in Chinese of course, and I employed many circumlocutions, but it's not hard at all to learn enough Chinese to talk to Chinese people about themselves and their culture.

It would be consigned to academia, where all dead languages go.

The Finno-Ugrian minorities of Russia, which are my chief object of study now, do not want their languages and literature "consigned to academia". They want their works preserved, they desparately seek more funding of publication (and an end to local government censorship), and they experience great pain over the monolingual policies of the Russian state--most of the Mari men of letters, for example, were murdered under Stalin. Are you to tell those suffering peoples to "just get over it"? One finds in Russia that the locals who did "get over" the loss of their language also have higher rates of suicide, alcoholism, and existential crisis, while those who are fighting to preserve their language and feel a connection to the past have a much more positive outlook.

Actually, back in the day bind _did_ tolerate underscores. I remember the anguish we had flushing machines called things like fileserver_one out the day that Vixie et al decided to enforce the standards. The DNS standards say [-a-z0-9], with dot as a delimiter. It's not the place for implementations to play hooky with that. For years there were hacks in bind to allow you to choose between accepting underscores in master zones (bad idea), secondary zones (quite bad idea) and recursive queries (sometimes

The ICANN tries to give a technical reason to a political problem, although this reason may be valid, it is not a very good idea. With the UN, it will be handled by international comitees and we will all be long dead before they finally agree on which country will be in that comitee.

Perhaps, but I can't fault ICANN for this one, as much as I might like to. Like it or not, most internet technologies have their roots in latin speaking countries, which means systems developed there may not be tweaked to work with outside language schemes.

If the fault lies with anyone, it's with the individual contributers of the tech. Or better, with the non-latin countries appearent lack of interest in some of the core projects needed to push this through ICANN ( specifically DNS, httpd ).

although you're right, many of the latin-alphabet countries have at least some 'unique' letters, mostly ligatures, like the german ringel S(long S/short S ligature) and the dutch 'long y' (i/j ligature and different from the y). many of these letters don't appear in the alfabet itself.

- Don't be too surprised when people around you start building their own houses rather than choosing to pay rent.

DNS upheaval has been a long time coming, and the current anti-American sentiment worldwide isn't exactly helping to stabilize it. We're already seeing all sorts of adhoc routing setups that deal with shortcomings of an ameri-centric DNS. My guess is that within the next few years, ICANN's 'control' of the internet will be in name only as everyone else in the world will have moved on to alternative routing and domain systems.

I think you are confusing anti-American GOVERNMENT sentiment with anti-American PEOPLE sentiment. Oh, and don't forget, we built the Internet. We were there first. We laid the groundwork and did the first R&D. It was only later that other countries started to get involved. And at any point in those phases, they could have suggested these changes. Instead, they wait until the house it built and everyone else it hanging their family pictures to complain about the choice of land and demand everyone s

Please, there have been complaints about DNS not supporting most language's (even latin) character sets since the birth of the web, so it's completely untrue that we waited till everything was built. After well over a decade of patient waiting, it seems that actual pressure was required to get this change through.

I think that might be jumping the gun. American or not, the internet plays a huge role in the functionality of the modern world. Just imagine the chaos if international office networks went from "I can't open this word document you sent me because it's in a different format" to "I can't get email from you because you're on a different internet". American DNS control or not, decentralizing the internet like you suggest might happen could be one of the worst things that could happen for global communications.

Read the news. Is organized religion currently a net win, or a dead loss?

I like your sig... it's just not accurate. You've focused to much on a particulary component of the larger problem and have failed to recognize the actual whole of the issue. Here's a correct understanding of the problem.

Read the news and some history. Is organized humanity currently a net win, or a dead loss?

For all you people saying "There's no problem, just do it" - I say watch out... there will be a rush of attacks and spoofs as soon as this is opened up. The letter "a" appears in the unicode character set multiple times, and some of the variants are almost indistinguishable. I'm not just talking about someone registering släshdot.org, I'm talking about someone reigstering slashdot.org (the a is FF41 instead of the normal a). Good luck telling the attacks appart from the real sites.

What do you mean by "if the unicode of the URL does not match the default unicode of the browser"? The point of unicode is that it is uniform - there's only one. It is broken up into sections, and perhaps that's what you meant to say, but even that won't work.Let's take Japanese as an example, and I will give you two reasons why it won't work.

Perhaps if you assume I am Japanese, you will assume that my "default unicode section" is the section containing the Japanese characters. So this works fine if I go to

I'd be in favor of the change just because anything that undermines the Unix Tower of Babel -- the dependency on ASCII which complicates text handling sooooo much even when Windows solved the problem soooo long ago -- is good. Even Java gets it. Even Apple (finally) get it. Unix Is Teh Problem.

And the ASCII problem isn't just bad because it forces people to use inefficient encodings like UTF-8 (THREE bytes per character?) It's bad because it allows people to write code like:

(a line repeated, with subtle variations, several hundred times in the code of a certain ubiquitous editor).

And, lo and behold, the above does not work, but once it appears in a few thousand places it's impossible to fix, and a vast towering structure of fixes made by people who don't really understand why it's an issue is built.

So, even though the proposed change would be hugely inconvenient for a huge number of people, I'm in favor, because I want the world to grow the fork up and understand that text != byte array some time while I'm still alive.

i18n on windows is far from "solved".I do admit that MS had a huge benefit when they started pushing unicode.(It takes a company with microsoft's level of clout to push around national governments )

And the ASCII problem isn't just bad because it forces people to use inefficient encodings like UTF-8 (THREE bytes per character?)

Perhaps you don't realize that UTF-8 is moving on to become the most dominant character encoding,and the legacy cruft such as UTF-16 (designed to deal with design flaws in windows) is being phased out.

Even languages that would end up as mostly 3 byte characters tend to benefit from the savings on single bytecharacters for control and formatting markup.

I'm not going to harp on about it, but a few basic web searches could enlighten you here.

Code like that *works* in UTF-8, which is one of the things that makes it beatiful. (among many others)

It allows you to deal with world characters sets when it matters, and allows you to ignore them when it does not.(for example, a lexical analyzer that specifies its tokens does not want to support punctuation from every language ever conceived)

And if you think code like that doesnt exist in the windows world, you are sadly quite naive.In my experience internationalizing applications, its typically far easier to upate unix applications, whichon occaision need nearly no changes at all, compared to the laborious grind and near total re-write often neededfor ms-windows applications.

Thats a good start.Registrars shouldnt accept such names in the first place though: Is there a valid reason to ever have a domain name with stray characters mixed in from different languages?

If a standard were to specify that a domain name must use a subset of unicode that is self-consistent, and that browsers should turn the address bar red to warn anytime a domain uses characters not in the users selected languages subsets, that would go a long way towards minimizing the phishing problem.

Whatever happened to Punicode (Unicode in a special dns-characters-only encoding format)? There was some hoopla about the scheme, which would require browsers to show punicode-encoded URLs in the appropriate characters on the screen, but some naysayers said that it was a phisher's dream since many glyphs throughout Unicode looked alike. I figure this issue has nothing to do with Unicode per se, but with phishing vs certified sites in general, but I haven't heard a peep from the Punicode camp for over a y

Imagine the land rush that'll ensue if DNS will allow non-Latin characters. Trademark transliteration ? A heaven for domainsquatters and an upcoming surge of legal fees for trademark lawyers, if you ask me.

Nice for localising, sure, but how usable will Japanese, Indian, or Arabic script URLs -- for example -- be for those who do not have access to the respective sets or keyboard layouts?

Given that some societies have used non-Latin characters for thousands of years, is this a bit late in coming?

Let's be clear. The domain name system only uses English characters. There are lots of languages in Europe (Italian, Spanish, French...) which are closer to latin than English (which isn't really a latin language at all) which are not currently represented, because you can't use accents in domain names, or other letters such as the spanish Enye (n with a squiggle, actually a distinct letter). English speakers often think accents aren't important but they can completely change a word's meaning.

True, but the English subset of the alphabet has another feature that matters in this regard: it's a lowest common denominator that all computers on the planet are capable of producing. I can type any letter easily on a computer in China, Israel, Jordan, Russia, Spain, India, etc. I can't necessarily input a given Chinese character, Arabic letter, or Cyrillic letter.Why does this matter? Well, one argument is that it doesn't, much: if I want to view a Chinese website I'm probably in China and can input Chin

. English speakers often think accents aren't important but they can completely change a word's meaning.

Yes, I am an English speaker, and throughout the day I often stumble across the recurring idea that accents are of no particular use to determing the meaning of a word. I would go so far as to say that I often think accents just aren't important. I'm glad Slashdot has you around to set things straight.

The internet was originally conceived, designed, and implemented in the USA at a time where hardware was at a premium, and corners were cut to conserve that limited resource. DNS was just one of the results of that era. However, it is the most visible because it is the front end means for people to find each other. That means there is now a very well established standard, used by people across the entire globe, that is very difficult to change.

Changing all the DNS servers in the world to switch from ASCII to Unicode is NOT trivial. The fact that some societies have used non-latin characters for thousands of years is completely and utterly irrelevant. THEY didn't make the internet. They simply bolted themselves on to an existing infrastructure.

I agree that progress needs to be made to accomodate non-latin characters, but to have people whining about "how they want it, and want it now"... That's just ridiculous. It's like waltzing into a house that was built 40 years ago and having a tantrum because the stairs are too steep and the house is too squished. Major structural renovations take time, effort, and careful planning. And there is nothing you can do to avoid that, short of implementing cheap stop-gap measures that are virtually guaranteed to cause even bigger unintended headaches later on.

Set up a private latin name prefix for the non-latin names
i.e. NONLATINPREFIX and then a UUEncode of the non-latin name.. IE
(arabic word for horse in arabic script)=AER5ER8EDG
so you would have NONLATINPREFIX-AER5ER8EDG.com as a domain name, that would resolve correctly if someone typed in (arabic word for horse in arabic script)..
1. This allows for simple web-extention to serve non-latin countries

2. Doesnt require any change to the DNS system. (other than some name policy changes)

3. Allows links to be imbedded in normalweb-pages so that they can be cut and pasted by anyone with latin functionality. So a Japanese person could cut and paste the link to some arabic site that they dont have the font for.

4. While this is a kludge it has some major advantages over rebuilding the DNS system.

Oh there is a funky problem... letters that look identical in different languages. could allow for spoofing.. so if the link had a dual-language character-set it would need to multicolor the thing so that it would look pretty odd.. so that microsoft.com and mîcrosoft.com would be clearly distinguishable (reverse the î character so that it's white on black)
but I still it's workable.

... parallel multi-nets. I guess servers will have multiple domain names for same IP address, one for each culture they wish to address.No matter what, english-language net will continue to be *the* Internet, a global Forum, direct connection between common people from all parts of the world ( Hey there!:) ).

All the other nets will have quite a marginal significance. Nations will try to boost them in order to keep their citizens indoctrinated with own traditional values, but things that do not fly by thems

I thought this was what unicode was for. The only 3 scared characters that I wouldn't want messed with are the ":", "/", and "." How come we don't have a unicode DNS solution so countries could use the entir unicode address pool for domain names? I've read postings basically bashing the non-English world for not being invovled with the original tech so being left out. So that's a valid reason to discrimnate now? What used to get me excited about slashdot was the unquie solutions that you could find in the c

What's this going to do for security. Didn't we have phishing attacks receintly that consisted of unicode characters being inserted into e+bay.com for instance that didn't get displayed. the domain e+bay.com being different than ebay.com.

"A domain name is a unique address that allows people to access a website, for example, smh.com.au"

No,a domain name is a sequence of characters mapped to an IP address. It was designed so as you won't have to remember 66.35.250.150 instead of slashdot.org. This wasn't a problem while the original Internet consisted of just four computers. DNS was never designed to provide identity. There was also the case of a stock trader hacking a DNS server and redirecting traffic from a legitimate finantial site to his own where he had duplicated the real site only with bogus information.

"He said that this could create problems where, for example, a character in Urdu looks identical to one in Arabic"

It sure could. How about totally replacing DNS with a system of online identities.

Im in a country that is based between europe and middle east, we have a few non-latin characters in the alphabet, still it creates problems when conferring domain names.

no wonder the middle east (arabic) countries are especially wanting this, because the majority of the inexperienced internet users there will be more likely to easily use these domain names, hence the sites using those domains will be greater incentive for controlling what they see, because these domains will be under their control nationally.

not only this, but we as it people will be very unwilling to change all our software to adapt with the new situation because of the horrible development/testing/implementation involved, and hence wont be accepting these domains as valid in our network traffic, which will create a second internet which is as described above, less free.

Adding unicode to DNS names would make phishing much more difficult to detect unless all the browsers, email clients and other tools are modified to indicate that a URL may not be what the user thinks it is. It is bad enough as it is, and remember, most Internet users are not as savvy as those of us on Slashdot. I forsee a lot of security implications by adding this.

Amusingly, when the issue is non-UTF8 character sets, or censorship, or anything else that upsets the non-Western countries, they start shouting threats like ``Turkey will start its own top-level domains'' or ``Iran will disconnect from the Internet''. Which I'm sure is terribly impressive in UN-type meetings where we're supposed to pretend that all countries' opinions matter, but in the real world is an entirely hollow threat.

Were some random non-UTF8 country to make interworking with the rest of the In