virt1203 wrote:Yes, it's solve all the problems, for all world languages injecting UTF8 (unicode) to chat window.
Well, here's the only reason I see this as difficult technically. I develop business applications, since 1997 have been involved with both database based applications and 'roll your own' databases, written in ASM for multimedia broadcasting applications, and do some gamedev on the side, still would be my dream job, but I'm married with a family, so odds are low I can move into a new industry and lose salary for it.. anyhow.
MMO's are a database application through and through, for a database based app to support UTF8/Unicode character sets, the first 'port' is to convert the database, which is a large engineering project in and of itself, and has to be done very careful to avoid scrambling the data completely. That's one issue.
Secondly, MMO's are a very network intensive application. You can't run a good MMO server off a cable modem connection as you can with say a quake gameserver (arguably anyhow). UTF8 doubles the bytes transmitted for every character sent over the network. In the past, when optimizing the bandwidth of games I worked on, we optimized down to the BIT level how much traffic we sent on the wire, this means that just going and doubling the bandwidth is a very difficult decision to come to. That being said, connections are only getting faster, bandwidth only more available. And, with clever tricks to manage caches of strings you don't even send them on the network (sadly except for chat as you can't predict what a player will say, so have to send the fulll bandwidth version, unless you can come up with an compression algorithm that isn't slower than just sending it on the wire).
Thirdly, MMOs are games. Game programmers sometimes need to do 'bad' things when programming for speed's sake. This means that the impact of doubling the character size for the application is a QA nightmare. Every line of code is subject to being broken by such a change, and when working with binary data (game models, image files, sounds), the 8 bit ASCII character is used interchangably as C/C++ lack a byte type natively, and 8 bits is a byte, so we assume some things. If a byte is suddenly in some code 16 bits and some code 8 bits, weird subtle bugs are going to be everywhere. Unless the original developers were very methodical and had had experience in porting from 16bit code to 32 bit code or the like (i.e. the really old guys, who aren't that prevalent in game development anymore, old in the industry is over 30).
Depending on the forethought put into it by the very first few lines of code way back, making this change is highly non-trivial, and could require a considerable amount of time and effort to achieve.
But, it sounds as if Gameforge would at the very least consider it from Phelan's post, I just wanted to be a bit of a naysayer in terms of why it might not happen. Would be great if we all thought of all the languages of the world before writing a single line of code, sadly we all tend to be a bit biased to what language matters to US.. and historically it's been the case a lot of apps can't be ported to some languages, they'd have to be rewritten completely, which is just very expensive