Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

Lurks asks: "My company, Codeplay, is set up to develop new and innovative compiler technology for the games industry. Our C compiler, VectorC, is a
cutting edge vectorizing compiler aimed at games and multimedia
applications that demand high performance generally through hand-optimized assembly. I'm writing to ask the
burning question on our minds, is it worth porting VectorC to Linux?
In fact, we're already targeting Linux as part of the PlayStation 2
version albeit not generating x86 code of course. A Linux port would see us converge this work with our Win32 compiler and such an
undertaking would certainly be popular with our Linux loving techies!
One caveat worth mentioning now is that the current version of VectorC is plain C only. 2.0 with full C++ compatibility is due early next year."

"Of course, commercial realities will rear their ugly head and we must consider that Linux isn't perhaps an obvious choice for a compiler aimed at games and multimedia applications. Given the certain degree of
hostility towards commercial closed-source products of this nature,
perhaps the idea should be consigned to the pet-project back burner
for the future rather than rolled out as a commercial product?"

Do you mean converting your compiler to run under linux, or making it able to compile code FOR linux (and if so, under what platform).

IF it's a playstation/games developmetn compiler.... then the only reason to port it to linux is if you want to offer linux as a development environment for playstation games. Is it? What's the benefit?

It works under w32 and compiles for w32 too, not just playstation. And although the company (at least one guy from it at aceshardware forums) does not like to promote vectorc for scientific programming it is at least as useful for serious purposes too. I guess he is talking about a compiler both the running and the target platform is linux. It is (compiled code is) damn fast under w32, I'm considering buying or the intel compiler, which I haven't been able to test yet.

And although the company (at least one guy from it at aceshardware forums) does not like to promote vectorc for scientific programming it is at least as useful for serious purposes too.

Scientific applications usually need more than single-precision arithmetic, while the vector instructions available in 3D-Now! and SSE are only single-precision. SSE2 has double-precision as well, so it may be more useful on the Pentium4 for scientific applications:)

Games generally aren't so picky about the precision, and so more can be done to optimize without breaking the code.

you are right on double precision thing, but that is not all vectorc does. scheulding is much better than alternatives (remider: I haven't yet tested intel's compiler) and software prefetching, cpu dependent cache management etc. are also important factors. I experienced huge speedups with double precision x87 code on my benchmarks too. Actually if you are not using the same values over and over such that you can keep them in registers for a long time, 3dnow! is only a bit faster than x87 code on athlon. Cache managment and instruction scheulding is all that counts under moderate data loads.

Scientific applications usually need more than single-precision arithmetic... so it may be more useful on the Pentium4 for scientific applications:)

You're obviously quite right. We get a surprising amount of interest from the scientific/academic community. High tech compilers that vectorize are nothing new in this area, but generally they're Fortran compilers for mainframes etc.

I'm guessing that with the power of desktops these days and the prevalence of C/C++, micro based scientific computing is looking more attractive.

It's quite spooky to see all these high energy research labs (Eek!) in our web logs. I wonder how many of these people understand the precision issues with most vector units out there.

Perhaps the Pentium 4 will become a standard choice for scientific computing because of the double precision cable vector unit. Just as well we put a lot of effort into SSE2.

I have 1.2.1, I just haven't benchmarked it yet. Chances are until I check intel's compiler I won't. Those result might not be indicative of your expected perforance but performance on my program is what counts for me (notice that values are given for a specific code snippet and I did not claim to have done extensive benchmarking.) Memory reads in terms of bytes for 64 bit and 32 bit precision conditions are same, that might be leading you to think something is wrong with my values. Also, I'm registered and do ask my questions if they appear.

It's still pretty early days for us, it could well be something broken! Showing your code to our lads could either a) point out something you need to do to make things faster or b) point out something we need to fix/improve.
The only thing that makes me think something is wrong with your values is that we just don't see that sort of lackluster performance in our tests. If we did, we'd fix the compiler!
Mat 'Lurks' Bettinson, Codeplay Ltd.

Even if your compiler did fpu ops in no time, it wouldn't get much better. Of 16 seconds completion time with 3dnow! and 17 seconds with normal fpu, 12 seconds is used for memory accesses. So, if you isolate the fpu operations, your compiler is doing 570% better than hand optimized borland code and 950% better than compiler optimized borland code. Now tell me how to do operations without operands and I'll give your compiler credit for 950% speedup...

I dont know if it will ever be able to compile the linux kernel, but I used VectorC in one of my projects and it did improve the performance significantly without compiling the whole thing.

If the gcc-compatibility is up to par, then it should be possible at some time in the future to compile the Linux kernel with VectorC. However, remember that there are currently _no_ compilers other than GCC which can do this (please correct me if I am wrong), as the kernel source uses a number of GCC-specific extensions to C.

It will already compile some other large pieces of software; I myself have used it to compile MAME, and it gave a significant speedup on most games over compiling it with Cygwin GCC (all on Win32, obviously).

You appear to be equating Linux with x86. This is incorrect. Debian GNU/Linux is available now on half a dozen architectures with another half dozen ports well underway.

You also need to understand that the Linux community is very diverse. The hostility to commercial products that you have seen comes from people who don't represent the majority: no one does that (except perhaps Linus).

And only two of the "released" ports (ia32 & ppc) of Debian run on hardware that is modern & fast enough to worry about doing sick optimizations on. Not to mention that a compiler is inherently non-portable, and only becomes so through a large amount of work. The more optimizations a compiler does, the less portable it becomes.

While we're on the topic of common misconceptions, could everybody read the dictionary, the FSF confusing words lists, or the OSI web pages sometime?

<b>Commercial is not the opposite of Open Source</b>. Never has been, never will. I don't pay for Red Hat Linux, but parts of it (such as the installer) are produced with commercial benefit in mind, similar to free to ait television. Red hat Linux is (IMHO anyway) a good commercial Open Source app. There are many others - Zope, large chunks of Zend, etc.

<b>Assuming that all Open Source projects are inherintly non commercial is not only false but very rude to the companis that produce Open Source software with their own financial gain in mind, but which also benefits the community</b>

I am also in the early alpha stages of developing a C/C++/COBOL compiler for GNU/Linux that produces code 1000% faster than GNU/GCC. Although I am in the early stages of development and have not written any code yet, please send me your venture capital cheques.

For those who don't know much about it, VectorC is a compiler for targetting fairly specific types of system. Namely those with a bit of SIMD like your P3/P4/etc,or small-scale parallelism like the PS2 (is there an XBox-specific version to help with shaders yet? dunno).

Anyway, I can see this not going down hugely well on Linux. Apart from the commercial thing, which is a big turnoff to many of the free-as-in-beer users;o) Why? Well, I just don't see a huge market of Linux users with the type of processors which get the advantage from that kind of compiling. Cross-compiling for PS2 yeah, we do that in my office (though not from Linux). But one of the big selling points of Linux is that it works on lower-end processors, which don't have SIMD.

Anyway, its a good compiler, if you need it. Just think about whether you do or not...

...but many (I'd venture to say most...) of the users HAVE a P3/P4/Athlon/etc. machine in the first place. Don't assume that because it's one of it's big selling points that it's not used on other machines or that because it's popular with the free-beer crowd that others (like game developers or commercial distributions) won't be interested in it.

To hell with games. There's been a lot of buzz about big clusters of Linux machines to do scientific number crunching. Stop me if I'm wrong, but wouldn't those people be able to make good use of a good vectorizing compiler?

...and the fact that they've invested in a large computing cluster means they'd be willing to spend good money on a compiler IF it provides a significant performance improvement.

If you have applications in the embedded world, then you probably have a market. They won't care about the cost. A compiler is a non-recurring cost. Linux is also non-recurring. In the embedded world, lots of software *is* a recurring cost: operating systems, protocols, webservers, etc...

Hasn't it been ported to Linux? I seem to recall the Evil Satellite TV Company I was working for evaluating the cost of Purify for their embedded Linux development and then rejecting it due to the cost (Which was something like 10 grand per seat.) Maybe it was some other product, but I'm pretty sure they said Purify.

The top-end tools cost that much no matter WHO you talk to- and that's what we're talking about here. Insure++ tells you what blocks of code got ran to see if you might have missed something in normal execution that might come back to bite you down the line.

While it's pricey and places itself out of the reach of normal (i.e. Small business and open source/free software) developers, it's a must if you need to ensure reliability. We're buying at least one license for it at CoolLogic in the very near future...

I don't know what a vanilla Unix Purify license goes for (my employer has a negotiated lower rate), but it's less than $10K US. It is several thousand dollars per user, floating across hosts (and maybe Unix platforms) but not floating across users; expensive.

I briefly read over their specs on their website and it I find it quite humorous that all of the benchmark code is asm. So essentially, they have an assembler. Now, we have absolutely no idea what level of optimization they used and did not compare benchs with gcc with full optimization. What I would love to see is a comparision between gcc with -03 and they compiler then a comparision between a larger production quality program. I quite frankly can't see the market for this either.

I briefly read over their specs on their website and it I find it quite humorous that all of the benchmark code is asm. So essentially, they have an assembler.

Err, I don't know how you got that impression. The benchmark code is in C. Just click on the tests on the left. If you click on the RIGHT you see the output assembly from the major compilers on Win32 platform.

And you're right, we haven't benchmarked against GCC because it's not a contender on Win32. Also your point on seeing benchmarks on large 'production quality program' is well taken and we're working on that. We've got some real-world games/engines compiling nicely but since we can't release the source for those it's of dubious use wouldn't you say?

Quake 1 is what a lot of people like to test us with, compiling it themselves.

I guess you misunderstood the way vectorc works. You can just compile the code usual way, or as a preferred alternative you can start interactive optimizer which allows you to see what asm code C code produces with various optimization options while editing C code. The interactive optimizer (at least my demo version) do not allow you to edit asm, it just helps you to see where suboptimal code is produced and why, so you can change either C code or compiler options to produce faster code.
gcc is very slow on intel/amd. It is pointless to even compare (nevertheless, I did. gcc 2.95 is sloooow. gcc 3.x is even slower on my matrix multiplication function.) The real competition is between intel compiler and vectorc.

Today, there are 2 compilers that are well known - GCC which is the default and most used on Linux, and Intel's ICC which is a commercial, but there is a free non-commercial version available from Intel. So far I have heard mixed reports from people about ICC effiency in terms of code generated, speed of binaries, size of binaries etc (slashdot users who use ICC - please post your conclusions).

Now - the next big compiler that will come out (commercially) is from Borland. Early reports from various testers suggest their C/C++ compiler is kicking both ICC and GCC in the ass, but again - I belive it when I see the numbers, although Borland got a reputation of isssuing quite fast compilers..

So - if you decide to release a compiler, you'll need to think about 3 points:

1. GCC compatible - you'll need it if you want to be used by open source users OR to allow developers to move their apps which used GCC to your compiler.

2. A free version (free as a beer) - in order to be really accepted and widely used by Linux users, you'll need to issue a free version for the developers to use. Intel learned this quite since the beginning that if they want their compiler to be accepted by the Linux users - they'll need to release a free version. Borland is rumored to release a free version also with their upcoming C/C++ compiler (command line version, not the GUI)

3. Competition - well, not much to say here, but you got companies like: Borland, Intel, and the GNU GCC, along with the Portland group's compiler, code warrior (Metrowerks) - plenty of competition. do you really want to get in?

Yes, that's the idea. We accept broad compatibility is a requirement and we've done a lot of work on that sort of thing on several platforms.

A free version (free as a beer)

That's a possibility. However VectorC is something other than a general compiler, although that's ultimately the goal. My current thinking is that if you want a free general purpose compiler than there's nothing wrong with GCC. If you have a need to write high performance code, you're probably doing this because you have a serious application in mind. You're faced with hand coding assembly or... VectorC. If you don't enjoy hand-coding assembly then VectorC starts to look attractive (we hope).

Then again, plenty of programmers enjoy that, I don't quote understand them myself:)

3. Competition... do you really want to get in?

Of course, otherwise Codeplay never would have started! To be honest, our focus is still games and in that area (specifically vectorization) compiler technology hasn't really moved forward.

Don't do it.
There is no money to be made in Linux market.
Just take a look at Loki problems.
What we have here is a single (yeah, with almost 100% market) Linux game developer unable to make money here and you are trying to sell compiler directed at game developers ?
It simply doesn't make any sense.

Linux game developer unable to make money here and you are trying to sell compiler directed at game developers ?
It simply doesn't make any sense.

We're aware the Linux games market is non existant. Fortunately VectorC is still useful for multimedia processing (which is basically what games are!) and there's a bit of that going on under Linux, right?

So far, all the people that have made comments have been thinking in terms of things like Loki or old machines that wouldn't use your stuff well.

Multimedia comes to mind.

Serious number crunching comes to mind.

While gaming's "nonexistent" (I won't bore you with what I know about all of this- suffice it to say that the malaise for Linux gaming is less due to a lack of a market and more due to a lack of a channel to sell and in some cases hardware (i.e. 3D cards...) to run it on.) it's about to have an upturn. It would be NICE for a company that is using your compiler to make their game go be able to make the Linux version run as fast if they so chose to do it.

GCC's ok at either. There ARE better compilers for some things, though.

Depending on the price of the compiler and when it offers C++ support, I might be very interested in purchasing a license to it.

"That's a possibility. However VectorC is something other than a general compiler, although that's ultimately the goal. My current thinking is that if you want a free general purpose compiler than there's nothing wrong with GCC. If you have a need to write high performance code, you're probably doing this because you have a serious application in mind."

If there is no free (as in beer) version then how can one decide if the performance benefits are worth the $$$ ?

"in order to be really accepted and widely used by Linux users, you'll need to issue a free version for the developers to use. "

Sorry but that is just stupid.
It is compiler what are we talking about here, a tool used exclusively by developers.
There is no "Linux users acceptance" here, if developers get to have it for free, that is it, you just killed your entire market here.

"do you really want to get in? "
I don't think they want to get in, for the simple reason that there is almost no money to be made in Linux market selling compilers.

First, just because there isn't a large Linux game market right now doesn't mean there will not be in the distant future. Since this compiler is only at version 2.0, now is the PERFECT time to get into Linux. That way, this compiler can grow with the Linux gaming market.

There is no "Linux users acceptance" here, if developers get to have it for free, that is it, you just killed your entire market here.

There are all kinds of free liscensing schemes which could apply to the free version. "Free for non-commercial use, etc. etc." Budding game developers might want to try the free version. Professional developers would almost certainly be willing to fork out the cost for the commercial version. After all, those are small expenses compared to the cost of developing a video game now days.

I don't think they want to get in, for the simple reason that there is almost no money to be made in Linux market selling compilers.

Sure they want to get in. They want to get in if the future of Linux gaming is bright. Sure, that's questionable. But then most business ventures are questionable. Some just make more clear business sense than other. If the Linux game market really sees a boom in the next five years then having their feet in the game-centric compiler door before the boom hits will have been an excellent decision. It's a risk, but I'd be willing to say it's a worthwhile risk. It's like gambling on fair odds with a high payout.

I think the real question here isn't so much if THEY want to invest the time in a Linux version, but rather, if the Linux community really wants to see Linux move in that direction.

It's partially my opinion that Linux users actually don't WANT Linux to become so mainstream that a large portion of the software for it is commercial, and only binary distributions exist. That sort of market is generally what happens to a mainstream OS, and it goes somewhat against the grain of Linux.

I personally think a game-centric compiler for Linux is a great idea. It's certainly no worse of an idea than SDL [libsdl.org], and SDL is definately coming along nicely. The future of Linux gaming looks a whole lot better today than I would have thought it would a few years back. There's still along way to go, though.

I've been checking ICC out for awhile (both on Linux and on Windows) and its a dream to use. Its quite fast, generates nice code (20% or so faster than GCC on most things I've tried with it, like ByteMark) and has really, really nice error messages. It isn't as complient with C++ as GCC 3.0, but its really good nonetheless, its just missing some of the eosteric parts of the standard.

I don't think so; how many applications really use the GCC extensions?
>>>>>>
GNU software tends to. GLIBC uses them as does the Linux kernel. Without good GCC compatibility, a compiler is of limited usefulness as a general purpose compiler on Linux.

If the optimizer is better then GCC then add/merge those components into GCC to improve it. This will save your company maintenance resources and time in the long run (ie. $$$) if the GCC community picks it up.

Porting this to linux seems to be a good idea, but the inherent problem is your business model. Who is your target audience? Are you targeting corporations such as lokisoft [lokisoft.com] who will use your compiler to port/create games for linux? (side note- the lokisoft page is down, i dunno if that is a fluke, if i have the wrong URL, or if they've packed up and left)

The problem is that if you target corp.'s like loki, you may not be able to sell enough units, or whatever, to justify the cost of porting. These linux gaming companies seem to fold faster than omlettes at waffle house.

If on the other hand, you just ported it and released it at random into the linux/OSS community, you would be doing the community a favor, and independant cells of programmers could attempt to port/write games for linux.

The problem with this solution is also the cost: If you release it open source for linux, you would be somewhat of a hero, or philanthropist, to the OSS community; however, you may not be able to justify the cost of porting it, if your idea is to make money by porting to linux.

I guess it depends on what time frame you think you can port it to linux in - if it would take you and your team an extra two days of programming, it may be worth it, as both a PR move and a gift to the OSS community. However, if it will take extra months of coding, just bear in mind that philanthropy doesn't pay bills.

Don't mean to be cynical, but you have to consider each decision as it relates to the almighty dollar.

You basically said what I was going to say, except I'll put it a little differently: Why are you [compiler dude] asking Slashdot? Why don't you do some market research on your potential customers and see what the market size is?

It's not complicated. If the return on investment is greater than the return on things you can otherwise use your time or resources on, then do it. Otherwise, don't.

We'd love to! Alti-Vec is seriously cool if you're in the business of getting excited about CPU vector units like we are.

We went to Apple and said "Guys, we'd really love to do a Mac version and you could really do with having a compiler that uses the cool bits in your CPUs. Wanna sling us a bit of cash?" The phones been strangely quiet. Oh well.

On the bright side, there's a games platform with a PPC processor. It doesn't have Alti-Vec (boo!) but it does have a basic vector unit. You can bet we're working on that pretty hard. So that'll have ramifications of moving to other platforms later on down the line when we have PPC code-generation going sweet.

If you're talking multimedia procesing, etc. like you did in an earlier comment, you're going to want to talk to the Yellow Dog Linux people- that's some of the markets they're going to with their small clusters of G4 computers (as in as many as 8 top-end G4's in the same space as Apple's tower...). IBM would be interested for the same reasons.

Well, it's a bit more complex than that isn't it. Us porting to a platform involves the following kind of issues;

Is the platform easy? IE. Do we have the code generation for that platform already?

Is it a heavy games and multimedia platform?

Are the tools substandard on that platform?

If we're not doing well in the above, then we'd not consider the platform a priority. Unless, of course, someone wants to give us money to do it. That's not out of the question where it's a proprietary platform such as a games console or, to an extent, the Apple Mac.

Apple is a good target on one level because the vector unit in the G4 is the best vector unit in any consumer CPU. There's also games development and, more importantly on that platform, there's a lot of performance-sensitive multimedia application development. Apple like to crow about how good their CPU is, so one would have thought they'd like a tool on their platform to prove it.

Regarding your other suggestions,maybe down the line we'll be looking at more business orientated high performance computing but that's not why Codeplay started up. From experience we have our work cut out talking to the various corporate players coming from an unknown company with 'play' in their name and 'games' on their web site. I think to be successful there we'd have to spin-off some high performance computing company or something. And definately no chins on the web site:)

Linux is a special case. For all other platforms, we take a straight business viewpoint based on the numbers. To be frank, we've got a LOT of work/platforms on our plate as it is without finding new platforms unless they were going to pay.

The Playstation 2 tools will pay the bills. Linux would just be 'cool', so long as it didn't cost us significant money and didn't generate bad PR. That's my view at the minute.

It has been said before - who is your audience. Linux has a strange market as a lot of people are used to the convenience of -free- software. That's perhaps the reason why Loki was having a hard time doing business.

I don't know what the business performance of Borland is with for example "Kylix", but for now, I haven't seen any free software yet, which was made with Kylix (does not mean it doesn't exist) but it makes me believe that their Kylix project was lesser success than intentionally expected.

Before Borland/Inprise started the Kylix project, they published a survey on their site, to investigate market demand. Maybe I'm mistaken, but I think that the enthousiasm in the survey results has not had the expected ROI for Borland. With the dot-bomb and the new economy collapse, investments in the Linux OS has gone down as well, except for those who chose Linux to be a strategic weapon against their competitors (IBM vs Microsoft).

Still, the point I want to make is, you need to determine your audience, but be cautious in your investigation methods and don't forget to do the general market watch.

I agree - I haven't heared much out of Borland (or anyone else for that matter) about cool things done in Kylix. Still, The freeware JCL Library http://www.delphi-jedi.org/ [delphi-jedi.org] will be out on Kylix soon, and I will get around to porting my freeware to Kylix sometime after that. I've been busy at work, ya know how it is.

Anyway, to drift back onto topic, it is worth remembering that Kylix 2.0, if when it ships (and I hope Borland is taking a long-term view on this one and keeps plugging at it), will include Borland C++ Builder. Yup Borland C++ On linux will be another C++ option.

Maybe it's just me, but if your specialty is high performance vector operations targeted specifically at graphic-intensive programs applications like games where vector operations are CPU-bound, then why would you not just code a superfast set of assembly routines for vector operations that are tuned for the various processors, and provide them as a C library that the world can use with their compiler of choice?

It seems like the alternative with VectorC is for me to implement my own vector libraries anyway and then hope your compiler is both clever enough to figure out how to optimize my code, and robust enough not to break on other constructs.

I'm not trying to be hostile, just curious. Perhaps there are tons more programs besides graphics / games that benefit from VectorC's optimization? Otherwise, it looks like you've implemented an entire C compiler to get a few features you could add to an existing one otherwise.

I've recently started working with a couple of 3D engines, and would really like to compile them with something like VectorC instead of GCC, and certainly I would like to see some of the Free Software Developers who make games use a compiler like VectorC if they can afford it, but especially developers like Loki Games, ID Software, etc. Of course, I would probably just use GCC until I had something very much worth presenting to the public (Open Source or otherwise), and for me to get the compiler before the final stages of development would be a question of how much VectorC costs, and how much the Version 2 upgrade would cost - but that is just me. As I said, there are some people out there who need it. What about 3D app makers like Alias|Wavefront (Maya), NaN (Blender), and several other companies I can't remember at the moment (the creators of Shake and Tremor, etc)? I don't know what compilers are available for PPC/PPC64 but obviously not many people care about that when it comes to games, but I'm sure someone will eventually make an equivalent of vectorc for ppc/64 if it hasn't already been done - but when it comes to the x86 I would really like to see it ported over (It would be nice to get the PS2 and x86 package togethor).

While supporting Linux may not show any direct impact in the game market as of yet what it would do is allow your company to market the product as a cross platform game compiler. Before your compiler would be of any use in that area you would need to have a PPC port, ideally both Linux and Mac. Also the most important part of a cross compiler is optimized libraries that run on every supported platform. This allows for a game company to develop once and target a number of different platforms with a simple compile. The ability for a game company to widen their audience with minimal work and investment will make your compiler an attractive option whether or not Linux has a huge game market.

Another area where Linux would be a boon is as compile farms. If a Linux version of your compiler came out that could target not just Linux but every other platform you support then companies could set up cheap compile farms to compile large programs in the background while a developers work stations\ remain free for development or testing. The beauty of it is that the compiler itself can be targeted to Linux and not specifically to an architecture like x86 and compiled to work on any Linux platform (which is almost any platform out there). With the PS2 running Linux a bunch of these boxes could be set up for this.

If game companies could target any platform they wish without having to invest it makes them happy. The game industry is fickle. One year this is the platform to target, the next it is completely different. Having Linux as a target may not cause a developer to start pumping out Linux games but what it does do is leave the option open if the industry should shift that way. It makes developers sleep easier at night knowing their code will not become obsolete by the time they wake up.

... for mainstream acceptance. C++ compilers all have their own name-mangling schemes which makes different compilers fundamentally incompatible. So the compiler may be able to find a niche market among developers of proprietary applications, but it's very difficult to persuade people to switch from g++, since that's what all the preloaded C++ libraries that ship with Linux use.

I like the idea of a specialized compiler being available if I need it (I'm stepping into the 3D waters).

But I have a small warning for you: If your compiler really does produce awesomely better code in that particular area, then if that area becomes popular with the GCC crowd, you are going to see alot of work spent on optimization. In other words, your product may be a spur to make GCC competitive with VectorC.

And I would also point out that I think that if VectorC, available for Linux/someArch, does NOT stir the GCC developers to improve, then your specific market doesn't exist on Linux.

Really sucks: Either you eventually face competition with GCC, or your product bombs on Linux.

I am developing an extremely authentic and highly innovative gaming technology for the information superhighway called SuperDuperGameAccelerator. It's really cool. It will accelerate games and cook your dinner too. I was wondering if it's worth porting it to Linux. In fact I'm already porting it to Linux, I just wanted to announce it to the world by posting on slashdot. You know, after the.com crash we can't get any more funding, so we could use all the free advertizing we could find. And I figured slashdot editors are too stupid to recognize the thinly disguised commercial...

Where the lame seem to think that every little request for something coming from a company is a thinly veiled ad, etc.

Linux WOULD be nice- I could see some uses for your compiler in the embedded space (which is one place Linux is definitely taking off in...) depending on the embedded application. Your benchmark info doesn't seem to show how big the executables were- how much bigger/smaller is your code compared to the other compilers?

If I've got a consumer device that needs a little multimedia processing (say some DSP work), I can either add a special purpose extra part that adds considerably to the bill of materials on the device or I can up the muscle of the chip a little and do it all in software for less impact on the BOM. Something like a current technology ignition controller or a PLC might not need this sort of thing, but we're moving into an arena that embedded and "PC" or "game console" have no real distinctions per se.

I know, I'm working in that area of embedded design. If it were for Linux on x86 and PPC I'd convince my employer we needed it for at least part of our product offerings right now.

Sounds like an excellent way to reduce code bloat. I suppose it's too much to hope that they take the "better C than C" parts of C++ and leave it at that.

I'm actually half serious here. I've worked on large projects in both C and C++, and the ones which were most successful were the ones where people didn't get carried away, trying to use every new, buggy, inefficient, feature of C++ in an attempt to prove that they could.

I'm all in favour of a compiler which restricts its features to those it does well instead of providing half-baked implementations of C++'isms just to bump up the feature list.

People are going to mark this as flamebait and trollish (With a karma of 50 I really don't care.) but trust me on this one, don't bother. People are already making games and multimedia software for Linux, and it isn't selling very well. A commercial compiler for stuff people can not sell in the first place is pretty much just a big waste of time.

The question should have been "Does Linux need another vectorizing compiler?" Currently, I'd say the answer is yes, because Intel needs some competition. I, for one, am 95% sure that a port of this compiler to Linux would get snapped up by anyplace that is doing high-performance computing.

I do high-performance computing, and I'd love to be able to try out Vector-C on some of our P-4 and Alpha Linux clusters, if I could. Right now, we use Intel's icc or gcc on x86 and Compaq's ccc or gcc on alpha, respectively. Pretty soon we are going to be looking at Itanium as well. Some of the time we are hand-hacking assembly just like the game programmers are, which is kind of sad; we would rather be compiling C. What Mat Bettinson said is definitely the case: "micro based scientific computing is looking more attractive."

Despite what some ppl here are saying, it's not an issue if it can't compile the kernel, or if it's not 100% gcc compatible, because most of the things the high-performance computing applications I've seen don't need to spend a whole lot of time in the kernel. However, you do have to make it work with both 2.1 and 2.2 glibc (please please please). The hacks we came up with to make icc work on our glibc-2.2 RH7 boxes are ugly and fragile.

Language issues: C++ is almost never a big deal
in HPC, but C/FORTRAN support is great. Having at least partial C99 support is best because then you get float *restrict foo, et al. Also, remember that not all HPC codes are fp. Some of us write integer intensive codes and/or memory intensive codes.

It's not an issue if it's not free-beer or free-software, because research grants will probably be happy to pay reasonable amounts for it -- maybe a couple hundred bucks, say -- but you have to remember that Intel is giving icc betas away for basically nothing, so you can't charge too much. This is not a troll, just trying to be realistic here.

Disclaimer: I am not speaking for my employers. I am not a person who gets to decide how grant money is spent (yet). These are just my opinions.

Even if Codeplay was to use the Edison Design Group C++ front end [edg.com] -- highly likely, as it's famous throughout the industry as an extraordinarily compliant, high quality front end, and seemingly a perfect match to the existing VectorC back end -- I'm highly skeptical this schedule could be met.

On the other hand, a lot of performance-minded projects stick with plain C. (I'm not commenting on whether or not that's the right decision; I'm observing what decisions are made in the industry.)

Even if Codeplay was to use the Edison Design Group C++ front end [edg.com] -- highly likely, as it's famous throughout the industry as an extraordinarily compliant, high quality front end,...

Implementing C++ properly natively is a requirement so that we can attempt to do the sorts of things VectorC does which no other compiler does. Simply put, if we used someone else's translator - we'd have C++ capability but almost certainly would be no faster than other C++ compilers.

... and seemingly a perfect match to the existing VectorC back end -- I'm highly skeptical this schedule could be met.

I assure you, we are. The work is almost complete, what remains is the significant internal testing and bug fixing which we estimate will require the rest of the year.

Speaking as a game developer, who codes
primarily on Linux for Linux, I would certainly
buy a copy once C++ support is available, if
the code produced was fast.

I have lost touch with the ammount of time I
have spent going over code again and again to
make it a bit more efficient, and removing
bottlenecks. I would gladly pay for a product
which would enable me to ship binaries that
were faster.

The question of ABI compatability on C++ is
very tricky though. A C++ compiler would be of limited use if it did not use the same ABI as g++,
though with the release of gcc 3, this ABI has at least stabilised.

It really depends on how you want to work it. Binary only releases are popular in certain Linux segments that don't want to hasle with compiling. The problem is of course you're only a kernal update away from having a program that doesn't work.

I think there are aspects that of the Linux Universe that would benifit from a well done multimedia compiler. DVD players, or general Home Theater Computer enviroments, Linux Set Top boxes. etc. Games of course would welcome this too.

But the real problem of course is how to you balance making a comcerial distro for Linux that is okay from the Opensource,and GNU communities perspective.

There are a couple ways to go. You can go the QT way, and have a free and comercial version. The hope being that the various projects will use your libs and compiler, and thus it would be popular for comercial aspects that would actually pay you.

Second would be binary only distro. Not as popular mind you because the person who puts together the Binary Distro (Which I assume would be a person who bought the package) may bit off more than they wanted to chew keeping things up to date. In order for this to work in opensource you'd need a lot of compiler directives. The idea being you could compile the project with GCC, it just wouldn't be as efficiant.

Third would be a hybrid method. Some sort pre-compile on the closed stuff, and an included client compiler that would bring everything system dependent together. Perhaps even making the compiler and libs free to distro, but the development enviroment closed.

My person preference would be number one. Anyone know how well QT does?

I'm sure plenty of people will disagree with my conclusions, but I believe them to be sound. If you want to be successful on Linux (or any other free OS), you need to be libre/free.

That means making the source code available under a GPL-like license.

However, you can still protect your market at the same time. Just make it a requirement that anything compiled with the libre version of VectorC, or a derivative, also be licensed under the GPL or any of the similar licenses recognized by the OSI [opensource.org].

Then dual-license the compiler so that anyone who pays for the commercial license is free to do whatever they want with the resultant binaries, no licensing restrictions.

That way you can contribute to the community by providing a libre compiler, benefit from the community who will likely contribute bug-fixes and enhancements, and still make money from the people who would be willing to pay in the first place.

If you want to be successful on Linux (or any other free OS), you need to be libre/free.

That depends on how you define success.

This will never displace gcc as the compiler used by most people - because gcc is good enough, open, and free.

But it has a fine niche market as a "pay extra, get better performance" option for people with serious crunch to do - graphic games, scientific, financial modeling, etc. It will continue to hold this niche unless/until gcc or some other free&open compiler achieves comparable performance AND cross-platform ability to all game platforms, or some commercial competitor outdoes it - at which point the product's market would be in jepoarday regardless of whether it had been ported to Linux.

At a minimum it should be able to produce code to RUN on Linux. Otherwise it's not supporting the game authors who want to release a Linux version of their product.

But I think that a version that runs ON Linux, if not overpriced (or as an extra-cost extra on an existing non-Linux release) should pay for the port and make a tidy profit. At a minimum some game designers will want to work directly on the Linux platform rather than being limited to cross-platform development.

In short, I'd take a very careful look at the benchmark code and methods of compiling before making conclusions about VectorC's performance. It might still be great; however, past benchmarks that have been floated have been questioned.

DISCLAIMER: I have not used VectorC, or directly examined the test code or assembly code generated for any of the tests. I have no direct assessment of the benchmarks - I'm pointing to others' assessments in response to another article about the compiler.

There is another compiler worth considering. Metrowerks has an auto vectorizing compiler for the Intel platform, Code Warrior, which both runs under and targets Linux (Intel & PowerPC). Code Warrior also targets Windows, Mac Classic, OS X, GAMECUBE, PS2 and most other game platforms. Benchmarks can be found on the Metrowerks site (http://www.metrowerks.com), but they're a bit old. We'd be happy to provide up-to-date benchmarks for anyone who's interested.

The real strength, however, is CodeWarrior's optimization and code generation.

I'm biased, though, as I'm a compiler engineer (PowerPC) for Metrowerks, so take all of this with a grain of salt.

--Doug

Editor : Adam Barker
Yes, I can't spell so I need one of those editor types.

Hi, I'm a game developer. I looked at your compiler at Siggraph and GDC.

...when I spoke to your sales rep at GDC and Siggraph, he indicated that C++ wasn't ready, as you said. He also didn't think your symbol information would be compatible with VTune or the other standard Windows profilers and couldn't tell me of any test suites that your compiler had passed, though he named a couple which were "close." I wouldn't use a compiler which doesn't pass basic conformance testing, and I certainly wouldn't take on a new compiler if I don't know that I can profile its output to prove that its working.

When I tried creating some simple code and looking at the disassembly on the sample machine at Siggraph, the compiler choked on some valid code (it seemed confused by the critical 'volatile' keyword), and the assembly generated was extremely naive about cache use and couldn't even hoist redundant operations out of loop operations.

So far as I can see, I'm supposed to dump my compiler for something that lets me use half a dozen instructions I can get with inline assembly or Intel's _free_ compiler, where Windows is concerned.

Shouldn't you finish your tools and make them work on one platform before you go trying to pitch them on others? Have your tools really advanced so far in the last few months that you're ready to split resources?

He also didn't think your symbol information would be compatible with VTune or the other standard Windows profilers

That is not correct, you can use VTune just fine. That's exactly how we see people using VectorC today. Your points on conformance testing are well taken. That is being done with the C++ version and obviously well announce results when we release that compiler.

... the compiler choked on some valid code...

Quite possible you ran into a 1.0 bug. There's a dramatic difference with the recent 1.2 release. VectorC can now compile entire games and middleware engines and demonstrate a boost in performance in many cases. That's not something I would have said at the beginning of this year.

So far as I can see, I'm supposed to dump my compiler for something that lets me use half a dozen instructions I can get with inline assembly or Intel's _free_ compiler, where Windows is concerned.

Well clearly I can't agree with your viewpoint on what a proper vectorizing compiler is capable of doing. Now if you're prepared to go do all the inline assembly for the various SIMD implementations on x86 CPUs then indeed, you have no use for VectorC. However that's pretty rare for game developers working to deadlines in my experience.

Shouldn't you finish your tools and make them work on one platform before you go trying to pitch them on others?

VectorC works marvellously on Win32 and is already being used by game and middleware developers to good effect. However I like your tone, you think you could come over and whip me with an apple in my mouth?:)

compiling for (running on) linux is not new
Intel has compilers for IA32 and IA64
ARM has compilers for ARM arch
IBM has optimised GCC for PowerPC (smart move for IBM)
MIPS pays for optimisations + support to GCC (algorithmics)

the difference is that you will never be able compile the kernel with anything other than GCC because it changes so often and in terms of support kernel hackers will tell you to jump if you try reporting bugs that are solely compiler related and not GCC

Compiling for (running on) linux is a matter of sticking to the standards (like ELF) and the ABI all are documented
(they don't change all that often although they changed this year for GCC 3.x )

Really you have to have someone signed up because most people have tried this and then gone back to GCC because its easy and the performance is not all that different !
(people end up inline assembly anyway if they only have one target which is much better than a compiler can do because it does not know what it does at run time while the developer can at least guess)

fun to see that the big vendors are split MIPS + IBM do GCC after trying out comerial compilers while Intel & ARM adopts a dual policy of funding GCC work and also have a closed version

no single silicon vendor ignores GCC it is one of the backbones to *BSD, Linux HP-UX (kernel compiled with GCC) VX-works (you receive hefty manuals about GCC and GDB when fork out for wind dev suite) and lots more

users include NASA, ESA, Chinese Government, WHO and many more
(people lives depend on GCC every second)

if you want to do research then GCC is a good place to start

we all owe a great deal to people who have put their time and effort into the projects and people who demanded to use it in projects

First, do the benchmarks, and see if it's needed/useful under Linux (I'm sorta assuming x86 here). The benchmarks page mentioned above (it's here [codeplay.com], btw) lists their VectorC compiler, Intel's compiler, and MS Visual C. Add a column to that labeled "GCC 3.0". Let me see just how much of a performance boost I'll get out of your product over the one I'm using (as opposed to these other products I'm not using).

Yeah, this'll probably require doing much of the porting work before determining whether there's a market for it. It's called R&D, and it does sometimes lead to dead-ends. Deal.

Well, as someone who is currently working on a commercial (or shareware, the line seems to have blurred) game for Linux, I have to say I would definetly be interested in your product. Not so much with the current project because it is turn based strategy and doesn't need super fast efficiency, but certainly future products I would love the chance to use a faster compiler for 3d, etc. Price is an issue of course, it's annoying to see software that costs more than the computer it runs on, even if "businesses do have the money to spend".

Secondly I'd like to say that Slashdot isn't the best place to ask about commercial software.:)
I can tell you this, we each have to support Linux in our own way. Ever since I was a kid coding in BASIC on an Atari 600XL I've dreamt of having a software company, I'm not going to give up on that dream because I've taken a liking to open source. Of course I'll continue to release open source projects, much in the way Loki has. But if you like Linux enough to bring a product over you should do it. You are just as much a part of the community as the people you are asking, and therefore have just as much right to shape the community.

Also one quick comment about everyone saying how Loki failed so you have no audience. You people are missing the point. Loki "failed" (if we must put the nail in it's coffin) because it was linux ONLY. Presumably game developers could use VectorC to create games for Linux AND Windows (any Mac OS X version?). I certainly don't expect to sell a whole heck of alot of copies for Linux, or really even to be well recieved, at least until I can afford to do some of the bigger projects that I want to do. But at this point Linux isn't meant to be the big moneymaker. I just use Linux almost exclusively, and I'm sure there are others that do, who might want to play my game. They shouldn't have to boot Windows to do that.

So yeah, I'd say that you have a good chance with VectorC, it would provide cross-platform options for the developers that use it, and if you promote a library like SDL then that will give them even more reason to compile for Linux as well (and thus buy the Linux version) by lowering the amount of work involved in moving the code from Windows.

It's time to move on from C and C++. (Take a look at the sans top 20 list of exploits, and tell me how many of those items would be there if we used modern, safe languages!) Security, portability, and code reuse are vital for building robust large systems. Please, no more C compilers, and no more C!

I said it's time to move on from C *and* C++, not from C *to* C++.;) I agree with you, C++ is an absolute horrorshow.

I don't think Java is the best choice out there, but it is orders of magnitude better than C++. (As for speed, see http://www.bagley.org/~doug/shootout/craps.shtml) It may not actually be as fast as C++ or C, but I say that's fast enough.

I have worked on compute-intensive software for years, both on free and on commercial UNIX systems, and I don't see the point. GNU C/C++ is a decent compiler. Usually, between algorithmic improvements, profiling, hand-tuning a few inner loops, and using optimized libraries (BLAS, 3D graphics, etc.), I have found I can get pretty close to machine performance. Furthermore, supposedly "high performance" C/C++ compilers require a bevvy of non-standard flags and keywords to optimize a lot of C/C++ code because C/C++ semantics inhibit many optimizations. On the other hand, running a compiler that is not the main compiler on the platform in question is often a major headache.

There is, of course, some market for C/C++ compilers that promise to do much better, as there is a market for a lot of things that promise "to make life easier". So, someone may buy this, but I wouldn't hold my breath.

I also note that your benchmarks don't show any comparisons with GNU C/C++.

This reminds me of a talk I had with our CEO when interviewing. I asked him if a port of our debugger to Linux was viable. He said sure, it was in the works, but didn't expect it to make any money at all. I thought the other way around- people have Linux on lots of systems because it's free, and now they're realizing they might need some good tools for it.

I went to SC2000 (the supercomputing tradeshow) last year. Almost everybody I talked was running MPI on Linux on PCs. Linux has made huge inroads in the scientific community working on ahem, clusters, of Linux boxes. It's actually quite amazing, consider the Linux market in this segment was close to nil a few years ago. Something about cash-strapped institutions not having to pay high OS licensing fees...

Anyway, it goes to show you that even though there is a free alternative (gdb/ddd) you can sell tools for Linux if you do things the free things can't, or do it better.

No, it isn't. It's a high performance auto-vectorizing compiler. The only other auto-vectorizing compiler is Intel's and that's a) So limited in what it will vectorize so as to be of little practical use and b) only vectorizes for SSE oddly enough.

I have an idea, how about we just use GCC with the -mcpu=i686 and -march=i686 flags instead of paying for this.

All due respect but VectorC blows away highly optimizing compilers like MSVC and Intel C/C++ when it comes to vectorization, let alone GCC.
Now we're not saying you have to pay for this, it's most likely you do not have a need for what VectorC is good at. It most certainly is not aimed at general compiling ala GCC.
It's an alternative to hand-coding assembly language when high performance is necessary. Admittedly my story could have been a bit more descriptive here but I wanted to keep it short.

Shame on/. story posters, this "story" is just a shameless plug.

Or maybe it was actually gauging the waters exactly as I said!. We have no products under Linux so 'plugging' here under this topic would be fairly pointed.
So cynical some people!
Mat 'Lurks' Bettinson, Codeplay Ltd.

I don't think you'll see too many "How Dare You"
posts - what you will see is a good number of
"Yawn - why do we need this?" and "I wouldn't
buy it - but I think you should port it anyway".

The fact is that even the very best optimising
compiler is only going to buy you about the same
performance increase as waiting six months and
letting Moores law do the work.

Anyone who is developing OpenSourced software had
better be sure that they'll get the needed speed
from vanilla GCC because their users will only
be using GCC. Hence your market is probably
limited to commercial closed-source software
that *needs* that small additional speed boost.

Hence only commercial games companies who support
Linux. That's a VERY small audience. I don't see
how you could ever recoup your costs. Maybe you
should give it away for free and hope to get a
huge wave of enthusiasm for the product that
would spill over into sales for the Windoze
version?

I think you are wrong about Linux users. The problem is that many
professional developer tools are very expensive
for the independent mom and pop shop. Really
good compilers and CASE tools can run into the
tens of thousands of dollars, expenses which can not be
justified for companies working on smaller
contracts. Many Linux developers are jack-of-all-trade consultants doing contract
work for small and mid-sized business. Programming
most likely is only one aspect of the services which
they provide.

Is it so hard to understand that some applications will not come into existence through Open Source efforts? Specialty stuff like this compiler will never exist for Linux if you insist on 100% adherence to doctrine.

IF Linux ever gains the popularity we all hope it will, it will stop being the land of all open source. Games will not be open sourced and other really good products that people worked hard enough to earn money from won't be either. Realize that. It's already happening. People need to feed their families, so they sell their hard work.

This guy here is selling a product on Win32. I assume they are making money. The question is simply, if we provide Linux users the option to use our software on that platform, will anyone buy it? They are doing a good thing here. They are taking an application that presumably gives the Win32 platform some advantage and offering to make it available to Linux users. I hate to say this, but if MS Office was ported to Linux, not only would it sell, but the platfom would become more popular. Just because you as an individual flee from closed source, does not mean that a company should not market a closed source product to the Linux platform.

If you're talking to just the Linux crowd, it'd be a harder sell (I'd buy it, but I'd prefer letting you know that I'm NOT doing Windows development with it...).

If you're talking to the games development companies, it'd be a plus, not a minus. If they use VectorC, they can expect to target two differing PC OS platforms as well as gaming consoles (As you probably well know, while C is portable, most compilers don't behave consistently- it's always better to use the same compiler for all target environments...)- if you do the MacOS X version/PPC Linux version, that would make for two more platforms as well.

$750 for the professional edition and $80 for the standard. That's more than reasonable a cost for the product. I'd at least buy the standard edition for Linux if it were available for both x86 and PPC.