I don't really get it. Someone spends his time creating a tool for his own stuff and then releases it for anyone to use and people complain. Not because it's broken or anything, but because it's not done in the way they want it. I don't think ggn said anywhere that it's a replacement of anything. It's just a tool that you can use to cross compile your programs. If you need mintlib or anything and you need to use it with brownout then you'll obviously need to compile it yourself.What I think you miss Thorsten is that this was never supposed to be used to compile MiNT stuff as I understand it at least. It's supposed to work for TOS and it's mostly geared towards demo and game development. If it helps people do MiNT stuff then.. great. And I don't see anyone stoping you from using whatever you want. Last but not least, there is no applestore to tell anyone how they should create their stuff. This is still Atari.

I can't say I understand the details of how/why it works and whether or not this is a good thing. What I do know is that having downloaded and built the toolchain, I'm now able to produce binaries for Pole Position that show marked performance improvements over binaries produced from the same source using the (also excellent) Vincent Rivière toolchain.

This toolchain definitely works as advertised and is a welcome addition to the Atari scene as far as I'm concerned. Thanks ggn!

Curious, and by no means do I wish to offend ggn... I just find it odd for someone who knows how to make changes to bcc and yet doesn't know how to create a patch file. I can't really say much, since I have only fables a tiny bit in python and can mangle my way through bash, but generally speaking... 'git diff' will print a diff you can use as a patch file.

On that note, what are the chances any of this gets published upstream?

ThorstenOtto wrote:you can't use *any* of the existing libraries. Everything has to be recompiled, including mintlib, gemlib, and every gnu package you can think of. I don't see any of that libraries.

That's problem doesn't kill you. As long as this is the truth, you have to deal with it. But: does gcc7 for Atari ST compile these libraries without annoying bugs and can you run the executables with enough confidence?

Please tell me, I would like to know. As long as the result of the produced executables is unpredictable, it's not worth to recompile all that stuff.

Not out of the box, but that is only because mintlib currently simply does not support it. It's partly because of differences in the elf assembler output, and partly because of newer features of gcc that have to be dealt with. But we are working on it.

m0n0 wrote:and can you run the executables with enough confidence?

The actual compiler is the same, so the final results should be comparable. But of course there have been substantial changes in the overall code generation compared to 4.6.4, and i would bet that not all have been tested for the m68k backend. Another reason why i prefer a solution that can replace the older 4.6.4 version without much changes; only recompiling existing programs will tell wether that version produces something useable.

m0n0 wrote:As long as the result of the produced executables is unpredictable, it's not worth to recompile all that stuff.

"predictable" is not the problem. Wether the compiled program works the way as before (hopefully faster) is the question

I (Vincent Rivière) just want to say that I welcome the GCC brown edition. It is a different approach than the FreeMiNT one. And different visions of the same object lead to better understanding.

- FreeMiNT approach is the native one. Port GCC to a new operating system. Patch the linker to generate native executables, so GCC can naturally output executables for the current system. Then recompile everything with that: libraries, tools, etc. Of course the same can be achieved with a cross-compiler (my preferred method).

- Brown approach is the embedded one. Everything is cross-compiled to generic ELF. Then a post-processor is used to convert the resulting executable to native format. There are no libraries or tools on the target machine, because we don't care, we just want to run the resulting executable on the target machine.

During 15 years, GCC has only be used on Atari by FreeMiNT people. Even if FreeMiNT's GCC (including my own cross-tools) can easily produce TOS-only executables, I have the feeling that it has never been used by TOS people. Surprisingly, it seems that Brown GCC has gained immediate enthusiasm from TOS people. I can only welcome this situation, as this brings more people to GCC. And if this encourages people to write cleaner code, compatible with GCC and ELF features, that can only be a good thing.

Regarding to my m68k-atari-mint cross-tools, yes, they are currently stuck to GCC 4.6. This is just because that version is good enough for my own needs, as well as FreeMiNT people. It has been extremely hard, over the years, to get something stable. Every new version needs more work, more testing, more bug reports, more risks of regression... and finally, not much benefits. And most of all, last years I had less time to care about GCC stuff, so things stayed like that. But the most important is that I have published all my work, including full history, to GitHub: m68k-atari-mint-gcc and m68k-atari-mint-binutils-gdb. So people braver than me can continue the work with newer GCC, what is currently happening with Thorsten and MiKRO.

Regarding to a.out, even if it is *not* dead, it is more and more deprecated. Definitely, the biggest trouble I encountered when porting MiNT patches to newer binutils/GCC were upstream bugs related to a.out. This is like that: GNU people don't care about a.out anymore. So anyone still using a.out features will hit bugs, because that code is no more tested upstream. So definitely, using ELF intermediate files is the way to go. This is why Brown GCC is 1 step forward FreeMiNT GCC regarding to that issue.

On the other hand, I agree to Brown detractors, the Brown solution is not what native GCC users would expect. But it *is* the general embedded solution: executables are generated with any standard ELF tools (old GCC, new GCC, or anything else), then the post-processor converts that into the target format. So any compiler upgrade is uncorrelated from Atari stuff. Hence more simplicity.

Regarding to legal stuff, I didn't look precisely as what is (or is not) provided (I didn't even try Brown GCC myself). So I have no idea if there is a real issue or not. But basically, when you redistribute binaries, you must respect the upstream license, otherwise the upstream copyright holder (namely: the FSF) may complain. When you redistribute binaries of GPL software, the following principles are roughly enough:- do not distribute binaries built from original sources mixed with any GPL-incompatible sources- provide your sources if a user of your binaries asks for themThe above rules are pretty easy to respect.Of course you are free to use any license of your choice for your own tools (i.e. brownout), with your own rules.

So basically, I just say: continue Brown GCC on your way, that's your project, as an alternative to FreeMiNT GCC. If it was a bad thing, it wouldn't be used.

Not yet. I will integrate other people's efforts on m68k-atari-mint GCC 7 when it's mature. Currently I only provide GCC 4.6.4, that one has been stable for years.

leech wrote:Granted last time I tried compiling something (QED) it couldn't find cflib, which I'm sure I have installed.

It worked for me, as I recompiled QED myself without trouble. A potential issue could have been defaults settings in my Cygwin cross-tools installer. In older versions, default installation type was set to Typical (without CFLib) while on more recent versions it is set to Full (with CFLib).

Not yet. I will integrate other people's efforts on m68k-atari-mint GCC 7 when it's mature. Currently I only provide GCC 4.6.4, that one has been stable for years.

leech wrote:Granted last time I tried compiling something (QED) it couldn't find cflib, which I'm sure I have installed.

It worked for me, as I recompiled QED myself without trouble. A potential issue could have been defaults settings in my Cygwin cross-tools installer. In older versions, default installation type was set to Typical (without CFLib) while on more recent versions it is set to Full (with CFLib).

I have started playing with a little app I wrote in 2010. I thought I might see if I could work out how to build it with gcc 7 and the Brown tools but I see it is Windos only and I was hoping for a native Atari version.

I know most builds are slow even on the 060 machines but my app is tiny and takes less than 30sec to build and link.

Ran into a couple of issues with the script building on a Solaris machine: sudo needs to default to pfexec on Solaris and make needs to default to gmake to use GNU make instead of Sun's make. Other than that it seems to be compiling GCC fine - at least so far!

Why use the brown gcc? You can just use the usual cross-compiler from Vincents site. If you need a newer compiler version, you can try http://tho-otto.de/crossmint.php. The main key feature of the brown toolchain, using link-time optimization, does not work for mintlib.

Hey, is there a problem switching --enable-decimal-float to --disable-decimal-float in your GCC build script? I'm building on a SPARC target which doesn't support decimal float. I won't know the results for a while, so I figure I'd ask too.

LuigiThirty wrote:Hey, is there a problem switching --enable-decimal-float to --disable-decimal-float in your GCC build script? I'm building on a SPARC target which doesn't support decimal float. I won't know the results for a while, so I figure I'd ask too.

Unless you try to compile some Atari program with the resulting compiler that needs decimal floats, that shouldn't be a problem. But it shouldn't be necessary, the host does not have to support that to enable that switch.

LuigiThirty wrote:Hey, is there a problem switching --enable-decimal-float to --disable-decimal-float in your GCC build script? I'm building on a SPARC target which doesn't support decimal float. I won't know the results for a while, so I figure I'd ask too.

Unless you try to compile some Atari program with the resulting compiler that needs decimal floats, that shouldn't be a problem. But it shouldn't be necessary, the host does not have to support that to enable that switch.

It refused to build saying my host compiler didn't support decimal floats until I disabled it.

Never looked into that, but it's a bit strange, considering that this is a cross-compiler. Ie.when that feature was first introduced, the previous gcc used to build the new gcc didn't have support for decimal floats, either. But actually, i've not yet seen any software that really needs this feature.

I got GCC 7.2 working and built some hello world applications! Awsome!

I'm adapting my Pure-C GEM game source over to Mintlib and GEMlib. Having some trouble with crashes though, and being without the Pure-C debugger is a pain. Is it possible to get symbolic debugging going with a TOS application built in GCC? Or are my chances dire enough that I should keep working on my homemade GDB stub for TOS?

Most common problem when switching from Pure-C to gcc in GEM programs is that you need to declare most variables as short instead of int. Turning warnings on with -Wall (and fixing reported problems of course) might help.

LuigiThirty wrote:Is it possible to get symbolic debugging going with a TOS application built in GCC?

If you mean to generate the debug information that PD needs: no that's not possible, unless someone reverse-engineers the format that PD needs, and implements that in GCC. You can generate other debug information in gcc of course, but unfortunately that does not help much because gdb for MiNT is in a poor state, and does not work at all most of the time.

Or are my chances dire enough that I should keep working on my homemade GDB stub for TOS?

I started writing a GDB stub that's linkable with TOS applications. (It's also in Pure-C and its assembler format so I'll need to adapt it to GCC and GAS...) So far it hooks the bus error, address error, and illegal instruction vectors and dumps the 68K registers to the screen when it crashes. At that point you can connect to the ST over RS-232 with GDB and examine registers and read/write memory. That's about it so far.