I wonder where he did get this? Has this platform been ported to some other FPGA? Did gehstock ported it to Mist?Or did he just used the zx81 core with the rom of the Jupiter ace? Since I believe that the Jupiter ace has the same board as theczx81.

Different people have different styles of programming. It doesn't mean other published sources are perfect. Neither is mine.Publishing the code means:1) You respect the one from who you got the code. Remember, the original author spent a lot of time to develop it.2) Some one can pick the code and improve it instead to do porting from beginning.

That excuse is bullshit, actually.What is most awesome in MiST - its openness. I can pick any core i like and improve it. Other similar projects based on closed (or soon-but-never published) sources either already disappear or stay very low. What Gehstock doing is spoiling this picture. Now he is doing some ports and improvements, later he will loose interest and disappear - all these cores will be abandoned without further improvements.

There are plenty of current Cores which don't work on all MiST versions, and more Cores get made instead of 'bullet proofing' the older ones. So what Gehstock is doing does not sound too bad.

I know it is all volunteer stuff, but community users have invested around 200 euros.I am up for a testing programme to 'bullet proof' troublesome Cores. I know the MiST does not have an OS as such, but could a RAM timing programme be written so using Community MiSTs we can get a min/max of RAM timings.

I don't think it's that simple; it probably needs somebody to check the timings on the original hardware. That said, if developrs can outline some busywork to "farm out" I'm sure there will be some volunteers.

Sorgelig wrote:Publishing the code means:1) You respect the one from who you got the code. Remember, the original author spent a lot of time to develop it.2) Some one can pick the code and improve it instead to do porting from beginning.

And depending on the license, there is a legal aspect too. All my sound cores have a GPL license. That means that if you use them and want to distribute the results you have to release your source code and your license cannot be more restrictive than GPL itself.

Higgy wrote:There are plenty of current Cores which don't work on all MiST versions, and more Cores get made instead of 'bullet proofing' the older ones. So what Gehstock is doing does not sound too bad.

I know it is all volunteer stuff, but community users have invested around 200 euros.I am up for a testing programme to 'bullet proof' troublesome Cores.

I am totally in favour of this proposal. We have too many cores that have issues. What attracted me of the FPGA world was the possibility of having exact replicas of the old hardware on modern electronics. But we are not quite there. Maybe we need to develop some sort of quality seal, like:

MiST candidate: a core that offers some functionality but is not considered ready

MiST approved: a core that covers all the functionality of the hardware it is meant to clone

MiST star: a core that on top of providing the functionality, meets the required FPGA constraints so it will work correctly on all MiST boards for all operating conditions.

MiST sun: a core that on top of being a MiST star provides clear documentation about usage and also about its internal architecture so other people will be able to improve it in the future.

Of course this is just a suggestion. Note that for the first two categories, we only need users who want to spend some time testing out the cores. Maybe intermediate categories would help.

Many thanks @jotego for your comments. I have tried to get this subject discussed in a previous dedicated Topic but I have not been successful in getting comments/backing from programmers.

It can be difficult. Programmers need to remember these are not personnel comments/attacks. As a community we all want fully working Cores and to have a 'design guide' to ensure that Cores work on all MiST PCB revisions/RAM chips only benefits the community.

I am very methodical and solve engineering problems for my job, but I don't have any programming knowledge or FPGA knowledge, so although I can create forms/collate information/analysis results, I can't say for sure "Core x has SDRAM timing issues with Y chips on v1.2 PCB vs Z chips on v1.3 MiST PCB". I can only look at what the community provide me with and put forward the results.

Another big variable across MiST operation is the SDcard. FAT, FAT32, capacity size (2GB, 8GB, 32GB?), speed rating (Class 2,4,10 etc) and single Core or multiple Cores with MiST Menu (and what Menu version is being used). - I say to make things easier just have the Core being 'tested' on the SDcard, nothing else.

The ZX-Uno forums have a great spirit for helping each other, we can improve

Like the Core programmers getting satisfaction in writing a Core, I get satisfaction in problem solving. We have all the knowledge and the people to help, it is just getting people to work together and the Core programmers to spend some time 'rubber stamping' what they might think is a fully working Core for them.

To me there are 3 Cores which need looking at to start with; Sam Coupe, MSX & Archimedes. Identifying who is the 'Core Lead' would be a good start, or someone who is able to modify the existing work.

My proposal would be to start with the Sam Coupe. What is the issue you ask. Well this is a classic example, the latest Core (Release 20170206) does not work properly, but the one before does (Release 20170112). Other people need to check to see if they experience the same.

And again, if it works you don't just say "it works". We need the variables. PCB version. SDcard info. Link to where did you download the BIOS (if Core requires it etc). Link to the file you test with.

And again, there are differences between a working Core and one where some special DEMO or coding does not quite work.I am trying to focus on the 'it works on one MiST but not at all on another MiST' (which they 'should' be all the same). For example the BBC Mirco Core, a lot works and works well. A few games/demos don't work, but they do work on the ZX-Uno. So we might be able to take info from another Core or unfortunately it might be down to the hardware used. - and fixes have been made working together. So the BBC Micro Core is a great example what what can happen.

Sorry for the long post, but we got to talk and it is better if we all work together to solve issues

Higgy wrote: Identifying who is the 'Core Lead' would be a good start, or someone who is able to modify the existing work.

Some have no clear current owners, some have been functionally abandoned and probably should find new owners to update them for things like YPbPr/RGsB output update and other general routing compatibility updating as the base Mist firmware changes etc .. I can think of a few who's ownership is murky and haven't been updated in ages (but could seriously do with updates),.. PlusTOO, FPGA64 {there are a few folks working on tweaks but the C64 core has no clear "owner"} .. I don't know if the Vic20 core has an actual current owner as well.. I think you may find a lot of the cores are murky at best.. But I do agree with your idea.. very much so!

Maybe as part of the "quality seal" listing, perhaps should indicate if a core needs to be adopted by someone .. perhaps thats even more important as many of these cores need love.

It's community, and most development has been made for fun.Instead of walking around and sorting the cores, it would be good if more people will start to learn programming. Since FPGA has more circuit background than traditional programming, some users far from programming but good in circuits can try their luck in FPGA as well.

It's from a user perspective though, so it doesn't address code quality and such. It's also important, IMO, to keep it fun to developers; so imposing too many standards can be heavy. Unless someone is willing to start rewriting cores themselves to meet their new standard.

One thing I can suggest for those who can't code FPGA would be to come up / find / write test suites for each system, probing different aspects in real hardware, emulators, and FPGA. That will be useful for preservation to everybody, not just the MiST/FPGAs.

We could maintain a list of links / test files on each core doc wiki page.

More coders is always good. But in this case, may be coders taking a more professional approach is probably more important. Seems that most cores were developed by very talented software coders that learned FPGA coding very fast. But hardware is not software, it is not just an issue of a different language or a parallel vs sequential computation. As we are saying most cores seem to violate modern synchronous designs practices, include proper timing constraints, let alone documentation, etc. The consequences are obvious.

You know, it is not just that a core might not work reliably, or not work on some situations. A bad core might actually damage the hardware. Chances fortunately are very low, because in most cases that might damage the hardware the core won't work at all and then the problem would be catched very early by the developer. But ...

It's from a user perspective though, so it doesn't address code quality and such. It's also important, IMO, to keep it fun to developers; so imposing too many standards can be heavy. Unless someone is willing to start rewriting cores themselves to meet their new standard.

One thing I can suggest for those who can't code FPGA would be to come up / find / write test suites for each system, probing different aspects in real hardware, emulators, and FPGA. That will be useful for preservation to everybody, not just the MiST/FPGAs.

We could maintain a list of links / test files on each core doc wiki page.