This is purely theoretical. What are the chances at any time of somebody developing the next Windows, macOS, or Linux? Is it an absolutely zero chance? How many years will it take for a team of at minimum 5 members? Is that still too small?

What happened to championing variety? It seems that everything has only a few narrow choices *cough* DIRECTV/Dish Network *cough* Don't get me started on that.

It is very, very small, IMAO, but not zero; unfortunately, it depends on too many factors, most of which have nothing to do with the specific OS or even computers in general, to really factor it beyond that.

I will say this: if one does come up, it will probably involve at least one dramatically new technology; it will be outstanding at something above and beyond others currently in use, in a way that cannot be easily or effectively retrofitted onto the existing ones; and beyond that, its success will not be primarily decided based on technology driven reasons (it has to have the 'killer app' aspect to get its foot in the door, but the rest has little if anything to do with the qualities or lack thereof of the OS itself).

And it will have either a charismatic and utterly ruthless - to the point of psychopathy - figure in charge of it, with the same sort of cult-of-personality Jobs had, or else be driven by ideological passion like that shown by Stallman and his devotees (but for a cause that is either utterly marginalized today or which doesn't exist yet). Probably both.

I would say that there is maybe a 1% chance of it happening in the next ten years, and if it does, it will be one no one saw coming.

Assuming anyone is still alive at all ten years from now, and still possess the level of civilization and industry needed for it to even matter, that is. I give odds of about 1:1500 against on that. When I'm feeling optimistic.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

Assuming you just mean general purpose/desktop OSs, all three you mention have been under constant, active development for 2 decades at least, and have been worked on by literally hundreds of people, if not thousands.

So for a new OS, you need some way of getting hundreds of experts working on it, and then in 5-10 years maybe you have something as capable as Windows or Linux.

MS's way of accomplishing this is dropping a butt-load of $$ and hiring hundreds of people, presumably very few people are willing or able to do that!

Linux took advantage of a very unique circumstance, maybe not even intentionally. Linux 1.0 dropped when there were a legion of disaffected Unix programmers, sick of anti-competitive and anti-open source practices from old school Unix vendors and with a pathological hatred of M$ . Torvalds offered a working, open source kernel when no-one else could. So Linux basically got the decades of hard working experts for free, they used Linux, so they improved Linux, and everyone benefited. Any new OS looking to go the same way would have to offer something exciting enough to lure some of the Linux devs.

All this makes me say the chances of a totally fresh OS, with no GNU, Linux, Windows or FreeBSD pedigree coming out and carving a chunk out of the desktop market is low enough to safely call it zero.

Note that while BSD386 (aka Jolix, one of the ancestors of FreeBSD and OpenBSD) was around, it was not obviously untrammeled - or at the very least, there were accusations of it using AT&T sources in places. Also, there were legal hassles between the principals of the project itself which eventually led to FreeBSD forking off from it, and later to NetBSD and OpenBSD forking from it in turn. While it was there as a not-quite-free Unix, its future seemed uncertain in ways Linux's did not, despite Linux have very little to offer at the time.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

I agree with Schol-R-LEA but I don't think it would necessarily have to be doing something with new technology, but just satisfying some use that users didn't know they wanted. My personal bet is something that takes what Linux has managed even further and builds a platform capable of making most computing fully 'disappear' into the world, a sort of smart distributed system that no one even realizes is everywhere. I also don't think that it'll be someone who set out to make an OS that manages this, but more likely a lot of individual factors, like perhaps an acquired business by a large embedded system manufacturer that ends up dominating the IoT space.

_________________"If the truth is a cruel mistress, than a lie must be a nice girl"Working on CardinalFind me at #Cardinal-OS on freenode!

Build something that does something better or equal to something else in some niche respect and for a better price with better support and people will use it. Don't try to aim to be better than everything, or even better at general tasks. You'll go nowhere with that because you won't be able to figure out what order you want to do things in, and what you need for a release.

Find something you want to do really well and make it happen. And learn some dang marketing/interviewing skills. I've seen a lot of really smart people with bright ideas on paper who would be a great asset if only they could sell themselves and their ideas. "I know you think your <thing> is good, but why should I think it's that good?"

Can't sell even a niche OS to a market of that niche if you can't convince people to buy into it.

Well, it would be very hard to become the next big OS. Windows, Mac and Linux have been since the 90s and 80s. Do you think OSDev is simple? It is not like making a 2D game. Also, these operating systems have millions of lines of code.

If you think OSDev is like baking a cake, then you're not suitable for it.

mac: i am targeting to sell my os and have actual users. its extremely hard - and i dont even have an available hardware on the markert yet, so i have to emulate it. but i dont even thined into native x86 (saturated market, tousands of os & dieing platform) or arm (no unified io, and i will not write 200 image for 120 platform). i wasnt satisfyed with them also. so i decided to bring subleq into a living thing. there i not have to worry about competitors. but i am only possible to sell yearly 5-10 license keys, thats not even near to bring back the spent time. however, i finished a bootable emulator for x86, and i will have other platforms soon. maybe i do yearly 2 or 3 platforms. maybe there will be an fpga and/or transistor level implementation by the end of summer by a group. but having the technology is not enough, i also must be find market segments where i can sell them.

An FPGA? I can believe that, that's just Verilog coding. An ASICS or an actual custom IC? Pardon me while I go shovel that. BS into my garden...

I don't know if you are lying, or if it is just that someone is lying to you, but there is zero probability of that happening. I don't know what the time scale you are talking is, but unless this is the culmination of several years work by electronic engineers, it simply isn't believable. And if you mean that they are starting now to get it done before September - mo. Even Intel couldn't do that. There are fixed time constriants

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

An FPGA? done before September - mo. Even Intel couldn't do that. There are fixed time constriants

They could, should they decide to make a single instruction CPU, drop stupid legacy things, and do like video games : get a unfinished product, receive its bugfixes via patches.

If you don't care about performance at all, then you can do nothing and say it takes 1000 years to execute one instruction. You'll be long gone before anyone realises you lied.

If you do care about performance, then it's the same complexity as a sane CPU except that you have the additional hassle of needing some extra "try to convert this pattern of idiotic SUBLEQ nonsense into a usable sequence of useful micro-ops" logic after the decoder (so that a large quantity of SUBLEQ instructions can be converted to a single "shift left" micro-op, or a single "AND" micro-op, or...).

Cheers,

Brendan

_________________For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.

I am assuming that Geri is expecting that it will a) run the code exactly as written (without concern for the efficiency, because, hey, a simple op code has to run fast, right?) without doing any of those sort of of things, and b) be able to run the subleq opcode in a single, short cycle, with no pipelining, instruction and data caching, branch prediction, microcoding, etc. and get as good or better performance than a 'super complex' design that, you know, had instructions that can be implemented in a simple circuit such as an adder, a negator, a Not Or, or a move to register.

I didn't choose those by accident; they are all operations required in a naive implementation the Subleq instruction. The adder is needed twice, in fact. Assuming you are using two's complement, the subtract is a negation (which in turn is an increment - basically a single bit add-with-carry on the zero bit - followed by a NOT). This is followed by the compare, and since this is subleq and not subeqz, it needs a general comparison, which is usually implemented using a subtract followed by a compare to zero. The compare itself is just a NOR with n inputs where n is the bit width, producing a single bit value for the result.

You will also need to decode the implicit operands into a set of three registers, two for the values to be operated on and the third for the target. You also would need an instruction register, and every instruction needs to end with the IP being either incremented or loaded with the branch target - in practice, it might be easier to have it do the increment in parallel a fourth hidden register, and use the results of the multi-NOR to select which of the two target address registers to load into IP.

All of this is without considering the massive amount of content-addressable memory you would need for caching these operands, because even with the sort of process you are likely to have access to, a load from memory would take at least a few hundred CPU cycles.

This is just off the top of my head; there are surely more efficient implementations that could be used. The thing is, anything like that would require time and effort, and Geri doesn't seem to really grasp that just because an idea is simple, it doesn't mean that it is a good idea. As both Brendan and I have told you repeatedly, the overall result of this in silicon would be at least as complex as any other current-day microprocessor, and would still never perform as well.

But all that is besides the point. Even for a straightforward, naive implementation such as I just described, designing the circuits is only the first step in developing a new IC, and by no means the longest. For an FPGA, yes, that is most of it, but it isn't even close to that for a custom ASIC, never mind a full IC implementation. And for that ground-up IC, the process costs millions (in either USD or Euros) just for creating a mask, as I have already stated. There are maskless process, true, but those cost even more - those costs are something you are going to get around, period.

Also, Geri? Please tell me you haven't given these people any money. The way you describe what they told you seems terribly suspicious to me, though that might be just my cynicism talking.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

Last edited by Schol-R-LEA on Fri May 12, 2017 11:00 am, edited 2 times in total.

also if no one eventually shows up with a working implementation on fpga-s, i will do it personally after learning it. however i really think there will be implementations soon. (there are librarys to puzzle anything together, and if a real experts shows up, he can optimize and clean the implementation if he wieshes to. )

Who is online

Users browsing this forum: No registered users and 0 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum