I program exclusively in Debian or Ubuntu, but I run it in a VirtualBox on my Win 10 desktop so I never have to worry about native graphics drivers and if the system gets borked, I can just rollback a snapshot.

I've actually been pretty impressed by how much better laptop support has gotten over the last 5 years, but still Linux just feels fragile in a way that Windows hasn't for a long time. Like I'm always worried it will fall apart at the slightest provocation. Admittedly I feel like this is mostly Xserver and the rubbish GUI tools rather than anything to do with the Linux kernel.

Linux on the desktop.I love the idea of completely replacing MS with free software but Linux (at least Debian style distos) have the worst user experience imaginable and it doesn't appear to be getting much better.

What's wrong with the UX on Debian based distros?I think the only complaint you could make is the slow release cycle, but that's subjective as some people prefer the tried and true.

_________________com.sun.java.swing.plaf.nimbus.InternalFrameInternalFrameTitlePaneInternalFrameTitlePaneMaximizeButtonWindowNotFocusedStateCompiler Development Forum

On one system if I log out and let it turn off the display, upon returning to it and entering the password the screen at first appears to turn on and work normally, but as soon as I enter the last password character and hit Enter the screen stops being updated as if everything hung right there (it was my impression the first time when I ended up rebooting), however the system is otherwise functional. Restarting lightdm (or whatever it is) fixes this at the expense of loss of all open windows/apps, which sucks nearly as bad as forced reboot except you don't corrupt the file system.

Connecting my iPhone hung the system once.

Connecting another USB device with some connectivity issues (not sure which of the two sides is at fault, happened once) may cause the UI to open up an error message window on every disconnect/read error, hundreds or thousands of such identical windows in mere seconds, which makes the UI unusable and leaves it in a broken state.

You would expect icons of multiple instances of the same program/directory to group together on the "task bar". How about losing an icon because of it somehow being adopted into the group of the web browser icons, with which it has nothing in common? You can see the process in ps or top alright, but not its window or icon on the screen because you're not looking where it's hiding!

Keyboard delay/repeat settings sometimes somehow change. I can't completely rule out my project (large and complex build or QEMU), but at the moment I blame it on Ubuntu. Either way I need to move the sliders in the settings a bit to restore the comfortable values.

Tabs in Terminal routinely stop being draggable/movable by the mouse. Not sure if I'm doing something wrong every now and then and they lock in response to my erroneous actions, but I have to right click and select move left/right if I'm not happy with the position.

Botched text copy'n'paste. Between some sources and destinations I can use both keyboard and mouse, between others only mouse (I get garbage if I use keyboard to paste text; AFAIR, it's when copying from FireFox to Terminal).

There are some other things, but these are the worst offenders in terms of WTFishness and overall effect.

I've actually been pretty impressed by how much better laptop support has gotten over the last 5 years, but still Linux just feels fragile in a way that Windows hasn't for a long time. Like I'm always worried it will fall apart at the slightest provocation. Admittedly I feel like this is mostly Xserver and the rubbish GUI tools rather than anything to do with the Linux kernel.

I feel like Windows is the fragile one, it always seems to replace the drivers I install with the drivers it wants to install. Admittedly the proprietary nvidia drivers are a bit rubbish on Linux, but when I tried Wayland on an Intel iGPU I was quite pleased with how smooth it ran. It was nice to drag around windows without a ton of tearing.

StudlyCaps wrote:

I run it in a VirtualBox on my Win 10 desktop so I never have to worry about native graphics drivers and if the system gets borked, I can just rollback a snapshot.

I think that filesystems like btrfs and ZFS support snapshotting, maybe that'd be good for you.

alexfru wrote:

On one system if I log out and let it turn off the display, upon returning to it and entering the password the screen at first appears to turn on and work normally, but as soon as I enter the last password character and hit Enter the screen stops being updated as if everything hung right there (it was my impression the first time when I ended up rebooting), however the system is otherwise functional. Restarting lightdm (or whatever it is) fixes this at the expense of loss of all open windows/apps, which sucks nearly as bad as forced reboot except you don't corrupt the file system.

What graphics drivers are you using? I have issues like this with the proprietary nvidia drivers.

alexfru wrote:

Connecting my iPhone hung the system once.

I don't use an iPhone so I can't really comment, but with my Android phone using MTP, everything just works.

alexfru wrote:

Connecting another USB device with some connectivity issues (not sure which of the two sides is at fault, happened once) may cause the UI to open up an error message window on every disconnect/read error, hundreds or thousands of such identical windows in mere seconds, which makes the UI unusable and leaves it in a broken state.

Yeah I think that Ubuntu goes a bit over the top with the bug report error messages, I think if one pops up multiple times it gives you the option to suppress future messages of that type.

alexfru wrote:

You would expect icons of multiple instances of the same program/directory to group together on the "task bar". How about losing an icon because of it somehow being adopted into the group of the web browser icons, with which it has nothing in common? You can see the process in ps or top alright, but not its window or icon on the screen because you're not looking where it's hiding!

I think this is a problem caused by application developers. Seems the class name in the .desktop file isn't set and then Unity just groups things in a weird way.

alexfru wrote:

Keyboard delay/repeat settings sometimes somehow change. I can't completely rule out my project (large and complex build or QEMU), but at the moment I blame it on Ubuntu. Either way I need to move the sliders in the settings a bit to restore the comfortable values.

I've never changed those settings so I can't really comment, but it's probably a bug on their end and not yours. Maybe it's worth a bug report so that somebody can take a look at it.

alexfru wrote:

Tabs in Terminal routinely stop being draggable/movable by the mouse. Not sure if I'm doing something wrong every now and then and they lock in response to my erroneous actions, but I have to right click and select move left/right if I'm not happy with the position.

I never even knew that the terminal had tabs I just usually end up using tmux.

alexfru wrote:

Botched text copy'n'paste. Between some sources and destinations I can use both keyboard and mouse, between others only mouse (I get garbage if I use keyboard to paste text; AFAIR, it's when copying from FireFox to Terminal)

There are two clipboards, one that gets copied into when you select text and gets pasted with middle mouse, and the 'normal' one. I think you can change this behaviour though.

I don't pay much attention to what Ubuntu does (I think that they're throwing out their Unity desktop in favour of GNOME now), but maybe things have improved since 14.04 was released 3 years ago.

_________________com.sun.java.swing.plaf.nimbus.InternalFrameInternalFrameTitlePaneInternalFrameTitlePaneMaximizeButtonWindowNotFocusedStateCompiler Development Forum

Running Ubuntu since 2010 and I definitely don't have any of those issues (and for crying out loud, I connected an iPhone once and it was probably more cooperative than every Android phone we ever plugged in). But yeah it can be hit and miss (definitely more hit than it used to be back when I started using it).

And I'm not sure about UX, at least if you disregard Unity (I'm using Gnome) it's not that far off from how Windows used to be, in fact I bet I'd probably find Windows 10 alien by now. And my mum is probably just as accustomed to Ubuntu, her biggest problem usually boils down to it loading slow because of me having hogged up the CPU resources in my account in the background =D (I tend to open everything and the kitchen sink, OK?) That and the hardware itself not being in great shape either, needs some serious maintenaince =/

As far as UX goes:2 or 3 PCs I've found don't correctly refresh dirty regions in text areas in windows, makes it almost impossible to use text editors or xterm. Never found a solution to this that worked for me.I've had X refuse to auto-detect monitor EDID when Windows does just fine (I had to dump EDID in Windows, then specify it as a file dependency in Ubuntu).I've installed packages (some in the core repos, some not) which have failed to correctly install and damaged apt's dependency resolution. This is usually repairable, but it's not fun.Manually editing Xorg.conf is just awful. Plus the low graphic fallback mode when you stuff something up (or when nvidia's config UI stuffs something up) often outputs at some crazy high dpi making it impossible to see anything, or with UI elements off screen.Brand new computer, had a dodgy setup in it's internal device enumeration so that the controller for the MoBo NIC didn't initialize which killed the startup process dead. Windows on same computer, no issues. Had to disable a bunch of devices in the BIOS which, luckily, that MoBo let me do.

I know there are probably fixes and workarounds for these issues, but those are hard to find and take a lot of time just to accomplish something which should be straight forward.

Please don't think I just like dumping on Linux though, I really do like it. I've had certain PCs where it's run very well, with minimal problems and I'd always choose it for server work. However in my experience when I use Linux I spend more time maintaining the system compared to actually doing the work I intended to do and I don't think that's a good thing for a desktop OS. This comes more from a place of disappointment that there isn't a better alternative than malice.

linux on desktop, better yet - the year of linux on desktop, - it's not a "lost cause" in fact, it found its way in humor, despite a very tolerant and loyalist community, this phrase definitely has become one of the biggest computer jokes/meme.

Disclaimer: I am a full-time Linux user. Windows gets uptime only for those applications that won't run on Wine / Playonlinux. (That'd be some games, basically.)

But my favourite computing lost cause is closely tied to the Amiga user experience, or the shareware era in general. Let's take a video conversion software as example.

GUI and CLI having equal rights. (I should be able to work interactively from the GUI, interactively from the command line, or script-driven. Linux is either GUI-only, or CLI-only with a GUI bolted on haphazardly by some other team, with all the incompatibility / discontinuation / version mismatch fun that comes with it. As GUI is definitely a second-rate citizen on Linux, GUIs tend to be... lackluster.)

Easy things easy, complex things at least achievable even for laymen. (Linux usually provides you with command line tools that might be mastered with a thick book on the subject and a couple of months trying to figure out what everything is about, but not even so much as an example on the manpage -- and a wide variety of GUI bolt-ons you can dig through if you are lucky. (See above how much fun those can be).

Consistency. The very strength of Linux -- constantly adapting -- means that not-so-baseline applications tend to "grow stale" pretty fast, especially if they tie in to kernel functionality. That online tutorial going into great detail on how to do X only works with version 1.x of the software, which has been discontinued, and 2.x works completely different. Ha-ha, tough luck, cope, here's the new thick book for you to study.

Too many of Linux software titles were written, basically, to cater for the specific needs of the author, and then left behind. "You got the source, fix it if you want." There's a distinct lack of ownership pride in there -- or an excess of it, which also happens.

There are a few exceptions (mostly the mainstream titles like LibreOffice, Gimp, ...), but generally I feel that the craftsmanship of the shareware heydays has been lost.

So today users are stuck between the "corporate" force-feeding by the Microsoft world, and the "works for me, fix it yourself" attitude of the Linux world. Old survivors of the shareware era like DOpus are getting fewer and further between.

I like the central repositories and package management of Linux, though.

Hell, Geri should take a look at it, because it might be the thing to make a modified form of SubLeq a feasible architecture

one cross is enough for me to take on my shoulders

but i checked what it is aniway

this is the first time i even hear about these ,,belt'' architectures, and i do not yet understand it - i checked some of the videos, i got not much real informations. only some special branch predictor that allows the cpu to decode the instruction pointer into positive and negative directions simultanously

this is very far from my conceptions of architectures, this seems just another billion transistor gold burrial dome, where the focus is on the theocretical cpu performance.

they webpage says they dont have compilers, hardware, OS, but they developing it since 14 years

this reminds me to the 3d game plans of world impregnating geeks who after 14 years of googling having a blank 3d space where they can fly with WASD keys and call it a game, but these guys even filled some patents and made a corporation on the purpose

14 years... either this is the case, or either its cant be done, becouse they architecture only viable in theory

on the linux question about mystically falling-apart graphics drivers, desktop item glitches, hangs, iphone hangs, by the comment of alexfru and others:

-the brutal bugs of debian based distros (like ubuntu) is not existing in the original debian.

-ubuntu and other shoddy Black Tie Stephen 2.0 debian clones are made by people who does not really have knowledge about anything, they just put everything in, if it looks nice for a middle aged female manager in a diaper factory. they dont test, they dont measure, their stuff is not working at all, 90% of the stuffs, including the window manager itself are a sluggish bloatware.

-normal debian does not have this issue as it will come with the default mesa drivers, including vesa svga fallback, and with multithread opengl renderer fallback, which works perfectly on almost 100% of computers, including netbooks

-the firefox/chromium copying issue is the bug in firefox/chromium, not in linux itself

I wasn't trying to sell you on the Mill, Geri, just get you to look at the approaches used to see if any of it would apply to what you are trying to accomplish. I get the impression that you started with one of the later videos, which would probably leave a lot of it unclear.

My point was that this is what a real CPU architecture project looks like. Given the resources in play, fourteen years is not at all a unreasonable time scale for developing a new architecture - and that includes one that is faux-simple such as SubLeq. For a full-time development team on a payroll, yes, it would be a long time, but that isn't the case here - most of them are volunteers with full-time jobs elsewhere, and the few full-time employees (Godard and Edwards are pretty much It, I think) are not getting paid, but are working on potential future stock values.

Also, skepticism is quite reasonable here. I am hoping it works out, and were I a semiconductor engineer I might well do some of the volunteer work to see it happen, but I doubt that even Godard is expecting it to work out. Working to see that it might, yes, but that doesn't mean the sort of ego commitment you seem to have in SubLeq.

Note that I said 'faux-simple'. A real-world implementation of an OISC is not likely to be simple, just because the instruction set is small. Quite the opposite, in fact. A performant version of SubLEQ could very easily exceed the transistor count of a Kaby Lake i7 - though I don't claim to be an expert on this so I couldn't make a realistic assessment.

Don't mistake 'simple in concept' with either 'simple to use' or 'simple to implement'. This is a problem I have run into repeatedly myself (I'm a Lisper with a preference for Scheme, remember?), and while I cannot say for certain that this would be an issue for you, it certainly seems likely that it will be.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

Last edited by Schol-R-LEA on Wed May 10, 2017 11:33 am, edited 1 time in total.

Schol-R-LEA: well ok, but they didnt put anything together except some slideshows. thats why you probably mentioned them here aniway. maybe the same issue what is alreday happened with itanium: people were not possible to write optimal compilers for it, even if in theory the cpu were fast. maybe (or maybe not) they have a simple C compiler which is possibly unable to utilize optimizations in real-world situations. if i interpreting corretly, this is a belt architecture with vliw instruction set. like the lame gets the hunchback on himself. the only relative succesfull vliw design was elbrus, and they are hanging from the boobs of russian government, otherwise they would be unviable too. maybe the whole vliw thing is just doomed to death,

Ah, Geri, are you using Google Translate for this? Because much of this paragraph makes no sense in English at all.

Also, it seems that I got caught in the middle of an edit, again, Here is what I added:

For a full-time development team on a payroll, yes, it would be a long time, but that isn't the case here - most of them are volunteers with full-time jobs elsewhere, and the few full-time employees (Godard and Edwards are pretty much It, I think) are not getting paid, but are working on potential future stock values.

Aanyway, skepticism is quite reasonable here. I am hoping it works out, and were I a semiconductor engineer I might well do some of the volunteer work to see it happen, but I doubt that even Godard is expecting it to work out. Working to see that it might, yes, but that doesn't mean the sort of ego commitment you seem to have in SubLeq.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

Last edited by Schol-R-LEA on Wed May 10, 2017 11:39 am, edited 2 times in total.

OK, not a problem. Knowing that helps explain some things. I will keep this in mind for the future.

And I got caught out again in an edit. Here is what I was adding:

I did post links to the videos earlier, but I don't know how much of any of them you saw. It wasn't really a slideshow, it was mainly a lecture - on their own the slides would make no sense even to someone who knew the architecture.

_________________Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTFμή εἶναι βασιλικήν ἀτραπόν ἐπί γεωμετρίανLisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.

Who is online

Users browsing this forum: No registered users and 2 guests

You cannot post new topics in this forumYou cannot reply to topics in this forumYou cannot edit your posts in this forumYou cannot delete your posts in this forumYou cannot post attachments in this forum