Posted
by
Soulskill
on Friday November 30, 2012 @03:50PM
from the time-to-start-thinking-about-this-again dept.

Volanin writes "I have been using Linux for the last 15 years both at home and at work (mostly GNOME and now Unity). Recently, I gave in to temptation and bought myself a Macbook retina 15". As you can read around, Linux still has no good support for this hardware, so I am running it inside a virtual machine. Running in scaled 1440x900 makes the Linux fonts look absolutely terrible, and running in true 2880x1800 makes them beautiful, but every UI element becomes so tiny, it's unworkable. Is there a desktop environment that handles resolution independence better? Linux has had support for SVG for a long time, but GNOME/Unity seems adamant in defining small icon sizes and UI elements without the possibility to resize them."

OSX isn't the Linux kernel. It's the OSX kernel - which is based on one of the BSDs, not Linux. But it's not the kernel that's important, it's the software that comes.with it - and OSX is very different to, say, Ubuntu.

I have a mac. I run linux on it. Couldn't stand using OSX, I found it quite terrible for me, but I needed something with serious processing capability so got myself a dual-socket mac pro. I've also got a macbook, also running windows, purchased because it was the only laptop I could find anywhere with a decently high-resolution screen. I dislike Apple's software, and believe the company business method is quite oppressive towards their customers and potentially even a threat to free technology in general, b

16075 packages that work, or 16075 packages that are available? Seriously, the last time I used a Mac with Fink, so much was broken that it just wasn't funny. From the complaints I've seen, this is still true, although probably not quite as horrid as back then.

Actually, the OS-X kernel - XNU [wikipedia.org] is successor to NEXTSTEP's kernel. Mach 2.5 got replaced by Mach 3.0, the BSD parts of it were replaced by FreeBSD userland, and the driver kit by a C++ API called I/O kit (Wonder why they didn't use Objective C here too?)

But I agree w/ the GP, though not for the reasons he states. OS-X is a far better system and has nothing that Linux doesn't, unless one considers Quartz to be a disadvantage compared to running X11. So what the OP is doing - running Linux in a VM - is t

Not really: GTK desktops like, say XFCE don't do that. Also traditional WM weren't designed for that, and the themes were typically made by l33t hackers who were somehow convinced that minimising the number of pixels in the bitmaps they used to draw their windows was cool.

KDE got a lot of flak for the early 4.x versions, because they felt terrible. But what they did (replacing many internals, reworking the architecture) did yield us now a very flexible UI. Plasma (KDE's UI) is fully based on SVG and looks good on pretty much any screen, be it a notebook, workstation, or even tablets. And its not such a CPU/memory hog as the people generally claim.

"Does "alpha" and "beta" mean anything other than greek letters to you?"

But KDE base software was *not* beta at all. While certainly quite a lot has changed since, i.e. kdelibs is basically what it was.

When you deal with a whole distribution, how would you say "Hey, the basic internals are good enough, but now all the applications running on top of this foundations will have to be reviewed". Hint: this has happened before, just look out Qt versioning to understand.

I've never tried it in really high resolutions, but everything I've found online says KDE supports resolution independence.And it's just so much better and usable in so many ways than those other environments you've been using.

Have to throw in my support here. Been using KDE since 1.x, I've tried other desktops but can't seem to use one of those without missing my KDE, and so much so that programs compiled to bring up GTK widgets (browsers) actively piss me off. The QT version of the file browser and so many other things are just more versatile and elegant.

HiDPI on Linux is a work-in-progress.. and even when it *does* work, battery life goes down the crapper. Also, thunderbolt hot-plug hasn't been figured out, but it will work as long as your Ethernet dongle is plugged in ahead of power-on. Wifi requires bw-fwcutter, etc.

It's the same as Linux on any other bleeding-edge hardware (and from a very Linux-unfriendly company).. so the entire thing has to be reverse-engineered from scratch.

Want it done faster?.. buy rMBPs for the developers actually working on the drivers.

Like all things Linux, they'll get it figured out eventually. Until then, the best way about it is just run VMware Fusion and run Linux inside of that.. solves all the above issues and really isn't that big of a performance hit. Probably not the "purist" answer you were after but it's the easiest way to get it done in the meantime.

Your comment shows a lack of understanding as to what DPI is supposed to be used for since DPI shouldn't control scaling.

DPI stands for dots per inch, and you should configure that setting to match the actual number of dots per inch of your display. Then the SW environment should support some sort of sliding scale to let you change the size of any UI elements.

Sadly most desktop platforms don't do this correctly and bind the DPI to the size of UI elements. I will admit that resolution independence isn't easy, Microsoft didn't really start down that path until Windows 7, and Apple didn't start to get close until Mountain Lion.

Having used a retina display Mac it irritates me that they don't just have a slider to set UI scale, but instead you can select from several pre-set resolutions. I suspect this is because many applications still try to plot stuff pixel by pixel and so can't scale arbitrarily. It's not easy for most SW to be truly resolution independent and it seems most developers seem to skip handling that sanely on all platforms.

--dpi dpi
This also sets the reported physical size values of the screen,
it uses the specified DPI value to compute an appropriate physi
cal size using whatever pixel size will be set.

Or maybe :--scale xxy
Changes the dimensions of the output picture. Values superior to
1 will lead to a compressed screen (screen dimension bigger than
the dimension of the output mode), and values below 1 leads to a
zoom in on the output. This option is actually a shortcut ver
sion of the --transform option.

Didn't the GNOME desktop switch to scalable SVG rendering way back in 2004 or so (starting from Raph Levien's work on Gill back in 1999)? There were all kinds of articles back then about how awesome SVG was and how all GNU/Linux desktops would be using it soon. I thought Nautilus was designed with SVG support in mind? What happened to all that work and when did GNOME switch back to pre-historic bitmapped stuff? That's kind of sad.

It's actually working. The situation is messy, but workable. (As usal for Linux)

-- X.org people found out that automatic DPI detection is mostly useless because there too many monitors out there who report incorrect information. X supports a DPI override switch which would be a nice place to manually adjust this but...

-- The GNOME people decided to ignore what X reports and hard coded a 96 DPI definition.

-- On top of their hard coded DPI, GNOME has a "text scaling factor" property (default 1.0). Increasing it causes compliant applications to render fonts and other UI elements in larger formats. The main motivation for this was to improve accessibility for visually impaired people, but it also serves for people with high DPI screens. This value can be changed via the accessibility options or by installing the gnome-tweak-tool (or editing gconf).Only GTK/Gnome applications will honor this and even then, compliance isn't perfect as some still use bitmaps for icons. But it's good.

So, for people with high DPI screens:- Force the X DPI setting to a proper value. This will help with some applications (including most Qt/KDE ones, I think).- Change the GNOME text-scaling-factor to something that matches the value above. Ie, if you set your X DPI to 200, then set your text-scaling-factor to 2.08 (200/96).- For Firefox or Chromium, you'll need to manually adjust the zoom level.

I'd really, REALLY like to get my hands on a powerful Linux laptop with such a high resolution screen... if I could afford it I might even settle for the virtual machine solution on the Mac, but a full-up Linux laptop with such a screen would be ideal.

During certain kinds of software development, it isn't uncommon to accumulate a dozen or more terminals and application windows displaying relevant content. Given good eyesight, there simply is no substitute for a high PPI screen when doing such work. Ditto f

Check that X11 has worked out the correct DPI of the display, not all displays pass this information through correctly and i'm not sure if virtual machines do...You can see the current dpi by using xdpyinfo.

X11 itself is pretty good at resolution independence, but individual apps using bitmapped graphics all over the place are not.

especially because "retina" is just assinine Apple marketing jingo. almost every LCD panel produced is purely off-the-shelf and available to any customer who wants it. in particular, there are lots of devices that have pixel densities as high or higher than the particular models Apple selected from the catalog...

Why do you think everything Linux has to be low-end shit? Some folks want higher res. and OP took one of a couple of routes to it. Sorry his choice of hardware struck such a nerve. At what price point do you say money isn't wasted or do you just not like high end hardware?

I think a much fairer statement would be "no one who develops Linux software gives a rats ass about Apple proprietary shit."

Fairer still would be to say "Apple Haters would self-mutilate if it put Apple in a bad light".

immediately run out and spend $3000 to validate my $3000 purchase.

You may not be aware, but Slashdot is just chock full of technical users who can use the web.

When they do so they would find the MacBook Pro Retina to be $1699, not your absurdly inflated figure.

They also, being technical users, would be asking themselves "could not a developer wanting to test resolution independence simply buy a high DPI desktop monitor and test that way also?"

Why yes. Yes they could. Too bad that you, a non-technical Apple Hater Troll, will be unable to even comprehend that question or think of similar cases before you post in the future and beclown yourself yet again.

You are kind of like the court jester who comes in and spills grape juice on your shirt on purpose. Every. Single. Day. Did you not notice the people stopped laughing long ago? And that the looks you get know are all ones of pity and horror?

There were several major projects started about a month after Retina laptops came out. Retina for Firefox. Retina for OpenOffice (Libre Office had support day 1). Retina Ubuntu... So now you are just dead wrong. Everyone in the Linux community knows that Apple hardware is a pretty good guide to features they are going to need to support down the road for Linux. Moreover a huge percentage of Linux developers use Apple hardware.

As for the rest about "wasted money" and "shiny" I'll leave that to whomeve

On consumer, desktop equipment, yes. Consumer mobile equipment is starting to see ludicrous DPI even in middle of the road devices, and commercial medical displays have offered very high DPI for some time.

I think you misuderstood what he said. If you kept reading you would read,"Consumer mobile equipment is starting to see ludicrous DPI even in middle of the road devices".Mobile devices like the iPhone, iPad, Galaxy Nexus, Nexus 4, Samsung S3, Nexus 7, Nexus 10, Kindle Fire HD, and so on all are providing very high DPI displays. It is a real shame that HDTVs have made 1080p displays so cheap that it is now the standard for most desktops.

If you pay for your own electricity upgrading to a similar LCD might pay for itself over one to two years, and you get a huge amount of desk space back. (the cat will complain about no longer being able to sleep on the monitor however.)

They don't make any, but they are buying just about all of the available displays of that type. It's like when the iPad came out and every other large electronics company wanted to make one as well, but found apple had bought all available production with some components.The same thing has happened with eink where one Russian company bought the full run of LGs flexible screen and it's going to be a year or so (if ever) before oynx are selling the one they had announced and a couple of years before Kindle o

It's not just about high dpi displays either. You can have a high resolution on a large screen while still wanting very large fonts and UI elements. It helps you see it better when your eyesight is not very good, if you have partial blindness, etc. So you can help both those with degenerative vision and those with amazing mutant vision at the same time.

Hey troll, like Apple or not they're addressing a glaring problem by bringing out the retina display. Our screen resolution has stagnated and even regressed due to HDTV and the buzz word compliance of 1080i. I can only hope throwing down the gauntlet as they have will push other hardware makers to bring out their own 4K displays.

What's your point? Computer monitors do 1080p, not 1080i and you were saying the industry was standardizing on HD buzzword compliance. Blu-Ray goes up to 1080p and works on HD TV's. Why are you bringing ATSC broadcast standards into this?

ey troll, like Apple or not they're addressing a glaring problem by bringing out the retina display.

That or they're creating the problem by purchasing every high resolution computer display available on the wholesale market for their own devices, making them prohibitively expensive for other manufacturers.

What glaring problem? The problem they're addressing is screen DPI, which is basically a non-problem, and not screen size, which is something I'd love to see get larger and is what you really mean when you say "resolution has stagnated."

Right now I'm stuck with a 1920x1200 monitor, and I'm glad to have that because no one makes them any more. If I were to "upgrade," I'd have to replace it with a 1920x1080 monitor. What I'd like to have is an even larger monitor, like the really nice but still way too expensive 2560x1600 monitors. (Still over $1000.)

What Apple did instead was up the pixel density, which is nice, I guess, but not really useful. Those high-DPI displays are great for a cell phone or other devices you hold in your hand, but not really great for a laptop.

Really, I'd rather see a higher push for the larger sized monitors so I get more useable room out of the display rather than see the DPI pushed up. All "retinal" gives you is the same UI, just with four times the pixels. It may look "shiny" but it sure isn't any more useful.

I just picked up a "WQHD" (Widescreen Quad-"HD" for values of HD meaning 1280x720, so a total of 2560x1440) 27" IPS LCD monitor online for $300 US. It's very bare-bones (DVI input only, no webcam or USB hub or anything, etc.) but considering a 1920x1080 monitor at 27" is hard to come by for $200, it's an excellent price for the much less common resolution.

They make them in Korea and ship them out under a handful of brand names. A search on "wqhd monitor" will find you several places you can buy them from. M

the really nice but still way too expensive 2560x1600 monitors. (Still over $1000.)

If you don't have a business case to justify $1000 for a monitor that you'll probably use for 5+ years, then you don't really need it.

I'm eyeing the Eizo 22" [amazon.com] - about $850 and has a bit higher DPI, along with the high resolution. The 2560x1600 screens are in the 30" range - the DPI isn't very good. That's fine for people with vision loss, but two screens at 1900x1200 are going to be better for most uses.

I don't know. Screen resolutions have been getting very large. Ie, I've got 1920x1200 at work and a few years ago I would have considered that something only available for several thousand dollars at least. What Apple is really doing is providing higher DPI; high resolution but on very tiny screens. They're sort of solving the problem of people wanting to see more things on a laptop but without having larger laptops.

The drawback is that I think most people really can't make use of that high DPI. So at

"Low resolution" is a hardware problem - you want higher-res, you need smaller pixels and more of them, and the only software that would affect that would be the software in the machines used in the design and manufacturing process for the displays.

I did this recently. After avoiding KDE because it didn't look nice, I tried it again after the Gnome devs "pulled an Apple" and said that we shouldn't be able to theme or add extensions to our desktop. It takes a bit more setup to make it exactly as you like,... but you can make it *exactly* as you like. You also only need to do it once. It'd well worth the minimal effort it requires.

Lulz. "If your head gasket is warped, instead of whining to a mechanic, why don't you forge yourself a new engine block?" Yeah, you can, and I'm glad the option is there, but coding your own drivers is absurdly impractical for the great majority of users.

The problem he is describing isnt a problem with support for the retina display. It is a problem in the design of the software itself, a design limitation that makes it difficult or impossible to 'fix.' If you have a piece of software that does half of what you need and is well designed, and one that doesnt 90% but is poorly designed, adding 50% of the necessary function to the first may be easier and more rewarding than struggling to somehow tack on the last 10% in the second case.

you know you're both wrong and trolling, so why did you press the "submit" button at all?

Linux does fine with high density displays. actually, the place it does worse is on extremely low-density displays. I have some 42" 1366x768 displays that take some painful tweaking to setup, since environments like KDE try to be smart about the ruler-size of fonts, not noticing that these screens really do have pixels big enough to throw a rubber chicken through...