Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

gbl08ma writes "According to varioussources, the ISO image size for the upcoming Long-Term Support Ubuntu version 'Precise Pangolin' will not fit on a regular CD, since the image size is expected to weigh around 750MB instead of the usual ~700MB. The idea is that users should either flash the image to a USB flash drive or burn it to a DVD. The extra room on the disc image could allow for integration of more GNOME3 components and Canonical applications. There was also a proposal to use a 1.5GB DVD image as the default download for Ubuntu 12.04."

The Linux kernel is only a few megabytes. The whole thing fits easily in the L2 or L3 cache of a modern server processor. That's Really Freaking Important for geeks like me who have to build stuff at scale.

I want to ignore your troll, but I can't. You raise an important issue, even if your motivations are suspect.

For 20 or 30 years we've had the meme "Intel giveth and Microsoft taketh away." That's shorthand for the fact that Microsoft operating systems grow less performant at the same rate Intel processors grow more performant, and net the progress is zero. It doesn't have to be that way any more.

Software does a whole lot more than it used to. Faster CPUs mean you can have software that operates at about the same speed as the old version, only it's about 100 times more feature rich and useful. That isn't zero progress.

It's not progress when todays' linux desktops run slower than either of XP, Vista, or Win7 on the same machine. It's bloat.

It's not progress when the same machine that ran KDE perfectly fine a few years ago struggles to run a lightweight DE such as LXDE today - because of bloat.

no, ubuntu has been turning into a bastardized bloated piece of shit distro and now this is the final nail in the coffin.
ubuntu kiddies need to use a real linux distro and stop trying to use "windows lite", it's slow, memory leaks all over the place and overall just a poor distro
ubuntu is an embarrassment to the linux community

no, ubuntu has been turning into a bastardized bloated piece of shit distro

Let me break that up:Bloated - You could call it that but it's also the closest to what the average desktop user would want right out of the box. The fact is I can install it for someone and they immediately have office, music players, etc. and a few toys. For that kind of user it's nice.Turning into shit - The progression from 9.04 (stable, solid) -> 9.10 (compatibility issues out the ass) -> 10.04 (ok, a little more together, some issues but more shiny) -> 10.11 (why do I feel like this is the last stop?) -> 11.04 (wow, Unity is terrible, they should at least still have GNOME installed by default?) -> 11.10 (Unity still sucks and GNOME 3 isn't near functional, shit is broken left and right, installing binary drivers all the sudden breaks things, tons of functionality missing, strangely broken packages left and right, WTF!?). So yeah, turning into shit.

and now this is the final nail in the coffin.

The thing is Ubuntu still works and it does have a lot of polish when compared to vanilla Debian. For a lot of people it's that polish that makes the difference.

ubuntu kiddies need to use a real linux distro and stop trying to use "windows lite", it's slow, memory leaks all over the place and overall just a poor distroubuntu is an embarrassment to the linux community

I'm not really sure you could call it "windows lite", especially since Shuttleworth seems to be bent on making it look and feel like some sort of artistic deconstruction of OSX. Not really sure on the memory leaks thing either but perhaps that's a Unity thing and I don't use Unity so I just don't know. If you are a classic Linux user I'd say it's fair to call it an "emberrassment" as Ubuntu has continually chosen to discard functionality and replace it with their own brand of flashy/popular/easy - but that's also decreased the entry level and attracted a lot of new users. The fact is I can say "Linux" and now people don't look at me funny, now they get an image of an orange or purple desktop with a bunch of widgets and compositing effects.

"Like what? Using more CPU for audio than for playing a video? (yes, I did this test)"

I highly doubt that unless you're using something like VDPAU, in which case it's hardly a fair test as the CPU isn't _doing_ any video decoding.

However, it's true that PA uses more CPU time than ALSA by default; know why? Because it does higher quality resampling. Given that low quality resampling can introduce audible artifacts into your sound stream, I'm all in favour, thanks. But if you want to change PA to use the same

Answer: because developers got a stupid idea, and no matter how many times the users tell them it's stupid, they won't listen. I suppose technically this isn't hating the users, but in practice the results are indistinguishable--perfectly reasonable concerns from users go unheeded.

My son was born after we stopped using our VHS tapes and he thinks they must have stored thousands of movies, given their size. And yeah, blank DVDs are now easier top buy and I usually netboot the ubuntu livecd anyway,

Here is Shuttleworth's blog talking about Ubuntu on tablets.. he wants to add tablets, not move to them:

By 14.04 LTS Ubuntu will power tablets, phones, TVs and smart screens from the car to the office kitchen, and it will connect those devices cleanly and seamlessly to the desktop, the server and the cloud.

Unity, the desktop interface in today’s Ubuntu 11.10, was designed with thi

Funny thing is: I live in what many people call a 3rd world country (Romania) and I have no data caps on my Internet; and nobody I know has any data cap on their Internet line.Of course, mobile data is capped, but everything else isn't.

Older hardware which (surprisingly!) still does well with Linux, but doesn't have the capability to boot from USB - that's why you would need a CD. A DVD is probably a good-enough alternative as well since DVD drives have been pretty standard for many, many years.

Yeah but the problem with that is this: what's the first thing to go out on a DVD/CDRW or a DVD burner? the ability to read DVDs. I don't know how many machines I'd had through the shop that would read and burn CDs just fine but the DVD would be crapped out.

So what is wrong with giving folks choice, isn't that is what FOSS is supposed to be out, choice? Why not have a 2 CD set AND a DVD with everything but the kitchen sink, why not that?

Of course I'll probably get hate for daring to even say the user should have choice, I don't know what happened to the community but it just don't seem like a nice place anymore. Now it seems to be too many have this "You'll take this and do it our way and damned well LIKE it or STFU and go back to windblowz luser" attitude, like FOSS is an exclusive club and they're the gatekeepers or something.

I used to love keeping up with what's new and thought back in 03 that by this time we'd see Linux boxes in every store, but somewhere along the ways the ground turned sour and the community seems to me to be more about being in a club than helping FOSS spread to the masses.

The problem with FOSS is that everyone wants the benefits, but no one wants to be part of it. And then you complain when they don't do it the way you like it.

That's not entirely true. I'm, for the most part, a FOSS user and I love the benefits it provides. Yet, while I want to be part of it and contribute (and i know i cant be the only one), I don't. why? I don't have neither the resources, nor the required skill set to do it. I'm definitely not rich, so hiring someone to make the mods for me is out of the question. And being a PhD student, having the time to make said implementations is out of the question, let alone learning the required languages and skills

Umm, you have a choice? You can use an old release, go to a different distro (that different distro could be better targeted at old hardware even), or package up your own release (or take the released image and remove some packages to trim it down for your own needs).

In this case I don't see it making a lot of sense to make it not CD size just for 50 megabytes worth of data but I also don't think the user is entitled to ubuntu on a CD or that it's a project requirement for ubuntu.

You still have so many choices for installing Ubuntu, that you won't find in many non-FOSS products:
- You can use the alternate text-mode Ubuntu installation CD.
- You can boot Windows and then install Ubuntu from there, using the Windows installer.
- You can install an older version of Ubuntu and dist-upgrade it in place.
- You can boot the USB image using a GRUB floppy or CD image.
- You can borrow an USB dvd reader for the first installation (hey, if you have defective hardware, you might expect to be required to have the proper tools to overcome your problems).

Gosh, someone sure likes to swap? I remember this kinda stuff from the days of floppies... a two floppy OS was NOT fun.

The trick for a distro has always been about supporting the old and the new. At a given point it is time to give up your 386 with its 1 speed CD player and buy a new computer. At least if you want to use a distro that has made it VERY clear that it is no longer aimed at weirdos. After all, how you are you going to run Unity on that old PC of yours?

Actually you're thinking of the SD card. The SD card is a simple flash memory with a toggle switch that relies on a controller a) recognises that the card has the switch flipped to read only and b) sends a signal to the OS.

USB on the other hand is not a direct link to the storage medium, and has hardware flash controllers onboard. The more expensive ones implement this properly, the cheaper ones actually hard limit the R/W line going to the chip. The cheap solution is robust but also easily visible because the OS doesn't know it's read-only. When you try writing to the drive you end up with a weird failed message.

That one "$2.48 for new" price you're latching on to doesn't include the $5 in shipping. View those offers taking that into account (most other sellers are free shipping) and you get $7.48, the real "street value" online.

Discounting shipping, I've seen piles of 4GB drives selling at $4 at Walgreens on clearance, which with tax would be $5.

Either way, USB drives are dirt cheap and have been for years. I'm also a bit wary of the notion that malware will infest a USB drive formatted on ext3 because you were careless enough to use your $5 drive to transfer photos to a pharmacy, instead of reserving another stick for that purpose.

My Vaio Z doesn't boot from USB. That's incredibly weird, since it's a "premium" high-spec machine, but it makes using Linux a pain in the ass. (So does their disregard of TRIM in favor of a custom SSD garbage-collection system, and their proprietary switchable graphics, and their out-of-the-box RAID 0'd SSD's, and...) It's like Sony had a serious case of NIH syndrome.

CD's just work. Newer stuff may be nice but PCs really are not standardized in any meaningful way. Booting via USB tends to be one of those things that is spotty. Some will boot on USB but only certain USB devices (ie, hard drives and removable disk drives, but not other mass storage like thumb drives). Some PCs may do this just as a security measure. PCs are not thrown away and replaced every year either, and we've gone from thumb drives being tiny and expensive to large and cheap in less than the typical life time of a PC.

Also this is Linux. Linux is very often put on older computers that people would otherwise throw away because it won't run the latest Windows very well. Those older PCs are much more likely to not support booting from thumb drives.

This might get me downrated, but honestly, I don't think Ubuntu is for everyone. I do think that Canonical wants to stay relevant with those folks who have 5 year old or younger machines.

If you need a Linux distro that fits on a CD drive, there are other options, but just about every machine in the past 5-6 years boots off a USB key or DVD drive. Some newer machines like netbooks and macbook airs don't (and have never) come optical drives (hell I have a toshiba portege from 2001 without optical media).

My Athlon 64 has 1.5GB of RAM, was bought in 2005 and still runs great. And it runs Wheezy, not some old crap like Windows 98 (cue some Fedora fan saying "but Debian is old crap"). Since the Phenom II/Core2, computers have grown too powerful for our simple, daily needs like viewing Wikipedia or Youtube videos. Also, my PC is far more powerful than Atom netbooks and runs circles around any ARM phone or tablet. Since everyone seems to be designing OSs with them in mind now, I'd say my old hunk of junk is pret

If the plan to use a 1.5GB default image goes through that will wreck havoc on the mirroring network. That's essentially doubling the size of the default ISO and will likely cause for some annoyed users waiting for the download. They're doing it wrong if they can't fit it on a CD.

Does that really matter that much? I can't recall the last time I didn't opt for the torrent download. It's always been the fastest way for me to get it, and I suspect the same is true in most situations.

Last I looked, Mint comes in at slightly larger than a similar aged Ubuntu release with a similar feature set. The current release is a DVD image for the full version already, only cut down versions are available on CD.

Good if it's true, I recently abandoned Ubuntu and installed mint because of Unity. Nice, clean and works fine out of the box (or USB-stick). But I don't see how it could be more populair than Ubuntu (yet)?

Practically speaking and forgetting every small petty argument.
What would it take to make ubuntu (Or any other linux Distro) a mainstream desktop OS. (Highlight DESKTOP)
If you were in charge of it or could give it direction what would you do to make it work, accepted and profitable.
I am hoping this will be an interesting exercise.

I'm using ubuntu since version 6, now 10.04 - it took some time to get audio right on my machine (fujitsu siemens Xi2428, older laptop), but one of the things I like most about linux is that once it's configured properly, it will stay that way. It will not get slower over time, or suddenly change behavior, like windows (although the last version of windows I used is XP, and still do, in Virtualbox).

I think Ubuntu 10.04 is a very nice looking desktop OS, it just works, everytime, no surprises. It's ideal for

Windows Vista was a hog, but Windows 7 will run on any system that Ubuntu does, and runs well on the same systems, although you may have to disable Aero. The Windows 8 developer preview is actually faster and uses less memory that Windows 7, but it does require a "DirectX 9" graphics card (most anything 2002+), as the graphics are 100% 3D-accelerated.

Win7 also remarkably stable from what I've seen for the past 2 years or so. It's not subject to the junk XP was, like having to run ipconfig/flushdns (or rebo

Mainstream? As in, used by a majority of worldwide users? It would take a couple decades being pushed by a leading technology company. It's a moot question, though, because the desktop is no longer the primary platform on which users use software. Today that platform is the web, and so far we have managed not to fork or otherwise divide the web, which is a real possibility.

API. Successfull modern platforms (Android, iPhone) have nice clean API. Ubuntu doesn't have any, it is chaotic mess of hundreds of APIs for various tools. It was OK back in '90, but not today. If Ubuntu want to win developers over Android / iPhone, it should provide single, unified and simple API (with Eclipse-based IDE) with full ecosystem of applications developed with it. If I was in charge, I would go with Java with additional ubuntu/Linux jars for lower level and system tasks.

I was going to respond "get for-pay apps in the Software Center" but then I looked and that's in there now. I just came back from Debian so I didn't know. There needs to be a lot more of these for-pay apps, and they need to not suck. Ubuntu needs commercial apps that make money, and not just a little. They need an "Angry Birds" breakout success story to bait the masses of developers needed to make a successful ecosystem. That's probably the only thing Old Sweaty was ever right about. This isn't going t

I've found that to be rather hit or miss. The utilities don't work reliably and often times I end up having to redo it several times before it works. At least for the utility that they used to use for booting Ubuntu off a USB stick.

-overburn Allow wodim to write more than the official size of a medium. This feature is usually
called overburning and depends on the fact that most blank media may hold more space than
the official size. As the official size of the lead-out area on the disk is 90 seconds
(6750 sectors) and a disk usually works if there are at least 150 sectors of lead out,
all media may be overburned by at least 88 seconds (6600 sectors). Most CD recorders
only do overburning in SAO or RAW mode. Known exceptions are TEAC CD-R50S, TEAC CD-R55S
and the Panasonic CW-7502. Some drives do not allow to overburn as much as you might
like and limit the size of a CD to e.g. 76 minutes. This problem may be circumvented by
writing the CD in RAW mode because this way the drive has no chance to find the size
before starting to burn. There is no guarantee that your drive supports overburning at
all. Make a test to check if your drive implements the feature.

Overburn is a great option, but it does _not_ work with name-brand media. Cheap CD media has a lot of overburn space because the manufacturing process is, well, cheap and the tolerances are not strict. Therefore the manufactures leave a lot of overburn space that may or may not be useful. With these you can get anywhere from 750-780 MB on a disk. The name-brand CDs are manufactured to ISO 900x specs, so they can bring the tolerances way down. You might not be able to go above 720 MB on some of these.

Note that no matter which type of disc you overburn, the end might not be readable! I hope that something nonessential is way out there on the end, and that the installer knows how to handle a non-read potion of the end of the disc.

This is ridiculous, CDs cost the same as DVDs and if your computer has a optical drive and is new enough that you should be using normal Ubuntu instead of one of its builds designed for low spec systems then you have a DVD drive (and a few gigs is nothing for a USB stick).I have been burning CD images to DVDs for like 5+ years now, because unless you want compatibility with really old systems there is no reason not to and lets face it Ubuntu is not really even compatible with these systems in the first place.

So I cannot even imagine one person being inconvenience by this.

Now significantly increasing the size will effect download time, but once it is on a HD 700MB or 1.5G are both so insignificant that it does not really matter.

I don't mind if software grows in size, but these days it just seems to grow out of control while keeping roughly the same feature sets.

The fact that Ubuntu is on revision 12.x and has just now BARELY escaped the media size it originally came out on, they're doing a hell of a lot better than most. Some growth is expected, and not really the fault of the Ubuntu team(they're at the mercy of package developers too).

By comparison, Adobe Reader used to be a 5MB installer. Now the damn thing is well over 50MB, and it still does the exact same thing; Read PDFs. Talk about bloatware...

Size creep in inevitable. The current 700MB image has already undergone about a 700 times size creep from early operating systems and it will continue.Ubuntu is designed for running on the common everyday computer and as time goes on that gets more powerful. And not to mention there are still tons of hardware specs it does not function perfectly on and more being created every day so if nothing else adding more drivers to the image will creep the size.

I recently reinstalled Oneric/ia64 on a few machines because of the recently included gcc-arm packages which always were a pain to self-compile.

I added a http-proxy (squid) on my local fileserver/NAS and used the netinstall through that proxy: No duplicate downloads, up to date packages for everything and I could start installing without waiting for the.ISO to finish downloading.

I highly recommend that method, and it should work for most distributions out there, possibly even for WindowsUpdate;-)

It's about time distro makers stopped restricting their content to what they can cram within the artificial limitations of 700MB. Pretty much every desktop and laptop computer since, what, 2004 has had a DVD reader.

I've always felt that the Ubuntu DVD ISOs were a bit of an afterthought. Hopefully this will now change.

Slackware ships on a DVD, and a full install is about 5-6 GB. But it certainly isn't bloated. It's one of the quickest and most stable distributions I've used, so I hesitate to say that adding more stuff to the Ubuntu install justifies people calling it bloated. Ubuntu's selection of software is still conservative in quantity. If anything would be blamed on bloat, it would be implementing it in such a way that it negatively affects your system's performance. So if they're adding unnecessary things to the system startup, or a lot of background processes that you don't use, then that would be bloat. (In Ubuntu's case, this has been happening, but it started long before they ever decided to ship a release that was too large for a cd.)

How hard is it to put the optional stuff on a second CD? Make sure you can run a low spec PC off the first CD and put all the higher spec stuff on the second one. People will have the choice to use either the DVD, only the first CD, or the two (or more) CDs. RedHat has been doing multiple CDs for years and years....

If they don't include Gnome 2 in the new Ubuntu version with LTS, we will see mass migration to Linux Mint. Already Linux Mint gained 40% in a single month, 'cause of the Unity & Gnome 3 debacle. I wonder when the Ubuntu decision makers are going to realise, how bad their new Desktop Environment is.

Why won't ubuntu just do as fedora has done? use xz compression on the squashfs image. The live image for fedora is now 565 MB, but would have been more than 700MB if using gzip compression as ubuntu does.
Reading from cd/dvd or even flash drives and harddrives (except ssds) are so slow compared to the cpu today anyway, so it would probably be faster in most cases.

Debian still does this, but I am guessing that Ubuntu wants to fit everything into one standardized package with the "one size fits all" mentality. At the same time they keep it as low as possible because not everyone has unlimited broadband internet.

This is not however strictly true, because Canonical *does* provide a minimal ISO ( here [ubuntu.com] ) which contains the kernel, userland and ethernet related stuff, after which you can do a netinstall of whatever you want.

Debian allows it, but all the software for a standard desktop is on CD1. Last time I tried CentOS it demanded 4 CDs for an install with all optional packages disabled (no desktops, no daemons), which confused and enraged me - if you're going to have a CD based install at all, then one CD should be enough to get started.

Ehh, but the thing is the latest Ubuntu is unlikely to work on the kind of old hardware that only has a CD-ROM device or can't boot from an external harddisk. AFAIK, Lubuntu and Xubuntu do fit everything in a CD.

By Pentium 4 or better, that likely means it requires SSE2 instructions, which means Athlon 64 is the minimum on the AMD side. 1GB of RAM is hard to find or get on 2001-2002 P4's as well due to the use of RDRAM. So you're basically looking at 2003-era systems as a minimum to run Ubuntu.

But finding an 8 year old or better system as a hand-me-down, at a yard sale, or even by dumpster diving isn't difficult at all. Never really has been. Most systems like that will actually still work once the typical spyware-infested XP install is removed.

Considering a brand new 4GB USB flash drive is a whopping $2.47 on Amazon (or $5 at Walgreen's) it's not that big of a deal to get one of those either.

Ubuntu made the right choice by dumping what is now an arbitrary 700MB limit. I'm sure plenty of people also "saw the light" of Linux on 1.44MB floppies in the late 90's as well, but it's almost 2012, and both eras are over now.

Any system that has been made since circa 2001 (i.e. the past 10 years) has been able to boot from USB.

Wrong. I know of some OEM boxes from the '04 era that can't, and some BIOSes from even later than that make it unnecessarily difficult - Gigabyte, IIRC, wrote its BIOS to force you to guess what kind of mass-storage your USB stick should emulate. Guess wrong and your USB stick won't boot.

Then use Debian, use Puppy Linux, use BasicLinux, use whatever. It's your choice, whether you're running an 8-core AMD Bulldozer, a $250 netbook that leaves any 2003-era system in the dust, or something from the 1990s that belongs in a museum (or landfill).

I only wish you luck on getting any modern software, such as an ACID2-compliant browser like Iceweasel or Chromium, to run on a Pentium 1 with 48MB of RAM. Such things do not constitute Windows 98 era junkware. If you're reading this with lynx, more power to you!

It's been a while, but these days it's fairly common for laptops to come without an optical disc. I'll probably get my desktops without as well as my Samsung USB drive is more than enough on the rare occasions where I really need an optical disc.

I rip all my discs to disk and load them from there, leaving my optical drive to pretty much just take up space.

I have a nice little PXE netboot environment on my home server. Currently it supports:

ubuntu-11.04-desktop-i386

ubuntu-11.10-desktop-i386

systemrescuecd-x86-2.2.0

debian

GParted Live

Every time a new release comes out, or I find a new distro which supports PXE, I unpack it in/var/lib/tftpboot and add it to the menu. The debian installer loads first and includes a neat menu system which makes it easy to install different distributions. When I bought my new netbook I naturally checked that it supports PXE.