Main menu

time passes by, let the memory lives forever

Monthly Archives: July 2009

Introduction

What’s wrong with a good ol’ joypad?

Throughout the 80’s and 90’s, the preferred method of input for the majority of games consoles continued to be the tried-and-trusted joypad. Despite the fact that joypads have remained as the de facto input standard for current-generation consoles such as Microsoft’s Xbox 360 and Sony’s PlayStation 3, they face an increasing threat from motion-sensing controllers.

It’s worth noting that the technology for motion-sensed input has been available for many years, but the control method first showed signs of becoming mainstream in 2006 with the launch of the Nintendo Wii console.

Unlike rivals Sony and Microsoft, Nintendo opted to take a different approach to gaming and has since reaped the benefits. Its motion-sensing controller, dubbed the Wii Remote and bundled with each Wii console, has become hugely popular with both new and existing gamers due to its pick-up-and-play nature. It is often cited as the primary reason for the Nintendo Wii’s prominent lead as this generation’s biggest-selling games console, having sold over 50 million units.

What’s the appeal?

While there’s nothing inherently wrong with a joypad – and we all still love a bit of button-mashing – the general consensus is that joypads are primarily accessible to users familiar with gaming. With profits in mind, the goal for the likes of Microsoft, Nintendo and Sony is to get everyone and anyone gaming.

A motion-sensing controller, therefore, is better suited to gamers of all ages, and, as the Wii Remote has shown during its first few years on the market, there’s plenty of appeal for those who’ve never gamed before. With Nintendo capturing a massive market of ‘casual’ gamers, both Microsoft and Sony – whose current-generation consoles have largely targeted hardcore users – are hoping to one day claim their slice of the casual-gaming pie. How will they do it? Well, the plan is to follow in the footsteps of Nintendo with motion-sensing devices of their own – only, with a twist.

In June 2009, at a major annual games conference, both Microsoft and Sony unveiled upcoming motion-sensing devices that are expected to make their debut on the Xbox 360 and PlayStation 3, respectively, in 2010.

So, let’s take a look at all three offerings, finding out how they work, and what they offer.

Nintendo Wii Remote

The Wii Remote – also known as the WiiMote – which you’ve no doubt seen, or even used, is bundled with Nintendo’s Wii console and acts as its primary controller.

Its unique ability to detect motion and act as a pointing device is made possible by a combination of modern-day technologies. It might be about the same size as a regular TV remote, but there’s plenty happening in this little device.

First and foremost, the Wii Remote needs to be able to detect where a user is pointing, allowing for users to click on certain areas of a screen – similar, in a sense, to how a mouse cursor is used to navigate a computer. In the case of the Wii Remote, this is achieved by a sensor-bar equipped with a series of LEDs (pictured below) and an optical sensor built into the remote.

The Wii Remote’s built-in optical sensor acts like a camera used to locate the sensor bar’s infrared LEDs in its point of view. By plotting where two spots of light fall (one from each end of the sensor bar), the Wii console is able to determine where the remote is pointing in relation to the screen.

All sounds very clever – and it is – but it does have one obvious drawback, too. A user, or indeed multiple users, would need to remain within the sensor bar’s point of view in order for the system to work.

In addition to knowing where a user is pointing, the Wii Remote can also calculate how it’s being moved. This is done with the use of accelerometers – tiny chips that feature a piece of silicon anchored at one end and allowed to move at the other between an electric field created by a pair of capacitors. Accelerating the remote in one direction causes the silicon to move and disrupt the electronic field. That change is translated to motion, and the Wii Remote can measure acceleration on three axes – allowing for the ability to perform a variety of gestures such as moving side to side, twisting, and pushing and pulling.

On top of all this, the data captured by the optical sensor and accelerometers needs to be sent back to the Wii console without wires. In order to achieve that, the Wii remote contains a built-in Bluetooth chip that allows for two-way communication with the console.

Sounds brilliant, and it is, but hardcore gamers have argued that the Wii Remote isn’t completely accurate, and doesn’t offer as precise control as say a joypad. Hoping to increase the accuracy of the remote, Nintendo in June 2009 launched an expansion device dubbed Wii MotionPlus.

The optional accessory, pictured above, plugs into any existing Wii Remote and features a multi-axis gyroscope that, when combined with the accelerometer and sensor-bar, should offer a far more accurate tracking mechanism.

That’s the Wii remote, and it has thus far set the standard for motion-based games control. Nonetheless, both Sony and Microsoft think they will top it in 2010.

Sony PlayStation Motion Controller

In-the-know readers will be aware that Sony’s PlayStation 3 console launched with a motion-sensing controller back in 2006. The peripheral, dubbed the SixAxis Wireless Controller, is essentially a DualShock joypad equipped with sensors that allow it to track movement along the three posture axis of roll, pitch and yaw, as well as three-dimension acceleration information along the X, Y and Z axes.

The SixAxis was deemed by many in the media to be a last-minute attempt to scupper Nintendo’s Wii Remote. It has so far failed to gather widespread interest from game developers and its use for motion detection has become something of a rarity. The SixAxis was later succeeded by the DualShock 3, a similar joypad with added rumble functionality.

Trying its hand at the motion-sensing game once again, Sony will next year launch its second attempt in the form of a PlayStation Motion Controller, pictured below.

Not a whole lot is known about the device, and it doesn’t yet have a finalised name. Sony, however, has publically stated that the controller will be available in spring 2010. As of now, we’ve only seen it demonstrated in prototype form. So, what do we know so far?

The PlayStation Motion Controller prototype is a handheld device similar in size to the Wii Remote and it’s quickly becoming known as “the wand” due its illuminated orb.

As it turns out, that orb is key to the controller’s functionality. It’s equipped with LEDs that are able to shine in a variety of colours, and it’s those spherical colours that are tracked by the controller’s accompanying component – the PlayStation Eye.

Unlike the Wii Remote – which, remember, uses a infrared light-emitting sensor-bar to track position – the PlayStation Motion Controller will be detected by the PlayStation Eye webcam. Due to the spherical shape of the controller’s top, the webcam is able to determine its position and distance by tracking the size of the sphere in relation to itself.

With the orb allowing for the ‘wand’ to be tracked in three dimensions, Sony’s controller then uses additional WiiMote-like technology to sense motion. Although Sony is yet to detail the finer workings of the controller, early indications suggest that the device is equipped with accelerometers and multi-axis gyroscopes that offer motion-tracking functions similar to the Wii MotionPlus.

At the prototype stage, Sony’s PlayStation Motion Controller appears to be more evolutionary than revolutionary, but the promise of ‘high-precision, sub-millimeter’ accuracy should be far more appealing to developers than the lacklustre SixAxis.

That leaves Microsoft, so let’s see what it has to offer.

Microsoft Project Natal

Microsoft’s attempt to reach out to the vast casual gaming audience has taken on a unique form. Although motion-sensing remains a key ingredient, Microsoft hopes to create a gesture-based experience without the need for users to hold any form of peripheral.

Sounds bold, but it reckons it can do it with a piece of kit codenamed “Project Natal”.

In its simplest form, Project Natal (pictured above) is a horizontal peripheral measuring roughly 25cm. It’s designed to sit below or above a television screen and connects to an Xbox 360 console. Once connected, it will spot human presence via facial recognition and allow users to control both the Xbox 360 interface and games using gestures or voice commands.

Sounds promising, so how does it do it? Inside Project Natal are a number of components that come together to deliver the full experience – including a webcam and multi-array microphone, a depth sensor and a processor running proprietary software.

All seems a bit complicated, but putting it simply, Project Natal emits an array of beams that hit objects in its path and bounce back to the device. With the data returned, the on-board processor’s proprietary software is able to map a surprisingly accurate image of everything in its field of view.

Exactly how accurate? Well, irrespective of objects in a room – including, for example, a couch – Project Natal is able to map the full body of multiple players, with project director Kudo Tsunoda suggesting that it could track a user’s individual fingers depending on distance from the device.

The theory is that a user is able to walk in front of their console, be instantly recognised, and be able to control the system and it games by moving parts of their body in a 3D space mapped by Natal.

Is it too good to be true? Although Microsoft has publicly shown Natal tech demonstrations, it remains to be seen how well it can be implemented in real-world usage. However, game developers have already shown a keen interest, and Project Natal has quickly become the focus of media attention. It’s expected to become available in 2010, but despite its apparent technological advantage, we’d assume it would be more costly to produce than, say, a Wii Remote or a PlayStation Motion Controller, and that may be Natal’s biggest obstacle as it attempts to lure a casual-gaming audience in the midst of a recession.

Summary

The age-old joypad continues to live on as the preferred input method for the experienced gamer, but following the pioneering efforts of Nintendo, motion-based control looks set to become a key gaming element in the coming years.

With all three of the biggest games console manufacturers pushing the technology forward, both casual and hardcore gamers alike can expect to be faced with multiple motion-sensing controller choices in the near future.

here’s some quite nice comparison on the three motion sensing device that’s coming to gamers wishlist soon.. article from Hexus.Gaming.

BLACKSBURG, Va. — Texting while driving increases the risk of a crash much more than previous studies have concluded with motorists taking their eyes off the road longer than they do when talking or listening on their cell phones, a safety research institute said Monday.

The Virginia Tech Transportation Institute used cameras to continuously observe light vehicle drivers and truckers for more than 6 million miles. It found that when drivers of heavy trucks texted, their collision risk was 23 times greater than when not texting.

Dialing a cell phone and using or reaching for an electronic device increased risk of collision about 6 times in cars and trucks.

Recent research using driving simulators suggested that talking and listening were as dangerous as texting, but the “naturalistic driving studies clearly indicate that this is not the case,” a news release from the institute said. The risks of texting generally applied to all drivers, not just truckers, the researchers said. Complete results were expected to be released Tuesday.

Right before a crash or near collision, drivers spent nearly five seconds looking at their devices, which was enough time at 55 mph to cover more than the length of a football field.

“Talking/listening to a cell phone allowed drivers to maintain eyes on the road and were not associated with an increased safety risk to nearly the same degree,” the institute said. “These results show conclusively that a real key to significantly improving safety is keeping your eyes on the road.”

The institute recommended that texting should be banned for all drivers and all cell phone use should be prohibited for newly licensed teen drivers. Fourteen states do ban texting while driving.

The study also concluded that headset cell phone use is not substantially safer than hand-held because the primary risks associated with both are answering, dialing, and other tasks that take drivers’ eyes off the road.

Voice activated systems are less risky if they are designed well enough so drivers do not have to take their eyes off the road often or for long periods.

A call to the institute was not immediately returned Monday night for more details.

i have to agree with this although i still do this. gotta stop doin this. its dangerous! article from GoogleNews.

Michael Owen scored four times during Manchester United’s tour of the far east. Photograph: Aly Song/Reuters

Ryan Giggs believes Michael Owen provides Manchester United with the kind of out-and-out striker they have not had since Ruud van Nistelrooy left for Real Madrid and is confident the former Newcastle United player will improve the balance and potency of the team’s attack. Although Giggs acknowledges that the real tests for Owen are yet to come, he is encouraged by the four goals the forward scored in as many games during United’s tour of east Asia.

This week’s Audi Cup in Munich, in which United face Boca Juniors and either Bayern Munich or Milan, should give a fairer indication of how one of Sir Alex Ferguson’s most unexpected signings will begin the campaign. Only in Seoul – the one game in which he did not find the net – did Owen encounter serious opposition. Nevertheless, Giggs is convinced the 29-year-old can improve United’s attack.

“I don’t think there was any real scepticism among the players about his arrival. The manager explained that he provides something we haven’t got,” said Giggs. “Kiko [Federico Macheda] is probably the most like Michael in the sense that he is an out-and-out goalscorer but he is still very young. Wayne and Dimitar Berbatov do things outside the box.
“Michael is a pure predator and that is something we have not really had since Ruud left. Before that we had Andy Cole and Ole Gunnar Solskjaer but our chances-to-conversion rate was not so good last season and Michael should improve that. He is a poacher who scores all types of goals. They all come from inside the box – that’s where he does his work and that is where he comes alive.”
In his five years at Old Trafford, Van Nistelrooy never once found the net from outside the area and Giggs detects other similarities. “The good thing about Michael over the years is that if he misses a chance, it doesn’t bother him one bit. That’s what great goalscorers do – they are always convinced they will get another chance. Ruud had that same mentality too.”

Should Owen start the season in this kind of form, even the cold-eyed Fabio Capello, who began his reign as England manager sceptical about a man who has scored 40 times at international level, might just bring him back. Owen has not played since Capello’s last defeat – a 1–0 friendly reverse against France in March 2008 – but is still prepared to back himself to overcome Sir Bobby Charlton’s tally of 49 goals.

“I am sure if he is playing and scoring for United, the rest will come,” said Giggs. “But looking at his goals-to-games ratio, if he gets another 15 internationals, then he will probably overtake Sir Bobby.”

and now the legend talks of owen. i believe in fergie’s decision. and with giggs words.. what else can a fan say?? next season seems very interesting. i think im going to like this new team. there won’t be single star show anymore. manchester united will come back as a team. article from Guardian.co.uk.

A photographer has captured a stunning photo of the space shuttle Endeavor docked with the International Space Station crossing the face of the sun.

You couldn’t just aim your digital camera at the sky and get results like this. Thierry Legault, who is known for his amazing astronomical imagery, uses specialized solar filters to capture the images.

When the shuttle docked with the ISS on July 15, the combined crews set a new record for space-vehicle occupancy. The 13 people aboard the station are the most that have been aboard the same vehicle in space. The astronauts have installed a “porch” on the space station for space-exposed experiments. The new addition effectively completes the Japanese Kibo laboratory.

Astronauts are deploying a variety of other scientific installations, too. One public-interest project, the Tomatosphere II, exposes millions of tomato seeds to space, which are then returned to Earth and distributed to classrooms across North America.

If you like Legault’s photograph, make sure to check out his other work, including his shot of the space shuttle Atlantis solo-transiting the sun.

PETALING JAYA (July 26, 2009): Close to 200 people turned up to pay their respects at the funeral of talented filmmaker Yasmin Ahmad, who passed away at 11.25pm on Saturday.

The multi-racial crowd, many of them in tears, at the funeral this morning was testament to Yasmin’s ability to reach across ethnic and cultural boundaries in both her films and the television commercials she created for Petronas.

The prayers took place at about 10am at Masjid Abu Bakar As Siddiq at Section 19/7A, Subang Jaya before her body was taken to the Muslim burial ground at USJ22.

Yasmin, who collapsed after suffering a stroke and undergoing surgery for cerebral hemorrhage on Thursday, had been creative force in the advertising and film industry, having won awards and accolades both locally and internationally.

Her interracial love story Sepet (2004) was accorded the Best Film Award and the Best Original Screenplay Award at the Malaysian Film Festival 2005. It also bagged the Asian Film Award at the Tokyo International Film Festival 2005, and the Grand Prix Award at the Creteil International Women’s Film Festival in the same year.

Her other films included Gubra (2006), Mukhsin (2006), Muallaf (2008) and Talentime (2009).

The 51-year-old Muar-born Yasmin, who was married to Abdullah Tan Yew Leong, began her career as a copywriter with Ogilvy & Mather before joining Leo Burnnett, where she rose to become its creative executive director.

Yasmin’s sudden death leaves a void in the local filmmaking industry, and her legacy will be long remembered.

Award-winning actress Azean Irdawaty, who worked with Yasmin on Talentime, told theSun she was in Singapore when she heard about Yasmin’s death. She rushed back just in time for the funeral.

“I am glad that I managed to see and kiss Yasmin for the last time,” said Azean, who, in her rush to make it to the mosque, ended up leaving her luggage at the airport.

Azean, who has been diagnosed with breast cancer, remembered an incident during the press conference for Talentime, when the actress said she did not mind if the movie was to be her last.

“It is ironic that it ended up being Yasmin’s last film instead,” she told theSun in a phone interview.

Fellow director Othman Hafsham, when met at the funeral, was effusive in his praise for Yasmin.
“She is a director who speaks her mind and her films reflect this,” he said.

“Only in this country was her movies considered controversial. But in international film festivals, they all accept her movies. Indeed she has became a role model for many aspiring director out there,” he said.

Actress Ida Nerina who is currently recovering from a bad fall, sent a text message to theSun to share her thoughts on Yasmin .

“Yasmin’s passing is tragic loss to the us, the selfish living,” she said. “Yasmin Ahmad was more than just a talented creative director. She was a loving daughter, wife, mother, sister and a teacher to many of us who are fortunate to have had her to sweep us off our feet.

“Yasmin was fiercely patriotic and a true Malaysian. I think she has taught many of us to trust our gut when being creative. But truth be known the most important lesson Yasmin Ahmad taught me was humility,” she added.

Independent filmmaker and author Amir Muhammad was also similarly affected by Yasmin’s death.
In an SMS, he said: “I still haven’t fully processed this yet. So I can only say I will miss her terribly but her courage and her compassion will be felt for a very long time among an audience that will only get bigger.”

More tributes to Yasmin Ahmad

>> Mahesh Jugal Kishor, who worked with Yasmin in Talentime,: “First of all I like to apologise If I cry talking about her.

“She is never a director with you. She always acts like a family member to her cast and crew. She is like mother to me. She is like a friend to me.

“What impressed me about her is that she can relate to everyone regardless their age. She will talk to the elders with so much respect. But when it comes to the younger people, she can be like one of us. She can talk our lingo.

“As a director she is not rigid. She gives the actors their creative freedom to interpret their characters.”

“I must tell you that I am not coping with her death very well.”

>> Elza Irdalynna, who starred in Talentime: “She has a way to make her film shoots a fun affair. It was like going on a vacation. She was a hilarious woman. She loved to play the piano and to sing.

She had a very beautiful voice.

“I spoke to her few hours before she suffered the stroke. She graciously agreed to a put on special screening of Talentime for my boyfriend who is from the United States. We were supposed to meet her on Monday to see the movie together. But it didn’t take place.”

>> Actress Maya Karin: “I have never worked with her. But I find she has a beautiful soul and she has been kind to everybody, and that is reflected in her movies.”

>> Actress Azean Irdawaty wrote a special poem dedicated to Yasmin:

She was an angel
God sent us
No, he lent us
To be a storyteller
Who spun stories of
Magic, joy and enchantment
Who we often lose sight of
Her fight was brief
Her leaving was a grief
Just like her movies
The end came too soon
But her wisdom remain
like words to a tune
That we will hold dear
Forever you will be near.
I love you Yasmin
I will miss you always.

this is a sad news for us malaysians. yasmin ahmad was a great film maker. her ideas are different from others. i watched most of her movies and loved most of them. her way of telling it comes from a different perspective, allowing us to open our mind further in our life. im gonna miss her ads for petronas, tnb and others during our festives. her ads have always been the best on tv.

Our continuing Linux-vs.-Windows series turns now to the absolute basics — the most universal, and occasionally most important, task you will undertake with any computer. Whatever software and OS you use, whatever you do with the machine, sooner or later you’re going to install, update or upgrade something. How does the process compare on the two platforms?

(Again, Mac OS folk, you’re not the topic of discussion here. If you want to comment on the .dmg experience or other aspects of tending your Apple orchards, please do so in comments, civilly.)

Windows applications these days, whether downloaded or installed from optical disc, tend to include installation wizards; at the very least, there’s likely to be a setup.exe program in there. Click and go (or just “go” if it’s an autorun).

Linux packages come in a few different wrappings, depending on your preferred flavor. The installation process for *nix packages used to be rather tedious. Many experienced users are familiar with .tar balls, which are similar to .zip files under Windows. The .tar, .gz, and .tgz extensions all indicated that you had before you an archive, which had to be unpacked and the readme or install file sought in the collection within.

Fortunately, we’re past all that now, thanks to package management — a development that brought Linux installation management on an ease-of-use par with other operating systems by making the install process part of the operating system, not part of the individual package. The first iteration of the genre was given the unfortunate name “pms” (package management system). Perhaps in deference to the greater needs of humanity, it was followed quickly by RPP (Red Hat Software Program Packages), Red Hat’s first essay in the field. Red hat later turned to RPM (RPM Package Manager, thank you Unix recursivity fiends), still a going concern for the Red Hat/Fedora/RHEL contingent. Yum (YellowDog Updater Modified) is one of the most popular package managers for that crowd.

The Ubuntu project (which is based on Debian — the full family name is actually Debian Gnu/Linux, so you know) focused especially on making the install and upgrade process pain-free — and in doing so actually provides a model that closed-source OS vendors would do well to follow. Like its progenitor Debian, Ubuntu uses the APT (Advanced Package Tool)-based Synaptic package-management tool to handle installations (and, as we’ll see, updates and upgrades). APT, which is one of the centerpieces of the Debian/Ubuntu usability philosophy, is the interface to the wide world of DEB packages; instead of every individual package toting its own installation, the smarts for the process lie in the OS itself. Specifically, APT manages and resolves problems with dependencies — a ticket out of the dreaded “dependency hell,” in fact. APT sits atop dpkg, a Debian package manager.

To APT, a repository looks like a collection of files, plus an index. The index tells APT (and, therefore, Synaptic) about a desired program’s dependencies, the additional files required to make the thing run. (Windows users, think “dynamic link library” here.) Synaptic checks the local machine to see if any of the listed dependencies need to be retrieved along with the program itself, and it tells you before installation if it will retrieve those for you.

More importantly, APT handles problems in which a package’s dependencies conflict with each other, are circular to each other, or are otherwise out of control. Windows users will easily recognize that mess: It’s DLL Hell by another name — and if Windows’ Add/Remove Programs function (Programs and Features in Vista) behaved nicely, it would actually do this sort of dependency tracking rather than simply enquiring of the setup.exe files it finds on the computer.

Jeremy Garcia, founder and proprietor of LinuxQuestions.org, notes that “the newer repository-based Linux distributions have gone to great lengths to mitigate the dependency hell issue… it’s something I rarely hear complaints of anymore.”

For our purposes, let’s look at how the process works in Ubuntu. When installing a new app, the easiest method is to fire up Synaptic Package Manager and type in the name or even just a few description terms concerning what you want: “yahtzee game,” for instance. Synaptic knows of several software repositories — collections of software that are carefully maintained and checked for malware and such — and users can add third-party repositories to check if they choose. By default, all repositories signed their packages, providing a level of quality assurance.

Some people find the Linux package management tools and repositories confusing, and some of that is due to the creative (and sometimes silly) naming of the tools. When the rubber hits the road, though, it’s not that complicated. RPM, YUM, RHN, and several others all relate to management of packages in RPM format. APT, Synaptec, Ubuntu Update Manager, Canonical’s commercial package manager — all of these relate to management of packages in the Debian DEB format. And if you happen to want a package that’s only available in RPM format for your Debian system (or vice versa) you can use a utility called Alien to translate between the two package formats, and keep everything on your system under the watchful eye of your chosen package manager.

There are four types of repositories in the Ubuntu universe: main, restricted, universe and multiverse. Main repositories hold officially supported software. Restricted software is for whatever reason (local laws, patent issues) not available under a completely free license, and you will want to know why before you install it. (“Free” in this case doesn’t mean free-like-beer but free-like-speech; if a package may not be examined, modified, and improved by the community, it’s not free.) Ubuntu has sorted matters out in this fashion, but once again it’s a wide Linux world out there, and you’re apt to encounter other terminology if you choose other flavors of the OS.

Software in the “universe” repository isn’t official, but is maintained by the community; sometimes particularly popular and well-supported packages are promoted from universe to main. And many “multiverse” wares (e.g., closed-source drivers required to play DVDs on an open-source system) are not free-like-speech; you’ll need to be in touch with the copyright holder to find out your responsibilities there. Many repositories of any stripe are signed with GPG keys to authenticate identity; APT looks for that authentication and warns users if it’s not available.

To the end user, all this looks like: Open Synaptic. Type in search term. Select stuff that looks cool. Click “Apply.” Done. Because the default repositories are actively curated, there’s very little danger of malware; because the packages themselves must conform with Debian’s install rules, the end user needs to do little to nothing to complete the process; because APT manages every files and configuration component completely, every package can be updated or removed completely without breaking the rest of the system. The applications are even sorted appropriately; my new Open Yahtzee game (I really do knock myself out for you people, don’t I) appeared under Games with no prompting from me at all.

If you’re determined, you can still do old-style installs in Debian, circumventing APT. If you’re compiling your own software or installing some truly paleolithic code, you can end up scattering files and such all around your system, none of it tracked by APT. But you’ve really got to try.

Linux offers a few choices for managing your updates, but in Ubuntu, again, the method of choice is Synaptic. The system periodically checks online for updates to all the applications it sees on your system. Updates fall into four categories: critical security updates, recommended updates from serious problems not related to security, pre-released updates (mmmm, beta), and unsupported updates, which are mainly fixes for older, no longer generally supported versions of Ubuntu. Most users will automatically update only those patches falling in the first two categories.

The process is otherwise identical to install — click and go. It is recommended, by the way, that users always do updates before upgrades to either individual programs or the OS.

Backing up applications

Windows users installing from optical disc are wise to keep those in case they’re needed later (along with any required license keys, of course). Because the repository model works as it does, Linux users may choose to simply rely on those servers. However, the program APTonCD makes it quite simple to create discs with backup copies of the packages installed on your system. The Ubuntu system must, however, be told about the specific disc from which you wish to install. That’s a three-step system-administration process, and you will need to have the actual disc in hand and ready to drop into the machine. (It can also help you burn discs of packages you don’t have on your system — if you wanted to hand someone a nice clean install disk, for instance — as Jeremy Garcia points out, “There are many corporate environments where Internet access in not available, for a variety of reasons.”)

Or, if you like, on the basic Synaptic menu, there’s an option to export a list of every blessed item on your system that APT is tracking. Take that list to another machine, import it, and Synaptic will install everything on the list, including appropriate updates, from the repositories.

Rolling back applications

Though the Internet often provides little recourse when one seeks an earlier version of a Windows program one didn’t have the sense to back up before an ill-advised install, both Linux platforms have a wide range of rollback ease, depending on which applications you’re dealing with.

Debian / Ubuntu leaders made a decision that new packages would not automatically uninstall older versions. This may or may not present tidying-up challenges for tiny-disked systems — in my own experience I find that the Computer Janitor utility does an adequate job of keeping things in check — but it certainly makes it easier to revert to an earlier version of a particular program.

Updating the OS

Minor updates to Windows are pushed out about once a month, or more often when Microsoft chooses to release an out-of-cycle patch. Major updates — the Service Packs — are less frequent. Desktop systems are often configured to automatically install updates when they become available, while Windows servers are typically configured to notify (but not install) updates so that proper testing can occur.

In Linux, the operating system and the kernel are constantly being updated. That doesn’t mean you need to update every time something changes, and as with Windows there are perfectly good reasons to wait — as with other operating systems, an upgrade can occasionally cause confusion with dependencies and break third-party software (especially, Murphy’s Law being what it is, on production machines). On the other hand, tiny performance improvements, support for newer gadgets, and assorted bug fixes may mean you find the prospect of frequently freshened kernels appealing, especially if you’re not doing the installation for any machine but your own. And many sysadmins would in any case like to automate the update process as much as possible for civilian users.

One good reason to update your kernel is to prepare for a larger upgrade; while a major version installation itself can’t be easily rolled back once installed, the kernel, modules or specific applications all can. A cautious or curious sysadmin could get a preview of how a newer version of the OS treats an older application by upgrading the kernel, checking its behavior, then testing individual applications to see how they behave, rather than upgrading the whole shebang and hoping for the best.

In related thinking, smart Linux users make their /home/ — the directory for data and documents — on a separate partition from the OS installation. That way, changes to the OS — up to and including switching to an all-new Linux distribution — don’t necessarily require you to reconfigure all applications and reload all your documents, photos, and other user data.

Upgrading the OS

Late October is going to a big time for you no matter which OS you use; Windows 7 is expected for release on the 22nd, while Ubuntu is expected to level up to Karmic Koala (did we mention the amusement factor in Ubuntu’s naming system?) on the 29th.

Whether or not you think Windows or Linux has the edge here is perhaps dependent on what you expect from a large install process. With Windows, the process goes relatively well if you remember to do your BIOS upgrades before you start the process (and are sure your current version can be upgraded to the new one). Linux upgrades must be done in lockstep, and if you’re more than one version behind you’ll have to install all the intervening versions until you’re up to date. On the other hand, upgrades for Linux can be done over the Net if you like; when upgrading Windows, on the other hand, you’re wise to get offline completely.

Backing up the OS

Windows users who purchase their machines with the OS pre-loaded used to be supplied with rescue disks in case of disaster; these days, it’s on a partition on the hard drive itself (and heaven help you if the drive fails). Various good options exist for backing up one’s Registry and system files in case of trouble. But things can get a little awkward (or expensive) when it’s time to start absolutely fresh with a clean install. (And every machine needs to do that now and then.) Did you save your license key? If the Debian / Ubuntu effort made nothing else simpler, the “free” part ensures a lot less drama when chaos strikes.

Rolling back the OS

It happens: You need to be where you were, not where you are. In Windows, you’re hosed; format and start again. In Linux… you’re still hosed. You can, however, roll back the kernel as mentioned. (In fact, you’re not really rolling back the kernel itself; the upgrade process leaves the old kernel in there, available from the boot loader just in case. They’re only about 10-15 MB, after all; you have room.) That’s rather helpful for testing purposes, and can save you some unpleasant surprises with individual applications; careful use of the testing technique described above may well spare you the need to roll back at all. (Also, this is an excellent time to have done that /home/ partition we mentioned.)

I am fairly sure RPM supports rollback, although I think it’s disabled by default. I’m not sure about dpkg. It’s also possible to force install an older version of the package you’re having issues with in many cases.

So what’s the verdict? For maintenance of applications and the operating system with minimum pain and maximum control, the answer to this Can Linux Do This question is YES, and well enough that Microsoft and other closed-source shops ought to be taking notes.

Comments

methuselah

Jul 23, 2009 – 8:56 PM

Nicely written. But the Apt process might sound a little scary for someone who has never seen it. In Ubuntu, adding a program can be as simple as this: (For those who don’t wish to type ANYTHING, not even a portion of a program name.)
Click on Applications
Click on Add/Remove
Click on the desired program Category
Scroll down and select the program(s) you wish to install
Click on Apply changes.
I’ve found installing and maintaining a Linux system is generally faster and easier than a Windows system. (Especially if you don’t have a bunch of driver disks!) Being able to do all the updates from one place in Linux is so much easier than having to go out to each and every programmer’s site for updates in Windows. I’ve heard others ask Microsoft programmers why Windows can’t offer the same convenience to users, but the ones I’ve spoken with just didn’t seem to understand.
I hope Microsoft will eventually emulate the ease of Linux maintenance.

hkb

Jul 17, 2009 – 10:49 AM

‘Don’t believe it’s valid to make the comparison ’cause there’s no one way to update/upgrade “linux”. This is one critical issue with me that varies from one distribution to the other; there’s no doubt that some distros provide for updates & upgrades far easier than Windows: take a look at Foresight and Arch.

JaDaDowntown

Jul 17, 2009 – 2:36 AM edited

I am using Linux over a decade and Unix more than two decades. If someone asks me about Updates Linux vs. Windows then you need first ask, what is Linux? Linux is just a kernel, and all another are projects how are added around the kernel to build an Operating System.
But I will take the hit now and ask you Angela Gunn about the command line tools from Archlinux and there “pacman” and Debian and there “apt”. Have you ever used them?
Have you ever compare Windows with the Graphical tools from openSUSE YaST or Debian’s Synaptic?
At the end you will find out, everybody has another taste, see things different and use different tools.
You better ask: “Can a Linux distributions manage updates and upgrades more easily than Windows”? The answer is yes!

Joco

Jul 17, 2009 – 12:18 AM

I find Ubuntu repositories safe and very convenient to use. However, I don’t agree with your suggestion to try to install software from source: cited from your article “If you’re compiling your own software or installing some truly paleolithic code, you can end up scattering files and such all around your system, none of it tracked by APT. But you’ve really got to try”
No thanks, I have just been through two failed compiled programs. The experience was so frustrating and even with help from experienced users, I could not get the programs to work. Not only they don’t work, there is no uninstall procedure! The “sudo make uninstall” is just for show (tried with Ekiga 3.2.5).
I thanks the programmers for their generosity to give away their programs. But there should be a convenient way to help them to release their works in well know package formats. Installing Linux programs from the source code is way way too complicated.

bopb99

Jul 17, 2009 – 6:57 AM

Linux needs a universal standardized package format.

DancesWithWords

Jul 16, 2009 – 9:17 PM edited

Ubuntu, Debian are fine. I use a derivation of Ubuntu called Crunchbang Linux on my Asus 901 EEE PC and Synaptic. The modern package manager of Linux are as good and dare I say better than what windows has offered for sometime. However unlike most computer users I don’t want my computer to be a toaster. Which is what 99.99 percent of the world wants. I want to have all the advantages of the modern package system while seeing gobs, and gobs of code scroll up my screen. Hence my work computer, home computers, server all run Gentoo Linux. They have extended the Portage system found in BSD. If your not a coding wizard, at least you can pretend with all the code flying by on your screen as you compile ever single package needed to run a complete system. Great for the control freak and the patient. 😉 Most want there computers to work… just work… like a toaster. I want to know why it works.

bousozoku

Jul 16, 2009 – 4:49 PM

Having used the Ubuntu distribution of Linux since 2007, I find that Linux distributions can do anything, though not always in a straightforward, simple, or transparent way.
However, supporting third party applications is usually good. I have waited on the various Canonical/Ubuntu developers to roll in application updates but they generally don’t happen quickly. In fact, they can be several releases back, as some applications change quickly. Firefox 3.5 has not been available as of a few days ago and the normal Mozilla way of updating has been disabled.
Still, Windows doesn’t care for third party software at all and neither does Mac OS X. Of course, you can get a package manager and play behind the scenes but they’re generally software ports from Linux and *BSD. A single point would be extremely nice.

lastjuan

Jul 16, 2009 – 3:22 PM

Sorry to sound so frivolous (well not really), but you should have used more acronyms like PITA, WOW or WTF. They are very related to the Windows update process…

marians

Jul 16, 2009 – 12:56 PM

Talking about Ubuntu and kernel updates … a kernel update usually requires a reboot. Well, not anymore, at least in Ubuntu. The very nice and free-for-Ubuntu KSplice application makes “rebootless kernel updates” a dream come true. Being available for Ubuntu only, it’s just a start. Linux users expect KSplice to provide the package for other distributions too. Having KSplice on a CentOS server (or any other Linux server) would make any admin extremely happy.
Something that Windows (and even OSX) still does not have.
And yet they don’t have to deal with the hell of video and/or sound drivers.
Upgrading to a new release makes me wonder every time if the existing ATI/NVidia drivers will work as they did before.
Easy things like setting up TV-Out or using your microphone in some trivial application (ex: Skype) are still a pain for the Linux user.
Not to mention the big confusing cloud caused by a non unified sound arhitecture.
Pulse Audio, ALSA, OSS, bla-bla… too many… much too many and none of them doing the right thing as easy as Windows or Mac does. I still dream of a Linux desktop.
Mark Shuttleworth wants Ubuntu to be on par with OSX in two years from now. Well, I guess he is extremely optimistic.
GNOME is a sitting duck, lagging way behind what a modern desktop needs to be/look like.
KDE devs just seem to be playing with the whole package for their own amusement (All I see of KDE is a huge “look ma’ Iwe can do this, it doesn’t matter if it’s just useless eye candy”.
Upgrading to the new KDE 4.x made lots of Linux users curse.
Compiz (as in Compiz, Beryl, Compiz Fusion and the whole history behind) brings some new air to the desktop. But try watching a video or play a game while having Compiz activated …
Well, this is not part of the subject. Reverting to the subject itself, yes, Linux update/upgrade can be a painless process most of the times. Easier than Windows. Not as stupid-proof as Mac though 🙂

garretthylltun

Jul 16, 2009 – 10:26 PM

Just when I thought my newer PC could handle the latest KDE…. Boy was I unhappy with that thought. KDE 4 was not the direction I thought they were going, but it’s the direction they went. And my newer PC was like my older PCs of days gone by trying to use KDE 3 on them… Almost impossible. So again, I’m back to using either Gnome or XFCE. Good old XFCE, always there when you need it, and always light on the resources. I recently busted out my old 500 Mhz box and tossed Xubuntu on there and likity split, it ran like a champ. I can’t put XP on it, I can’t put Vista on it, I can’t put OS X on it and I can’t put KDE or Gnome on it, but XFCE was there yet again to save the day.
Anyway.. Long story short, I’ve given up on KDE now with the advent of version 4. 😦
BTW, check out Enlightenment DR17 if you get a chance. Eye candy without the KDE Bloat. 🙂

PC_Tool

Jul 16, 2009 – 10:52 AM

Agreed.
Though while it would be ideal for Windows Update to manage updates to all installed programs, considering they weren’t all “installed” from a Microsoft Repository, issues become onerous.
Where does it get the updates from? How does it know?
Developers for Windows don’t necessarily like following the rules. If Microsoft were to say, “You must use this installer (package manager), and you must include this information”, they’d be up in arms. If Microsoft told them, “You must upload the installation media and updates to this repository”, the users *and* developers would go insane.
Without the repository, you run the risk of the files no longer being where the developer said they’d be during install. This would break the functionality *and* be 100% out of Microsoft’s control. Not something they’d want to allow to happen. Talk about a PR nightmare….

elitegangsta

Jul 16, 2009 – 11:49 AM

Exactly.

Orange

Jul 16, 2009 – 1:33 PM edited

It’s only the installation package format that’s mandated. If MS were to adopt this model, I don’t think they would try and create an uber-repository of all available windows software, and it certainly doesn’t resemble that on the Linux side. Canonical (the business behind Ubuntu) handles this nicely: On a default Ubuntu install, there are thirty-something repositories already defined. Some are maintained by Canonical (analogous to Microsoft maintaining a Windows OS and Office repository), and the rest are maintained by third parties, analogous to Adobe and Google maintaining their own repositories. You want more? Add more repositories for the software you want.
For a real example, Skype maintains a repository for Deb/Ubuntu versions, and the software they make available is on their own servers. It’s just that they are using the DEB file format for their installer. E.g. their instructions are:
1. Add the Skype repository*: deb http://download.skype.com/linux/repos/debian/ stable non-free
2. Reload or update the package information
3. Install the skype package.
From that point on, Skype is installed on your Ubuntu system, AND updated automatically along with everything else on the system. To the end-user, it looks like one process, but in reality each package looks for updates from its own maintainer; Canonical, Adobe, etc etc. Now if I could just point Office 2003 (running under Wine) to a Microsoft repository…. 🙂

PC_Tool

Jul 16, 2009 – 4:12 PM

I shouldn’t have worded the first sentence of that last paragraph quite the way I did…
Supplant “Without the repository” with “Without the Microsoft repository”.
If it’s out of their control, Microsoft will not actively code to support it. Losing a repository and ending up with a broken update looks bad for Microsoft, even if they had nothing to do with the repository in question.
Now, there’s *nothing* stopping a 3rd party group of folks from doing something like this (other than scale, of course), but of course, without the built-in functionality…who’s going to use it…or even hear about it?

Ciprian.Dobrea

Jul 16, 2009 – 8:32 PM

Too true.
Someone could indeed launch a tool similar to Steam, for desktop aplications, or Steam could be extended to manage more apps than just games.
They already have a great way of managing licenses and a pretty nifty online-store for apps(games).

this is an ongoing debate.. linux vs windows. i pasted with comments as its quite informative. do take a look at original article from betanews.

An object, probably a comet that nobody saw coming, plowed into the giant planet’s colorful cloud tops sometime Sunday, splashing up debris and leaving a black eye the size of the Pacific Ocean. This was the second time in 15 years that this had happened. The whole world was watching when Comet Shoemaker-Levy 9 fell apart and its pieces crashed into Jupiter in 1994, leaving Earth-size marks that persisted up to a year.

That’s Jupiter doing its cosmic job, astronomers like to say. Better it than us. Part of what makes the Earth such a nice place to live, the story goes, is that Jupiter’s overbearing gravity acts as a gravitational shield deflecting incoming space junk, mainly comets, away from the inner solar system where it could do for us what an asteroid apparently did for the dinosaurs 65 million years ago. Indeed, astronomers look for similar configurations — a giant outer planet with room for smaller planets in closer to the home stars — in other planetary systems as an indication of their hospitableness to life.

Anthony Wesley, the Australian amateur astronomer who first noticed the mark on Jupiter and sounded the alarm on Sunday, paid homage to that notion when he told The Sydney Morning Herald, “If anything like that had hit the Earth it would have been curtains for us, so we can feel very happy that Jupiter is doing its vacuum-cleaner job and hoovering up all these large pieces before they come for us.”

But is this warm and fuzzy image of the King of Planets as father-protector really true?

“I really question this idea,” said Brian G. Marsden of the Harvard-Smithsonian Center for Astrophysics, referring to Jupiter as our guardian planet. As the former director of the International Astronomical Union’s Central Bureau for Astronomical Telegrams, he has spent his career keeping track of wayward objects, particularly comets, in the solar system.

Jupiter is just as much a menace as a savior, he said. The big planet throws a lot of comets out of the solar system, but it also throws them in.

Take, for example, Comet Lexell, named after the Swedish astronomer Anders Lexell. In 1770 it whizzed only a million miles from the Earth, missing us by a cosmic whisker, Dr. Marsden said. That comet had come streaking in from the outer solar system three years earlier and passed close to Jupiter, which diverted it into a new orbit and straight toward Earth.

The comet made two passes around the Sun and in 1779 again passed very close to Jupiter, which then threw it back out of the solar system.

“It was as if Jupiter aimed at us and missed,” said Dr. Marsden, who complained that the comet would never have come anywhere near the Earth if Jupiter hadn’t thrown it at us in the first place.

Hal Levison, an astronomer at the Southwest Research Institute, in Boulder, Colo., who studies the evolution of the solar system, said that whether Jupiter was menace or protector depended on where the comets came from. Lexell, like Shoemaker Levy 9 and probably the truck that just hit Jupiter, most likely came from an icy zone of debris known as the Kuiper Belt, which lies just outside the orbit of Neptune, he explained. Jupiter probably does increase our exposure to those comets, he said.

But Jupiter helps protect us, he said, from an even more dangerous band of comets coming from the so-called Oort Cloud, a vast spherical deep-freeze surrounding the solar system as far as a light-year from the Sun. Every once in a while, in response to gravitational nudges from a passing star or gas cloud, a comet is unleashed from storage and comes crashing inward.

Jupiter’s benign influence here comes in two forms. The cloud was initially populated in the early days of the solar system by the gravity of Uranus and Neptune sweeping up debris and flinging it outward, but Jupiter and Saturn are so strong, Dr. Levison said, that, first of all, they threw a lot of the junk out of the solar system altogether, lessening the size of this cosmic arsenal. Second, Jupiter deflects some of the comets that get dislodged and fall back in, Dr. Levison said.

“It’s a double anti-whammy,” he said.

Asteroids pose the greatest danger of all to Earth, however, astronomers say, and here Jupiter’s influence is hardly assuring. Mostly asteroids live peacefully in the asteroid belt between Mars and Jupiter, whose gravity, so the standard story goes, keeps them too stirred to coalesce into a planet but can cause them to collide and rebound in the direction of Earth.

That’s what happened, Greg Laughlin of the University of California at Santa Cruz, said, to a chunk of iron and nickel about 50 yards across roughly 10 million to 100 million years ago. The result is a hole in the desert almost a mile wide and 500 feet deep in northern Arizona, called Barringer Crater. A gift, perhaps, from our friend and lord, Jupiter.