Category Archives: Technical

Fluff can be a PC killer as I recently discovered, but the solution to the problem is straightforward.

As a true geek, I built my own PC, but it’s no speed demon with a mid-range quad-core processor. However, I’d been suffering from intermittent PC shutdowns that would always happen when I was at the PC and never when the PCs was on but not in use. It was very irritating because you’d be right in the middle of something and then you’d be dumped out. I put it down to buggy software.

Last week I started to rip a few DVDs for tablet viewing and every time the PC would shutdown within about 30 seconds of starting the file conversion. At this point I began to think that the processor might be overheating, forcing a shutdown before it was damaged. Upon opening the case, nothing looked particularly out of the ordinary; there was a bit of fluff but nothing you’d think of as being a problem. It was only when I looked more closely at the heat sink on the CPU that I saw many of the spaces between the thermal vanes were clogged with fluff.

Out with the vacuum cleaner and a good hoovering later, I powered the PC on and started a fresh rip. This time the PC didn’t shutdown and I was able to rip solidly for at least an hour without any shutdowns. Problem solved!

Tip of the Day – if you are experiencing intermittent crashes or shutdowns, open your PC and give your CPU’s heatsink and fan a clean with the vacuum cleaner.

And if any Americans out there are wondering what “fluff” is, I believe that you know it as “lint”.

Last weekend, I upgraded my NAS from 2 TB to 4 TB and it was all too easy. The NAS is a Buffalo LinkStation Duo but as the drives are mirrored, I only get half the total 2 TB capacity, i.e. 1 TB. I was getting pretty close to having the full terabyte of data on the unit, so I decided it was time for a storage upgrade. However, the last time I upgraded another model of NAS, it involved much chicanery and re-installing of firmwire via USB, so I proceed with trepidation.

Not so this time. It was mostly lots of waiting interspersed with a few minutes of activity, followed by first time success. Disappointingly little geekery was required.

Step 1. Buy a pair of SATA 2 TB hard-drive. The LinkStation already had Seagate drives installed, so I played it safe and bought some Seagate Barracuda drives. Wait a couple of days for drives to arrive in post…

Step 2. Backup the data from the NAS to an external USB drive. My favourite tool for this is rsync because it simply copies files (no archives or zip files) and you can stop and start the backup as you like. You can even keep using the NAS up until the last minute before running one final rsync to copy the latest changes over. Leave the backup to run overnight…

Step 3. Shutdown the LinkStation via the web interface.

Step 4. Remove hard drives, insert new ones.

Step 5. Power up the Linkstation and log on via the web interface.

Step 6. Format drives in turn. Configure as RAID 1. Wait for best part of a day while array synchronises….

It was pleasantly straightforward to upgrade the NAS and a big change from the last occasion I had to swap a disk. For sure it takes a couple of days to do the swap, but the time is spent shuffling data around, not actually working on the unit. Definitely a recommended upgrade.

From time to time you may feel like pounding your keyboard in frustration, but Mionix has actually invented a keyboard that makes that possible. Now, we aren’t recommending that you take out your rage on an inanimate object, just pointing out that they claim that this thing is pretty much indestructible.

Every key on this new keyboard is mounted on a steel plate and rates for 50 million keystrokes. What they are selling here is a long-lasting keyboard, not one you can hammer at will. This keyboard, they claim, should last you 10 years. The keyboard cable is heavy duty, and it even has a built in audio jack and USB.

In addition to the keyboard, Mionix also has a new mouse and mouse pad, both of which have some very unique and highly-tested features to optimize them. The new keyboard carries an MSRP of $149.99, the mouse is $79.99, and mouse pads begin at $24.99. You can find out more at Mionix.

If you’ve driven at all you’ve probably been in a traffic jam at least once in your life. If you live in the Northeast or around any major metropolitan area such as Chicago, Dallas, or Los Angeles you may feel like you are in one almost daily. Sometimes there is a visible cause such as a car accident or roadwork, but other times a traffic jam seems to appear for apparently no reason at all. Scientist and engineer have been studying this phenomenon for years. In 2007 ScienceDaily published an article explain how this can easily happen using a truck switching lane and therefore cause the traffic behind them to slow down below a critical speed. The traffic around the incident clears and moves forward however the problem rolls back like a wave creating the traffic jam. There is a good graphical representation of this at SmartMotorist

So Scientist have known what happens in a traffic jam for awhile, the question is how can they be prevented. There are three types of traffic flow. Free flow, where traffic is flowing at the maximum speed allowed. Synchronized flow where because of the traffic density the vehicles move at a slower but still constant speed. Finally there are jams where speed drops to zero when traffic density reaches a certain unknown threshold. So how do you prevent the third circumstances. One possible solution is to have vehicles to talk to each other through an automated system. If you have been in a traffic jam you will quickly recognize that most people have one of two reactions the first are the defensive drivers who leave more space between them and the vehicle in front of them then necessary. The second group are offensive driver, the kind that drive up so close behind you that you can see the spinach they had for diner. What you want is for vehicles entering the traffic jam zone to act more defensively and enter the problem zone slower and those in front to leave the jam quickly causing the traffic jam to dissolve. What is the best way to do this, one possible solution is to have cars talk to each other. They could share their speed and position to the cars around them. As cars in front of them slow down this would hopefully convince the cars coming up to the area to slow down also. Meantime the cars in front of the congested area would leave faster, keeping the flow going. This is the idea that is discussed in a Technology Review article published by MIT.

There are of course several problems that need to be resolved for this to work. First is security you want to make sure you have a system that can’t be hacked. Second at this point it is unclear how many cars need to have a system installed for it to be effective. Also systems that are manufactured by different companies need to be able talk to each other. Finally people have to actually use the information that they are provided in the way they are suppose to. As more and more cars enter our highways both in the United States and around the world developing technology like this becomes increasingly important. This type of technology is still in its infancy, but if it becomes reality, it will have far more impact on productivity and the economy, then any social network.

Smashing Magazine is celebrating its fifth birthday and as a wee treat, has prepared a “Best of Smashing Magazine” ebook and is giving it away free. The articles are all about web design, Photoshop, typography and user interfaces (or the user experience as it seems to be called now).

It’s no lightweight either – there are 409 pages of beautifully prepared material packed with information and examples. The first article, “30 Usability Issues”, makes interesting reading even if you aren’t a web designer. By being more educated about design, as a consumer you can be more aware and critical of websites and other media. Did you know that the Macintosh logo is an example of the Law of Pragnanz? No, neither did I but you’ll have to read the article to find out what it means.

Other articles include, “Setting Up Photoshop for Web and iPhone Development”, “What Font Should I Use?” and “10 Principles of Effective Web Design”. There’s the occasional overlap between the articles but it’s never repetition for the sake of it.

The ebook is available from iTunes or for .pdf, .mobi and .epub formats, direct from Smashing Magazine. Warning – it’s 55 MB download as it contains all three versions of the ebook.

It is very important that you listen to my podcast on July 2nd as within that show will be instructions on how to win a Lenovo Think Centre A63 Computer System with dual screens. As you know I have been talking about this system in my show for a while now, and it is going to make a great computer for a listener of my show. The contest details will be announced within the show. Here the kicker the contest period is very short. I will announce a winner on July 6th so you will have to make sure that you download and listen to the podcast over the 4th of July Holiday weekend.

I will be talking more about the Lenovo Think Centre this coming week along with releasing a in depth review in preperation for the giveaway. In addition to our giveaway a host of other sites are giving away the same prize package. Check out the rest of the sites for more details timelines of their contest are listed below.

Yes, there was sarcasm in the title. But the FCC proposed a really is a great idea. Tell people when they are about to hit their limits.

I remember when I got my first overage bill. It was $130 more than expected. While I was a bit perturbed, I understood and paid it off. Definitely took a hit in my pocketbook.

Of course I did have a Land line and my Nokia phone (which still was just for calling people) had only 250 minutes and $.40 a minute after. I also could walk 20 feet to the west and all of a sudden get “Roaming” charges. So going over on a plan was easy – in 1998.

Nowadays, the land line is gone in liu of Skype and Google Voice. The Cell is the primary contact for calls, texts, emails and facebook posts. I have an unlimited text and data plan and if I go over in minutes, I have a backlog of rollover to keep me safe. Then again, I pay $130 a month…

Stopping the Overage:

37 years after the first cell phone call was made, companies are finally realizing that someone might go over in their minutes. O.K, the FCC is realizing this and trying to make the phone carriers comply. If the user hits their limit, they get a message stating that.

The user then can choose if they want to rack up additional charges or turn their phone off until the next month starts. Wait – you can turn a phone off?

Similar but Sad: Data Plan overage

Vodafone in the UK – who nixed their unlimited data plan – announced they will be offering a free text service to warn people if they hit their limit. So those of you in the UK who watch their soaps or Dr. Who from the phone during lunch might not be able to watch more than 1 episode for the whole month.

Remember last year the Chicago Bears fan who watched the game from his netbook on a cruise ship? He got $3000 in overage fees for his wireless data plan.

It’s all about a text

I get texts from AT&T whenever my bill is ready; Or if I haven’t paid last months yet. I suppose it’s time of month to see that text message aga…. oh wait. Here it is. They are so eager to make sure you pay your bill, but not that eager to let you know if you stretch your limits.

Automation

It’s not like someone has to sit by their phone and text everybody that goes over “Dude: You’re hitting your limit.”. We have automated scripts that can do that. Just like my bank has an automated script to tell me when my account hits below…. oh wait. Just got THAT text message, too.

I totally agree on an alert system. I can’t log into the website everyday to see where I am in minutes. Then again, if I ever go over

The Mitre Corp has produced the 2010 CWE / SANS Top 25 Most Dangerous Programming Errors which identifies the most commonly encountered coding errors that can potentially lead to web sites being hacked or PCs being compromised. Some of the errors are well-publicised in the technical press, e.g. “cross-site scripting”; some are downright stupid, e.g. “use of hard-coded credentials” and others are the results of carelessness, e.g. “improper validation of array index”.

However, what makes this document better than the usual Top-X lists is that it provides guidance to programmers on how to prevent or mitigate the errors. For example, to avoid cross-site scripting it suggests, “Use languages, libraries or frameworks that make it easier to generate properly encoded output. Examples include Microsoft’s Anti-XSS library, the OWASP ESAPI encoding module and Apache Wicket“. There’s additional information for the technically-minded that goes through the different stages of software development starting with initial design, through to compilation, implementation and testing.

One of the best pieces of advice is in the discussion around checking for unusual or exceptional conditions, “Murphy’s Law says that anything that can go wrong, will go wrong. Yet it’s human nature to always believe that bad things could never happen, at least not to you. Security-wise, it pays to be cynical. If you always expect the worst, then you’ll be better prepared for attackers who seek to inflict their worst. By definition, they’re trying to use your software in ways you don’t want.”

So, if you are into web programming in any way, this has to be mandatory reading to keep the bad guys at bay. Even if you are not, the discussion elements for each of the errors is illuminating in showing exactly what is going wrong and why it’s bad. Just skip over the technical bits in between.

I decided to devote a large part of last weekend to upgrading my main system to Windows 7. In the interest of science I decided that I would read no guides or tips beforehand, I would test how easy it was using only the information and instructions that came in the packaging.

So the stage was set for the install. The system I am upgrading was very powerful when I built it 5 years ago. While I do most of my web surfing on it, the main use for the system is to manage my media, either syncing it to my portable player or streaming it to devices on the network. It started this process with Windows XP Media Centre Edition installed; I had a brand new copy of Window 7 Home Premium upgrade to work with.

Stage 1 – Preperation

Even the packaging for Windows 7 made clear that a clean upgrade was only possible if you were upgrading from Vista. The claim was though, that even though the main programs would need to be re-installed, the settings would be maintained. I have never had a software upgrade that ran well so my confidence was not high. Given that the test is to see how easy it is to have a usable system after the upgrade I took a few notes first on the beginning state.

When I performed a Vista upgrade on a relative’s computer the main issue I had was with a lack of drivers for all the installed devices. At that stage it has already been 3 months or so since Vista was released and it was months more after that before all devices had working drivers. I have a number of extra components installed so I am interested to see how many work after the upgrade.

Microsoft has released an upgrade advisor to check which parts of the system are supported under the new environment.

The reports showed that Outlook Express would not be available and the game port would not work. No great loss for these as I do not use either of them. More worrying though was that my network card was listed as not compatible. I have recently put in a new wireless-n router so I took the precaution of buying a new wireless card prior to starting the upgrade.

As you can see from the image to the right, the majority of my devices came up as being supported.

Once I had a level of confidence that I could support Windows 7 on this PC I was ready to start upgrading. The only change I made to the system was to upgrade the RAM to 2GB.

Stage 2 – Settings transfer

First step of the XP upgrade process is to run the Windows Easy Transfer program. This is designed to take all the settings and files from the old to the new system. There are options to save these to disk, USB media or a network share. If the Windows 7 and XP installs are on different computers the transfer can also be done directly across the network. In my case I set the target as a directory on another drive in the system.

While the process completed with no errors it took a long time. Even though there was only about 260GB of data the process started at 5:05pm and didn’t finish until 12:51am, nearly 8 hours later. Given that the processor was much busier than the disk during this time it appears like this was more than just a simple file copy.

Stage 3 – Install

The actual install on Windows 7 itself was a breeze. I chose to install to a brand new directory so I could still boot XP if everything went pear shaped. Even though I was using an upgrade version the install didn’t complain and there was very little interaction needed over a 15 minute process. Within half an hour of shutting down XP I was running the Windows 7 side of the Easy Transfer. This time I didn’t wait for the finish, I left it to run and went to bed.

In the morning the transfer had finished and I had a working system. The next step was to check whether it was functional. The Easy Transfer Report showed a few strange issues, including the “programs without identified manufacturers” including 5 Microsoft Programs. Happily, even though the upgrade assistant claimed my system wasn’t up to Aero, it was running happily.

Stage 4 – Is it working?

A Device Manager report showed that there were 5 devices that did not automatically find a driver.

-MS Keyboard with Fingerprint reader

-Soundblaster Audigy

-Avermedia TV tuner

-SB Gameport

-DLink USB wireless-g NIC

This last was of course the problematic one as it prevented me getting onto the Internet to find drivers. Thankfully I was pre-prepared with my brand new replacement NIC. Such cunning, such foresight, such misplaced optimism. This is where I ran into my first actual problem with the install. The Netgear wnda3100 wireless n USB NIC came with a Vista driver that would not load and management software that crashed 5 seconds after it loaded. As I no longer have any UTP running to my study from the router this would have been a problem without the miracle of multiple computers and flash memory.

Doing a few searches I found that it may not have been Microsoft’s fault. The general feeling around some of the forums was that the Netgear 64-bit drivers were flaky to begin with and people had similar problems with Vista. I managed to find someone who had hacked a driver to actually work located http://www.wnda310064bit.webs.com/ So thanks to unnamed author who gets some link love and a $10 donation.

Now back on the Net I ran an autocorrect feature that Windows 7 provided. This managed to find drivers for 2 of the remaining issues, the SB Audigy and the TV tuner which both now worked. This left just the gameport, which I was unconcerned about, and the fingerprint reader on the keyboard. While the whole keyboard is listed in the report, the reader is the only function that does not work. How ironic that the only device that caused me a lasting problem on a Windows OS was a Microsoft problem.

I was also having problems with the system freezing coming out of hibernate. I am currently avoiding this by diasabling the auto-hibernate feature.

Stage 5 – Application re-install

The big test was next. The two biggest worries I had going into this were Firefox and iTunes. Firefox has a number of plugins, greasemonkey scripts, and heaps of favorites and links.

I was very pleased with this install though. No only were all of the mentioned features there instantly after install, the new version of FF remembered all of the tabs I had open under XP when I shut down. I had left a number of tabs open as what a I though would be an unfair test of the upgrade and was pleasantly surprised with the result.

The iTunes install went just as well with all of my songs and playlists surviving intact. Most importantly all my podcast subscriptions, listened stats and player sync details came up automatically. I did need to re-authorise a couple of songs though, which highlighted again for me the danger to consumers of DRM. Almost all my digital media is DRM free because I stayed with CD’s until iTunes offered DRM free downloads. I have 5 iTunes DRM’ed songs though that I bought for my daughter because she bugged me at a weak moment. I have already used 3 of my 5 total re-authorisations and they were only purchased 2 years ago.

Conclusion

The other programs I re-installed had no significant history to remember. All in all a relatively quick and painless process. A benchmark claims that the system is about 15% slower running Windows 7, which is not bad for a 5yo system jumping 2 OS generations in one step. The browsing and podcast syncing, which are the main functions of the system are running just as well as with XP. The next couple of weeks will show whether problems start to show up and I’ll report back on my progress.

For the upgrade process though I will give Microsoft an 8.5 out of 10. They lose some marks for the length of elapsed time the whole process took, most of which was waiting for the Easy Transfer process to finish. This was the only real negative though from what was a painless process that delivered a better than expected result.

The rumour machine is abuzz with the prospect of Apple releasing a tablet, and there are a number of other people, most notably TechCrunch and Archos, have tablet systems either out or on the planning desk. It will be interesting to see if anyone can finally make one that is worth owning.

The first tablet PC I had used was a Compaq Concerto somewhere in the mid 90’s which was a 486 based tablet version of a standard Compaq notebook. I believe the much glorified Apple Newton preceded it, however while both these products were much hyped at the time but never delivered any real value and were cancelled without replacement.

In the 15 years or so since they first appeared tablets have made periodic returns, always for a brief flurry of enthusiasm that eventually came to nothing. The most sucessful itteration was the slight tangent into the PDA space. While that eventually was a dead end market as well, it did directly contribute to the creation of the smartphone market, which has been an increasingly bouyant tech market.

I am not confident that the track record of tablets gives great hope of success this time. There are a couple of new technologies that give this iteration a better chance. The first is multitouch touchscreens. This will make the platform more usable and increase the number of applications. The second is pageflow. While it is more an application of technology rather than a radically new technology, it does change the feel of scanning though multipage documents or lists of items, essentially making it a more natural feel.

The other positive aspect is the operating system options that exist now. A major problem previously was trying to use a full size operating system on a platform that needed to be lightwieght. A modified iPhone OS or Android platform could offer the functionality needed for the platform to operate while being lightweight enough to operate well on a less powerful platform and get good battery life.

I think this article in PCMag might have a good handle on what the Apple platform might end up looking like. One factor I definitely agree with is that new version of the tablet has to offer a different experience from a standard laptop, which needs to be more than just adding a different interface. There is also a good rundown of the tablet market in this NYTimes article.