We can easily think that Apple is pretty dumb not to include USB storage drivers on their gadgets.

We can easily think that Google is pretty dumb not to have included a mini-SD memory slot on their 7 inch Nexus tablet.

We can easily think that desirable functions that are obviously left out of all sorts of hardware and software products are omitted due to the stupidity of the associated manufacturers.

Admittedly, manufacturers can be pretty dumb from time to time, but frequently these omission are for one of a few reasons:

1) They hope to include the function in a future model and get you to buy that one too (think iPhone here).

2) They have a goal which is counter to what you want to do. Google wants everyone to use the cloud and incrementally increasing the internal RAM of their tablet would mean this functionality would not be as important.

3) Same symptom, different disease. Apple charges an arm and a leg for incremental increases in internal memory and would not have this ability if you could add memory on your own - so USB storage and adding RAM chips to tablets goes against the grain.

4) The less expensive the printer, the more the per-sheet printing cost tends to be. There are no free lunches (or few anyway) and since printer companies are actually chemical companies (from a profit center standpoint) and they have to make money somewhere.

5) GSM phones in the US are generally locked to a specific vendor. Manufacturers of the phones (or SIM enabled devices) frequently restrict availability in the States until they make a deal with a carrier or two and then restrict sales to only that channel (as Samsung has done on its SIM versions of their Tab 2 tablets.

Simply because you see an obvious design flaw or omission in a product doesn't mean that you are smarter than their engineers - only that their marketing department is sure that THEY are.

"We can easily think that Apple is pretty dumb not to include USB storage drivers on their gadgets."

Well before starting off with false assumptions "we" definitely doesn't include me. It's not easy to assume "dumb" when it comes to Apple's decisions and I'm not 2nd guessing them...and clearly neither is the public. They may however simply think they need iTunes becuase MS isn't supporting Apple - its just different enough by Apple to favor Apple and like you I think that's by design - clever like fox. It has Steven convinced it's MS that hasn't provided support rather than the other way around and I imagine it builds for many others others the "it just works" vs winpc=no_go experience bias. Halo effect is powerful - like the force.

Years ago, when I first got into computers as a hobby (late seventies), there was something of a debate about whether or not chips should be socketed or soldered. The argument in favor of socketed chips was that users could easily replace chips that failed, or that they wanted to upgrade.

The argument in favor of manufacturers soldering chips to the logic board was that soldered chips were much less likely to experience failures.

(As I recall, every single chip on the logic board of my first computer was socketed.)

Manufacturers and most customers would probably agree that a design that lead to fewer hardware failures was inherently superior. Over time every manufacturer moved to soldered chips, and most customers were better served by socketed. chips.

Getting back to your point number 3: Adding a slot for removable memory introduces a degree of operational complexity. Users suddenly have to keep track of whats on internal memory, and whats on the removable memory.

Consider the Microsoft Surface RT. It has an SD card slot, but the OS doesn't let you install apps on that slot. Imagine what would happen if a user removed a card without realizing that a currently-running app was on that card?

Even limiting the cards to data introduces problems. What happens when you're editing a document, and you remove the card?

Oh, you say, that's not a problem. The user should remember which apps and data are where. Sure, we can do that, but who wants to?

You can design a device that best serves the needs of average users or techie users, but its very hard to do both.

Yes, removable storage can add complication, but no different in level than the floppy disk, USB thumb drive, external hard disks or DVD's do.

Sure, because of the size of modern SD chips and the flexibility of Android, there is the potential for some users to get confused. That said, anyone who has spent any amount of time in tech support can offer hours of entertaining stories about confused customers.

While there is some benefit to protecting customers from their own stupidity (Apple has made a practice of this), I suspect many of these design limitations are more likely put in to serve the interests of the manufacturers (as they tend to foster sales of that manufacturer's exclusively distributed products.

Yes, removable storage can add complication, but no different in level than the floppy disk, USB thumb drive, external hard disks or DVD's do.

Given a choice between a single hard drive large enough to store all my operating systems, apps, and data, or the alternative of multiple drives, I'd choose the latter. (And, in fact, that's exactly how I work.)

The only reason I have a 2nd drive on my desk is for backups. (And if you think about it, the whole idea of backups is a concession to the unreliability of hard drives, operating systems, file systems, and apps. If apps always wrote correct data, and operating systems and file systems also stored that data correctly, and if hard drives never experienced hardware failures, then we'd only need backups to protect us from our own clumsiness, like accidentally deleting a file.)

(Well, I suppose another reason for backups is unforeseen disasters like thefts, floods and fires. But the solution for those is some kind of remote backup, not another drive on my desk.)

I suspect many of these design limitations are more likely put in to serve the interests of the manufacturers (as they tend to foster sales of that manufacturer's exclusively distributed products.

That would make sense only if a manufacturer offered the device in multiple storage configurations. But the iPod shuffle comes only with 2GB of storage. If you want more, you'll have to look at competing products from other vendors.

The iPod classic only comes with 160GB of storage. The MacBook Air is available in 4 models, but they all have precisely 4GB of RAM. The iPod nano comes in a large variety of colors, but all the models have exactly 16GB of storage.

These limitations don't keep people in Apple's ecosystem. On the contrary, if you need a different amount of storage or RAM, you have no choice but to leave Apple for a competitor.

I think the more logical explanation is that fewer models means using fewer part variations, simplified manufacturing lines, and fewer SKUs, all of which reduce costs. Yes, they're doing it for their own financial reasons, but part of their logic is that these specific configurations will meet the needs of the greatest number of users.

(Apple uses the same WiFi chips in all their Macs. I once ordered a Dell laptop and had a choice of 3 or 4 different WiFi chips. Even if Dell sells more laptops, I bet Apple gets better pricing through buying 10 million model X chips than Dell does buying 4 million each of models W, X, Y, and Z.)

Expandability also requires tradeoffs like weight and space. The design goal of the Air was to be as small as possible. Having RAM slots would require a larger logic board. Likewise, having a non-removable battery allowed them to make it an irregular size to fill as much possible available space. (Take a look at the battery in this picture: http://guide-images.ifixit.net/igi/tNeWnWVjptZlHXJd.large)

Why doesn't the Air have built-in Ethernet? Because WiFi serves essentially the same purpose, and an ethernet port is physically taller than the computer itself! (Lenovo solved this problem by making their computer thicker at the point where the Ethernet port sits; look at this photo and tell me if you think it looks better for that compromise: http://www.engadget.com/2012/07/13/lenovo-ideapad-u310-revie...)

Now, there are some users who might be better served by having a user-replacable 5-hour battery, but I suspect the numbers pale in comparison to the number of users who would rather have a non-replaceable 7 hour battery. And again, if you need replaceable batteries, then Apple's decision has the effect of forcing you to buy from their competitors.

Personally, I find wired Ethernet to be vastly superior to WiFi, unless of course you really do need mobility. (I find that there's pretty much no time where I'm actually moving while using the Internet... usually I'm sitting somewhere, and that somewhere could have a jack.)

Especially public WiFi, which is getting bogged down, and will be bogged down far worse as all these MIMO devices that tie up multiple channels (like the new Kindles) become popular and everyone tries to stream HD video. I'd much rather have a gigabit Ethernet connection at McDonald's or Starbucks or wherever... just plug in and it works, no need for passwords to keep people outside the store off their network.

I think it's incredibly weird that twisted-pair Ethernet is still using only the same-old original connector (the 1975 mod-plug, no less!). I mean, come on, even USB alone has, like, 27 different styles of connectors. Why can't twisted-pair Ethernet have a new slim connector that would fit on all these small devices? Maybe throw some power on the connector while they're at it (something easier than power-over-Ethernet). Or if you don't add power, make an induction-coupled connector, since Ethernet is transformer-coupled anyway. That be great for places like McDonald's, since it'd be pretty much tamper-proof and could be hosed down without any damage.

If you have a 10 or 20 Mbit/sec internet connection, and your 802.11N is running at 144 Mbit/sec, switching to Ethernet won't really make a difference. Only if you're slinging bits between devices within your network does it matter.

I find that public WiFi gets incredibly slow when there's a lot of users, and especially so when there's a lot of other WiFi hotspots active in the general vicinity as well. WiFi speed at resorts is typically poor and often atrocious... especially when the signal is low (which it very often is). The phones at the resorts are wired, and they work fine; which leads me to wonder if just using wired connections for Ethernet instead of WiFi would work fine there too.

Clearly, wired Ethernet at a public place would not be slowed down at all by the presence of other establishments nearby. But I'm not sure how much improvement there'd be in a public place with no other WiFi signals within range. Certainly I find such places can be very slow, but is that because the WiFi is slow, or because the shared Internet connection is maxed out?

"Or if you don't add power, make an induction-coupled connector, since Ethernet is transformer-coupled anyway"

Not so trivial to make Gb ethernet work over anything that's not a very near field tiny isolation core transformer....and then it would still need a cable unless you want your tablet flat & touching that grimy McDonald's table top. I'd vote for a cheap fiber optical link with no ESD concersm, no pins to bend, and can be wiped down. Whatever it is beyond WiFi unless Apple gets on board there's not going to be any new connection method unity.But we digress...B

Actually I have a question regarding this. The observation is empirical, but I have notices on some wi-fi networks, as soon as an iPad is connected, it knocks all the Windows laptops off. I have noticed this at a number of public locations in Asia and Australia, so it's likely some specific router configuration or the ability of the Apple device to hog bandwidth.

Yes, I assumed a short cable would be used. Perhaps a regular connection on the tablet (but with a slimmer connector than a mod-plug) and the inductively-coupled connector on the other end of the cable (and at the public port).

I'd vote for a cheap fiber optical link with no ESD concersm, no pins to bend, and can be wiped down.

Cool idea.

Whatever it is beyond WiFi unless Apple gets on board there's not going to be any new connection method unity.

Given Apple's history of poor engineering and the occasional business blunder of extraordinary magnitude, I'm frankly surprised they've lasted this long. The long-running joke was "Apple is a company struggling to put itself out of business, but they are too incompetent to pull it off." I guess their incredible marketing has saved them time and time again. Remember when Sony used to be the "don't bother to compare, just buy Sony" company... before they faded into just another brand? That's kinda how I see Apple's future.

The phones at the resorts are wired, and they work fine; which leads me to wonder if just using wired connections for Ethernet instead of WiFi would work fine there too.

Maybe, but when you're dealing with networks about which you have little or no knowledge, it's hard to draw any useful conclusions.

Consider that in a properly configured network where computer data and VOIP packets are prioritized over data, so that no matter how many people use data, the voice calls should always sound good.

Second, if the hotel/resort/whatever is sharing a single slow connection among many users, it won't matter much if they're using ethernet or WiFi because the weak link will be the internet connection.

Clearly, wired Ethernet at a public place would not be slowed down at all by the presence of other establishments nearby

True. Just to be clear, it won't make any difference if Starbucks installs ethernet ports at every table. If there are 25 people in Starbucks sharing a wired ethernet network, performance would likely be no better than if they all used WiFi.

(There are unusual exceptions; I know of one place with a huge pipe to the internet (I think it's an OC48). At that location, there are two WiFi networks, and one requires a password. The "guest" network only runs 802.11g speeds, and it's always swamped. But they do have Ethernet ports in some locations, and I always use them when they're available.)

"Just to be clear, it won't make any difference if Starbucks installs ethernet ports at every table. If there are 25 people in Starbucks sharing a wired ethernet network, performance would likely be no better than if they all used WiFi."

Not so sure - WiFi inherently is a more limited pipe (than using todays cheap wired gigabit routers) so there is more contention for bandwidth/resource and cause of colissions.B

I do not know what the fastest routers are these days, but my new PC has two 10 Gigabit NIC ports. I have nothing to plug into them yet. My FiOS connection does quite well with a 100 Megabit NIC. It also has two 1 Gigabit NIC ports on the motherboard. But my other computer has only two NICs, both 100 Megabit.

Just to be clear, it won't make any difference if Starbucks installs ethernet ports at every table. If there are 25 people in Starbucks sharing a wired ethernet network, performance would likely be no better than if they all used WiFi. — stevenjklein

Not so sure - WiFi inherently is a more limited pipe (than using todays cheap wired gigabit routers) so there is more contention for bandwidth/resource and cause of colissions.

That's pretty much my thinking. Especially considering stevenjklein's comment (above) was in response to my comment about your WiFi not only having to compete with other users at Starbucks, but with WiFi users in nearby stores as well. So you've got the 25 Starbucks users, 20 users at a frozen yogurt store next door, and 40 users at the McDonald's behind... all competing for the same WiFi frequencies. And some of those users are Kindle HD owners using multiple channels simultaneously to stream High-Def videos... so each one of those is like two (or more) regular WiFi users.

I still think it's weird that the only connector for Ethernet is still this ancient phone company thing that's too big to fit on so many modern devices. Especially if you look at how many connectors there are for, say, USB or HDMI.

That's pretty much my thinking. Especially considering stevenjklein's comment (above) was in response to my comment about your WiFi not only having to compete with other users at Starbucks, but with WiFi users in nearby stores as well. So you've got the 25 Starbucks users, 20 users at a frozen yogurt store next door, and 40 users at the McDonald's behind... all competing for the same WiFi frequencies. And some of those users are Kindle HD owners using multiple channels simultaneously to stream High-Def videos... so each one of those is like two (or more) regular WiFi users.

Wifi officially has 14 channels, but they overlap. In practice it's possible to use four of them with little to no interference with each other: 1, 6, 11, and 14 if you're using 802.11b or b/g, or on 802.11g and 802.11n you alternatively can do 1, 5, 9, and 13.

(802.11n with channel bonding is more demanding: use channel 3 or 11.)

So the Starbucks, frozen yogurt stand, and MickeyDee's should not be competing for bandwidth. In theory. Unfortunately, the hotspot devices aren't necessarily smart enough to automatically space themselves out nicely.

Some devices don't even know about channels above 7. Because of that and the fact that I move around a lot (live in a motorhome) I'd put my hotspot on 14... except it's one of the devices that stops at 7. (It also isn't smart enough to avoid channels already in use, if I let it pick the channel itself.)

The uplink speed from the hotspot can also be an issue. 802.11g networks can pretty easily generate a load of 34mbps; 802.11n channel-bonded networks, 200mbps. Apparently the best DSL service currently available from AT&T delivers 24mbps capacity. Cable modems go to 150mbps.

(By the way, we're spoiled. 34mbps is eight average-length novels, as ASCII text, per second. And we aren't satisfied.)

"I still think it's weird that the only connector for Ethernet is still this ancient phone company thing that's too big to fit on so many modern devices. Especially if you look at how many connectors there are for, say, USB or HDMI."

The phone (RJ) legacy is definitely related to twisted pairs being so phone cabling based. As much as it's big at least anyone with a $20 crimp tool can make one - wouldn't be true for terminating with some mini molded connection like is the case of USB et al. Pros & cons both.Cheers,B

The phone (RJ) legacy is definitely related to twisted pairs being so phone cabling based. As much as it's big at least anyone with a $20 crimp tool can make one - wouldn't be true for terminating with some mini molded connection like is the case of USB et al. Pros & cons both.

But if that's a concern, they could make a connector that could be as easily attached to the cable but is big in only one dimension rather than two. So you'd still need a wide port on the device, but not a tall one.

But if that's a concern, they could make a connector that could be as easily attached to the cable but is big in only one dimension rather than two. So you'd still need a wide port on the device, but not a tall one.

The dongle from my old (circa 1998) PCMCIA Ethernet card had exactly that.

"But if that's a concern, they could make a connector that could be as easily attached to the cable but is big in only one dimension rather than two. So you'd still need a wide port on the device, but not a tall one."

Is the juice worth the squeeze? Again it's easy to terminate twisted pair wiring so any changes that make it less so might better offer more significant differences when hardwaired is alreay a hard sell. All JMO of course,B

I find that public WiFi gets incredibly slow when there's a lot of users, and especially so when there's a lot of other WiFi hotspots active in the general vicinity as well. WiFi speed at resorts is typically poor and often atrocious... — Radish

Second, if the hotel/resort/whatever is sharing a single slow connection among many users, it won't matter much if they're using ethernet or WiFi because the weak link will be the internet connection. ... Just to be clear, it won't make any difference if Starbucks installs ethernet ports at every table. If there are 25 people in Starbucks sharing a wired ethernet network, performance would likely be no better than if they all used WiFi. — stevenjklein

Not so sure - WiFi inherently is a more limited pipe (than using todays cheap wired gigabit routers) so there is more contention for bandwidth/resource and cause of collissions. — Philipo

My wife used WiFi to connect her laptop. The front desk had given us a long speech about how to use the WiFi along with a card of instructions and another set of last-minute warnings ("remember the capitalization is..."). After a bunch of clicking and typing and agreeing to long sections of fine print, she was all set with Internet connectivity.

Me, I plugged my laptop into the Ethernet jack. Other than the Windows warning ("Is this a private or a public network?") which my wife also got, it just works. No clicking on stuff. No user names. No passwords. No agreeing to pages of fine print. Plug it in. It works.

So I figured I'd run a speed test between the two connection methods, but I decided to wait until today (since it's Valentine's Day and the hotel will be busier).

So today... the wife's laptop can't connect at all over WiFi. Nothing's working. I'm not using my laptop, so I give her my Ethernet cord. She plugs it in, it works, no questions, no nothing. It just works.

So I didn't bother running a speed comparison, since at that point the WiFi speed would necessarily be zero.