* Graphics card glitches - I paid close to 4K so I don't have to deal with quality control issues.

* Touchpad is just too large. I found myself resting my palm on it all the time, and sometimes (clicking) without realizing it. Also, if you like lying down and working (which I do sometimes because of a lower back problem) the size of the touchpad will make you work extra hard to avoid accidental clicks.

* Had the machine for ~ 10 days, used the touch bar less than that. Definitely not worth the money. Hopefully in the future, they'll have the 15' option without it.

* The bootcamp experience just sucks (this was my primary reason for returning it). Currently, there's no way to gracefully switch between discreet and integrated gpu, so the battery life is terrible, like two-and-a-half-hour maximum battery life terrible. gpu-switch doesn't work either. In fact, if you use gpu-switch you'll have to rebuild both macOS and Windows as the machine will just hang when you try to boot into either.

* Recovery mode has many issues with network connectivity. A few times, I had to tether/connect to my iPhone hotspot for it to go through.

* Sharp edges everywhere.

The specs are very underwhelming too, but I was willing to tolerate lower specs for higher build quality. I actually just picked up an XPS 15 9550 from Microcenter. Got the 2.6Ghz, 16GB (expandable to 32GB), 512GB SSD, 4K touchscreen for $1350 (an open box, new for $1499).

I don't want to discount your experience by any means. And there's no denying that the price is very steep for the new MBP. But, other than price, I'm really satisfied with the new 15" MBP.

In particular, I do actually really appreciate the larger trackpad. But I'm a heavy user of BetterTouchTool and have always regarded the trackpads as one of the main reasons to get a MacBook. I don't even bother with three-finger drag now thanks to the size of the trackpad.

I think the Touch Bar should be considered for what it is: a replacement for static function keys. Apple of course hyped it like they hype everything. But considered realistically in context I consider it a success. I actually do use it some. Some of the simplest things work the best; for instance, I really like the options presented when taking a screenshot. I also enjoy using it for music control, scrubbing through music, and switching between music sources (including YouTube tabs). Nothing revolutionary, but then again, how could it ever be given what it is?

Your note about three-finger dragging puzzles me. I’d thought now that the trackpad is larger, this particular mode of dragging would be more useful. With smaller trackpads I would constantly run out of space to end the drag without lifting my fingers. That’s why I got used to click-and-holding with my thumb and repeatedly swiping with the index finger to complete the motion. What am I missing here?

Double-tapping and leaving the finger on the trackpad lets you drag the window or other item until you short-tap again. Drag lock even allows you to lift your finger and continue dragging from a different position on the trackpad.

I don't run out of space anymore. I don't even have to do what you just described (repeatedly swiping with the index finger). I can often just click down with my index finger and drag a window all the way to where I want it without reaching the edge of the trackpad. I could never do this in the past, so I came to rely heavily on repeatedly three-finger-dragging windows around.

I also used to prefer three-finger drag because it was physically hard to click down and hold the click while dragging. But the Force Touch trackpad makes that much, much easier, and it also makes it so that I can always initiate the click anywhere, even at the top of the trackpad.

So it's really the combination of the larger trackpad and the Force Touch design that finally prodded me to stop using three-finger drag.

Thanks for clearing that up. My error was to have only drag operations in mind where letting go mid-way is not an option. So, yes, a dragged window stays where it is. I was thinking about a dragged item and how it bounces back to its original position or ends up in the wrong container if you let go too soon.

Not to rathole on this - but now I'm confused. When I click-drag something on macOS, I can click once, and repeatedly swipe with my index finger, lifting it from the touch pad each time to continue. This is true with every operation I can think of - selecting text, dragging windows, dragging items, etc...

In theory, a three finger-swipe is basically GUI equivalent to a click and drag - so the behaviors should be identical.

I am a heavy Mac user and I could understand the Apple premium. There was no real competition before, but now that's changed. There are good offerings from Dell, Lenovo, Razer, HP, Asus and many others now.

I didn't even notice my mistake until I received the machine. Oh well, USB-C is a nice to have but not mandatory(HP blocks eGPUs anyway). Skylake vs Kaby Lake is a very small difference, except for the graphics.

Except that BestBuy also made a mistake and sent me an I5/8/256 version(silver instead of Ash silver). Which was promptly returned. I've used the refund to buy parts for a desktop ITX PC on Newegg. I'll be using a chromebook or whatever when I can't be bothered to sit at a desk.

What were you using before? I just upgraded from a late 2008 MBP, and the difference is night and day. Computers only get noticeable better roughly every six years these days, so if you were using last year's model then of course it's going to seem like a ton of money for no real benefit.

From my perspective this thing is at least 2x as fast in real world use, substantially lighter, way better screen and speakers. And the keyboard is awesome and the build quality is outrageously good.

I get that people are upset they can't edit 5K or whatever, but you can get a desktop computer for that for less than half the price. Obviously they will be even better in three years once intel has chips that are appropriate for the MBP that also support LPDDR4, but for now this seems like clearly the best computer on the market and very clearly aligned with where the industry is going in the future.

For instance if you are entrepreneur, any computer will have enough power and other things become important. An entrepreneur needs to travel a lot so battery lasting is essential, weight being low is essential, size being small is essential, it just working(software-hardware integration) is essential.

You can't move a desktop easily.

As an entrepreneur and engineer I use Macbooks a lot. Heavyload is done in servers. I used to compile my own gentoos(Linux distros) and built my own desktops and servers from discrete components in the past to get the best bang for the buck. Not anymore.

Macbooks are great computers in overall design, if you are careful enough to avoid first generation designs(that applies to any product from any company). Once they iron all the bugs it just works.

I can't speak about MBP as I have only ever bought Windows machines but I was in the market for a new laptop for my new software dev business (one-man startup at the moment) and I looked at a few: Although I should say that my brother is still using his 2008 17" MBP to this day so they do know how to build them!

My shortlist was the XPS 13, XPS 15, Surface Book and Latitude E7470.

I discounted the XPS firstly for having too many coil whine issues even after 3 generations[1]. In addition, I had read about the key travel being short (1.2mm I think) but it wasn't until I actually tried one in PCWorld that I realised it's horrific to type on for any length of time. Not something I had experienced in the past. The 4k screen is wonderful though.

The Surface Book, while having a nice keyboard and sumptuous screen, has a terrible warranty: 1 year hardware support out of the box and for 3 years it was around £350. Even then, all they would do is send you a second hand replacement and take yours away. I read about some people that had been sent badly scratched replacements even though theirs was perfect. Too risky for a £2000+ machine.

The Latitude e7470 ticked all the boxes: easily expandable, tough, 14" and a screen res of 2560 x 1440. Also, I found one in the outlet store (scratch and dent) for £800 with a 3 year onsite next day warranty: I haven't received it yet but I have a 7 year old one at home that still runs as a Windows Server 2012 R2 machine and apart from having crap battery life, it still runs.

I hope your XPS is ok, it's stunning to look at in the flesh but I didn't want to take the risk that my £2000 machine started whining and to have Dell say "it's by design", and I hated the keyboard, so I stuck with the slightly less glamorous but no less capable Latitude.

Great move. Keep hearing from forums, reddit small issues here and there for this first-gen notebook. Looks like many others are paying premium to Apple to be their beta testers. Really hope Apple release non-touch bar version in parallel for future refresh.

> * The bootcamp experience just sucks (this was my primary reason for returning it). Currently, there's no way to gracefully switch between discreet and integrated gpu, so the battery life is terrible, like two-and-a-half-hour maximum battery life terrible. gpu-switch doesn't work either. In fact, if you use gpu-switch you'll have to rebuild both macOS and Windows as the machine will just hang when you try to boot into either.

Also a problem on previous models. I was working on my own GPU switching solution but gave up due to lack of time and the fact most of my applications work just fine under OS X.

I checked, but I couldn't find out if Dell still has that Sputnik programme they did a few years back, where you could order one of these with Ubuntu LTS preinstalled. I'm wondering if everything "just works" yet.

Chargers may offer cryptographic signatures in the future for authentication against a whitelist at the device.

Second and most problematic:
The MacBook is a good citizen here, but many laptops (HP business series, Dell XPS series) only support USB-C PD with profile 4&5 (20V/3A+).
This rules out the car dongle as well as cheap USB power banks.

The connector is always the same, the customer cannot deduce charger/device compatibility. The experience will suck.

In other words, exactly like the early days of usb-micro/mini? I distinctly recall having chargers from blackberries that wouldn't support my first android phones due to being underpowered. I would imagine over time we'll see something almost identical only with two unofficial categories instead:

One set of chargers will be for mobile devices and just support the highest standard we see in them.

One will be for laptops and the same - just supports the highest profile for them.

Depending on cable configuration, pinout, wall plate and structured wiring system that 8P8C might be usable (or not) for multiple different types of data networking, from the assorted ethernet speeds to E1 to token ring, or for a serial console, or delivering power and audio to a remote speaker, or hdmi-over-utp, or even -48V telephony, and let's not even get started on the only-subtly-different but actually incompatible RJ45 connector, or people sticking RJ11 plugs in 8P8C ports.

I almost always hear "RJ45" to identify the 8-conductor ethernet female or male connector - depending on the context. It is universally understood, and there is no confusion about it. There is nothing wrong with using it in everyday conversation.

I have never once heard the phrase "8P8C" used to refer to an ethernet jack. Not once (outside of this thread) - but I have heard it used that way when referring to various 8-pin telco connections - it was a common term of art in the 90s when describing telco installations that used that configuration. When talking about Ethernet, and people are trying to be specific, they usually reference EIA-TIA-568B/A.

There are certain words, like "Bandwidth" - that, might technically mean the width of the band (typically in Hz), but have grown over time to refer to data rate as well. And that's cool - language is versatile that way.

This interesting tangent about common parlance for connector names demonstrates another way in which USB Type-C's adoption trajectory is characteristically similar to 8P8C: people are already giving it a technically incorrect common name, "USB-C".

Both chargers and devices need not support all charging standards/profiles and thus may disagree on working together. A working (all USB PD profiles up to 100W supporting) charger looks inherently the same as a profile-1 only one, and both may even say "USB PD" in the specs.

Thank god the MacBook accepts the widespread 5V/3A USB-PD power level and even USB BC.

I don't know why you are getting downvoted(I guess it's the fuck you at the end), but I absolutely agree - if a cable fits in a port, it should just work. Anything else is horrible design that's user hostile. Apple sells an LG USB-C display, and if you use the USB-C cable bundled with the MacBook Pro, it doesn't work. And you don't get an error message - it just doesn't work.

Some companies, for example Nintendo, figured this out a long time ago. Notice how with their consoles, if the disc/cartridge fits in the console, the console will always play it, even between generations. The customer shouldn't have to research arcane names and study symbols on cables - if it fits, it should just work. And USB-C is just a mess at the moment.

> Notice how with their consoles, if the disc/cartridge fits in the console, the console will always play it, even between generations.

Well, that's not quite true. Both the Wii and Wii U have a standard-sized disc slot; on the Wii you can insert small GameCube discs into that slot and they'll play, but on the Wii U they won't. On the portable side, 3DS cartridges do have a tab to prevent them from fitting into a DS, but that wasn't the case for the handful of games exclusive to the brief-lived DSi.

Recent Nintendo consoles have also had compatibility issues with standard storage devices. The Wii supported SD cards, but wasn't compatible with SDHC cards, which are almost all cards with a capacity of 4GB or higher. This was eventually rectified with a software update... but the update only applied to the system menu, not to games which could access SD cards themselves, including notably Super Smash Bros. Brawl. The Wii U, for its part, supports storing games on external hard drives, but doesn't provide as much power over USB as most hosts do, requiring the use of a USB Y cable and a separate USB power source even for drives that don't normally require external power.

I haven't come to a conclusion w/r to those "shiny" new MacBook features yet, but I'm not an Apple user either. Time will tell though.

I hope Dell will improve its power circuitry in the future and support USB BC and PD 5V/3A. Let's hope it's just that current power chipsets lack those modes because of time to market pressure and Apple is ahead of its competitors here.

I can understand your anger at Apple, I hear the same a lot from design and audio professionals... You may have got some downvotes for that last statement.

Having had the Macbook 12" for a year and a half now and a Nexus 6p for a year, it's really been quite wonderful. I can charge my laptop or phone using the same charger – of course not as fast as the OEM charger, but wonderful for being on the move. I love that I can use a typical battery backup to charge my Macbook. I really wish everything of mine USB-C, and it will be soon.

Right now it feels wonderful with OEM chargers, but you do have me worried about the future buying replacements and accessories.

There will be the cheap Chinese chargers that will suck. Then customers will complain, and Belkin and others will notice and make good chargers with proper marks on the packaging. It won't take too long before customers and shops know what to buy or sell.

I have that car adapter as well as a USB power bank. Both don't charge the Dell XPS 13, because it wants 20V/3A. The docs of both chargers and the Dell are light on the USB-PD details. The chargers are great for phones so i didn't return them.

However there's not a single car charger nor a power bank that does usb-pd at 20V/3A at the moment. So sad.

In my own (possibly and probably inaccurate) opinion, I feel as though "hackers" aren't the people who need professional gear to do professional (or daily) tasks. They're the ones who can make the most out of as little equipment as possible.

"See that toaster over there? It's been reprogrammed to automatically deposit my cat's food every fourth hour and have it warm as well."

"That 2009 dinosaur of a smartphone sitting in the corner? It's an IP surveillance camera."

"That first generation Xbox, it's powering the zoom feature of the Hubble Telescope."

That last one might have been a bit of an exaggeration, but my point is that something isn't great because it's the latest. Something is great because someone increased it's value after using it or created something of higher value than the equipment used to create it.

> Hackers do not benefit from a closed box with non-expandable performance.

To address this point... some might. But not all will. And I certainly think that fewer will than the generations before. I really hope USB-C will be as great as these companies say it will, but until then I'll happily use my different ports that work as they are expected to.

All professionals need professional gear to do professional work. That's kind of the definition of a professional - someone who can afford the right tools, and knows how to use them to get a job done quickly and competently.

Turning a toaster into a cat feeder is tinkering, not professional hacking. There's nothing wrong with tinkering. But it's the difference between wiring up a Raspberry Pi as a heating controller, and building a company that sells fully licensed and certified heating controllers all over the world with support infrastructure.

One is hobby project, the other... isn't.

A useful definition of a professional tool is one that lets you forget you're using it because it's so transparently intuitive you never have to think about its needs.

I don't think the 2016 MBP does that. The ports are (literally) a side issue. The problem is more that Apple are thoughtlessly losing their reputation among professionals, because Cook, Schiller and co don't seem to be thinking hard enough what they're doing, and don't appear to have an understanding of what their professional customers are looking for.

...Which is not something super-thin for the sake of it, or with a gimmicky touch bar. It's something expandable with ports that "just work", no physical or metaphorical rough edges, with the option to have decent memory (i.e. 32GB) and a reasonable processor speed bump.

This shouldn't be hard or controversial, but for some reason it seems to be beyond Apple's understanding.

I'm hardly a hater. I bought the 12.9" iPad Pro last week, and I'm loving it. But the laptop format is challenging because you either stay conservative, or you go full experimental with (say) a dual-display clamshell. or even a touch panel instead of the trackpad.

Half-hearted innovations like the touch bar glued onto an ungenerous spec look like gimmicks for the sake of it, not serious attempts to improve professional productivity.

Don't get me wrong, but professionals are those who deploy Win3.11 machines running on embedded i386 cores for big money today. It's more about meeting some specs, getting certifications and providing reliability than it is about technical details.

Your definition is more bleeding-edge-users who are constantly limited by technology and could justify to pay a couple thousand for a 10% increase in performance. This group overlaps but is not equal to the professional users.

Many real professionals will love the new MBP lineup while many more will hate it.

I still don't understand this seemingly rigid idea of what a "pro" needs in a computer. (High performance, who cares about the battery life or form factor.) Surely it depends what your line of work is.

I'm a programmer, mostly web based apps and related servers. For my usage, I need an SSD (~512gb) and a recent cpu with 16gb+ ram. The touchpad has been what kept me on the mbp... Looking at the Razer Blade Pro, love the keyboard layout, but its totally overkill for my needs.

An integrated gpu, with a higher end i5, with a big ssd and lots of ram for half the price would be more appealing to me.. even bringing a lower rez screen would be okay for my needs... love the for factor though.

I didn't say an integrated gpu and 32gb are taxed... I said I needed an SSD greater than 256gb (512), and at least 16gb of ram. However, in many corp environments if you need those, you get a hefty machine.

If it's being taxed, it's probably because some idiot spec'd an HDD that pulls resources too slowly For modern JS dev, you really need an SSD more than anything else. Mainly because the build/watch process is tracking many thousands of small files which is significantly worse on hdd. 60+ seconds vs. under a second for any change to take effect in the browser. This can be as much as an hour a day wasted. The 5 hours of wasted time in a few weeks are more costly than the upgrade to ssd.

-- edit

As to 512gb, it's because after all the software, that can take 100gb... creative assets well over that depending on the projects... it's easy to hit 240gb between the OS, software, projects, and assets.

Beyond that, show me off-the shelf hardware that can be configured with 16+gb ram and a 512gb SSD that doesn't have the other stuff I don't need?

>The problem is more that Apple are thoughtlessly losing their reputation among professionals, because Cook, Schiller and co don't seem to be thinking hard enough what they're doing, and don't appear to have an understanding of what their professional customers are looking for.

I just wanna know what cable they use when they need to plug their iPhone into their MacBook. I don't seem to recall Apple selling a usb-c to lightning cable.

I find it weirder that people whining about that are usually not impaired by it for their dayjobs? I hear people shouting about wanting gpus and multicores and 16 gb and then that they are doing React dev on it as professional. What do you need that monster for when doing that dev? When I do 3d game dev, ML or image processing/recog I need something heavy (although... I really only need that locally for the first from that list). The rest I can do on mostly anything after 2009... Including React dev.

For me (closer to react guy), I want an SSD and plenty of ram... a high end core i5 with integrated gpu will do... but in many corp environments there are only a handful of build options, so you get the uber machine or 8gb ram with spinning rust drive.

I mean, I can "get by" on anything - an iPod Touch that runs Emacs 23, just to pick an example off my desk. When the chips are down, I'll get the job done with it. When the chips aren't down, and I have something more closely approaching my druthers with regard to the tools I use, I'm not going to go for the iPod.

I agree that USB-C might be helpful in that respect, but it totally depends on the drivers. If there is a device that I can make hiccup through timing attacks it depends on what liberties I have in the driver.

For Bluetooth for example you have Ubertooth. If the Bluetooth radio on my laptop would be accessible on a low enough level there would be no need to use other hardware to execute attacks.

I'm just a hobbyist with a jtag programmer, some digital analysis, nothing fancy. A professional reverse engineer is another beast with microscopes, etc.

A hacker in the form of someone exploiting vulnerabilities in web applications does benefit not from a single computer, but from many. I don't think he/she would care about the specs too much.

> They're the ones who can make the most out of as little equipment as possible

... like a $4k laptop?

It's not like Apple is the only company to support USB-C on laptops... in fact, as far as I can tell, every major manufacturer has it on their most recently-released laptops.

> To address this point... some might. But not all will.

How, exactly, would someone... anyone... BENEFIT from closed-box performance? Just because someone might not benefit from a laptop you can upgrade doesn't mean the converse... that they somehow benefit from having one that you can't upgrade.

(side note: I can think of ways... decreased cost, decreased size... but the laptops certainly don't cost less, and any difference in size w/ a 15" laptop is negligible)

By this version of what a hacker is, it makes it sound like Apple is saying "This is a sub-par piece of equipment but you'll make the most of it because you're a hacker who's more productive with less!"
I agree with the definition on it's own merit.

I think it's funny how signaling things like "expandable performance" and "deep key travel" get tied to being a hacker. The only thing you can upgrade even on an expandable laptop is memory and disk, which a hacker is probably maxing out to begin with. (Even expandable laptops are pretty limited by the chipsets these days in how much RAM they can handle). And given the high resale value of Macs, your typical Silicon Valley worker will probably spend more on lattes than on simply selling their Mac every couple of years and buying a new one.

HP Zbook first gen here: I can also replace CPU and GPU. I didn't do it and never will do but it's in the user manual.

I upgraded RAM and swapped the DVD with a 1 TB SSD. The HD is still in there, auto shutdown after 5 secs. I use it for storing large files I don't need on the SSD but could be handy to be online sometimes, like raw videos from my camera.

I'd like to replace the keyboard with one without the number pad. Possible in theory but there is no part that fits on the market. The only 15" laptops without number pad at the time were the Mac and I think the XPS, which was overheating. Maybe the problem with the latter is solved now.

So in this context, it's almost entirely controlled through software. One can be a hacker with hardware, but as of now I doubt USB C is going to make anyone as much of a hacker as a specific type of software might. And this new MacBook has very little to do with changing how real "hackers" might do things.

1. A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary. RFC1392, the Internet Users' Glossary, usefully amplifies this as: A person who delights in having an intimate understanding of the internal workings of a system, computers and computer networks in particular.

2. One who programs enthusiastically (even obsessively) or who enjoys programming rather than just theorizing about programming.

3. A person capable of appreciating hack value.

4. A person who is good at programming quickly.

5. An expert at a particular program, or one who frequently does work using it or on it; as in ‘a Unix hacker’. (Definitions 1 through 5 are correlated, and people who fit them congregate.)

6. An expert or enthusiast of any kind. One might be an astronomy hacker, for example.

7. One who enjoys the intellectual challenge of creatively overcoming or circumventing limitations.

8. [deprecated] A malicious meddler who tries to discover sensitive information by poking around. Hence password hacker, network hacker. The correct term for this sense is cracker.

The term ‘hacker’ also tends to connote membership in the global community defined by the net (see the network. For discussion of some of the basics of this culture, see the How To Become A Hacker FAQ. It also implies that the person described is seen to subscribe to some version of the hacker ethic (see hacker ethic).

It is better to be described as a hacker by others than to describe oneself that way. Hackers consider themselves something of an elite (a meritocracy based on ability), though one to which new members are gladly welcome. There is thus a certain ego satisfaction to be had in identifying yourself as a hacker (but if you claim to be one and are not, you'll quickly be labeled bogus). See also geek, wannabee.

This term seems to have been first adopted as a badge in the 1960s by the hacker culture surrounding TMRC and the MIT AI Lab. We have a report that it was used in a sense close to this entry's by teenage radio hams and electronics tinkerers in the mid-1950s."

>No, sorry, the new MacBook pro sucks for hackers. It's great for prosumers who like gadgets and benefit from USB-C. Hackers do not benefit from a closed box with non-expandable performance.

You'd be surprised. Linus Torvalds, a hacker one would presume, used to have a MacBook Air as his primary laptop, and praise it as the best laptop ever made, saying that other companies have failed to produced something as good (though after 2014 he moved to a Chromebook). And the reasons he gave for praising it for are exactly what people think Apple has too much emphasis on: thinness, lightness and battery life.

Of course that's a single data point (through for more data points you can go to any software conference, where Apple laptops usually dominate both the speakers and audience, even though they are less expandable and have less performance than some gaming/desktop-replacement laptops).

An Apple laptop can fit well with some hackers, for at least two reasons, I think:

First, tinkerer != hacker. A hacker can be a tinkerer, but it's not the same thing. There are people who love to hack on specific things, create new stuff etc, but could not care less about dealing with the hardware or customizing their window manager. A lot of hackers I know in fact tend to be quite minimalistic in those areas.

Second, a hacker isn't necessarily all about raw performance and 8GB graphics cards. A lot of hacker types and great programmers in C or whatever can do wonders with very little hardware.

Now, other kind of professionals, like 3D artists, number crunchers and such, might definitely want more GPU/RAM. But even for a lot of creative professionals, stable and "what we know" trumps "latest and greatest". Most professional music studios I know, for example, have 2 and 3 generation old setups, and never jump to the latest OS until 1-2 year after its out.

In the photography world, were I dabble (and once did professionally), we have this notion of "measurbators" and "pixel peepers". They are those that are obsessed with ISO performance, megapixel count, sharpness, synthetic tests of camera gear and so on, but seldom create any output of any particular worth. I guess the same would be for PC people obsessed with benchmarks beyond a particular point, especially if their workflows don't need them. A hacker, in this regard, would be the opposite. This guy, is a hacker, photography wise:

USB is standard, so now "hackers" are supposed to have an raging lust for standards on Apple hardware?

"Don't worry, the dongles are USB standards too". I don't give a rats ass if dongles are standard are not. I am not lugging around dongles to connect my phone to my laptop.

I use Ubuntu and One Plus 3 and USB 3.1 Type C works like a charm with 10 Gbps speed. I got 100 pieces of really good quality cables for around $2.5 each, used up around 25 around my house, office and car (removed link)

I am not spending 1 bazillionton dollars on the same or worse quality Apple cables.

After a week with a 13 inch, what I've mainly noticed is how physically unwelcoming it is:

The edges are very sharp, and the air vents on the bottom are right where you grab the laptop to pick it up, which gives it a knife-like feel. Also, by expanding the track pad far beyond it's useful size, there is now no gap between it and the space key. I have discovered that I have a habit of resting my thumb just below the space bar and now I tend to bump the pointer on accident now. The arrow keys are now a continuous run of keys with no way to orient quickly like previously (where the side arrows were slightly smaller and made it obvious where the up key was without looking.) . Finally, the keyboard action is very short, as you would expect with such a thin laptop.

I do most of my work with an external keyboard and monitor, so it isn't that big a deal to me, but I can see it being hard on people who use their laptop exclusively.

> I have discovered that I have a habit of resting my thumb just below the space bar and now I tend to bump the pointer on accident now

This is surprising and disappointing. I really enjoyed using Apple's touchpads with tap to click because their accidental press recognition was good enough that I never really worried about this. I typically had to turn tap to click off when using Linux because I would randomly click while typing.

I wonder why they haven't been able to get it to work as well on the new touchpad

However, I don't think all drivers have palm detection unless I'm mistaken?

Also, unfortunately while I would have enjoyed configuring the system to my personal desires, these days with the limited free time I have, I'd rather be able to just do what I was going to do on my laptop instead of having to spend time tinkering with it to get it right. Apple seems to do very well in this respect since it almost always knows when I want to tap vs when I just have my palm there.

"...what I've mainly noticed is how physically unwelcoming it is: The edges are very sharp..." - I noticed this too on the early Microsoft Surface power-bricks (not sure if it applies to the later Surface models), the power-bricks had unusually sharp-edges, at the time it struck me as unnecessary and not human-centric. I wondered if the designer ever had the opportunity to interact with the tactile experience before it went into production. Has the design process broken down and designers no longer see a production prototype before they sign-off?

Yes, I have the 2014 version of the 13 inch macbook pro. The increased sharpness is very noticable to me, particularly where the underside vents are now placed, creating that knife-like feel when you pick it up.

Sharp edges seems to be a trend that has migrated to other manufacturers as they ape Apple's design. My three year old Lenovo, and new work supplied Dell are both unforgiving to type on for the same reason.

I am disappointed by the Touch Bar. It is such a small and dim screen and a lot of attention is required for interacting with it. There is no tactile feedback, so touch typing is impossible. QuickType and Emojis are useless for me. It might be a good accessibility feature, but QuickType is much too slow and the upward movement of the entire hand/arm interrupts the flow of typing. I think the Touch Bar is great for designers who can benefit from a 'general purpose touch/slider input device' (e.g. for color mixing, navigating timelines, parameter fine-tuning). It might also reduce cognitive load a bit since it reduces the need for memorizing keyboard shortcuts. On the other hand, users who are blessed with a good memory will probably not benefit very much in that regard because pressing a key combination is much quicker than scanning and touching the Touch Bar.

The only attractive feature of the Touch Bar model really is Touch ID. If Apple would sell the Function Keys model with a Touch ID power button and one more USB port, I would happily buy it. But right now, I am a little bit confused and baffled by Apple's new MacBook Pro product line. I think I am going to wait another generation to see whether Apple gets back on track and whether the Touch Bar can stand the test of time.

I use the function keys a lot for changing volume, screen/keyboard brightness, pausing/playing music. I also launch Terminal and Firefox to F3 and F4, I heavily make use of ESC for all kinds of things (VIM, closing UI dialogs) and I occasionally use the function keys for some shortcuts such as finding the next occurrence with F3 in Foxit Reader, F10 to compile, F7 to show the desktop, F6 to switch between Karabiner profiles. For me it would be quite a loss.

I bought a new 13" MBP with touchbar and I'm returning it on Monday. I don't like the keyboard and I _really_ don't like the touch bar, and I seem to only get about 3 hours of battery life. I'm going to stick with my early 2014 MacBook Air until Apple figures their stuff out.

Making any judgments like this on battery life after a single day or even a few days of use is VERY unwise.

The main reason is that your system needs to index your entire drive; this will require MUCH more power than typical power use until this process is complete. Should give a new system at least a week to shake out before freaking out about battery life.

I've also noticed the battery life is quite bad. My first one might have been defective as it only gave about 3 hours of battery life (from the in computer estimate), which is the same as my 4 year old 2012 macbook pro.

Second 13" touch bar machine now reports about 5 hours, which is still not great.

Also, for what it's worth, I love the new keyboard. It took me a few days to get used to it but just yesterday I measured myself typing on it at 130 wpm, the same speed I can do on normal keyboards. It's very satisfyingly clicky.

Yes, I've been hearing about the battery life from several people now. I was very excited to get one but I tried it at the store and really didn't like the keyboard. If, in the future apple just decides to do the whole keyboard as a screen, then I guess I'll just have to always use an external keyboard.

Same here. I find it, on the other hand, very plausible that some people will like it, in theory, but a lot of the good things I heard about the keyboard seem to come from people who just received their computer. I generally take those kind of feedbacks with a grain of salt as the buyers remorse bias is now a very well understood phenomenon :D

Just to add some anecdata: I tested the keyboard for 10 minutes with TypeRacer in an Apple Store and I found it to be OK. The clicking certainly does provide enough feedback and since I am used to mechanical keyboards it also does not bother me how loud it is (I actually like loud keyboards). I also liked the short key travel.

I remember when I first got my MacBook with the Force Touch trackpad (that doesn't physically move but instead using haptics to simulate the 'click' feeling) and how much I hated it initially. The illusion of the click worked fine, it just wasnt nearly as strong enough. A week later and I love it and hate the previous buttons which just feel mushy and archaic.

Similar story with iPhone 7 home button, except I've just come to accept it and ignore it but wish we still had a proper physical home button.

The smaller battery is disappointing. Yep, the CPU uses less power but that only counts if your MBP is more or less idle. In the end, footprint matters and not thickness … Apple abandoned too much battery capacity IMHO.

Hell yeah! I sincerely regard those 2 ports on the cheaper model simply as hardware crippling. Much like AV manufacturers punish you by giving you less audio/hdmi ports if you go for their entry models.

What's the use case for needing 3 USB? Wherever you sit normally (home or work) just get a usb hub/dock and use the single connection to the laptop for keyboard, mouse, charging, and driving a monitor.

When out and about, when do you need 3? Can't ever recall seeing someone at a coffee shop, conference, etc needing 3 USBs.

I think the idea is you would have a USB-C hub that has all of those ports on it, so you come to your desk to work and plug in a single plug and away you go with charging/monitor/keyboard/mouse/ethernet.

3 seems like a minimum to me. Power, mouse (yeah, bluetooth, but the only ones I like require a dongle), and phone. My current laptop has two USB ports, and it covers about 99% of my needs, but it's also frequently full.

I've been on the non-touch 13" for about a week now and I'm really happy with it. I moved over from a 2012 11" MBA. The form factor of the 13" is great for grabbing 30 minutes here and there during travel to do some work.

Battery life is good, and 16GB is a big step up from the 8GB I had on the MBA. I've been running a decent-sized system in minikube with no problems.

The only issue I still seem to have is that Hangouts absolutely obliterates the battery. A one-hour call took battery life from 9:45 to 1:15!

I bought a 13" MBP with the Touch Bar too, and I am also getting bad battery life, on the order of 3-5 hours depending on the workload. I've ordered a 13" without the Touch Bar in hopes that the lower wattage CPU, lack of the Touch Bar, and bigger batter will result in much more battery life.

With AS? Yeesh. Mine (previous gen) will drain in about 2 hours with that or PyCharm open. IntelliJ stuff is super power-hungry unless you put it in power-save mode (and then it's not even as useful as a "plain" editor like sublime text or something).

I mean, I like all the power-hungry features, and I'm usually plugged in - I'd prefer they keep adding useful things rather than worry about my battery. But it does hurt at times :)

Maybe keyboards were already à solved problem. People had no desire for change here. I find recent Apple quite off its market. They push markers higher because it was 'apple genius' before, without checking if the crowd still Dreams about that. A bit lazy and Self centered.

This is a terrible attitude for a technologist to have. Cellphones were a "solved problem" before the first iPhone came out. Laptop monitors were a "solved problem" before the first retina MacBook Pro came out. Look where we are now.

Of course, this doesn't mean every change made to an existing technology will be an improvement and/or be embraced right away and by everyone, but I see it as a Good Thing that a major technology player is still experimenting with things that everyone else has written off as "already solved."

Cellphones I can agree, retina display not at all. This was not just about technology but how Apple is approaching its products. Latest MBP felt too much like technology solving non problems: bestest trackpad (too big, "no" palmrest), thinnest keyboard (no tactile feedback, needs to spend 1s on ensuring your fingers are at the right location), dynamic touchbar: too dynamic, forces people to leave the content and focus down. All this surrounded by this feel that what Apple did since the iPod/iPhone, cross coupling all their technological improvements along their product line (ipod helped the iphone, iphone helped the macbook air, etc etc) is now applied blindly because it worked before and Apple has to occupy the terrain and grab the spotlight.

That's reading a lot into my words. I was happy to "get used" to the new keyboard because I thought it was awesome that my laptop had become thinner and lighter... two things that I value more than having a keyboard that might be slightly better.

Thinner and lighter only matter when carrying it around. Keyboards matter when you're using the computer. Most folks who use their computer 8 hours or more a day that I know tend to do more with the keyboard than they ever do lugging it around. So the keyboard really matters, even in small increments.

I'm also going to make what I think is a reasonable assumption and say that a vast majority of folks are going to carry their laptops around in bags. In which case, the 7 ounces saved (for reference sake, this is about the same weight as a 45 watt power brick, minus the extension cord) is nearly meaningless, and the half inch on each dimension will be completely meaningless.

The difference between 13" and 15" for size and weight is much greater than the size and weight savings afforded by the new keyboard. Also, both the 13" and 15" MacBook Pros have the same keyboard.

On a side note, I think that a 13" monitor hits a sweet spot. It fits in a lot of bags, has just enough screen size to be productive without requiring an external monitor, and the performance difference between 13" and 15" computers is usually quite small (if it exists at all).

> The difference between 13" and 15" for size and weight is much greater than the size and weight savings afforded by the new keyboard. Also, both the 13" and 15" MacBook Pros have the same keyboard.

It's a cumulative process. My first Macbook was a white 13.3" polycarb core 2 duo. The new 15" MBP is a pound lighter, almost half as thick, and only half an inch deeper and an inch wider. Samsung has a 15" laptop that's under 3 pounds, making it very palatable even for people used to a 13"-er.

At what point along that evolution should Apple have stopped striving to get smaller?

To me that sounds like you're the target market for what was the Air, or what is now the MacBook. The MacBook Pro is their top of the line and shouldn't be diminishing something as important as the keyboard which people use all day for an extra mm off the thickness. They've crossed the line on form over function.

I tried switching to the 12" earlier this year. I ended up giving up due to the poor performance of the machine relative to my needs, but I did get used to the keyboard - once I got used to it, was able to pretty fast on it. (Then again, my work and home setups feature Das Keyboard Pros, one brown, one blue switches)

When I first saw the touchbar, I immediately thought of a plastic, overheating, underpowered piece of crap laptop you find at any Retail store. Literally looking at that touchbar makes me think of having to uninstall 20 HP bloatware apps from a family's shiny new craptop.

This entire article is raving about a single, relatively small but important detail that Apple has been notoriously hostile towards and worst at up until now: cross-manufacturer port compatibility. So hostile that they went out of their way to find ridiculous loopholes in their compliance with the EU's Common EPS Memorandum of Understanding on USB-B.

Yes, it's a great feature, but giving Apple so much credit for introducing it is the ultimate irony.

The EC organized a voluntary Memorandum of Understanding for mobile phone chargers, where the major phone companies stated their intent to standardize on a common phone charger. (depending on who you ask, to avoid the EU making a binding rule about it otherwise)

Apple signed this too, but as we all know didn't actually add a micro-USB-B connector to their phones like everybody else, but just providing an adapter to USB, which was not forbidden, but arguably against the spirit of the memorandum.

Citing wikipedia, Some observers, noting Apple's continued use of proprietary, non-micro USB charging ports on their smartphones, suggested Apple was not in compliance with the 2009 Common EPS Memorandum of Understanding. The European Commission however, confirmed that all MoU signatories, "have met their obligations under the MoU,"[15] stating specifically, "Concerning Apple's previous and present proprietary connectors and their compatibility with the agreement, the MoU allows for the use of an adaptor without prescribing the conditions for its provision"

Be careful with the idea that the charger is just a regular USB-C charger. These laptops will draw 3-4 amps at 20 volts. Most phone chargers are designed for 1-3amps at 5 volts, so they would only provide a trickle of charge to these laptops. Older chargers (and computers ports!) can be damaged by these higher power draws. Also, the USB-C cable the Macs come with is rated up to a hefty 5 amps. Some of the cheap phone cables out there could actually pose a fire danger if they were to handle 5 amps.

> I’m sure it’s only a matter of time until Amazon is flooded with cheap versions of this idea that tweak it just enough to avoid patent issues. I look forward to buying $3 breakaway USB-C cables in the future.

Has this guy not seen what's going on with Amazon and the cheap USB-C cables that are flooding it[0]? They're literally destroying people's laptops by drawing too much power and frying either end.

"It’s not the Nexus’ fault that my MacBook got fried — it was just doing what it was supposed to do: ask for as much power as it can get. It’s not the MacBook’s fault either — its ports weren’t designed to handle delivering that much juice nor to know that they shouldn’t even try. It is the fault of the cable, which is supposed to protect both sides from screwing up the energy equation with resistors and proper wiring."

This is simply not true. Both the device that is supplying and receiving power should be regulating. The cable should just be "dumb". I've done a bunch of testing on various chargers using cheap USB power meters and 2A USB dummy loads. For instance iPhones/iPads will monitor the voltage of the charger and if it starts sagging will reduce the amperage they draw. (my own testing has only been with "classic" USB type A stuff though)

The only cable I've heard of that actually fried anything was one that was wired completely wrong putting power on the data pins or something like that.

Are you familiar with Benson Leung (Google hardware engineer working on the Pixel) work reviewing various USB-C cables and accessories? He's extensively demonstrated that there are USB-C cables on the market that are dangerous.

Yes, he is who I was referring to when I mentioned "The only cable I've heard of that actually fried anything was one that was wired completely wrong putting power on the data pins or something like that."

edit: I will admit I am partially wrong though - checking a sample of his reviews reveals the case of Type A <> C cables, where as a bridge between the standards the cable is presenting as a charger - in those cases, yes it should not just be "dumb". I still maintain that well-behaved chargers should not supply more power than they are capable of, and well-behaved devices should (and many do) monitor their charging environment and back off when the voltage sags.

Only chargers that don't follow the spec can be damaged by the power draw. That is the problem with adapters and USB-A cables that use wrong resistor and USB-C device will pull 3A from USB-A charger that can only provide 2.4A at most.

Also, the charge cable is electronically marked for 5A. Regular USB-C cables only do 3A, which limits them to 60W with USB-PD.

Are the MacBook USB-C ports special in some way that makes them USB-C and also a non-standard charger port?

If not, then they must be compliant USB-C ports and therefore would not be supplying an incorrect voltage. As far as I understand the fundamentals of electricity, current doesn't matter. A 5v USB power source that can supply 100 amps would still only supply whatever the device was designed to draw when charging.

But all of that is moot if the MacBook ports are "special" USB-C ports and somehow they negotiate a voltage higher than 5V.

The USB Power Delivery (USB-PD) spec allows devices to negotiate up to 20V - the key word, however being "negotiate", means this is no different than how older USB charging solutions work (start at 5V+100mA, negotiate to whatever the client device can handle and what the host will allow).

I'm always amused by how uninformed many people are about USB and power supply, if devices can't negotiate something that works for both ends they just stop. It's a shitty situation to be in if your device fails to charge, but devices that follow the USB-IF rules will never draw more current than the power source can supply, and the power source will never supply more than the device has requested.

A "dumb" device that doesn't negotiate will get the minimum 5V@100mA that the USB spec allows, dedicated chargers often decide to not drop the power after the negotiation window is over but a proper host device (laptop, etc.) will drop all power if a device fails to negotiate.

So yeah, if you plug something into the Macbook charger it will either not charge (dunno if it supports traditional USB power specs since it's a type-C only charger) or charge as any other charger already does.

Re: "dumb" devices, this is how it's supposed to work, yes, but I don't think I've ever seen a charger that won't happily dump 1A into a resistor wired over the power pins - including brand name (Apple) chargers and yes, the USB ports on a MacBook Pro. I've never seen a modern device supply only 100 mA.

For devices that DO attempt to negotiate, yes in those cases usually you'll see it work according to spec.

I haven't tried USB-C yet, it would be interesting to see if devices have gotten stricter.

I'll just go ahead and contradict the running opinion and say: I like my 2016 MacBook Pro 15".

I've owned a few Macs and this one is my favorite. Despite my initial impressions, I really like the keyboard. I'm also quite happy with the trackpad which I've found to be excellent as usual and not picking up stray contact.

The only place I'm not "ecstatic" or "pleased" is with the TouchBar - and to be clear, I'm not displeased. I just don't really notice it. It's there doing its thing and I'm using the laptop, doing my thing. Occasionally I need the escape key and tap where I'd expect the escape key to be and while I don't get the tactile feedback it works.

All in all - the TouchBar is a net zero to me. I didn't lose anything by losing my Fn keys but I don't feel I gained anything with the TouchBar except TouchId which is nice.

I'm finding it really odd that people are praising the new keyboard. I must admit that I have only tested it in the Apple Store, but I find it terrible, the short travel feels really bad to me. Is this something that you just get used to?

I'm not sure - I use a gaming keyboard on my desktop at home (Steel Series) and while it is mechanical with deep travel, you barely have to tap a key to get it to register. That is, you don't need to fully press a key to trigger it. At work I use a CODE keyboard which is also mechanical with a pretty deep press.

Lots of people complained about the original Apple 'chicklet' style laptop keyboard when it was first introduced circa 2006 for the same reasons. In my experience with many different keyboards over the years once you get used to shorter key travel it's hard to go back to taller keys.

As for as laptop keyboards, it's pretty great. The greater issue is with the palm rejection of the giant touchpad. That is considerably worse than before (when it was never a problem). This is something that can be solved with software though, so not too worried.

I have an 13" since almost a week. I think it's great. 95% of my time is spent in iTerm/VSCode/XCode/Android Studio.

Some remarks:

- Keyboard, especially arrow keys, took me 2 days to get used to, but now it feels weird typing on a old macbook. I actually love it and prefer it now.

- The thumb + touchpad thing mentioned elsewhere here was definitely a big problem in the first day or two. It isn't anymore (guess I got used to it? not sure because I didn't try to avoid it)

- USB-C is freaking awesome. I bought an adapter with ethernet, HDMI, usb 3 and SD that actually replaces the 3 adapters I had to carry around. And because of UPD, I only have one cable to plug to the mac and everything is there including power.

- I don't miss magsafe as much as I thought I would. Although I would happily buy an adapter if it is thin enough (some are coming).

- Touchbar is actually pretty great, although it being a touch screen, the lack of touch feedback can be annoying at first. Pretty ESC is annoying at first, but I got used to it and don't mind it now.

- Touchbar would be an _awesome_ medium to get notifications (such as long running terminal jobs etc...)

- I didn't get any of the battery life issues people are talking about. Actually, I get 8-10 hours out of it easily (ie plug it at the end of the day because I forgot it was unplugged).

- Thinner bezels around the screen makes it somehow look bigger (even though the visible area is the same size and the screen/lid itself is smaller).

- HiDPi is freaking great. Finally I can use a 4k monitor smaller than 32" and still get retina display (1440p HiDPi and other intermediate resolutions up to native 3820x2160 are fully supported)

I've never really used the F-keys since my Visual Studio days (F5/F9/F10/F11) when I used to to .NET 2.0 on Windows. However with the touchbar, when you press Fn, the touchbar displays the F-keys instantly (you can choose for them to be displayed by default).

For Android Studio, I also remap the keys to Cmd+something out of consistency with XCode. For instance, I mapped build to Cmd+B etc...

For those of us who use a variety of JetBrains products, the F-key shortcuts are the consistency. Having them faked on a minuscule screen, rather than being findable by touch, would make the new MacBook unusable for me.

Sad, as I find OSX to be the best in a bleak landscape of OS's, but given Apple's obtuse insistence on knowing what's best for all, to the extent of removing critical components of the universal standard keyboard, my current MB Pro will certainly be my last. Ugh Windows!

USB-C is awesome. It is the future and in some ways it is good Apple are being the way they are.

But... I wish Apple would put USB-C in the iDevices (honestly why does the iPhone and iPad use Lightning when USB-C exists??). It is annoying that I can ditch all my cables except that one fucking extra Apple cable.

Time will tell, but I wonder if reliability will be the issue. USB-C has 24 contacts crammed in there. That's a lot of tiny parts for something that gets manhandled and lint packed as much as a phone charger connection.

If I was on the hook to warranty replace $600 devices when the internal connector gets ruined I might take my time jumping on board too.

It isn't like Lightning is super reliable. I have had many official cables fall apart within a year of normal business travel and at least one port on an iPhone 6 fail most likely due to something getting in the port (although Apple never stated that obviously).

I doubt USB-C is that much worse than Lightning. Even if it is say 10% less reliable as a cable it would still be much better to have a USB-C port simply because you are more likely to have a USB-C cable if everything else uses it.

Because Lightning is significantly thinner, also less wide, and mechanically superior in very way. There's a much firmer "snap" when you plug it in, the connection stays connected, and the port itself is more durable as well, due to not having a plasticy thin thingy in the middle that holds all the contacts.

I'd think (but have zero experience here, would love to hear otherwise if I'm wrong) that C would be better for RF interference purposes. C has the contacts inside a solid shroud, lightning has them exposed. Seems like lightning would need extra shielding on the port side, where C takes care of some / all on the cord?

My (crazy) theory is that they will remove all ports on the iPhone soon, and rely on wireless charging and data. It may be necessary to achieve Ives (well documented) dream of a screen that covers the entire front edge-to-edge.

They havent switched to USB-C because people will br super annoyed if they just supported it for a couple of years.

It also explain headphone jack removal. Removing two ports at the same time would make people super mad. And they also need to stimulate the wireless headphone market.

> It may be necessary to achieve Ives (well documented) dream of a screen that covers the entire front edge-to-edge.

Source on this?

I didn't know this is what Ive wants to achieve, but it's something I've assumed every device manufacturer will be racing towards (even if it is on a niche device to test market reaction) and I've wondered why no one seems to be boldly pushing this?

They'll probably change to USB-C eventually though, and the smart time to do that was in the same year that they did on all their other hardware, which is also the same year that they dropped the headphone jack.

IMHO they should have done it with the iPhone 7 has they ditched the 3.5mm jack forcing people to buy headphones that will only work with an Apple iPhone or iPad. Hell not even their laptops have Lightning ports so they are literally useless for anything else.

I also find it amusing that they released the new MacBook Pro with a 3.5mm jack when they clearly don't need it. It makes even less sense on a laptop than a phone IMHO.

I don't get it. It's bad for hackers because of the tepid software updates, increasingly developer unfriendly application environment, lack of full touch in an age where every other manufacturer does that ad a standard, and awkward meshing with its own ecosystem.

USB-C is pretty great, actually. Still pretty raw for the mainstream tech crowd, but it's not like that for tech-literate consumers.

Mobile and web are touch native now. It's fantastic for mobile developers to be able to have a similar input method for testing as use.

I'm not sure what makes touch not poweruser and "hacker" friendly, but it's not the 80s anymore. Most of us have significant portions of our workload on the web and the web is, by and large, an extremely touch-friendly medium.

Hello. This subtle "you seem different from my dated 80s image of hackers so you aren't one" game you're playing is boring, and if you check out the score no one is buying what you're selling.

It's faster to pop up to the screen to press a link and return to the home row than it is to pop to the trackpad and do the same unless the pointer is quite close to target. Both have similar repetitive strain implications.

> The new charging block that comes with the MBP looks exactly the same as any traditional MBP charger

Actually not, the convenient little 'arms' for the cord are missing. And the cord itself is rather stiff and not very flexible … and there's no green / red charging status light either. It's OK as a USB-C charger but it looks and feels different from a traditional MBP charger.

The "arms" on MagSafe adapters are actually bad for the cable - wrapping it around them puts undue stress on the cable, which causes them to fray faster. And a stiff/inflexible cable should also prevent it from fraying.

Seems like Apple took an proprietary, easily-damaged charging cable and replaced it with a standard, less easy to damage one.

As someone who took good care of my MacBook charging adaptors and cables, I found them to be perfectly durable. Those arms are only problematic when one wraps the cable 'immediately', instead of giving an extra loop and then beginning the winding.

I'm away from my laptop at the moment so I can't see which brand/model I have exactly, but there are a number of accessories for chargers that I find more convenient than the built-in arms, especially because I use the "long" power cable. It's something like this: https://www.amazon.com/Cable-Organizer-Macbook-Power-Adapter...

I'd honestly love to use my laptop's Ethernet dongle with my iPad or iPhone, but that won't happen. Apple has just recently doubled down on Lightning with the Apple TV Remote, the iPhone 7 headphones and the new Magic Trackpad/Mouse/Keyboard trio. For iPad accessories, they've even introduced the all-new Smart Connector instead of a USB-C port on the side of the iPad.