The original generation Galaxy Note I played with was an AT&T model, and as a result was based around the same platform (I call a platform the combination of SoC and baseband) as the Skyrocket, which was AT&T’s SGS2 with LTE. That platform was Qualcomm’s Fusion 2 chipset, the very popular combination of a APQ8060 SoC (45nm dual core Scorpion at 1.5 GHz with Adreno 220 graphics) and MDM9x00 for baseband (Qualcomm’s 45nm first generation multimode LTE solution). The US-bound Galaxy S 3 variants were built around the successor of that platform, which was Qualcomm’s MSM8960 SoC (28nm dual core Krait at 1.5 GHz with Adreno 225 graphics and an onboard 2nd gen LTE baseband). The result was quick time to market with the latest and greatest silicon, improvements to performance, onboard LTE without two modems, and lower power consumption.

The Galaxy Note 2 does something different, and finally brings Samsung’s Exynos line of SoCs into devices bound for the USA where air interfaces are a combination of LTE, WCDMA, and CDMA2000. It’s clear that the Note 2 was on a different development cycle, and this time the standalone 28nm LTE baseband I’ve been talking about forever was available for use in the Galaxy Note 2, that part is MDM9x15, same as what’s in the iPhone 5, Optimus G, One X+, and a bunch of other upcoming handsets. If you haven’t read our other reviews where I’ve talked about this, the reason is that MDM9x15 is now natively voice enabled (MDM9x00 was not unless you ran with a Fusion platform), smaller, and lower power than its predecessor. The result is that there’s finally a multimode FDD-LTE, TDD-LTE, WCDMA (up to DC-HSPA+), EVDO (up to EVDO Rev.B) and TD-SCDMA baseband out there which doesn’t require going with a two chip solution. I could go on for pages about how this is primarily an engineering decision at this point, but the availability of MDM9x15 is why we see OEMs starting to finally ship handsets based around SoCs other than Qualcomm’s and also include LTE at the same time.

Anyhow, for a lot of people this will be the first time experiencing Samsung’s own current Exynos 4 flagship, Exynos 4412, which is of course quad core ARM Cortex A9s at a maximum of 1.6 GHz alongside ARM Mali–400MP4 built on Samsung’s 32nm HK-MG process. To the best of my knowledge, the Note 2 continues to use a 2x32 bit LPDDR2 memory interface, same as the international Galaxy S 3, though PCDDR3 is also a choice for Exynos 4412.

I’ve put together a table with specifications of the Note 2 and some other recent devices for comparison.

Physical Comparison

Apple iPhone 5

Samsung Galaxy S 3 (USA)

Samsung Galaxy Note (USA)

Samsung Galaxy Note 2

Height

123.8 mm (4.87")

136.6 mm (5.38" )

146.8 mm

151.1 mm

Width

58.6 mm (2.31")

70.6 mm (2.78")

82.9 mm

80.5 mm

Depth

7.6 mm (0.30")

8.6 mm (0.34")

9.7 mm

9.4 mm

Weight

112 g (3.95 oz)

133g (4.7 oz)

178 g

180 g

CPU

1.3 GHz Apple A6 (Dual Core Apple Swift)

1.5 GHz MSM8960 (Dual Core Krait)

1.5 GHz APQ8060 (Dual Core Scorpion)

1.6 GHz Samsung Exynos 4412 (Quad Core Cortex A9)

GPU

PowerVR SGX 543MP3

Adreno 225

Adreno 220

Mali-400MP4

RAM

1 GB LPDDR2

2 GB LPDDR2

1 GB LPDDR2

2 GB LPDDR2

NAND

16, 32, or 64 GB integrated

16/32 GB NAND with up to 64 GB microSDXC

16 GB NAND with up to 32 GB microSD

16/32/64 GB NAND (?) with up to 64 GB microSDXC

Camera

8 MP with LED Flash + 1.2MP front facing

8 MP with LED Flash + 1.9 MP front facing

8 MP with LED Flash + 2 MP front facing

8 MP with LED Flash + 1.9 MP front facing

Screen

4" 1136 x 960 LED backlit LCD

4.8" 1280x720 HD SAMOLED

5.3" 1280 x 800 HD SAMOLED

5.5" 1280 x 720 HD SAMOLED

Battery

Internal 5.45 Whr

Removable 7.98 Whr

Removable 9.25 Whr

Removable 11.78 Whr

The Galaxy Note 2 also is one of the first handsets on the market other than Nexus devices to ship running Android 4.1. This puts it at a definite advantage in some tests as we’ll show in a moment, both due to improvements from project butter and what appear to be even newer Mali–400 drivers. I pulled the Note 1 out of my drawer and updated it to Android 4.0.1 and ran all the same tests again.

First up are some of the usual JavaScript performance tests which are run in the stock browser. Anand added a few in, and personally I think we’ve got almost an abundance of JavaScript performance emphasis right now. Again this is strongly influenced by the V8 JIT (Just In Time Compilation) library bundled with the stock browser on Android. OEMs spend a lot of time here optimizing V8 to the nuances of their particular architecture which can make a substantial difference in scores.

The usual disclosure here is that Android benchmarking is still a non-deterministic beast due to garbage collection, and I’m still not fully satisfied with everything that is available out there, but we have to make do with what we’ve got for the moment.

Next up is GLBenchmark 2.5.1 which now includes a beefier gameplay simulation test called Egypt HD alongside the previous Egypt test which is now named Egypt Classic. Offscreen resolution gets a bump to 1080p as well.

Here we see Mali–400 MP4 performing basically the same as I saw in the International Galaxy S 3 which is no surprise — it is after all the same SoC. Other than a slight bump in the Egypt Classic offscreen performance numbers, there aren’t any surprises. We see Exynos 4412 putting up a good fight, but Adreno 320 in APQ8064 is still something to look out for on the horizon. I'd run Taiji as well but we'd basically just see vsync at this point.

Vellamo 2.0.1 is a new version of the previously well-received Vellamo test developed by Qualcomm initially for in-house performance regression testing and checkin, later adopted by OEMs for their own testing, and finally released onto the Google Play Store. This is the first time the 2.0 version of Vellamo has made an appearance here, and after vetting it and spending time on the phone with its makers I feel just the same way about 2.0 as I did 1.0. There’s still the disclosure that this is Qualcomm’s benchmark, and that stigma will only go away after the app is open sourced for myself and others to code review, but from what deconstruction of the APK I’ve done, and further inspection of the included jS, I’m confident there’s no blatant cheating going on, it isn’t worth it.

Vellamo 2’s biggest new thing is the inclusion of a new ‘metal’ test which, as the name implies, includes some native tests. This is C code compiled with just the standard android compiler and -o2 optimization flag into both ARMv7 and x86 code. There’s Dhrystone for integer benchmarking, Linpack (native), Branch-K, Stream 5.9, RamJam, and a storage subtest.

Exynos 4412 and Android 4.1 is definitely a potent combination, which puts it close to the top if not at the top in a ton of CPU bound tests. My go-to application with lots of threading is still Chrome for Android, which regularly lights up four core devices completely. Even though our testing is done in the stock browser (since this almost always has the faster, platform-specific V8 library) my subjective tests are in Chrome, and the Note 2 feels very quick.

Post Your Comment

131 Comments

Of course you could consider this an empirical display of the difference between Apple supporters and Apple haters. The Apple supporters don't seem to feel a compulsive need to wander into a non-Apple thread and tell everyone how much Android, Samsung, Google, AMOLED, S-Pen and TouchWiz all suck.

Either way, yes, a thread that does't feel like a wanna-be gang fight between two groups of 8-year-olds is a pleasant experience!Reply

As owner of a 5" Dell Streak, I'm looking to replace it with another large phone and Note 2 seems just perfect. One question though: can I use another launcher (I don't like TouchWiz) without losing the S-Pen features?Reply

"we load webpages at a fixed interval until the handset dies, with display set at exactly 200 nits as always. The test is performed over both cellular data and WiFi. The new test has decreased pause time between web page loads and a number of JavaScript-heavy pages."

What's the pause time?

In such a test, the system that is fastest to idle will typically result in a lower or lowest power draw through time. One can only interpret the performance of a device relative to other devices when doing the specific test, maybe. No one should be assuming that they'll get 10 hours of wifi/cellular browsing though, whatever time your device of choice gets.

Do humans really do what the battery life test does? I don't know if we're at the point where we are input saturated on how fast we can web browse, but I don't think we're at the saturation point just yet. So, if a phone downloads and renders web pages faster, I think we would just browse more pages in the same given time frame. The devices that burn more power during download and render times may end up with shorter battery life performance simply because a user is browsing more and faster.

You guys are definitely promoting the idea of wider "dynamic range" on battery performance. The battery life test is perhaps a light use case based on my gut feel. Minimally, I think, at least for web browsing, the minimum battery life performance should at be established for devices.

Lighter and lighter browsing workloads would tail off towards standby time. You may want to establish or guess at what the max work rate for humans is while browsing the web, like using an average reading speed or maybe somewhere in the 80th percentile of reading speed.

Obviously, it's a more than one parameter problem with games, GPS, etc, but maybe that can be tackled later.Reply

Of all the smart phones I know about, this is the one I'm most interested in. However, I'm a Windows kind of guy, tried Android briefly and I certainly see why people would choose it over iOS, but in my opinion it's not great. Just my personal view, of course. I'm Windows trained and Windows is the most "intuitive" for me, largely because of that I'm sure.

So, in case anyone from Microsoft is reading - can we get Nokia to make something like this with Win 8 as the OS?

On Verizon? (I have to say though, Verizon's methodology for keeping their phones up-to-date doesn't thrill me. Of course, cell phones, particularly smart phones, are about as private as a house made entirely of screen doors anyway, so I'm not sure that's all that important as things exist today.)

Really, you have a Note 2 without split screen functionality? So you cannot see and work with 2 apps at the same time??Thats weird! Here in Holland I cannot walk the street without seeing billboards showing off this feature.. those Apple fans don't know what hit them.. firstly iOS 6 wich feels like 2010 version of Android for Android users, now this Note 2 can even show 2 apps at the same time, side by side!

Really weird the US versions don't have this firmware yet. But like you said in other reviews, US market is totally different. Here we see Samsung phones, in the shops, there you see AT&T or T-mobile or Verizon phones. Although here in NL almost always sold with subscription, and with the name of your provider on it, at least its an actual Samsung phone and not a T-Mobile phone..Reply

What? They're all still Samsung/HTC/etc branded in the US... What are you going on about? The carriers do meddle with updates and the firmware, but the thing isn't sold as an AT&T Galaxy S or something, it still says Samsung and it's advertised by Samsung (for better or worse, their anti-Apple commercials are almost as bad as Apple's old anti-PC commercials)...

Hell my Sprint HTC EVO 4G LTE doesn't even have Sprint silkscreened anywhere on the front (amazing show of restraint on their part). AT&T's often the worse about branding, they've put the name AND logo on some Moto phones (centered no less).Reply