Pages

10/17/2013

I have been using Galaxy Nexus for quite some time and it worked fine as an everyday+development phone until its IMEI number got blocked on T-Mobile recently. If your IMEI number is 004999010640000 (you can see your IMEI number by dialing *#06#), then this post might help you restore your connectivity. It did for me: I got my GN off eBay long ago. It was a DoCoMo phone from Japan and it was unlocked with the F*ckDoCoMo app by the seller.

For the confused audience

You used Galaxy Nexus on T-Mobile (or other networks) and someday you found it suddenly stop to register onto the cellular network. After rounds of unhelpful customer service calls, you were told that your IMEI number (004999010640000) is blocked on the T-Mobile network (and possibly other networks too). Furthermore, the customer representative could do absolutely nothing about it (aside from filing an effectless unblock request). If you have a blocked IMEI other than the stated one, this post will not help: it does NOT teach you how to change to an arbitrary IMEI. Legality aside, software-based IMEI change on Galaxy Nexus seems to remain an unsolved problem (unlike some other phones).

Solution

The gist of my approach is to restore the original, unique IMEI number (it is on the sticker on the back of your phone under the battery) from your phone's /factory/.nv_data.bak file. And if you do your backups right, you shouldn't lose any of your data. The usual disclaimer for hacking guides applies: It worked for me and I believe it works in principle, but I cannot guarantee that it will work for you and I cannot be responsible for any damage.

The basic ideas/facts behind the solution are:

/factory/.nv_data.bak (and maybe /factory/nv_data.bin too?) stores the system backup of your IMEI (and other information). Note since this file is hidden (its filename starts with a dot), you can only list this file by "$ ls -a"

both files store the IMEI in some non-obvious way. This prevents easy IMEI modification with a Hex editor. This is why we cannot just make up a new IMEI number for your GN, unlike some other phones which lacks such protective mechanisms.

the android system uses salted md5 checksums stored in /factory/.nv_data.bak.md5 and /data/radio/nv_data.bin.md5 to detect tempering of the aforementioned files. We don't know how to generate these salted checksums but a glitch/oversight in ICS (Ice Cream Sandwich, Android 4.0.x) can effectively generate them for us: it logs an offending checksum along with its correct counterpart.

if both nv_data files fail the checksum test, the android system will create a "default" /data/radio/nv_data.bin containing the infamous generic IMEI (along with the unlock status). In fact, F*ckDoCoMo app achieved unlocking GN via this side-effect of tempering nv_data and leaving the checksums out-of-sync. The nice thing is that the app does edit your /factory/.nv_data.bak in a sensible way so your phone will remain unlocked with this nv_data file.

If your /factory/.nv_data.bak has not been overwritten inadvertently -- that is a big "if" which can only be tested by experimentation --, then we should be able to recover your original IMEI by acquiring the correct checksum for /factory/.nv_data.bak.

Instructions

(I assume that you are familiar with adb, fastboot, and other basic tools or can easily acquire such knowledge via the Web. Upon requests, I will edit this post to expand/clarify on any unclear steps. The XDA threads mentioned at the end are also helpful.)

0. backup your entire phone to /sdcard with the ClockworkMod recovery (CWMR) boot image's backup utility. In additon, as there is risk that you will lose your unique IMEI forever, you should carefully backup the entire /factory and /data/radio directories onto your laptop/desktop/cloud/etc. You can do that by $ adb pull.

1. install a rooted ICS-based ROM (root is probably not necessary since you can get that in CWMR, but it will make life easier), e.g. cm-9.0.0-maguro-stable from cyanogenmod. The glitch we rely on is fixed in later versions. (I tried JellyBean 4.2 and it was no good.)

2. (You will need root to do this and the following steps.) Take note of the content in /data/radio/log/nv.log

Find the original md5 that is the same as /factory/.nv_data.bak.md5 (in my case, it is 92d39f2375e581f8bd8095486c8758b1) and then you should copy the computed md5 in the same line (so in my case, it is a4bc620886c51f712caa377adf90cf24). The computed md5 is the salted md5 checksum we want. FYI- The other set of md5's corresponds to /factory/nv_data.bin.You don't really need to worry about them unless .nv_data.bak is corrupted and nv_data.bin is not. Then you might want to try the process with /factory/nv_data.bin.

3. Remount /factory as read/write and then replace .nv_data.bak.md5 with the correct salted md5 checksum from last step.

5. $ reboot. Now /data/radio/nv.log should contain no new complaints about failed checksums and the IMEI should be restored to some non-generic number! (If you don't care about restoring your data or the ROM/Android version you were using, you are done now.)

6. restore your backup from /sdcard using CWMR. Note that /data/radio (but not /factory) is replaced by your restored backup, so your IMEI is the generic one again!

7. Since we have the correct salted md5 checksum in /factory/.nv_data.bak.md5 to go with /factory/.nv_data.bak, you only need to remove /data/radio/nv_data.bin* and then reboot to let the system automatically restore from /factory/.nv_data.bak. Finally, after rebooting, you should see the same non-generic IMEI as what you saw in step 5, while having your phone restored!

If this works and you want to thank me, buy me a cup of coffee and/or follow me on twitter: @falcondai :)

Additional Resource

You can check whether your IMEI is blocked on T-Mobile: http://www.t-mobile.com/VERIFYIMEI.ASPX (004999010640000 is confirmed to be blocked. Such a generic IMEI should never get blocked... maybe someone reported their phone with that IMEI as stolen to T-Mobile?) This IMEI number seems to be a default for many Samsung phones.

According to some people on Reddit, this generic IMEI number was blocked on T-Mobile as early as 2013/2: http://www.reddit.com/r/Android/comments/17yizs/anyone_else_running_a_custom_rom_and_have_your/ But mine somehow managed to survive till recently, about one month ago, i.e. 2013/9. I called T-Mobile to add the generic IMEI number to my account explicitly after the first time it showed symptoms (likely back in February) and that solved the problem for me until it happened again.

Most, if not all, software-based IMEI modification discussions/methods do not generalize and almost certainly won't help in the case of Galaxy Nexus due to obfuscation in nv_data. (If you find anything applicable, feel free to share them in comments.)

6/20/2013

After having played around with Twitter data for a while, I had a question: how Twitter samples the supposedly random tweets to send out through its sample streaming API?

I vaguely remember that it used to say "1% random sample" somewhere on the official documentation but I can no longer find that statement. So I decided to investigate the question by experiments. The result turns out to be far more fascinating than I expected (such as the appearance of 666).

This task would be trivial if I had firehose access but I do not. I initially thought of crawling tweets with ID's near the ones received in the stream sample and then do the counting. But I quickly found out how terribly inefficient that was: the tweet ids seem often to be very sparse. Then, thanks to Twitter's commitment to open source, I found their tweet ID generator on Github, wittily named snowflake (after a snowflake's large number of possible configurations, I suppose). In order to create a distributed solution to global unique ID generation, the essential idea of snowflake is to use timestamp and unique worker ID together to ensure uniqueness in an independent manner.

The first thing I noticed in snowflake is that whereas the 'created_at' property of the returned JSON tweet objects provides timing information at per-second resolution, one can recover per-millisecond timing information from snowflake! With this more precise timing information, some intriguing pattern emerges from the tweets in sample stream: within each second, all received tweets fall within a 10-millisecond-wide window. So we get 10/1000 = 1% of the millisecond timestamps which translates to roughly 1% of all tweets (assuming good randomness in tweet creation time) confirming the claim in my memory. But the surprise does not stop there, that sampling window is the same for every second! It is fixed exactly between the 657th and the 666th millisecond. So there is the 666 in the title. I wonder what is the story behind choosing 666 and this particular scheme of "random" sampling.

To make the post more complete, I should add that: 1. snowflake is used not only for tweet ID's but also direct message ID's. 2. before snowflake was activated sometime on 11/4/2010, Twitter used incremental ID's (the earliest existing tweet being 20).

To start playing with snow, you can use my little python module to create and melt a snowflake ID. (Indeed, you might soon find that not every tweet is delivered even in that 10 milliseconds window.)

If you find this interesting, leave a comment. We can also talk on twitter: @falcondai

5/22/2013

In the official Xbox One release yesterday (2013/5/21), it is mentioned that the new Kinect sensor employs a time-of-flight camera to acquire depth image instead of using the Light Coding technology from the original Kinect (which is patented by PrimeSense). It amazed me because I heard about how expensive a time-of-flight camera is: SwissRanger 4000 (a popular choice in academic research) with 176x144 resolution at 50 FPS costs more than $4,000. So I started digging and found how ignorant I am to the recent development of the field.

To understand how time-of-flight (or TOF) cameras acquire a depth image, its Wikipedia page is a good place to start. The basic idea is that you need to measure the round-trip time (RTT) of the photons that are emitted by the sensor and reflected back. Given the speed of light being 3x10^8 m/s, the sensor would need the precision to measure 6 picoseconds difference in time to measure 1 millimeter difference in depth (1mm is the best precision the original Kinect could achieve). This high precision requirement, i.e. high frequency RF, makes the chip and circuitry design more challenging and costly. The surprise for me is that 3DV Systems (and Canesta) seemed to have found a way to lower the cost drastically and planned to release a RGB-Depth sensor, called ZCam, for under $100. (But before 3DV could sell it, the company was bought by Microsoft.)

Let's venture deeper into the hardware. Thanks for WIRED Magazine's exclusive report, we can see the performance and internals of the new Kinect sensor.

We can see from the picture of the circuit board that the three crucial components, RGB camera, infrared (IR) sensor, and IR illuminator, not unlike the original Kinect except that the IR sensor and illuminator are no longer exposed.

The external of the new Kinect sensor unveiled with Xbox One

Photo of the internal of a new Kinect from WIRED with labels added by me

Photo of an original Kinect sensor with components labeled.

Thus the IR sensor and RGB camera are still separated (also confirmed by the different field of view when switching between the two streams in the WIRED video: http://youtu.be/Hi5kMNfgDS4?t=5m27s). If the sensor can capture both RGB and IR simultaneously from the same sensor (or just switching quickly between the two at 60Hz/60Hz or 30Hz/30Hz), texture mapping alignment in KinectFusion type of application would get better.

In case you wonder, the reason why I argue that the IR illuminator is the big rectangular block to the left of the IR sensor (instead of the other way around) is because the active IR image shows shadow due to IR illumination on the left but the screen we see is a mirror image: http://youtu.be/Hi5kMNfgDS4?t=5m8s. This shows that the IR illuminator is on the left to the IR sensor (which by the way should have a lens like the exposed RGB camera).

Comparing to the original Kinect, the most noticeable improvement of the new Kinect is the almost shadow-less depth image (see video http://youtu.be/Hi5kMNfgDS4?t=37s) due to the closer placement of IR sensor and illuminator. In fact, TOF technology allows for more flexibility with the illumination placement and design (thus better depth image). For comparison, google "kinect shadow" and take a look at the images.

Looking forward, the natural question to ask is when will depth sensing technology go mobile? Canesta seems to have some prototype for product can fit into the form factor of a phone: https://www.youtube.com/watch?v=5_PVx1NbUZQ and the video is two years old.

The trail that led to this new design of Kinect sensor is clear in retrospect. Microsoft acquired 3DV Systems and Canesta a few years ago which have both worked on TOF technologies extensively. The acquisitions obviously clear up some patent concerns for Microsoft (Bye, PrimeSense...). The down side is that we, as developers and consumers, might not see a more open-source friendly alternative with similar technology anytime soon. And we would have to rely on Microsoft to release good SDK and live with Windows when using the new Kinect in commercial applications.

2/15/2013

Visualizing geo-location data is often more involved than other types of data because you probably want to put them into the context of places instead of just latitude-longitude values. There are many solution to this problem and mine might not be the best or the easiest. Please share your thoughts in comments if you have a better solution. So my solution is to place your data on Google map with their static image API: it can provide street views too!

Using Google Map Static Image API
To illustrate how easy this is, I will use some examples as well as links to the relevant entries in the reference. The basic idea is that all data (and other parameters) are passed in a request to Google in the URL string and then the rendered static image will be returned as a response.