In this presentation, google's image and video compression codec were compared to major image and video compression standards including JPEG, JPEG XR, JPEG2000, x264, and HEVC (H.265). It is concluded that

Sony entered the binocular market on Friday, introducing its DEV-3 and DEV-5 digital binoculars, both capable of capturing 7.1-megapixel still images and 1080 HD videos in either 2D or 3D.

The binoculars are the “world’s first digital binoculars with HD video recording, zoom, autofocus and SteadyShot image stabilization,” according to the company’s announcement.

SteadyShot image stabilization is the the optical stabilization system that’s found in Sony Handycam camcorders and Cyber-shot cameras — it helps keep images clear and stable, even when viewing at high magnifications.

The binoculars come with a rechargeable battery pack that allows up to approximately three hours of 2D recording on a single charge. Also included with both models is a battery charger/adaptor, A/V connecting cable and USB cable for PC connection.

There are only slight differences between the two binocular models. While both the DEV-3 and DEV-5 feature 10x optical zooms, the DEV-5 can digitally zoom up to 20x. The biggest difference is the DEV-5′s built-in GPS receiver, which enables it to automatically geo-tag videos and photos. Lastly, the DEV-5 comes with a number of fancy accessories not included with the DEV-3, including a carrying case, neck strap, finder cap, a lens cover and large eye cups.

You really have to be an enthusiastic birdwatcher or sports fanatic to shell out for these puppies, though. The DEV-3 comes in at $1,400, and the DEV-5 will be priced at a whopping $2,000. We’re not convinced that a $600 price jump is justified for the addition of DEV-5′s GPS feature, advanced zooming and a few accessories.

Both binoculars will be available in November 2011 at Sony retail stores, the online Sony store and at authorized retailers.

Hewlett-Packard purchased Palm last year for over a billion dollars primarily to get their hands on the WebOS operating system for powering its tablets and smartphones. It's turned out to be much too little too late. Despite WebOS being a new operating system with many attractive features, HP's tablet offering, the TouchPad, has been a major bust, selling in the hundreds and leaving major retailers complaining about their inventory and wanting HP to take it back. So HP announced yesterday that it was getting out of the tablet and smartphone business. WebOS may find a home (and the most likely would be someone who is currently betting on Android and worried that now that Google has to make real money on Android to justify its $6B acquisition of Motorola Mobile maybe they should hedge that bet; but it will be an expensive hedge). I don't know why HP expected to be a big hit out of the gate with its WebOS strategy, and if it didn't have the stomach for a lengthy race and thought it was a sprint, I don't know why they bothered to get into the business in the first place.

HP, which is the largest PC manufacturer in the world, also announced that it may get out of PCs. Presumably, in the same way as IBM, by finding a home for the division in a company that is more geared up to producing consumer products.

They are also buying Autonomy, the largest software company in the UK, $10B, positioning themselves more in services and servers, competing head to head with IBM and Oracle. Of course Apotheker the CEO would probably prefer to buy his old company SAP but he can't afford it since it is worth as much as HP.

Analysts didn't like it and many downgraded HP, and as a result HP is down 20% (destroying $12B or so of market cap). So forget that SAP is worth as much as HP, it's now worth $10B more.

So what a story! The big fight by Carly Fiorina (against Walter Packard, Bill's son) to buy Compaq. Oh yes, and people's phones being bugged. Out she goes. In comes Mark Hurd. Weird sexual shenanigans and out he goes (and pops up at Oracle). In comes Leo Apotheker (whose prior experience was all running software businesses such as SAP and for a time was hiding to avoid being subpoenaed in a lawsuit with Oracle). I wonder how long he'll last.

Friday, August 19, 2011

Dylan McGrath

8/17/2011 3:40 PM EDT

SAN FRANCISCO—Texas Instruments Inc. said Wednesday (Aug. 17) that its OMAP multimedia applications processor line is not for sale, contrary to widely circulated rumors.

A spokesperson for TI said via email that the company is aware of speculation in the press about the sale of the OMAP business and wanted to set the record straight. Rumors about the potential sale of the company's OMAP division are inaccurate, the spokesperson said.

"To be clear, these are rumors, plain and simple," the spokesperson said. "They are not true, and were not started by TI. TI remains committed to our core wireless business, which encompasses the OMAP applications processors and wireless connectivity solutions. And, we are committed to helping our customers succeed in the marketplace."

Rumors have been circulating for several weeks that TI was considering the sale of its OMAP division. Rumored suitors for the OMAP division included Broadcom Corp. and Advanced Micro Devices Inc. Last week, Will Strauss, principle analyst at market research firm Forward Concepts Inc., speculated that even Intel Corp. might be a potential fit for the ARM-based OMAP line.

TI previously declined to comment on the rumors, citing long-standing company policy not to comment on rumors or speculation about mergers, acquisitions or divestitures.

Thursday, August 18, 2011

For the last few years what we have observed is the increasing attraction of FPGA for signal processing engines and there are systems in the market that employs a FPGA in conjunction with a processor chip on standalone systems. Now days during the architecture phase of a project, system designers always face this common question: what is best choice for the system a DSP or a FPGA? This article will list down 5 important parameters or rather guidelines to help make the right choice.

For the last few years what we

1.Performance

Identify the sampling rate of the system design in consideration. If it is more than a few MHz, FPGA is the natural choice. What is the data rate of the system? If it is more than perhaps 20-30Mbyte/second, then FPGA will handle it better. How many conditional operations are there? If there are none, FPGA is perfect. If there are many, a software implementation may be better. Does your system use floating point? If so, this is a factor in favor of the programmable DSP. Are libraries available for what you want to do? Both DSP & FPGA offer libraries for basic building blocks like FIRs or FFTs. However, more complex components may not be available, and this could sway your decision to one approach or the other. The table below shows a direct comparison table for performance criterion.
MMAC is the number of fixed-point-32-bit or single-precision floating-point multiply-and-accumulate operations that can be executed in units of millions per second

2.Power consumption

In some high-performance signal processing applications, for example, FPGAs can take advantage of their highly parallel architectures and offer much higher throughput than DSPs. As a result, FPGAs' overall energy consumption may be significantly lower than that of DSP processors, in spite of the fact that their chip-level power consumption is often higher. Unfortunately, there is a dearth of accurate, apples-to-apples energy consumption data for FPGAs and DSP processors, making it difficult to compare their energy efficiency.

3.Form factor and Size

When sample rates grow above a few MHz, a DSP has to work very hard to transfer the data without any loss. This is because the processor must use shared resources like memory busses, or even the processor core which can be prevented from taking interrupts for some time. An FPGA on the other hand dedicates logic for receiving the data, so can maintain high rates of I/O. A DSP is optimized for use of external memory, so a large data set can be used in the processing. FPGAs have a limited amount of internal storage so need to operate on smaller data sets. However FPGA modules with external memory can be used to eliminate this restriction. A DSP is designed to offer simple re-use of the processing units, for example a multiplier used for calculating an FIR can be re-used by another routine that calculates FFTs. This is much more difficult to achieve in an FPGA, but in general there will be more multipliers available in the FPGA. If a major context switch is required, the DSP can implement this by branching to a new part of the program. In contrast, an FPGA needs to build dedicated resources for each configuration. If the configurations are small, then several can exist in the FPGA at the same time. Larger configurations mean the FPGA needs to be reconfigured – a process which can take some time.

4.Design Reliability and Maintenance

This is one area where we can always debate which is better reliable and easier to maintain. Experts say that with similar expertise of 2 engineers one on FPGA and the other on DSP, FPGA based system would be better in terms of reliability and maintenance. The reasons behind this are due to differences in the digital signal processor and FPGA engineering development process. There is a fundamental challenge in developing complex software for any type of processor. In essence, the digital signal processor is a specialized processing engine, which is constantly reconfigured for many different tasks, some DSP related, others more control or protocol oriented. The complexity of each task is more or less equivalent, no matter whether the design uses digital signal processor or FPGA implementation. Both routes offer the option to use third-party implementations of common signal processing algorithms, interfaces, and protocols. Each also offers the ability to reuse existing intellectual property (IP) on future designs. FPGAs offer a more native implementation for most DSP algorithms. Each task is allocated its own resources, and runs independently. It intuitively makes more sense to process each step of a continuously streaming signal processing chain in an assembly line-like process, with dedicated resources for each step. By comparison, FPGA designs tend to be updated much less frequently, and it is generally an unusual event for a manufacturer to issue a field upgrade of a FPGA configuration file.

5.Cost: Development Time, Time to market and risk

This is another potential item of debate. Some are of the opinion that Programming FPGAs is difficult, usually requiring a hardware-oriented language such as Verilog or VHDL. FPGA solutions can take an order of magnitude longer to code than DSP solutions which impacts development costs and increases time to market. The DSP can take a standard C program and run it. This C code can have a high level of branching and decision making – for example, the protocol stacks of communications systems. This is difficult to implement within an FPGA. But at the same time one can argue with the fact that most signal processing systems start life as a block diagram of some sort. Actually translating the block diagram to the FPGA may well be simpler than converting it to C code for the DSP. So looks like it all depends on availability of expertise.

Summary

At the end choosing between an FPGA and a DSP is dependent on several other factors apart from those listed above. In fact there is no global recipe to decide this as it’s a tradeoff business. It is the job of the architect to choose a platform that best meets the requirements of a specific system. I tried to give some insight into choosing the appropriate device for your design and hope this helps.

So why isn’t everyone using FPGAs for DSP?

• Lack of experience using these devices for intense computational applications.
• Algorithms developed for microprocessors can be difficult to translate into hardware.
• Immaturity of design tools for FPGA based DSP design
• Success of an FPGA DSP design is heavily dependent on the experience of the designer, not only in implementing designs in FPGAs, but also in tailoring algorithms for hardware efficiency.

Saturday, August 13, 2011

Recently the Wifi of my Viewsonic G Tablet, in which Tnt Lite 4.2.0 was installed, can't be enabled. It just likes the typical Wifi errors reported in the web. I thought it may be related to OS, and so decided to install the latest Android Honeycomb.

Thanks to roebeet in slatedroid.com, I just followed the instructions in

Many initiatives are underway for greater video quality. Japan's NHK is working on Super Hi-Vision video format , a successor to the familiar high-definition broadcast system, using 4K × 2K TV with approximately four times the pixel count of Full HD. Fujitsu announced that it has developed h.264 based codec equipment for systems transmitting Super Hi-Vision (SHV) video, with resolution 16 times higher than Japan's current Full Hi-Vision video. As entering an era of video data exploding, 3D and other high-resolution imaging, video compression technology is going to be more and more important.

HEVC, also called h.265, aims to achieve approximately four times the compression performance of MPEG-2 and approximately twice that of MPEG-4 AVC/H.264. The following slides show this initiative:

Tuesday, August 9, 2011

ABI research revealed the total Application processor market evolution up to 2016,

Eric Esteve suggested since smartphone will continue to grow during the next five years with a 26% CAGR. This means that in 2016, we can expect just less than 900 million smartphone to reach the market. This will generate a market for Application processor growing at the same rate, in units, and a bit less in value, due to the price erosion. There will be a direct impact on EDA and IP sales, as these SoC are ever more challenging and complex to design and the mobile industry demand in Time to Market impose to outsource ever and ever more IP.

According to a survey conducted by Eric Esteve, there are 173 million smartphone in 2009. At the beginning of 2011, we had access to the 2010 figure: 304 million units! 75% year to year growth on such a market, in hundred millions, it is something we never saw before in the electronic industry. See more in

Thursday, August 4, 2011

ARM's move into the broad Tablet and PC space is based on lining up as many partners as possible to attack Intel from multiple angles. It’s a strategy not so different from what Intel employed in the early PC days. However, the strategy is unraveling as Apple and Samsung have reached market share domination without ARM’s merchant partners. The end game is still playing out as partnerships and alliances continue to form. The long term impact on ARM will be slowing revenue and earnings growth...

The final piece to Apple’s strategy is to arrange a nice marriage between Qualcomm and Intel for the benefit of just Apple. Apple will demand that Intel run Qualcomm silicon in their fabs to get maximum performance at lowest power and cost. The Qualcomm silicon will be used for iPhones and Tablets in some die-stacking package. Getting CPU and Communications silicon cost down is key to their long term battle with Samsnug. ARM wins in the Tablet and iPhone wars hanging on Apple’s coattails but it is less of a win than if the merchant players were all in the game.

Intel wins in Tablets and iPhone business through expanded foundry business for Apple that includes building ARM chips for iphone and ARM + x86 combo CPUs for Tablets and MAC Air notebooks. Add on to that the Qualcomm foundry business and it is a significant revenue upside. Qualcomm may have the option of increasing their business with Intel Foundry – for the purpose of selling chips into Samsung, HTC, and Nokia etc… As a long time participant in the mobile PC business, it is the shifting alliances that will be the most fun to watch over the next two years.