Slashdot videos: Now with more Slashdot!

View

Discuss

Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

georgewilliamherbert writes "EETimes is reporting that Dalsa has shipped a record-breaking 111-megapixel CCD image sensor to customer Semiconductor Technology Associates. The chip was paid for by a U.S. Navy SBIR project. At four inches across, a bit big for camera phones, but the 10560x10560 format will probably get professional digital camera users drooling."

Thanks for trying, but you're not really saying anything. A 500 megapixel image printed on a 10 kilometer by 10 kilometer screen and viewed at a distance of 1 meter will be easily distinguishable by the human eye. As another poster pointed out, it's actually not area that matters but angles, or if you use area you should consider the viewing distance also.

If you're doing something for a small print piece you want a high DPI (ie 300). If it's a poster you can use a lower DPI. If it's a Billboard you can use a significantly lower DPI.

I'm a graphic designer and I recently committed the industry's cardinal sin the other day... I had a comp printed at Kinkos. I was printing a fairly large bus shelter poster that was 150dpi. The newb behind the counter had the audacity to bitch about DPI, even though (I would imagine) it was fairly obvious that I did this for a living.

If you're developing something large in Photoshop you do -not- want to play around in 300 DPI. People read those things from a few feet away and, I don't care if you have a new dual-core dual g5, you do -not- want to wait for a 30x40in bitmap to rotate on a multilayered 300 DPI document.

That said, high res photography is important. You may only want to highlight a small piece from a large image, and you can't do that unless you have good source material.

His post refers to the resolution of the eye itself . ..Which may not answer the original question. ..

"Per unit area." I believe, although he didn't express it quite right, that what he's interested in is how many dots per inch at a given viewing distance on the print before the human eye cannot tell the difference.

He wants to know how much camera is actually overkill when all he wants is a picture of his girlfriend for his desk.

The answer, of course, is "it depends." I haven't seen his girlfriend so I don't know what the appropriate resolution would be.

.....Yes I can believe it wasn't designed by someone who knew anything at all about optics. We have a blind spot wich the brain fills in. This is because part of optic nerve is in front of the retina......Repeating someone's ignorant statements isn't exactly the most intelligent thing to do.

Someone who does know about eye design is the ophthalmologist Dr George Marshall, who said:'The idea that the eye is wired backward comes from a lack of knowledge of eye function and anatomy.'He explained that the nerves

Well, resolution of the human eye would be detected in angles (degrees/minutes/seconds), not area, but nonetheless I think I can give the approximate answer you're looking for based on some general rules for selecting screen size for high definition. For a 720p screen, you generally want the screen size to be about half as wide as the distance you're sitting from it, such that a 1280x720 image is considered more or less fully resolved about 10' away from a 60" (width, not diagonal as TVs are usually quote

Oh yes, I was aware of that after I did the math, though it is interesting that 1080p is really "plenty" for video. I realize that while I calculated the detail we can discern to be about 1.3 arcminutes, I think the actual number is more like.3 (as in about a third of an arcminute, not a little more than one)

The eye has around a hundred million nerve inputs, so the per frame resolution can't be higher than that. However, the real resolution of that is actually considerably lower (the signals are essentially downsampled on their way into the brain).A high speed head mounted display (sufficiently close to the eyes) with only 2-3 megapixels would probably be sufficient to completely satisfy your eyes.

It's not a noob question, but it does try to liken things that are not alike. Unfortunately, the human eye and cameras are different beasts that tend to frustrate nearly every attempt at comparison. This is in large part due to the fact that when most people say "the human eye" they actually mean the "eye-brain system," which is far more complicated than just the eye, which is itself already complex enough to do plenty of the frustratin'.

In any case, the issue with throwing the brain into the mix is that it does a lot of "post-processing" on the images that stream in from the eye and give us a mental picture much different from what the eye itself is actually able to pick up. Also, the eye has different kinds of vision--in the center of the field of view, in a very narrow range in fact, we see with acuity. Outside that very narrow range, our brain fills in a lot of the details that we think we see from moment to moment, but is actually not being "seen" in the same sense as what's in the center of view. (Of course, this comment will inevitably beget the philosophical discussion: what does it mean to "see," exactly?) If you doubt that your eyes only see with acuity in a fairly tight circle around the direct center of your field of vision, try this experiment: pick up a book, open it to a random page, and fixate your eyes on a word somewhere in the center. Now, see how many words you can read around that word without moving your eyes to look directly at those words. The words you can make out fall in your acute vision field. (You'll find that if you move the book farther away, you can read more words because they fall within the same angle--this works up until it gets so far away the overall level of acuity you enjoy isn't high enough to make out any of the words at all.) The rest of your field of view is in your non-central field (I'm callng it). Your peripheral vision is comprised of the part of your field of view for which your brain does not bother filling in any detail--you're only vaguely aware of it in the visual sense provided it's not moving.

What our non-central vision lacks in acuity it makes up for in motion detection. That's why hunters often say when you first spot prey in the distance that's fairly well camoflauged with its surroundings as it moves about, don't look directly at it, but look slightly to the side. That way, when it starts moving again you'll see it and you can put it in center vision again, but once it stops, look off to the side again. Stargazers often use this trick as well--if you look directly at a faint star, after a couple of seconds you'll question whether it's actually where it was just a moment before. But if you look slightly off to the side, your eyeball moves around and twitches enough that it creates apparent "motion" of the faint star you're trying to see and you can pick it up again. (Incidentally--this is the reason why our eyes in are constant motion...if you've ever tried to make your eyes exactly still you know how difficult it is to keep from twitching them constantly. It's because our brain requires that motion to keep the motion detecting parts of your eyeballs feeding the detail your visual cortex craves. You'll also find that if you are able to keep your eyes at all from twitching for an extended period, 10 or 15 seconds, you'll find that the level of detail in your non-central vision starts to fall off, sometimes even fading to black...this isn't very noticable until you start twitching again and suddenly see color and detail spring back.)

Anyway, the point is, no matter what one says about the eye in relation to a camera, someone will be bound to argue (and, in some sense, almost certainly be right). It's kind of a useless endeavor to try to get a megapixel rating for the eye, or figure out what it's dynamic range is, etc. A more fair comparison would be hooking a camera up to a computer, then periodically having the camera move slightly and snap a shot, then the computer takes it and stitches it into a composite of the entire scene comprised of s

It's interesting that this came up, since last week I was reading an article on the resolving power of the human eye as it pertains to photography and how to choose output resolutions. Short answer: 50 lines per degree of view [digitalsecrets.net]. From there you can do some right-triangle trig to figure out how many line pairs should be perceivable for some output format based on how close you're going to be to it. For an 8x10 image, the author says 2300 pixels in the long dimension or 230 PPI would cut it (I didn't double-check his math). I tend to wonder if you don't have to introduce a factor of two in there somewhere, since to reproduce a "line" of resolution seems like it ought to require two pixels.

Of course, that's an oversimplification; hence the long answer. The human eye doesn't have a fixed number of "megapixels" that you could easily convert to a measurement of a photo or really even of another camera. First, you have the problem that the eye's "resolution" isn't evenly spread across the field of view: it's concentrated near the center, and thinner out in the periphery. This is why if you concentrate and try to pay attention to something that's not in the center of your field of view (that you're not looking directly at) it won't be as clear as when you look directly at it. (The exception is in very low light: your indirect vision is better at night vision.) However your brain reassembles the image and makes you think that you're seeing one great-big full-res panorama, when in reality at any one time you're only seeing a small part in "full rez" with the rest of your field of vision at something less, but with the full version available on-demand (by looking at it).

If you could actually do a 'screen grab' of the image your eyes were actually feeding into your brain, at any particular time, I think it would be a lot lower-quality than many people suspect. Almost without question, it would be lower quality than many photographs of the same scene. The depth of field is short, the resolution is concentrated in the center, as is the color, and there's a hole in the dead center of the image because of your optic nerve's placement on your retina. Your sense of sight works as well as it does, in large part, because of all the caching and postprocessing that's done transparently by your brain to the incoming information stream.

Really, when we compare a photo to our "sight," what we're really comparing is the photo to our brain's recollection of how it saw a particular scene, which might be very different from what our eyes actually took in, and further still from the 'objective truth' (if you believe in such a thing, that is) of what actually was there at that moment. The easiest example is color saturation: we tend to see and remember things as being far more colorful than they actually are: an "accurate" photo will therefore look dull compared to memory, so we compensate by oversaturating our photos to make them look more 'realistic.'

It's only possible to make comparisons between our eyes and mechanical cameras, and between our overall sense of sight and recording systems, for very limited cases. Even to answer a relatively simple question like "what's the eye's maximum megapixels?" completely would probably stretch the boundaries of currently understood optometry, neuroscience, and psychology.

The problems that prevent digital sensors from blowing away film are that pixel densities that approach film resolution are too noisy, and digital sensors don't have the ability to handle as wide a range of light intensities as film does.

well the noise problem is also based on the size of the sensor, so cameras with larger sensors can usually produce very clean hi res images. i don' think they are still as high as film, but for most uses, i don't think it will make noticable difference.
I do agree on the dynamic range problem of digital sensors. the trick for now, is to think of it like shooting slide film (which is close to a digital sensor in capturing about 5 stops of light). I am really hoping they get this problem solved before i b

no. the size of the sensor has nothing to do with noise. the grandparent poster was correct -- it's the DENSITY of the sensor that affects the amount noise you get.DSLRs have the advantage, not because their sensors are necessarily larger, but because the pixels aren't packed so tightly together. You could hypothetically use the same processes they use to make those tiny 8MP compact-camera CCDs to make an APS-C sized CCD for a DSLR. You'd have tons of (hypothetical) resolution, but the noise would

given the same number of pixels, larger sensor = lower density. if you used the same process to make a sensor liek you described, you just get more pixels, rather than better pixels. i guess i should not have assumed i mean for the same number of pixels a bigger sensor will produce a cleaner image, but i thought it was aobvious. also, if you really want to get technical, it's not even the density, but the size of the individual pixels that makes the difference.

I have used a film scanner to scan all film I have ever shot in my life. I now use a Digital SLR for all of my photography. I can tell you a few things that I have observed. First, my film scanner has a scan resolution of 2700DPI. For a 35mm film frame, that is roughly 51MB for an uncompressed 16bit color channel frame. I believe in terms of megapixels it's just over 10Megapixels. One thing I noticed is even my 100 speed film has very observable film grain at this dot pitch. My Digital SLR has some distortion when I look at the raw high res image but it's not nearly the same. So my conclusion is that even older DSLRs CCDs have better grain resolution than traditional film. As a note, I used relatively cheap color film. More expensive, black and white, or slide film may be so much better than SLRs of today. I once thought of shooting all slide film for better color depth and resolution, but felt it was too much of a PITA to scan it all by hand.

Next note. The are odd color aberations with SLRs that I still see today that do not exist even in the crappiest of color film that I scanned. There's a look that all digitals have that a trained eye can see. I haven't received any shots taken from truly high end professional DSLRs to see if they have solved this problem but even D30s have it.

Final comment is regarding color depth, undersaturation, and over saturation. Since they are all related/same. Film is still by far superior in this regard. DSLRs still undersaturate long before standard color film. Oversaturation is still a problem. Look at the full res pixels of anything shiny. It stands out pretty bad. Skin tones have always been a huge problem. I have no clue why since skin tones are typically in the mid range. Color depth and saturation/undersaturation still has a lot of room for improvement with DSLRs.

So I guess all I really needed to say is that I've observed that grain seems to be mostly solved with DLSRs.. but none of the other issues have yet.

Oh yah.. film speed is another big one. When I crank up my DSLR to 1600ISO it really sucks. Much worse than 1600ISO film. Maybe this is where the film grain comment comes from?

Out of curiosity what DSLR do you have? I have a film slr and a digital point and shoot and am thinking of buying one of the cheap rebel 350 xts as they are now cheap enough to intrigue me (and really as cheap as a high end digi point and shoot) I was going to wait until the full frame dropped but I have a feeling it will be 3 years before they drop $1000.

That is interesting that you find the original rebel better than film then. I have the (original) elan 7/film and an old canon g3 which I gave to my gf actually (under the excuse of upgrading). I'm not a big fan of the "multiplier effect" but the xt is basically the same price as the old canon g3 was and the cheap lens that comes w/ it seems like it is wide enough for use..

Well I have the canon g3 which wasn't really usable past iso 200 so I am sure its better than that:) Its remarkable that the rebel xt is about what I paid for the canon g3 actually back in the day! Of course I really want the canon 5d, oh to dream:)

yeah, at that small a sensor size, it's really hard. i had a G1 before the 300D... withthe G1, i would only shoot at ISO 50, unless i absolutly had no other choice.. with the 300D is might use 200 or 400 if needed. with the 20d, i use 400 and 800 when needed and 1600 when i'm desperate.

Interesting are you taking about the D30 at 1600ISO? IF so, i'd really be currious to see a comparison with the current generation of Cameras, because as i udnerstand it, the 20D's Generation of sensors and newer have greatly improved noise/signal ratios.
On saturation, isn't that at peast partially an issue with the software/firmware?

I have the Digital Rebel original. You are probably right it is related to firmware. I have the most recent version. But at the end of the day when I compare my film scans to my digitals... I get my observations above.So your point is that it's not necessarily the CCD to blame, but possibly software. Point well noted.

As another point of merit. I have found that Canon is by far the leader in this area. My wife's now nearing four year old Canon P&S takes better pictures than a lot of brand new P

i think i am more prone to blame the software side for saturation, since i shoot RAW almost exclusively and thus and under or oversaturation is really the result of my descision in the RAW developing.

from my research and asking people who've shot both, Canon is definately the leader in terms of R&D. i think it's because they are the only company that really makes all of the key components from the glass to the sensors. Nikon seems to be fairly good with certain images. from what i understand the Ni

As odd as this may sound.. but I don't care enough to shoot RAW. I know it's better, but JPEG is good enough for me. I know... I'm not a purist.For the record, I would never stear anyone away from a Nikon digital SLR. They are very good. It just is my opinion that Canon is dominant in nearly all aspects of the photo market. Every review and personal comparison I've ever done, it seems Canon is either nearly tied for top, or way ahead. Of course, this is not a good situation for any of us. I hope Niko

Oh yah.. film speed is another big one. When I crank up my DSLR to 1600ISO it really sucks. Much worse than 1600ISO film. Maybe this is where the film grain comment comes from?

This is because of the difference in how high ISO speeds work in digital vs film. High-ISO film is more sensitive to light because the photosensitive grains are larger -- the digital equivalent would be bigger pixel sensors. Digital cameras implement high-ISO mode by increasing the amplification on the pixel sensors, which makes the

Well sure sounds like that'll BLOW AWAY 35mm film and definitely be about comprable to 4x5 film.

I was actually looking for a funny link, but this guy [kenrockwell.com] makes a great point -- a good scanner and a roll of that 4x5 film [kenrockwell.com] -- yes, four inches by five inches, absolutely huge compared to a 35mm roll -- will get you 100 megapixels of resolution for a couple thousand bucks.

It reminds me of a story I saw (on PBS or Discovery Channel) about modern medicine in developing countries. People will pay extra for a "digital X-Ray", even though the cheap equipment produces a digital image that has far less resolution than a plain old film X-Ray. But it's "digital", so it must be better.

Digital X-Rays involve several orders of magnitude less radiation exposure than film X-Rays. That, and the instant development allowing you to know right away if you need to take another shot, are what make digital X-Rays worthwhile. The resolution is more than adequate for either digital or film X-Rays.

Oh really? Okay, next time you get an X-ray, if you're concerned about your radiation exposure, compensate by:a. Not worshipping the sun for a few days (okay, so you're a geek and don't remember what the sun looks like, so see b. below)andb. (if you're a geek outside of work) stop sitting 6" away from your monitor or television for a few days

That should offset any radiation dose of a typical routine dental X-ray. Now, if you had an X-ray done for more serious issues (injuries, etc.) then I would think tha

The quality of a digital X-ray is as good as the old ones. You won't fail to make any diagnoses because of the changes. The advantages, however, are:1. Cost - much lower2. Radiation - much lower3. Image manipulation - increases diagnostic yield in a variety of ways4. Transmission - to other specialists, near instantaneously (depending on connection speed - usual rate-limiting factor is getting someone in front of the receiving screen to interpret the images)5. Can't lose them (not quite true, but easier to

Digital X-Rays involve several orders of magnitude less radiation exposure than film X-Rays.

Film X-Rays do that too, since the inside of the film cartridge is coated with a phosphorescent compound that emits visible light upon xray irradiation. Ever wonder why your xrays are all blue? It ain't 'cause of the xrays or your bones.

[bquote]It reminds me of a story I saw (on PBS or Discovery Channel) about modern medicine in developing countries. People will pay extra for a "digital X-Ray", even though the cheap equipment produces a digital image that has far less resolution than a plain old film X-Ray. But it's "digital", so it must be better.[/bquote]The advantage of "digital x-ray" is that you don't have all those wonderful film processing chemicals around, the results are near instant, and it requires less radiation compared to traditional film x-rays, and convienence. The hospital near my house is 100% digital. As soon as the image is taken it is uploaded to a server where both the radiologist and doctor can look at it, whether they are at the hospital, at the doctor's office next door, the hospital across town, or half way around the world if need be.

4x5 film doesn't come in rolls, it comes in sheets that you load into a holder, one to a side. You have to load the film in complete darkness, and hope that the holders won't leak. When taking the picture, you focus with a groundglass that is situated where the film will be, then close the lens, insert the holder into the camera, and pull out the dark-slide, and then take your exposure, and you should be taking lots of notes. Because there is so much manual labor that you have to do for each exposure, there is a whole different mindset to Large Format Photography, you will go out and expect to take a half dozen exposures, while the digital camera encourages the practice of just shooting anything and everything, and then sifting through the thousand or so exposures for the good ones.

The owner of a camera shop near where I live once had the opportunity to use a Large Format Polaroid camera [polaroid.com], which exposes Polaroid fim that is 20 by 24 inches. He described it this way: "Take your megapixels and shove them up your ass!"

In 10 years cheaper digital cameras will exceed the quality of large format photography.

I would dispute that assumption. Due to the limitations of lenses that exist in the physical world, you cannot simply make the pixels on a CCD arbitrarily small so that you can have more of them. Even if you could conquer the noise problems that go along with the small pixels in the consumer grade 6MP and 8MP sensors, which are much smaller than the sensors that you find in the more expensive DSLRs, you would run into

Actually they look like decent cables for the money.. gold plated connectors with heavy shielding, although 3' is a bit short. They probably used "digital" to attract the attention of people connecting S/PDIF components, but it doesn't look like they're trying to price gouge based on imaginary features. At least not more than the usual Sears markup. Now these guys [thecablepro.com], on the other hand...

Well sure sounds like that'll BLOW AWAY 35mm film and definitely be about comprable to 4x5 film.

ISO100 film has a grain size of approximately 5 microns, which corresponds to a resolution of 36MP. Standard
4k scanning (12.5MP) captures all the detail in anthing short of the pro-est of the pro, and 8k scanning (54MP)
all but guarantees that even future advances in scanner technology won't have the ability to extract any further
detail from a 35mm negative.

You would need godlike optics, bright light, and a perfectly still subject and camera to come anywhere near that
36MP with ISO100 35mm film, but it represents a sort of upper limit at that speed. 4x5in film therefore has an
effective resolution (at something comparable to ISO100) of 500MP.

So, this can effectively replace 35mm film in terms of resolution. It falls a bit short of replacing
truly professional-quality film, however. But then, how often do you need to print out your personal
pics at literally bilboard size?

So, this can effectively replace 35mm film in terms of resolution. It falls a bit short of replacing truly professional-quality film, however. But then, how often do you need to print out your personal pics at literally bilboard size?

Maybe my grinning face is the ONE YOU NEED TO CALL IF YOU'VE BEEN INJURED!

Or at least so I hear...somebody over at Luminous Landscapes ran a comparison [luminous-landscape.com] of a PhaseOne P45 39-megapixel back against drum-scanned 4x5 Velvia 50. These are guys whose standard print size is 30"x40", so fine detail is pretty crucial to them, as is color accuracy. Bottom line? The film had a slight edge, but not enough to offset the huge increase in convenience and versatility of digital. Granted, the P45 alone lists for $32,990 at Calumet, plus another $6-10,000 or more for the camera and lenses, but app

the 10560x10560 format will probably get professional digital camera users drooling.... I imagine the memory card vendors, hard drive vendors, backpack vendors, and chiropractors will be drooling at this as well:-)

I'd doubt many professional photographers are drooling over this. The market, at least in terms of commercial photograpgy, is about at its limit of need, in terms of the 32+ megapixel cameras. Manufacturers are now pushing the envelope for satelite and other advanced imaging. In most commercial applications, the current state of the art in terms of cameras combined with transfer and storage requirements is more than sufficient.

It may be at its limit for the number of megapixels but, there's still a lot of things to improve like the maximum color range a digital camera can record. With 16 bits color channel, we would be able to record a lot more informations so we wouldn't be limited as much when we try to capture a high dynamic range picture. There's tools like in Photoshop CS2 to give you the abilities to have high dynamic range but it would be a lot better to have it directly in the camera.

It may be at its limit for the number of megapixels but, there's still a lot of things to improve like the maximum color range a digital camera can record. With 16 bits color channel, we would be able to record a lot more informations so we wouldn't be limited as much when we try to capture a high dynamic range picture. There's tools like in Photoshop CS2 to give you the abilities to have high dynamic range but it would be a lot better to have it directly in the camera.

The CCD cameras [sbig.com] used by astronomers routinely produce 16 bits per pixel.
Most of these are monochrome devices: to shoot a colour picture you must shoot
pictures through red, green and blue filters, then combine them.

The key advantages for astronomy are zero reciprocity failure (film
loses sensitivity in long exposures; CCDs don't), high quantum efficiency (almost all the photons
intercepted by the sensor are noticed)
and excellent linearity (you can digitally subtract extraneous light, like city lights).

However, even in astronomy, there is a hard core who still do film. There
are many reasons: some people just like the look, others enjoy the craft
of wet darkroom work, and so on.

My favourite camera is a 4x5 press camera, a
Crown Graphic [graflex.org].
It takes perfect 1950s newspaper photographer pictures. And I develop and
print them myself.

I may be misunderstanding your point, but we currently DO have the ability to capture 12 bits per chanel. Of course, if you are shooting JPEGs then you already limit the bits to 8 per chanel. If shoot RAW the camera stores 12 bits/chanel and if you convert to TIFF it embeds them in 16 bits/chanel for a true 36 bit image (inside 48-bit space).

Going up to 16 would be a nice thing, but as far as I am concerned, 12 is more than enough. Sure, there are situations when I can see posterization or other nasty art

That's not entirely true. I have an uncle that shoots large format and pays $50 a scan for images about this size. The CCD size in this article is about the same size as his large frame film. So it's really not out of the question. Although highly unlikely and rare need for sure. So in general I completely agree with you, but there are a sick few that would actually use it and be able to justify it for the work they do.I don't expect to see anything remotely close to this in a large format camera any t

you'd be surprized who could take advantage of this single chip design.previous 120 megapixel CCD cameras were based on multiple sensor array systems, and while i don't know where they were used most, I can think of a number of uses EG: no flash indirect light photogaphic archiving of ancient documents, maps and tapistries. as well as advertising photography for use in billboards and other large posters.

True, you don't need 120 megapixels to take a high school prom photo, but there are probabbly enough prof

Tell that to the Gigapixl project.:) Seriously, professionals will have geeks amongst them, and geeks love anything that's new, sparkly and does cool stuff. Furthermore, "serious" professionals are subject to market forces, and the market is more likely to buy bigger, better, flashier photograps - even if they can't tell the difference - simply because the adverts look more impressive. Impressions, even when illusions, sell. And photographers know that.

it depends. do you need a smaller camera that usues less power? The basic physics is the more pixels you cram into the same space, the greater your noise to signal ratio. That's why a 6MP DSLR will produce cleaner images than an 8MP points and Shoot.

What I meant is, it's probably much cheaper to make 16 small sensors with the same total area and number of megapixels as one huge sensor. If you could get the seams down to just a few pixels wide, it might be a good tradeoff.

If used with a really expensive, high-speed analog-digital converter (ADC) capable of digitizing ~2700 million pixels per second, then it could reach a good 24 fps speed, but that's about 40 times what is needed by a HDTV camera CCD (1920x1080 interlaced at 60 fps). Normally this conversion speed is only available with a specialized high frame rate set.Image sensors work by converting light to electric charges. More light in an area makes the pixel hold more charges.

the 10560x10560 format will probably get professional digital camera users drooling.

Megapixels are nice, but I would trade high-res for a high-quality lens any day of the week. For example, NASA's Spirit rover took those stunning photos (that we all drooled over) with only a one-megapixel image sensor. [space.com]

The best part about this announcement isn't the 100 megapixel size. Photographers can already buy large format digital backs for view cameras with 300 megapixel resolution (albeit for a hefty price). But they use multiple CCDs and require external power supplies and HDDs. This new chip opens up intruiging possibilities for a self-contained high resolution camera that requires much less power to operate. Still, a CCD of that resolution will generate raw image files of about 350 megabytes each, so portability will necessarily be compromised to a degree by storage requirements.

The link to the SBIR page appears to be defunct due to bookmarking data called from a session. I wasn't about to ask the submitter to give me his cookie and I tried finding info about the Dalsa project on the SBIR site, but wasn't having any luck, so here's a press release [dalsa.com] from the company that built it.

It sounds like the interest for the navy is along the lines of astro-navigation, but I'm not really sure. It's definitely not something general photographers need or even want. It's kind of pointless if your lenses aren't comparably impressive, or if you're not printing it out at a couple feet in size and to be displayed in a way that someone would get close enough to appreciate the quality. Plus once you take all that data, then you have to store it. I'm not sure how RAW images are stored, but if my math serves, a 24 bit BMP at that size would take about 300 MB per image.

Huh. I checked a couple of times to make sure it was persistent, across a couple of computers, but I guess it was anyways and expired. My apologies to the readers.
Control Number for the contract is: N043-226-0074
You can SBIR search for that control number at: DOD SBIR/STTR AWARDS - Custom Search [dodsbir.net]

No, not with an Speed Graphic size camera. The problem is the optics. The resolving power of a lens is proportional to its diameter over the wavelenght of light. You would need a lens wioth a diameter measured in meters on the front of that Speed Craphics camera. This is in fact where these huge CCS are typically used. On telescopes with optics that are a few meters in diameter. Optics can't really be made faster than about f/2 so the focal lenght would be measured in meters too. So you would be be

Okay, why do they say things like "100 megapixel *limit*" (emphasis mine). The sound barrier was a limit, it was a point on the spedometer that seemed impossible to go past. Things changed at that point and a whole different school of thought was nessary to overcome the limit.a 100 Megapixel sensor, while an unholy and awsome creature, is nothing more than the latest and greatest CCD sensor. they broke hte 100 Megapixel mark.Having said that, bravo for them.

The fine article appears slashdotted, so I don't know if they cover this. The application which leaps to my mind for this detector is astronomy. Astronomers will pay big money for a better detector - I've seen a US$200k chip (2k x 2k pixels in about 1990, for use in the Sloan Digital Sky Survey camera.) Even at these prices, it is cheaper to get the same quality upgrade by improving your detector than by building a bigger telescope.

Astronomers run their CCDs at liquid nitrogen temperatures (to reduce thermal noise), and for UV astronomy they use "thinned" chips (etch/grind away the back of the chip so you can illuminate it from that side - otherwise too many photons are lost before reaching the light sensitive volume.) I'm not sure what other features astronomical CCDs require which might not be present in this chip. Pixel size shouldn't matter too much (except in its effect on noise) as you can design your camera to scale the image to suit the detector.

My brother-in-law is a PHD in Electical Engineering that works at Dalsa (actually he probably designed the chip in question). He says that the mostly design for satellite imagine, astronomyt and hollywood. Dalso won a techie oscar a few years back.