Not a prediction for what this story is talking about, but here's a few things I'd like to see Apple take on:

* Living room TVs as part of a cable partnership and content deal that would blow away the cable box and current UI and shift to an anywhere/anytime/ any device content delivery.

* Partnership with several automakers to integrate iOS into dashboard features -- navigation, Airplay integration, automatic syncing of tracks, playlists and apps to an in-car hard drive, etc.

* Partnerships with home security, appliance makers, HVAC and electrical controllers, etc., to integrate iOS into specific home functions. I could see wall-mounted panels that run iOS apps to control thermostat, home audio, security, etc. It would be app-driven, so individual vendors can compete on the strength of their own controller apps. Apple would target middle-class buyers rather than the high-end custom market with ease of use, wireless, it-just-works technology. Appliance manufacturers would love the technology because it would spark a wave of iOS-compatible purchasing.

All of those are areas where Apple could generate licensing revenue to manufacturers without depending on non-Apple devices to "run" on iOS.

In regards to the plenoptic cameras, I'll say this - perhaps that will be the next exciting wave of tech in the field, but for now, a brief perusal of the info suggests it's a ways from practical application in post-production.

Too slow, too expensive. Classic "prototype" tech.

It took some time before we could utilize the current still image capabilities into a motion picture process. That's always been the stumbling block. Stills are simply a different game.

Having said that, I'm sure that if this focus magic is truly viable in the big picture, it'll inevitably be coming down the pipeline quickly.

Imagine: a plenoptic DSLR with pro-audio capabilities would be like having your very own cave troll. Nothing would stand in your way!

In regards to the plenoptic cameras, I'll say this - perhaps that will be the next exciting wave of tech in the field, but for now, a brief perusal of the info suggests it's a ways from practical application in post-production.

All you really need is for one company to breach the gates and get the communities creative juices flowing. I would not expect one company to write all the software to be used in digital photography and frankly we have not seen this, software technologies actually drove hardware sales.

Quote:

Too slow, too expensive. Classic "prototype" tech.

The above drove me to the Kodak comment. Living in Rochester NY it was not an uncommon refrain to hear to justify Kodaks arms length dealings with digital. It is also a classic example of a company that can't see a year or two into the future.

Quote:

It took some time before we could utilize the current still image capabilities into a motion picture process. That's always been the stumbling block. Stills are simply a different game.

Having said that, I'm sure that if this focus magic is truly viable in the big picture, it'll inevitably be coming down the pipeline quickly.

If the tech can follow the path blazed by conventional digital it will become significant quickly. I believe the one big thing with digital wasn't the cameras or sensors, but rather software. Easy access to the image data along with an explosion in software to work with that data really ignited digital photography.

It looks like plenoptic is more involved technology wise but even so if a easy to work with format for the picture data can be accessible to all I can see rapid adoption. However if we see the company controlling access in a way that makes building tools for RAW data difficult then we have a problem. Success really depends upon software these days.

Quote:

Imagine: a plenoptic DSLR with pro-audio capabilities would be like having your very own cave troll. Nothing would stand in your way!

Cave troll? IOS device maybe?

I actually see the concept of the plenoptic camera as being pretty cool. If priced reasonably I may buy one my self. Unfortunately I don't expect the first camera out the door to be out standing. Well let's put it this way, not outstanding considering the number of pixels available. In this regard I think Lytro is being a little coy as it looks like they have a big sensor with few pixels.

The above drove me to the Kodak comment. Living in Rochester NY it was not an uncommon refrain to hear to justify Kodaks arms length dealings with digital. It is also a classic example of a company that can't see a year or two into the future.

I

Kodak makes me sad. They tried to remain relevant by buying out other companies involved in digital photography. I don't know if they are even designing ccd chips anymore.

I suspect that the other guys re going after a volume market but even if not that is one expensive camera.

Well hopefully though the Lytro web site was not encouraging at all in that respect. In fact they seemed to be making up excuses. Mind you I'm not mega pixel crazed but even an extremely high quality camera can not make up for the lack of pixels..

This thread's the first I heard of the Lytro guys. They seem to be touting a new sensor, which is HUGE bucks to develop. Even if it works as advertised it has a very steep acceptance curve because it doesn't look to be compatible with stuff people are already comfortable with.

Quote:

I was left with a slightly different picture. Frankly though the web site leaves a lot to be desire. There is nothing of value on the sight when it comes to the camera itself. Further all of the pictures posts are too tiny to be of use. Now I understand this is generation one technology and very bleeding edge at that, but I've yet to see sound descriptions of what the camera output and its current status in R&D.

No way can you get true zoom from a static light-field. That is the standard Hollywood mistake of generating new detail from thin air. You can crop and enlarge all you want, but eventually resolution will get pixellated at medium and larger zoom levels. You can play all kinds of other games though based on what is captured from the scene.

Quote:

If they could end up so small that would be great, t just looks like it is a few years off.

idunno. With the kind of computational photography I'm familiar with you may eventually just place a lens adaptor on the back of an existing lens and onto an existing camera and get the benefits in post. That spells cheap death for wholesale custom hardware.

Kodak makes me sad. They tried to remain relevant by buying out other companies involved in digital photography. I don't know if they are even designing ccd chips anymore.

Interestingly doing that right in Rochester. They actually make some very impressive chips but even here I'm not sure they grasp the market. For example their interest in the 4/3rds market which I see as another boondoggle.

Interestingly the concept of the 4/3rds aspect ratio isn't bad it is rather they strived for an extremely small sensor. Here it looks like they had underestimated cell phone camera quality and ignored the needs of pros.

Speaking of buying out, they did or do own one of the Japanese camera companies that was focused on point and shoot. The problem there is the point and shoot market is dry as a bone. It appears that their Crystal balls where left unpolished whence looking towards the future. Really what we have in Kodak is a series of moves and missteps that they never recovered from.

Part of Kodaks problems may have been a form of industrial incest. For example Kodak had a relationship with a local University (University of Rochester) that would train executives and grant said executives MBA's. Now one can argue the value of an MBA, but any program that ends up being tied to closely to a local company becomes a problem. More so it appears that the programs goals where more about creating subservient lower level managers that did as they where told.

How tight was this relationship? Well a Fuji executive enrolled in the program but Kodak had so much influence they actually had the student kicked out. That is an incredible amount of influence to have over a University. While the influence over the university was pathetic it also demonstrated significant short comings with Kodaks views on the world. For example how do you really derive value from a program if the only people involved in the program are Kodak people. You set yourself up from the beginning to control what sorts of opinions, strategies and plans will be discussed. I think a lot of this stems from Kodaks ability, for a very long time, to control the markets so they made darn sure their executive training was also controlled. In effect they created a system where everybody knew their place, thought alike and took orders well.

The big problem, as we are now all seeing, is that this created an environment where whole businesses within Kodak got broadsided. Film, cameras and a bunch of other business got skewered because no one had any long term vision there. Or if they did where not permitted to excite on their beliefs.

In any event what does this have to do with Apple? Easy, who will be the new visionaries? Also who will be the driving force in the business world. Sometimes these are not the same people but Steve certainly served in both roles. Will Apple make the same sorts of mistakes Kodak has, and get so tied up in their ways that they mis which direction the future moves to? It really is a worrisome issue long term.

Everybody wants this convergence,
But we still want to get TV on that wonderful new HDTV + whatever!

Wrong!

As someone in this thread keeps insisting, for that there are already plenty of TVs in da house. Even the poorest of the poorest countries have got plenty of TV sets per household for that type-a-thing. That's old news, my friend.

Just observe what's happening with YouTube and the other Cloud projects. Maybe you'll get a hint of what's coming up in the very near future. In YouTube, they're gearing up for paid content meaning that if you're a teacher you'll be able to put your teaching videos up for paid access. YouTube, among others that are setting themselves Cloud operations - such as Apple - are going to become the de facto content providers.

Regarding the oh-so-many-leeches TV business, what happens is that the law of least resistance will work its way out of the so called content providers (FOX etc) / old paradigm TV. Apple and others will NOT get into such mess.

What is the big deal of iTunes providing TV series episodes - they do! Movies? They do! Add to that news - I've been getting my news such as the Japan tsunami via YouTube! More and more TV channels have an App (and website) with up-to-date news clips of whatever it's going on. I can get that on my iPhone. I can get that on my iPad. Why not watch them on 55"? Entertainment such as music videos? YouTube has them. You get the idea.

The new paradigm is already here, it's just being fine tuned at the moment, a model being refined. Steve jobs is on top of that, he personally wants to influence what the new paradigm household will be.

As for Apple and Steve Jobs, everything has always been about timing. And the timing is right, now!

Because it's about the ecosystem! Integration. Apple's art.

GAMES - did I mention games?! Check this out, I've mentioned this as an obvious possibility in a previous post on this thread and it's already coming out on iOS 5! All of this is obvious.

Yes, it'll remain possible to add the $99 box to existing TVs because Apple/Jobs don't want to block people from getting into the Apple ecosystem. But it'll be only natural to make one or two Apple branded "TV" models that have the Apple TV integrated and maybe some added functionality, maybe a bit more hardware-software than the $99 box that'll make such a device sort of an iOS driven 55" entertainment iMac remotely controlled by your iPad/iPhone.

My guess is that Steve sees the iCloud becoming the next paradigm of video content providers, leaving behind all the inefficient greedy "conventional" broadcasters. That's what he does, he revolutionizes industries because he sees a few years ahead of time.

And this time no one will ridicule the Apple innovation - they will be screaming about it right and left.

One more thing: much like the iPhone is not really a phone but a computer that, yes, does calls, this new line of Macs are not TVs, they'll only look like such!

Stevie 'invented' the iPhone. And developers came up with Games and Apps. 'Build it and they will come' sort of thing. Now work your way to a 55" device (or 40") that (much like the iPad works with iPhone Apps) already has 1000s of working Apps. It would also have the Apple TV environment etc. Is 1920x1080 very far from the iPad's resolution? Sooner or later this will happen. And I think that's now. You cannot have the iCloud+iTunes + all other stuff without an Apple logo on a big screen - it would be a hole in the ecosystem. It's got to be closed, whole, cannot be a bunch of parts, that's the PC world where little to no synergy exists, where chaos and viruses proliferate.

Why is it so hard to understand that the Internet + personal computer is taking over / integrating everything? If I were Warner or Fox I would be going crazy setting up the server farm ASAP.

This is innevitable. And I believe articles such as these are not really rumors at this point, they're only being careful not to screw it for Apple, it's kind of obvious they all already know what's going on.

Furthermore, it's also obvious that the timing of Steve's resignation is in function of this (new line of Mac/TV) announcement that is kind of rumored in this piece - Steve got out of the way (apparently) so that it seems Cook is the one already innovating and leading through this new gigantic industry disruptive move. What a Genius move!

Steve indeed runs a very tight ship. I bet he has the next 10~20 years already planned for. I've never seen a business so intelligently and carefully run! And dare I say, a lot of that merit may be attributed to Cook, I'm sure.

One reason these people are not screaming right and left what they know is stock price, they want in in the deal of the century. If this proves to be true, AAPL will rise like crazy because Cook does not carry a life threatening disease and he is the model for trustworthiness and reliability. Expect AAPL to double in a very short time.

Interestingly the concept of the 4/3rds aspect ratio isn't bad it is rather they strived for an extremely small sensor. Here it looks like they had underestimated cell phone camera quality and ignored the needs of pros.

Four thirds is the sensor size ( 1.33" diagonal), not the aspect ratio. And it's not terribly much smaller than the size as Super 35 film frame. There are pro video cameras that have 1/3" (0.33") sensors, some have 1/2" sensors and they've proven to be very popular.

This thread's the first I heard of the Lytro guys. They seem to be touting a new sensor, which is HUGE bucks to develop. Even if it works as advertised it has a very steep acceptance curve because it doesn't look to be compatible with stuff people are already comfortable with.

No way can you get true zoom from a static light-field. That is the standard Hollywood mistake of generating new detail from thin air. You can crop and enlarge all you want, but eventually resolution will get pixellated at medium and larger zoom levels. You can play all kinds of other games though based on what is captured from the scene.

idunno. With the kind of computational photography I'm familiar with you may eventually just place a lens adaptor on the back of an existing lens and onto an existing camera and get the benefits in post. That spells cheap death for wholesale custom hardware.

"No way can you get true zoom from a static light-field. That is the standard Hollywood mistake of generating new detail from thin air.?

Are you saying that you can only get Zoom from optical lenses -- separated by space as they are today?

If so, is it possible to approximate zoom using mirrors so that the lenses could be incorporated into a iPhone (parallel to the subject rather than perpendicular) -- where nothing need extend from the lens on the back of the phone?

I think 10x or 12x would satisfy most needs.

Edit: BTW, thanks to all the posters on this thread -- I've learned quite a bit about stuff I never knew existed! In fact I am so pleased, I just did a click-through on a header ad

"Swift generally gets you to the right way much quicker." - auxio -

"The perfect [birth]day -- A little playtime, a good poop, and a long nap." - Tomato Greeting Cards -

"No way can you get true zoom from a static light-field. That is the standard Hollywood mistake of generating new detail from thin air.?

Are you saying that you can only get Zoom from optical lenses -- separated by space as they are today?

If so, is it possible to approximate zoom using mirrors so that the lenses could be incorporated into a iPhone (parallel to the subject rather than perpendicular) -- where nothing need extend from the lens on the back of the phone?

I think 10x or 12x would satisfy most needs.

Edit: BTW, thanks to all the posters on this thread -- I've learned quite a bit about stuff I never knew existed! In fact I am so pleased, I just did a click-through on a header ad

Essentially yes. To get zoom where the detail of an object is enhanced you need physical optical changes in the light gathering before it hits the light sensor. Once the image in on the light sensor it is all Information Theory from there, with the amount of info on the light sensor always being constant.

Digital zoom cannot add any new information or detail, it just makes the existing info a little easier to see. Optical zoom captures less detail from the entire scene, but a concentrated amount in the zoomed portion of the scene, equivalent overall amount of information. So there is more raw information (detail) available about the stuff in that zoomed portion of the scene, but no information available outside the zoomed area (to satisfy No Free Lunch).

Four thirds is the sensor size ( 1.33" diagonal), not the aspect ratio. And it's not terribly much smaller than the size as Super 35 film frame. There are pro video cameras that have 1/3" (0.33") sensors, some have 1/2" sensors and they've proven to be very popular.

Unless my sources are screwed and my memory gone bad (extremely possible) 4/3 rds does more or less specify the aspect ratio. The diagonal on most sensors is about 22 mm which is far less that 1,33". These sensors are very small and are in fact smaller that APS-C and Foveon sensors. More importantly when the standard debuted I remember it being sold as an aspect ratio.

Another way to look at this is that the 4/3 rds frame is about the same size as the old 110 film frame. That makes it pretty obvious that Kodak has had a hand in this format.

Looking at the 4/3rds system I see an interesting concept but the platform is obviously limited by it's frame size. Of course some see that as an advantage. The problem is small frame sensors are terrible for creative use of depth of field. Frankly even 35mm sensors leave a lot to be desired here if you have ever shot with a medium format camera. Combine that with the ever increasing quality of cell phone cameras I expect 4/3rds to go the way of many of the point and shoot platforms. I suppose it is a personal thing but these days if your cell phone doesn't do it a move to a much larger format makes lots of sense. Mainly because of the extra control you get.

However I can see a lOt more iOS devices coming. Some might even compete with traditional Macs but they won't be called Macs. I can see Apple undercutting the lowest cost i86 hardware with very compact low power hardware. IOS is just starting to mature into a more robust and useful way to do things. Forgot to mention Apple TV here, add another $25 in parts and that machine would make a nice network access node, a stationary FaceTime machine or do any number of other useful tasks around the house.

Given that I still think the entire Mac desktop lineup needs to be overhauled. It just doesn't meet enough user needs to be viable much longer. The Mac Pro is the worst example here.

Given that I still think the entire Mac desktop lineup needs to be overhauled. It just doesn't meet enough user needs to be viable much longer. The Mac Pro is the worst example here.

It is because it's not even perfect for the pros relative to some other machines out there. They buy it because it runs OSX. If OS wasn't a factor there are more appealing workstations built today for less money. I still think that line will die. The imac has survived because people like bigger screens than what a laptop will allow.

On the laptop end I just wish they could get them to run a bit cooler. If they did that it would motivate me to buy a new one.

However I can see a lOt more iOS devices coming. Some might even compete with traditional Macs but they won't be called Macs. I can see Apple undercutting the lowest cost i86 hardware with very compact low power hardware. IOS is just starting to mature into a more robust and useful way to do things. Forgot to mention Apple TV here, add another $25 in parts and that machine would make a nice network access node, a stationary FaceTime machine or do any number of other useful tasks around the house.

Given that I still think the entire Mac desktop lineup needs to be overhauled. It just doesn't meet enough user needs to be viable much longer. The Mac Pro is the worst example here.

Quote:

Originally Posted by hmm

It is because it's not even perfect for the pros relative to some other machines out there. They buy it because it runs OSX.

Well nothing is perfect. In any event OSX is the reason most people buy Macs, this surprises whom?

Quote:

If OS wasn't a factor there are more appealing workstations built today for less money. I still think that line will die. The imac has survived because people like bigger screens than what a laptop will allow.

I don't like the word die! I prefer to think of it as a refactoring.

Quote:

On the laptop end I just wish they could get them to run a bit cooler. If they did that it would motivate me to buy a new one.

This will always be a trade off. Frankly it looks like Apple tries to put as much compute power into it laptops as the thermals will allow. In your case you might want to wait for Ivy Bridge, but that does not guarantee cool operation. Apple could very well crank up performance again bringing back the same thermal levels.

In the case of the mac pros, the machine should be competitive in its own design rather than something you're forced into just to run OSX. There's no reason at its price point that it shouldn't have a nice functional design with up to date hardware.

On the laptop end their recent designs have been very interesting. On the windows end a lot of laptops look pretty similar and are defined primarily by specs. Apple's refinements in terms of case rigidity and weight reduction really are very cool. They're very focused on user experience and I think the heat thing is a factor there. I hope they do use ivy bridge to bring down the temperatures experienced under heavy workloads,

In the case of the mac pros, the machine should be competitive in its own design rather than something you're forced into just to run OSX. There's no reason at its price point that it shouldn't have a nice functional design with up to date hardware.

Which it HAS. The case makes it absolutely silent and all we need is a Sandy Bridge speed bump to tide us over until Ivy Bridge and we'll be right as rain.

Nothing gets really interesting until Skymont. Because after that, we're forced into quantum computing by the laws of physics.

Nothing gets really interesting until Skymont. Because after that, we're forced into quantum computing by the laws of physics.

No. Memristors will provide a significant lengthening of the pre-quantum computing era. We have been building chips with one hand tied behind our backs. There are 4 basic circuit building blocks and to date we have only used 3 of the 4 electrical elements + transistors. Well memristors make the full scope of theoretical EE available, finally.

Memristors will lower power requirements and raise speed because they are faster than a series of transistors and can natively maintain state power off. HP and IBM both made significant production related advances in the past year, as in finding repeatable ways to actually make them in quantity. Because they can often be combined in ways to eliminate transistors while also taking up less real estate than the transistors you can get denser circuits with lower power requirements for the same functionality.

I would be very surprised if there aren't a good 30-40 years of solid computational speed and throughput advances to go as the new circuit designs are optimized. That's a long tome before we have to worry about feature scale and betting the farm on PFM q-bits.

No. Memristors will provide a significant lengthening of the pre-quantum computing era. We have been building chips with one hand tied behind our backs. There are 4 basic circuit building blocks and to date we have only used 3 of the 4 electrical elements + transistors. Well memristors make the full scope of theoretical EE available, finally.

Memristors will lower power requirements and raise speed because they are faster than a series of transistors and can natively maintain state power off. HP and IBM both made significant production related advances in the past year, as in finding repeatable ways to actually make them in quantity. Because they can often be combined in ways to eliminate transistors while also taking up less real estate than the transistors you can get denser circuits with lower power requirements for the same functionality.

I would be very surprised if there aren't a good 30-40 years of solid computational speed and throughput advances to go as the new circuit designs are optimized. That's a long tome before we have to worry about feature scale and betting the farm on PFM q-bits.

The future's gonna be so much weirder than we can imagine. None of the functions project straight-line. The delta of change and the ubiquity of hardware and software at decreasing $/unit of computation makes every trend exponential in its compounding effects. And that's in computational science alone..... ...the rippling effects in the biological sciences, physics, engineering and material sciences, etc. will further transform the landscape of human reality.

Four thirds is the sensor size ( 1.33" diagonal), not the aspect ratio. And it's not terribly much smaller than the size as Super 35 film frame. There are pro video cameras that have 1/3" (0.33") sensors, some have 1/2" sensors and they've proven to be very popular.

seems to me that "four thirds" may refer both to the aspect ratio and size of the sensor.

it should be noted that 4/3" is the nominal 'vacuum tube equivalent' size of the sensor, not the actual diagonal measurement.
the actual diagonal of the sensor (d) is found from the nominal tube size (n) by the forumla:d [mm] = 16 [mm/inch]*n [inch].
(where the factor 16 mm/inch has some archaic origin)
this works out to approx 21.3mm for 4-3rds, which is pretty close to the specified 21.6 mm.

At which point Apple will own our music, files, browsing history, location data, and various other scary tidbits. NTM, Apple was recently the highest valued company on Earth. Yes, Earth .... Apple is growing into a daunting size and sitting on more cash than the U.S. Government. This is all coming from a 20 year Apple user and shareholder that has benefited greatly from their growth. I love them and appreciate what they have done but they're starting to frighten me a bit.

I'm not interested in iSkyNet. The day Apple goes 100% iCloud is the day I go 100% linux.

Hear! Hear!

I have been entertaining that idea since Apple introduced the walled garden in earnest. And the moment for the switch is getting closer it seems.
(Looked at Linux UIs lately? They have improved vastly!)

So it kinda makes you guys look kinda silly with all this hand wringing.

Quote:

Originally Posted by Chuck O. Jones

Hear! Hear!

I have been entertaining that idea since Apple introduced the walled garden in earnest. And the moment for the switch is getting closer it seems.

What really has you so worried? Frankly technology changes too fast to even remotely believe all the guesses about the future. Besides you have to balance positives and negatives here.

For example some people call the iPhone a walled garden. It is in some ways, but not excessively so as their is a huge stockpile of apps for the device. In the end you have to balance the positives against the negatives. Frankly my iPhones reliability as a phone is paramount thus the limitations are welcomed.

On Mac OS/X there are a different set of "rules" when it comes to what is acceptable on the platform. Apples intentions here are not written in stone and thus subject to fear and speculation. However things like App Store can be likened to Linux package managers. Linux package managers done right from the users perspective. So far I like what I see.

Quote:

(Looked at Linux UIs lately? They have improved vastly!)

Listen I understand Linux really well, having run it as a primary OS since fedora/redhat 5. That includes various distributions on both desk top and laptop hardware. I really like Linux but on the other hand have grown to really hate the organization responsible for the GPL. Still when I bought my first Mac in 2008 I've not looked back. For general user needs a Mac simply beats every other platform out there. Throw in technical usage such as software development and it gets even better.

Now from the users perspective iCloud really has some impressive potential. If it works as described and matures well I can see the service becoming a huge sales driver for Apple. In a way your documents become virtual and not stuck to one device. I don't want to yell victory yet but I think you have to look at this from the perspective of a user. Many will trip over themselves to get the features promised, and likely demand even more afterward.