If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.

Vidix and r500 and the future of computing in general

Not to toss alcohols on the ATI fire here but has anybody been able to get vidix to initialize an ATI r500+ chipset?

I thought I found a solution but my enthusiasm was faster than reality...

On a completely different note:

When will we see a fully autonomous video device that can operate independent of the mommy, or even the family CPU?

Of course that means it needs it's own proprietary video core but it makes no sense that more than one processor has it's hands in the video bank... That's how accounts get overdrawn and compositing errors are born...

Answering my own question:

When we have an OPEN STANDARD, PUBLIC PROPERTY, fully autonomous, command driven serial bus transition processor capable of delivering =<100 GiB (10 gigabyte per second) firmly attached to a PUBLIC STANDARD (short for both terms above) system board design. Then all devices must be "autonomous" to play ball. Otherwise they just sit there feeling stupid. Then the hardware mfg only has to support what they build, making the entire system "OS" independent; ...and the future of computing is borne on the winds of change and laughter and life here and -ever after....

There have been products like that for video input (tuner via USB etc) -- I think the reason you don't see similar products on the output side is that it doesn't make much sense to separate video output from 3d output since they both go to the same monitor.

The other, more practical problem is that the chip design is often largely finished while the video standards are still in flux, so the only way to have *all* the video processing on chip *and* maintain the required flexibility is to give the chip a big general-purpose processor, and that raises the cost of the whole system.

We are obviously heading in that direction with Fusion, but that will eliminate the need for a separate CPU at the same time.

Comment

There have been products like that for video input (tuner via USB etc) -- I think the reason you don't see similar products on the output side is that it doesn't make much sense to separate video output from 3d output since they both go to the same monitor.

I'm not following you here...

I wasn't suggesting separating 3d from video, only that we separate large scale functions like main processor, video, storage, auxiliary etc, and make them independent of OS drivers by adding an autonomous proprietary core processor so the device is completely independent. Then applications would request services directly from the major system aspects using main processor code via a multiplexed serial bus.

As for adding expense, yes, it would. But there is also an offset in cost-per when a device has a longer life cycle, even more so when gross profit is less important than overall function and the future of society as a whole is more important. And the discussion digresses to business philosophy more than technical aspects of design.

We, as a society, aren't there yet in terms of social cooperation, but eventually we will be and it does no harm to examine future strategies. Some of which can be implemented in whole or in part, now.

From my point of view we really need to break the crippling dependence between a mfg's hardware and a specific OS's driver/software and the linkage at the moment is the bus. A next generation, command driven, analogous clock rate (analog waveform, digital pass-filter), multiplexed serial system bus would go a long way towards such independence.

But, it would require cooperation and public support for standards, something the leader in the software industry has shown to be less than willing to be part of, having been sued in nearly every major jurisdiction on the planet for "unfair business" practices.