◄►Bookmark◄❌►▲▼Toggle AllToC▲▼Add to LibraryRemove from Library • BShow CommentNext New CommentNext New Reply

Search TextCase SensitiveExact WordsInclude Comments

List of Bookmarks

The latest data from Top 500, a website that tracks the world’s most powerful supercomputers, has pretty much confirmed this with the release of their November 2015 list.

The world’s most powerful supercomputer, the Tianhe-2 – a Chinese supercomputer, though made on American technology – has now maintained its place for 2.5 years in a row. The US supercomputer Cray XK7 built three years ago maintains its second place today. Relative to June 2013, there has not even been a doubling in aggregate performance, whereas according to the historical trendlines, doublings have typically taken just a bit over a single year to occur. This is unprecedented, since Moore’s Law applies (applied?) to supercomputers just as much as it did to standard electronics.

Apart from serving as a conventient bellweather for general trends, futurists are well advised to follow supercomputers for two reasons.

Technological Projections

Their obvious application to the development of radical technological breakthroughs, from the extraordinarily complex protein folding simulations vital to uncovering medical breakthroughs to the granddaddy of them all, computer superintelligence. The general “techno-optimistist” consensus has long been that Moore’s Law will continue to hold, or even strengthen further, because the Kurzweilian view was that the exponent itself was also (slowly) exponentially increasing. This would bring us an exaflop machine by 2018 and the capability to do full human brain neural simulations soon afterwards by the early 2020s.

But on post-2012 trends, exponentially extrapolated, we will actually be lucky just to hit one exaflop in terms of the aggregate of the world’s top 500 supercomputers by 2018. Now the predictions of the first exaflop supercomputer have moved out to 2023. Though perhaps not much in conventional life, a “delay” of 5 years is a huge deal so far as projections built on big exponents are concerned. For instance, assuming the trend isn’t reversed, the first supercomputer theoretically capable of full neural simulations moves out closer to 2030.

In terms of developing superintelligence, raw computing power has always been viewed as the weakest limit, and that remains a very reasonable view. However, the fact that even in this sphere there appear to be substantial unforeseen obstacles means a lot of trouble for the traditional placement of superintelligence and even the technological singularity at around 2045 or 2050 (not to even mention the 2020s as per Vernor Vinge).

National Power

Supercomputers can also be viewed as an instrument of national power. Indeed, some of the most powerful supercomputers have been used for nuclear testing (in lieu of real life). Other supercomputers are dedicated to modeling the global climate. Doing it better than your competitors can enable you to make better investments, even predict uprisings and civil wars, etc. All very useful from a geopolitical perspective. And of course they are very useful for a range of purely scientific and technological applications.

As in so many spheres in the international arena, the overwhelming story here is of the Rise of China.

From having o-1 supercomputers in the Top 500 during the 1990s and a couple dozen in the 2000s, it surged past a waning Japan in the early 2010s and now accounts for 109 of the world’s top supercomputers, second only after the USA with its 199 supercomputers. This just confirms (if any such confirmations is still needed) that the story of China as nothing more than a low wage workshop is laughably wrong. An economy like that would not need 20%+ of the world’s top supercomputers.

COUNTRIES

COUNT

SYSTEM SHARE (%)

RMAX (GFLOPS)

RPEAK (GFLOPS)

CORES

United States

199

39.8

172,582,178

246,058,722

10,733,270

China

109

21.8

88,711,111

189,895,013

9,046,772

Japan

37

7.4

38,438,914

49,400,668

3,487,404

Germany

32

6.4

29,663,941

37,844,201

1,476,524

United Kingdom

18

3.6

11,601,324

14,230,096

724,184

France

18

3.6

12,252,180

14,699,173

766,540

India

11

2.2

4,933,698

6,662,387

236,692

Korea, South

10

2

7,186,952

9,689,205

283,568

Russia

7

1.4

4,736,512

6,951,848

208,844

Brazil

6

1.2

2,012,268

2,722,150

119,280

Otherwise the rankings are approximately as one might expect, with the Big 4 middle sized developed Powers (Japan, Germany, UK, France) performing modestly well relative to the size of their population and the rest – including tthe non-China BRICS – being almost minnows in comparison.

Well there are other things to consider when talking about computing performance.

For example, concurrency is more important than in the past, but it’s up to software creators to take advantage of that. That can bring up computing performance considerably if done correctly.

The second thing is the promise of quantum computers (which also requires knowledge of parallel programming). These will increase computing performance far beyond what Moore predicted, but it will require more work from the software side.

Software makers have more tools than ever to increase the performance of their products, and hardware makers are relying on those tools to be used.

Quantum computers (which don't exist and may never exist, even though they seem theoretically possible) have nothing specifically to do with parallel programming and are not applicable to all, or even to a large subset of problems.
[These will increase computing performance far beyond what Moore predicted] makes no sense at all, since Moore (or rather the people who turned him into a law) predicted a rate of increase of, not a level of, performance.
This is an excellent and overdue post.

Interesting that you should bring up parallel computation, Nikita. Back in the 90s when I was a college student, I worked as an assistant at a research center for parallel computation, and even then, all the researchers there were saying that Moore's law wouldn't last forever and that parallel computing was the future.

Well there are other things to consider when talking about computing performance.

For example, concurrency is more important than in the past, but it's up to software creators to take advantage of that. That can bring up computing performance considerably if done correctly.

The second thing is the promise of quantum computers (which also requires knowledge of parallel programming). These will increase computing performance far beyond what Moore predicted, but it will require more work from the software side.

Software makers have more tools than ever to increase the performance of their products, and hardware makers are relying on those tools to be used.

Quantum computers (which don’t exist and may never exist, even though they seem theoretically possible) have nothing specifically to do with parallel programming and are not applicable to all, or even to a large subset of problems.
[These will increase computing performance far beyond what Moore predicted] makes no sense at all, since Moore (or rather the people who turned him into a law) predicted a rate of increase of, not a level of, performance.
This is an excellent and overdue post.

consider the F-35 Shitstack. A fighter-bomber so sophisticated (30,000,000 lines of code) that it can (in theory) virtually fly itself. But each time one glitch is ironed out, another appears. Lately the F-35 has taken to ejecting its human pilot at highly inopportune moments

In software more is not always necessarily better. I do not know that much about the F35, but from what I have read, its main problem is that its driven by different factions in the US military that all want to add their own features to it. This has also been going on for a long time now, so the software is probably a reflection of these conflicting needs and is probably as messy and clunky as the huge bureaucracy behind this project.

Well there are other things to consider when talking about computing performance.

For example, concurrency is more important than in the past, but it's up to software creators to take advantage of that. That can bring up computing performance considerably if done correctly.

The second thing is the promise of quantum computers (which also requires knowledge of parallel programming). These will increase computing performance far beyond what Moore predicted, but it will require more work from the software side.

Software makers have more tools than ever to increase the performance of their products, and hardware makers are relying on those tools to be used.

Interesting that you should bring up parallel computation, Nikita. Back in the 90s when I was a college student, I worked as an assistant at a research center for parallel computation, and even then, all the researchers there were saying that Moore’s law wouldn’t last forever and that parallel computing was the future.

I scooped Tyler Cowen on the general subject of “lol, singularity” in Takimag back in 2009 (link above). Of course technological progress ain’t what it used to be. Western Civilization is stagnating in every way; no noteworthy works of art, no innovations in technology, no breakthroughs in science which increase human power over or even understanding of nature. Device physics; essentially a small subset of optics -that’s the only measurable form of progress. Soft X-ray lithography is a sort of hard physical limit on this, and we’re already there.

Despite the fact that I crunch numbers for a living, I haven’t needed a new computer in 5-6 years. The new ones are not significantly faster than the old ones. There are a few technological punts which might change this; most of them involve changing memory architectures so data access will be faster. Frankly, some parts of computer science are actually regressing. You can develop “innovative” technologies by looking at how people wrote code in the 1960s and reading Knuth.

I'm not sure if it's as great as Beethoven or Bach, but Arvo Pärt's music seems to me to be at least on Josquin des Prez's level. Or higher, considering that Josquin didn't have the oeuvre of half a millennium before him, which is I think quite difficult, because once you have a conventional way of composing music, it limits your imagination and either you'll create unlistenable shit out of spite just to break the rules (which many composers did from Schönberg on), or keep within the tradition (which some great composers like Shostakovich did), but then you weren't as original as Arvo Pärt arguably is.

But it might be just because I got really excited by a modern composer whose music isn't unlistenable shit. (I'm aware that maybe my lack of musical education or my lack of knowledge of other modern composers makes my opinion less than valuable for many.)

Some additional comments:
1. Power efficiency is very important. Some supercomputers (including #1 on the list) aren’t being used at full capacity because they’re so expensive to run.

>The Tianhe-2, which reaches 33.86 petaflops per second, was ranked No. 1 in the Top500 List for the third time in the row in late June. But some researchers say the system is not being used to its full potential because of its expense. Researchers have to pay the electricity costs to use the supercomputer, which can run between 400,000 to 600,000 yuan a day (between about $64,000 to about $96,500 a day), the story said.
>Some critics also say the developers of the Tianhe-2 focused too much on hardware, and not enough on software, forcing researchers to write their own code to use the system, the article said. Some researchers would have to spend years or a decade to write the necessary software, said Chi Xuebin, deputy director of the Chinese Academy of Science’s Computer Network and Information Centre.
>“It is at the world’s frontier in terms of calculation capacity, but the function of the supercomputer is still way behind the ones in the U.S. and Japan,” Chi told the newspaper. “It’s like a giant with a super body but without the software to support its thinking soul.”
>Tinahe-2, developed by the National University of Defense Technology and housed at Sun Yat-sen University in Guangdong, has served 120 clients at 34 percent of its capacity, the story said. Researchers have deployed the supercomputer for railway design, earthquake simulation, astrophysics and genetic studies.

(It looks like to me that China wanted a supercomputer at #1 on the list, and threw enough hardware at the problem to get there, but this created some problems and is probably not representative of the state of supercomputers as a whole. I suggest treating Tinahe-2 as an outlier.)

2. Kurzweil loves to talk about “S-curves”, which are somewhat unfalsifiable, but I do think there are some promising techs out there. If we figure out more general forms of parallelization, “true” 3d-chip design (not just FinFET-style 2d layering), neuromorphic advances, etc, things could keep going at a good clip even if our miniaturization processes stall.

Some additional comments:
1. Power efficiency is very important. Some supercomputers (including #1 on the list) aren't being used at full capacity because they're so expensive to run.

>The Tianhe-2, which reaches 33.86 petaflops per second, was ranked No. 1 in the Top500 List for the third time in the row in late June. But some researchers say the system is not being used to its full potential because of its expense. Researchers have to pay the electricity costs to use the supercomputer, which can run between 400,000 to 600,000 yuan a day (between about $64,000 to about $96,500 a day), the story said.
>Some critics also say the developers of the Tianhe-2 focused too much on hardware, and not enough on software, forcing researchers to write their own code to use the system, the article said. Some researchers would have to spend years or a decade to write the necessary software, said Chi Xuebin, deputy director of the Chinese Academy of Science’s Computer Network and Information Centre.
>“It is at the world’s frontier in terms of calculation capacity, but the function of the supercomputer is still way behind the ones in the U.S. and Japan,” Chi told the newspaper. “It’s like a giant with a super body but without the software to support its thinking soul.”
>Tinahe-2, developed by the National University of Defense Technology and housed at Sun Yat-sen University in Guangdong, has served 120 clients at 34 percent of its capacity, the story said. Researchers have deployed the supercomputer for railway design, earthquake simulation, astrophysics and genetic studies.

(It looks like to me that China wanted a supercomputer at #1 on the list, and threw enough hardware at the problem to get there, but this created some problems and is probably not representative of the state of supercomputers as a whole. I suggest treating Tinahe-2 as an outlier.)

2. Kurzweil loves to talk about "S-curves", which are somewhat unfalsifiable, but I do think there are some promising techs out there. If we figure out more general forms of parallelization, "true" 3d-chip design (not just FinFET-style 2d layering), neuromorphic advances, etc, things could keep going at a good clip even if our miniaturization processes stall.

Some additional comments:
1. Power efficiency is very important. Some supercomputers (including #1 on the list) aren't being used at full capacity because they're so expensive to run.

>The Tianhe-2, which reaches 33.86 petaflops per second, was ranked No. 1 in the Top500 List for the third time in the row in late June. But some researchers say the system is not being used to its full potential because of its expense. Researchers have to pay the electricity costs to use the supercomputer, which can run between 400,000 to 600,000 yuan a day (between about $64,000 to about $96,500 a day), the story said.
>Some critics also say the developers of the Tianhe-2 focused too much on hardware, and not enough on software, forcing researchers to write their own code to use the system, the article said. Some researchers would have to spend years or a decade to write the necessary software, said Chi Xuebin, deputy director of the Chinese Academy of Science’s Computer Network and Information Centre.
>“It is at the world’s frontier in terms of calculation capacity, but the function of the supercomputer is still way behind the ones in the U.S. and Japan,” Chi told the newspaper. “It’s like a giant with a super body but without the software to support its thinking soul.”
>Tinahe-2, developed by the National University of Defense Technology and housed at Sun Yat-sen University in Guangdong, has served 120 clients at 34 percent of its capacity, the story said. Researchers have deployed the supercomputer for railway design, earthquake simulation, astrophysics and genetic studies.

(It looks like to me that China wanted a supercomputer at #1 on the list, and threw enough hardware at the problem to get there, but this created some problems and is probably not representative of the state of supercomputers as a whole. I suggest treating Tinahe-2 as an outlier.)

2. Kurzweil loves to talk about "S-curves", which are somewhat unfalsifiable, but I do think there are some promising techs out there. If we figure out more general forms of parallelization, "true" 3d-chip design (not just FinFET-style 2d layering), neuromorphic advances, etc, things could keep going at a good clip even if our miniaturization processes stall.

I scooped Tyler Cowen on the general subject of "lol, singularity" in Takimag back in 2009 (link above). Of course technological progress ain't what it used to be. Western Civilization is stagnating in every way; no noteworthy works of art, no innovations in technology, no breakthroughs in science which increase human power over or even understanding of nature. Device physics; essentially a small subset of optics -that's the only measurable form of progress. Soft X-ray lithography is a sort of hard physical limit on this, and we're already there.

Despite the fact that I crunch numbers for a living, I haven't needed a new computer in 5-6 years. The new ones are not significantly faster than the old ones. There are a few technological punts which might change this; most of them involve changing memory architectures so data access will be faster. Frankly, some parts of computer science are actually regressing. You can develop "innovative" technologies by looking at how people wrote code in the 1960s and reading Knuth.

no noteworthy works of art

I’m not sure if it’s as great as Beethoven or Bach, but Arvo Pärt’s music seems to me to be at least on Josquin des Prez’s level. Or higher, considering that Josquin didn’t have the oeuvre of half a millennium before him, which is I think quite difficult, because once you have a conventional way of composing music, it limits your imagination and either you’ll create unlistenable shit out of spite just to break the rules (which many composers did from Schönberg on), or keep within the tradition (which some great composers like Shostakovich did), but then you weren’t as original as Arvo Pärt arguably is.

But it might be just because I got really excited by a modern composer whose music isn’t unlistenable shit. (I’m aware that maybe my lack of musical education or my lack of knowledge of other modern composers makes my opinion less than valuable for many.)

I think a great deal of what might have formerly been considered as great art continues to be produced in the modern world.

This Korean prodigy is making ultra-realistic paintings: http://www.dailymail.co.uk/news/article-2384759/Artist-Joongwon-Charles-Jeongs-hyper-real-paintings-look-like-photos.html

Even much ballyhooed products such as video games have music that might have been considered great had it been written a few hundred years ago: https://www.youtube.com/watch?v=Ns7fNPiNiNc

So why do we not tend to notice it? Simple - because the Mona Lisa and Mozart's Symphony No.40 were original, towering colossi on a flat landscape. Now they are almost a dime or dozen and the land is strewn with towers (and skyscrapers). Further progress is hard because we might have largely reached the limits of what humanity is capable of in principle - at least so far as traditional media is concerned, anyway.

consider the F-35 Shitstack. A fighter-bomber so sophisticated (30,000,000 lines of code) that it can (in theory) virtually fly itself. But each time one glitch is ironed out, another appears. Lately the F-35 has taken to ejecting its human pilot at highly inopportune moments

A fighter-bomber so sophisticated (30,000,000 lines of code)

In software more is not always necessarily better. I do not know that much about the F35, but from what I have read, its main problem is that its driven by different factions in the US military that all want to add their own features to it. This has also been going on for a long time now, so the software is probably a reflection of these conflicting needs and is probably as messy and clunky as the huge bureaucracy behind this project.

I'm not sure if it's as great as Beethoven or Bach, but Arvo Pärt's music seems to me to be at least on Josquin des Prez's level. Or higher, considering that Josquin didn't have the oeuvre of half a millennium before him, which is I think quite difficult, because once you have a conventional way of composing music, it limits your imagination and either you'll create unlistenable shit out of spite just to break the rules (which many composers did from Schönberg on), or keep within the tradition (which some great composers like Shostakovich did), but then you weren't as original as Arvo Pärt arguably is.

But it might be just because I got really excited by a modern composer whose music isn't unlistenable shit. (I'm aware that maybe my lack of musical education or my lack of knowledge of other modern composers makes my opinion less than valuable for many.)

I think a great deal of what might have formerly been considered as great art continues to be produced in the modern world.

Even much ballyhooed products such as video games have music that might have been considered great had it been written a few hundred years ago:

So why do we not tend to notice it? Simple – because the Mona Lisa and Mozart’s Symphony No.40 were original, towering colossi on a flat landscape. Now they are almost a dime or dozen and the land is strewn with towers (and skyscrapers). Further progress is hard because we might have largely reached the limits of what humanity is capable of in principle – at least so far as traditional media is concerned, anyway.

I disagree in that great music is not simply music that's pleasant to listen to, and great painting is not something that looks just like a photograph. Myron's Discobolus is perhaps among the greatest sculpture's of all time, but a real discus thrower never actually looks like that. Leonardo's last supper doesn't look at all the way it could have looked like, if for nothing else than because they are all sitting on just one side, yet it's a great painting.

There's depth to art that is different from the photographic quality of paintings etc.

Have you heard about Mansudae, the North Korean art studio? Due to isolation it has been preserved from the debacles of modern art like other communist states. There overseas project division has created many monuments in african nations, like the African Renaissance Monument. Their one project in the west is the restoration/recreation of the Fairytale Fountain in Frankfurt.

So imposing Diversity in every field is finally affecting computers too. It’s been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

Typical of whites to try and grub after every morsel of success or accomplishment and to hate on what others have done.

The breakthroughs in medicine that you are trying to claim for white people was just the low hanging fruit. It required intelligence, not genius. And if whites did not discover those breakthroughs some other society would have. Probably without going to war with the entire planet or enslaving it in debt like the white people.

This sour grapes and narcissistic outlook is characteristic of a society that is declining or dieing out.

Technology is not going to save your life, or anyone's life. Technology is going to destroy life - all life. Technology is a death cult whose enthusiasts and proponents envision basically turning the Earth into a giant computer:

http://www.amazon.com/Age-Em-Work-Robots-Earth/dp/0198754620/

To the extent that technology is delayed or may never show up, that's a good thing.

Doesn't really explain why Europe hasn't picked up the slack though. Compared to the US, European higher education and the European computer industry have been very homogeneous. If it was just about diversity, you'd think that Europe would have zoomed ahead.

I think a great deal of what might have formerly been considered as great art continues to be produced in the modern world.

This Korean prodigy is making ultra-realistic paintings: http://www.dailymail.co.uk/news/article-2384759/Artist-Joongwon-Charles-Jeongs-hyper-real-paintings-look-like-photos.html

Even much ballyhooed products such as video games have music that might have been considered great had it been written a few hundred years ago: https://www.youtube.com/watch?v=Ns7fNPiNiNc

So why do we not tend to notice it? Simple - because the Mona Lisa and Mozart's Symphony No.40 were original, towering colossi on a flat landscape. Now they are almost a dime or dozen and the land is strewn with towers (and skyscrapers). Further progress is hard because we might have largely reached the limits of what humanity is capable of in principle - at least so far as traditional media is concerned, anyway.

As for that Korean fellow, if that is what you’re into then you really owe it yourself to check out Chuck Close

So you think all USG's best capabilities must be super-secret, just because. Does this apply only to computing? Has it always been the case? If not, when did it start? And does the same thing hold for other governments? You are not succeeding in making your statement appear rational.

If in 1970, 1980 or 1990 the Pentagon had better chips than the general public, we would know it by now. Over decades stuff leaks out. The existence of the Enigma Project was eventually leaked. Look how many books there have been about the Manhattan Project. And for how long.

Since no info about a past military/civilian chip gap has come to light, it's very unlikely that it existed in past decades. Which makes it unlikely that it exists today.

You seem to have the mentality of a small child.
“My daddy is so strong I won’t even tell you how strong he is!”

No, I'm saying the US DoD, NSA and D of Energy do NOT have the mentality of a small child and therefore they have no need to say - "oh yeah well we have much stronger computers and here they are!!".

Why would they need to expose their elite computing power other than to show off or get involved in a nationalistic rivalry? Why?

There is no need or law or strategic advantage in disclosing this information to anyone.

So you think all USG’s best capabilities must be super-secret, just because. Does this apply only to computing? Has it always been the case? If not, when did it start? And does the same thing hold for other governments? You are not succeeding in making your statement appear rational.

I think a great deal of what might have formerly been considered as great art continues to be produced in the modern world.

This Korean prodigy is making ultra-realistic paintings: http://www.dailymail.co.uk/news/article-2384759/Artist-Joongwon-Charles-Jeongs-hyper-real-paintings-look-like-photos.html

Even much ballyhooed products such as video games have music that might have been considered great had it been written a few hundred years ago: https://www.youtube.com/watch?v=Ns7fNPiNiNc

So why do we not tend to notice it? Simple - because the Mona Lisa and Mozart's Symphony No.40 were original, towering colossi on a flat landscape. Now they are almost a dime or dozen and the land is strewn with towers (and skyscrapers). Further progress is hard because we might have largely reached the limits of what humanity is capable of in principle - at least so far as traditional media is concerned, anyway.

I disagree in that great music is not simply music that’s pleasant to listen to, and great painting is not something that looks just like a photograph. Myron’s Discobolus is perhaps among the greatest sculpture’s of all time, but a real discus thrower never actually looks like that. Leonardo’s last supper doesn’t look at all the way it could have looked like, if for nothing else than because they are all sitting on just one side, yet it’s a great painting.

There’s depth to art that is different from the photographic quality of paintings etc.

I disagree in that great music is not simply music that's pleasant to listen to, and great painting is not something that looks just like a photograph. Myron's Discobolus is perhaps among the greatest sculpture's of all time, but a real discus thrower never actually looks like that. Leonardo's last supper doesn't look at all the way it could have looked like, if for nothing else than because they are all sitting on just one side, yet it's a great painting.

There's depth to art that is different from the photographic quality of paintings etc.

A one-quarter-ton communication satellite is now outperforming the previously used 175,000 tons of transatlantic copper cables, with this 700,000-fold reduction in system-equipment weight providing greater message-carrying capacity and transmission fidelity, as well as using vastly fewer kilowatts of operational energy.

The computers are in space, so the law applies. Invest in space, where your money will go further. US is investing in space via Russia due to American corruption and drive to create more debt without any increase in value. The middle east is an investment in a toilet without water. Fill your tank with gas and have supertoilet. War created a 700,000-fold decrease in system and trillions in debt. We have supertoilet paper to wipe rears. Winter evictions and shooting are up, which means more agency jobs. With arsons we’ll also have a case load for investigations. Not many investors left. Law of the land of bureaucracy applies still. It’s slower, worse and more expensive and the plumbing is bad. That creates inspection jobs at city hall!

Don’t send any more data crooks to Russia, we have plenty! Send to Germany to watch Syrians.

So imposing Diversity in every field is finally affecting computers too. It's been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

on-target. Promethean, White Western civilization is dying, to be replaced by a combination of imitative Judeo-Asiatic stasis, Islamic ringworm, and Black chaos

You seem to have the mentality of a small child.
“My daddy is so strong I won’t even tell you how strong he is!”

No, I'm saying the US DoD, NSA and D of Energy do NOT have the mentality of a small child and therefore they have no need to say - "oh yeah well we have much stronger computers and here they are!!".

Why would they need to expose their elite computing power other than to show off or get involved in a nationalistic rivalry? Why?

There is no need or law or strategic advantage in disclosing this information to anyone.

If in 1970, 1980 or 1990 the Pentagon had better chips than the general public, we would know it by now. Over decades stuff leaks out. The existence of the Enigma Project was eventually leaked. Look how many books there have been about the Manhattan Project. And for how long.

Since no info about a past military/civilian chip gap has come to light, it’s very unlikely that it existed in past decades. Which makes it unlikely that it exists today.

I worked on a military project that used the first 386 chip back in the 1980s. Then the commercial 386 was running at 16 megahertz while the milspec 386 was 12.5 megahertz. Over time and the with government money the radiation hardening program and the VHSIC program the gap closed between milspec parts and commercial parts. However, I don't remember in all the projects I worked on, having military processors significantly outperforming commercial processors, in fact it tended to be just the opposite - mainly because of the temperature requirements of milspec parts.

I also was contacted many times about some project in Washington that I was never interested in. They were only interested in me because I worked at Burroughs and put on my resume that I had some experience with Algol. This was an old language popular in Europe but not much in the US. It was taught in college programming classes but seldom used. Burroughs used it extensively back then. The computers were just plain old Burroughs main frames, likely a 7700 series and probably lots of them. It seemed like it might have been some kind of secret program since I never got a good sense of what it was on just a phone call so I never went for an interview. I didn't think much of the idea of leaving southern California for a purely data crunching job. Most of my programming has been what used to be called real time but is now called embedded systems. I like to see that the software is doing something.

If militaries and intelligence agencies use supercomputers at all, it’s simple common sense that their characteristics (if not existence) will be classified. Why give your adversaries a clue about your modeling capabilities?

If you look at today's supercomputers they are not like those that Seymour Cray designed. They are all massively parallel machines that probably anybody who wants to spend the money on can build. Did you see the column "number of cores". Many are built with hundreds or thousands of Pentium type cores all able to share the same memory. I doubt there are any secrets that we have that China wouldn't. The real issue is can you break a problem up so that you can use a parallel processor. There are bottlenecks where everything prior to a point has to be calculated before you can go to the next step. No matter how many cores you have they will all be sitting idle until the slowest thread finishes.

If militaries and intelligence agencies use supercomputers at all, it's simple common sense that their characteristics (if not existence) will be classified. Why give your adversaries a clue about your modeling capabilities?

If you look at today’s supercomputers they are not like those that Seymour Cray designed. They are all massively parallel machines that probably anybody who wants to spend the money on can build. Did you see the column “number of cores”. Many are built with hundreds or thousands of Pentium type cores all able to share the same memory. I doubt there are any secrets that we have that China wouldn’t. The real issue is can you break a problem up so that you can use a parallel processor. There are bottlenecks where everything prior to a point has to be calculated before you can go to the next step. No matter how many cores you have they will all be sitting idle until the slowest thread finishes.

If in 1970, 1980 or 1990 the Pentagon had better chips than the general public, we would know it by now. Over decades stuff leaks out. The existence of the Enigma Project was eventually leaked. Look how many books there have been about the Manhattan Project. And for how long.

Since no info about a past military/civilian chip gap has come to light, it's very unlikely that it existed in past decades. Which makes it unlikely that it exists today.

I worked on a military project that used the first 386 chip back in the 1980s. Then the commercial 386 was running at 16 megahertz while the milspec 386 was 12.5 megahertz. Over time and the with government money the radiation hardening program and the VHSIC program the gap closed between milspec parts and commercial parts. However, I don’t remember in all the projects I worked on, having military processors significantly outperforming commercial processors, in fact it tended to be just the opposite – mainly because of the temperature requirements of milspec parts.

I also was contacted many times about some project in Washington that I was never interested in. They were only interested in me because I worked at Burroughs and put on my resume that I had some experience with Algol. This was an old language popular in Europe but not much in the US. It was taught in college programming classes but seldom used. Burroughs used it extensively back then. The computers were just plain old Burroughs main frames, likely a 7700 series and probably lots of them. It seemed like it might have been some kind of secret program since I never got a good sense of what it was on just a phone call so I never went for an interview. I didn’t think much of the idea of leaving southern California for a purely data crunching job. Most of my programming has been what used to be called real time but is now called embedded systems. I like to see that the software is doing something.

So imposing Diversity in every field is finally affecting computers too. It's been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

Typical of whites to try and grub after every morsel of success or accomplishment and to hate on what others have done.

The breakthroughs in medicine that you are trying to claim for white people was just the low hanging fruit. It required intelligence, not genius. And if whites did not discover those breakthroughs some other society would have. Probably without going to war with the entire planet or enslaving it in debt like the white people.

This sour grapes and narcissistic outlook is characteristic of a society that is declining or dieing out.

Like it or not, Whites (Greeks) gave us science. Most societies and even animals have technology, but science relied on a certain evolution in man's brain. That event reached a critical mass in ancient Greece.

If the principles of modern medicine are low hanging fruit then why weren't they picked sooner? Modern medicine only really goes back a hundred and fifty years or so to Pasteur, before then the West was probably as clueless as anywhere else. Having a microscope must have helped. Jenner's discoveries also. But it strikes me that the germ theory of disease was only achieved after a long and difficult struggle.

And if whites did not discover those breakthroughs some other society would have

Ah so! I love these Asian Supremacists who assure us the breakthroughs of Newton, Gauss, Von Braun, Haley, Pasteur, Roentgen, Christiaan Bernard, Galileo, Lorentz, Poincare, Maxwell, Claude Shannon, etc., etc., etc. were all just "low-hanging fruit pickers," not geniuses. Meanwhile, China, which has a civilization stretching back five thousand years, did NONE of these tings.How come is that? How come you failed to PICK that low-hanging fruit? You had, after all, five thousand years to do it. What? Just too lazy?

I think a great deal of what might have formerly been considered as great art continues to be produced in the modern world.

This Korean prodigy is making ultra-realistic paintings: http://www.dailymail.co.uk/news/article-2384759/Artist-Joongwon-Charles-Jeongs-hyper-real-paintings-look-like-photos.html

Even much ballyhooed products such as video games have music that might have been considered great had it been written a few hundred years ago: https://www.youtube.com/watch?v=Ns7fNPiNiNc

So why do we not tend to notice it? Simple - because the Mona Lisa and Mozart's Symphony No.40 were original, towering colossi on a flat landscape. Now they are almost a dime or dozen and the land is strewn with towers (and skyscrapers). Further progress is hard because we might have largely reached the limits of what humanity is capable of in principle - at least so far as traditional media is concerned, anyway.

Have you heard about Mansudae, the North Korean art studio? Due to isolation it has been preserved from the debacles of modern art like other communist states. There overseas project division has created many monuments in african nations, like the African Renaissance Monument. Their one project in the west is the restoration/recreation of the Fairytale Fountain in Frankfurt.

So imposing Diversity in every field is finally affecting computers too. It's been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

“The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.”

More like a critical mass of specific types of cognitive abilities in White people. Abstract logical thinking and strong property rights come to mind.

Typical of whites to try and grub after every morsel of success or accomplishment and to hate on what others have done.

The breakthroughs in medicine that you are trying to claim for white people was just the low hanging fruit. It required intelligence, not genius. And if whites did not discover those breakthroughs some other society would have. Probably without going to war with the entire planet or enslaving it in debt like the white people.

This sour grapes and narcissistic outlook is characteristic of a society that is declining or dieing out.

Don't worry, other societies will go on just fine without you.

Like it or not, Whites (Greeks) gave us science. Most societies and even animals have technology, but science relied on a certain evolution in man’s brain. That event reached a critical mass in ancient Greece.

Typical of whites to try and grub after every morsel of success or accomplishment and to hate on what others have done.

The breakthroughs in medicine that you are trying to claim for white people was just the low hanging fruit. It required intelligence, not genius. And if whites did not discover those breakthroughs some other society would have. Probably without going to war with the entire planet or enslaving it in debt like the white people.

This sour grapes and narcissistic outlook is characteristic of a society that is declining or dieing out.

Don't worry, other societies will go on just fine without you.

If the principles of modern medicine are low hanging fruit then why weren’t they picked sooner? Modern medicine only really goes back a hundred and fifty years or so to Pasteur, before then the West was probably as clueless as anywhere else. Having a microscope must have helped. Jenner’s discoveries also. But it strikes me that the germ theory of disease was only achieved after a long and difficult struggle.

Typical of whites to try and grub after every morsel of success or accomplishment and to hate on what others have done.

The breakthroughs in medicine that you are trying to claim for white people was just the low hanging fruit. It required intelligence, not genius. And if whites did not discover those breakthroughs some other society would have. Probably without going to war with the entire planet or enslaving it in debt like the white people.

This sour grapes and narcissistic outlook is characteristic of a society that is declining or dieing out.

Don't worry, other societies will go on just fine without you.

And if whites did not discover those breakthroughs some other society would have

Ah so! I love these Asian Supremacists who assure us the breakthroughs of Newton, Gauss, Von Braun, Haley, Pasteur, Roentgen, Christiaan Bernard, Galileo, Lorentz, Poincare, Maxwell, Claude Shannon, etc., etc., etc. were all just “low-hanging fruit pickers,” not geniuses. Meanwhile, China, which has a civilization stretching back five thousand years, did NONE of these tings.
How come is that? How come you failed to PICK that low-hanging fruit? You had, after all, five thousand years to do it. What? Just too lazy?

So imposing Diversity in every field is finally affecting computers too. It's been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

Technology is not going to save your life, or anyone’s life. Technology is going to destroy life – all life. Technology is a death cult whose enthusiasts and proponents envision basically turning the Earth into a giant computer:

So imposing Diversity in every field is finally affecting computers too. It's been decades since we had major breakthroughs in fields like medicine.

So, literally Diversity is DIEversity. The technology that could have saved your life or the life of a loved one, has been delayed. And if the critical mass of White people is disrupted, may never show up.

Buy hey, nobody can call you a rayyyccciss! Not sure what could that does you in the grave.

Doesn’t really explain why Europe hasn’t picked up the slack though. Compared to the US, European higher education and the European computer industry have been very homogeneous. If it was just about diversity, you’d think that Europe would have zoomed ahead.