His book continues his war on digital utopianism and his assertion of humanist and individualistic values in a hive-mind world. But Lanier still sees potential in digital technology: He just wants it reoriented away from its main role so far, which involves “spying” on citizens, creating a winner-take-all society, eroding professions and, in exchange, throwing bonbons to the crowd.

Much of the book looks at the way Internet technology threatens to destroy the middle class by first eroding employment and job security, along with various “levees” that give the economic middle stability.

“Here’s a current example of the challenge we face,” he writes in the book’s prelude: “At the height of its power, the photography company Kodak employed more than 140,000 people and was worth $28 billion. They even invented the first digital camera. But today Kodak is bankrupt, and the new face of digital photography has become Instagram. When Instagram was sold to Facebook for a billion dollars in 2012, it employed only 13 people. Where did all those jobs disappear? And what happened to the wealth that all those middle-class jobs created?”

But more important than Lanier’s hopes for a cure is his diagnosis of the digital disease. Eccentric as it is, “Future” is one of the best skeptical books about the online world, alongside Nicholas Carr’s “The Shallows,” Robert Levine’s “Free Ride” and Lanier’s own “You Are Not a Gadget.”

So there’s still a lot of human effort, but the difference is that whereas before when people made contributions to the system that they used, they received formal benefits, which means not only salary but pensions and certain kinds of social safety nets. Now, instead, they receive benefits on an informal basis. And what an informal economy is like is the economy in a developing country slum. It’s reputation, it’s barter, it’s that kind of stuff.

Yeah, and I remember there was this fascination with the idea of the informal economy about 10 years ago. Stewart Brand was talking about how brilliant it is that people get by in slums on an informal economy. He’s a friend so I don’t want to rag on him too much. But he was talking about how wonderful it is to live in an informal economy and how beautiful trust is and all that.

And you know, that’s all kind of true when you’re young and if you’re not sick, but if you look at the infant mortality rate and the life expectancy and the education of the people who live in those slums, you really see what the benefit of the formal economy is if you’re a person in the West, in the developed world.

So Kodak has 140,000 really good middle-class employees, and Instagram has 13 employees, period. You have this intense concentration of the formal benefits, and that winner-take-all feeling is not just for the people who are on the computers but also from the people who are using them. So there’s this tiny token number of people who will get by from using YouTube or Kickstarter, and everybody else lives on hope. There’s not a middle-class hump. It’s an all-or-nothing society.

And then, you know, along a similar vein at that time early audio recordings, which today would sound horrible to us, were indistinguishable between real music to people who did double blind tests and whatnot.

So in the beginning photography was kind of a labor saving device. And whenever you have a technological advance that’s less hassle than the previous thing, there’s still a choice to make. And the choice is, do you still get paid for doing the thing that’s easier?

We kind of made a bargain, a social contract, in the 20th century that even if jobs were pleasant people could still get paid for them. Because otherwise we would have had a massive unemployment. And so to my mind, the right question to ask is, why are we abandoning that bargain that worked so well?

Of course jobs become obsolete. But the only reason that new jobs were created was because there was a social contract in which a more pleasant, less boring job was still considered a job that you could be paid for. That’s the only reason it worked. If we decided that driving was such an easy thing [compared to] dealing with horses that no one should be paid for it, then there wouldn’t be all of those people being paid to be Teamsters or to drive cabs. It was a decision that it was OK to have jobs that weren’t terrible.

I mean, the whole idea of a job is entirely social construct. The United States was built on slave labor. Those people didn’t have jobs, they were just slaves. The idea of a job is that you can participate in a formal economy even if you’re not a baron. That there can be, that everybody can participate in the formal economy and the benefit of having everybody participate in the formal economy, there are annoyances with the formal economy because capitalism is really annoying sometimes.

But the benefits are really huge, which is you get a middle-class distribution of wealth and clout so the mass of people can outspend the top, and if you don’t have that you can’t really have democracy. Democracy is destabilized if there isn’t a broad distribution of wealth.

And then the other thing is that if you like market capitalism, if you’re an Ayn Rand person, you have to admit that markets can only function if there are customers and customers can only come if there’s a middle hump. So you have to have a broad distribution of wealth.

It was all a social construct to begin with, so what changed, to get to your question, is that at the turn of the [21st] century it was really Sergey Brin at Google who just had the thought of, well, if we give away all the information services, but we make money from advertising, we can make information free and still have capitalism.

But the problem with that is it reneges on the social contract where people still participate in the formal economy. And it’s a kind of capitalism that’s totally self-defeating because it’s so narrow. It’s a winner-take-all capitalism that’s not sustaining.

My understanding was that the U.S. heads of business got the nod to go ahead and start manufacturing things *other* than weapons, because our industrial capabilities weren't anhialated (sp?) relative to so many others.

There’s always academic tenure, or a taxi medallion, or a cosmetology license, or a pension. There’s often some kind of license or some kind of ratcheting scheme that allows people to keep their middle-class status.

In a raw kind of capitalism there tend to be unstable events that wipe away the middle and tend to separate people into rich and poor. So these mechanisms are undone by a particular kind of style that is called the digital open network.

Music is a great example where value is copied. And so once you have it, again it’s this winner-take-all thing where the people who really win are the people who run the biggest computers. And a few tokens, an incredibly tiny number of token people who will get very successful YouTube videos, and everybody else lives on hope or lives with their parents or something.

I guess all orthodoxies are built on lies. But there’s this idea that there must be tens of thousands of people who are making a great living as freelance musicians because you can market yourself on social media.

And whenever I look for these people – I mean when I wrote “Gadget” I looked around and found a handful – and at this point three years later, I went around to everybody I could to get actual lists of people who are doing this and to verify them, and there are more now. But like in the hip-hop world I counted them all and I could find about 50. And I really talked to everybody I could. The reason I mention hip-hop is because that’s where it happens the most right now.

The interesting thing about it is that people advertise, “Oh, what an incredible life. She’s this incredibly lucky person who’s worked really hard.” And that’s all true. She’s in her 20s, and it’s great that she’s found this success, but what this success is that she makes maybe $250,000 a year, and she rents a house that’s worth $1.1 million in L.A.. And this is all breathlessly reported as this great success.

But for someone who’s out there, a star with a billion views, that’s a crazy low expectation. She’s not even in the 1 percent. For the tiny token number of people who make it to the top of YouTube, they’re not even making it into the 1 percent.

I think in the total of music in America, there are a low number of hundreds. It’s really small. I wish all of those people my deepest blessings, and I celebrate the success they find, but it’s just not a way you can build a society.

The other problem is they would have to self-fund. This is getting back to the informal economy where you’re living in the slum or something, so you’re desperate to get out so you impress the boss man with your music skills or your basketball skills. And the idea of doing that for the whole of society is not progress. It should be the reverse. What we should be doing is bringing all the people who are in that into the formal economy. That’s what’s called development. But this is the opposite of that. It’s taking all the people from the developed world and putting them into a cycle of the developing world of the informal economy.

We don’t realize that our society and our democracy ultimately rest on the stability of middle-class jobs. When I talk to libertarians and socialists, they have this weird belief that everybody’s this abstract robot that won’t ever get sick or have kids or get old. It’s like everybody’s this eternal freelancer who can afford downtime and can self-fund until they find their magic moment or something.

The way society actually works is there’s some mechanism of basic stability so that the majority of people can outspend the elite so we can have a democracy. That’s the thing we’re destroying, and that’s really the thing I’m hoping to preserve. So we can look at musicians and artists and journalists as the canaries in the coal mine, and is this the precedent that we want to follow for our doctors and lawyers and nurses and everybody else? Because technology will get to everybody eventually.

I have 14-year-old kids who come to my talks who say, “But isn’t open source software the best thing in life? Isn’t it the future?” It’s a perfect thought system. It reminds me of communists I knew when growing up or Ayn Rand libertarians.

It’s one of these things where you have a simplistic model that suggests this perfect society so you just believe in it totally. These perfect societies don’t work. We’ve already seen hyper-communism come to tears. And hyper-capitalism come to tears. And I just don’t want to have to see that for cyber-hacker culture. We should have learned that these perfect simple systems are illusions.

I am culturally a man on the left. I get a lot of people on the left. I live in Berkeley and everything. I want to live in a world where outcomes for people are not predetermined in advance with outcomes.

The problem I have with socialist utopias is there’s some kind of committees trying to soften outcomes for people. I think that imposes models of outcomes for other people’s lives. So in a spiritual sense there’s some bit of libertarian in me. But the critical thing for me is moderation. And if you let that go too far you do end up with a winner-take-all society that ultimately crushes everybody even worse. So it has to be moderated.

Libertarians tend to think the economy can totally close its own loops, that you can get rid of government. And I ridicule that in the book. There are other people who believe that if you could get everybody to talk over social networks, if we could just cooperate, we wouldn’t need money anymore. And I recommend they try living in a group house and then they’ll see it’s not true.

And that is doesn’t concentrate power too much, and if we can just get to that point, then we’ll really be fine. I’m actually modest. People have been accusing me of being super-ambitious lately, but I feel like in a way I’m the most modest person in the conversation.

See, now I like this guy. This is like the political equivalent of aiming for the realist view in geopolitics. We separate what is likely from what is unlikely and aim not for "the best" situation, but a situation where the worst aspects have been mitigated.

It's backwards thinking that both parties would have a hard time integrating into their (ughhh) brand.

Yeah, no kidding. I was there. I gotta say, every little step of this thing was really funded by either the military or public research agencies. If you look at something like Facebook, Facebook is adding the tiniest little rind of value over the basic structure that’s there anyway. In fact, it’s even worse than that. The original designs for networking, going back to Ted Nelson, kept track of everything everybody was pointing at so that you would know who was pointing at your website. In a way Facebook is just recovering information that was deliberately lost because of the fetish for being anonymous. That’s also true of Google.

Books are really, really hard to write. They represent a kind of a summit of grappling with what one really has to say. And what I’m concerned with is when Silicon Valley looks at books, they often think of them as really differently as just data points that you can mush together. They’re divorcing books from their role in personhood.

I was in a cafe this morning where I heard some stuff I was interested in, and nobody could figure out. It was Spotify or one of these … so they knew what stream they were getting, but they didn’t know what music it was. Then it changed to other music, and they didn’t know what that was. And I tried to use one of the services that determines what music you’re listening to, but it was a noisy place and that didn’t work. So what’s supposed to be an open information system serves to obscure the source of the musician. It serves as a closed information system. It actually loses the information.

And if we start to see that with books in general – and I say if – if you look at the approach that Google has taken to the Google library project, they do have the tendency to want to move things together. You see the thing decontextualized.

I have sort of resisted putting my music out lately because I know it just turns into these mushes. Without context, what does my music mean? I make very novel sounds, but I don’t see any value in me sharing novel sounds that are decontextualized. Why would I write if people are just going to get weird snippets that are just mushed together and they don’t know the overall position or the history of the writer or anything? What would be the point in that. The day books become mush is the day I stop writing.

So to realize how much better musical instruments were to use as human interfaces, it helped me to be skeptical about the whole digital enterprise. Which I think helped me be a better computer scientist, actually.

Sure. If you go way back I was one of the people who started the whole music-should-be-free thing. You can find the fire-breathing essays where I was trying to articulate the thing that’s now the orthodoxy. Oh, we should free ourselves from the labels and the middleman and this will be better.

I believed it at the time because it sounds better, it really does. I know a lot of these musicians, and I could see that it wasn’t actually working. I think fundamentally you have to be an empiricist. I just saw that in the real lives I know — both older and younger people coming up — I just saw that it was not as good as what it had once been. So that there must be something wrong with our theory, as good as it sounded. It was really that simple.

All resources become the common heritage of all of the inhabitants, not just a select few.” This arrangement is made possible by aggressive use of advanced technologies to create an abundance of resources and thereby negate the need for any sort of rationing.

put money directly in people’s hands so they can spend it and keep the market economy going. The main difference is that instead of making the income unconditional, Ford advocates doling out money according to an incentive scheme that encourages behavior society desires.

if we can find a way to directly upgrade human minds—such as through the use of brain-computer interfaces—then workers would be able to keep pace with technological change and readily adapt to new jobs and industries as quickly as they crop up.

Yes, there will be less jobs available, and certainly people’s incomes will suffer, but technology will simultaneously bring down the cost of living at a fast enough rate that people will survive just fine without the need for government invention or economic restructuring.

"On the internet and in the media there has been growing discussion of technological unemployment. People are increasingly concerned that automation will displace more and more workers-that in fact there might be no turning back at this point. We may be reaching the end of work as we know it."

I have a theory about digitial distractions, a theory that's supported by anecdotes from the mobile learning pioneers at Abilene Christian University and by data from CU-Boulder grad student Bethany Wilcox. My theory is that if all students are expected to do with their mobile devices during class is take notes, then they'll distract themselves with Facebook et al. because notetaking doesn't keep their minds busy enough.

The term "digital ink" usually refers to the ability to draw using a digital device, typically with a stylus. If I could pass out stylus-equipped iPads to all my students, that would be great. Since I can't, I have students participate in "digital ink" activities on the cheap.

I'm a big fan of teaching with clickers. (I did happen to write the book on this topic.) When I teach a course, I ask my students to purchase clickers because I know I'll use them regularly throughout the course.

"When I talk with most instructors about cell phones or laptops in the classroom, I hear great concern about digital distractions. If you let students use mobile devices such as smart phones, tablets, or laptops, they'll surely start checking Facebook, watching ESPN, or shopping for shoes. (I don't know what's up, but when this issue gets addressed in the media, shoe shopping is ALWAYS mentioned.)"

Sixty percent of the jobs in the US are information-processing jobs, notes Erik Brynjolfsson, co-author of a recent book about this disruption, Race Against the Machine. It’s safe to assume that almost all of these jobs are aided by machines that perform routine tasks. These machines make some workers more productive. They make others less essential.

The turn of the new millennium is when the automation of middle-class information processing tasks really got under way, according to an analysis by the Associated Press based on data from the Bureau of Labor Statistics. Between 2000 and 2010, the jobs of 1.1 million secretaries were eliminated, replaced by internet services that made everything from maintaining a calendar to planning trips easier than ever.

Economist Andrew McAfee, Brynjolfsson’s co-author, has called these displaced people “routine cognitive workers.” Technology, he says, is now smart enough to automate their often repetitive, programmatic tasks. ”We are in a desperate, serious competition with these machines,” concurs Larry Kotlikoff, a professor of economics at Boston University. “It seems like the machines are taking over all possible jobs.”

In the early 1800′s, nine out of ten Americans worked in agriculture—now it’s around 2%. At its peak, about a third of the US population was employed in manufacturing—now it’s less than 10%. How many decades until the figures are similar for the information-processing tasks that typify rich countries’ post-industrial economies?

To see how the internet has disproportionately affected the jobs of people who process information, check out the gray bars dipping below the 0% line on the chart, below. (I’ve adapted this chart to show just the types of employment that lost jobs in the US during the great recession. Every other category continued to add jobs or was nearly flat.)

Here’s another clue about what’s been going on in the past ten years. “Return on capital” measures the return firms get when they spend money on capital goods like robots, factories, software—anything aside from people. (If this were a graph of return on people hired, it would be called “Return on labor”.)

Notice: the only industry where the return on capital is as great as manufacturing is “other industries”—a grab bag which includes all the service and information industries, as well as entertainment, health care and education. In short, you don’t have to be a tech company for investing in technology to be worthwhile.

For many years, the question of whether or not spending on information technology (IT) made companies more productive was highly controversial. Many studies found that IT spending either had no effect on productivity or was even counter-productive. But now a clear trend is emerging. More recent studies show that IT—and the organizational changes that go with it—are doing firms, especially multinationals (pdf), a great deal of good.

One thing all our machines have accomplished, and especially the internet, is the ability to reproduce and distribute good work in record time. Barring market distortions like monopolies, the best software, media, business processes and, increasingly, hardware, can be copied and sold seemingly everywhere at once. This benefits “superstars”—the most skilled engineers or content creators. And it benefits the consumer, who can expect a higher average quality of goods.

But it can also exacerbate income inequality, says Brynjolfsson. This contributes to a phenomenon called “skill-biased technological [or technical] change.” “The idea is that technology in the past 30 years has tended to favor more skilled and educated workers versus less educated workers,” says Brynjolfsson. “It has been a complement for more skilled workers. It makes their labor more valuable. But for less skilled workers, it makes them less necessary—especially those who do routine, repetitive tasks.”

“Certainly the labor market has never been better for very highly-educated workers in the United States, and when I say never, I mean never,” MIT labor economist David Autor told American Public Media’s Marketplace.

“The spread of computers and the Internet will put jobs in two categories,” said Andreessen. “People who tell computers what to do, and people who are told by computers what to do.” It’s a glib remark—but increasingly true.

This time it’s faster

History is littered with technological transitions. Many of them seemed at the time to threaten mass unemployment of one type of worker or another, whether it was buggy whip makers or, more recently, travel agents. But here’s what’s different about information-processing jobs: The takeover by technology is happening much faster.

Brynjolfsson thinks he knows why: More and more people were doing work aided by software. And during the great recession, employment growth didn’t just slow. As we saw above, in both manufacturing and information processing, the economy shed jobs, even as employment in the service sector and professional fields remained flat.

Especially in the past ten years, economists have seen a reversal of what they call “the great compression“—that period from the second world war through the 1970s when, in the US at least, more people were crowded into the ranks of the middle class than ever before.

There are many reasons why the economy has reversed this “compression,” transforming into an “hourglass economy” with many fewer workers in the middle class and more at either the high or the low end of the income spectrum.

The hourglass represents an income distribution that has been more nearly the norm for most of the history of the US. That it’s coming back should worry anyone who believes that a healthy middle class is an inevitable outcome of economic progress, a mainstay of democracy and a healthy society, or a driver of further economic development.

Indeed, some have argued that as technology aids the gutting of the middle class, it destroys the very market required to sustain it—that we’ll see “less of the type of innovation we associate with Steve Jobs, and more of the type you would find at Goldman Sachs.”

So how do we deal with this trend? The possible solutions to the problems of disruption by thinking machines are beyond the scope of this piece. As I’ve mentioned in other pieces published at Quartz, there are plenty of optimists ready to declare that the rise of the machines will ultimately enable higher standards of living, or at least forms of unemployment as foreign to us as “big data scientist” would be to a scribe of the 17th century.

But that’s only as long as you’re one of the ones telling machines what to do, not being told by them. And that will require self-teaching, creativity, entrepreneurialism and other traits that may or may not be latent in children, as well as retraining adults who aspire to middle class living. For now, sadly, your safest bet is to be a technologist and/or own capital, and use all this automation to grab a bigger-than-ever share of a pie that continues to expand.

"Everyone knows the story of how robots replaced humans on the factory floor. But in the broader sweep of automation versus labor, a trend with far greater significance for the middle class-in rich countries, at any rate-has been relatively overlooked: the replacement of knowledge workers with software.

One reason for the neglect is that this trend is at most thirty years old, and has become apparent in economic data only in perhaps the past ten years. The first all-in-one commercial microprocessor went on sale in 1971, and like all inventions, it took decades for it to become an ecosystem of technologies pervasive and powerful enough to have a measurable impact on the way we work."

If a newcomer thinks it can win by competing at the high end, “the incumbents will always kill you.”

If they come in at the bottom of the market and offer something that at first is not as good, the legacy companies won’t feel threatened until too late, after the newcomers have gained a foothold in the market.

He offered as an example the introduction of cheap transistor radios. High fidelity, vacuum-tube powered incumbents felt no threat from the poor quality audio the transistors produced and missed the technological shift that eventually killed many of them.

Instead of coming in at the low end of the market with a cheap electric vehicle, Tesla Motors competes with premium offerings from legacy automakers.

“Who knows whether they will be successful or not,” he said. “They have come up with cars that in fact compete reasonably well and they cost $100,000 and god bless them.”

“But if you really want to make a big product market instead of a niche product market, the kind of question you want to ask for electric vehicles is, I wonder if there is a market out there for customers who would just love to have a product that won’t go very far or go very fast. The answer is obvious.

“The parents of teenagers would love to have a car that won’t go very far or go very fast. They could just cruise around the neighborhood, drive it to school, see their friends, plug it in overnight.”

Because that kind of electric car offers something that doesn’t threaten incumbents and provides a low-end solution, Christensen says that has a greater chance of surviving and ultimately upending the auto market than Tesla’s flashy Roadsters and sedans.

Christensen said he thinks the venture capital world needs to be disrupted because it is focused too much on making big killings on big investments at a time when there are plenty of good smaller investments to be made on companies that will be disruptive.

“Venture capital is always wanting to go up market. It’s like the Rime of the Ancient Mariner. 'Water, water everywhere and not a drop to drink.' People in private equity complain that they have so much capital and so few places to invest. But you have lots of entrepreneurs trying to raise money at the low end and find that they can’t get funding because of this mismatch. I think that there is an opportunity there.”

“For 300 years, higher education was not disruptable because there was no technological core. If San Jose State wants to become a globally known research institution, they have to emulate UC Berkeley and Cal Tech. They can’t disrupt,” he said on Wednesday.

“But now online learning brings to higher education this technological core, and people who are very complacent are in deep trouble. The fact that everybody was trying to move upmarket and make their university better and better and better drove prices of education up to where they are today.

"Basically, his theory of disruption centers around how dominant industry leaders will react to a newcomer: "It allows you to predict whether you will kill the incumbents or whether the incumbents will kill you.""

For a kid that wasn't into fantasy books, I didn't approach it as an adventurer. I approached the game as me: I'm standing in front of a house, not unlike my own, a wrapped gift that you have to figure out how to unwrap. It was like a Choose Your Own Adventure in which no other pages were suggested. Here's where you are, kid. Now what?

"If we are honest, we must admit that one aspect of the technium is to make holes in our heart. One day recently we decided that we cannot live another day unless we have a smart phone, when a dozen years earlier this need would have dumbfounded us. Now we get angry if the network is slow, but before, when we were innocent, we had no thoughts of the network at all. Now we crave the instant connection of friends, whereas before we were content with weekly, or daily, connections. But we keep inventing new things that make new desires, new longings, new wants, new holes that must be filled."

Karl Schroeder, science fiction author of the novel Permanence, writes about the Fermi Paradox. The Fermi Paradox says that if there is an infinite universe there must be an infinite number of civilizations at advance stages that would emit evidence of their presence, but as far as we see in any direction, there are none. The skies should be full of aliens, but are not. Why not?

The question I’m going to spin entertaining lies around is this: what is network security going to be about once we get past the current sigmoid curve of accelerating progress and into a steady state, when Moore’s first law is long since burned out, and networked computing appliances have been around for as long as steam engines?

if you start trying to visualize a coherent future that includes aliens, telepathy, faster than light travel, or time machines, your futurology is going to rapidly run off the road and go crashing around in the blank bits of the map that say HERE BE DRAGONS.

Democracy is a lousy form of government in some respects – it is particularly bad at long-term planning, for no event that lies beyond the electoral event horizon can compel a politician to pay attention to it

. In general, democratically elected politicians are forced to focus on short-term solutions to long-term problems because their performance is evaluated by elections held on a time scale of single-digit years

Assuming that they survive the obstacles on the road to development, this process is going to end fairly predictably: both India and China will eventually converge with a developed world standard of living, while undergoing the demographic transition to stable or slowly declining populations that appears to be an inevitable correlate of development.

In 2006, for the first time, more than half of the planet’s human population lived in cities. And by 2061 I expect more than half of the planet’s human population will live in conditions that correspond to the middle class citizens of developed nations.

by 2061 we or our children are going to be living on an urban middle-class planet, with a globalized economic and financial infrastructure recognizably descended from today’s system, and governments that at least try to pay lip service to democratic norms.

For example, we’re currently raising the first generation of kids who won’t know what it means to be lost – everywhere they go, they have GPS service and a moving map that will helpfully show them how to get wherever they want to go.

The whole question of whether a mature technosphere needs three or four billion full-time employees is an open one, as is the question of what we’re all going to do if it turns out that the future can’t deliver jobs.

We’re still in the first decade of mass mobile internet uptake, and we still haven’t seen what it really means when the internet becomes a pervasive part of our social environment, rather than something we have to specifically sit down and plug ourselves in to, usually at a desk.

This leaves aside a third model, that of peer to peer mesh networks with no actual cellcos as such – just lots of folks with cheap routers. I’m going to provisionally assume that this one is hopelessly utopian

wireless bandwidth appears to be constrained fundamentally by the transparency of air to electromagnetic radiation. I’ve seen some estimates that we may be able to punch as much as 2 tb/sec through air; then we run into problems.

whether lifelogging becomes a big social issue depends partly on the nature of our pricing model for bandwidth, and how we hammer out the security issues surrounding the idea of our sensory inputs being logged for posterity.

Only the rapid development of DNA assays for SARS – it was sequenced within 48 hours of its identification as a new pathogenic virus – made it possible to build and enforce the strict quarantine regime that saved us from somewhere between two hundred million and a billion deaths.

we can expect eventually to see home genome monitoring – both looking for indicators of precancerous conditions or immune disorders within our bodies, and performing metagenomic analysis on our environment.

If our metagenomic environment is routinely included in lifelogs, we have the holy grail of epidemiology within reach; the ability to exhaustively track the spread of pathogens and identify how they adapt to their host environment, right down to the level of individual victims.

In each of these three examples of situations where personal privacy may be invaded, there exists a strong argument for doing so in the name of the common good – for prevention of epidemics, for prevention of crime, and for prevention of traffic accidents. They differ fundamentally from the currently familiar arguments for invasion of our data privacy by law enforcement – for example, to read our email or to look for evidence of copyright violation. Reading our email involves our public and private speech, and looking for warez involves our public and private assertion of intellectual property rights …. but eavesdropping on our metagenomic environment and our sensory environment impinges directly on the very core of our identities.

The new Ultimaker 3D printer made in the Netherlands has arrived in the US. The machine, which prints bigger and faster than MakerBot printers, was created by three Dutch makers who met at the Fab Lab in Utrecht, Holland two years ago.

Newspapers and defense journals along the Pacific Rim and elsewhere are replete with foreign speculation on the future activities of a more internationally active and aggressive Chinese navy, to say nothing of more sober discussions of the constraints and limitations facing potential Chinese naval ambitions with a single carrier (for now) and no history or culture of carrier operations.

The attention on the Varyag is, in many ways, misplaced. China is historically a land power. Its biggest security challenges remain at home, across a vast territory that will continue to require large expenditures for manpower, equipment and transportation. China’s historical flirtation with a navy that travels far beyond its immediate neighborhood has been limited. Even the famous voyages of Zheng He could be called frivolous, rather than a serious attempt to dominate seas around the world or even the region.

China’s extensive geography and high population are its core strength and greatest defense. Even if an invasion from the sea is initially successful, China has the human resources to ultimately either absorb the conqueror (the one land power that was successful in invading China — the Mongols — eventually became subsumed into Chinese culture), or to outlast the invader through a long war of attrition.

China’s naval expansion, in that case, is not part of a strategy to engage in a naval arms race with the United States or challenge U.S. dominance of the seas. Rather, Beijing intends to build a defensive buffer around China’s maritime periphery.

the attention to China’s new aircraft carrier, deep-diving submarine, its space exploration, and similar activities also helps Beijing distract audiences domestic and global from real problems inside the country

Beijing’s top concern is avoiding an economic and social crisis, and Chinese leaders know that it may be only a matter of time before the Chinese economy faces the same structural limitations that its East Asian counterparts already faced.

"China is once again on the verge of sending its first aircraft carrier to sea. In recent days, the Chinese media has expanded on comments, made during a Defense Ministry press conference, openly confirming that China is refitting the Varyag and preparing to enter the small club of nations with aircraft carriers."

"Seagram's advertised its VO Canadian whiskey back in the mid-1940's with a series of extremely manly magazine ads about "Men Who Plan Beyond Tomorrow". They were unspecified futuristic thinkers who liked the fact that Seagram's was patient enough to age VO for six years. Each of the ads depicted a different miracle that would transform postwar America and they were glorious."

Hemingway's personal typewriters (he had more than one) are treated like relics. They are roped off, no touching them, they've become the object of pilgrimages, fetching more than $100,000. Yet, the venerated typewriter itself is indistinguishable from other units made on that assembly line.

This relic magic operates at full throttle in the world of modern celebrity collectors. The $3 hockey puck used in the 2010 gold medal Olympics championship game later sold for $13,000 because of the unique properties it acquired during the game.

Yet as we approach the tenth anniversary of the disasters of 9/11, there is an official campaign to assign supernatural potency to the remains of the World Trade Center. The twisted bits of steel salvaged from the site of the fallen towers are being treated as holy relics, taken on a long processions for public viewing, while the disaster site itself is being described as a "sacred place."

There is certainly value in keeping old things. Museums that collect artifacts, like say the Computer History Museum, contain both original prototypes and arbitrary production-run units, and these contain great historical information and lessons. But it doesn't (or shouldn't) matter who touched or used them previously. Manufactured artifacts can't be relics. They are all clones.

Of course, there is no difference, which is why we place so much emphasis on provenance ("it's been in our family forever!"). In the end, a historical technological artifact is one of the reservoirs in the modern world where superstition still flows freely.

By Kevin Kelly at The Technium: "Superstition is alive and well in the high tech world. It is visible most prominently in our technological artifacts, some of which we treat like medieval relics. Recently, supernatural superstition has crept into American treatment of 9/11."

The key to the D-Dalus' extreme maneuverability is the facility to alter the angle of the blades (using servos) to vector the forces, meaning that the thrust can be delivered in your choice of 360 degrees around any of the three axes. Hence D-Dalus can launch vertically, hover perfectly still and move in any direction, and that's just the start of the story.

"The D-Dalus (a play on Daedalus from Greek mythology) is neither fixed wing or rotor craft and uses four, mechanically-linked, contra-rotating cylindrical turbines, each running at the same 2200 rpm, for its propulsion."

"Contemplating the shortcomings of the younger generation has ever been a hobby of the elder. As I start to transition to the latter population (perhaps a bit early for my age), I've found myself worrying more and more about the kids, and how little they seem to appreciate things. That kind of complaint is neither constructive or original. But the fact is that the kids are growing up pretty weird these days, because of the way technology has outpaced our institutions of learning and standards of knowledge."

"TVs that hang on walls? Automatic lights? Food cooked in seconds? American power companies sure had the future figured out! Except for one little thing... we're still waiting on those driverless cars. The futuristic family from this ad bares a striking resemblance to the family of Disneyland TV's 1958 episode, "Magic Highway, USA."

ELECTRICITY MAY BE THE DRIVER. One day your car may speed along an electric super-highway, its speed and steering automatically controlled by electronic devices embedded in the road. Highways will be made safe -- by electricity! No traffic jam.. no collisions... no driver fatigue."