With Industry 4.0, radical social and economic disruption is upon us. This technological triumph, however, is equally haunted by troubling fears about its future. And, once again, we are reminded that the machine is not solely a technological question.

When the first coal-fired steam engines were being readied to carry passengers in early Victorian England, many alarms were sounded: the trains would traumatise livestock, belch toxic smoke, devastate ripening crops and people, it was believed, would surely suffocate if carried at speeds that surpassed the then-mind-boggling maximum of 20 mph.

The social thinker and philanthropist John Ruskin (1818-1900), whose writings inspired Mahatma Gandhi (1869-1948) even warned that the railways would certainly bring about a 'deterioration of moral character'.

Earlier, the Luddite uprising between 1811 and 1816 had almost overrun Britain. Multitudes of skilled weavers and artisans whose livelihoods were eliminated by textile factories and industrial machinery were in ferocious rebellion and their desperate 'machine-smashing' run was finally put down only by the military. Allegedly, at one point, more British soldiers were fighting the Luddites than the numbers deployed against Napoleon Bonaparte's army in Europe.

But the steam age Industry 1.0 remained remorseless. It focussed on ushering in mechanical production and cleared the grounds for the next technological leap. From the 1870s, Industry 2.0 began, when electricity started moving assembly lines in factories and boosted manufacturing. Cybernetics, or the digital revolution, took off in the 1960s and Industry 3.0 was thus born with mainframe computing (1960s), personal computing (1970s and 80s) and the Internet (1990s).

It was at the 2011 Hannover fair in Germany " the world's leading trade show for industrial technology " that Industry 4.0 was coined. In the opinion of Klaus Schwab, founder-executive chairman of the World Economic Forum, three aspects made this new momentum: technologies are evolving at an exponential rather than linear pace; digitalisation is not merely changing the 'what' and the 'how' of doing things, but also profoundly altering 'who' we are becoming; and lastly, industry and society are undergoing 'systematic' transformations.

In 2015, the WEF had identified 21 technological tipping points which ten years from now will radically overhaul our existing sense of what constitutes biology, digital connectivity and physical existence. This 'brave new world', in fact, is already bursting forward through a raft of technologies such as Artificial Intelligence (AI), robotics, Blockchain, Internet of Things, Genome Engineering, Neurotechnologies, Big Data, Machine Learning and >3D Printing to name a few.

At the heart of this dazzle, however, lies the abstract mathematical powers of computing " the cybernetic protocol for turning points of information into digital data and systematically amassing the latter into meta-data. The difference, for Bruce Schneier in Data and Goliath, is that while data is content, meta-data provides context. Put differently, the 'data-exhaust' of an individual can be collected, collated and endlessly interrogated to establish patterns and organise control for influencing behaviour. But such data-surveillance is no longer a capacity uniquely vested only in governments and big corporations.

The proliferation of circuit cameras, smartphones, miniature recording devices and social media have inescapably put almost every presumed private act into the fishbowl of public viewing. In Industry 4.0, 'generalised surveillance', thus creates a data vulnerable society of all-against-all.

Meta-data, moreover, can be spun into formulas and equations to produce algorithms"a mathematical procedure or the sequencing of steps for realising a specific output. Algorithms make machine learning and deep learning possible and are thereby able to profoundly transform the fields of robotics and >AI.

Robots are now rigged for speech, pattern recognition, complex task performance and communication with other robots via the digital cloud. Robots in Industry 4.0 have become versatile, dexterous and, significantly enough, able to constantly learn from data.

AI technologies, on the other hand, are increasingly being primed as 'prediction machines'. Prediction, according to Ajay Agrawal, Joshua Gans and Avi Goldfarb, the authors of What to Expect From Artificial Intelligence, is not only an input into decision-making, but can make economic value by reducing uncertainty. As a telling example, we have >Amazon, the archetypal AI economy, which aims to colonise the future by feeding and fattening on meta-data in order to pre-empt your every possible anticipation.

Will these tireless robots and efficient AIs phase out human jobs in manufacturing and the services? The jury, however, is not yet out given that many types of human-machine jobs are also emerging. Instead, we might be better off asking a different question. Notably, what is the nature of work in 4.0?

Increasingly, as cold capital replaces flesh and bone labour, what passes for work has become contractual, part-time, temporary gigs and piece work. Impermanence aggravated in a climate of job redundancies and technological obsolescence.

Susanne Klein, an anthropologist at Hokkaido University, has for several years now been studying these fragile 'alternate lifestyles' in post-growth Japan. The loss of secure long-term employment, she points out, has led individuals to mostly live for the moment. Their abandonment of medium and long-term planning, it seems, is the only way to hedge against an unsure, unknowable and rapidly changing future. The word that everyone has on their lips is 'precarity' " an emotional condition caused by insecurity, anxiety and the loss of predictability.

While 4.0 cannot be reduced to only a gloom and doom story, the previous guarantees that made the middle class (education, health, affordable housing and pensions) are being lost not only to neo-liberal globalisation but increasingly to automation as well. The writings of Martin Ford and even the observations of Erik Brynjolfsson and Andrew McAfee, authors of The Second Machine Age, clearly warn us that 4.0 is rudely polarising and cleaving societies"a few start-up superstars ascending to the top with the rest precariously circling the bottom.

At the CeBIT expo in Hannover in March of 2017, Japanese Prime Minister Shinzo Abe unveiled his government's ambition for achieving what he termed as 'Society 5.0'. Though still an evolving notion, Society 5.0 intends to reorient Industry 4.0 by restating the urgency for a 'human-centred society'. Japan, in other words, wants social values " the full play of culture, norms and history " to seize and steer hyper-connectivity rather than the reverse.

But in India, if Prime Minister Narendra Modi's recent speech at the inauguration of the Centre for the Fourth Industrial Revolution in New Delhi is anything to go by, technology is to remain triumphant. In part, his banal understanding of technology seems to be in full step with his government's sustained anti-intellectual mood. In the last couple of years, the teaching of humanities and social sciences in India has been all but gutted and serious questioning choked off at universities.

Industry 4.0, hence, moves brusquely into India without moderation and qualification by academic reflection. By disarming humanities and social sciences from its stock of critical abilities, technology becomes only a technological rather than a social challenge.

In all likelihood, a machine-colonised future awaits the Indian youth: the magnificent demographic dividend that ends up as 'flowers in the dustbin'.

The writer is an associate professor, Graduate School of Asian and African Area Studies, Kyoto University