Books; People; Ideas : These are few of my favourite things. As I live between day-to-day compromises and change-the-world aspirations, this is the chronicle of my journey, full of moments of occasional despair and opportune discoveries, of connections and creations, and, most of all, my quest of knowledge as conversations.

Saturday, December 31, 2016

For, all our difference, between Liberals and Conservatives, the Left and the Right, can be summarised as this:

1. The Liberals want the State to tax those who earn and
provide Welfare to those who do not, so that those who earn can keep
themselves forever on the treadmill and those who do not can be happy
with the handouts, and this should keep everyone off politics.

2. The Conservatives do not want to tax those who earn and do not want to give Welfare to those
who do not, so that the former is happy and the latter is on the
treadmill, and this should keep everyone off politics.

These conversations are so common that one may think this was always the case. However, as we know, politics of welfare is not primordial, but rather an industrial age phenomena. At its core, it assumes that everyone can, and should, be able to find work, and it is either Unfortunate (Liberal) or Criminal (Conservative) not to be able to find work.

One may love or hate it, these ideas do not work anymore when some of the fundamental assumptions do not hold true anymore.

For example, that one can find work, if one is ready to make the effort. We may be entering an epoch of excess labour.

Or that, all a person needs is relevant skills. While this may be true in a qualified sense, the unspoken assumption is that a person can be taught any skill, at any time, without regard to his or her abilities or interests, and that s/he is infinitely mobile and can live anywhere in the world, where such skills may be required.

Or that wages can be flexible downward. This is one of the great blind spots of economic reasoning, that lives can be ordered around supply and demand curves, and people would understand and accept that.

Or, following this, most people seeking work will work, and the ratio of working people and those seeking welfare will be something like 15:1, which is what the ratio is in Britain today. This would ensure that we can tax the wages to pay for welfare.

And, finally, that more capital investment creates more jobs. This is why our systems of taxation and welfare treats dividends, arising out of investments, specially.

But these do not hold true, and we know that. People can not be skilled as required, nor can they be moved around flexibly. The downward spiral of wages is often the moments of politics and revolution. With automation and global supply chains, the ratio of working and non-working people are really threatened, and our healthy ratios are often a result not of a functioning economy, but statistical black art, as we stop counting people who give up. And, crucially, capital investments today do not necessarily create jobs; rather, in many cases, it destroys jobs by employing labour saving technologies.

All this makes for an urgent paradigm shift in Welfare. And, without being disrespectful to those in need, we have to move beyond Welfare. As, in a scenario when most people could be on dole, dole is the new normal and we may need to find a new word altogether.

We already have an idea that could work: Universal Basic Income. This is an idea whose time has perhaps come. If, for example, UK scraps welfare and moves to universal basic income, it could pay about £6,000 to every working age person in the country without adding a penny on its Welfare budget. Indeed, that may sound unjust, if we stop paying the sick and the infirm, and instead start paying the able-bodied who could serve at a Tesco till instead. However, one has to contend with the possibility that Tesco may indeed want to automate those tills - anyone noticing would have seen the precipitous drops in manned tills over the last six years or so!

One could argue £6000 isn't a lot, and one could not live on this. But a number of other things could be done to pay more to those who needs more, as well as realign the system of taxation. Most middle class wealth creation in the last ten years have happened in urban properties, a very tangible form of wealth which is easily taxable. Also, there is no reason to treat dividends specially once we accept that Capital Investments are not creating jobs and income, and are solely about private profits. And, the argument that this means a flight of capital is only true for a few countries, which have set themselves up as a sort of tax and regulation haven. For any country with a significant market, tying paying taxes to market access would make companies stay put - and indeed, wreck the party for tax havens.

These are not new ideas, but those which are already being discussed and tried out in some parts of the world. But, so far, these ideas remain in the fringes and being implemented in a piecemeal manner. Universal Basic Income also does not make sense with the State taking the responsibility of Health Care and Education. Now, one could argue that the State does not do anything well, but, as we have seen, this is just neo-liberal mythologies that we should all now get tired of. Usually, the State may have been bad in running businesses - hotels, banks etc - but the private sector has been exceptionally poor in providing public services, especially health and education. Where the State has failed in education is usually where it has allowed a 'mixed economy', allowing private sector to operate side by side (where there are no private schools, such as in Northern Ireland or Canada, the State schools have performed very well).

However, old models die hard, and Welfare would be the hardest to change. It is the 'anti-politics' machine, supporting whole political ideologies and vast bureaucracies in its intricate folds and elaborate expanse. It is one thing in which everyone has a vested interest. But, as we know, mental models are no match for technologies - and as practises evolve on the ground, it is eating away the foundations of the system that were so loving built over the last two centuries.

Tuesday, December 20, 2016

I wrote a post earlier about my reading enterprise in 2016 (see here), something of a narrative account of my book diary and a Goodread's Reading Challenge that I indulged myself with. As an aggregate, it showed a failure - I only finished half as many books cover to cover as I set out to do - and fragmentation of goals and enterprises, as I followed several different agenda and did not complete any of those in any form.

In a way, this is fine. I wanted to follow my heart in what I chose to read, and it really could not have been otherwise. In a sense, I want my relationship with my books to be impulsive yet profound, momentary yet forever remembered, free of commitment but laden with meaning. This is what I am, perhaps: Long ago, someone promised me a relationship with 'all the dimensions, but no destination', and though all the details of that conversation have faded, the idea remained with me, and now morphed into a quest for me to be lived in the books.

But, then, books are for a reason. I do not read to entertain myself. In fact, I see this idea of reading as entertainment, all the industry of pulp fiction and the practise of reading to fill time, with suspicion: It is that dreaded consumer fashion seeping into my sacred space. I make it a point to declare that as I do not read to fill the time, the common and awkward question, 'where do you find time to read', is somewhat irrelevant.

But, then, why do I read? I noticed that every time I write about reading, I, unconsciously, make a case that I read to learn. In the earlier post, I spoke about my project about Creative Cities, which has receded in the background as other interests took over, and my interest in economic development, enlightenment and ideas, and indeed in history and politics. This may fit into a paradigm, that I am reading for learning. This makes me less crazy, though not completely out of the jail, as learning may no longer be viewed as an entirely wasteful enterprise (it was interesting to hear a business associate recently describe me, rather affectionately, as 'bookish').

This is indeed my paradigm too, or rather the excuse: That I am spending all this time and money on books because I am learning. The alternative, describing all this as a hobby, would make me appear profligate and even somewhat lazy. But, this leads me to another question: What am I learning? This is where perhaps the 'all the dimensions, but no destination' is the most appropriate. My learning is indeed thematic and have specific objectives in themselves, but those objectives are not, as one would expect, neatly connected with some outcome, career-specific or otherwise.

In fact, one of the key things in 2016 was that despite engaging into a significant reading enterprise, I did not read any business books. None! I did buy a few, though I would describe the ones I bought more as technology books rather than business books. I have generally been interested in the development of technology and how it affects jobs and wider society, both historically - and hence my readings on Capitalism, Industrial Revolution and even Middle Ages - and at our own time. My work, which is all about bridging education with employment or enterprise, did make it more interesting: There was a reason for me to try to develop a well-considered argument!

As I approach 2017 then, I see a fuller development of this interest. 2017 is likely to be a significant year globally, as the politics of countries shifting rapidly with a Trump Presidency, a real Brexit and increasing conflicts and realignment of powers and interests; for me, personally, 2017 is the year when I start again, after the necessary penance of the failed experimentation is complete, on a new enterprise. In more than one way, 2017 is a year of decisions for me, and chief among them is a call I have to take about whether I should go on living in Britain, or whether I should look to migrate again, either back to India (or Southeast Asia, which is more interesting) or to Mainland Europe.

Big questions like this, rather than tame ones such as which job I should do, usually define my learning agenda. I expect to continue a few things I am doing - such as my pursuit of History of Ideas - and add a few new things, perhaps learning a language and culture (German and Russian are the two alternatives I am contemplating now). I also expect to align my writing with these explorations, and write more than just blog posts (a plan I made and abandoned many times) - and indeed, seek to align my work more closely.

Monday, December 19, 2016

There is a talent problem in our economies. Speak to any employer in almost any country, and they will tell you how hard it is to find good employees with appropriate skills. And, this happens despite a massive expansion of public education system and rising literacy, an unprecedented level of access to Higher Education and Skills Training. There is also a near total consensus that education has an economic goal - jobs or career - and most employers have existing programmes and expansive plans to engage with education providers at all levels in search of talent. And, yet, the problem persists - and getting worse.

Here, I think, we should make the case for a paradigm shift. The recruitment of talent currently happens with a supply chain paradigm. Even in the best of the cases, where employers are deeply engaged with education institutions, they try to shape the curriculum, spot student talent early and do campus interviewing, they are still looking at the educators as passive suppliers of talent, whose job is to educate people to the employers' specifications.

Sure enough, there are other models that exist, but only at the very high end of the talent pole. The research scientists, for example, would transition from education to employment entirely differently, as the labs and projects are often jointly owned and run, between the employers and the universities, and the dividing line between work and learning is far less clearly drawn. This is perhaps because, at that level, the 'competencies' are transient and dynamic, and on the employers' side, the stakes are higher and the economic case is easy to make.

It is difficult to see how such a system could work for an Indian outsourcing firm recruiting thousands of people every year, or a small business requiring a few entry level talent. But, the existing model of finding talent, that of being a customer to the universities, is not working either. This is because the 'competencies' have become as dynamic as ever, even in the entry level jobs, and the business models of both the large and small employers at lower tiers of the value chain are now susceptible to dramatic change. The Indian Outsourcing companies, unable to compete on costs in the face of automation, are trying to climb up the value chain, and try to move from 'jobs' to 'ideas' business; they can not achieve this without overhauling their talent recruitment model. On the other end of the scale, the small, niche, companies are trying to move from 'services' to products, reducing unit costs but creating greater complexity of design. And, for companies across the spectrum, new challenges are emerging - demanding greater sensitivity to markets and customers, more transparent operations and deeper compliance, cultural sensitivity fit for a global workplace - which the current recruitment model insufficiently addresses.

The employers are seeking to solve this by doing more of the same - deeper engagement in curriculum design, more campus interviews etc. But the solution may lie elsewhere. It is important to look at how new industries are emerging in different places. This often happens anchored around an IBM, a GE or a General Motors, but not driven by them: Rather, an ecosystem emerges as the Supply Chain transforms into a Value Chain!

The difference between the Supply Chain and the Value Chain is that innovation happens at different levels of the supply chain rather than being at the customer end. The traditional model of Big Company Labs innovating and then handing down the specifications to the suppliers to conform to is the old model, and it is now getting replaced by what you may call the 'Cisco model', where an ecosystem of providers, working competitively or collaboratively, building innovation at every step of the process. In this model, the End Customer is an active participant, a conduit to market, and a guarantor of relevance.

A similar model is needed to solve the problem of finding talent, and if we may call it, the next talent innovation. The employer should not merely participate in curriculum making, as this is a static process, often obsolete even before this is put to practise. Besides, in real world, changes happen within the practise, in small steps, in emerging use of tools, rather than in overarching theoretical models, the level curriculum deals with. Instead of giving inputs to curriculum, the employer should perhaps be the curriculum, allowing co-ownership of the educational experience. An active strategy to transform the educational experience, with employers standing guarantee to innovation, is one way to encourage innovation at every step of the educational experience - and for creation of a value chain for talent.

Sunday, December 18, 2016

I started 2016 with a surprise discovery: That Bill Gates reads a book a week! I love reading books, yes, the old fashioned paper books, and spend most of my time and money on books. And, yet, I struggle to read as much as I would like to, as life intervenes. The work, the chores, the celebrations and the worries, moments social and the solitary, all present their different challenges inbetween me and an undivided and unwavering commitment to my books. And, yet, here is the man, who earns about $150 every second - if that's one benchmark how valuable his time must be - and who, as Michael Sandel explained, may find stopping and picking up a $50 bill if he spots one lying on the pavement a waste of his time, claiming that he accords highest priority to reading, and even sets aside time when life gets too busy!

I know the usual explanation: We are not Bill Gates. Yes, when you earn $150 a second, you do not drive your own car. You can choose who you socialise with. And, indeed, if you play Chess with Warren Buffet, that is a different life altogether compared with drinking with friends. But, never did such explanations - excuses, shall we say - deterred me from trying. One way of looking at it is that Bill Gates can read more because he has achieved everything one could dream for in life, and the other way is to think that I should read more because I have not yet. The latter argument is the one for me, as that is the one which argues in favour of doing something, rather than making the case for changing nothing.

So, I did go into 2016 with a modest commitment of reading 52 books from cover to cover. And, as the year draws to a close, I failed - by a margin! I completed only 27, cover to cover, and will perhaps do a couple more in the Holiday season. So, about half of what I wanted to do, an insignificant number compared to the total 302 books I bought during the year, in addition of borrowing a total of 223 books from the five libraries that I regularly visit. In fact, the borrowing figure itself shows my lack of progress in the reading enterprise - I am entitled to borrow 67 books at one time and the lowly aggregate actually suggests that I rolled over a lot of them as I could not finish them! Indeed, there are 148 books which I abandoned after working through it almost towards the end, and another 102 which I abandoned less than half-way through, in addition to 6 that I could claim to be currently reading. And, I must also add to this figure about 37 that I used only for specific purposes, reading a chapter or so, because they were needed for my studies in the History of Ideas (it is fair to count them as I counted the Library borrowings on the other side of the equation). So, I had a total of 525 new books, bought or borrowed, this year, and I managed to use 212 as intended (and there may be 6 more) and abandoning 102 altogether. This is indeed not the full picture as all the books I read are not the ones I bought during 2016, but I can live with this data at the time.

Looking back at this data now, my big problem in book reading is definitely fragmentation. This shows up even in the small number that I managed to finish reading. There are several threads I followed during the year. I followed up on my general interest in the American War of Independence and its Founding Fathers (I read the biographies of Franklin and Adams last year, as well as a number of books on the years between 1776 and 1789) and completed an interesting biography of Jefferson and a fascinating book on American Constitution, Unruly Americans, which basically argued that the American Constitution itself was conceived as an instrument to preserve the credit status of the young republic, rather than any of the lofty purposes that we ascribe to it. Despite reading these, there are several I could not pursue yet: I started reading a biography of Washington and abandoned half way, and a couple of books by Gordon Wood and Joseph Ellis also remained untouched. Towards the middle of the year, I was interested in Hamilton, no doubt by the success of the musical, but I drifted so far away in my interests by then that I did not buy Ron Chernow's book, which remains on my Wish List.

The other theme that I started the year with was Money and Banking. One of the books I read early in the year was a story of Central Bankers, The Lords of Finance, which, despite its balk and vast scope of the story, was fascinating to me. I started and not completed several other books on the subject during the year, Philipsen's Little Big Number on GDP and Angus Deaton's The Great Escape being the most notable. This is one thing that I am perhaps going back to in the remaining weeks of this year, as books such as these definitely interest me. I have on my list, something that I would perhaps follow up on in 2017, the fascinating The End of Alchemy, by Lord Mervyn King (which I have two copies of, one bought in India at a cheap price and another, bought on impulse, as I was at an event where Lord King was speaking and I could get it signed by him), and Robert Gordon's the Rise and Fall of American Growth.

When I started 2016, I wanted to do a serious writing project. I chose, at that point, to explore Cities of Ideas, the cities that flourished, at points of time in history, as places of Creative Ideas and Enterprises. I read the journalistic Geography of Genius early in the year, not least because it did talk about the mini-enlightenment of Kolkata, and established in my mind, a link between Scottish Enlightenment in Kolkata. I did also read a bit about Vienna, completing The Nervous Splendour and working half-way through to Thunder At Twilight, and Paris, particularly the fantastic The Existentialist Cafe, before being waylaid into reading about Camus, in the brilliant A Life Worth Living. I also started reading Joseph Roth and finished his The Radetzky March, but not, as I started doing, Robert Musil's The Man Without Qualities, or Roth's The String of Pearls, which I borrowed from the libraries.

If a reason is to be given for my abandonment of the creative cities project, though I am sure I would eventually come back to this in 2017, it is that I developed a new interest - in the phenomenon of Enlightenment itself. So I did read Anthony Pagden's The Enlightenment and Why It Still Matters cover to cover, perhaps twice if I count the number of times I had to stop and re-read, and Frank Turner's European Intellectual History From Rousseau to Nietzsche. This, to be combined with a number of shorter studies, on Heidegger, for example, dominated the summer months for me: I was forever talking about technological attitude and attitude of care and was trying to understand everything through that lens. This also led to my enrolment, finally and courageously, at Birkbeck College to study History of Ideas, though, as I must admit, it turned out to be a different thing that 'Intellectual History' that I was looking to study.

Much of my later readings this year was strategic, as someone coming to studying history formally late in life must do, or to catch up on the basics: This accounts for my reading the history of French Annales school, or the gripping narrative presented by Henry Ashby Turner, in very House of Cards style, of Hitler's ascension to power in January 1933. Then, there is this whole enterprise on studying on Nationalism, Benedict Anderson's Imagined Communities and its criticism, and the history of Italian Risorgimento, which dominated my book-buying, library borrowing and reading in the last three months, without a corresponding impact on my finished books list, at least yet.

So, to '17: It is funny that I must come back to where I started, Gates' commitment to read a book a week! This is perhaps my character - I take the motto "Fail, Fail Again, Fail Better" too seriously - and I am going to try again. Indeed, as I intend to continue my studies through 2017, this should become easier, more so as I move to doing research projects defined by me rather than trying to complete a curricula of some kind, as I had to do in the Autumn term this year. I know what's coming up early in 2017, a few weeks of intensive work on Nationalism and Nation States and then a few more weeks of studying Darwin, but I intend to return to my Creative Cities project as an aside when I get to breathe again. I do expect to return to travelling by summer months - I have not been travelling for last 12 months and while that may have helped me recover my sleep and health, that reduced my reading time - and hopefully this will allow me to pursue my own reading agenda again.

Friday, December 16, 2016

Everyone, it seems, loves an Org Chart. The little boxes of power, those straight lines of responsibility, that one page definition of the hustle of start-up life - neat, tangible and reassuring! It is loved by those who make them, as they see themselves securely placed in one box or another, and by those who demand them, investors, accreditation agencies and bankers, so that they know how to give credit and how to apportion blame! When they are given out publicly, as is usual in countries that thrive on hierarchy, customers treasure them for writing complaints to the big man at the top and salesmen treasure them to cut the chase.

But, it is also one of those old-fashioned things that everybody loves to hate. Particularly in the start-ups, where the rough and tumble of daily lives often do not follow neat structures and fixed boundaries, a secure spot towards the top is as desirable as the lovely cabin at the upper decks of the Titanic. In a world where rolling up the sleeves and getting things done - rather than hiding behind one's paygrade - is the ultimate virtue, Org Charts are often a liability.

So, when a new entrepreneur asked, my suggestion was to go without an org chart and designations. When he insisted - his is a small education business going for accreditation and the inspection agency demanded to see one - my suggestion was to call everyone a 'Partner', defining people by their responsibilities rather than their 'designations'. To this bright idea, he came up with two significant objections almost immediately.

First, a practical one, that designations attract people. It is much easier to hire talent if a good designation could be offered. In the same vein, with undifferentiated designation, it is harder to hire Senior people.

Second, it is easier to communicate the person's decision making remit with a designation. This makes things easy and transparent for outsiders, Partners, Banks, Investors and Accreditators, to understand the organisation. Everyone being a partner is problematic for everyone outside.

These are common sense objections, but, I have two corresponding questions about the business of Org Charts.

1. Does designations attract the right kind of people who will be successful in a Start-Up environment?

2. Does designations communicate the remit of decision-making within a start-up and can the outsiders reliably make out what the person does from the short-hand of a designation?

The answer to the first question is negative. People who value designations crave for a kind of recognition, institutional rather than meritocratic, which is hard to achieve in a start-up. Besides, this view is based on a fundamental misconception about the nature of the start-ups - that it is a small version of the big company - which is not at all true. The Start-ups are a different organism, with different challenges and growth paths, and running it with formula that may work for large companies - that designation attracts talent - may be a recipe for disaster.

The answer to the second question is also negative. We already knew that different industries had different designation systems - a General Manager in an Manufacturing Organisation would usually have more budget and money than a Vice President of an Advertising Firm would would, in turn, be equivalent to a Managing Director in a Bank - but this all gets mixed up in the Start-up world. This is because most start-ups borrow their designation systems indiscriminately, most commonly from the heritage of the Founder but often also from their customer companies. So it is hard to tell, more often than not, whether a Director is senior or junior to a VP in a start-up.

In fact, contrarily, an elaborate designation system may say something else about a start-up. If you are the customer and if you sit in a meeting attended by more than one Managing Directors, you may be forgiven for thinking this is to be a classic case of all Chiefs and no Indians. The designation system itself changes the organisation more often than not: It shifts focus away from execution and responsibility to meetings and plans, as this is what people with Senior designations are expected to be do more of the time.

So, in the end, my advice to my correspondent was to define the organisation by responsibility, not designation. And, indeed, do so not to follow a fashion, but with a deliberate strategy to focus on execution.

This means, potentially, giving everyone a say on everything, but that is a good thing, rather than a bad thing, for a start-up. In a start-up, outside intelligence and perspectives are often limited, and more eyes find more bugs, as they say in software.

But, with defined responsibilities, there would be one person responsible for execution in each case, and as long as that person gets primacy, as s/he is accountable, the business will work out fine (capability, in this formulation, is assumed, and getting rid of designations mean that the organisation needs to be more meritocratic, not less).

And, one final thing about the customers and the partners, the two principal outsiders every start-up needs: The smart ones do not look at Org Charts, at least not anymore. They understand the organisation, and feel the way it works. An elaborate system of designations makes them take less risks and make worse offers, as they sense that an overtly bureaucratic start-up may not last, and therefore, not sharing the risks with.

Thursday, December 15, 2016

Massimo d'Azeglio is usually credited with coinage of the expression "We have made Italy, now we must make Italians" (though scholars have now indicated that he never did write this, and the expression originated only later in the current form). Whoever said this, this represents what we may call the 'problem of Italy' - a new nation state without the corresponding sense of citizenship and belonging. Indeed, most of Italy's modern history is marked by disunity, between North and South, between the Left and the Right, the Industry and the Peasantry and so on. The existence and implausibility of the Nation State in Italy, something that the expression of 'making Italians' indicates, have been the basis of much discussion, not just in academia but in politics: It is no surprise that 'making Italians' was appropriated and popularised by the Fascists who took the project on themselves.

In context, one has to note that d'Azeglio, the Poet, Hero and Statesman of Risorgimento Italy, did write something in his autobiography (which will eventually morph into the current form) about making Italians of Character. This statement did indeed, by definition, accept the existence of Italians and Italy, rather than the later, 'making Italian' expression. But the later expression, if vindicated only with a retrospective justification, also shows what we may call the 'problem of nationalism', the need for reconciliation of the political idea and state-making with the ideal of community and individual identities. There is only a fine line between the idea of 'making Italians' and of fascism, the passion for education and the terror of a 'national community' that Nazis would speak of later. In case of Italy, this 'making' project would soon become a quest for 'greater Italy' and what may have started as an enterprise to include would, in no time, turn into a quest to exclude. No wonder that scholars would track down the origin story of this quote - of making Italians - not in 1860s Risorgimento Italy, but rather to 1890s and Italian defeats in Ethiopia.

And, that is perhaps appropriate, as Ernest Renan pointed out that a 'nation' was built often, not upon celebration of victory, but rather by the solidarity of common suffering. At good times, it is usually a scramble for profit and perks, but only suffering, defeat and disasters of various kinds, can bring people together, only if in the memory of happier days and past glories. We have some historical corroboration: The path to United States was forged through the calamities of the Civil War, for example. Against this, the curious optimism of the post-colonial nationhood, woven around the assertion of liberty and self-determination, the Wilsonian ideal, was pitted.

This may, at this point, bring us to the 'problem of India'. Here, a nation was made, on the presumption of its ancient heritage and origin, but with a cultural unity that was challenged, in equal measure, by linguistic fragmentation and political disunity. The origins of the new nation was marked by suffering - the immense, genocidal tragedy of the partition - but this suffering divided, rather than united, the people. The state that emerged was closely modelled on the Imperial model, keeping its symbols and ceremonies, its Civil Service and the customs, juxtaposing a distant, elitist model of administration with the political ideals of the Republic. In this new formation, 'making' Indians were not an urgent task: They were already made, it was assumed, and the independence was the fulfilment of a 'tryst with destiny', an end in itself. A powerful modern state emerged, with Republican rhetoric and Democratic institutions, with the struggles for self-governance culminating merely in an Indian Raj. The obsession, as it emerged, was with the place the new nation may occupy in the world of nations, rather than its own nation-making.

Indeed, India was an idea, as some apologists of this lofty vision would say: An idea that meant to define the ideal and illuminate the path, and gradual prosperity and education was to include the excluded and bring the marginalised to mainstream. The Indian state, solid, enlightened and powerful, was to guide the lives of its citizens, deliberately and comprehensively, overseeing it from the commanding heights of the economy and through a moral presence in every aspect of the society. It is a kind of paternalistic nationhood, built around prosperity and progress.

We live at a time when this vision is challenged. The ever-expanding powers of the state hijacked by a lumpen-elite, the vision of progress degenerated in a morass of corruption and the citizenship morphed into a confusion of everyone for himself, the Indian nation now faces the existential questions that were postponed. Time for a Cultural Revolution, some would believe, and accordingly, one was indeed made, an overnight cancellation of 86% Indian currency in circulation in a country where more than 90% transactions are in cash. It is a perfect storm of economic upheaval, political action and cultural shift, brought about by deliberate and authoritarian action, with complete disregard of the protocols of Cabinet Governance and Parliamentary procedure. This meant suspension of basic economic rights - the ability to withdraw own money from a bank account - for the vision of a organised, digitised economy. It was not only Mao who was capable of an utopia!

This thing then should make Indians. A collective sacrifice, going about with little to spend, queueing up in the banks for hours, with the greater good in mind is an way to build the nation. It is only fitting that the goals of this great enterprise shifted, quite swiftly, from controlling the black economy to curbing terrorism to creating a digital economy, and in fact, in most cases, to all of the above or whatever is most attractive at a given point. But, again, as it appears now, this is one of those calamities which divides, rather than makes, a nation. The demonetisation meant an organised transfer of wealth to the poor, who used the cash, to the wealthy, who stand to gain most from increased liquidity of the banks and lower interest rates on borrowing. Years hence, people would possibly look at this demonetisation exercise and use a Churchillian expression in the reverse: Never has so much been stolen from so many by so few!

Monday, December 12, 2016

I am something of a veteran being on the losing side of elections. And, with interests in politics globally, I am on the losing side more often than normal. I have indeed no business taking sides on US or Filipino Presidential elections, or the referendum in Italy, but I did want an outcome and ended up being on the losing side. Closer home, I did vote Remain and was stunned by Brexit, and more disappointed than surprised by Indian choice of their Prime Minister in 2014. It is not a good time for people with 'Liberal' sympathies, and I am sure to be in for some more disappointments in 2017, including some major ones in France and Germany, as it looks like.

However, I am writing this not to moan my plight, but rather to reflect on what one does when elections produce unpalatable results. I did indeed express my disappointment and question the merit of Direct Democracy in the morning after Brexit, a genuine feeling that I came to regret with time. In fact, now that the disappointment with election results are increasingly becoming the norm rather than exception (the only recent exception of the norm being Austria's rejection of Norbert Hofer), I think time has come to think seriously about 'Trump Syndrome', that feeling of resentment with the outcome of an democratic election, which, if taken any farther, would undermine the democratic system as a whole.

I would have called this Brexit Syndrome, as the feelings and its consequences were very similar. However, in the aftermath of Trump's election, this has become serious business, with White House dropping hints of Russian involvement, electoral college members filing lawsuits, the Green Party pushing for recounts and even suggestions that Trump could be impeached upon assuming office (which has near-Zero chance of going through a Republican House and Senate). So, it is befitting that we call it 'Trump Syndrome', as the question is crystallising around Trump and the Liberal panic.

One should see that valiant that it may be, all these post hoc actions to undermine Trump's election undermine the democratic process itself. I am no admirer of Mr Trump, and I am as panicked in anticipation of his presidency as anyone else. However, I think the issues being raised and discussed around the validity of his election rather counter-productive. The popular vote is important, but there were Presidents who lost on popular vote before. The question of Russian Hacking is without proof that this had impacted the outcome in some way, and in fact, a big failure of the outgoing administration as they failed to prevent it (would Obama want to be remembered as the President who let Russians control the US Presidential election?). The question of Trump's ownership of foreign assets being seen as a violation of first amendment is quite far-fetched, as Trump did not acquire those properties using Presidential offices, and treating legitimate business purchases (as in renting a premise or a hotel stay) as bribery or favours would open up floodgates of other conflict of interest instances which may affect anyone, including the Congressmen or Senators voting on it, who may own a share in any corporation trading globally. Finally, the silliest of all, the complaint that FBI opening the enquiry into Hillary Clinton's email server late in the day allowed Trump to win is to request the world to stand still so that Clinton could win the election, and indeed, a strange way to overlook her naivety (or complicity in something sinister) in maintaining those servers in the first place. Each one of these complaints and assertions have effectively told the world not just about Trump's flawed election, but the broken system of democracy - and therefore, had the opposite effect of its backers' intent.

So, what is the correct response when democratic elections produce unpalatable results? Undermining democracy, or questioning the process itself, is hardly the best way to go about it. Yes, in a way, this has been America's way of conducting democracy elsewhere - countries could be democratic as long as it produces friendly governments (Egypt being a recent example) - but this is hardly the way one could maintain a stable polity. The other way is to maintain an accountable system in place, wherein the winning party is under checks-and-balances and the losing party has to go through self-questioning and correction. The British system is not perfect, but this is quite close to what I am describing. And, this was my realisation after my frustrated comment about democracy on the morning after Brexit vote: That there was a lot of turmoil, and a very bad Cabinet that came of it, but the system is self-critiquing and potentially self-correcting.

Which is not what could be said of the Democratic Party in America, or Indian National Congress in India. There was very little discussion about how Democrats lost the election primarily as they bent over backwards to ingratiate Clintons. And, indeed, in India, the Gandhis are beyond questioning, even though they preside over election debacles after election debacles.

Seen with this perspective, it may be that the unpalatable results are not that bad after all. A madman may win, but only if his opponents really falter. The point I need to take home is that the Liberal position itself is unsustainable, and the people like Clintons and Gandhis have nothing to offer to the people. In fact, the Trumps and Modis of the world are as much a creation of the bankruptcy of Liberal politics as it is of other things, and people, after all, have chosen wisely.

Sunday, December 11, 2016

Whenever I speak about Universities As Networks, the idea smacks of being the 'cool new thing': I am immediately hit with the claim of tradition - that universities have been in their current form for 'hundreds of years' - with the implication that this institutional form is resilient and not going to change anytime soon.

The point is, of course, that the critical thinking that universities claim to imbibe in their learners is expected not to be applied to the institutions themselves. This claim of faux-tradition, that the universities have been around in some sort of unchangeable form for hundreds of years while everything around them changed, often goes unquestioned. So, a little scrutiny of the origins and traditions of the universities is quite useful for our conversation.

And, humbling, too: Because if anyone seriously thought that the universities as networks is a cool new concept invented for the Internet age, a quick tour of the medieval universities would quickly disabuse one of such a presumption. That is indeed what universities were, as they emerged in the middle ages, very much a network!

The most startling thing about those first European universities (universities existed in the Arab World, India and China before this), in Bologna, Paris, and later in other places, whose traditions we so fondly gloat about, is that they owned no buildings. And, because they did not really have a campus, they were often mobile - in fact, the University of Paris once decamped from Paris for several years because of a quarrel with the town authorities! These institutions rented rooms - in Churches, Monasteries and even in Brothels - and ran day-long sessions of lectures followed by disputations (modern-day seminars).

Also notable that the term 'universities' did not arise from a reference to the Universe or Universality (or its Latin equivalent), but rather as this was a union, of students and masters, formed primarily for collective bargaining purposes. I twist this original meaning a little if I say this was a Network, but only a little - because that was what its purpose was: To be at an University did not mean being at a campus, but belonging to a network, which negotiated collectively and had a privileged status.

The history of universities is a specialist area of interest, and do not receive much popular attention. However, as I read Charles Homer Haskins and Hastings Rashdall, I am fascinated by these ancient institutions - their forms, legacies and how similar the debates were through the ages - and see that the universities were indeed formed as a network. They were not all similar - there was no central authority and no 'Bologna process' - and what happened in Paris (an institution run by the Masters) was not similar to what happened in Bologna (an institution run by its students). And, indeed, I am making no claims that the universities we build now should go back in time and replicate these institutions, though, it seems that Bologna did have a student-centred university, stealing the thunder from today's For-Profit institutions, with some pretty disastrous implications for those who taught there. And, finally, one must be aware that one must look beyond European tradition when speaking about universities in Asia, as we are no longer in the Nineteenth century that everything has to be seen through the European eyes. Yet, the lessons from the first universities are useful, not least because this is the starting point of the long tradition that the advocates of status quo usually invoke: The universities have always been a product of their circumstances, and not something that stood outside its time and context.

Monday, December 05, 2016

If we follow the talk, the future of jobs is bleak. Software is eating the world, as Marc Andreessen loves to claim, and he means it literally. Economists are now studying the probability that jobs such as Taxi Drivers and Restaurant Waiters would be automated in all seriousness. Technology seems to be reaching a tipping point that would transform the workplace.

In context, educators are right to be worried about the future of their students. There are things being said to mollify the duly anxious: That while the emergent technologies may destroy some of the jobs (in one estimate, 47% of the occupation groups employing three-quarters of the current workforce), it would create new jobs which we have not seen yet. While this is most certainly true and new professions will arise, such statements hide the fact that new economy jobs are just too few to offset the loss of old ones: Kodak, with its 130,000 employees, giving way to Instagram, with a dozen employees at the time, is illustrative of how these new jobs may play out.

However, it is important to remember that the current conversations about technologies and jobs are shaped by the historical experience, mainly of the Industrial Revolution. The underlying assumption of the technology evangelists are that it would all work out fine, just as it did then; those who fear the job-destroying technologies are also following the arguments that were used then, such as the collapse of the aggregate demand, and believe that what we see now is merely a more advanced, perhaps final, stage of that play.

There is a difference, an important one, though, between then and now. The technologies of the industrial revolution were designed to augment human productivity, and once education fully caught up with it - most West European nations achieved near total literacy by the end of the Nineteenth century - the new technologies created jobs and spread the prosperity. No one was necessarily talking about 'machine learning' then, primarily because there were ample profits to be found by expanding the productive capacity. This is different from the conversation we are having now.

The other important difference is the one about 'learning curve', which I used in my earlier post about which jobs may matter in the next three to five years (see Which Jobs Matter?). The argument is technologies are not autonomous, but they need learning - by human operators or in the machine learning form - to be productive. This takes time, and often, a deployment of technology would have to pass through a number of intermediate stages to impact practise. The current conversation about 'automation apocalypse' may therefore be premature.

There is a reason why it is intentionally so. Current developments of technology are often pursued independent of use, which is different how technologies may have emerged in the industrial revolution. This phenomena is caused by Venture Capital, which is a new thing, and while we may cite some examples of 'merchant adventurer associations' from the pre-modern times, venture capital in its current form and unique motivations is a completely new engine driving technology development.

This is usually seen as a good thing, as new products and services can emerge without having to battle out vested interests and conservative mindset like in the past. However, the 'dent-in-the-universe' ambitions of venture capital also has a serious downside: The technology creation and deployment becomes autonomous and disconnected from economic realities, causing bubbles to appear from time to time, leading to destruction of value but also social disruption. The venture capital industry (and I use the term in a broad sense, including family offices, sovereign funds, private equity and hedge funds etc., an industry which is tasked with recycling the surplus generated by the world economy) controls the public sphere - not just media but also democratic politics and interest groups, the intellectual ambitions and funding of the universities - and therefore, creates its own universe of ideas and conversations.

However, this, in itself, insulates the development of technology from 'feedback', except in the hard edges of the business cycle. The existence of huge private surplus may free the development of technology from the trials and tribulations of daily life, leading to the celebration of an unprecedented acceleration of history, but the underlying logic of technology eventually strikes back as it should. This is because, while we may not notice it, learning curve is still very much in action, and while rhetoric may outshine the reality on a given day, reality hits home eventually and inevitably.

What do all these mean to our practical concern about students and jobs? It is that following the technology talk is often pointless. The starting point of thinking about student employability is not new and exciting technologies, but rather the sectors in transition. There is no denying that we have made significant progress in technologies, but not so much in altering our practises and ways of doing things. It does indeed seem that the world economy is entering into a period of lower rates of profit, slower growth, tighter financial conditions and a phase of disruptive politics, which is likely to set off another negative feedback cycle in technology utopianism. What will remain, however, the logic of technology, the steady progress of learning and slow integration of tools into everyday practises. The jobs that facilitate those, an Accountant who is adept in technology, a Supply Chain professional, a Multimedia News Reporter, will be the ones to look out for.

Friday, December 02, 2016

There are things we know: That as technologies change rapidly, there is a hollowing out of the Middle Class jobs. Some jobs, like the Telephone Operator, have become extinct; some others, like Secretaries and Receptionists, have become less ubiquitous; and yet others, like the Book-keepers, are being driven into obsolesce. Just like automation of an earlier kind marginalised the factory worker (Charlie Chaplin in Modern Times, remember), the automation is now coming for the middle class lives and suburban lifestyles. Even those jobs created by technologies - the Call Centre worker and others - are now facing competition from newer generations of technologies, such as Voice Recognition. And, the indication is that this will intensify further, and transform the domains that were hitherto deemed safe: Jobs such as Accountants, Taxi Drivers, Legal Clerks and even Waiters and Cooks. The economies that benefited greatly from the globalisation's last wave - India comes to mind - will be greatly disrupted from its latest turn.

Then, there are things that we don't. The big question, of course, is then, what happens to all the people. Are we looking at a world full of unemployed, a new global underclass? The answer that the apologists of technological progress give in response is also based on a big do-not-know: Technological progress creates its own jobs, as it did for last several decades. We do not know what these jobs would be yet - who would have known about a Search Engine Specialist even only a few years ago - but we know there will be these jobs. And, whether or not we are optimistic, or fatalistic (we have always found a way, didn't we?) or doomsayers, we still do not know how the politics of the labour market will shape out: Would we continue to tax incomes of labour while we look at dividends, incomes of capital, more leniently, even when such capital is deployed to replace Labour and not create jobs (which is the reason for the tax incentives)?

However, this post is not about the technologies and politics of the job market, but rather how an educator (or an institution) should approach this issue. What jobs would really matter? As there are so many what-ifs and no clear answers, it is tempting not to try an answer at all. But an answer is needed and demanded, by no one else but the students, who are increasingly conscious of the high costs of education and seeking an assurance of some kind. While the educators would often say that the only certainty is uncertainty, they are essentially hiding the other side of the certainty - student debt! And, while some courageous educators are making the case for an education for character, claiming that character can help people live through changing circumstances, this fails to answer sufficiently why someone will need to go to an university and incur costs to build character, because the world is not short of trying circumstances itself.

My point is that attempting to answer the question - what jobs would matter in the future - is an important one for the educators. While there are uncertainties about what precisely those job titles would be, one must start with an understanding of the future labour market in an educational enterprise. This is because education is a future-oriented activity, and someone needs to make the attempt to unscramble the convergent forces of politics, technology and ideas to create 'models' of what the job market would look like.

So, to attempt again, there are things we know and things we do not. For example, we do not know precisely which technologies will emerge and how soon or lately they will affect jobs and careers. However, the project for the educator does not have to look out several years in the future. In fact, one thing we should know is to shorten our horizon and work with a few years at a time, three to five years at the most. And, with three to five year horizons, we can know enough about the technologies in development and from our past experience, the process of technology diffusion. And, with this, we can indeed build a workable model for this prediction.

With this in mind, we should know better than saying technologies would create new jobs which we do not know about. And, yet, such a stance is popular, popping up on Powerpoint in conference circuits all the time. The reason for this is a kind of technology fetishism that define our popular discourse: Talking about technological progress in this mystical way is actually glamorous. And, the hard truth really is that cutting edge technology, even with all those data visionaries and nanotech biggies, will have very little impact on jobs, in the immediate term or even the long term. They may be excellent jobs, but they are not the ones we educate for. In the end, those jobs do not really matter.

The jobs that will matter in the next three to five year horizon are those jobs focused, not on the creation of technology, but those that are aimed at diffusion of existing or emerging technologies. There are vast sectors of our lives that will embrace more and more technology, and use it better and more efficiently. We could have predicted the coming of the Search Engine Specialist jobs five, even ten, years before it became a reality: We could have looked out at our Yahoo! or Alta Vista and spoke about these transforming marketing in 1998 (and we actually did). The jobs that will make tech an everyday affair is unglamourous - does anyone want to be BT Technician who is usually only seen hunched at a Cable Box - but these are the jobs that matter.

The answer, therefore, is we know which jobs will matter, but we do not want to say. This is because, even in all its uncertain glory, the attraction of the unknowable technology creation jobs make better advertising copy than the very predictable jobs of technology diffusion. But just as the beautiful model in the advertisement makes the average housewife's life a little more miserable (even if she, in the process, buys more soap), the mystery of technology nirvana actually makes the poor students' frustrations worse. An educator, therefore, is duty-bound to demystify the Labour Market and talk about what jobs do really matter.

Italy recently apologised to Libya for its occupation of the country between 1911 and the Second Word War and offered an investment deal of...

How To Live

"Far better it is to dare mighty things, to win glorious triumphs even though checkered by failure, than to rank with those poor spirits who neither enjoy nor suffer much because they live in the grey twilight that knows neither victory nor defeat."

- Theodore Roosevelt

Last Words

We shall not cease from exploration And the end of all our exploring Will be to arrive where we started And know the place for the first time.