The data that we misinterpret, because there’s too much noise and not enough signal.

The data that we misattribute, because we mistake correlation and causality.

The data that we misuse, because we want them to support an agenda based on falsehoods.

Without Data Literacy, we end up in one of the following scenarios with regard to Data:

we don’t collect it;

we ignore it;we look at it, but don’t apply it;

we apply it incorrectly;

we extract the wrong meaning from it;

or twist it to support our (wrong) ideas.

Data Literacy can help us solve those problems, but it’s only one part of the puzzle. Anyone can throw a few numbers together to make a quick statistic, or compile tons of them into massive spreadsheets, but without any real meaning to be extracted we’re left with numerical gibberish, or “data salad,” if you will. This is where contextualization, narration, and design / visualization come into play; described for the purpose of how we can enable Data Literacy.

[--]

Employing methodologies and frameworks from the social sciences and humanities can get at key questions like:

Who created the data, for what reason, under what conditions, for which purpose? What are the barriers, entry points, and backgrounds that impact their ‘data exhaust’?

Who is gathering, analyzing, interpreting, explaining, and visualizing the data — what are their goals, seen and unseen biases, and personal backgrounds they bring to bear on these exercises?

Who the ultimate audience or audiences? What framing do you have to employ to best communicate the findings — and what happens if they don’t understand or agree?

What impact do things like the current zeitgeist, their geopolitical position in the world, or previously held beliefs play in the audience’s willingness to engage? Ability to understand?

[---]Therefore, if we are to move from:

Big Data / Even Bigger Data to More Meaningful Data

Data Science to Data Literacy

… we must employ the art, craft, and science of narration. The danger in not doing this leaves the understanding, application, and adoption of data and Data to those who are skilled in the art of collecting, storing, and parsing it. In my experience in and around the field over the last decade, at best you will get an academically-presented assessment and at worst, you will get an obtuse, ‘inside baseball’ report.

If I really hoped to make major progress in AI, the best place to do this wouldn’t be another AI lab. If I really wanted to build a better thinker, I should go study philosophy. […] I quit my technology job to get a Ph.D. in philosophy. And that was one of the best decisions I ever made.

Friday, February 27, 2015

My skin began to change for the better. It actually became softer and smoother, rather than dry and flaky, as though a sauna’s worth of humidity had penetrated my winter-hardened shell. And my complexion, prone to hormone-related breakouts, was clear. For the first time ever, my pores seemed to shrink. As I took my morning “shower” — a three-minute rinse in a bathroom devoid of hygiene products — I remembered all the antibiotics I took as a teenager to quell my acne. How funny it would be if adding bacteria were the answer all along.

The findings demonstrated only an association, not cause and effect, so it was not clear whether these behaviors directly led to fewer allergies. But it may be the case that these behaviors expose children to innocuous bacteria, which can help strengthen their immune systems, said Bill Hesselmar, an assistant professor at the University of Gothenburg and lead author of the study.

Finally, Rob Knight brilliant TED talk:

And we've just over the last few years found out that the microbes in different parts of the body are amazingly different from one another. So if I look at just one person's microbes in the mouth and in the gut, it turns out that the difference between those two microbial communities is enormous. It's bigger than the difference between the microbes in this reef and the microbes in this prairie. So this is incredible when you think about it. What it means is that a few feet of difference in the human body makes more of a difference to your microbial ecology than hundreds of miles on Earth.

And this is not to say that two people look basically the same in the same body habitat, either. So you probably heard that we're pretty much all the same in terms of our human DNA. You're 99.99 percent identical in terms of your human DNA to the person sitting next to you. But that's not true of your gut microbes: you might only share 10 percent similarity with the person sitting next to you in terms of your gut microbes. So that's as different as the bacteria on this prairie and the bacteria in this forest.

So these different microbes have all these different kinds of functions that I told you about, everything from digesting food to involvement in different kinds of diseases, metabolizing drugs, and so forth. So how do they do all this stuff? Well, in part it's because although there's just three pounds of those microbes in our gut, they really outnumber us. And so how much do they outnumber us? Well, it depends on what you think of as our bodies. Is it our cells? Well, each of us consists of about 10 trillion human cells, but we harbor as many as 100 trillion microbial cells. So they outnumber us 10 to one. Now, you might think, well, we're human because of our DNA, but it turns out that each of us has about 20,000 human genes, depending on what you count exactly, but as many as two million to 20 million microbial genes. So whichever way we look at it, we're vastly outnumbered by our microbial symbionts. And it turns out that in addition to traces of our human DNA, we also leave traces of our microbial DNA on everything we touch. We showed in a study a few years ago that you can actually match the palm of someone's hand up to the computer mouse that they use routinely with up to 95 percent accuracy.

Seek out that particular mental attribute which makes you feel most deeply and vitally alive, along with which comes the inner voice which says, 'This is the real me,' and when you have found that attitude, follow it.

Thursday, February 26, 2015

Everything in physiology follows the rule that too much can be as bad as too little. There are optimal points of allostatic balance. For example, while a moderate amount of exercise generally increases bone mass, thirty-year-old athletes who run 40 to 50 miles a week can wind up with decalcified bones, decreased bone mass, increased risk of stress fractures and scoliosis (sideways curvature of the spine)—their skeletons look like those of seventy-year-olds. To put exercise in perspective, imagine this: sit with a group of hunter-gatherers from the African grasslands and explain to them that in our world we have so much food and so much free time that some of us run 26 miles in a day, simply for the sheer pleasure of it. They are likely to say, “Are you crazy? That’s stressful.” Throughout hominid history, if you’re running 26 miles in a day, you’re either very intent on eating someone or someone’s very intent on eating you.

Wednesday, February 25, 2015

If our well-being depends upon the interaction between events in our brains and events in the world, and there are better and worse ways to secure it, then some cultures will tend to produce lives that are more worth living than others; some political persuasions will be more enlightened than others; and some world views will be mistaken in ways that cause needless human misery.

Tuesday, February 24, 2015

Tomorrow morning starts the grueling Orthodox lent. No animal product for 40 days. Note that both Ancient Greeks and Levantine Semites never ate meat without some kind of sacrifice to the God(s), something that persists in Kosher-Halal rituals. Meat was limited to festivals ("carnival").

I initially thought that the intermittent protein deprivation followed by overcompensation was meant to draw benefits from Jensen's inequality/antifragility (whether the process is kidney-rest, anti-inflammatory or authophagy for cancer control/hormonal as held by Valter Longo, it doesn't matter because we know the statistical structure of natural life gave hunters intermittent meat and steady vegetables and we are not supposed to have steady red meat).

But it can't be just that. It just hit me that I missed a central point. This relief was also to help ... THE ANIMALS, the ecology. Animals too need a break from milk/egg production, etc. And because of nonlinearity their population may need some kind of natural surge (hint: look at Lotka-Volterra predator-prey models).

Monday, February 23, 2015

Man surprised me most about humanity. Because he sacrifices his health in order to make money.Then he sacrifices money to recuperate his health. And then he is so anxious about the future that he does not enjoy the present; the result being that he does not live in the present or the future; he lives as if he is never going to die, and then dies having never really lived.

Sunday, February 22, 2015

Einstein’s most famous contribution to science, the general theory of relativity, was published in 1915. He won the Nobel Prize in 1921. Yet, rather than assume he was a finished product, Einstein continued to work and contribute to the field for 40 more years.

Up until the moment of his death, Einstein continued to squeeze every ounce of greatness out of himself. He never rested on his laurels. He continued to work even through severe physical pain and in the face of death.

Everyone has a gift to share with the world, something that both lights you on fire internally and serves the world externally, and this thing–this calling–should be something you pursue until your final breath. It could be your actual job, as it was for Einstein. It could be a creative hobby, as it was for Vivian Maier. It could be the care you provide to those around you.

Whatever it is for you, our lives were meant to be spent making our contribution to the world, not merely consuming the world that others create.

[---]

Hours before his death, Einstein’s doctors proposed trying a new and unproven surgery as a final option for extending his life. Einstein simply replied, “I have done my share, it is time to go. I will do it elegantly.”

We cannot predict the value our work will provide to the world. That’s fine. It is not our job to judge our own work. It is our job to create it, to pour ourselves into it, and to master our craft as best we can.

We all have the opportunity to squeeze every ounce of greatness out of ourselves that we can. We all have the chance to do our share.

One of the participants at the Kawasaki meetings, Sumie Maekawa, says she and her husband, who have no children, see their Aibo as a daughter. Ms. Maekawa, who is 72, talks to the Aibo every day, travels with it and makes clothing for it. She and her husband agreed that whichever of the two lives longer should be cremated alongside the dog, which also is named Ai, in expectation of a family reunion in the afterlife.

“I can’t imagine how quiet our living room would have been if Ai-chan wasn’t here,” Ms. Maekawa said, using an honorific suffix applied to girls’ names. “It will be sad when the day finally comes when Ai-chan is unable to stand up.”

Saturday, February 21, 2015

Oliver Slacks is one my favorite people; most of us can only wish to have a privileged life like his and the gift of saying an eloquent goodbye like him. Here's his My Own Life:

It is up to me now to choose how to live out the months that remain to me. I have to live in the richest, deepest, most productive way I can. In this I am encouraged by the words of one of my favorite philosophers, David Hume, who, upon learning that he was mortally ill at age 65, wrote a short autobiography in a single day in April of 1776. He titled it “My Own Life.”

“I now reckon upon a speedy dissolution,” he wrote. “I have suffered very little pain from my disorder; and what is more strange, have, notwithstanding the great decline of my person, never suffered a moment’s abatement of my spirits. I possess the same ardour as ever in study, and the same gaiety in company.”

I have been lucky enough to live past 80, and the 15 years allotted to me beyond Hume’s three score and five have been equally rich in work and love. In that time, I have published five books and completed an autobiography (rather longer than Hume’s few pages) to be published this spring; I have several other books nearly finished.

Hume continued, “I am ... a man of mild dispositions, of command of temper, of an open, social, and cheerful hum our, capable of attachment, but little susceptible of enmity, and of great moderation in all my passions.”

Here I depart from Hume. While I have enjoyed loving relationships and friendships and have no real enmities, I cannot say (nor would anyone who knows me say) that I am a man of mild dispositions. On the contrary, I am a man of vehement disposition, with violent enthusiasms, and extreme immoderation in all my passions.

And yet, one line from Hume’s essay strikes me as especially true: “It is difficult,” he wrote, “to be more detached from life than I am at present.”

[---]

I feel a sudden clear focus and perspective. There is no time for anything inessential. I must focus on myself, my work and my friends. I shall no longer look at “NewsHour” every night. I shall no longer pay any attention to politics or arguments about global warming.

This is not indifference but detachment — I still care deeply about the Middle East, about global warming, about growing inequality, but these are no longer my business; they belong to the future. I rejoice when I meet gifted young people — even the one who biopsied and diagnosed my metastases. I feel the future is in good hands.

I have been increasingly conscious, for the last 10 years or so, of deaths among my contemporaries. My generation is on the way out, and each death I have felt as an abruption, a tearing away of part of myself. There will be no one like us when we are gone, but then there is no one like anyone else, ever. When people die, they cannot be replaced. They leave holes that cannot be filled, for it is the fate — the genetic and neural fate — of every human being to be a unique individual, to find his own path, to live his own life, to die his own death.

I cannot pretend I am without fear. But my predominant feeling is one of gratitude. I have loved and been loved; I have been given much and I have given something in return; I have read and traveled and thought and written. I have had an intercourse with the world, the special intercourse of writers and readers.

Above all, I have been a sentient being, a thinking animal, on this beautiful planet, and that in itself has been an enormous privilege and adventure.

When we are young, we spend much time and pains in filling our note-books with all definitions of Religion, Love, Poetry, Politics, Art, in the hope that, in the course of a few years, we shall have condensed into our encyclopaedia the net value of all the theories at which the world has yet arrived. But year after year our tables get no completeness, and at last we discover that our curve is a parabola, whose arcs will never meet.

Friday, February 20, 2015

Science meant looking -- a special kind of looking. Looking especially hard at the things you didn't understand. Looking at the stars, say, and not fearing them, not worshiping them, just asking questions, finding the question that would unlock the door to the next question and the question beyond that.

Thursday, February 19, 2015

Just as with individuals there are different kinds and degrees of asshole nations, not all of which are equally morally condemnable, and which require different handling. A typology of assholishness allows us to distinguish the merely badly behaved from the ugly from the genuinely horrid.

Let's start by considering assholism that is domain specific misbehaviour rather than a dominant moral identity. Japan's support for whaling can certainly be described as assholish. Having voluntarily joined the International Whaling Commission and (eventually) signed up to its moratorium on whaling, Japan's government effectively raises a middle finger to the international community's concern to protect these endangered species by not only issuing fake ‘scientific research' permits to its whaling fleets but also providing them with millions of dollars of subsidies per year.

Yet Japan is generally a pretty well-behaved member of the international community of nations. In many respects, most notably in its commitment to its pacifist constitution and its funding for development aid, it takes care to recognise the moral reality of other nations. Its assholish behaviour is limited to a relatively few areas like whaling (and import tariffs and school history books that gloss over its brutal S. East Asian empire). Japan is not a completely unreasonable country: One doesn't have to worry constantly about what new self-serving stunt it will pull; and one can hope, eventually, to reason with it even on those subjects where it presently refuses to listen to criticism. Like a good many other reasonably normal countries, Japan is only a partial rather than a complete asshole nation.

Like us, cetaceans have special brain cells, spindle cells, that are associated with communication, emotion, and heightened social sensitivity. These cells were once thought to be unique to us, but research is now showing that whales and dolphins may have up to three times more spindle cells than humans.

Wednesday, February 18, 2015

The moral: don’t invade a country if you are too lazy to learn the language. If you can’t understand what people are saying, you are operating blind. I’ve been told by American officials that up to 95% of the Iraqis imprisoned in American brigs were probably guilty of nothing. They were ratted out, perhaps by someone who owed them money, and the gullible Americans just locked them up. Imprisoning the innocent created unnecessary enemies for the occupation. In 2003, most Iraqis were pleased at Saddam Hussein’s ouster. They could have been predisposed to support American aims, if the Americans hadn’t alienated so many of them for little reason. It is impossible to successfully conduct a war if you can’t distinguish friend from foe because they all look the same to you. If more American soldiers understood Arabic, their insight and awareness of Iraqi culture could have made a huge difference.

Tuesday, February 17, 2015

In dwelling, live close to the ground. In thinking, keep to the simple. In conflict, be fair and generous. In governing, don't try to control. In work, do what you enjoy. In family life, be completely present.

Sunday, February 15, 2015

The average American currently consumes three to five times the needed amount of daily protein, and most of us aren't training to be competitive athletes or bodybuilders. While we take protein to be a vessel of robust health, we continue to lead sedentary lives. And when we don't exercise, the body quickly stores any and all excess protein as fat.

"In fact, consistent protein overload will flood the kidneys, and cause digestive issues, nausea, harm to your brain and nervous system, and unusual weight gain. You’ll also be putting your body at risk of developing more serious long-term health problems, such as a buildup of amino acids, insulin, ammonia, and other toxic substances in your bloodstream."

Likewise, when dietitians recommend foods high in protein, they are not talking about bacon and hamburgers. Non-animal sources of protein include the Japanese vegetable dish edamame, which packs 16 grams of protein per cup, chia seeds, quinoa, lentils, Greek yogurt, tempeh, nut butters, and chickpeas.

A quiet secluded life in the country, with the possibility of being
useful to people to whom it is easy to do good, and who are not
accustomed to have it done to them; then work which one hopes may be of
some use; then rest, nature, books, music, love for one's neighbor —
such is my idea of happiness.

Saturday, February 14, 2015

When we started in the PhD program in Computer Science at Stanford, Prof. Rajeev Motwani, who was the “default PhD advisor” for all incoming PhD students told us that our only priority should be to find a research topic that we care about. In fact, Prof. Motwani was so adamant that this was the only thing that mattered, that he said that we shouldn’t worry about the myriad of other requirements for the PhD. He explained that rarely had he seen anyone not complete their PhD for any other reason.

Rajeev taught me that getting a PhD is as much about “finding a problem as “solving the problem.” The first and often the most time-consuming part of the PhD is finding a problem that you care deeply enough about to spend several years of your life researching. Ultimately the goal of a PhD should be to “make an incremental contribution to human knowledge.” Sounds simple enough doesn’t it?

There are many things about doing a PhD that have helped me to better understand how things work or should work in a startup environment. But one of the most important has been finding a problem worth solving.

[---]

Frequency, Density, and Pain have become three variables that I now look at to analyze almost any problem. They tend to be good measuring sticks to see how the problem you’re solving stacks up.

Frequency: Does the problem you’re solving occur often?

Density: Do a lot of people face this problem?

Pain: Is the problem just an annoyance, or something you absolutely must resolve?

Put another way asking the Frequency, Density and Pain question can help you to identify how often people have this problem? How many people have this problem? Do they care?If your business is stuck at a plateau and you can’t figure out why you’re not making headway, it might be worth thinking about these variables to determine whether you’re solving the right problem, a big enough problem, or a frequently occurring problem.

Thursday, February 12, 2015

The “on” switch for your brain’s camcorder, according to this study, is knowing
that you must recall the information. The fine-tuning of this memory
mechanism is more precise than we realize, so much so that we can
remember a specific detail about a thing or event without remembering
much else about it. The researchers call this “attribute amnesia.”

Wyble added, “This result is surprising because traditional theories
of attention assume that when a specific piece of information is
attended, that information is also stored in memory.”

Wednesday, February 11, 2015

Yet we have not always approached the history of nazism in this way. Indeed, the predominantly moral perspective from which Hitler and the Germany he created are currently viewed is a relatively recent one. For a long time after the end of the war he launched in September 1939 and lost five and a half years later, Hitler was a comparatively neglected topic for historians, as were the Nazi movement and the Nazi state. Evidence was piled up for the Nuremberg trials, but the focus was very much on “war crimes”, the years before 1939 were more or less out of the visual range of the prosecutors, and the death camps at Treblinka, Auschwitz and elsewhere were not the central point of the investigation.

The trials were quickly forgotten, at least for the time being. In Germany, a kind of collective amnesia followed, undermined only by resentment at the trials themselves, the intrusive process of “denazification”, the brutal expulsion of 12 million ethnic Germans from Eastern Europe at the end of the war and the mass bombing of German cities in its later stages. In the countries formerly occupied by Nazi Germany, such as France, people wanted to remember the resistance. In the Eastern bloc, communist governments celebrated (and exaggerated) the role of communist resisters but preferred to try to integrate ex-Nazis into the new society they were building rather than come to a reckoning with their crimes. In Britain, people remembered the war, the stoicism of the population during the blitz and the achievements of the British armed forces, but not much besides.

It wasn’t until the late 1960s that things began to change. For Germans, the key question was how and why the Nazis had come to power. The Federal Republic, with its capital in the Rhenish university town of Bonn, had gained legitimacy through the “economic miracle” of the 1950s, but was still not much older than Germany’s first democracy, the Weimar Republic, had been when it had given way to Hitler’s Third Reich. People asked nervously “Is Bonn Weimar?” Political scientists and historians examined the reasons for the vulnerability of Weimar’s institutions and found, reassuringly, that the answer was “No”.

Molecular biology has shown that even the simplest of all living
systems on the earth today, bacterial cells, are exceedingly complex
objects. Although the tiniest bacterial cells are incredibly small,
weighing less than 10-12 gms, each is in effect a veritable
micro-miniaturized factory containing thousands of exquisitely designed
pieces of intricate molecular machinery, made up altogether of one
hundred thousand million atoms, far more complicated than any machine
built by man and absolutely without parallel in the nonliving world.

Tuesday, February 10, 2015

Beware the irrational, however seductive. Shun the ‘transcendent’ and all who invite you to subordinate or annihilate yourself. Don’t be afraid to be thought arrogant or selfish. Picture all experts as if they were mammals. Never be a spectator of unfairness or stupidity. Seek out argument and disputation for their own sake; the grave will provide plenty of time for silence.

Monday, February 9, 2015

A passion to make the world a better place is a fine reason to study social psychology. Sometimes, however, researchers let their ideals or their political beliefs cloud their judgment, such as in how they interpret their research findings. Social psychology can only be a science if it puts the pursuit of truth above all other goals. When researchers focus on a topic that is politically charged, such as race relations or whether divorce is bad for children, it is important to be extra careful in making sure that all views (perhaps especially disagreeable ones, or ones that go against established prejudices) are considered and that the conclusions from research are truly warranted.

Sunday, February 8, 2015

Thanks to the speed of CRISPR research, the accolades have come quickly. Last year MIT Technology Review
called CRISPR “the biggest biotech discovery of the century.” The
Breakthrough Prize is just one of several prominent awards Doudna has
won in recent months for her work on CRISPR; National Public Radio
recently reported whispers of a possible Nobel in her future.

[---]

Doudna and other researchers did not pluck the molecules they use for
gene editing from thin air. In fact, they stumbled across the CRISPR
molecules in nature. Microbes have been using them to edit their own DNA
for millions of years, and today they continue to do so all over the
planet, from the bottom of the sea to the recesses of our own bodies..

We’ve barely begun to understand how CRISPR works in the natural
world. Microbes use it as a sophisticated immune system, allowing them
to learn to recognize their enemies. Now scientists are discovering that
microbes use CRISPR for other jobs as well. The natural history of
CRISPR poses many questions to scientists, for which they don’t have
very good answers yet. But it also holds great promise. Doudna and her
colleagues harnessed one type of CRISPR, but scientists are finding a
vast menagerie of different types. Tapping that diversity could lead to
more effective gene editing technology, or open the way to applications
no one has thought of yet.[---]

At the time, Koonin, an evolutionary biologist at the National Center
for Biotechnology Information in Bethesda, Md., had been puzzling over
CRISPR and Cas genes for a few years. As soon as he learned of
the discovery of bits of virus DNA in CRISPR spacers, he realized that
microbes were using CRISPR as a weapon against viruses.

Koonin knew that microbes are not passive victims of virus attacks.
They have several lines of defense. Koonin thought that CRISPR and Cas enzymes provide one more. In Koonin’s hypothesis, bacteria use Cas enzymes
to grab fragments of viral DNA. They then insert the virus fragments
into their own CRISPR sequences. Later, when another virus comes along,
the bacteria can use the CRISPR sequence as a cheat sheet to recognize
the invader.

Scientists didn’t know enough about the function of CRISPR and Cas enzymes for Koonin to make a detailed hypothesis. But his thinking was provocative enough for a microbiologist named Rodolphe Barrangou
to test it. To Barrangou, Koonin’s idea was not just fascinating, but
potentially a huge deal for his employer at the time, the yogurt maker
Danisco. Danisco depended on bacteria to convert milk into yogurt, and
sometimes entire cultures would be lost to outbreaks of bacteria-killing
viruses. Now Koonin was suggesting that bacteria could use CRISPR as a
weapon against these enemies.

To test Koonin’s hypothesis, Barrangou and his colleagues infected the milk-fermenting microbe Streptococcus thermophilus
with two strains of viruses. The viruses killed many of the bacteria,
but some survived. When those resistant bacteria multiplied, their
descendants turned out to be resistant too. Some genetic change had
occurred. Barrangou and his colleagues found that the bacteria had
stuffed DNA fragments from the two viruses into their spacers. When the
scientists chopped out the new spacers, the bacteria lost their
resistance.

Barrangou, now an associate professor at North Carolina State
University, said that this discovery led many manufacturers to select
for customized CRISPR sequences in their cultures, so that the bacteria
could withstand virus outbreaks. “If you’ve eaten yogurt or cheese,
chances are you’ve eaten CRISPR-ized cells,” he said.

Saturday, February 7, 2015

When Kim the computer scientist writes a program, her aim is to learn something about the underlying algorithm. The object of study in computer science is the computing process itself, detached from any particular hardware or software. When Kim publishes her conclusions, they will be formulated in terms of an idealized, abstract computing machine. Indeed, the more theoretical aspects of her work could be done without any access to actual computers.

When Chris the computational scientist writes a program, the goal is to simulate the behavior of some physical system. For her, the computer is not an object of study but a scientific instrument, a device for answering questions about the natural world. Running a program is directly analogous to conducting an experiment, and the output of the program is the result of the experiment.

When Dana the developer writes a program, the program itself is the product of his labors. The software he creates is meant to be a useful tool for colleagues or customers—an artifact of tangible value. Dana’s programming is not science but art or craft or engineering. It is all about making things, not answering questions.

Should these three activities be treated as separate fields of endeavor, or are they really just subdivisions of a single computing enterprise? The historian Michael Mahoney, an astute observer of computing communities, suggested that a key concept for addressing such questions is the “agenda.” The agenda of a field consists of what its practitioners agree ought to be done, a consensus concerning the problems of the field, their order of importance or priority, the means of solving them (the tools of the trade), and perhaps most importantly, what constitutes a solution…. The standing of the field may be measured by its capacity to set its own agenda. New disciplines emerge by acquiring that autonomy. Conflicts within a discipline often come down to disagreements over the agenda: what are the really important problems? [---]

The grizzled curmudgeon in me wants to object that this instant cartography is not
real
programming, it’s just a “mashup” of prefabricated program modules and
Internet resources. But building atop the achievements of others is
exactly how science and engineering are supposed to advance.

Still, a worry remains. How will the members of this exuberant new
cohort distribute themselves over the three continents of computer
science, computational science, and software development? What tasks
will they put on their agendas? At the moment, most of the energy flows
into the culture of software development or programming. The excitement
is about
applying
computational methods, not inventing new ones or investigating their properties. In the long run, though,
someone
needs to care about LR(1) parsers.

Guy Lewis Steele, Jr., one of the original MIT hackers, worried in the
1980s that hackerdom might be killed off “as programming education
became more formalized.” The present predicament is just the opposite.
Everyone wants to pick up the knack of coding, but the more abstract and
mathematical concepts at the core of computer science attract a smaller
audience. The big enrollments are in courses on Python, Ruby, and
JavaScript, not automata theory or denotational semantics.

I would not contend that mastery of the more theoretical topics is a
prerequisite to becoming a good programmer. There’s abundant evidence to
the contrary. But it
is
a necessary step in absorbing the culture of computer science. I am
sentimental enough to believe that an interdisciplinary and
intergenerational conversation would enrich both sides, and help in
knitting together the communities.

On two occasions, I have been asked [by members of Parliament], 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able to rightly apprehend the kind of confusion of ideas that could provoke such a question.

Friday, February 6, 2015

Differences in these parts of the brain can account for some variability between individuals, but what about differences that seem to be defined by age?

Many areas of the brain grow and develop as you age, and the areas responsible for social emotions are no different. Between the ages of four and five, you start to develop the ability to understand that people around you could be having thoughts or emotions that are different than your own. Further important changes occur during a period that society often seems to single out as the pinnacle for being different: adolescence.

Adolescence – the period extending from puberty to the point of independent stability – is often portrayed as a very dramatic time with a new emphasis placed on the importance of friendships and social input. Researchers have even found during this period that many adolescents value the input of their peers even over the input of their family.

[---]

Move up just a few years to young adults and there is already a shift, with this group watching five times as much television as online video. At least some part of that difference can perhaps be accounted for with changes that occur in this period to the brain itself.

One of the areas going through important structural changes in this period – with additions of gray matter and changes in shape – is the area that deals with “social emotions.” Social emotions are those that require you to consider what others might be thinking – like guilt or embarrassment – rather than your own emotional experience – like fear. When researchers ask adolescents and adults to explain certain emotions, both groups feel and describe them in the same way. But the activity that is happening in the brain, and the way that information is being processed, differs between the two groups.

I consider as lovers of books not those who keep their books hidden in their store-chests and never handle them, but those who, by nightly as well as daily use thumb them, batter them, wear them out, who fill out all the margins with annotations of many kinds, and who prefer the marks of a fault they have erased to a neat copy full of faults.

Wednesday, February 4, 2015

And the wise ones bet heavily when the world offers them that opportunity. They bet big when they have the odds. And the rest of the time, they don’t. It’s just that simple.

[---]

How many insights do you need? Well, I’d argue: that you don’t need many in a lifetime. If you look at Berkshire Hathaway and all of its accumulated billions, the top ten insights account for most of it. And that’s with a very brilliant man—Warren’s a lot more able than I am and very disciplined— devoting his lifetime to it. I don’t mean to say that he’s only had ten insights.I’m just saying, that most of the money came from ten insights.So you can get very remarkable investment results if you think more like a winning pari-mutuel player. Just think of it as a heavy odds against game full of craziness with an occasional mispriced something or other. And you’re probably not going to be smart enough to find thousands in a lifetime. And when you get a few, you really load up. It’s just that simple.

[--]

And it makes sense to load up on the very few good insights you have instead of pretending to know everything about everything at all times. You’re much more likely to do well if you start out to do something feasible instead of something that isn’t feasible. Isn’t that perfectly obvious?How many of you have 56 brilliant ideas in which you have equal confidence? Raise your hands, please. How many of you have two or three insights that you have some confidence in? I rest my case.

Does a man of sense run after every silly tale of hobgoblins or fairies, and canvass particularly the evidence? I never knew anyone, that examined and deliberated about nonsense who did not believe it before the end of his enquiries.

Tuesday, February 3, 2015

Today machines can recognize, say, a dog jumping. But what if someone is holding a piece of meat above the dog. We recognize that that’s a slightly different concept, a dog trick. And the piece of meat isn’t just a piece of meat, it’s a treat—a different linguistic idea. Can we get computers to understand these concepts?

Deep learning algorithms are very good at one thing today: learning input and mapping it to an output. X to Y. Learning concepts is going to be hard.

One thing Baidu did several months ago was input an image — and the output was a caption. We showed that you can learn these input-output mappings. There’s a lot of room for improvement but it’s a promising approach for getting computers to understand these high level concepts.

[---]

Do you see AI as a potential threat?

I’m optimistic about the potential of AI to make lives better for hundreds of millions of people. I wouldn’t work on it if I didn’t fundamentally believe that to be true. Imagine if we can just talk to our computers and have it understand “please schedule a meeting with Bob for next week.” Or if each child could have a personalized tutor. Or if self-driving cars could save all of us hours of driving.

I think the fears about “evil killer robots” are overblown. There’s a big difference between intelligence and sentience. Our software is becoming more intelligent, but that does not imply it is about to become sentient. The biggest problem that technology has posed for centuries is the challenge to labor. For example, there are 3.5 million truck drivers in the US, whose jobs may be affected if we ever manage to develop self-driving cars. I think we need government and business leaders to have a serious conversation about that, and think the hype about “evil killer robots” is an unnecessary distraction.

Monday, February 2, 2015

In his Apology for Raymond Sebond (1576), Michel de Montaigne ascribed animals’ silence to man’s own wilful arrogance. The French essayist argued that animals could speak, that they were in possession of rich consciousness, but that man wouldn’t condescend to listen. ‘It is through the vanity of the same imagination that [man] equates himself with God,’ Montaigne wrote, ‘that he attributes divine attributes for himself, picks himself out and separates himself from the crowd of other creatures.’ Montaigne asked: ‘When I play with my cat, who knows if she is making more of a pastime of me than I of her?’

Montaigne’s question is as playful as his cat. Apology is not meant to answer the age-old question, but rather to provoke; to tap into an unending inquiry about the reasoning of animals. Perhaps, Montaigne implies, we simply misunderstand the foreign language of animals, and the ignorance is not theirs, but ours.

Montaigne’s position was a radical one – the idea the animals could actually speak to humans was decidedly anti-anthropocentric – and when he looked around for like-minded thinkers, he found himself one solitary essayist. But if Montaigne was a 16th century loner, then he could appeal to the Classics. Apology is littered with references to Pliny and a particular appeal to Plato’s account of the Golden Age under Saturn. But even there, Montaigne had little to work with. Aristotle had argued that animals lacked logos (meaning, literally, ‘word’ but also ‘reason’) and, therefore, had no sense of the philosophical world inhabited and animated by humans. And a few decades after Montaigne, the French philosopher René Descartes delivered the final blow, arguing that the uniqueness of man stems from his ownership of reason, which animals are incapable of possessing, and which grants him dominion over them.

[---]

In his curious book The Criminal Prosecution and Capital Punishment of Animals (1906), E P Evans traces multiple accounts of animal executions from Roman times well into the Renaissance: sows, horses, insects and donkeys all met their death at the hands of well-compensated executioners throughout Europe. Animals were condemned for falling foul of human law: murder, theft, even witchcraft.But why torture and publicly execute a beast supposedly incapable of logos? The personification of condemned animals is striking precisely because it disturbs; the whole spectacle seems designed to yank them from the world of dumb, unthinking beasts; to demonstrate their capacity for human consciousness, one that entails an awareness of human morality, justice and language. After all, what’s the purpose of retributive justice if the offending party is unable to feel the weight of the anathema?Perhaps the answer is found in the bloody history of 18th century Paris. During the Great Cat Massacre, apprentice printers captured and hung hundreds of cats. They did so as a gruesome form of protest – to hold someone accountable for their unfair treatment, their physical and financial suffering at the hands of their human masters. With little or no agency, apprentices put the ugly words of their masters into the mouths of innocent cats. The cat killers gave the felines the words they needed to hear and, though their intentionality was quite different, their need for animal logos belonged to the same fantastical world as the internet’s doge.- More Here

I feel lonely in this century when conversing "debating" about the consciousness of animals, I can only image what Montaigne went through 5 centuries earlier. Sir, I salute you and you have been my inspiration and will be still my last breathe.

Sunday, February 1, 2015

I have learnt so much from Andrew in the past few years that my life will never be the same. I can only imagine the fire he instills in younger kids (the ones who read him of course); we will miss your blog Andrew, but I am looking forward for a new book from you soon... it would be a dream come true if you could ever team up with Peter Singer....

This designation should go to someone who actually has helped change the world, rather than just changing lots of minds. It also should go to someone who has embodied key trends of the time, noting that for both standards I am focusing on the United States.

Based on those standards, I am inclined to pick Andrew Sullivan, who is most recently in the news for his announcement that he is quitting after fifteen years of blogging.

Any discussion of Sullivan's influence must begin with gay marriage. Thirty-six states and the District of Columbia already have legalized gay marriage, representing a majority of the American population, with possibly Alabama and others to follow. A broader Supreme Court decision for nationwide legalization may be on the way. More generally, gay rights have taken a major leap forward.

[---]

Sullivan was a very early blogger, and in that arena he was tireless too, so dedicated that he now claims blogging is endangering his health. For many years, it was common for his site to put up fifteen or more posts a day, a remarkably large percentage of them interesting or in some way provocative or informative. He embodied the classic blogosphere like no other writer, as he fine-tuned and mastered the art of the blog as an ongoing critical — and indeed substantive — dialog with oneself. He was an inspiration for many writers, myself included, and he gave many of us our first big links and our first taste of how to deal with a crashed site from heavy traffic.

That's two big wins right there, and how many other public intellectuals can come up with one?I thought long and hard before selecting Andrew for the designation of most influential public intellectual. Perhaps Paul Krugman has changed more minds, but his agenda hasn't much changed the world; we haven't, for instance, gone back to do a bigger fiscal stimulus. Peter Singer led large numbers of people into vegetarianism and veganism and gave those practices philosophic respectability; he is second on my list. A generation ago, I would have picked Milton Friedman, for intellectual leadership in the direction of capitalist and pro-market reforms. But that is now long ago, and the Right has produced no natural successor.

Dawes observed that the complex statistical algorithm adds little or no value. One can do just as well by selecting a set of scores that have some validity for predicting the outcome and adjusting the values to make them comparable (by using standard scores or ranks). A formula that combines these predictors with equal weights is likely to be just as accurate in predicting new cases as the multiple-regression formula that was optimal in the original sample. More recent research went further: formulas that assign equal weights to all the predictors are often superior, because they are not affected by accidents of sampling.

Followers

Follow by Email

Subscribe To

About Me

I have this "little" 75 lb chocolate colored guy named Max and he has been the catalyst for my metamorphosis. Ever since he came into my life, I have been trying to subside that ape inside me.Blogging is proclamation of my ignorance to the world, the willingness to learn and an effort to get rid of my cognitive dissonances.