The Aresan Clan is published four times a week (Tue, Wed, Fri, Sun). You can see what's been written so far collected here. All posts will be posted under the Aresan Clan label. For summaries of the events so far, visit here. See my previous serial Vampire Wares collected here.

The basic idea is that prison is a bad idea for most criminals by whatever standard you use. If you believe in a retributive theory of punishment then you must admit that prison is very expensive expensive way of punishing people for their crimes and that there are simpler and cheaper solutions — financial penalties and physical punishment are just as retributive and a whole lot cheaper. if you abide by a rehabilitative theory of punishment, then prison is a horrible idea. Prison has a poor track record of rehabilitating prisoners and much more frequently makes people more likely to engage in future criminal activity. The last thing you want to do to help someone is to lock them up with a bunch of hardened criminals.

The only people you really want to lock up are people who are a danger to society. It makes sense to lock up Charles Manson, BTK and the Son of Sam, but not so much someone incarcerated for marijuana possession, or for that matter Bernie Madoff and Martha Stewart. Prison is expensive and thus the taxpayers, who have no culpability in these crimes are punished.

Moskos asks "Given the choice between five years and ten lashes, wouldn’t you choose the lash?" to which I'd say definitely yes. Give the problems of rampant overcrowding, prison violence and pervasive rape I'd say prison is a horrible place to go and I'd gladly endure the pain of flogging to avoid prison.

I wouldn't go out and say that flogging is necessarily the best form of non-prison punishment. Fines, for one, make sense in many cases, such as drug possession and financial and property crimes; damage to reputation is an under appreciated punishment (think for example of stockades, or a highly visible letter sewed on someone's clothing, a la The Scarlet Letter, or a swastika carved in someone's forehead a la Inglorious Basterds); people can be monitored and tracked for much cheaper than imprisonment; and perhaps other forms of physical punishment might be better. The best solutions for each case would require some careful thought (hmm, what about branding people on their arm for certain crimes?). But what is obvious is that prison is simply not a good idea for many of the crimes for which it is applied.

The basic idea is that prison is a bad idea for most criminals by whatever standard you use. If you believe in a retributive theory of punishment then you must admit that prison is very expensive expensive way of punishing people for their crimes and that there are simpler and cheaper solutions — financial penalties and physical punishment are just as retributive and a whole lot cheaper. if you abide by a rehabilitative theory of punishment, then prison is a horrible idea. Prison has a poor track record of rehabilitating prisoners and much more frequently makes people more likely to engage in future criminal activity. The last thing you want to do to help someone is to lock them up with a bunch of hardened criminals.

The only people you really want to lock up are people who are a danger to society. It makes sense to lock up Charles Manson, BTK and the Son of Sam, but not so much someone incarcerated for marijuana possession, or for that matter Bernie Madoff and Martha Stewart. Prison is expensive and thus the taxpayers, who have no culpability in these crimes are punished.

Moskos asks "Given the choice between five years and ten lashes, wouldn’t you choose the lash?" to which I'd say definitely yes. Give the problems of rampant overcrowding, prison violence and pervasive rape I'd say prison is a horrible place to go and I'd gladly endure the pain of flogging to avoid prison.

I wouldn't go out and say that flogging is necessarily the best form of non-prison punishment. Fines, for one, make sense in many cases, such as drug possession and financial and property crimes; damage to reputation is an under appreciated punishment (think for example of stockades, or a highly visible letter sewed on someone's clothing, a la The Scarlet Letter, or a swastika carved in someone's forehead a la Inglorious Basterds); people can be monitored and tracked for much cheaper than imprisonment; and perhaps other forms of physical punishment might be better. The best solutions for each case would require some careful thought (hmm, what about branding people on their arm for certain crimes?). But what is obvious is that prison is simply not a good idea for many of the crimes for which it is applied.

I had noted earlier that "trial and error is the most consistently powerful method for attaining knowledge," and had suggested, that just as experimental trials have been powerful in advancing scientific, they'd also be powerful in improving public policy.

But I had a few caveats, mostly on the limits of when trial and error can be applied, but most important was the last: "it's a bit unrealistic to expect that governments, after a long history of ignoring scientific evidence and expert opinion, to suddenly start building policy on evidence." This is a statement that begs expansion, and we might mention many such as "diffuse costs/concentrated benefits," "rent seeking" and "regulatory capture."

Bryan Caplan also adds "Short Time Horizon," pointing to a nice explanation by Tim Hartford in his new book Adapt. Politicians have short terms and thus need to create visible benefits within short time periods. This leads to a condition we might call "Immediate Benefits/Deferred costs." And it makes running trials because trials take so long and frequently lead to unpopular conclusions.

I had noted earlier that "trial and error is the most consistently powerful method for attaining knowledge," and had suggested, that just as experimental trials have been powerful in advancing scientific, they'd also be powerful in improving public policy.

But I had a few caveats, mostly on the limits of when trial and error can be applied, but most important was the last: "it's a bit unrealistic to expect that governments, after a long history of ignoring scientific evidence and expert opinion, to suddenly start building policy on evidence." This is a statement that begs expansion, and we might mention many such as "diffuse costs/concentrated benefits," "rent seeking" and "regulatory capture."

Bryan Caplan also adds "Short Time Horizon," pointing to a nice explanation by Tim Hartford in his new book Adapt. Politicians have short terms and thus need to create visible benefits within short time periods. This leads to a condition we might call "Immediate Benefits/Deferred costs." And it makes running trials because trials take so long and frequently lead to unpopular conclusions.

Sunday, May 29, 2011

Justin E. H. Smith opines in the New York Times last week about the loss of curiosity in Philosophy. There are some fair points in the article, though I must disagree with his emphasis on curiosity or the lack thereof as the problem.

For one, philosophy over history has radically changed since the early days. Back in the times of Ancient Greece when the term "Philosophy" was coined, it simply meant pursuit of knowledge, in the broadest sense of the term, including math, science, ethics, ontology, logic, epistemology and so on. Over time, the pursuit of knowledge has become more specialized. This process hasn't been guided by a loss of curiosity, as Smith seems to suggest, but really is a difference in the effectiveness of truth-seeking techniques in various sub-disciplines of the pursuit of knowledge. Rapid advancement in the rigors of mathematical logic and theorems meant this field became an area of specialization on its own very early on. Later, during the scientific revolution, important thinkers such as Francis Bacon, Rene Descartes, Galileo Galilei and Isaac Newton introduced to philosophy new techniques of observation, experimentation and mathematical rigor which permitted ideas to be more decisively tested and led to real progress in these fields. But such techniques didn't apply to all fields of knowledge-pursuit, only to certain fields. It really only applied to physics, originally, but it expanded to other disciplines. Important discoveries led to development of chemistry and biology as independent disciplines in their own right. Other disciplines now considered part of science, such as medicine, psychology and economics didn't immediately benefit from these new techniques; they advanced slowly for a while, even after the scientific revolution, but eventually they developed and emerged as their own sub-disciplines with their own specialists. As these various disciplines of math, physics, chemistry, psychology, economics, biology began to make rapid advances in their theoretical underpinnings and established understanding, they became inaccessible to the broadly curious dilettante. To be a competent physicist, for example, as has been the case for a long time, requires considerable education and training before one can expect to make substantive contributions to the discipline, and such has been the case of all of these various disciplines that have pealed off and established themselves independently of the broader pursuit of knowledge.

Some disciplines haven't benefited from these advances, especially fields like ontology, ethics, metaphysics, axiology, aesthetics and political science, fields which are now considered to be part of "philosophy." But all this "philosophy" is, really, just the leftover husk after various other disciplines have been pealed away. No real advancement has been made in these fields since the beginning. We haven't made any new advances in, say, ethics, in the past 2000 years. The reason is, I believe, because there simply are no good ways of disproving relevant philosophical theories. Assuredly, one can make a defense in support of an idea; for example, if I wanted to argue that there is a substantial dualism between mind and body, I could concoct an argument like Descartes did. All a person could do in response is refute the argument, or perhaps offer a different argument. But refuting an argument doesn't refute the conclusion (truths can be defended with fallacious arguments; pointing to the fallacies in the arguments, doesn't disprove the truth, only the argument), and any counter-argument will itself be inevitably open to refutation. Judging between the merits of various arguments and counter-arguments is highly subjective, and so we're left without any definite determination of which idea is superior. In certain scientific fields, though, we've found very strong ways of refuting theories, running experiments. For example, if you want to test out the caloric theory of heat, you run an experiment. A well-designed experiment can all but completely disprove a theory: all you have to, for example, is prove that frictional heat provides an endless supply of heat (whereas caloric theory proposes that each thing contains a finite amount of caloric heat). Positively proving theories is difficult in science and fraught with uncertainty, but the ability to get rid of bad theories strengthens the certainty behind those that remain. If we could narrow down the field of possible candidates for a theory of mind, for example, then we could focus our energy on those that remain, weeding out more bad ones and better refining the good ones. That would be progress. But we can't do that in philosohpy.

Now all of this advancement in other fields, as I said, has led to specialization, which is quite a good thing, since it allows the overall collaborative project of science to advance much more rapidly in many more directions. But it also has drawbacks, if the various disciplines and sub-discliplines and sub-sub-discliplines become too specialized, then no one will have sufficient general knowledge to integrate them and the people who become too focused on narrow disciplines might make poor theories that would be improved by taking into account the big picture. These are trade offs that science takes for the sake of greater and more rapid advancement. They're partially addressed by popularizers and generalists who try to summarize all of this knowledge and bring it to people outside of the field, but it still is a limitation. The problem with philosophy is that such trade offs don't make sense. Since philosophy hasn't progressed, for example, in the area of the philosophy of mind since the days of Descartes, all that specialization in philosophy leads to is more theories in philosophy of mind. In short, whereas in scientific fields, the great expansion in knowledge has led to expansion in specialists (since no one person can take in all this knowledge) in philosophy specialization has only been driven by the need for a person to have a grasp on all the competing and, quite frequently, equally plausible theories. Specialization makes sense if progress is being made, but not if we're just spinning out more theories.

In Smith's article, he focuses really on the problem of curiosity as plaguing philosophy. Though I would diagnose the problem somewhat differently, him and I both agree on what's fundamentally wrong with philosophy: it's too specialized, too narrowly focused. He believes that philosophers should be unafraid to research irrelevant trivia, just for the sake of curiosity. I believe that philosophers should be generalists, not just within the fields that are still left for philosophy, but within all fields. Of course, such broad knowledge means that a philosopher will never have deep knowledge within any one area, but that's what we have the scientists for. And within the area of philosophy since no advancement in knowledge nor any sort of certain conclusions are forthcoming, the exploration of broad theories of everything is just as worthwhile as the mincing apart ever more detailed theories of mind.

When philosophers specialize too much, they risk making themselves obsolete. Philosopher has too much become a discipline of very well-eduated professionals simply talking to each other, without their theories translating into broader application for the public as a whole. Within science the hope is that the specialization will lead to applications that will benefit people, even if such applications aren't immediately obvious. But such is not the case with philosophy. Only when one tries to come up with theories that really, in the broadest sense are meaningful to people and help them make sense of the complex world we live in, will philosophy be relevant. Philosophers are sort of artists of the mind, and the more tools from the great toolbox of human knowledge they can employ, the better the works of art they create.

Justin E. H. Smith opines in the New York Times last week about the loss of curiosity in Philosophy. There are some fair points in the article, though I must disagree with his emphasis on curiosity or the lack thereof as the problem.

For one, philosophy over history has radically changed since the early days. Back in the times of Ancient Greece when the term "Philosophy" was coined, it simply meant pursuit of knowledge, in the broadest sense of the term, including math, science, ethics, ontology, logic, epistemology and so on. Over time, the pursuit of knowledge has become more specialized. This process hasn't been guided by a loss of curiosity, as Smith seems to suggest, but really is a difference in the effectiveness of truth-seeking techniques in various sub-disciplines of the pursuit of knowledge. Rapid advancement in the rigors of mathematical logic and theorems meant this field became an area of specialization on its own very early on. Later, during the scientific revolution, important thinkers such as Francis Bacon, Rene Descartes, Galileo Galilei and Isaac Newton introduced to philosophy new techniques of observation, experimentation and mathematical rigor which permitted ideas to be more decisively tested and led to real progress in these fields. But such techniques didn't apply to all fields of knowledge-pursuit, only to certain fields. It really only applied to physics, originally, but it expanded to other disciplines. Important discoveries led to development of chemistry and biology as independent disciplines in their own right. Other disciplines now considered part of science, such as medicine, psychology and economics didn't immediately benefit from these new techniques; they advanced slowly for a while, even after the scientific revolution, but eventually they developed and emerged as their own sub-disciplines with their own specialists. As these various disciplines of math, physics, chemistry, psychology, economics, biology began to make rapid advances in their theoretical underpinnings and established understanding, they became inaccessible to the broadly curious dilettante. To be a competent physicist, for example, as has been the case for a long time, requires considerable education and training before one can expect to make substantive contributions to the discipline, and such has been the case of all of these various disciplines that have pealed off and established themselves independently of the broader pursuit of knowledge.

Some disciplines haven't benefited from these advances, especially fields like ontology, ethics, metaphysics, axiology, aesthetics and political science, fields which are now considered to be part of "philosophy." But all this "philosophy" is, really, just the leftover husk after various other disciplines have been pealed away. No real advancement has been made in these fields since the beginning. We haven't made any new advances in, say, ethics, in the past 2000 years. The reason is, I believe, because there simply are no good ways of disproving relevant philosophical theories. Assuredly, one can make a defense in support of an idea; for example, if I wanted to argue that there is a substantial dualism between mind and body, I could concoct an argument like Descartes did. All a person could do in response is refute the argument, or perhaps offer a different argument. But refuting an argument doesn't refute the conclusion (truths can be defended with fallacious arguments; pointing to the fallacies in the arguments, doesn't disprove the truth, only the argument), and any counter-argument will itself be inevitably open to refutation. Judging between the merits of various arguments and counter-arguments is highly subjective, and so we're left without any definite determination of which idea is superior. In certain scientific fields, though, we've found very strong ways of refuting theories, running experiments. For example, if you want to test out the caloric theory of heat, you run an experiment. A well-designed experiment can all but completely disprove a theory: all you have to, for example, is prove that frictional heat provides an endless supply of heat (whereas caloric theory proposes that each thing contains a finite amount of caloric heat). Positively proving theories is difficult in science and fraught with uncertainty, but the ability to get rid of bad theories strengthens the certainty behind those that remain. If we could narrow down the field of possible candidates for a theory of mind, for example, then we could focus our energy on those that remain, weeding out more bad ones and better refining the good ones. That would be progress. But we can't do that in philosohpy.

Now all of this advancement in other fields, as I said, has led to specialization, which is quite a good thing, since it allows the overall collaborative project of science to advance much more rapidly in many more directions. But it also has drawbacks, if the various disciplines and sub-discliplines and sub-sub-discliplines become too specialized, then no one will have sufficient general knowledge to integrate them and the people who become too focused on narrow disciplines might make poor theories that would be improved by taking into account the big picture. These are trade offs that science takes for the sake of greater and more rapid advancement. They're partially addressed by popularizers and generalists who try to summarize all of this knowledge and bring it to people outside of the field, but it still is a limitation. The problem with philosophy is that such trade offs don't make sense. Since philosophy hasn't progressed, for example, in the area of the philosophy of mind since the days of Descartes, all that specialization in philosophy leads to is more theories in philosophy of mind. In short, whereas in scientific fields, the great expansion in knowledge has led to expansion in specialists (since no one person can take in all this knowledge) in philosophy specialization has only been driven by the need for a person to have a grasp on all the competing and, quite frequently, equally plausible theories. Specialization makes sense if progress is being made, but not if we're just spinning out more theories.

In Smith's article, he focuses really on the problem of curiosity as plaguing philosophy. Though I would diagnose the problem somewhat differently, him and I both agree on what's fundamentally wrong with philosophy: it's too specialized, too narrowly focused. He believes that philosophers should be unafraid to research irrelevant trivia, just for the sake of curiosity. I believe that philosophers should be generalists, not just within the fields that are still left for philosophy, but within all fields. Of course, such broad knowledge means that a philosopher will never have deep knowledge within any one area, but that's what we have the scientists for. And within the area of philosophy since no advancement in knowledge nor any sort of certain conclusions are forthcoming, the exploration of broad theories of everything is just as worthwhile as the mincing apart ever more detailed theories of mind.

When philosophers specialize too much, they risk making themselves obsolete. Philosopher has too much become a discipline of very well-eduated professionals simply talking to each other, without their theories translating into broader application for the public as a whole. Within science the hope is that the specialization will lead to applications that will benefit people, even if such applications aren't immediately obvious. But such is not the case with philosophy. Only when one tries to come up with theories that really, in the broadest sense are meaningful to people and help them make sense of the complex world we live in, will philosophy be relevant. Philosophers are sort of artists of the mind, and the more tools from the great toolbox of human knowledge they can employ, the better the works of art they create.

Saturday, May 28, 2011

I had recently been reading through the works that are generally considered part of H. P. Lovecraft's dreamland stories, including "The Case of Charles Dexter Ward," "Nyarlathotep," and "The Dream Quest of Unknown Kadath." When looking for online copies of these works, I stumbled upon the rather tangled copyright questions concerning Lovecraft's works: are any of his stories under copyright, and if so which ones, and if so who owns the copyright. I should say at the outset that I reached no resolution on determining whether Lovecraft's works are in the public domain and believe no resolution is forthcoming.

All of Lovecraft's works entered the public domain in the EU in 2008, since the EU decided that copyright in all cases is merely 70 years after the death of the author and Lovecraft died in 1937. Australia's Gutenberg project has virtually all of his fictional works up, since the works are apparently in the public domain in Australia. But in the US it's more uncertain. Wikisource has an even more complete collection though it notes that some works are under copyright, and there is an extensive wikipedia entry on this question. At the US gutenberg site, the only works available are some early works, and in fact they've deliberately declined to offer any of the later works. Librivox.org similarly, only records works from 1922 and earlier (with the sole exception, so far, of The Shunned House, which first appeared in an amateur press and didn't originally have its copyright registered). You would think copyright would be a moot point since Lovecraft died 74 years ago, but you would be assuming that copyright law actually makes sense. As one Gutenberg staffer says Lovecraft is a rather interesting case.

To start off, any book published prior to 1923 is in the public domain. Some of Lovecraft's early works fall in this time period. But all of his most famous works, including "Call of Cthulhu," "Dreams in the Witch House," "At the Mountains of Madness" and so on were written after 1923. Before 1976, all works had to be registered with the copyright office to avoid falling into the public domain, and many of Lovecraft's works (the ones published in amateur presses) were almost certainly never registered. Additionally, any work published between the years 1923-1963, not only had to have been originally registered, but had to have that copyright renewed sometime between 1950 and 1992 to avoid avoid falling into the public domain. If it was renewed, then it is copyrighted until 95 years after publication. Unfortunately, there's no official database that explicitly lists which works published before 1963 had their copyright renewed. The Copyright Office has an online database of works renewed after 1977, but if the work was renewed from 1950-1977, that requires searching through the copyright office's database of physical paper records. The Stanford library has tried to address this, with their Copyright Renewal Database, which has tried to put all of the renewals into a digital, searchable form. If we just wanted to answer the question, for example, whether the "Call of Cthulhu" is in the public domain we can do searches for "Lovecraft," "Cthulhu" or "Weird Tales" (the publication that it was first published in). The only renewals we find are those of August Derleth and Donald Wandrei, Lovecraft proteges who edited and published several early collections of Lovecraft's works. They registered and renewedthe copyrightson somecollectionsthey publishedof Lovecraft'sworks. But copyrights on such collections normally only cover new material and contributions, namely the arrangement, editing, introductions and any new stories; they wouldn't apply directly to Lovecraft's original work. Thus, if one went back and republished the original Cthulhu story published in Weird Tales in 1928, those copyrights presumably wouldn't apply. And no further evidence of other renewals has been found.

There may be, nonetheless, some exceptions, namely in works originally published posthumously by Darleth and Wandrei. For example, Derleth and Wandrei registered and renewed the copyright to Beyond the Walls of Sleep, in which the "Dream Quest of Unknown Kadath," first appeared and in which "The Case of Charles Dexter Ward," was first published in its complete form. If they held full rights to these works at the time of publication, they may have successfully renewed rights to these works. But with the rest of the stories, there appears to be no evidence of renewal.

Unfortunately, there have been and still are copyright holders that claim the rights. Chris J Karr has a long article detailing the claims of these copyright holders. As he first explains, there are 23 works of Lovecraft that were written prior to 1923 and 15 works that never had their copyright registered, but the remaining 27 works are uncertain. They are claimed to be held by Arkham House Publishers, a publishing house founded by Derleth and Wandrei, and by Lovecraft's reconstituted literary estate.

It is generally believed that Lovecraft retained all rights to his works published from 1926 forward (though we don't have documentation to confirm this). In the period 1923-25, he published a few works in amateur presses like The Tryout that didn't register their copyrights, and six works in Weird Tales, which did register its copyrights. Weird Tales did transfer whatever rights it held to Derleth and Wandrei and Arkham House in 1947, which would probably only include the short stories "The Festival," "The Hound," "The Temple," "The Unnameable," "The Horror at Martin's Beach," and "Under the Pyramids." As already stated, there is no evidence that the copyrights were renewed on these stories.

Of the rights to the remaining 21 works published from 1926-37, that Lovecraft retained rights to, his rights were transferred, upon his death in 1937, to his only heir, his aunt, Annie Gamwell. Gamwell transferred the royalties in her will to Derleth and Wandrei, but the copyrights she held and these were transferred to her heirs, Edna Lewis and Ethel Morrish. Lewis and Morrish subsequently transferred at least some of their rights to Arkham House in an agreement. The problem is that the language of the agreement in which Morrish and Lewis ostensibly gave rights to Derleth and Wandrei is not clear about what rights are being transferred, and many dispute whether copyrights were actually transferred by this agreement. The question is whether granting "the right to publish H. P. Lovecraft's work," and "sell second serial rights" constitutes giving them full copyrights or just amounts to giving them permission to publish and collect the royalties.

All of this would be moot if the copyrights weren't renewed. Even if there is no evidence of renewal, it's possible that when Darleth and Wandrei renewed the copyrights to collections in which Lovecraft's works were republished, this constituted renewal of the copyright on those works; it's also possible renewals were made that have simply not been found. Karr concludes that all of Lovecraft's work are in the public domain, based on arguments used by Arkham House in a much later lawsuit with Wandrei. The lawsuit was over disputed royalties apparently owed to Wandrei. Arkham house used the argument that Wandrei was not owed royalties because he lacked the rights to the relevant works because the copyrights were never renewed. Karr takes this as definitive, since it is the statement of the publishing house itself that the publishing house did not own the rights to any of the works of Lovecraft that might be under copyright. I'm not so confident that Karr is right, since it's entirely possible that Arkham House is mistaken and such a statement in the context of a lawsuit needn't be legally binding, and Karr notes that Arkham house and the reconstituted Lovecraft estate still claim to the own the copyrights (for example, copyright notices of Lovecraft Properties LLC, the reconstituted Lovecraft estate, here and here). But, without any records of renewal, it's about the strongest evidence available.

For my money, I'd say all of Lovecraft's are probably in the public domain (possibly excepting writings published posthumously, such as "Dream Quest of Unknown Kadath" and "The Case of Charles Dexter Ward), but there really is no answer to the question of their copyright status. Were the rights properly renewed? Were they transferred to Arkham House? Copyright law has been stretched so far into the past that lost documentation, orphaned works and uncertainties about ownership become more and more problematic. All we can say is that if someone were to challenge copyright ownership, a court would be able to come to a decision, but the decision could go either way. It's doubtful it'll be worth anyone's time and money to try and resolve this in court, and this makes the situation de facto as if the copyrights still hold, since no one wants to risk getting sued (thought the copyright holders haven't, at least so far been zealous in pursuing their lawsuits). So, long as no one challenges their claim, the persons claiming own the rights, own them by default. And the issue won't really be resolved until 2032 when the last of Lovecraft's works published in his lifetime will enter the public domain (and I'm excluding the few posthumous publications, which appeared as late as 1944), unless there's another copyright extension, which would just push all of this mess forward.

All of this really illustrates the absurdity of copyright law, which only grows more absurd the further copyrights are extended. Today, copyright law covers works published before 1978 for 95 years after their publication and 70 years after the death of the author for works published since 1978. Such a term undermines the original intention of copyright law, namely to encourage the creation of art. Not only do rights devolve to people who have no part in the creation of original works, but the longer the term, the more copyright holders invest in protecting valuable copyrights and the less they invest in creating new copyrightable works. But, just as I said before, assuming that copyright law should actually abide by its purported rationale assumes that copyright law actually makes sense.

I had recently been reading through the works that are generally considered part of H. P. Lovecraft's dreamland stories, including "The Case of Charles Dexter Ward," "Nyarlathotep," and "The Dream Quest of Unknown Kadath." When looking for online copies of these works, I stumbled upon the rather tangled copyright questions concerning Lovecraft's works: are any of his stories under copyright, and if so which ones, and if so who owns the copyright. I should say at the outset that I reached no resolution on determining whether Lovecraft's works are in the public domain and believe no resolution is forthcoming.

All of Lovecraft's works entered the public domain in the EU in 2008, since the EU decided that copyright in all cases is merely 70 years after the death of the author and Lovecraft died in 1937. Australia's Gutenberg project has virtually all of his fictional works up, since the works are apparently in the public domain in Australia. But in the US it's more uncertain. Wikisource has an even more complete collection though it notes that some works are under copyright, and there is an extensive wikipedia entry on this question. At the US gutenberg site, the only works available are some early works, and in fact they've deliberately declined to offer any of the later works. Librivox.org similarly, only records works from 1922 and earlier (with the sole exception, so far, of The Shunned House, which first appeared in an amateur press and didn't originally have its copyright registered). You would think copyright would be a moot point since Lovecraft died 74 years ago, but you would be assuming that copyright law actually makes sense. As one Gutenberg staffer says Lovecraft is a rather interesting case.

To start off, any book published prior to 1923 is in the public domain. Some of Lovecraft's early works fall in this time period. But all of his most famous works, including "Call of Cthulhu," "Dreams in the Witch House," "At the Mountains of Madness" and so on were written after 1923. Before 1976, all works had to be registered with the copyright office to avoid falling into the public domain, and many of Lovecraft's works (the ones published in amateur presses) were almost certainly never registered. Additionally, any work published between the years 1923-1963, not only had to have been originally registered, but had to have that copyright renewed sometime between 1950 and 1992 to avoid avoid falling into the public domain. If it was renewed, then it is copyrighted until 95 years after publication. Unfortunately, there's no official database that explicitly lists which works published before 1963 had their copyright renewed. The Copyright Office has an online database of works renewed after 1977, but if the work was renewed from 1950-1977, that requires searching through the copyright office's database of physical paper records. The Stanford library has tried to address this, with their Copyright Renewal Database, which has tried to put all of the renewals into a digital, searchable form. If we just wanted to answer the question, for example, whether the "Call of Cthulhu" is in the public domain we can do searches for "Lovecraft," "Cthulhu" or "Weird Tales" (the publication that it was first published in). The only renewals we find are those of August Derleth and Donald Wandrei, Lovecraft proteges who edited and published several early collections of Lovecraft's works. They registered and renewedthe copyrightson somecollectionsthey publishedof Lovecraft'sworks. But copyrights on such collections normally only cover new material and contributions, namely the arrangement, editing, introductions and any new stories; they wouldn't apply directly to Lovecraft's original work. Thus, if one went back and republished the original Cthulhu story published in Weird Tales in 1928, those copyrights presumably wouldn't apply. And no further evidence of other renewals has been found.

There may be, nonetheless, some exceptions, namely in works originally published posthumously by Darleth and Wandrei. For example, Derleth and Wandrei registered and renewed the copyright to Beyond the Walls of Sleep, in which the "Dream Quest of Unknown Kadath," first appeared and in which "The Case of Charles Dexter Ward," was first published in its complete form. If they held full rights to these works at the time of publication, they may have successfully renewed rights to these works. But with the rest of the stories, there appears to be no evidence of renewal.

Unfortunately, there have been and still are copyright holders that claim the rights. Chris J Karr has a long article detailing the claims of these copyright holders. As he first explains, there are 23 works of Lovecraft that were written prior to 1923 and 15 works that never had their copyright registered, but the remaining 27 works are uncertain. They are claimed to be held by Arkham House Publishers, a publishing house founded by Derleth and Wandrei, and by Lovecraft's reconstituted literary estate.

It is generally believed that Lovecraft retained all rights to his works published from 1926 forward (though we don't have documentation to confirm this). In the period 1923-25, he published a few works in amateur presses like The Tryout that didn't register their copyrights, and six works in Weird Tales, which did register its copyrights. Weird Tales did transfer whatever rights it held to Derleth and Wandrei and Arkham House in 1947, which would probably only include the short stories "The Festival," "The Hound," "The Temple," "The Unnameable," "The Horror at Martin's Beach," and "Under the Pyramids." As already stated, there is no evidence that the copyrights were renewed on these stories.

Of the rights to the remaining 21 works published from 1926-37, that Lovecraft retained rights to, his rights were transferred, upon his death in 1937, to his only heir, his aunt, Annie Gamwell. Gamwell transferred the royalties in her will to Derleth and Wandrei, but the copyrights she held and these were transferred to her heirs, Edna Lewis and Ethel Morrish. Lewis and Morrish subsequently transferred at least some of their rights to Arkham House in an agreement. The problem is that the language of the agreement in which Morrish and Lewis ostensibly gave rights to Derleth and Wandrei is not clear about what rights are being transferred, and many dispute whether copyrights were actually transferred by this agreement. The question is whether granting "the right to publish H. P. Lovecraft's work," and "sell second serial rights" constitutes giving them full copyrights or just amounts to giving them permission to publish and collect the royalties.

All of this would be moot if the copyrights weren't renewed. Even if there is no evidence of renewal, it's possible that when Darleth and Wandrei renewed the copyrights to collections in which Lovecraft's works were republished, this constituted renewal of the copyright on those works; it's also possible renewals were made that have simply not been found. Karr concludes that all of Lovecraft's work are in the public domain, based on arguments used by Arkham House in a much later lawsuit with Wandrei. The lawsuit was over disputed royalties apparently owed to Wandrei. Arkham house used the argument that Wandrei was not owed royalties because he lacked the rights to the relevant works because the copyrights were never renewed. Karr takes this as definitive, since it is the statement of the publishing house itself that the publishing house did not own the rights to any of the works of Lovecraft that might be under copyright. I'm not so confident that Karr is right, since it's entirely possible that Arkham House is mistaken and such a statement in the context of a lawsuit needn't be legally binding, and Karr notes that Arkham house and the reconstituted Lovecraft estate still claim to the own the copyrights (for example, copyright notices of Lovecraft Properties LLC, the reconstituted Lovecraft estate, here and here). But, without any records of renewal, it's about the strongest evidence available.

For my money, I'd say all of Lovecraft's are probably in the public domain (possibly excepting writings published posthumously, such as "Dream Quest of Unknown Kadath" and "The Case of Charles Dexter Ward), but there really is no answer to the question of their copyright status. Were the rights properly renewed? Were they transferred to Arkham House? Copyright law has been stretched so far into the past that lost documentation, orphaned works and uncertainties about ownership become more and more problematic. All we can say is that if someone were to challenge copyright ownership, a court would be able to come to a decision, but the decision could go either way. It's doubtful it'll be worth anyone's time and money to try and resolve this in court, and this makes the situation de facto as if the copyrights still hold, since no one wants to risk getting sued (thought the copyright holders haven't, at least so far been zealous in pursuing their lawsuits). So, long as no one challenges their claim, the persons claiming own the rights, own them by default. And the issue won't really be resolved until 2032 when the last of Lovecraft's works published in his lifetime will enter the public domain (and I'm excluding the few posthumous publications, which appeared as late as 1944), unless there's another copyright extension, which would just push all of this mess forward.

All of this really illustrates the absurdity of copyright law, which only grows more absurd the further copyrights are extended. Today, copyright law covers works published before 1978 for 95 years after their publication and 70 years after the death of the author for works published since 1978. Such a term undermines the original intention of copyright law, namely to encourage the creation of art. Not only do rights devolve to people who have no part in the creation of original works, but the longer the term, the more copyright holders invest in protecting valuable copyrights and the less they invest in creating new copyrightable works. But, just as I said before, assuming that copyright law should actually abide by its purported rationale assumes that copyright law actually makes sense.

Friday, May 27, 2011

In one of Nietzsche's notebooks from early 1886 there's a passage in which he wrote "The Titles of Ten New Books." And then he lists out the names of all these books that he's thinking about writing: "The Will to Power," "The Artist," "We Godless Ones," "Midday and Eternity," "Beyond Good and Evil," "Gai Saber," "Music," "Experiences of a Scribe" and "The History of Modern Darkening." Of these ten books, the only one that Nietzsche would end up writing is "Beyond Good and Evil," which he would complete later in 1886. "The Will to Power" he spent considerable time working on, but ultimately he never finished it before the onset of his insanity in January of 1889. The problem for Nietzsche wasn't a lack of ideas or things to write about, it was a lack of time to bring these ideas to fruition. He wanted to write them all, I'm sure, but it took time to write some of them and other projects intervened. He certainly kept himself very busy for the next three years, producing several books, but he just couldn't write everything he wanted to.

I had mentioned earlier that ideas are cheap. Though I hope I have a lot more time ahead of me than Nietzsche, I've still got more ideas for stories and novels than I can write right now, and will probably come up with new ideas before I put all of my current ideas into execution. I'm sure Nietzsche would've gladly traded his overabundance of ideas for some more time. It's basic supply and demand: too many ideas means low value. The only people who are complaining about people stealing their ideas are people that don't have many.

In one of Nietzsche's notebooks from early 1886 there's a passage in which he wrote "The Titles of Ten New Books." And then he lists out the names of all these books that he's thinking about writing: "The Will to Power," "The Artist," "We Godless Ones," "Midday and Eternity," "Beyond Good and Evil," "Gai Saber," "Music," "Experiences of a Scribe" and "The History of Modern Darkening." Of these ten books, the only one that Nietzsche would end up writing is "Beyond Good and Evil," which he would complete later in 1886. "The Will to Power" he spent considerable time working on, but ultimately he never finished it before the onset of his insanity in January of 1889. The problem for Nietzsche wasn't a lack of ideas or things to write about, it was a lack of time to bring these ideas to fruition. He wanted to write them all, I'm sure, but it took time to write some of them and other projects intervened. He certainly kept himself very busy for the next three years, producing several books, but he just couldn't write everything he wanted to.

I had mentioned earlier that ideas are cheap. Though I hope I have a lot more time ahead of me than Nietzsche, I've still got more ideas for stories and novels than I can write right now, and will probably come up with new ideas before I put all of my current ideas into execution. I'm sure Nietzsche would've gladly traded his overabundance of ideas for some more time. It's basic supply and demand: too many ideas means low value. The only people who are complaining about people stealing their ideas are people that don't have many.

Thursday, May 26, 2011

Peter Thiel, a co-founder of PayPal, has picked 24 finalists to eschew college for two years. The 24 finalists will receive $100,000 and spend the time trying to develop business ideas. The idea basically is that college is simply not worth it for some students (it's expensive and time-consuming, might be a bubble), and they'd be better off finding success through a different route. Thiel's fellowship is not popular with everyone, but it raises questions at a time when many are seeing an expansion of the number of students going to college as a worthwhile goal, and the stats are showing that more and more college grads are going into careers that don't require college degrees.

Peter Thiel, a co-founder of PayPal, has picked 24 finalists to eschew college for two years. The 24 finalists will receive $100,000 and spend the time trying to develop business ideas. The idea basically is that college is simply not worth it for some students (it's expensive and time-consuming, might be a bubble), and they'd be better off finding success through a different route. Thiel's fellowship is not popular with everyone, but it raises questions at a time when many are seeing an expansion of the number of students going to college as a worthwhile goal, and the stats are showing that more and more college grads are going into careers that don't require college degrees.

An extensive study released Wednesday in the journal Business and Politics found that the investments of members of the House of Representatives outperformed those of the average investor by 55 basis points per month, or 6 percent annually, suggesting that lawmakers are taking advantage of inside information to fatten their stock portfolios.

Mark Perry concurs); we should eliminate insider trading laws. Addressing an unfair situation (where some have unfair access to information) with an unfair law isn't an improvement. I think it's fair to say: you can't legislate fairness.

An extensive study released Wednesday in the journal Business and Politics found that the investments of members of the House of Representatives outperformed those of the average investor by 55 basis points per month, or 6 percent annually, suggesting that lawmakers are taking advantage of inside information to fatten their stock portfolios.

Mark Perry concurs); we should eliminate insider trading laws. Addressing an unfair situation (where some have unfair access to information) with an unfair law isn't an improvement. I think it's fair to say: you can't legislate fairness.

Monday, May 23, 2011

Been reading through The Whiskey Rebellion and liking it. One of the interesting things I came across was how, at the time of the Whiskey Rebellion (1790s), it was not uncommon for whiskey to be used as currency in the rather isolated regions of the Western United States (at the time, that would be the Appalachian region). Paper currency was still not too hot at the time because there had been a lot of currency devaluation (namely printing reams of money) during the Revoluationary War and people were only just learning to trust the dollar; and Gold and silver weren't practicable because very little of it made it out that far. So, people were left with what they had, and they found whiskey to be the best available currency.

There's a lot of currencies that people have used over the years, whether it be tobacco, grain, shells or canned fish, people tend to find a workable currency, even in rather limited and difficult circumstances, like living in a prison. Gold has tended to be the preferred currency throughout history when people can get at it and there are a number of reasons why. It's 1) rare, 2) divisible, 3) portable, 4) durable and 5) recognizable. Compared to gold, whiskey's less than optimal, but it still stacks up pretty well. It's not as rare as gold, but in the isolated regions of the Appalachians it was more than sufficiently rare (especially since people like to drink away their stock); it's divisible; it's portable (much more so than grain), it's fairly durable (it has a very long shelf life, but as a liquid it is subject to being lost by spillage) and it's recognizable (in that part of the country they started learning to recognize the taste of whiskey from a young age). And if we look at other currencies widely used, we usually see that they stack up pretty well, or at least better than available alternatives.

The strange thing, though, is that some fiction authors simply don't understand the nature of money and come up with some truly poor choices of money. Most fiction authors, if they need a made up money, will just simplify it by using a fiat currency which works pretty much like our dollars and call it something uncreative like "credits," but some authors try to get creative and use a commodity currency. The problem is they don't know what makes a good commodity currency. The worst offender I think I've come across is the graphic novel Bone by Jeff Smith, which partly takes place in a village that uses chicken eggs as currency. That definitely would fall low on four of the five qualities of a good money: eggs are easily breakable, they're common, they have only a modest shelf life and they're not divisible. A simple agrarian economy would probably find a much more workable form of money, like grain or spirits or, if available, precious metals.

Another example is the 1951 The Day the Earth Stood Still, in which Klaatu's alien civilization uses diamonds for currency. I imagine the writer thought he was being clever and trying to show that these aliens were wiser than us because they'd latched onto a currency superior to ours, since it's more durable and much more portable. It's true that it is portable - a high quality 1 carat diamond, weighing in at 200 mg would cost thousands of dollars, whereas 200 mg of gold, at today's prices would only cost about $11 - but what it makes up for in portability it loses in divisibility. Even the smallest diamonds are out of the price range of most normal purchases, and are so small as to be inconvenient. They also fail on recognizability, not because it's not distinct, but because it can be too easily forged. Cubic Zirconia is simply not distinguishable by most people from diamond and small diamonds are difficult to examine. Gold can be faked too, but it's color is distinct enough that faking usually comes in the form of covering a cheaper metal with a thin veneer of real gold or by mixing it with baser metals. But even more important, gold can be stamped. A distinct and difficult to forge stamp from a reputable mint goes a long way towards preventing forgery. Such is not the case with diamonds. You could probably mitigate the problem of inconvenience and forgery by putting small diamonds in the assay cards they use for small gold bars, but that complete undermines the portability aspect. In other words, though we could imagine that, due to the drawbacks of diamonds, even in the limited space of an aircraft, you could find a place to stuff a few pounds of gold coin. Only a few pounds would be worth more than most people in this country earn in a year.

So, I guess I'm just saying, if you want to get creative and imagine a fictional universe using an alternative commodity currency you need to bear in mind what makes a good currency, and even then, boring ol' cliche gold is still probably your best option (King David used foreskins to buy the love of his wife, so I guess there's always that [1 Samuel 18:25-27]).

Been reading through The Whiskey Rebellion and liking it. One of the interesting things I came across was how, at the time of the Whiskey Rebellion (1790s), it was not uncommon for whiskey to be used as currency in the rather isolated regions of the Western United States (at the time, that would be the Appalachian region). Paper currency was still not too hot at the time because there had been a lot of currency devaluation (namely printing reams of money) during the Revoluationary War and people were only just learning to trust the dollar; and Gold and silver weren't practicable because very little of it made it out that far. So, people were left with what they had, and they found whiskey to be the best available currency.

There's a lot of currencies that people have used over the years, whether it be tobacco, grain, shells or canned fish, people tend to find a workable currency, even in rather limited and difficult circumstances, like living in a prison. Gold has tended to be the preferred currency throughout history when people can get at it and there are a number of reasons why. It's 1) rare, 2) divisible, 3) portable, 4) durable and 5) recognizable. Compared to gold, whiskey's less than optimal, but it still stacks up pretty well. It's not as rare as gold, but in the isolated regions of the Appalachians it was more than sufficiently rare (especially since people like to drink away their stock); it's divisible; it's portable (much more so than grain), it's fairly durable (it has a very long shelf life, but as a liquid it is subject to being lost by spillage) and it's recognizable (in that part of the country they started learning to recognize the taste of whiskey from a young age). And if we look at other currencies widely used, we usually see that they stack up pretty well, or at least better than available alternatives.

The strange thing, though, is that some fiction authors simply don't understand the nature of money and come up with some truly poor choices of money. Most fiction authors, if they need a made up money, will just simplify it by using a fiat currency which works pretty much like our dollars and call it something uncreative like "credits," but some authors try to get creative and use a commodity currency. The problem is they don't know what makes a good commodity currency. The worst offender I think I've come across is the graphic novel Bone by Jeff Smith, which partly takes place in a village that uses chicken eggs as currency. That definitely would fall low on four of the five qualities of a good money: eggs are easily breakable, they're common, they have only a modest shelf life and they're not divisible. A simple agrarian economy would probably find a much more workable form of money, like grain or spirits or, if available, precious metals.

Another example is the 1951 The Day the Earth Stood Still, in which Klaatu's alien civilization uses diamonds for currency. I imagine the writer thought he was being clever and trying to show that these aliens were wiser than us because they'd latched onto a currency superior to ours, since it's more durable and much more portable. It's true that it is portable - a high quality 1 carat diamond, weighing in at 200 mg would cost thousands of dollars, whereas 200 mg of gold, at today's prices would only cost about $11 - but what it makes up for in portability it loses in divisibility. Even the smallest diamonds are out of the price range of most normal purchases, and are so small as to be inconvenient. They also fail on recognizability, not because it's not distinct, but because it can be too easily forged. Cubic Zirconia is simply not distinguishable by most people from diamond and small diamonds are difficult to examine. Gold can be faked too, but it's color is distinct enough that faking usually comes in the form of covering a cheaper metal with a thin veneer of real gold or by mixing it with baser metals. But even more important, gold can be stamped. A distinct and difficult to forge stamp from a reputable mint goes a long way towards preventing forgery. Such is not the case with diamonds. You could probably mitigate the problem of inconvenience and forgery by putting small diamonds in the assay cards they use for small gold bars, but that complete undermines the portability aspect. In other words, though we could imagine that, due to the drawbacks of diamonds, even in the limited space of an aircraft, you could find a place to stuff a few pounds of gold coin. Only a few pounds would be worth more than most people in this country earn in a year.

So, I guess I'm just saying, if you want to get creative and imagine a fictional universe using an alternative commodity currency you need to bear in mind what makes a good currency, and even then, boring ol' cliche gold is still probably your best option (King David used foreskins to buy the love of his wife, so I guess there's always that [1 Samuel 18:25-27]).

The rain fell so thick,
Like a forest of reeds hung from the sky.
She parted the raindrops,
Like parting a bead curtain with her hands,
And we darted for shelter,
Where little drips still dripped from off her nose.
And we kissed each other,
With a kiss like a great splash in a puddle.

The rain fell so thick,
Like a forest of reeds hung from the sky.
She parted the raindrops,
Like parting a bead curtain with her hands,
And we darted for shelter,
Where little drips still dripped from off her nose.
And we kissed each other,
With a kiss like a great splash in a puddle.

It's nice to have lots of freedom to be creative, so that you can do exactly what you want. For example, if you live in a country where there are free speech restriction, this can inhibit your ability to express certain ideas, which can feel stifling. On the other hand, external limitations can spur creativity. External authorities setting limits is exactly what you don't want, but self-imposed limitations can present challenges and push you towards ideas you wouldn't have otherwise. Just as Frost said free verse is like playing tennis with the net down, one could say that limitation is the mother of inventiveness.

One example of this I have found particularly fruitful is to adapt dreams. The thing that's great about dreams is that they're so random, filled with nonsensical details that don't make sense together. So, what I did, as an exercise in creativity was to take all the details of the dream that I remembered and try to make them into a sensible story (using as many of the details as I could), by adding background, context, sci-fi or fantasy elements and so on. So, if I had a dream where I was walking over rolling hills of snow in my bare feet, and my feet felt warm even while on the snow, and to get into my apartment building I had to crawl through a very narrow cave like a spelunker (that's actually the dream I had this afternoon while taking a nap) then I would try to make a story out of those details that actually wasn't random, but was comprehensible. Maybe my apartment had been destroyed by an earthquake so I had to crawl through the rubble; maybe I had an ailment that made me metabolism unbearably high, such that I had to be in light clothing out in the cold just to keep a normal temperature; maybe it'd just become really warm after a huge snow storm; or whatever. I tried this many times, and some of the results are off the wall.

Certainly it's not the only way you could try to spurn creativity this way, but it's one way to get really uniquely creative ideas.

It's nice to have lots of freedom to be creative, so that you can do exactly what you want. For example, if you live in a country where there are free speech restriction, this can inhibit your ability to express certain ideas, which can feel stifling. On the other hand, external limitations can spur creativity. External authorities setting limits is exactly what you don't want, but self-imposed limitations can present challenges and push you towards ideas you wouldn't have otherwise. Just as Frost said free verse is like playing tennis with the net down, one could say that limitation is the mother of inventiveness.

One example of this I have found particularly fruitful is to adapt dreams. The thing that's great about dreams is that they're so random, filled with nonsensical details that don't make sense together. So, what I did, as an exercise in creativity was to take all the details of the dream that I remembered and try to make them into a sensible story (using as many of the details as I could), by adding background, context, sci-fi or fantasy elements and so on. So, if I had a dream where I was walking over rolling hills of snow in my bare feet, and my feet felt warm even while on the snow, and to get into my apartment building I had to crawl through a very narrow cave like a spelunker (that's actually the dream I had this afternoon while taking a nap) then I would try to make a story out of those details that actually wasn't random, but was comprehensible. Maybe my apartment had been destroyed by an earthquake so I had to crawl through the rubble; maybe I had an ailment that made me metabolism unbearably high, such that I had to be in light clothing out in the cold just to keep a normal temperature; maybe it'd just become really warm after a huge snow storm; or whatever. I tried this many times, and some of the results are off the wall.

Certainly it's not the only way you could try to spurn creativity this way, but it's one way to get really uniquely creative ideas.

A long time ago Aristotle emphasized that the end and goal of all human action is something called Eudaimonia, which roughly translates as happiness, but probably more accurately means flourishing or living a good life. It means success in all those areas that are important to us, such as accomplishments, material wealth and good human relationships.

Psychology has only recently begun recently to catch up to this idea. For a long time, the emphasis was on more transient notions of happiness - emotional joy, excitement, pleasure. But this narrow perspective leads into some obvious puzzles:

Why did couples go on having children even though the data clearly showed that parents are less happy than childless couples? Why did billionaires desperately seek more money even when there was nothing they wanted to do with it?

I'd mentioned this problem earlier, in particular with relation to the joys of having kids. The solution, for psychology, is to focus instead on Eudaimonia. Dr. Seligman, for one, has taken this approach, and has broken down flourishing into five main areas:

positive emotion, engagement (the feeling of being lost in a task), relationships, meaning and accomplishment.

I tend to think the latter three are probably the most important, with the former two falling more on the transient side of happiness, but, nonetheless, it's a good list.

There are numerous implications for these ideas. I see a major implication for welfare and charity in the idea that only self-made success brings satisfaction. In other words, welfare and charity need to be pursued as a means to help someone find success on their own, and any welfare that doesn't aim towards or produce those ends is harmful.

A long time ago Aristotle emphasized that the end and goal of all human action is something called Eudaimonia, which roughly translates as happiness, but probably more accurately means flourishing or living a good life. It means success in all those areas that are important to us, such as accomplishments, material wealth and good human relationships.

Psychology has only recently begun recently to catch up to this idea. For a long time, the emphasis was on more transient notions of happiness - emotional joy, excitement, pleasure. But this narrow perspective leads into some obvious puzzles:

Why did couples go on having children even though the data clearly showed that parents are less happy than childless couples? Why did billionaires desperately seek more money even when there was nothing they wanted to do with it?

I'd mentioned this problem earlier, in particular with relation to the joys of having kids. The solution, for psychology, is to focus instead on Eudaimonia. Dr. Seligman, for one, has taken this approach, and has broken down flourishing into five main areas:

positive emotion, engagement (the feeling of being lost in a task), relationships, meaning and accomplishment.

I tend to think the latter three are probably the most important, with the former two falling more on the transient side of happiness, but, nonetheless, it's a good list.

There are numerous implications for these ideas. I see a major implication for welfare and charity in the idea that only self-made success brings satisfaction. In other words, welfare and charity need to be pursued as a means to help someone find success on their own, and any welfare that doesn't aim towards or produce those ends is harmful.

It's not always easy to accurately determine authorship of works in the ancient world. Sometimes works are unsigned or, quite commonly, they're misattributed. For example, among the 43 dialogues attributed to Plato, 14 are generally considered not to be written by Plato, with an additional 3 that are uncertain and debated. The remaining 26 scholars are quite comfortable with attributing to Plato and haven't much doubted, meaning about 60% of "Plato"s works were definitely written by the historical Plato. This misattribution is not just a matter of people mistaking who wrote these works, but of the authors themselves deliberately claiming that the author of the work was someone else, such as Plato. It's a practice called Pseudopigraphy. A similar situation occurs with the Pauline epistles in the New Testament. Of the 13 letters that are claimed to be written by Paul, only 7 are widely agreed upon to be definitely the work of Paul; the 3 pastoral epistles (1st & 2nd Timothy, & Titus) are widely believed to be not Paul's; and 3 are other epistles hotly debated (1st & 2nd Colossians & Ephesians). This raises the question of why people would do such a thing and what status we should give to these works.

There may be many reasons for doing such a thing. For one, people may assign authorship to an author when they are working in the tradition of that author. It may become conventional to assign authorship to the founder of the tradition. For example, it was quite common for authors in the Pythagorean school to attribute authorship to Pythagoras long after he died, making it very difficult for us today to determine what exactly the original Pythagoras actually contributed to philosophy and mathematics and what is the result of his followers. The reason for this may be part humility, part homage or part tradition.

Generally speaking, people have wanted to assume this as the model and motive for the Pauline pseudopigraphy because it's the most benign and virtuous. In other words, the Pseudo-Pauline authors were simply followers of Paul who believed themselves to be simply extending the words of Paul.

But there are other, less benign reasons why people misattribute authorship. For one, one might do it purely for personal gain. For example, it has been argued that Moses de Leon, who originally published the Zohar in Spain in the 13th century was actually the author, despite attributing the work to a famous 2nd century rabbi, Shimon ben Yochai. According to one account his wife confessed, after de Leon's death, that he was the author and that he attributed it to the much more famous rabbi because it would be more profitable that way. Why would anyone listen to an obscure Spanish Jew? But a famous rabbi who lived over a thousand years ago, that is someone worth listening to. Such was also the case with James MacPherson's Ossian and Thomas Chatterton's Rowley poems, two 18th century British poets who both claimed to have discovered long lost medieval poems.

And sometimes people misattribute authorship in order to attribute false beliefs to someone. For example, the famous Russian forgery, The Protocals of the Elders of Zion was written in 1903 to give credence to the idea of a Jewish plot to control the world. The book was presented as if it was the work of Jewish leaders spelling out their plans for world domination, clearly meant to give fodder to anti-Semites who genuinely believed in the idea of a Jewish world conspiracy.

As stated, pseudopigraphy was a common practice in the Ancient world, and certainly extends to the books of the New Testament. Scholars have long known about these issues. Doubted works include the pseudo-Pauline letters, as well as 2nd Peter and certain significant passages later interpolated into the Gospels. Most notable among the interpolations are the story of the woman taken in adultery, the last 12 lines of the Gospel According to Mark, and the so-called Comma Johanneum. But though scholars acknowledge that these are probably not original, they have been reluctant to call any of these forgeries.

Bart Ehrman has excited a lot of controversy by stating that these pseudopigrapha and interpolations are appropriately called forgeries, claiming that more than a third of the books of the New Testament were forged. From a certain perspective none of this is new. As I said, scholars have long doubted the authenticity of the pieces I've mentioned, as well as others. And, though there is debate about some of them, there are still passages that are widely acknowledged to be unoriginal and the work of much later scribes and authors.

But, by calling them forgeries Ehrman's trying to make a stronger claim. For one he's saying that these works are deceptive and they wouldn't have found their way into the canon of scripture if Ancient authors had realized it. The Ancient authors were concerned with authority, and did believe that certain people had genuine authority that others did not. They were on the lookout and rejected purported forgeries like Marcion's version of Luke.

Additionally, Ehrman argues that such pseudopigrapha were often motivated by clear ulterior motives. In the Pseudo-Pauline 1st Timothy, Paul supposedly wrote: "I do not permit a woman to teach or to assume authority over a man; she must be quiet"(1 Tim 2:12), and a scribe may have even interpolated, into the genuinely Pauline 1st Corinthians, that "women should remain silent in the churches"(1 Cor 14:34). In short, the authors may have been using the authority of more notable figures to give weight to their own ideas and make them appear to have divine sanction.

One, though, wouldn't want to say that of all such passages. For example, it's hard to argue that the story of the woman taken in adultery was added for some ulterior motive. It appears to have just been meant to emphasize Jesus prominent message of forgiveness, which is bolstered by other parts of the gospels. And there other interpolations which may have been interpolated accidentally. For example, it was common at the time for scribes to add notes in the margin. Since scribes would sometimes add in the margin text they'd accidentally omitted, later scribes would sometimes take this marginalia and reintegrate it into the text, believing it was supposed to be part of the text. The story of the woman taken in adultery may have started as a marginal note, recounting a well known story, that some scribe added to the margin and a later scribe integrated into the text. After passing through the imperfect hands of so many transcribers, corruptions seeped in.

So, what do we make of all this? For one, it does go to prove something that we were already aware of, that in the days of early Christianity it was a time of great controversy and divisiveness within the budding church, with many groups and theologies jostling for preeminence. On the other hand, whether we want to call this pseuddopgripha "forgeries" is a difficult question, but it does highlight the point that these are the works of genuinely very fallible men.

It's not always easy to accurately determine authorship of works in the ancient world. Sometimes works are unsigned or, quite commonly, they're misattributed. For example, among the 43 dialogues attributed to Plato, 14 are generally considered not to be written by Plato, with an additional 3 that are uncertain and debated. The remaining 26 scholars are quite comfortable with attributing to Plato and haven't much doubted, meaning about 60% of "Plato"s works were definitely written by the historical Plato. This misattribution is not just a matter of people mistaking who wrote these works, but of the authors themselves deliberately claiming that the author of the work was someone else, such as Plato. It's a practice called Pseudopigraphy. A similar situation occurs with the Pauline epistles in the New Testament. Of the 13 letters that are claimed to be written by Paul, only 7 are widely agreed upon to be definitely the work of Paul; the 3 pastoral epistles (1st & 2nd Timothy, & Titus) are widely believed to be not Paul's; and 3 are other epistles hotly debated (1st & 2nd Colossians & Ephesians). This raises the question of why people would do such a thing and what status we should give to these works.

There may be many reasons for doing such a thing. For one, people may assign authorship to an author when they are working in the tradition of that author. It may become conventional to assign authorship to the founder of the tradition. For example, it was quite common for authors in the Pythagorean school to attribute authorship to Pythagoras long after he died, making it very difficult for us today to determine what exactly the original Pythagoras actually contributed to philosophy and mathematics and what is the result of his followers. The reason for this may be part humility, part homage or part tradition.

Generally speaking, people have wanted to assume this as the model and motive for the Pauline pseudopigraphy because it's the most benign and virtuous. In other words, the Pseudo-Pauline authors were simply followers of Paul who believed themselves to be simply extending the words of Paul.

But there are other, less benign reasons why people misattribute authorship. For one, one might do it purely for personal gain. For example, it has been argued that Moses de Leon, who originally published the Zohar in Spain in the 13th century was actually the author, despite attributing the work to a famous 2nd century rabbi, Shimon ben Yochai. According to one account his wife confessed, after de Leon's death, that he was the author and that he attributed it to the much more famous rabbi because it would be more profitable that way. Why would anyone listen to an obscure Spanish Jew? But a famous rabbi who lived over a thousand years ago, that is someone worth listening to. Such was also the case with James MacPherson's Ossian and Thomas Chatterton's Rowley poems, two 18th century British poets who both claimed to have discovered long lost medieval poems.

And sometimes people misattribute authorship in order to attribute false beliefs to someone. For example, the famous Russian forgery, The Protocals of the Elders of Zion was written in 1903 to give credence to the idea of a Jewish plot to control the world. The book was presented as if it was the work of Jewish leaders spelling out their plans for world domination, clearly meant to give fodder to anti-Semites who genuinely believed in the idea of a Jewish world conspiracy.

As stated, pseudopigraphy was a common practice in the Ancient world, and certainly extends to the books of the New Testament. Scholars have long known about these issues. Doubted works include the pseudo-Pauline letters, as well as 2nd Peter and certain significant passages later interpolated into the Gospels. Most notable among the interpolations are the story of the woman taken in adultery, the last 12 lines of the Gospel According to Mark, and the so-called Comma Johanneum. But though scholars acknowledge that these are probably not original, they have been reluctant to call any of these forgeries.

Bart Ehrman has excited a lot of controversy by stating that these pseudopigrapha and interpolations are appropriately called forgeries, claiming that more than a third of the books of the New Testament were forged. From a certain perspective none of this is new. As I said, scholars have long doubted the authenticity of the pieces I've mentioned, as well as others. And, though there is debate about some of them, there are still passages that are widely acknowledged to be unoriginal and the work of much later scribes and authors.

But, by calling them forgeries Ehrman's trying to make a stronger claim. For one he's saying that these works are deceptive and they wouldn't have found their way into the canon of scripture if Ancient authors had realized it. The Ancient authors were concerned with authority, and did believe that certain people had genuine authority that others did not. They were on the lookout and rejected purported forgeries like Marcion's version of Luke.

Additionally, Ehrman argues that such pseudopigrapha were often motivated by clear ulterior motives. In the Pseudo-Pauline 1st Timothy, Paul supposedly wrote: "I do not permit a woman to teach or to assume authority over a man; she must be quiet"(1 Tim 2:12), and a scribe may have even interpolated, into the genuinely Pauline 1st Corinthians, that "women should remain silent in the churches"(1 Cor 14:34). In short, the authors may have been using the authority of more notable figures to give weight to their own ideas and make them appear to have divine sanction.

One, though, wouldn't want to say that of all such passages. For example, it's hard to argue that the story of the woman taken in adultery was added for some ulterior motive. It appears to have just been meant to emphasize Jesus prominent message of forgiveness, which is bolstered by other parts of the gospels. And there other interpolations which may have been interpolated accidentally. For example, it was common at the time for scribes to add notes in the margin. Since scribes would sometimes add in the margin text they'd accidentally omitted, later scribes would sometimes take this marginalia and reintegrate it into the text, believing it was supposed to be part of the text. The story of the woman taken in adultery may have started as a marginal note, recounting a well known story, that some scribe added to the margin and a later scribe integrated into the text. After passing through the imperfect hands of so many transcribers, corruptions seeped in.

So, what do we make of all this? For one, it does go to prove something that we were already aware of, that in the days of early Christianity it was a time of great controversy and divisiveness within the budding church, with many groups and theologies jostling for preeminence. On the other hand, whether we want to call this pseuddopgripha "forgeries" is a difficult question, but it does highlight the point that these are the works of genuinely very fallible men.

Thursday, May 19, 2011

Every morning lying beside me still
My little wife in peaceful sleep,
Until that little light from windows streaming in
Awakes her to a smile and a morning kiss,
That whisks me into morning blisses.
And every morning still the same
Forever on: delight, elation,
Growing deeper, love inflating.
Every morning getting better
As each one more she besides me lays.

For Fatma

Every morning lying beside me still
My little wife in peaceful sleep,
Until that little light from windows streaming in
Awakes her to a smile and a morning kiss,
That whisks me into morning blisses.
And every morning still the same
Forever on: delight, elation,
Growing deeper, love inflating.
Every morning getting better
As each one more she besides me lays.

Several months ago, Mike Masnick at Techdirt asked the question "Have We Reached A Tipping Point Where Self-Publishing Is Better Than Getting A Book Deal?" The answer, as he sees it, is definitely yes, at least for may writers. The real services that a publishers provides are editing and marketing and if you can manage these on your own or can independently hire someone to do them for you, then self-publishing really is a better deal. In fact, Masnick noted, a few months after, that "More Authors are Realizing They Can Make A Damn Good Living Self-Releasing Super Cheap eBooks." For many authors, selling cheap e-books in the 99¢ to $2.99 range is much more profitable than going the traditional route simply because of the large amount that publishers take and the small pittance they give to their writers. Again, that's not to say, it's always a bad idea, but it should make a writer think twice about going the traditional publishing route. Last week Kristine Kathryn Rusch went through a long explanation of how poorly publishers really pay writers and that, since publishers no longer control the whole market (since there are ways to publish and distribute without publishers), it simply is just a better idea to get those services independently: you can hire editors to edit, hire designers to design your cover, hire marketers to market, as well as doing these things yourself. According to Yahoo, publishers earn nowadays a 16.1% net profit margin, which is very impressive, ranking them 21 out of 215 industries. And David Friedman similarly argued that the traditional services provided by publishers are more and more being more competitively provided independently of publishers, such that publishing might eventually disappear (or at least radically change).

Most notably, for the digital self-publisher, good money can be made at very low prices. Glenn Reynolds at instapundit, among others, had noted in March the 99¢ kindleebook trend, and it got picked up by others. Technium showed us the math of how it is that lowering the price from the already cheap $2.99 to 99$ can translate into much higher profits, and Coyote Blog noted how lowering the price for his ebook even helped his far from bestselling title.

While all of this is going on, ebooks have only been growing in popularity and traditional paper books declining. Just today Amazon announced that ebook sales now surpass in sales all forms of print books. This is in terms of numbers of books sold. They noted that for every 100 paper books (including both paperback and hard cover) they sold 105 ebooks. It helps that the ebooks are cheaper, but the overall trend is unmistakeable. Traditional paper publishing is declining.

Admittedly, not all of this really applies to me all that well. I'm an author at the very beginning of his career with limited following who's going to struggle to market his work. But nonetheless there simply is a lot to recommend about going for cheap ebooks: I can publish it much quicker, don't have to spend months finding a publisher, will fully control rights to it and might not even get that much better marketing from the publisher anyways. Thus, I've published my first release, a collection of short stories titled The Merpeople of Old Lagoon & Other Stories for National Short Story Month at 99¢. 99¢ is the cheapest I can sell an ebook for (except free), and it only leaves me 35¢ per book. So, I won't really be seeing anything on this for a while, but we'll keep at it and hope for the best in the long run.

And this summer I'll release a few other works that need to be finalized. And then it'll just be the long hard road of getting my name out. One can't say that the advantage of digital self-publishing is that it's easy.

Several months ago, Mike Masnick at Techdirt asked the question "Have We Reached A Tipping Point Where Self-Publishing Is Better Than Getting A Book Deal?" The answer, as he sees it, is definitely yes, at least for may writers. The real services that a publishers provides are editing and marketing and if you can manage these on your own or can independently hire someone to do them for you, then self-publishing really is a better deal. In fact, Masnick noted, a few months after, that "More Authors are Realizing They Can Make A Damn Good Living Self-Releasing Super Cheap eBooks." For many authors, selling cheap e-books in the 99¢ to $2.99 range is much more profitable than going the traditional route simply because of the large amount that publishers take and the small pittance they give to their writers. Again, that's not to say, it's always a bad idea, but it should make a writer think twice about going the traditional publishing route. Last week Kristine Kathryn Rusch went through a long explanation of how poorly publishers really pay writers and that, since publishers no longer control the whole market (since there are ways to publish and distribute without publishers), it simply is just a better idea to get those services independently: you can hire editors to edit, hire designers to design your cover, hire marketers to market, as well as doing these things yourself. According to Yahoo, publishers earn nowadays a 16.1% net profit margin, which is very impressive, ranking them 21 out of 215 industries. And David Friedman similarly argued that the traditional services provided by publishers are more and more being more competitively provided independently of publishers, such that publishing might eventually disappear (or at least radically change).

Most notably, for the digital self-publisher, good money can be made at very low prices. Glenn Reynolds at instapundit, among others, had noted in March the 99¢ kindleebook trend, and it got picked up by others. Technium showed us the math of how it is that lowering the price from the already cheap $2.99 to 99$ can translate into much higher profits, and Coyote Blog noted how lowering the price for his ebook even helped his far from bestselling title.

While all of this is going on, ebooks have only been growing in popularity and traditional paper books declining. Just today Amazon announced that ebook sales now surpass in sales all forms of print books. This is in terms of numbers of books sold. They noted that for every 100 paper books (including both paperback and hard cover) they sold 105 ebooks. It helps that the ebooks are cheaper, but the overall trend is unmistakeable. Traditional paper publishing is declining.

Admittedly, not all of this really applies to me all that well. I'm an author at the very beginning of his career with limited following who's going to struggle to market his work. But nonetheless there simply is a lot to recommend about going for cheap ebooks: I can publish it much quicker, don't have to spend months finding a publisher, will fully control rights to it and might not even get that much better marketing from the publisher anyways. Thus, I've published my first release, a collection of short stories titled The Merpeople of Old Lagoon & Other Stories for National Short Story Month at 99¢. 99¢ is the cheapest I can sell an ebook for (except free), and it only leaves me 35¢ per book. So, I won't really be seeing anything on this for a while, but we'll keep at it and hope for the best in the long run.

And this summer I'll release a few other works that need to be finalized. And then it'll just be the long hard road of getting my name out. One can't say that the advantage of digital self-publishing is that it's easy.

Wednesday, May 18, 2011

If you're like most people, you probably assume that insider trading is unambiguously bad. You've seen Wall Street and see Gordon Gecko as the villain and probably couldn't imagine that anyone would support insider trading. You'd be surprised that among economists the virtues of insider trading are rather controversial. There's a thorough summary of the debate and the literature on the topic here. Considerable empirical research has gone into the question of the effect of insider trading, without any clear consensus. Many economists think we should drop insidertrading laws and some think we shouldn't.

Attempts to answer the question through empirical research have proven difficult because we have only a limited grasp of the extent of insider trading. There is no doubt that the few persons convicted of insider trading, like Martha Stewart and Raj Rajaratnam, represent a small sliver of the overall practice. Discovering whether someone had special insider knowledge motivating a trade is especially hard in the best of circumstances, and it's even harder to discover when special insider knowledge leads a person not to make a transaction (for example, if someone was thinking of buying or selling, but got wind of some info which led them to decide against it).

One plausible problem of insider trading is that it shifts resources from outsiders to insiders. This would only happen in cases when securities, such as stock, are increasing in price, but the idea is that the insiders can use their earlier access to knowledge to buy greater amout of stocks at a lower price, before the rise. The rejoinder to this is that insider trading could permit companies to lower the salaries of management. Because the information they have access to will be more valuable in the absence of insider trading laws, direct compensation (salary and benefits) can be reduced and the benefits of this pay cut passed on to shareholders.

The main argument for why insider trading is advantageous is that, as a market, any securities market, like the stock exchange, serves as the consensus among investors of the real value of a company's stock. It's a crucial source of revenue for a company, so underpricing can disadvantage a worthy company just as overpricing can unfairly benefit an unworthy company. The market seeks out the correct price through the price changes caused by investors' buying and selling. But, since investors are always facing imperfect knowledge, price will be an imperfect measure of value. The more knowledge those investors have, the better they'll be able to set the price. Insider trading will thus make the price of shares more inaccurate. The case often cited is that of Enron, which in 2001 had become grossly overpriced, and crashed in price as soon as it was revealed that the company's accounting practices had masked their profound financial unsustainability. Insider trading would've brought Enron down to reality much quicker, would've forced them to bankruptcy much sooner, saved many shareholders (those who invested in the company late in the game and lost their shirts) lots of money, and would've benefited the economy by transferring its resources into the hands of other companies that could put them to more productive use.

From a moral perspective, the counter arguments are that insider trading is fraud and it's unfair. The idea that it's fraud derives from the earliest supreme court decision on insider trading, before there was even a law on the books against insider trading. The 1909 supreme court, in Strong V. Repide decided that when a person has access to insider information, doesn't make it public, and then transacts based on that insider information, that person is committing fraud. This argument relies upon a supposed duty to make this information public. In other words, since I must make such information public, then the practice of not making it public is a deliberate withholding of information, which is a form of deceit. But without this duty to disclose, then the case for deliberate deceit doesn't hold. Certainly, there might be cases, where such a duty makes sense, such as if you told someone some information, and then failed to apprise them of a change in circumstances, or if a person specifically requested information and you withheld it. But it's hard to make the case that all instances of withholding information represent fraud. Additionally, there are clear cases, such as if a person is not in a position to make public insider information (for example, if a person is contractually prevented from disclosing certain information) or if a person is responding to non-verifiable information (for example, if a person hearing rumors, which can't be verified), which it might be risky to reveal, especially if the information turned out to be untrue. In short, it seems unlikely that insider trading necessarily involves fraud, and if there is fraud in particular cases, then relevant law should apply to that fraud.

The other, probably more common, argument is that insider trading is unfair. The idea is that since the insider has unequal access to certain information, using that information to their advantage is unfair for everyone else who doesn't have access to the information. This is undoubtedly true, but such unfairness is also a common feature of life. I use my superior knowledge of philosophy to secure positions teaching philosophy to people less knowledgeable. Real estate agents use their superior knowledge of the real estate market to make themselves useful to potential buyers and sellers. People negotiating transactions in many instances frequently try to control information to make for the best deal.

Additionally, it's also worth it to ask : to whom is it unfair? As Robert Murphy explains:

For example, suppose a Wall Street trader is at the bar and overhears an executive on his cell phone discussing some good news for the Acme Corporation. The trader then rushes to buy 1,000 shares of the stock, which is currently selling for $10. When the news becomes public, the stock jumps to $15, and the trader closes out his position for a handsome gain of $5,000. Who is the supposed victim in all of this? From whom was this $5,000 profit taken?

The $5,000 wasn't taken from the people who sold the shares to the trader. They were trying to sell anyway, and would have sold it to somebody else had the trader not entered the market. In fact, by snatching the 1,000 shares at the current price of $10, the trader's demand may have held the price higher than it otherwise would have been. In other words, had the trader not entered the market, the people trying to sell 1,000 shares may have had to settle for, say, $9.75 per share rather than the $10.00 they actually received. So we see that the people dumping their stock either were not hurt or actually benefited from the action of the trader.

...

In fact, the only people who demonstrably lost out were those who were trying to buy shares of the stock just when the trader did so, before the news became public. By entering the market and acquiring 1,000 shares (temporarily), the trader either reduced the number of Acme shares other potential buyers acquired, or he forced them to pay a higher price than they otherwise would have. When the news then hit and the share prices jumped, this meant that this select group (who also acquired new shares of Acme in the short interval in question) made less total profit than they otherwise would have.

In short, in any particular instance of insider trading there will be winners and losers. Some people may actually benefit from another person's insider trading, just as I gave the example of people who bought Enron stocks in 2001 just before it tanked and lost huge amounts of money (if the company's stock had fallen earlier, these people would've benefited). The people who benefit or lose do so by complete accident; there's no fairness to the distribution of benefits and losses in these cases; it's just how it happened.

Additionally, it's also frequently assumed that insider trading is some sort of guarantee. But the reality is that there is still risk involved. The insider information might affect stock prices in surprising ways, not to mention that the insider information may simply turn out to be false.

Thus, though we can't deny that there is a certain unfairness to insider trading, these considerations should mitigate that unfairness, and considering that a market will perform better with better information, it seems that overall insider trading is beneficial. In Wall Street Gordon Gecko had told Bud Fox that the stock market is a zero-sum game, but this is completely inaccurate. The stock market has grown in size considerably since 1987 when the movie was released. A healthy functioning market will in fact be a positive-sum game, and will perform all the better with better information, generally benefiting those who participate in it. There undoubtedly will be winners and losers and inevitable unfairness, but overall insider trading is a net benefit.

If you're like most people, you probably assume that insider trading is unambiguously bad. You've seen Wall Street and see Gordon Gecko as the villain and probably couldn't imagine that anyone would support insider trading. You'd be surprised that among economists the virtues of insider trading are rather controversial. There's a thorough summary of the debate and the literature on the topic here. Considerable empirical research has gone into the question of the effect of insider trading, without any clear consensus. Many economists think we should drop insidertrading laws and some think we shouldn't.

Attempts to answer the question through empirical research have proven difficult because we have only a limited grasp of the extent of insider trading. There is no doubt that the few persons convicted of insider trading, like Martha Stewart and Raj Rajaratnam, represent a small sliver of the overall practice. Discovering whether someone had special insider knowledge motivating a trade is especially hard in the best of circumstances, and it's even harder to discover when special insider knowledge leads a person not to make a transaction (for example, if someone was thinking of buying or selling, but got wind of some info which led them to decide against it).

One plausible problem of insider trading is that it shifts resources from outsiders to insiders. This would only happen in cases when securities, such as stock, are increasing in price, but the idea is that the insiders can use their earlier access to knowledge to buy greater amout of stocks at a lower price, before the rise. The rejoinder to this is that insider trading could permit companies to lower the salaries of management. Because the information they have access to will be more valuable in the absence of insider trading laws, direct compensation (salary and benefits) can be reduced and the benefits of this pay cut passed on to shareholders.

The main argument for why insider trading is advantageous is that, as a market, any securities market, like the stock exchange, serves as the consensus among investors of the real value of a company's stock. It's a crucial source of revenue for a company, so underpricing can disadvantage a worthy company just as overpricing can unfairly benefit an unworthy company. The market seeks out the correct price through the price changes caused by investors' buying and selling. But, since investors are always facing imperfect knowledge, price will be an imperfect measure of value. The more knowledge those investors have, the better they'll be able to set the price. Insider trading will thus make the price of shares more inaccurate. The case often cited is that of Enron, which in 2001 had become grossly overpriced, and crashed in price as soon as it was revealed that the company's accounting practices had masked their profound financial unsustainability. Insider trading would've brought Enron down to reality much quicker, would've forced them to bankruptcy much sooner, saved many shareholders (those who invested in the company late in the game and lost their shirts) lots of money, and would've benefited the economy by transferring its resources into the hands of other companies that could put them to more productive use.

From a moral perspective, the counter arguments are that insider trading is fraud and it's unfair. The idea that it's fraud derives from the earliest supreme court decision on insider trading, before there was even a law on the books against insider trading. The 1909 supreme court, in Strong V. Repide decided that when a person has access to insider information, doesn't make it public, and then transacts based on that insider information, that person is committing fraud. This argument relies upon a supposed duty to make this information public. In other words, since I must make such information public, then the practice of not making it public is a deliberate withholding of information, which is a form of deceit. But without this duty to disclose, then the case for deliberate deceit doesn't hold. Certainly, there might be cases, where such a duty makes sense, such as if you told someone some information, and then failed to apprise them of a change in circumstances, or if a person specifically requested information and you withheld it. But it's hard to make the case that all instances of withholding information represent fraud. Additionally, there are clear cases, such as if a person is not in a position to make public insider information (for example, if a person is contractually prevented from disclosing certain information) or if a person is responding to non-verifiable information (for example, if a person hearing rumors, which can't be verified), which it might be risky to reveal, especially if the information turned out to be untrue. In short, it seems unlikely that insider trading necessarily involves fraud, and if there is fraud in particular cases, then relevant law should apply to that fraud.

The other, probably more common, argument is that insider trading is unfair. The idea is that since the insider has unequal access to certain information, using that information to their advantage is unfair for everyone else who doesn't have access to the information. This is undoubtedly true, but such unfairness is also a common feature of life. I use my superior knowledge of philosophy to secure positions teaching philosophy to people less knowledgeable. Real estate agents use their superior knowledge of the real estate market to make themselves useful to potential buyers and sellers. People negotiating transactions in many instances frequently try to control information to make for the best deal.

Additionally, it's also worth it to ask : to whom is it unfair? As Robert Murphy explains:

For example, suppose a Wall Street trader is at the bar and overhears an executive on his cell phone discussing some good news for the Acme Corporation. The trader then rushes to buy 1,000 shares of the stock, which is currently selling for $10. When the news becomes public, the stock jumps to $15, and the trader closes out his position for a handsome gain of $5,000. Who is the supposed victim in all of this? From whom was this $5,000 profit taken?

The $5,000 wasn't taken from the people who sold the shares to the trader. They were trying to sell anyway, and would have sold it to somebody else had the trader not entered the market. In fact, by snatching the 1,000 shares at the current price of $10, the trader's demand may have held the price higher than it otherwise would have been. In other words, had the trader not entered the market, the people trying to sell 1,000 shares may have had to settle for, say, $9.75 per share rather than the $10.00 they actually received. So we see that the people dumping their stock either were not hurt or actually benefited from the action of the trader.

...

In fact, the only people who demonstrably lost out were those who were trying to buy shares of the stock just when the trader did so, before the news became public. By entering the market and acquiring 1,000 shares (temporarily), the trader either reduced the number of Acme shares other potential buyers acquired, or he forced them to pay a higher price than they otherwise would have. When the news then hit and the share prices jumped, this meant that this select group (who also acquired new shares of Acme in the short interval in question) made less total profit than they otherwise would have.

In short, in any particular instance of insider trading there will be winners and losers. Some people may actually benefit from another person's insider trading, just as I gave the example of people who bought Enron stocks in 2001 just before it tanked and lost huge amounts of money (if the company's stock had fallen earlier, these people would've benefited). The people who benefit or lose do so by complete accident; there's no fairness to the distribution of benefits and losses in these cases; it's just how it happened.

Additionally, it's also frequently assumed that insider trading is some sort of guarantee. But the reality is that there is still risk involved. The insider information might affect stock prices in surprising ways, not to mention that the insider information may simply turn out to be false.

Thus, though we can't deny that there is a certain unfairness to insider trading, these considerations should mitigate that unfairness, and considering that a market will perform better with better information, it seems that overall insider trading is beneficial. In Wall Street Gordon Gecko had told Bud Fox that the stock market is a zero-sum game, but this is completely inaccurate. The stock market has grown in size considerably since 1987 when the movie was released. A healthy functioning market will in fact be a positive-sum game, and will perform all the better with better information, generally benefiting those who participate in it. There undoubtedly will be winners and losers and inevitable unfairness, but overall insider trading is a net benefit.

Tuesday, May 17, 2011

One thing one must often think about in storytelling is when to reveal surprising revelations to the audience. Some people take the attitude of withholding the information usually for one of two reasons: either to create a big surprise at the end or to create a mystery that keeps the audience's attention. But there's also something to be said for revealing surprises much earlier to create anticipation. Allow me to explain.

Twist endings are unfortunately overused sometimes. A well done twist ending can really make a story: it's a good punctuation at the end that leaves you with a good impression when you put down a book or leave a movie theater. There are many things that can make a good twist, but I think the real test of whether a twist is good is if it significantly changes the experience of the story if you know the twist. For example, among films we can think of a number of celebrated twist endings (Psycho, The Sixth Sense, Fight Club, Les Diabolique, The Usual Suspects, La Jetee, and Twelve Monkeys) when we go back and watch these movies, we realize that it's a very different movie the second time. This is because the twist is really integral to the story. When you first see The Sixth Sense, you see the story of a psychologist trying to redeem himself in the midst of a failed marriage while trying to help a troubled child. The second time around, instead you see a deluded ghost bonding with a clearly reluctant kid while he watches his wife painfully mourn his passing. This is a good way to use a twist: a good twist should stretch back and change a whole story. If a twist seems tacked on, or dispensable or makes things more confusing, it's probably not a good idea.

Creating a mystery has a different use, to your reader or viewer interested in finding out the secret. This is standard practice in Mystery stories, but it is the stock and trade of many other stories, as well. Since we are innately curious creatures, we have a hard time leaving a mystery unsolved, making it difficulty to stop a movie or put down a book when there are still questions unanswered. Of course, it's important to make the mystery intriguing enough that we want to discover the truth. But the main thing to remember about mysteries is that it is important to make the revelation commensurate with expectation. In other words, if the audience is led to believe there's some big mystery, they shouldn't find out that's it's something small and mundane - it's a big disappointment. Thus, if you're going to build a mystery up, then you need to make it something big: either one big earth-shattering revelation or many smaller revelations that combine together into something complex and intricate.

But sometimes you want to avoid mystery and surprise and reveal information early to create anticipation. This usually works in the form of dramatic irony: the reader or viewer is told something that the character has yet to find out. It works well with particularly dramatic and emotionally charged revelations. We can think of the classic example of this, Romeo and Juliet. Romeo and Juliet was a popular story when Shakespeare decided to adapt it for the stage, so the ending was probably well known amongst his audience, but even if there were some that didn't known how it was going to end, he explained in the dialogue that is was going to have a tragic ending with the two lovers dying. The effect on the audience is to create expectation of the emotional event, giving the emotions more time to build up and rise, thus making our emotional reaction to the event stronger. Horror movies frequently do this with rising tension before a big scare. It's become even a cliche for horror movies to use the music to signal something ominous, to get the audience tense so that we'll jump all the more when something suddenly pops up on screen.

We can also illustrate this with another more recent example, which handles it particularly well: the pilot episode of Twin Peaks. In the first few minutes we're apprised of the fact that Laura Palmer is dead, and almost immediately the scene cuts to Laura's mother who's unaware of what has happened and is wondering where her daughter is. At this point, we can easily anticipate what's coming: the mother is going to find out her daughter is dead and is going to be very sad. A pall hangs over the scenes of the mother calling around asking people if they've seen her daughter and immediately makes her a sympathetic character. Because we know what's coming we can see the revelation quickly approaching, as the parents are about to learn the truth. Our sadness is given more time to grow and thus is felt much stronger.

All three of them have their place and fit well in certain places and in certain types of stories. The important point really is to not overuse any of them and to know when one or the other is important.

One thing one must often think about in storytelling is when to reveal surprising revelations to the audience. Some people take the attitude of withholding the information usually for one of two reasons: either to create a big surprise at the end or to create a mystery that keeps the audience's attention. But there's also something to be said for revealing surprises much earlier to create anticipation. Allow me to explain.

Twist endings are unfortunately overused sometimes. A well done twist ending can really make a story: it's a good punctuation at the end that leaves you with a good impression when you put down a book or leave a movie theater. There are many things that can make a good twist, but I think the real test of whether a twist is good is if it significantly changes the experience of the story if you know the twist. For example, among films we can think of a number of celebrated twist endings (Psycho, The Sixth Sense, Fight Club, Les Diabolique, The Usual Suspects, La Jetee, and Twelve Monkeys) when we go back and watch these movies, we realize that it's a very different movie the second time. This is because the twist is really integral to the story. When you first see The Sixth Sense, you see the story of a psychologist trying to redeem himself in the midst of a failed marriage while trying to help a troubled child. The second time around, instead you see a deluded ghost bonding with a clearly reluctant kid while he watches his wife painfully mourn his passing. This is a good way to use a twist: a good twist should stretch back and change a whole story. If a twist seems tacked on, or dispensable or makes things more confusing, it's probably not a good idea.

Creating a mystery has a different use, to your reader or viewer interested in finding out the secret. This is standard practice in Mystery stories, but it is the stock and trade of many other stories, as well. Since we are innately curious creatures, we have a hard time leaving a mystery unsolved, making it difficulty to stop a movie or put down a book when there are still questions unanswered. Of course, it's important to make the mystery intriguing enough that we want to discover the truth. But the main thing to remember about mysteries is that it is important to make the revelation commensurate with expectation. In other words, if the audience is led to believe there's some big mystery, they shouldn't find out that's it's something small and mundane - it's a big disappointment. Thus, if you're going to build a mystery up, then you need to make it something big: either one big earth-shattering revelation or many smaller revelations that combine together into something complex and intricate.

But sometimes you want to avoid mystery and surprise and reveal information early to create anticipation. This usually works in the form of dramatic irony: the reader or viewer is told something that the character has yet to find out. It works well with particularly dramatic and emotionally charged revelations. We can think of the classic example of this, Romeo and Juliet. Romeo and Juliet was a popular story when Shakespeare decided to adapt it for the stage, so the ending was probably well known amongst his audience, but even if there were some that didn't known how it was going to end, he explained in the dialogue that is was going to have a tragic ending with the two lovers dying. The effect on the audience is to create expectation of the emotional event, giving the emotions more time to build up and rise, thus making our emotional reaction to the event stronger. Horror movies frequently do this with rising tension before a big scare. It's become even a cliche for horror movies to use the music to signal something ominous, to get the audience tense so that we'll jump all the more when something suddenly pops up on screen.

We can also illustrate this with another more recent example, which handles it particularly well: the pilot episode of Twin Peaks. In the first few minutes we're apprised of the fact that Laura Palmer is dead, and almost immediately the scene cuts to Laura's mother who's unaware of what has happened and is wondering where her daughter is. At this point, we can easily anticipate what's coming: the mother is going to find out her daughter is dead and is going to be very sad. A pall hangs over the scenes of the mother calling around asking people if they've seen her daughter and immediately makes her a sympathetic character. Because we know what's coming we can see the revelation quickly approaching, as the parents are about to learn the truth. Our sadness is given more time to grow and thus is felt much stronger.

All three of them have their place and fit well in certain places and in certain types of stories. The important point really is to not overuse any of them and to know when one or the other is important.

Monday, May 16, 2011

In the late nineteenth century it was a common belief that sex drained the life energy out of a person (see, for example, here). One can imagine this theory being expounded by men who noticed how tired and lethargic they were after sex and who believed that we are sustained by some unknown life energy, believing they must have lost a little bit of their life in the process. The French had a now familiar term "La petite mort," "the little death," which actually referred to this loss of life energy. It wasn't just a metaphor. It was believed that too many of these little deaths could lead to a big death. The parallel between semen and blood was made, as if every ejaculation was comparable to blood loss and too many of these ejaculations could eventually kill you. In fact, there were case studies in late nineteenth century medical literature of nymphomaniacs simply dying of exhaustion.

All the more reason to be worried about hypersexuality considering its serious health consequences. Surprisingly it was female sex obsession, nymphomania, that was considered more common, much more so than the male version satyriasis. This term nymphomania derives, apparently from the 1775 work of a French researcher named Dr. Bienville, who wrote the first full study of nymphomania Nymphomania, or a Dissertation Concerning the Furor Uterinus. His proposed list of causes for it included: "eating rich food, consuming too much chocolate, dwelling on impure thoughts, reading novels, or performing 'secret pollutions' (masturbation)." Like many of his followers in the coming centuries, he too believed that excessive sexuality in women, as well as in men, was a disease and something that could be cured.

The times have changed and beginning in the mid twentieth century the psychology field began to abandon the idea of hypersexuality as a mental disorder or disease. For one, it's not really believed that there are any health downsides to having too much sex. And there simply is too much variability in people's level of libido and the ways people express their sexuality, for some practices to be considered as a disorder or disease. Some people like to have more sex than others.

Nonetheless, some people still consider high levels of sexual activity as possibly being an addiction. In Der Spiegel, Frank Thadeusz writes against this idea of sex addiction and self-help groups that propose to treat people as if it's a disease, like alcoholism or drug addiction.

Now, first of all, we should distinguish between what we call a habit and an addiction. We all have many habits, whether it includes always reading the newspaper over breakfast, or taking a certain route to work, or checking your email every fifteen minutes. Usually, when we speak of an addiction, we mean a habit that really interferes with your life. Not just something that's a bad habit, like say pushing the snooze button every morning when the alarm goes off, but something truly harmful, like a drug or alcohol addiction. When excessive sex was believed to have serious health consequences, this might have made sense for sex addiction, but not anymore.

Even then, labeling it as a disease might not be helpful either, since a disease is normally you're something a victim of. People may like to perceive themselves as victims of their bad habits, but this is too much an evasion of blame. Certainly you don't choose whether the cold virus or cancer that infects you starts spreading, but you do have a choice of whether you'll drink alcohol or have sex. That isn't to say it's an easy choice or that a strong urge might not pull you towards something (people overcoming addictions do genuinely struggle with their addictions and do fail to overcome them), but it is still in the realm of choice.

On the other hand, we should be a bit reluctant to condemn groups that try to help people get over bad habits. Certainly there are room for helping people that have say a bad overeating habit or have what they believe to be an unhealthy obsession with sex. Breaking such habits can be really hard, and though people get over habits in different ways, it can really help some people to have that support. The problem is when these groups promote an ideology that truly misunderstands or misrepresents the problem, which is probably not helpful. Sex addiction is really a label that only works when self-applied and proper responsibility needs to be accepted; if you believe that your sexual habits are interfering with your life, then cutting back is good. But when you start to think another person is abnormal simply because their habits deviate from your own, then that's not good.

In the late nineteenth century it was a common belief that sex drained the life energy out of a person (see, for example, here). One can imagine this theory being expounded by men who noticed how tired and lethargic they were after sex and who believed that we are sustained by some unknown life energy, believing they must have lost a little bit of their life in the process. The French had a now familiar term "La petite mort," "the little death," which actually referred to this loss of life energy. It wasn't just a metaphor. It was believed that too many of these little deaths could lead to a big death. The parallel between semen and blood was made, as if every ejaculation was comparable to blood loss and too many of these ejaculations could eventually kill you. In fact, there were case studies in late nineteenth century medical literature of nymphomaniacs simply dying of exhaustion.

All the more reason to be worried about hypersexuality considering its serious health consequences. Surprisingly it was female sex obsession, nymphomania, that was considered more common, much more so than the male version satyriasis. This term nymphomania derives, apparently from the 1775 work of a French researcher named Dr. Bienville, who wrote the first full study of nymphomania Nymphomania, or a Dissertation Concerning the Furor Uterinus. His proposed list of causes for it included: "eating rich food, consuming too much chocolate, dwelling on impure thoughts, reading novels, or performing 'secret pollutions' (masturbation)." Like many of his followers in the coming centuries, he too believed that excessive sexuality in women, as well as in men, was a disease and something that could be cured.

The times have changed and beginning in the mid twentieth century the psychology field began to abandon the idea of hypersexuality as a mental disorder or disease. For one, it's not really believed that there are any health downsides to having too much sex. And there simply is too much variability in people's level of libido and the ways people express their sexuality, for some practices to be considered as a disorder or disease. Some people like to have more sex than others.

Nonetheless, some people still consider high levels of sexual activity as possibly being an addiction. In Der Spiegel, Frank Thadeusz writes against this idea of sex addiction and self-help groups that propose to treat people as if it's a disease, like alcoholism or drug addiction.

Now, first of all, we should distinguish between what we call a habit and an addiction. We all have many habits, whether it includes always reading the newspaper over breakfast, or taking a certain route to work, or checking your email every fifteen minutes. Usually, when we speak of an addiction, we mean a habit that really interferes with your life. Not just something that's a bad habit, like say pushing the snooze button every morning when the alarm goes off, but something truly harmful, like a drug or alcohol addiction. When excessive sex was believed to have serious health consequences, this might have made sense for sex addiction, but not anymore.

Even then, labeling it as a disease might not be helpful either, since a disease is normally you're something a victim of. People may like to perceive themselves as victims of their bad habits, but this is too much an evasion of blame. Certainly you don't choose whether the cold virus or cancer that infects you starts spreading, but you do have a choice of whether you'll drink alcohol or have sex. That isn't to say it's an easy choice or that a strong urge might not pull you towards something (people overcoming addictions do genuinely struggle with their addictions and do fail to overcome them), but it is still in the realm of choice.

On the other hand, we should be a bit reluctant to condemn groups that try to help people get over bad habits. Certainly there are room for helping people that have say a bad overeating habit or have what they believe to be an unhealthy obsession with sex. Breaking such habits can be really hard, and though people get over habits in different ways, it can really help some people to have that support. The problem is when these groups promote an ideology that truly misunderstands or misrepresents the problem, which is probably not helpful. Sex addiction is really a label that only works when self-applied and proper responsibility needs to be accepted; if you believe that your sexual habits are interfering with your life, then cutting back is good. But when you start to think another person is abnormal simply because their habits deviate from your own, then that's not good.

Glenn Reynolds has a long post and a response from a teacher arguing that high school English should much be more focused on writing persuasive arguments. I can't say I disagree. I'm someone who loves literature and fiction and reads (and writes) a lot of it, but being able to present and defend a position is a much more valuable skill, and so long as we remember that a teacher can't teach everything and teaching priorities need to be made, then we should admit that writing argumentative essays should be a much higher priority. If there's one really important thing that a person can get out of philosophy, it's learning about the history of philosophical arguments for various ideas, more so than the ideas themselves.

Glenn Reynolds has a long post and a response from a teacher arguing that high school English should much be more focused on writing persuasive arguments. I can't say I disagree. I'm someone who loves literature and fiction and reads (and writes) a lot of it, but being able to present and defend a position is a much more valuable skill, and so long as we remember that a teacher can't teach everything and teaching priorities need to be made, then we should admit that writing argumentative essays should be a much higher priority. If there's one really important thing that a person can get out of philosophy, it's learning about the history of philosophical arguments for various ideas, more so than the ideas themselves.

Saturday, May 14, 2011

I had a dream once that I saw a mystical teacher and his three apprentices, all dressed in spartan, beige-colored habits and bearing closely cropped hair, jogging through a field. It was the half light of dusk and the field was of long brown grass that glowed a golden glow in the light of a sun sinking over the distant mountains. Their goal was a pilgrimage to a far away, high up temple of some vague ominousness, a dark place of uncertain evil. There the youngest of the three apprenctices, the initiate, would be put through a dangerous ritual, a purgation, wherein he'd be cleansed of evil. But along this path the four of them were being stalked, by a small but malicious creature that lurked in the long grass and sought to pick them off one by one.

I woke up at this point, but the question, which perhaps you can answer, is what was going to happen next.

I had a dream once that I saw a mystical teacher and his three apprentices, all dressed in spartan, beige-colored habits and bearing closely cropped hair, jogging through a field. It was the half light of dusk and the field was of long brown grass that glowed a golden glow in the light of a sun sinking over the distant mountains. Their goal was a pilgrimage to a far away, high up temple of some vague ominousness, a dark place of uncertain evil. There the youngest of the three apprenctices, the initiate, would be put through a dangerous ritual, a purgation, wherein he'd be cleansed of evil. But along this path the four of them were being stalked, by a small but malicious creature that lurked in the long grass and sought to pick them off one by one.

I woke up at this point, but the question, which perhaps you can answer, is what was going to happen next.

Robin Hanson had a post a couple of weeks ago asking people the question of whether they'd support redistributing GPA, in the same way most people support the redistribution of income; namely people in the top 10% or so would give some of their GPA to people in the bottom 10% or so. People were largely against GPA redistribution, though most people think that redistributing income is fine. Hanson's point was that people are naturally hypocritical because, though they support one rather than the other, they couldn't readily give reasons for this belief.

In fact, even if we look at it carefully, the difference is hard to find. People might think it's unfair to take GPA from someone since they earned it, but the same goes with money, since people who are wealthy have earned their wealth as well. One might respond that though many rich people earned their wealth, many of them were born into fortunate circumstances, such as being upper middle class or wealthy from their parents, and thus they didn't earn it. But then one just responds that since GPA is heavily influenced by intelligence, and intelligence is largely determined by innate ability which is unfairly distributed, you don't really earn your GPA either. If one then tried to respond that redistributing GPA would muck up the incentives, making high achieving students work less hard to get high marks (since their GPA will be taken away) and making poor students work less hard to pull themselves out of the bottom 10% (since they'll get free grades anyways). But the same applies to wealth redistribution, which will affect incentives in the same way.

The difference really comes around when we realize the different roles and different importance of the two: grades are primarily informational (they convey information about how much a student learned and how well they performed) and they're not really critical (you won't die without good grades); money on the other hand is a medium of exchange and a store of value and it is really critical in a market economy (without it you can't get food, clothing, shelter and thus potentially face death). These aren't perfect arguments and we could quibble about these differences too, but they're pretty good and get at the key differences: because wealth is essential for functioning in society (because of its role in the economy) then redistributing money is a much bigger help to those in the bottom 10% than redistributing GPA; one might even say redistributing wealth is necessary for people in dire need. The question is whether people who prefer the redistribution of wealth and not GPA are recognizing this difference and just unable to articulate it. Since understanding things you can't articulate is not unheard of, this is plausible.

I suspect though that this is not the reason that most people feel this way. I think it's rather the case that people feel (accurately or inaccurately) that they've earned their grades and that people who are wealthy haven't really earned their money; or perhaps they think that the unequal distribution of money is uniquely unfair (unlike other unequal distributions like say intelligence, physical attractiveness, athletic ability, height, thinness and so on); or perhaps they just assume that, since wealth redistribution is the norm whereas GPA redistribution isn't, that that's the way is should be (status quo bias). Any attempt to rationalize or explain such assumptions is going to be post hoc.

In short, we can believe that people have pretty good reasons that they can't articulate or believe (as I think more likely) that they simply have unquestioned assumptions they've never thought about (though if they're clever, they'll find good ways to rationalize them). The reason I lean to the latter is because I know that when I was younger I too, like most people supported the redistribution of wealth but wouldn't, if ever asked, have supported redistribution of GPA, and my beliefs have changed (not about redistribution of GPA) precisely because I started to think about these assumptions.

Robin Hanson had a post a couple of weeks ago asking people the question of whether they'd support redistributing GPA, in the same way most people support the redistribution of income; namely people in the top 10% or so would give some of their GPA to people in the bottom 10% or so. People were largely against GPA redistribution, though most people think that redistributing income is fine. Hanson's point was that people are naturally hypocritical because, though they support one rather than the other, they couldn't readily give reasons for this belief.

In fact, even if we look at it carefully, the difference is hard to find. People might think it's unfair to take GPA from someone since they earned it, but the same goes with money, since people who are wealthy have earned their wealth as well. One might respond that though many rich people earned their wealth, many of them were born into fortunate circumstances, such as being upper middle class or wealthy from their parents, and thus they didn't earn it. But then one just responds that since GPA is heavily influenced by intelligence, and intelligence is largely determined by innate ability which is unfairly distributed, you don't really earn your GPA either. If one then tried to respond that redistributing GPA would muck up the incentives, making high achieving students work less hard to get high marks (since their GPA will be taken away) and making poor students work less hard to pull themselves out of the bottom 10% (since they'll get free grades anyways). But the same applies to wealth redistribution, which will affect incentives in the same way.

The difference really comes around when we realize the different roles and different importance of the two: grades are primarily informational (they convey information about how much a student learned and how well they performed) and they're not really critical (you won't die without good grades); money on the other hand is a medium of exchange and a store of value and it is really critical in a market economy (without it you can't get food, clothing, shelter and thus potentially face death). These aren't perfect arguments and we could quibble about these differences too, but they're pretty good and get at the key differences: because wealth is essential for functioning in society (because of its role in the economy) then redistributing money is a much bigger help to those in the bottom 10% than redistributing GPA; one might even say redistributing wealth is necessary for people in dire need. The question is whether people who prefer the redistribution of wealth and not GPA are recognizing this difference and just unable to articulate it. Since understanding things you can't articulate is not unheard of, this is plausible.

I suspect though that this is not the reason that most people feel this way. I think it's rather the case that people feel (accurately or inaccurately) that they've earned their grades and that people who are wealthy haven't really earned their money; or perhaps they think that the unequal distribution of money is uniquely unfair (unlike other unequal distributions like say intelligence, physical attractiveness, athletic ability, height, thinness and so on); or perhaps they just assume that, since wealth redistribution is the norm whereas GPA redistribution isn't, that that's the way is should be (status quo bias). Any attempt to rationalize or explain such assumptions is going to be post hoc.

In short, we can believe that people have pretty good reasons that they can't articulate or believe (as I think more likely) that they simply have unquestioned assumptions they've never thought about (though if they're clever, they'll find good ways to rationalize them). The reason I lean to the latter is because I know that when I was younger I too, like most people supported the redistribution of wealth but wouldn't, if ever asked, have supported redistribution of GPA, and my beliefs have changed (not about redistribution of GPA) precisely because I started to think about these assumptions.

There are many that argue that in education we should be focusing more on creativity and independent thinking. The reasoning behind this is basically that these will always be valuable skills and focusing too much on facts and memorization is counter-productive since much of that knowledge may become quickly obsolete. The problem with this line of argument is that it's not entirely clear how or even whether these things can be taught.

The Guardian yesterday compiled a number of opinions from various writers who briefly weighed in on the question of whether creative writing can be taught. Their opinions range from "not really" to "somewhat." Unfortunately, they don't go into the question in too much depth and I think that more can be said.

There are a few things that should be noted. First, even if we don't know whether writing skill can be taught or not, we know it can be learned. No one is born a Shakespeare. Access to the childhood writings of great writers consistently shows that they wrote generally as poor as all children wrote. Somehow they learned a few things about writing along the way.

Second, the more knowledge you retain and the better you are able to use that knowledge, the more creative you'll be. Memory, particularly working memory, is a big component of overall intelligence. Many people like to make this diametrical division between rote memorization and creativity, but the reality is that the better a person's memory functions, generally the more creative they are. And the best way to improve memory is to memorize things. That doesn't mean that rote memorization is the best way to improve memory, just that memory skill is important. Also, the larger the storehouse of knowledge you can draw from, the more creative you'll be. That means that learning lots of facts can be very beneficial.

Third, a teacher can only be involved in a percentage of the learning of a student. There are only so many hours of teaching time, and these few hours are certainly only sufficient to impart limited knowledge. Students need to be learning on their own, especially if they ever want to attain something that requires a great many hours of skill, such as sophisticated and well-developed writing ability. K. Anders Ericsson famously estimated, based on considerable research, that it takes on average 10,000 hours or 10 years to master something (if not more, in highly competitive fields). That's a lot more hours than I, or any teacher, probably wants to spend teaching a student.

Fourth, you can lead a horse to water, but you can't make him drink. In other words, you can't force a student to learn anything they don't want to learn. In fact, a student doesn't even need to stubbornly refuse to learn, just simply not paying attention or not studying is enough. In short, the student needs to be involved in the learning process. A good teacher can motivate students, but the student still needs to meet the teacher half way.

Fifth, writers are not entirely aware or able to explain what they know. There are many skills that we develop and master non-linguistically and non-consciously. For example, the ability to throw a football forty yards with pinpoint accuracy cannot be explained because it's a matter of training your muscles and your nervous system to be able to coordinate themselves in such a way that you can simply think, "Throw this ball to hit that target," and it will happen. Your muscles and nervous system are conditioned in such a way that you generally perform poorer if you think consciously too much about it; decisions need to be streamlined such that they can basically be made below the conscious level, quickly and without deliberation. The way you condition them is with practice, repetition and feedback, and a lot of it. A talented writer's sense about what is appropriate, what fits, what the best way to say such a thing, what should happen in a plot, what a character should do, and so on, is also frequently non-conscious; such decisions have been developed through the experience of lots of reading and writing. This means that writers need only be aware of what they are doing in a limited sense; as such they really only have a partial understanding of what they're doing right, not to mention that they might also fall into bad habits that they mistakenly think are favorable. Thus, their ability to pass on useful information will also be limited.

Sixth, being creative means doing thing differently. In other words, you can teach somehow how other people have done things, but being creative means doing things in a different may. By definition, if you teach someone something and they parrot it, they're not creative.

If we take these points in combination I think we can get a good idea of what it means to teach creative writing. Namely, that at best a teacher can facilitate the self-learning of a student. By definition you can't really teach creativity, but you can help students become more creative. And the best way to teach students would be to teach them more readily teachable knowledge like writing techniques, tropes, conventions and genres through giving examples and reading quality writing; as well as trying to motivate them to broaden their knowledge independently. In short, as to the question of whether creative writing can be taught, the answer is: "sort of, but not really."

There are many that argue that in education we should be focusing more on creativity and independent thinking. The reasoning behind this is basically that these will always be valuable skills and focusing too much on facts and memorization is counter-productive since much of that knowledge may become quickly obsolete. The problem with this line of argument is that it's not entirely clear how or even whether these things can be taught.

The Guardian yesterday compiled a number of opinions from various writers who briefly weighed in on the question of whether creative writing can be taught. Their opinions range from "not really" to "somewhat." Unfortunately, they don't go into the question in too much depth and I think that more can be said.

There are a few things that should be noted. First, even if we don't know whether writing skill can be taught or not, we know it can be learned. No one is born a Shakespeare. Access to the childhood writings of great writers consistently shows that they wrote generally as poor as all children wrote. Somehow they learned a few things about writing along the way.

Second, the more knowledge you retain and the better you are able to use that knowledge, the more creative you'll be. Memory, particularly working memory, is a big component of overall intelligence. Many people like to make this diametrical division between rote memorization and creativity, but the reality is that the better a person's memory functions, generally the more creative they are. And the best way to improve memory is to memorize things. That doesn't mean that rote memorization is the best way to improve memory, just that memory skill is important. Also, the larger the storehouse of knowledge you can draw from, the more creative you'll be. That means that learning lots of facts can be very beneficial.

Third, a teacher can only be involved in a percentage of the learning of a student. There are only so many hours of teaching time, and these few hours are certainly only sufficient to impart limited knowledge. Students need to be learning on their own, especially if they ever want to attain something that requires a great many hours of skill, such as sophisticated and well-developed writing ability. K. Anders Ericsson famously estimated, based on considerable research, that it takes on average 10,000 hours or 10 years to master something (if not more, in highly competitive fields). That's a lot more hours than I, or any teacher, probably wants to spend teaching a student.

Fourth, you can lead a horse to water, but you can't make him drink. In other words, you can't force a student to learn anything they don't want to learn. In fact, a student doesn't even need to stubbornly refuse to learn, just simply not paying attention or not studying is enough. In short, the student needs to be involved in the learning process. A good teacher can motivate students, but the student still needs to meet the teacher half way.

Fifth, writers are not entirely aware or able to explain what they know. There are many skills that we develop and master non-linguistically and non-consciously. For example, the ability to throw a football forty yards with pinpoint accuracy cannot be explained because it's a matter of training your muscles and your nervous system to be able to coordinate themselves in such a way that you can simply think, "Throw this ball to hit that target," and it will happen. Your muscles and nervous system are conditioned in such a way that you generally perform poorer if you think consciously too much about it; decisions need to be streamlined such that they can basically be made below the conscious level, quickly and without deliberation. The way you condition them is with practice, repetition and feedback, and a lot of it. A talented writer's sense about what is appropriate, what fits, what the best way to say such a thing, what should happen in a plot, what a character should do, and so on, is also frequently non-conscious; such decisions have been developed through the experience of lots of reading and writing. This means that writers need only be aware of what they are doing in a limited sense; as such they really only have a partial understanding of what they're doing right, not to mention that they might also fall into bad habits that they mistakenly think are favorable. Thus, their ability to pass on useful information will also be limited.

Sixth, being creative means doing thing differently. In other words, you can teach somehow how other people have done things, but being creative means doing things in a different may. By definition, if you teach someone something and they parrot it, they're not creative.

If we take these points in combination I think we can get a good idea of what it means to teach creative writing. Namely, that at best a teacher can facilitate the self-learning of a student. By definition you can't really teach creativity, but you can help students become more creative. And the best way to teach students would be to teach them more readily teachable knowledge like writing techniques, tropes, conventions and genres through giving examples and reading quality writing; as well as trying to motivate them to broaden their knowledge independently. In short, as to the question of whether creative writing can be taught, the answer is: "sort of, but not really."

Friday, May 13, 2011

I'm pretty convinced that trial and error is the most consistently powerful method for attaining knowledge. There's just way to many ways a person to fail and make mistakes in reason. So, it's interesting that Ben Goldacre over at the Guardian should write an article at the Guardian titled "How can you tell if a policy is working? Run a trial." In other words, if you're considering making a law, run a trial to see what the best way to write the law.

A couple of caveats that Goldacre doesn't mention, of course. For one, trials are not always feasible; sometimes they're too expensive or simply unethical. Second of all, trials can only tell you how well a certain policy will achieve a specified goal; it can't tell unconditionally what the best policy is. For example, if you have a policy that increases safety, but compromises privacy, it's an open question whether the increased safety is worth the compromised privacy, and it certainly isn't self-evident that it is lawmakers that should always be making these decisions on behalf of the populace. Third of all, always remember to include a control. In other words, if you're trying to evaluate Law A vs Law B, you want to make sure you include (if possible) a control that evaluates what happens without either Law A or B; sometimes not creating a law to begin with is better. Lastly, it's a bit unrealistic to expect that governments, after a long history of ignoring scientific evidence and expert opinion, to suddenly start building policy on evidence. Alas, there's been too much research in public choice theory to make one think that politicians will consistently enact the laws that best reflect the evidence.

I'm pretty convinced that trial and error is the most consistently powerful method for attaining knowledge. There's just way to many ways a person to fail and make mistakes in reason. So, it's interesting that Ben Goldacre over at the Guardian should write an article at the Guardian titled "How can you tell if a policy is working? Run a trial." In other words, if you're considering making a law, run a trial to see what the best way to write the law.

A couple of caveats that Goldacre doesn't mention, of course. For one, trials are not always feasible; sometimes they're too expensive or simply unethical. Second of all, trials can only tell you how well a certain policy will achieve a specified goal; it can't tell unconditionally what the best policy is. For example, if you have a policy that increases safety, but compromises privacy, it's an open question whether the increased safety is worth the compromised privacy, and it certainly isn't self-evident that it is lawmakers that should always be making these decisions on behalf of the populace. Third of all, always remember to include a control. In other words, if you're trying to evaluate Law A vs Law B, you want to make sure you include (if possible) a control that evaluates what happens without either Law A or B; sometimes not creating a law to begin with is better. Lastly, it's a bit unrealistic to expect that governments, after a long history of ignoring scientific evidence and expert opinion, to suddenly start building policy on evidence. Alas, there's been too much research in public choice theory to make one think that politicians will consistently enact the laws that best reflect the evidence.

When I lived in New York City I had a roommate named Eric that worked in a comedy club. He was tall twenty-something, the child of a white mother and a black father, with a very sarcastic sense of humor and an easy-going personality. He aspired to be a playwright, and had been writing plays and trying to get some productions off the ground for his stuff.

A couple times Eric had been visited by an old friend of his who was studying at Brown in Providence. She was getting her PhD in English and intended to be an English professor. She’s also of mixed parentage half-Filipino, half-Latina, a short woman in her only twenties with a frankly annoyingly squeaky voice, but nonetheless intelligent and funny.

This had been the second or third time she’d come to visit, and she always stayed in Eric’s basement bedroom next to mine, since there was no other room for her in the house that five of us, including Eric and I, shared. This had been a spontaneous, unplanned visit that she’d suddenly decided upon at the last minute, deciding to spend one of her weekends with Eric in New York instead of back in Providence.

What neither I nor Eric realized was that she was romantically interested in Eric. They’d been friends for quite a while, so Eric thought nothing of her visits and they were able to share a bed in his small bedroom without any sex. She, on the other hand, didn’t realize that Eric had just begun dating an older co-worker who waitressed at the comedy club. This older coworker was a woman with considerably more history than Eric, a twice divorced woman who’s most recent divorce occurred after she found her husband in bed with another woman. An impetuous and fiery woman, she’d left her husband right then and there, despite having nowhere to live. She ended up living at a women-only shelter, which was still her home at the time.

That weekend evening, while Eric was working, Eric’s friend and I went to the comedy club to see a show. We talked along the way, traveling from our place in Queens to the comedy club in Midtown Manhattan. We arrived a bit early and walked around. Then we went inside and waited in the downstairs bar for the previous set to finish up before we could take our seats in the main room. This whole time we were talking—about the usual, about literary topics, about what life was like in Brown, about what was going on in Eric’s life. A few minutes before the show was about to start, she’d mentioned the perils of dating co-workers, and I mentioned casually that Eric had just started dating a co-worker of his. Her whole demeanor changed when she heard this and she suddenly became completely silent.

We then immediately went up to hear the show. The show overall was lots of fun and her and I laughed the whole way through, but this was only a reprieve. After the show she was sullen and untalkative. Whereas before, the two of us had had no problem keeping up a conversation, now it was long silences, while I tried to get her talking and she didn’t say anything. We rode back on the subway to Woodside, Queens, and I insisted that she tell me what was wrong. She at first refused, but eventually admitted that she needed to tell someone her secret.

As we walked around the neighborhood, she told me about the process of her growing fondness Eric. They’d gone to school together and become close friends. There’d been some ambiguity about the status of her relationship, and it had seemed at one point, late in their senior year, that Eric was trying to transition from friends to lovers. He’d taken her out on a date, and she balked, wanting to remain only friends. After he’d moved to New York and she’d gone to college in Providence, her affections towards Eric grew and she’d been hoping to come down to New York to try and seduce Eric and make a romantic connection. She told me not to tell Eric about all this, and I promised I wouldn’t

She had to spend the night again, sharing the bed in Eric’s room, but then she promptly left in the morning, telling Eric that something had come up and she needed to get back to Providence. He hadn’t questioned this, and though it was apparent that something was wrong, he attributed it to whatever this thing that had come up was. I kept my silence about her secret, while he told me about how uncomfortable it was to share a rather small bed with someone. You just can’t sleep as well. He said that he’d actually hoped that I would’ve seduced her so that he could have a bed all to himself. I laughed at the joke, but still kept my silence.

When I lived in New York City I had a roommate named Eric that worked in a comedy club. He was tall twenty-something, the child of a white mother and a black father, with a very sarcastic sense of humor and an easy-going personality. He aspired to be a playwright, and had been writing plays and trying to get some productions off the ground for his stuff.

A couple times Eric had been visited by an old friend of his who was studying at Brown in Providence. She was getting her PhD in English and intended to be an English professor. She’s also of mixed parentage half-Filipino, half-Latina, a short woman in her only twenties with a frankly annoyingly squeaky voice, but nonetheless intelligent and funny.

This had been the second or third time she’d come to visit, and she always stayed in Eric’s basement bedroom next to mine, since there was no other room for her in the house that five of us, including Eric and I, shared. This had been a spontaneous, unplanned visit that she’d suddenly decided upon at the last minute, deciding to spend one of her weekends with Eric in New York instead of back in Providence.

What neither I nor Eric realized was that she was romantically interested in Eric. They’d been friends for quite a while, so Eric thought nothing of her visits and they were able to share a bed in his small bedroom without any sex. She, on the other hand, didn’t realize that Eric had just begun dating an older co-worker who waitressed at the comedy club. This older coworker was a woman with considerably more history than Eric, a twice divorced woman who’s most recent divorce occurred after she found her husband in bed with another woman. An impetuous and fiery woman, she’d left her husband right then and there, despite having nowhere to live. She ended up living at a women-only shelter, which was still her home at the time.

That weekend evening, while Eric was working, Eric’s friend and I went to the comedy club to see a show. We talked along the way, traveling from our place in Queens to the comedy club in Midtown Manhattan. We arrived a bit early and walked around. Then we went inside and waited in the downstairs bar for the previous set to finish up before we could take our seats in the main room. This whole time we were talking—about the usual, about literary topics, about what life was like in Brown, about what was going on in Eric’s life. A few minutes before the show was about to start, she’d mentioned the perils of dating co-workers, and I mentioned casually that Eric had just started dating a co-worker of his. Her whole demeanor changed when she heard this and she suddenly became completely silent.

We then immediately went up to hear the show. The show overall was lots of fun and her and I laughed the whole way through, but this was only a reprieve. After the show she was sullen and untalkative. Whereas before, the two of us had had no problem keeping up a conversation, now it was long silences, while I tried to get her talking and she didn’t say anything. We rode back on the subway to Woodside, Queens, and I insisted that she tell me what was wrong. She at first refused, but eventually admitted that she needed to tell someone her secret.

As we walked around the neighborhood, she told me about the process of her growing fondness Eric. They’d gone to school together and become close friends. There’d been some ambiguity about the status of her relationship, and it had seemed at one point, late in their senior year, that Eric was trying to transition from friends to lovers. He’d taken her out on a date, and she balked, wanting to remain only friends. After he’d moved to New York and she’d gone to college in Providence, her affections towards Eric grew and she’d been hoping to come down to New York to try and seduce Eric and make a romantic connection. She told me not to tell Eric about all this, and I promised I wouldn’t

She had to spend the night again, sharing the bed in Eric’s room, but then she promptly left in the morning, telling Eric that something had come up and she needed to get back to Providence. He hadn’t questioned this, and though it was apparent that something was wrong, he attributed it to whatever this thing that had come up was. I kept my silence about her secret, while he told me about how uncomfortable it was to share a rather small bed with someone. You just can’t sleep as well. He said that he’d actually hoped that I would’ve seduced her so that he could have a bed all to himself. I laughed at the joke, but still kept my silence.

Tuesday, May 10, 2011

Beginning when we were about six or seven my parents began sending my sister and I out to California to spend several weeks with relatives. Generally it was two weeks with my mother's sister Mary and her family and two weeks with my father's brother Bruce and his family. We always had fun with Bruce and his family, but our experience at Mary and her husband Nick's place was a bit different.

Informally this time with Nick and Mary was known as "Nick's work camp," because there was invariably a significant amount of physical labor involved on this vacation. Nick and Mary lived on a plot of land in the Trinity Alps region of northern California, a sprawling are in the woods with ponds and streams and wild blackberries filled with several homemade buildings and piles of junk. I use the term "junk" very broadly here, since it included everything from non-functioning cars and tractors to drawer-fulls of nuts and bolts. Nick would buy this stuff off of people and then hopefully resell it to others, possibly fixed up or not. There tended to more stuff flowing in than out, so the collection of stuff only grew.

The work that we were put to varied a lot, from one summer helping them build one of their homemade buildings (a simple three wall building used for storing bike parts and fixing up bikes), to helping them dredge for gold at the bottom of a pond. But one of the most memorable jobs we were put to was the moving of a huge pipe. The pipe in question was a section of steel pipe, some 3 foot in circumference and 15 feet long that probably weighed in at a half to three quarters of a ton. The challenge was to move this pipe from the place where it'd been left in the woods somewhere to Nick and Mary's property. Fortunately, Nick had a truck which could carry this amount of weight, a huge flat bed truck. But the process of getting this pipe from the ground to the raised bed of the truck was the really difficult part. We had to first roll the pipe out of the ditch where it'd been dumped, then using a number of ropes, it was pulled up onto the bed with a couple of ratchet hoists. It was a lot of grunt work and struggling that achieved its goal, but left us beat afterwards.

I'm reminded of it as I read about a man, Wally Wallington, almost single-handedly constructing a cement replica of stonehenge. He explains it on his homepage, and you can see how he's able to very efficiently both move and raise up stones vastly heavier than the pipe we were moving. Such techniques can help explain how many ancient structures using massive multi-ton stones could be built, namely through techniques similar to Wally Wallington's. It also helps explain the mysterious construction of Edward Leedskalnin's Coral Castle in Florida. It makes a person tempted to try and build a stonehenge in their own back yard.

I think about how such techniques might have been applied to our relatively simpler problem to make it easier and doable with a smaller crew. But I guess when you got enough people that you can do it with brute force, why bother with more delicate techniques.

Beginning when we were about six or seven my parents began sending my sister and I out to California to spend several weeks with relatives. Generally it was two weeks with my mother's sister Mary and her family and two weeks with my father's brother Bruce and his family. We always had fun with Bruce and his family, but our experience at Mary and her husband Nick's place was a bit different.

Informally this time with Nick and Mary was known as "Nick's work camp," because there was invariably a significant amount of physical labor involved on this vacation. Nick and Mary lived on a plot of land in the Trinity Alps region of northern California, a sprawling are in the woods with ponds and streams and wild blackberries filled with several homemade buildings and piles of junk. I use the term "junk" very broadly here, since it included everything from non-functioning cars and tractors to drawer-fulls of nuts and bolts. Nick would buy this stuff off of people and then hopefully resell it to others, possibly fixed up or not. There tended to more stuff flowing in than out, so the collection of stuff only grew.

The work that we were put to varied a lot, from one summer helping them build one of their homemade buildings (a simple three wall building used for storing bike parts and fixing up bikes), to helping them dredge for gold at the bottom of a pond. But one of the most memorable jobs we were put to was the moving of a huge pipe. The pipe in question was a section of steel pipe, some 3 foot in circumference and 15 feet long that probably weighed in at a half to three quarters of a ton. The challenge was to move this pipe from the place where it'd been left in the woods somewhere to Nick and Mary's property. Fortunately, Nick had a truck which could carry this amount of weight, a huge flat bed truck. But the process of getting this pipe from the ground to the raised bed of the truck was the really difficult part. We had to first roll the pipe out of the ditch where it'd been dumped, then using a number of ropes, it was pulled up onto the bed with a couple of ratchet hoists. It was a lot of grunt work and struggling that achieved its goal, but left us beat afterwards.

I'm reminded of it as I read about a man, Wally Wallington, almost single-handedly constructing a cement replica of stonehenge. He explains it on his homepage, and you can see how he's able to very efficiently both move and raise up stones vastly heavier than the pipe we were moving. Such techniques can help explain how many ancient structures using massive multi-ton stones could be built, namely through techniques similar to Wally Wallington's. It also helps explain the mysterious construction of Edward Leedskalnin's Coral Castle in Florida. It makes a person tempted to try and build a stonehenge in their own back yard.

I think about how such techniques might have been applied to our relatively simpler problem to make it easier and doable with a smaller crew. But I guess when you got enough people that you can do it with brute force, why bother with more delicate techniques.

Monday, May 9, 2011

I was just reading through Aristotle's Rhetoric and found Aristotle discussing judicial torture. He doesn't apparently take sides about evidence extracted under torture (since this is a book about rhetoric), but nonetheless shows a fairly critical attitude. In the context he is pointing out arguments that, in a law court, could be used to both promote or dismiss such evidence. He ends up pointing out more extensively why such evidence is untrustworthy, saying: "we can destroy [evidence extracted under torture's] value by telling the truth about all kinds of torture generally; for those under compulsion are as likely to give false evidence as true, some being ready to endure everything rather than tell the truth, while others are equally ready to make false charges against others, in the hope of being sooner released from torture"(1377a2-6). What's interesting is that cases of false information extracted under torture were apparently well enough known, even in Aristotle's day, that a person could cite several examples.

This issue has come up recently because some information extracted under the torture of Khalid Sheikh Mohammed eventually led to the discovery of the hiding place of Osama bin Laden. Some people have used this as evidence that torture can have its value. The argument for torture is a form of Utilitarian argument, basically stating that, in certain circumstances torture is justifiable when the benefit of information extracted under torture vastly outweighs the evils of torture. For example, if you torture someone to get information to stop a terrorist plot that will kill thousands of people, then it's justified. Such argument might be a bit tenuous in this case, since it's not clear that Killing Osama will undermine Al Qaeda or make a dent on the threat of terrorism, but there still might be some cases where it applies.

Steve Chapman over at Reason writes about the weakness of this argument. The problem is that this argument relies on a couple key assumptions, that information extracted under torture is reliable and that torture is the best way to extract information. But, the unreliability of torture has been known at least as far back as Ancient Greece. Since it's all too common for innocent people who know nothing to be tortured into giving false information to get the torture to stop, about the only circumstance that torture might be reliable is if you have a suspect that you know is holding some information, but who refuses to divulge.

Even here, torture doesn't appear to be the best strategy. People are human and are generally willing to divulge information with people they trust and bond with. If Stockholm Syndrome is real, then you can gain the trust and empathy even of prisoners and get them to willingly speak openly about what they know. In fact, it can work even for people who really don't feel empathy. Remorseless serial murderer Pedro Lopez confessed to his several hundred murders after police mercilessly beat him into subm... Oh wait. No, they didn't do that. They just put a priest in the cell with him, and the priest earned his trust, and Pedro Lopez confessed. He even took police out to show them all the burial places of the victims he'd killed in the area. Steve Chapman points out that that gaining trust is probably more productive, since it'll lead the subject to divulge more volubly, whereas under torture subjects will try to give as little information as possible. In other words, if you try to torture some kernel of information out of a guy, he might finally divulge it, but he'll completely close up his mouth from then on. Gaining someone's trust and bonding with them does require a bit more finesse and creativity, but it's worth the effort and will generally lead to more information extracted.

The case that torture is horrible and unethical is a pretty easy case to make, but it's sometimes justified on Utilitarian grounds, arguing that there are cases where the ends justify means. Unfortunately, even these arguments seem highly doubtful and should lead us to conclude that torture is something that is never justifiable.

I was just reading through Aristotle's Rhetoric and found Aristotle discussing judicial torture. He doesn't apparently take sides about evidence extracted under torture (since this is a book about rhetoric), but nonetheless shows a fairly critical attitude. In the context he is pointing out arguments that, in a law court, could be used to both promote or dismiss such evidence. He ends up pointing out more extensively why such evidence is untrustworthy, saying: "we can destroy [evidence extracted under torture's] value by telling the truth about all kinds of torture generally; for those under compulsion are as likely to give false evidence as true, some being ready to endure everything rather than tell the truth, while others are equally ready to make false charges against others, in the hope of being sooner released from torture"(1377a2-6). What's interesting is that cases of false information extracted under torture were apparently well enough known, even in Aristotle's day, that a person could cite several examples.

This issue has come up recently because some information extracted under the torture of Khalid Sheikh Mohammed eventually led to the discovery of the hiding place of Osama bin Laden. Some people have used this as evidence that torture can have its value. The argument for torture is a form of Utilitarian argument, basically stating that, in certain circumstances torture is justifiable when the benefit of information extracted under torture vastly outweighs the evils of torture. For example, if you torture someone to get information to stop a terrorist plot that will kill thousands of people, then it's justified. Such argument might be a bit tenuous in this case, since it's not clear that Killing Osama will undermine Al Qaeda or make a dent on the threat of terrorism, but there still might be some cases where it applies.

Steve Chapman over at Reason writes about the weakness of this argument. The problem is that this argument relies on a couple key assumptions, that information extracted under torture is reliable and that torture is the best way to extract information. But, the unreliability of torture has been known at least as far back as Ancient Greece. Since it's all too common for innocent people who know nothing to be tortured into giving false information to get the torture to stop, about the only circumstance that torture might be reliable is if you have a suspect that you know is holding some information, but who refuses to divulge.

Even here, torture doesn't appear to be the best strategy. People are human and are generally willing to divulge information with people they trust and bond with. If Stockholm Syndrome is real, then you can gain the trust and empathy even of prisoners and get them to willingly speak openly about what they know. In fact, it can work even for people who really don't feel empathy. Remorseless serial murderer Pedro Lopez confessed to his several hundred murders after police mercilessly beat him into subm... Oh wait. No, they didn't do that. They just put a priest in the cell with him, and the priest earned his trust, and Pedro Lopez confessed. He even took police out to show them all the burial places of the victims he'd killed in the area. Steve Chapman points out that that gaining trust is probably more productive, since it'll lead the subject to divulge more volubly, whereas under torture subjects will try to give as little information as possible. In other words, if you try to torture some kernel of information out of a guy, he might finally divulge it, but he'll completely close up his mouth from then on. Gaining someone's trust and bonding with them does require a bit more finesse and creativity, but it's worth the effort and will generally lead to more information extracted.

The case that torture is horrible and unethical is a pretty easy case to make, but it's sometimes justified on Utilitarian grounds, arguing that there are cases where the ends justify means. Unfortunately, even these arguments seem highly doubtful and should lead us to conclude that torture is something that is never justifiable.