Archives

Meta

Posts tagged ‘About’

Appropriation [uh-proh-pree-ey-shuhn] in the arts is the use of pre-existing objects or images with little or no transformation applied to them. The use of appropriation has played a significant role in the history of the arts (literary, visual, and musical).

Appropriation can be understood as ‘the use of borrowed elements in the creation of a new work.’ In the visual arts, to appropriate means to properly adopt, borrow, recycle, or sample aspects (or the entire form) of man-made visual culture. Most notable in this respect are the ‘Readymades’ of Marcel Duchamp (are ordinary manufactured objects that the artist selected and modified, as an antidote to what he called ‘retinal art’).

Fair use is a limitation and exception to the exclusive right granted by copyright law to the author of a creative work. Examples of fair use include commentary, search engines, criticism, news reporting, research, teaching, library archiving, and scholarship.

It provides for the legal, unlicensed citation or incorporation of copyrighted material in another author’s work under a four-factor balancing test (Purpose and character; Nature of the copied work; Amount and substantiality; and Effect upon work’s value). Along with Public Domain, Fair use is one of the ‘Traditional Safety Valves’ (techniques that balance the public’s interest in open access with the property interest of copyright owners)

Assemblage [uh-sem-blij] refers to a text ‘built primarily and explicitly from existing texts in order to solve a writing or communication problem in a new context.’ The concept was first proposed by Johndan Johnson-Eilola (author of ‘Datacloud’) and Stuart Selber in the journal, ‘Computers & Composition,’ in 2007. The notion of assemblages builds on remix practices, which blur distinctions between invented and borrowed work.

Johnson-Eilola and Selber discuss the intertextual nature of writing, and they assert that participation in existing discourse necessarily means that composition cannot occur separate from that discourse. They state that ‘productive participation involves appropriation and re-appropriation of the familiar’ in a manner that conforms to existing discourse and audience expectations.

Information wants to be free is a slogan of technology activists invoked against limiting access to information. According to criticism of intellectual property rights, the system of governmental control of exclusivity is in conflict with the development of a public domain of information. The iconic phrase is attributed to American writer Stewart Brand who, in the late 1960s, founded the ‘Whole Earth Catalog’ and argued that technology could be liberating rather than oppressing.

The earliest recorded occurrence of the expression was at the first ‘Hackers’ Conference’ in 1984. Brand told Steve Wozniak: ‘On the one hand information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other.’

The reliability of Wikipedia (primarily of the English-language edition), compared to other encyclopedias and more specialized sources, is assessed in many ways, including statistically, through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in the editing process unique to Wikipedia.

Several studies have been done to assess the reliability of Wikipedia. A notable early study in the journal ‘Nature’ said that in 2005, ‘Wikipedia scientific articles came close to the level of accuracy in Encyclopædia Britannica and had a similar rate of ‘serious errors.’ The study was disputed by ‘Encyclopædia Britannica,’ and later ‘Nature’ responded to this refutation with both a formal response and a point-by-point rebuttal of Britannica’s main objections.

Infornography is a portmanteau of ‘information’ and ‘pornography’ used to define an addiction to or an obsession with acquiring, manipulating, and sharing information. People ‘suffering’ from infornography enjoy receiving, sending, exchanging, and digitizing information. The definition (without explicitly using the term itself) is also greatly applied in many cyberpunk settings, where information can almost be considered a currency of its own, in a sense facilitating the development of an alternate world for ‘escapism.’ Megacorps, hackers, and other kinds of people use information to thrive; they can subtly be called infornographers.’

The term was popularized by the 1998 Japanese TV cult cyberpunk series ‘Serial Experiments Lain,’ an avant-garde anime influenced by philosophical subjects such as reality, identity, and communication. The series focuses on an adolescent girl living in suburban Japan, and her introduction to the ‘Wired,’ a global communications network similar to the Internet. Communication, in its wider sense, is one of the main themes of the series, not only as opposed to loneliness, but also as a subject in itself. Director Nakamura said he wanted to show the audience — and particularly viewers between 14 and 15 — ‘the multidimensional wavelength of the existential self: the relationship between self and the world.’

Dionysian imitatio is the influential literary method of imitation as formulated by Greek author Dionysius of Halicarnassus in the first century BCE, which conceived it as the rhetoric practice of emulating, adapting, reworking and enriching a source text by an earlier author. It marked the beginning of the doctrine of imitation, which dominated the Western history of art up until 18th century, when the notion of romantic originality was introduced.

The imitation literary approach is closely linked with the widespread observation that ‘everything has been said already,’ which was also stated by Egyptian scribes around 2000 BCE. The ideal aim of this approach to literature was not originality, but to surpass the predecessor by improving their writings and set the bar to a higher level.

In contemporary psychology, the ‘Big Five’ are five broad domains or dimensions of personality which are used to describe human personality: openness, conscientiousness, extraversion, agreeableness, and neuroticism. Openness involves active imagination, aesthetic sensitivity, attentiveness to inner feelings, preference for variety, and intellectual curiosity.

A great deal of psychometric research has demonstrated that these qualities are statistically correlated. Thus, openness can be viewed as a global personality trait consisting of a set of specific traits, habits, and tendencies that cluster together.

In psychology, novelty seeking (NS) is a personality trait associated with exploratory activity in response to novel stimulation, impulsive decision making, extravagance in approach to reward cues, and quick loss of temper and avoidance of frustration. It is considered one of the temperament dimensions of personality. Like the other temperament dimensions, it has been found to be highly heritable. High NS has been suggested to be related to high dopaminergic activity (which plays a major role in reward-motivated behavior). When novelty seeking is defined as a decision process (i.e in terms of the tradeoff between foregoing a familiar choice option in favor of deciding to explore a novel choice option), dopamine is directly shown to increase novelty seeking behavior. Specifically, blockade of the dopamine transporter, causing a rise in extracelluar dopamine levels, increases the propensity of monkeys to select novel over familiar choice options.

A research study found that novelty seeking had inverse relationships with other temperament and character dimensions, particularly harm avoidance and to a more moderate extent self-directedness and self-transcendence. Novelty seeking is positively associated with the five factor model trait of extraversion and to a lesser extent openness to experience and is inversely associated with conscientiousness. Novelty seeking is positively related to Impulsive sensation seeking and with psychoticism in Eysenck’s model.

The Liberal Arts is a curriculum of seven subjects, the first three of which are called the trivia (grammar, rhetoric and logic). Its literal meaning in Latin could have been, ‘appropriate to the street corner, commonplace, vulgar.’

In medieval Latin, it came to refer to the lower division of the Liberal Arts (the other four were the quadrivium, namely arithmetic, geometry, music, and astronomy, which were more challenging). Hence, trivial in this sense would have meant ‘of interest only to an undergraduate.’ The meaning ‘trite, commonplace, unimportant, slight’ occurs from the late 16th century, notably in the works of Shakespeare.

A digital mashup refers to digital media content (e.g. text, graphics, audio, video, animation) drawn from pre-existing sources, to create a new derivative work. Digital media have made it easier for potential mashup creators to create derivative works than was the case in the past, when significant technical equipment and knowledge was required to manipulate analog content. Mashups raise significant questions of intellectual property and copyright. While questioning the law, mashups are also questioning the very act of creation. Are the artists creating when they use other individuals’ work? How will artists prove their creative input?

A major contributing factor to the spread of digital mashups is the World Wide Web, which provides channels both for acquiring source material and for distributing derivative works, both often at negligible cost. Web or cloud computing based applications are a combination of separate parts brought together with the use of the open architecture of public Application Programming Interfaces (API). For example, a mashup between Google Maps and Weather.com could be made available as an iphone application, where the content and context of that content are drawn from outside sources through the published API.

A long tail refers to the statistical property where a larger share of the population rests within the tail of a probability distribution than observed under a normal distribution. This has gained popularity in recent times as a retailing concept describing the niche strategy of selling a large number of unique items in relatively small quantities – usually in addition to selling fewer popular items in large quantities. The concept was popularized by Chris Anderson in an October 2004 Wired magazine article, in which he mentioned Amazon.com and Netflix as examples of businesses applying this strategy.

However, a 2008 study by Anita Elberse, professor of business administration at Harvard Business School, calls the Long Tail theory into question, citing sales data which shows that the Web magnifies the importance of blockbuster hits. Also in 2008, a sales analysis of an unnamed UK digital music service by economist Will Page and high-tech entrepreneur Andrew Bud found that sales exhibited a normal distribution; they reported that 80 percent of the music tracks available sold no copies at all over a one-year period.