Most people don't have the time to become "serious" philosophers (write a book, etc.); this site is an outlet for the straggling insights that they are able to share. Whether your thoughts are manicured or inchoate, if you want to impart your perspectives, perceptions, intuitions, or counter-intuitions on life, politics, nature, the color yellow, anything!- this is the place to do so. I’ll share insights that I've had, and continue to contribute as thoughts come to me.

Saturday, November 21, 2009

Monday, November 2, 2009

The important question is not whether or not a person procrastinates (because most people do). Rather, the relevant question is what a person does during that procrastination period. It is how we use our time and perhaps fulfill a higher level goal that we procrastinate away from a same-level anti-goal.

Sunday, September 13, 2009

Awkwardness, naturally, is uncomfortable. But perhaps it is time to desensitize ourselves to this social distress and see the utility (and therefore positivity) of this construct that gets a bad rap.

To understand how awkwardness may be useful, I will first acknowledge that there are several different strains of awkwardness: that which results when a guy encounters a second girl he's seeing while he is out with another girl is conceptually different from that which ensues when a girl refuses a social situation that presents a moral dilemma in some way. The focus here is on the latter-type instance. In this latter case, what does awkwardness truly represent? It is a dissonance between one's personal belief and that which is presented as the presented/available/preferred circumstances of/by an interactant with this person. So that there is minimal ambiguity, such might occur when a romantic couple gets into an argument and one of the dyad feels such discomfort that, despite his/her deepest values contradicting whatever the other believes, he or she acquiesces and distances him-/herself from these values.

The way the situation is presented above, of course, makes it seem obvious that deviating from one's values would be a poor choice and it may even seem like an easy problem to field. But how often has anyone encountered a similar situation of varying degree? It is more common than you think (just one example is that you don't want to make a "scene" with a store clerk by saying something even though they dropped your fresh fruit such that you are now the owner of bruised fruit (maybe you could get new fruit), or stretched a sweater as they wrapped it such that it no longer will look good on you (again, there is the possibility to replace it). Such situations are ones that one may encounter on a typical day.

While some of these instances are relatively benign/low-impact, others have more potentially serious ramifications for not transcending the awkwardness (a girl saying "no" to a guy in applicable situations, etc.) Regardless of the gravity of the consequences, however, it is important to train to assess the situation properly. Each instance is an opportunity to access one's beliefs; they are accessible, for they are a requisite for the feeling of awkwardness. To the extent that you allow the dysphoria of awkwardness to overwhelm you, you may not make a good decision, for the decision likely goes against your values. The point need not be elongated further: in short, if you resign yourself to the fact that awkwardness is going to be awkward, you can to an extent immunize yourself to its deleterious effects in decision-making. You can then generate the (obviously) logical conclusion to stay with -instead of stray from- your values. Becoming more comfortable with awkwardness is the antecedent to a host of superior decisions that you will, in retrospect, be happy to have made.

Monday, September 7, 2009

Sometimes the "right thing" to do is not the correct thing to do...oh, when conscience is maladaptive! The goodness of conscience may be importantly qualified by the fact that we only get one life and we must do what makes sense for us...this may seem paradoxical with my post on doing the right thing the wrong way being superior to doing the wrong thing the right way. But the contrast of the correct thing to the "right thing" above is not necessarily equating it to doing the wrong thing, in light of the qualification :)

Tuesday, August 18, 2009

Conversation overheard in a NYC lobby between a four year old and her mother:

Girl: Mom! Mommmm!Mother: Yes dear...?Girl: What is 'arm' without the 'r'?Mother: 'Am.'Girl: Nooo. It's 'm.' I said 'arm'!Girl again: What is 'arm' without the 'r'?Mother: 'Am'.Girl: Nooo- I said 'arm'! The answer is 'm' mom!

This cycle went on for another two rounds, each campaigner sticking to her initial answer. The mother eventually stopped answering and the girl eventually stopped asking. It's interesting/maybe sad that the mother appeared too worn to have the perspective that both fresh ears and a bit of developmental/social psychology afforded me to disambiguate the misunderstanding.

My first guess when the girl said, "what is arm without the 'r'?" was the same as her mom: 'am.' Then I realized, through the girl's answer, that her mom and I had something in common that we did not have in common with the girl- we could both read. The girl's answer relied on phonetics as opposed to spelling. What the not-yet-literate girl was really asking was "What is arm without the 'ar' (sound)?" Her answer, 'em' (m), then, was based on sound, not spelling. However, the mother was unable to take the perspective of her child to realize a difference in approach to solving the riddle. I was going to point this out to the mother, but just enjoyed catching the conversation bit.

This episode quickly reminded me of something in psychology called the Stroop Effect.

Answer this in your head: What is the color of this text: green

If you are reading this, then you're literate, and like other literate people you likely initially thought 'green,' if you didn't actually say it as well. Yet most young children are quick to correctly indicate the color of a word that does not match the word itself. For instance, to write the word 'green' in pink lettering (above). Why? Not-yet-literate children have little problem identifying the word as being pink in color (the letters g-r-e-e-n mean nothing to them), but literate people get caught on the discrepancy between what their logical brains are reading and the color stimuli itself. This confusion slows down their answer of 'pink,' and often adults will also mistakenly say 'green.'

It is important to be sensitive to the superiority of doing the right thing the wrong way over doing the wrong thing the right way. Foundation is everything; subsequent missteps are easier to correct.

Saturday, June 27, 2009

It seems that the lines- social rules, courtesies, and hierarchies- that once provided us with an opportunity to discriminate (non-prejudiced connotation) one person from another have been softened to the point of imperceptibility. Two linked topics of great concern are privacy and celebrity.

People may assume that because we live in a world today where little of one's life cannot be made public through some media- whether via facebook, blogs (whoops!), or twitter- that somehow people don't care about these aspects being kept private. Formalizing this concern, people may think that others don't value privacy anymore. I do value privacy.

A friend the other day took it upon himself to find my phone and start reading through my text messages. Really...really?? If this doesn't sound outrageous to you, then perhaps you have become desensitized to the softening of the social courtesies that I am speaking about. I mean this in the least judgmental tone possible, for I think it's easy for anyone to get caught up in it. But a phone is a personal device for keeping track of various communications, and the contents therein, likewise, personal. If I wanted you to see my conversations with people, I would have put it in a Word document and printed off a personal copy for you, and maybe even highlighted the good stuff for easy reader viewing.

This is far from saying that I have anything to hide. Just the person I am, there is essentially no drama for vicarious social vultures to feed upon. That said, this does not in any way grant permission or mitigating circumstances for viewing what is private.It boils down to this, which I know I have previously written about: we need some...intangible things other than our bodily divisions that separate us from others in order to give us a sense of being separate. These features that make us us, and separate, are not necessarily in the name of making us "special," "unique," or "better," than others, but for God's sake, it seems like a human right to not simply be operators of common knowledge about each other, to whom some of this common information technically belongs. We become estranged from our own identities if this happens.

The second part, or maybe example, of these softening lines is the distinction between celebrity and non-celebrity. This line has all-but disappeared. Celebrities want to be normal: they join reality shows in their non-professional domain to experience a rush on the dance floor. They send out twitter messages to people on the side-walks and interact with fans. News anchors laugh with another in too-human a way and joke when they mess up. Celebrities create facebook profiles that they probably do not (and should not) have time to maintain if they want to continue to be great at what they originally gained recognition for. Likewise, non-celebrities want to- and can become- "celebrities" through any extreme act. They sing horribly or dress scantily on American Idol and may be on the finale show and get a Golden Idol from Ryan Seacrest. Survivor alumna Elisabeth has gone on to host The View and marry pro quarterback Tim Hasselbeck. Even celebrities don't stay put but rather move laterally within a profession characterized more broadly everyday as 'entertainment.' They are singers who really want to be actors; actors whose dream it really is to produce; ex-reality stars who always wanted to host a talk show and, with enough visual appeal, attempt to conceal lack of talent in the delivery of their lines, the crudeness of their articulation, and try to pass off their opinions as authority.A certain standard has to be reestablished, and merit-based status rewarded. In this attention free-for-all, the loudest person will always be right. And we are all being taught to shout.

I am all for standing out in the crowd, differentiating myself, but I'd really like to do it in an 'honest' way; competitions in track, for instance, are determined by time or distance. If I want to stand out and be the best in an event there, I will have to do so against certain guidelines. The 'lines' help us keep track- we can differentiate ourselves from others, and even from our previous performances. Some people argue against lines because they confine us, but even the most creative people who "drew outside the box/lines" made use of these lines by virtue of having drawn outside of them!

The lines give us a reference and combat the free-for-all; they help establish a standard for who is what (I think people are afraid to be confined to labels and so want the aforementioned lateral movement; but again we need certain intangible boundaries that give us a sense of self); finally, lines provide us with structure for social expectation and comfort- they inform us that it is not ok to disrespect the boundaries of personal space and information. It is time to reestablish standards, recalibrate our expectations, and re-sensitize ourselves to our rights as individuals.

Saturday, June 6, 2009

We are all a lot more capable of doing things than we think we are, for better and for worse. The range of actions in which we may engage is much wider, on both sides of our norm, than we can consciously anticipate or entertain. I think often society promotes this idea in a one-sided form- i.e. that we are capable of doing good things, or advancing our skills, or breaking records (time) beyond that which we originally thought. However, I'm sure we also all have done things that we said we'd never do, and these situations were either incomprehensible or unfathomable to us until we find ourselves in them.We oft deny the unconscious influence of the situation, but it is there. Consequently, our judgment for the actions of others should perhaps be tempered, even calibrated in light of this fact to assign their intentions a little less weight than we normally would, for it could just as well be ourselves that are venturing out into the tails of the behavioral bell curve.

Wednesday, May 27, 2009

Saturday, May 23, 2009

One's need for approval is a great leverage for another's power.

In fact, motivation for many actions is predicated on need for approval. Of course, there is no endorsement of exploiting the reality of this precept; on the contrary, its knowledge can be used to extend consideration to another who suffers on the short end of the equation.

Saturday, May 9, 2009

Monday, May 4, 2009

We are all artists. We all start out with the same, monochromatic palette: time. Every day, we paint our canvas with this time, and every day, whether we wanted to or not, whether we felt like it or not, a canvas is completed. How artfully, how skillfully we use this limited time determines the painting- a masterpiece or a mere mediocre facsimile- that we sign as the day draws to an end. Tonight, what will the canvas you put your name on look like?

Thursday, April 23, 2009

Private information is crucial to maintaining a sense of self. For a 'self' has boundaries. In an age of sharing and even advertising oneself in various forums of social networking sites, self-presentation and proving oneself have often made us keep nothing private. As a result, however, we suffer, for the boundaries that define a self are made permeable.In the process of making an excellent case for our worth to others publicly, the more we share the less extra information we have that only we know. And that can be a problem, because confidence likely comes from being able to say, "there's more where that came from." But in these times where we put everything out there on our social or professional resume, there is nothing left to back up our public assertions.Consequently, over-sharing of information may foster the Imposter Phenomenon, whereby we believe we are not as great as others think we are, and are fearful that we cannot live up to the reputation that we have happened to come into possession of. This inflated reputation is not surprising, however, given how good we have become at selling ourselves. A lesson should be that we should not always act on our impulse to make a case for our greatness at first impression, because we may not be able to live up to it if we do not have anything new to support later. That is not to say that people are not great and don't have great things to share. But sharing too much and leaving no information private for oneself will likely induce a sense of being a fraud. After all, knowledge-especially self-knowledge- is power.If we continue to let over-sharing cause self-worth inflation, we may fall from grace in the form of reduced confidence, etc. And all we had to do was practice not saying so much.

Tuesday, April 21, 2009

Thursday, April 16, 2009

...But we become great because of how we carve out our existence in life, and that sometimes utilizes restraint, and moderation. You see, greatness is not always achieved by defying rules and trail-blazing, but by how we use these parameters to leverage and define ourselves...

Sunday, March 15, 2009

Monday, March 2, 2009

Whether we have a proclivity for formal education or not, we cannot deny our inextricable relationship with learning, nor can we deny that we need it and find it useful. Through learning, for instance, upon hearing a noise whose origin we cannot visibly discern, we still often automatically know which noises we can ignore, those which we should run toward and attend to, and those...from which we should run like hell! This inaction vs. action, and valence of action, can be applied to myriad other everyday encounters, whose facilitation we owe due respect to our ability to learn, associate, and detect covariance of stimuli.

Tuesday, February 24, 2009

When we are motivated to form an impression, in general, or one of a particular valence, more specifically- we unintentionally activate a biased information processing system that will give us exactly what we are looking for. The on-line processing mode will encode information differently, and will both retrieve early informational entries more often and bias subsequent evaluations based on an unrepresentative recall of information, contra to what the complete, objective information set would offer. This is the science behind what many of us knew all along: if you really want to be accurate, you can't go in with expectations! Easier said than done...

Wednesday, February 4, 2009

Saturday, January 31, 2009

Structure is the glue upon which meaning can affix itself. Structure represents a deliberate coordination of stimuli in an overstimulated world. The resulting coordinated bundles of stimuli signify something simply by virtue of having been assembled (involuntary/innate meaning), and can additionally hold meaning from volition and subsequent external input.

Friday, January 30, 2009

Sorry, coming through, but I'm going to draft-dodge time if that's alright with you...I don't intend for this to be viewed as an insubordination of sort, or even as a rejection of a system that I don't think makes much sense. Rather, I see it as a sensible adoption of a metric that is more adaptive for us in the long run.If we continue to use time, well then, we all get old. Our number gets bigger, whether we like it or not. And does anyone want to be "old?" The connotations aren't pretty. We can escape, though, by dodging time just as Bill Clinton dodged the draft: we simply don't have to participate if we don't want.There are other association metrics besides time that are inaccurate in other ways, which constrict our open minds, conflate unrelated issues, and produce judgment errors about what people should or should not be. For example, the Zac Brown band acknowledges one such oversight in their Chicken Fried lyrics: "There's no dollar sign on a piece of mind." You can't buy serenity: that you must earn with a different currency. Also, people assume there's some relationship between weight and happiness. For those who think being skinny is a unique predictor of happiness (in regression statistics, the equivalent of producing a significant squared semi-partial correlation), then please talk to all those unfortunate anorexic girls and ask them how much they get satisfaction out of a good meal with friends... So why do we allow time to dictate our perspective? The guys in Rent had it right when they asked:"How do you measure, measure a year?In Daylights, in sunsets, in midnightsIn cups of coffeeIn inches, in miles, in laughter, in strife.

In five hundred twenty-five thousandSix hundred minutesHow do you measureA year in the life?

How about love?"

And just as money and weight are fallible prophets of contentment and happiness, respectively, so too is time a poor predictor for the "age" of a state of mind. I've been called "a baby" and "ma'am" within 24 hours of each other; the evaluations coming from a man 30 years my senior and a girl who has been around ten years fewer than I have; apparently they could not agree on how "old" I was. So instead of letting them duke it out, I think I'll negotiate my age and not worry about either label. I'll try to avoid labeling "older" or "younger" altogether, because I think there is even an age associated with labeling (and it's old age). You can only label something when you think you've got it figured out. In contrast, I plan to stay in an apprentice mindset- taking on new things and adjusting my thinking as the terrain changes. That mindset allows you to call something what it is- an approach I'd argue is more adaptive than assuming certain qualities or amount of knowledge accrued based on objective age. Plus, I'm pretty sure that an "apprentice" is associated with youth. So I'll stick with that. And being excused from the parameters of time additionally allows for us to be "opposite" things at the same time. Because I think dodging time is a pretty wise move...and yet I'm young.But now I can be both.

Tuesday, January 13, 2009

The day before the highly anticipated, emotionally-charged BCS national title game, which our university's team was playing in, I excitedly asked another graduate student where they'd be watching the game. "What game?" they responded.This slice is scarily a representative one of what it means to be a PhD candidate, or more generally, a graduate student: technically you are still a citizen of humanity...just, well... twice removed.Let me backtrack. There is the saying that when you are in college (i.e. undergrad), you're in a bubble- not keenly aware of world events, important political issues, or frankly of the weather 50 miles outside the perimeter of Eden. Really all that matters is campus life, the school's sports teams' performances, and national holidays are only considered significant if they equate to a day off from class. And while this bubble may be impervious to the "real world," it seems the danger of such a bubble does not reach precarious levels.Graduate school, on the other hand, affords the recipe for an actual jeopardous existence. Graduate life, of course, is not only safely sealed off from the real world; it's additionally sheltered from the events constituting campus culture (like, say, a national championship bid!) Effectively, we find ourselves in a "double bubble." And that is dangerous: for while the first bubble perhaps shields us from the unpleasant things of the world- thus a blasé bubble, a good bubble- the second bubble translates more to an armor to the serenity afforded by the first, outer bubble.Sure, there is the occasional outlier grad student who's a triple threat- is up-to-date on the war on terrorism, attends all of the university's sporting events, and somehow still finds time to do groundbreaking research (I know this may not sound like a traditional triple threat combination to anyone outside of academia by the way :P)- but even this is a rarity. We graduate students are then a distilled population operating in a vacuum-like setting, supposedly generating relevant knowledge for those a world or two away.So just as you have those distant cousins, once, twice, four times removed (whatever number pleases you), with whom you seldom interact, and to whom you're not really related in the kind of emotionally-attached, kinship kind of way, the bubble life of grad students- that is, the double bubble- requires that we qualify our citizenship of humanity with an *. And this asterisk reduces our relation to other citizens of humanity as citizens twice removed.

Tuesday, January 6, 2009

It is futile to attempt to control the past, for we cannot; in this sense, the past does not matter. The future, of course, does seem to matter: however, it does not matter today, for it cannot be directly controlled. The only thing that we really can control is today (ex. day n), and therefore today is the only time that matters; hence, the most important day of our lives. Likewise tomorrow (i.e. day n+1) will be the most important day of our lives...tomorrow (day n+1). So the future is important- or, I should say, will be important- but only as a function of the current day-at-a-time.Furthermore, today is not merely a connector between yesterday and tomorrow- though we'd think so by the amount of time our thoughts spend in these two extremities. What if we simply extinguished the word of the three (yesterday, today, tomorrow) from our vocabulary where our thoughts dwell the least? How empty would we feel if we eliminated the crucial word (i.e. today) that fills the space in between yesterday and tomorrow? We'd probably be gasping for the word 'today' as we would for air if suffocating. It is only through this that we realize today is the most important time.