About

Saturday, April 27, 2013

I have to say I’m not a
fan of having something explained to as it not being “needed.” The phrase
proves fairly vague and, well, not very descriptive. In my perspective, it’s a
fiction book. None of it is needed. The world would not be worse off if it was
never written. I don’t care if it’s Lord of the Rings or Romeo and Juliet, if we didn’t have that, we’d just have something
else to take its place. Entertainment has its purpose, but it takes too many
forms to be considered a necessity, because it’s not like we’re going to die
without books, or even be able to find a satisfactory spares.

But, while editing, we
find ourselves with the problem of what’s important and what isn’t, what should
be cut and what should be changed. To solve an issue with the work, there are
an infinite number of choices to be made. We don’t like the character? We could
delete him, change his gender, cut some dialogue, alter the backstory, edit the
perception the characters have of him, or even change his name, each option
being completely viable possibility.

When we say something
“isn’t needed,” it’s a nice way of trying to tell the author there is a
problem, and that problem is solved by cutting. If, in reality, an extra scene
had no negative consequences, then there would be no reason to get rid of it.
Often, however, what the advisor means is the scene has costs that are
indirectly influenced by its existence, at least when he’s telling the truth.
Things like, the story is too long and this is an easy cut. Or, more likely, he
is not being upfront and he really
means there is something wrong with it (it’s boring) and it is an easy cut. If, however, it’s not boring and
the book isn’t too long, but it’s not interesting or too short, then leaving it
as is as much of an option as changing it. In the hypothetical situation that
it stands completely neutral, it really, truly doesn’t matter if it’s in there
or not.

The thing is, when it’s
someone else telling you to delete something, there’s certain benefits. The big
one is they are narrowing down your options, for one. The problem really arises
not when someone else has told us our scene is irrelevant, but when we have to
decide, for ourselves, what we should cut, or simply how to flesh it out,
because the issue is not to decide if you agree or disagree, it’s coming up
with an opinion in the first place.

-Every scene should
develop one or more of the follow: Character, tone, setting, plot, or theme.

Now, technically, an
author could easily say that anything furthers one of the five. The important
thing is to not lie to yourself, especially when the only one you’re talking to
is yourself.

Do you feel the scene
actually develops one of these elements, or do you feel that the scene just could? If it’s the latter, it means that it, in all
likelihood, it doesn’t.

It can get confusing
because a good story has such complex and complicated settings, characters,
plots, and themes, that yes, you could attribute him going to the bathroom as a
reference to one of those things. The best way to tackle this is to remember
that you are a good writer and pretending that workload isn’t an issue.
Therefore, you won’t feel bad when you put in a scene you decided is pointless,
and the issue of having to do it won’t be as big of a factor.

Remember:

-A million things
happened to these characters in their lifetime. Why is the narrator choosing to
describe this moment at this time?

-To illustrate the
deliberate and, more importantly “I put a lot of thought into this” that is indicative
of quality, everything requires more than one reference. If the point is your
character is shy, his shyness needs to be illustrated at least three times,
even if it’s just a form of development and not relevant to the plot. Does the
scene only say one thing, and does that thing ever come up again?

-Nothing changes from
the scene before to the scene after.

Is there any piece of
information in the scene that changes something in the story? Are the
characters now angry with each other, which motivates their unwillingness to
help? Can they not continue in the same direction they could before? Do they
have a new motivator to help them come up with a plan and energy?

Every scene should have a
growing effect on the story, whether it be mood, objectives, strategy, or even
the reader’s understanding and perception of the world or the characters. If a
scene “resets,” akin to the way a television episode would, often that means it
can be cut easily.

Of course, that’s not
always the case. Sometimes a scene will constantly be referenced later on, or
it will have super big events happen, in which case a cut would mean a complete
rewrite of later work.

If the latter is the
case, it still means that the scene is okay as is. The optimal choice, when
cutting is not an option, is to add more information in (about the setting,
characters, or any of the others) and have your story deal with the
ramifications of the event. Some are now mad at each other, some are grieving
or scared, some feel guilty, they are now wanted by the law, or another
character has a vengeance against them.

Sometimes the best change will demand a whole new direction for the book to
take, and it’s important to recognize that while that isn’t the only choice, it
is still something to consider, despite it going against your original vision.
It’s a hard path to take, so many authors will limit themselves to less
desirable routes for the sake of less work or even fear of ruining what they
already have.

-When this piece of
information is repeated, fleshed out, and shown several times, does it deliver
something else the reader didn’t know before?

On the other side of the
spectrum, sometimes a scene can be superfluous if it reveals a piece of
information everyone already knows. Now, as I just said, it is beneficial to
discuss a topic, such as a character’s vanity, more than once, but there is a
difference between cementing and being repetitive.

Every time a singular
piece of information is similarly re-discussed in order to reinforce an idea,
it should also add something that the audience didn’t know to it, say it in a
different and entertaining way, or have it lead to cause and effects.

The first time John’s
vanity is indicated, it is when the novel starts and he has to go home to
change his coat after he drops mustard on it. We understand that he cares about
his appearance. The second time, we see his girlfriend his standing outside the
bathroom for two hours, waiting to get in. We learn that not only is John
obsessed with looking good, he’s selfish about it too, and we’ve reinforced
looks are prioritized over his time. Last we see him swearing as he realizes
he’s left his comb behind, then proceeds to be rude to his waiter. This
reinforces his vanity, but then also gives motivation for the waiter to keep
the wallet he’s left on the table even though he could easily catch him, which
allows for him to not be able to take a taxi home, which makes him have to walk
and then explains why he gets beaten up and sent to the hospital where a life
change event happens.

Reinforcement is better
when it’s not just reinforcement.

-Keep to the point—i.e.
the themes.

No one does anything
without a motive, whether it be deciding to sit down, call a friend, or murder
someone, we all have a reason why. That includes why we are telling a story.

Susie tells Mary that she
broke up with Danny. Susie wants many things from Mary, mostly just along the
lines of reassurance and comfort. That’s a motive. Writing a book is no
different, except we have time to think about why we’re saying what we are.

Now people who actually
write have lots of reasons to do it. We do it for fun, we do it for respect, we
do it as catharsis, as a love of fiction, to make a career, to help people, to
get famous, etc. but no matter how altruistic our reasons are, the reasons why
we write aren’t the point we want the readers to see.

Every story has a “point,” even if people like to pretend that art doesn’t.
This point, or theme, can often be considered the moral of the story, but
really it is what we want the audience to understand. It could be anywhere from
a simple, “Isn’t James Bond cool?” to “Look how terrible racism is!” Of course,
you’d want to word it and take it more seriously than I am, but the intention
is the same.

It is helpful to consider
the theme to be like a decorating theme, i.e. the color schemes that make the
design look deliberate. When you walk into a room that someone clearly got most
of their furniture from random, separate places, with a maroon couch, a brown
carpet, and white curtains, it could look really nice, but it also reads like
what it is: I got what I liked and hoped it matched. Same thing when it comes
to a book. If you have a bunch of good ideas together that don’t really seem to
correlate, it looks like there’s less thought involved. It may not be a bad
thing, it may even make magic, but that will only be in parts, and the general
picture will look like you did exactly what you did.

Many authors don’t decide
on a theme until after the first draft is complete. A lot of people say that
they don’t do thematic writing at all, probably because it sounds pretentious.
I personally come up with open-ended and sometimes arbitrary themes like, “fear
of the unknown,” to deliberately tie all the events together with one theme and
motivation. Doing whatever works for you is important, but even when we choose
not to have a “theme” specifically, we still have some sort of point.

Ask yourself why you are
telling this story and what you want the audience to understand. Then when the
question of “what’s needed” comes up, you’ll better know the answer.

-Is this something I
wanted to happen or just felt like should happen?

Despite always having a
reason for writing what we do, many times, we’re not aware of the reasons. The
conscientious choices we are understand, but the subconscious ones, the more
common ones, are often made without any sort of knowledge. Sometimes it has to
do with the vision and atmosphere we wanted, sometimes it’s because we’re
following a tradition; the subconscious decides things for both great and
shallow reasons.

Whether it be insisting
on writing a woman as nice, to a traveling group coming to a rickety bridge, to
starting the book with the character waking up, there are a lot of choices that
we might make because, without thinking, we felt like that’s how it should be.

When criticized, it is
common for the speaker to indicate that what the author did was “wrong,” which
solicits the understandable response of, “Nu-uh.” In most situations, a choice is not right or wrong,
but contextually unsupportive, or simply have a bad execution. And, because of
quality being defined by comparison, it may just be there is nothing wrong with
the execution save for its commonality demands for higher implementation. Which
means that, though a great story could start with someone waking up, it’s much
harder than if we were to start it otherwise.

In any case, it becomes
hard for the author to decide that he has been influenced by society, and makes
him want to say that, yes, deciding to put the family in suburbia was part of
his vision, and no, not a subconscious default.

Being honest with
yourself and separating what you care about to what you don’t care about helps
you determine the best solution to your problem. You might find that, though
you weren’t aware of it, you want your character to come from married parents
because you don’t want her issues with men to be considered daddy issues. Or,
you might decide that, hey, you just did that because that’s how normal family
life is supposed to be, but considering that it seems illogical that both parents will accept their daughter choosing to travel
to a hell dimension, you might just get rid of one so it seems more convincing.

When looking at a scene
that seems to be irrelevant, and you can’t decide if you should get rid of it
or fix it, being honest, objective will give you the choice you won’t regret.

-Remember that
irrelevant details can be beneficial.

All of that being said,
having things happen that aren’t important can be a good thing. First and
foremost, only discussing relevant events will announce that all events are
relevant, and it will be harder to foreshadow, drop hints, and even legitimize
why the character has put two and two together. Little details of actions, features,
and even incidents can make life seem more real.

This little contradiction
of “have irrelevant details” and “get rid of what isn’t needed,” makes it all
the harder to figure out if a scene is important or not. Just because a scene
isn’t related to the plot doesn’t necessarily mean it should be taken out; it
might behave as the perfect red herring or even contrast, despite it not being
its original point.

The question you need to
ask is what are the costs and rewards of leaving it in?

Fiction is a luxury that
hasn’t been around in the form we know it for long. We use it for
entertainment, we use it to learn, we use it to explore the world around us,
but when it comes to concrete rules and regulations, there are no natural
limits. An author can do whatever he wants when he writes, it just isn’t
necessarily going to be successful. The important thing to remember when trying
to improve your work is that you know best, and that includes knowing what you
think is wrong. When trying to cut down, utilize your real opinions, and then
the question of, “is this needed,” can be answered.

Friday, April 19, 2013

Self-limitation is a strange process. It’s something that
the artist does in a multitude of ways and from large to miniscule magnitudes.

Writers limit themselves by putting up boundaries. By, you might
even say, putting ourselves in a box. And I’m not just talking about ideas,
though that’s a part of it. It’s how we tackle our careers, our image, and even
our vision. There are many ways that a person can, and does, prevent ourselves from
writing creatively.

The worst part is it’s a process we are completely unaware
that is happening. The subconscious is sitting there,
trying to make things normal—as it does—trying to categorize everything, and
bam, it decides, “It has to be done this
way.” And we, so in love and brainwashed by our handsome instinct, just listen
to it, completely unaware of how controlling it’s being.

The five most common forms of limitation are trying
to be original, trying to be realistic, trying to be an instinctual genius,
trying to be pure in our vision, or even just trying to be elitist.

Now, I must clarify that there are many times in which these
limitations are legitimate, understandable, and even sensible. There’s a clear
place in which you might read your own work and go, “This isn’t realistic,” and
obviously find that a problem. What I’m advocating is not that originality,
realism, purity of vision, or anything else is a bad thing; I’m saying that
it’s not the only thing.

Take for instance the first boundary: the obsession
with being purely original. I claim that originality is not a
quality of good writing, but a means to achieve a quality of good writing.
Which is to say, if something is meaningful and entertaining, but not original,
there’s no problem. It just seems hard to picture a cliché work being meaningful and entertaining. In any case, the issue
arises with the fact that if the author is so focused on writing what hasn’t
been written before, he’s not going to be writing what he wants.

We get over it. Eventually. After writing about two or three
novels, it starts to occur to the author how ridiculous it is. Some people are
prodigies and do it within the first six pages. But if you’re not at that stage
yet, let me expedite the process and say it’s important that the story is
yours, not that it’s “original.” So, while writing something that’s already
been written isn’t a good thing, it’s okay to take what’s already been made and
change it until it’s something that only you could have come up with.

The limitation of being original removes a ridiculous number
of creative options. If we wanted to be over the top about it, we could even
say, all the options, as everything has been done before.

Realism is more painful and less obvious. This is a prominent problem especially
within the acting community. The artist, without being aware of it, decides
that their art needs to be a pure reflection of reality, and refuses to think
of anything that could be considered, “Silly.”

Again, there are places it can make sense. Someone who is
only “silly,” doesn’t have any commitment to the piece, and neither does the
audience. We want to grasp onto something real, even in comedy and absurdism.
If the work has no sincerity, no one’s going to care. But, on the other hand,
art isn’t always realistic, and doesn’t always need to be. I call it the Jim
Carey method. It’s a stylized version of reality that is over the top enough to
be entertaining, but not so much so that the audience is brought out of
immersion.

A prime example of my own form of limitation is “realistic
fighting.” Instead of having my characters (who are in supernatural worlds, I
might add) do flips and jump off buildings, punch people through tables, and
gracefully take a hit, I have them get the crap beaten out of them.

When I realized this I could sit down and consciously ask why I felt inclined to make these fights so (as I
perceived them) realistic over entertaining. It’s not like anyone who watches
action movies is sitting there crying out, “A fridge wouldn’t save you from a
nuclear blast!”

Oh wait.

But, all jokes aside, watch an action flick. Look at all
these moves that the actors clearly didn’t do without ropes and special
affects. And ask yourself if you, the audience, care that it’s not something
you’d see an everyday cop do. Sure, sometimes it’s bothersome, such as in the
case of our dear friend Mr. Jones, but it doesn’t mean that you should limit
yourself to realism without the committed decision of “I’m going to write the
most realistic story ever.” If that’s not the main point of the book, then it’s
not something you need to care about.

But the biggest pain that we have to face, if not the most
common, is our obsession with being a born-genius. Americans, as I have said,
admire innate talents far over any learned ones. We love child prodigies, no
matter if they’re old and mediocre now. We give them far more credit than the
person who worked his ass off learning rhythm and tone.

I perceive it akin to women who buy clothing two sizes too
small. The reason why female styles don’t have a universal sizing system is
because the companies know that if it says we’re a four instead of a six, we’re
more likely to buy it. Which is precisely how a size zero came into existence.

One of the prime mistakes of shopping, however, is when
someone purchases the size they want to be rather than the size they are. By
being in denial about it, by choosing to ignore reality, to act as if the world
is the way it should be rather than the way it is, their size (which may have
been perfectly attractive before) is now exaggerating flaws. Pants too small
will give you a muffin top, and where in the appropriate jeans the curves would
look good, now the woman just looks fat.

This is the same thing as what artists do with innate
talent. We want to be born-genius so bad, that we are in denial about the
reality. And the reality is that, no matter how much you believe in nature over
nurture, no matter how many talents you were “born” with, there are some things
that an author is not going to be able to do right off the bat.

Yet, instead of addressing it, saying, I can’t depend on my
handsome subconscious to lead me down the right path and I’m going to have to
do some actual thinking, authors will be in denial about it.

For example, creativity stems from two places. One is the
subconscious’s definition of “normalcy” is different than the average
definition of “normalcy.” Whereas most people will name “carrot” when asked for
a vegetable, some might say, “corn,” or even, “avocado.” This is the one we
want to be true, not only because it’s easier, but it’s because what the
American culture respects. (Noting that 1. Not all my readers are American, and
2. Lot’s of other cultures respect it too.) The reality is, however, that any
author who subconsciously defines all normalcy as different than others do
wouldn’t be able to communicate. This means that you and I both are going to
come up with the same knee-jerk reaction to a question at one point or another,
and, being that a quality of good writing has to do with the perception of
repeatability, you could see how this might be a bad thing.

The trick is to not limit yourself to being innately
creative, but recognizing that you can be intellectually creative. By noticing
the trick (everyone is going to say carrot) you can logically choose to be more
interesting.

And it’s not a lie, and it’s not a con, and it’s not a bad
thing. If an author believes that being aware of his knee-jerk reactions (say,
making a family live in a suburban house) and changing them to something more
creative (say, putting them in a yurt) is lying about who he is, or whatever
nonsense he wants to come up with, then he needs to accept that he’s as
creative as he is, end of story. But the person willing to work on it, willing
to take from the subconscious and conscious, will become as creative as he
wants to be.

Next we have the purity of vision. This is something I
recommend to most authors, or at least, having a vision. But we can get a
little ridiculous about it.

I once imagined a future scene in which my female character
was driving and my male character was in the passenger’s seat. When I actually
got to the part, however, it worked out that he was the one driving and she was
in the passenger’s seat. I started to work out a way and a reason for them to
switch, when I realized that not only was it going to add about 2,000 words to
my already long book, but I didn’t really have a reason that it had to be that
way.

Recognizing that allowed me to open my little mind.

This is a problem that not only affects inane decisions, but
it is something that will often cause indecision.
In order to maintain the purity of the concept, we often avoid making active
choices, because active choices, no matter how minor, will change the
atmosphere, character, and tone, as well as sometimes even plot. I choose to
say, “she laughed,” instead of, “she said,” and the image of her speaking is
completely altered.

Except that maintaining a tone by trying to use
non-influential words (such as said) adds no information. The story indicates
what it’s about and then consistently maintains that same tone and atmosphere
throughout the book; no one cares because nothing is changing or being added.

This shows in things like backstory, in which we didn’t
envision his parents, and then suddenly there’s a benefit to having them be, I
don’t know, lawyers. But because this new piece of information can alter how we
see the character, authors are inclined not to make it, wanting to leave the
world as vague as possible. Vagueness leaves all options available. So it’s
appealing to an author to try to keep all his doors open. Yet it’s more
appealing to an audience to have him actually go through one.

As for elitism, the ideas of “I can’t do that” and “I can’t
do this” because it’s not “the image I want to have,” creates boundaries from
snobby (if not accurate) perceptions. It includes, “I don’t write short films,”
to “I don’t do nonfiction,” to “I’m a
writer, not an intern!” It can even orient around ideas. “I refuse to write
about vampires,” or “I’m not going to write for an audience. I will only write
for me.” Or even, “I refuse to write for myself, I will only write for an
audience.” “Never use said,” “Always use said,” “Semicolons are off limits,” “How
dare you start with a dream sequence?” “You’re not allowed to write for blank,
blank, or blank reasons…”

Again, sometimes it’s beneficial to have boundaries.
Refusing work because you don’t get paid can help your reputation and how much
people respect you. It can also limit your options. As long as you are aware of
it and making active choices, rather being in denial, then the problem of the
limitations isn’t an issue.

The trick about limiting yourself is to not do it in any direction.
An author can’t prevent himself from using adverbs if he wants variation. An
author also can’t limit himself to only using adverbs. Nothing is right and
wrong, no matter how much your subconscious or your college professor tells you
otherwise. Limiting yourself because of this idea of “right and wrong,” because
of “what you should be doing,” because of “how it should be,” is limiting
creativity.

The real problem is that people limit creativity both by
refusing to do something and refusing to do something new. The problem with
using said is not the actual word, but the inflexibility to try new things, to
find a different way to say, “He spoke at her,” then the tried and true way.

The best way to overcome the box is to be aware of it. Recognize your
assumptions and your hang-ups, and always question the reward-cost ratio. Start
telling your subconscious, “there is no should,” and start asking, “How does
this help me?”

You are allowed to do anything you want. Now you just have
to become more aware of the options.

Sunday, April 14, 2013

In my college theatre
department, I had a “professor” whose such great knowledge allowed him to
bypass his absent B.A. and teach young and very impressionable minds. His
resume was lengthy, and though I have never been certain if his work in the
60’s was actually impressive or he just knew we were naïve enough not to be
able to tell, he has a great list of self-accolades.

He once told us this story—which
I would very much like to know how we got on it—about the time when he was
flown out to a new city to direct a show. He agreed, on the grounds that there
would be someone to “take care of him.” See, when he was making “so many
creative decisions,” he “lost the ability” to make them for himself. This I
believe. He barely had the capacity to think on a clear day.

In any case, he got what he
wanted, and they hired him an assistant.

So the day he arrived in town,
he went to rehearsal and was worn out by the end, as he knew he would be. The
day was so long and the decisions so plentiful, he just was gone by the time he
got out.

He turns to his assistant and
says, (in a way I imagine a more annoying Boo from Monsters Inc. would say) “I’m hungry.”

So the assistant turns to him
and asks, “Where would you like to go?”

And my teacher wanted to start
crying.

Now, it is ridiculous for him
to think that the assistant would be able to decide where to take him. She had
no idea how much he wanted to spend, if he wanted quick, cheap, or good, or
even what his tastes were. It makes absolute sense that she wouldn’t be able to
give a good answer until she knew more about him. All of this expectation is
unreasonable. Except that’s what he’s paying her for.

This is an analogy to the
writer/reader relationship. The reader is someone who is looking for an
opinion, an idea. They are hungry, but they aren’t sure what they want. Or
rather, they are bored, and they aren’t sure what they want to think about.
What they really desire is for someone else to make a decision and allow them
to then accept or reject it.

When authors (or, the more
typical, playwrights) say, “It’s about whatever you want it to be about,” what
they’re really saying is, “I don’t know. Where do you want to eat?”

This is a really frustrating
conversation that everyone’s been in. We all sit around waiting for someone
else to introduce an option. Whether it’s because we can’t think of anything or
because we don’t want to be shot down, it’s the same for the arts. When a
creator doesn’t want to say, “Let’s do this,” it’s because he doesn’t want to
be told, “That’s a stupid idea!”

But most importantly, it
illustrates why it is okay to have overbearing opinions and, yes, even tell
readers what they should be thinking. That’s what they want. Sometimes they
only want it so they can feel superior to you, but they want it none of the
less.

And the perception that someone
not knowing what he want to think means he doesn’t like thinking would be the
same as someone not knowing where he wants to eat doesn’t like eating.
Sometimes we’ve just eaten at the same place far too long and now we want something
new. In fact, that’s usually it.

Whether we notice it or not,
we’re always thinking. We spend a great portion of our day daydreaming and
fantasizing. What happens is, just like eating, we get too much of something
and we get sick of it. So we want something new.

People can get inspired by a lot
of things that aren’t entertainment, but entertainment is the one place that
we’re paying to get inspired. As a writer it’s important to realize that liking
something is being inspired by it. It
could be something as “deep” as wanting to go save the third-world countries,
or something as shallow as wanting to become an international spy.

Watch children. They don’t
censor themselves. Where adults like to hide how they fantasize, kids are right
out in the open about it. And you know what they do after they watch a Superman
movie? They pretend to be Superman. Or they pretend to be with Superman. They
are inspired to be Superman or do what Superman does. Just like your much
older readers do.

In all essence, a good story
needs to change the flow of thought, meaning that what the reader was thinking
before he started it is not what he’s thinking after he finished it.

Which is why “it’s about
whatever you want it to be about,” makes for a bad story. Sure, a lot of times
people will take a work in and use it to mean whatever they want it to mean,
but that’s not because the author is allowing for that to happen. At least not
obviously. It’s because the story appears to be concrete enough that it
supports whatever theories other people have. Sure, the bench may be an
illusion, but that’s different then just saying, “Pretend a bench is there.”

It’s the difference between them
misinterpreting what you’re saying and actually talking for you. We go back to
the restaurant analogy. The two of you are sitting in a car, and Reader can’t think
of anywhere she wants to go, whether it be because she’s sick of the place or
they’re closed, her ideas aren’t working for her right now. She’s blanking out.
So Author says, “How about fast food?”

Now, Reader’s thinking, “Yeah!
Dairy Queen!” When really, the author meant McDonalds. Needless to say, Reader
went in a different direction, but she was stimulated into having an idea which
came from Author’s idea. Sometimes the writer might be okay with the
alternative idea—“Dairy Queen is good”—and sometimes not—“I got food poisoning
there last time!”—but, if we say that the author’s goal was to stimulate the
reader into thinking, well, he did his job.

But this is different than had
Reader not been able to think of anything, asked Author, and Author said, “I
will go where ever you want to go.”

Well, thanks. Way to tackle the
problem.

And here’s the thing. While a
friendly relationship allows for each party to decide if they want to take
charge, becoming a storyteller (and for a price) is announcing you are the leader.

As the leader, you get some benefits.
You get control and you get respect, as well as the added bonus of never having
to go where you don’t want to (metaphorically too). But then you have
obligation to your followers. You make decisions so they don’t have to. You put
an idea out there to be judged so they don’t have to. They get the grace of
sitting around and judging other people, without having to be the one to make a
suggestion and risk being judged themselves. And if they don’t like that, if
they aren’t interested in the joy of “not thinking,” and they want to make
decisions, they can, easily. It’s called writing.

Which basically means that if
they’re choosing to read, they want decisions made for them to accept or
reject. If they aren’t interested in that, then they’ll write their own book.

You don’t want to be the person
in the backseat who says, “I don’t care where we go,” and then shoot down all
their ideas. What that does is give you all the benefits and none of the
consequences, which puts all the negatives on Reader’s shoulders. And why would
she pay for that?

And this, “I don’t care where we
go. No,” mentality, is all an author can do if he’s not going to make a
suggestion. He can’t take a concrete piece of material and have it evolve to
match whatever it is they want. The reader suggests, “It’s about racism!” But
then the book says something that doesn’t support that. That’s shooting their
idea down.

People try to be really vague
about their answers, muttering something that might sound like a yes to the
individual, but a no to everyone else. When he’s talking to a crowd whose all
making different suggestions, he can’t agree with any of them because that would
be disagreeing with everyone else. But muttering an answer isn’t an answer, and
no one’s going to be satisfied with it. And anyway, the conversation is going
to carry on, and so, even if Reader believed you agreed with him, you’re going
to have more and more details that contradict that.

Consistently being vague is
boring and will flaw itself out in the end. Even if you’re convincing in the
beginning, the conversation will eventually become:

“Do you want to go to a fast food
place?”

“Yes.”

“Or there’s that Italian
restaurant on Main.”

“Sure.”

“Well which is it?”

“Yes.”

“Are you just saying yes to
everything I say?”

“Yes.”

Sure, he might think he’s being
the “nice guy” and giving her control over what they do, but in reality, he’s
just passing responsibility onto her shoulders. If he doesn’t make a
suggestion, he can’t get rejected. So it becomes her job to put herself out
there. And if that’s what she wanted to be doing, she’d be doing it already.
While foisting yourself and your opinions on other people can be irritating,
refusing to give them is just as annoying.

Friday, April 5, 2013

Metagaming is a term used in roleplaying to refer to a
mindset in which the player (a real person) recognizes that they are playing a
game and has their character (the not real person) act accordingly.

Now when the goal of the activity is to fantasize, it’s
pretty obvious how this would be a problem. Metagaming leads players in
Dungeons and Dragons to do things like randomly rob a bank, knowing the Dungeon
Master will have trouble acting “realistically,” e.g. having the cops arrest
him, taking him to jail, have him sit there for six years then go on trial
before being let out on a technicality. Or just murdering him.

Certainly, the murdering thing would do the trick, forcing
the player to make a new character before telling him to stop being a butt, but then
there’s hostility which is not something that benefits the fun.

I recently read an article on Cracked.com called “The 6
Dumbest Mistakes of Supposedly Smart Movie Characters.” Though the article has fairly
valid points, it brought to mind the huge issue that writers have to contend
with, and that is what I am going to call metareading.

We have this contradiction when making a “believable” world
where the person the author has to convince knows the character is in a story,
but the character the author needs to motivate does not.

This doesn’t seem like it would be that big of a deal. We’d
think that if the character and the world were well written, their actions
would make sense to a reader, and that readers, being smarter than the average
bear, would recognize that the protagonist wouldn’t understand his own
invincibility. Yet, despite our intelligence and respect for the storyteller,
readers function mainly by gut reaction, not logic. In that vein, stories have
certain “standards of protocol” which the audience expects from fiction and not
in real life.

Let’s consider this protagonist’s inability to die. Though
he is not technically invincible (in most cases), we are fully aware that Harry
Potter isn’t going to kick the bucket in the middle of book three. Yet Harry
doesn’t know that he’s schedule for four more novels, being, allegedly ignorant
his fate is being controlled by a middle aged British woman. Of course he would
be scared; he has a murderer standing in front of him with the ability to
massacre him in two words. The problem is that the readers aren’t scared. They
are simply standing around waiting to see how he gets out of it. Aristotle
defines tension as doubt as to outcome. Harry doubts his outcome, but John Doe
doesn’t for one second.

I remember when I was younger questioning this very issue.
I’d be watching something like Kim Possible or Shrek and not really
understanding why everyone was so worried all the time. Of course, I was like,
eight, and anyone older has a better understanding of abstract thought, but for
many people, this question can take form in many different contexts. The
problem affects relation to mood or motivation, whether it be of high magnitude
(life or death) medium magnitude (asking a girl out) or minute magnitude
(having a blonde Jewish woman).

Even if the characters are aware of their ridiculous escapes
being too good to be true, they would be more inclined to think they’re lucky
bastards or there is a scheme afoot than that they aren’t going to die.

Take the red shirts from Star
Trek. Every fan watching the show knows the poor soul is doomed, and that
is because there is an explicit pattern about it. But we also know the writer’s
motivation, meaning that the red shirts are brought along because he wants to
murder someone, and it has to be someone unimportant. When looking at this from
a character’s perspective, it becomes a little harder to know what he or she would
think. Not only do the characters not perceive themselves as unimportant (thus that
pattern being lost to them), all they know is that every time a mission goes
down, only Captain Kirk and Spock return. Either they would think that the
world is a very dangerous place and only the best warriors survive, or Kirk is
secretly murdering off the low level crew members. Due to our own egotism,
however, it is likely that even after having every red shirt before us dying,
we’d still have hope and believe in our own abilities. Or we’d go kicking and
screaming before we’d ever go.

Basically, the reader knows the red shirt is going to die. The red shirt can’t
act as though he thinks he’s going to die. At least not any more so than Kirk.
And though the reader logically knows that the red shirt doesn’t think he’s going
to die, she stills screams at the show when he’s too much of an idiot to not
know “don’t go in there!”

Unfortunately, demands for this sort of formulaic writing
style is actually fairly high, especially in the movie world.

I’ve had many filmmakers make a comment akin to, “When I’m
reading a script, I turn to page 15 and if there’s not an inciting incident, I
know it’s bad.”

Inside the fictional world, the patterns of plot structure and
tension seem irrational. Someone being exposed an inciting event and then three
following disasters would lead to a break down and hysterics. Four ridiculously
big events that happen to no one happen to me all within the course of a week.
What am I supposed to think about that?

Plot structure is formulaic and there is a demand for it.
This makes metareading all the harder to prevent. We can’t always take from
reality because expectations of fiction differ from expectations of reality.

An average reader could tell the difference between a movie
script and a transcript, even if we made it of the same tame subject matter and
casual format. That’s because in reality we talk differently than people talk
in fiction. And though there have been many authors who attempted to reunite an
audience with a realistic conversation (nonlinear, filled with stutters and
pauses, repetitive, and inane) it’s not usually interesting.

The real reaction is not something a reader expects or even
wants. And despite some believing incorporating an honest response instead of a
contrived movie one would be a creative direction, it’s important to remember
that films don’t have real gunshots for a reason. People don’t necessarily
recognize reality when they see it, and many realistic shows have been accused
of the exact opposite.

Then we have to take into consideration the reader’s point
of view (an objective, safe place where all the information necessary is being
presented and all of the unnecessary information is being cut out) and the character’s
point of view.

For instance, if a 40 year old woman was to be introduced to
a vampire for the first time as the inciting incident, she had a very different
perspective on the world than the reader. See, both learn to understand the
“rules” as they experience them, except that the character has had 40 years worth of lessons teaching her
that vampires do not exist, and the reader has had fifteen minutes. It would be
along the lines of watching The Simpsons
your entire life and having them suddenly reveal that they can fly. Not only is
that stupid, but unbelievable. You’d think it was a dream, or have to metaread
and believe the authors were replaced by monkeys. Either way, it wouldn’t be
something you accepted gracefully.

So you have a scene in which this woman is being introduced
to a supernatural world for the first time and she, like most people in our
world, isn’t buying it. She thinks that it’s a scam, or an insane man who
thinks he’s a vampire, even after she witnesses him suck the life out of
someone. The reader, who has only
been privy to the supernatural laws of the world, feels this is ridiculous.

This was an actual discussion I had with a friend of mine.
We were talking about Stephen King’s Jerusalem’s
Lot, in which he complained at the woman’s denial. Well, in the vein of
grief and shock, denial is a very typical ordeal that people can hold onto a
surprisingly long amount of time. My argument is that, when putting myself in that
position, as much as I would want to believe he was really a vampire, I would
be skeptical all the way until he had proven it beyond a doubt.

Next, the reader who has been given deliberate information
and can think clearly (because he’s not in danger) might very well ignore the
fact that the character, who has had three months elapse and a million other
things happen in that time, might just well panic and not be thinking straight.
When the viewer can take an objective look and see things rationally, he may
expect the character to do it too. For that matter, we always believe
characters to be smarter than us and, well, to have confidence. Where most
humans live their life in a state of doubt (Did I make a fool of myself or was
I just charming?) characters tend to be able to assess situations very quickly,
and, quite frankly, we expect them to do it.

I can’t readily prove metareading happens because it is an
internal action. And I cannot say how often it occurs either. The only evidence
I have is in the future. Start watching, start talking to people, hear out
their criticisms and consider if they are, perhaps, having expectations of
fiction that wouldn’t make sense in reality. I know that sooner or later it
will come up.

So, as you do wait for evidence far beyond what I can give,
it comes down to this: it is not the job of the reader to stop metareading.
Sure, he will very much enhance his experience when doing so, but the reality
is people mostly metaread because they want to. They’re looking for something
wrong. They want the book to be bad. This makes it the author’s job to prevent
it from happening as best she can, simply because she cannot count on people
doing what they should and actually trying to enjoy the book.

So what does the author do? As I said, she can’t just count
on being realistic because we aren’t necessarily going to believe reality when
we see it. And, unfortunately, there are people out there who, knowing that the
protagonist won’t die, can’t understand why he’s crying.

When I say it is the “author’s job,” it doesn’t mean it is
an author’s priority. Necessarily. Sometimes trying to idiot proof a story will
ruin it, and that what I’m about to say should be taken with a grain of salt
for that very reason. But when it comes to improving a story, it’s a crisp
factor to take into consideration.

The problem with metareading, as I said before, has to do
with the reader’s intension when he does it. Like the guy in the Dungeons and
Dragons game, he’s not robbing the bank because he’s “playing his character” (though
that’s what he says) he’s robbing it because he has no respect for the DM. Or,
on the flip side, a reader might be trying to give the benefit of the doubt and
just can’t because the continuity issues are that bad. In both parts, the
solutions are the same.

Number one, hide your motivation. This is pretty much true
in the majority of writing. We want the readers thinking inside the story, i.e.
“I want the villain to get punched in the face!” not “The writer really wants
me to want the villain to get punched in the face!” Latter thoughts (unless of
course the reader is trying to have
these latter thoughts) mean that reader is not absorbed in the story, he is
metareading, constantly being reminded that this is a story written by someone.

If you’ve ever watched a made-for-T.V. Disney movie, you’ve probably seen an example of this done badly where a bully is clearly a bully for the sake of being a bully and not really for any benefit outside of that.

The best way to hide the writer’s motivation is by having
multiple motivations. When a character does something, he has a reason he did
it: an objective, if you will. Objectives don’t always have to make complete
sense, and they can be vague and irrational. In fact, they are often not even
understood by the character himself.

Other motivations, however, can include more writer’s motivations, which means that by diluting your
reasons it makes them less clear. So, we have Draco get into a nasty fight with
Hermione so as to make us hate him a little more. But not only that, it also 1)
Motivates future pranks. 2) Informs us of Hermione’s background. 3) Informs us
of the historical bias against Mudbloods. 4) Leads them both to getting into
trouble which explains Hermione’s absence at an important moment. 5)
Foreshadows how Hermione will deal with a similar problem in the future,
thereby legitimizing it.

(I’d like to point out that I’m making this up. I don’t
remember Harry Potter well enough to give a real analysis.)

Not only does this indicate a hell of a lot more thought
(which is a quality considered when determining “good” writing), it makes it
seem less contrived, in a strange way.

Next is basic writing chores like continuity issues.
Continuity is extremely important because readers’ trust is hard to gain back.
You make one little mistake about calling a stalagmite a stalactite and
instantly you’re an idiot. If you’re lucky, you’ve already earned enough
respect for them to get over it, but even so, you don’t get that many strikes.

Then understand common writing traditions in order to
clearly indicate when you are going against them. A case in point is the
likable protagonist. We assume, as readers, we are supposed to side with the
main character. There are many books where that isn’t the case, but if an
author is going to go that route, she really needs to indicate that it is
deliberate.

When reading The Devil
Wears Prada, I started on page one as a friendly audience. I really enjoyed
the movie. But before the first chapter was over she had lost me, simply due to
a few minor discrepancies that could easily be attributed to the personality of
the protagonist, not a mistake on the author’s part.

There was nothing in it that the benefit of the doubt
couldn’t cover. She was driving an 80,000 dollar car in which she couldn’t even
ask her boss where it was without getting fired. Yet she was smoking in it.
Okay, maybe Miranda smokes in it too. Within the first chapter she ruins her
shoes, her pants, and then a second pair of shoes. Alright, she’s an idiot, but
maybe the author wants me to think
that. She is Jewish, but also blonde. Fine. There are blonde people who are
also Jewish, or maybe she dyes her
hair. Whatever. But then she proceeds to have a tiny bedroom, goes out without
measuring it, and then thinks a queen bed would fit. Okay, I would never measure
a bedroom either—or at least, not until I read that—but even I, as a person of
the exact same age and experience, knew that fitting a double bed into a regular size room is pushing it. Again, perhaps I’m
not supposed to like the character.

The protagonist could have been an idiot, it could have been
a character choice, but it seemed more along the lines of the author making
decisions that she wanted to make (such as the blonde hair with uncommon
genetics) without having a full view of the world first. Basically, it felt
like a lack of thought.

Had she wanted to really make it seem like the character was
stupid, she is going against tradition, which means she has to prove it’s
deliberate. When an author recognizes certain expectations (by metareading
yourself) and then referencing them as he ignores them makes it seem less like he
doesn’t know what he’s doing, and more like he’s creative.

The problem with metareading is the problem with writing in
general. In order to create a good story we must combined elements of what
people want and what people will believe, of the peculiarity of truth and the
comfort of expectation, and know when to pander to jerks and when to tell
them to go screw themselves.