The course website and blog for the Fall 2016 instance of Penn State's SC200 course

Monthly Archives: November 2016

Exercise is defined as any activity requiring physical effort carried out to sustain or improve health and fitness. Throughout our lives, people have been preaching the importance of exercise in order to maintain a healthy body. One aspect of physical activity that is rarely discussed is its effect on a persons brain. I feel that exercise of any type makes me more mentally alert than then when I am sitting around being lazy. Is it possible that exercise can improve a persons mental awareness? It turns out that scientists have been asking this for quite some time.

An article written in the Physical & Health Education Journal focuses on the possible mental benefits of exercise. Scientist Atkinson Russell makes it clear that he believes exercise can improve a persons mood, concentration and memory. A team of scientists lead by Russell conducted an experiment to support his claim. In 2012 they evaluated children from various schools around the United States on their mental abilities. Once sufficient data was collected, the scientists introduced a new physical engagement program to the school that involved several fun activities. Teachers were encouraged to let their students participate in the program at least once a day before a lesson. After a year, what the team found was that the students mental ability had improved faster than the normal rate for kids their age. They were more engaged in school work and more involved within the classroom. This study provides decent evidence to suggest that exercise can have mental benefits. However, factors such as limited sample size and possible confounding variables cause speculation about its legitimacy. While this may show that casual exercise improves young children’s ability to learn, it does not prove that physical activity can directly benefits someones brain.

The New York Times published various studies done by a team from the University of Illinois that all give convincing evidence to support my claim. In their first trial, 21 students were randomly selected and given a test that evaluated their memory. The students were then randomly instructed to either sit, run, or lift weights for about thirty minutes. When the team tested the students again, they found that ones who ran performed significantly better than the other two groups. In a similar study, the team assigned exercise programs to a group elderly people. One group was instructed to exercise by walking, the others were told to stretch. After six months of daily exercise, the team found that the group of walkers were more mentally active and improved noticeably on tests of memory. Both studies provide decent evidence to support my conclusion. They are done professionally and carefully. Although reverse causation can be ruled out, there could be a confounding variable causing the mental improvements. Despite the small sample sizes, the findings from both of the University of Illinois studies strengthen my claim that exercise can make a person smarter.

None of the three studies that I looked at provided enough evidence to conclude that exercise makes a person smarter. However, they present valid results that can persuade a person who is skeptical about my claim. To a reasonable person, there is enough evidence from these studies to make it obvious that exercise definitely has some mental health benefits. Although it may not make you a genius, it definitely gives your brain a boost.

I understand that this might have been a risky/uncomfortable topic to research, but after hearing people claim that if a guy has big hands or big feet, he definitely has got it big down there too, I was honestly curious. I have heard numerous people say these things for years, and I guess I never took the time to actually figure out whether or not what they were saying was true. Before even researching my topic, I stopped to think about what I thought the results would be. Honestly, I don’t see how everything can be related. I think you’ve either got it or you don’t—no matter what size your hands and feet are. But obviously, this blog would be incomplete without research to back up my hypothesis.

So, is it true? Does the size of a male’s hands and feet determine how big he is? Here, the null hypothesis would state that the size of men’s hands and feet has no correlation to how big they are. The alternative hypothesis would state that the size of a man’s hands and feet does relate to how big they are.

As I was doing research, I came across an article that mentioned one of the Republican debates. Marco Rubio had insulted Donald Trump by claiming he was small because he had extremely tiny hands. Upon all of my research, however, I found that Marco Rubio’s comments had no real meaning because there is no proven evidence that men with small hands/feet are also small in that region. Debby Herbenick, a sex researcher at Indiana University stated that there have been conflicting results about whether or not the idea is true.

Although there did not seem to be a clear relationship between the two variables, another hypothesis was proposed as I was doing my research. Apparently, if a man’s index finger and ring finger are mismatched, they are more likely to be bigger. South Korean researchers working in a team from several universities did studies on this topic and examined data from 172 men who were in the hospital for urology surgery. During the study, the lengths of the men’s fingers and the size of their testes were individually measured. As results from the study, the men with ring fingers larger than their index fingers were, in fact, bigger. The study also concluded that the men with different size index/ring fingers were more likely to be fertile.

Although I did not think there was a chance that anything in a man’s hands or feet could be related to other parts of his body, I turned out to be wrong, according to various studies performed. Of course, there could be errors in the studies performed, but all of them seemed to have similar results. I am still hesitant to believe some of the studies I have just read about, but I guess I have been proven wrong!

Every year it seems like people are getting smarter than they were before. In high school my grade was always enrolled in more challenging classes than the year before us, while at the same time, the year below us was progressing even faster than we did. I always assumed that this was due to improved teaching abilities and better learning methods, thinking that it had nothing to do with actual intelligence. However, is it possible that the younger generations are actually evolving to become smarter than their predecessors? Or is it a phenomenon resulting from other confounding variables?

An article published in The Telegraph explains a phenomenon known as the “Flynn Effect” which has been being studied for more than thirty years. The Flynn effect was started by American scientist James Flynn in the 1980’s. He noticed that companies who routinely gave out IQ tests had to increase their difficulty every year in order to maintain their average score at 100. Does this mean that people are getting smarter every year? Or is the changing world making the outdated questions too easy? The Better Angels of our Nature, written by Steven Pinker, claims that this effect has been observed in more than thirty countries. IQ scores improve by an average of 3 points a decade, meaning an average teenager today would have an IQ score of 118 i the 1950’s. Clearly, the intelligence of generations is steadily increasing as time goes on.

Now the question is what has caused this change in intelligence. Could it be that as time progresses, humans brains evolve to become more intelligent? Or is there another confounding variable that improves mental health that didn’t exist decades ago. According to a study conducted by the University of Aberdeen, this increase in intelligence is most likely due to improved diets, health and modernization of society. The team looked at two groups of children; those who grew up before World War Two and those who were children during the war. What they found was that the children who grew up during the war were on average more intelligence than those born before it. What was different between these two groups? The more intelligent kids grew up eating more nutritious meals, less junk food and experienced more social change than those before them. After the war, their towns were modernized with improved schools and more political awareness. The team believes that their upbringing is what caused them to become more intelligent than the older generation. Instead of evolution, it was their improved lifestyles that ultimately led to their increased mental ability. Although this study only looks at one group of children there is clearly a trend occurring. What makes this study so convincing is that it cannot be due to reverse causation, and there is so much evidence than it is hard to say it is coincidence. Similar trends have been observed among societies for the past century.

The IQ gains experienced by the children of World War Two are not unique to their experience. Situations like these are believed to be what causes generations to become smarter every year. It is not due to evolution, instead, confounding variables such as improved diets and health, modernization of communities, and social awareness all lead to more intelligent generations. Data presented in these articles is enough to conclude that children are becoming smarter than older generations, not due to evolution, but as a result of a more intelligent society.

Christmas has always been my favorite time of year. Over winter break while shopping for a christmas tree I was surprised by the size and health of the evergreen trees. Almost 500 massive, thick, green trees soared into the air waiting to be put in someones living room. This got me thinking about the evolution of trees as global warming increases. It is obvious that our environment is not as healthy as it used to be. Pollutants fill the air and heat up the earth more and more each year. I asked myself, what effect has this had on trees over the past decades? I concluded that the increase of pollutants in our atmosphere, such as CO2, most likely deter the growth of trees. However, what I found was quite surprising.

An article posted in The Telegraph claims that plants and trees are actually growing faster as a result of the increased carbon dioxide levels in the air. Since plants survive by converting CO2 into protein through photosynthesis, more CO2 in the air should mean healthier plants. Professor Martin Perry, head of plant science at Rothamsted research, believes that “There is no doubt that the enrichment of air with CO2 is increasing plant growth rates in many ares.”

The Telegraph article also includes a study done by the University of Leeds on tree growth over time. The team measured the girth of 70,000 trees across Africa and compared them to similar data collected four decades ago. What they found was that trees had been getting bigger and growing faster as time progressed, all while being in an environment that gained .6 tons of CO2 per year. This correlational study does not prove that increased CO2 causes tree growth, but it shows that greenhouse gases have not negatively effected trees. This study was only done on one specific area, however the sample size is large enough to convince me that it is no coincidence. We can rule out reverse causation, however it is possible that a confounding variable causes CO2 increase and improves tree growth.

According to the Food and Agriculture Organization there are actually more trees today than there were 100 years ago. The FAO explains that tree growth has exceeded harvest almost every year since the 1920’s, our current forest volume is 380% greater than it was at this time. What makes this important in contradicting my claim is the fact that CO2 levels (http://myweb.wwu.edu/dbunny/pdfs/CO2_past-century.pdf) have increased by 600 billion metric tons since 1920, the same year this study was conducted. CO2 levels directly correlate with the increase in tree population, providing a reason to believe that they are causal. Like the first study, this correlation could be caused by a confounding variable such as rainfall, sunlight, or human activity. However, it is enough to prove that my original claim was false.

Both studies give sufficient evidence to conclude that CO2 levels do not negatively effect tree growth. In fact, it is more likely that they actually improve the health and size of trees. With so much focus on the negative impact of increased green house gases, it is interesting that they have some benefits too.

About a week ago while browsing through shows to watch on HBO I stumbled across a mini series titled “The Pacific.” The show follows a group of World War II soldiers as they fight on numerous islands throughout the Pacific ocean. One aspect of the show that I became curious about was the Japanese and German militaries ability to brainwash their soldiers and civilians. I did not think that brainwashing was actually possible, however watching the show caused me to second guess this thought. Intrigued, I decided to look into the science behind brainwashing to figure out how it works and to what extent it can impact a persons actions.

According to an article from Science, most psychologists agree that brain washing is possible. They claim that it can impact a majority of human minds if done correctly. The article explains that the process behind brainwashing is mainly psychological, caused when a subject becomes completely dependent on their superior. Brainwashing can be done by withholding food, water and sleep from a person until they break down. Relying on someone else to survive can alter a persons brain, causing complete devotion toward capturers and their beliefs. The article explains the process behind brainwashing and gives anecdotal evidence of its success in situation of war. This evidence is intriguing but it was not strong enough to conclude that brainwashing is scientifically possible. Could their change of beliefs be caused by the torture of captivity? Or do they actually lose control of their own thoughts? Curious to get to the bottom of this I continued to research.

Since the concept of brainwashing is immoral, there are no recent studies that I could find focusing on brainwashing. An article published in Psychologists World cites a study done by Robert Lifton and Edgar Schein in the 1950’s. The team evaluated American POWs who were “brainwashed” by their captures during the Korean War. They reached the conclusion that the anti-american statements made by the soldiers were not a result of actual brainwashing. Instead they claimed that their change in beliefs were a defense mechanism from within the brain. In order to survive the ruthless torture, they subconsciously began agreeing with the Korean beliefs. The prisoners were not actually “brainwashed.” Although they appeared to be different people, it was simply a mechanism of survival. Their core beliefs began to reemerge once they were in a safe environment.

This study is old and was only done on a select few people from the same war with similar backgrounds. It isn’t the most credible source to form an opinion from, but it does provide solid evidence that complete brainwashing is not actually possible. While psychologists might claim that it is possible, there are no confirmed victims of complete brainwashing. I believe that most cases of “brianwashing” are a defense mechanism from the brain to protect from torture. After researching the topic, I don’t believe that total brainwashing is possible.

Technology its constantly improving. TVs are becoming clearer, cars are more eco friendly and food is grown more efficiently. With so much change occurring, there has to be some technology that can have negative impact the human body. A big health question mark are newly developed noise cancelling head phones such as Beats and Bose. They are bigger, louder and clearer than ever before. Wearing headphones that produce more noise are bound to have negative impacts on a persons hearing. With this thought in mind, I researched more to determine if there was any evidence to actually prove that using headphones can have a negative impact on someones hearing.

This claim is very straight forward; wearing headphones more frequently can have a negative impact on a persons l
ong/ short term hearing. Since it is occurring over time, reverse causation can be ruled out as a possible explanation. Also, it is hard to think of a confounding variable that would cause a person to wear headphones more often, as well as negatively impact their hearing. For these reasons, it is easy to say that direct causation is the most likely explanation behind the claim. This makes it easier to research and determine whether it is true.

A 2016 study posted in the International Tinnitus Journal aimed to find a correlation between teen listening habits and hearing loss. A team studied 131 high school students and asked them questions regarding headphone usage, average volume, and average listening time. The study found that 79% of the population used headphones on a daily basis, with 37% claiming they listen at a high volume. The team also asked them questions related to their hearing ability. What they found was that teens who reported listening to music at
higher volumes with headphones were more likely to report difficulty concentrating, the need to ask people to repeat themselves and various other symptoms relating to degraded hearing. The team concluded that teens in this study were using personal headphones at rates higher than recommended. Of those polled, the team would recommend Hearing Health Promotion Programs for a majority of the population.

Although this study provides some convincing evidence, an article written by two doctors in Health Scope claims that there are enough studies done to determine whether or not headphones can negatively effect a persons health. Despite the lack of studies, the doctors believe that it is more likely than not that headphones effect a persons hearing. The article provides evidence to support their claim. Decibel levels of headphones (around 85 deci
bels) usually cause complications in a person hearing if they are exposed for more than 15 minutes. This level of sound has been shown to numb ears and impact a listeners hearing over a long period of time. Another reason the doctors believe that headphones are harmful is due to the abundance of electromagnetic rays emitted by the speakers. These waves are believed to negatively effect the brain, however there is no solid evidence to prove this.

The study provides evidence to support my original claim that headphones are bad for a persons health. However, it only samples 130 kids of similar age. With such a small sample size, it is hard to claim that this study provides enough evidence to confirm the claim. Although

the authors of the Health Scope article are convinced that headphones are not good for health, they admit that there are not enough studies to prove this. They provide evidence from various experiments done about noise, but none of these studies directly link headphones to hearing loss. Although the general consensus among the science community is that headphones effect hearing, there are no studies that can directly prove this. Despite this lack of evidence, there is enough medical data to convince a reasonable person to turn their music down a little in an attempt to preserve their hearing.

If you are anything like me then you laugh when you are in an uncomfortable situation. This doesn’t happen because you are laughing at the other person or because you don’t know how to react… it’s because this is your normal reaction, but why?

People like to mask their feelings due to not wanting others to really know how they feel- so people may laugh in times of nervousness because they are trying to balance their anxious feelings. In a study the researcher, Oriana, Aragon explains how emotion needs to be regulated because if you begin to laugh obnoxiously for no reason in a time that is not appropriate that can mean you had enough of a certain stimulus and you don’t need it anymore.

This study was experimental and observational because Aragon watched people in social situations and learned by studying their brain that the stimulus is overworked. Too many signals are being sent to the brain at one time causing the laughter to take place.

(Photo by https://automaticimprov.wordpress.com/2012/09/24/beingfunn/baby-laughing/).

Neuroscientist V.S. Ramachandran explains in his book, that we signal ourselves when horrible things we’ve just encountered aren’t that bad because we want to believe this. By making ourselves believe that the situation is not that bad- it can be a defense mechanism explaining our behavior to mask what’s really going on. We decide to put a wall up because our anxiety overrides us and we do not want to look weak to others.

X Variable- Stimulus (What’s going on in your life?)

Y Variable- Laughing at inappropriate times

Confounding Variables- Did you think of something funny? Did someone look at you weird? Etc.

**This experiment can be due to chance as is every, but this one more so because people laugh at the most random things. Reverse causation isn’t present here because you can’t laugh out of nervousness and then your stimulus decides to overreact. The stimulus needs to have a reaction first and then the laughing would occur.

By looking at this theory of nervous laughter, when people are able to make light of traumatic events in their life- although it is a sign of healing, people have come to terms with the fact that they are able to be happy again. With this said- there doesn’t have to be a traumatic event in your life to have nervous laughter. This can just be who you are. Laughing makes people feel better and it is one of the few universal things in life- after all we all want to believe that everything is going to be okay.

(Photo by https://automaticimprov.wordpress.com/2012/09/24/beingfunn/baby-laughing/).

Now that I have left my home and began my new, independent life at college my parents are always finding ways to remind me to stay safe and healthy. As any college campus is a hub for the binge drinking culture, this is one of their main concerns. Recently, before I go out to a party my mom will shoot me a text saying, “remember to eat a big dinner before you go out if you plan on drinking!” As sometimes I am rushing to get ready for my night time event, I wonder if spending time getting and consuming this big meal is actually beneficial. I have heard multiple times that if you drink on an empty stomach, the alcohol will effect you more than it would with a full stomach. However, as we have learned many times in class, just hearing something from the general population does not necessarily mean it is the truth. For example, the statement could be skewed due to the power of the anecdote. It is possible that one person did not eat all day and then went out and got too drunk, which could just be a case of correlation. It is always necessary to turn to science with these types of questions.

Possible answers:

Null hypothesis: The amount of food consumed by a human before drinking does not play a role in how much the consumption of alcohol is going to effect him or her.

Alternative hypothesis (general belief): The amount of food consumed by a human being plays a role in how much the consumption of alcohol is going to effect him or her

Before doing any research on the topic I was thinking about how it would effect people’s daily habits if the null hypothesis was correct. I concluded that as this point it might not change anything at all. It would probably be very difficult to convince someone that this is true . As we have talked about in class, once something is a strong, popular belief amongst the majority of people, it is extremely hard for scientists to change their mind.

Studies:

The study that I found is from the Journal of Clinical Pharmacology and shows two different studies that were performed at Indiana University hospital. The first study consisted of 20 people (10 males and 10 females) and was testing the alcohol elimination rate. The subjects underwent two sessions of alcohol being injected into their veins. The first session was after a 12 hour period of not eating and the second was after a normal meal. In order to make the study accurate, all of the subjects ate the exact same amount of calories.

Result: Those who ate before the alcohol injection had a increased alcohol rate of 25% compared to those with an empty stomach. There were also large differences between males and females which is shown above.

The second study was performed in a similar fashion except their were four sessions. The first was again one in which the subjects did not eat before the injections. The second one gave the injection after a high-fat meal. The third one was given after a high-carb meal and the fourth one was given after after a high-protein meal.

Result: Those who ate before the alcohol injections had a 45% increased rate of alcohol elimination. There were no significant differences across the three different types of meals that were consumed by those who ate.

Method: This article tells us that on an empty stomach, alcohol is absorbed into the blood stream faster than it is if you have recently consumed food.

Conclusion: All of the advice that I have heard from many people ranging from my best friends to my parents, along with the scientific evidence that I stated above, has made me come to the conclusion that it is a smart and rational idea to eat the appropriate amount before drinking alcohol.

Social media is embedded into modern culture. Facebook, Twitter, Instagram, Snapchat, the list goes on and on. Within the last decade there have been increasing numbers of social media platforms, and their influence has expanded as well. Young people more than any other age

group are the ones who are committing the most time to it. On average, people between the
ages of 8 and 18 spend 11 hours and 18 minutes a day on some form of recreational media. That is almost half of their day! Seeing such a striking statistic made me wonder, does social media negatively affect adolescent development?

From the most three most relevant papers I found on this subject, it appears that social media affects adolescent development in three major ways. It affects our behavior, health, and self-esteem.

To begin with, there is a theory on behavioral learning called Social learning theory. It is the notion that children imitate behaviors they observe, particularly ones which result in favorable outcomes. So, the more time they spend observing celebrity behavior, or whomever they see on various media platforms, the more likely they are to normalize it, and reflect it into their own actions. So, if the president elect says something which makes grabbing a woman by the hoo-hah acceptable, the adolescent observer will absorb that information and translate it into their own behavior (probably not that extreme, but in other ways). This still holds true with the exposure to mature material such as sex, drugs, and violence. Per a paper by Victor C Strausberger, Amy B. Jordan, and Ed Donnerstein, being exposed to these concepts at such a young age can result in the mirroring of that behavior. This leads to increased aggression in adolescents, and increased participating in risky behavior

Our physical health is also impacted by social media. Engaging in social media is often a sedentary act. People watch YouTube, and scroll through Pintrest while sitting on their couches. There is a correlation between such heavy social media usage. This type of lifestyle has several negative health implications, including, but not limited to increased risk of heart attack, loss of lean muscle tissue, and depression. These are not the only health risks which are correlated with social media usage. So much screen time can also lead to developmental disorders such as ADHD. Also, youth who use social media tend to develop sleeping disorders as well. These are not all the negative physical health issues which can arise, but they are too numerous to list in this single blog post.

Lastly, social media can negatively affect one’s self esteem. A study conducted using Dutch adolescents, and a friend networking site, examined the relationship between the sites use and the self-esteem of the participants. Unlike I expected, their study found that the effect of this social media platform on self-esteem was directly related to the type of feedback the users would receive. The more positive comments they got, the better they felt about themselves, and the opposite effect occurred for negative comments. This is logical, but it still surprised me, so I looked to see if any other studies had had similar findings. As it turns out, this is not the only study which saw that it can be beneficial. I think that my own bias got in the way when I was making assumptions before reading the study.

After analyzing these papers, I do not think that there are any issues in terms of the science behind them. In my researching, I found copious numbers of papers on the subject, probably because of its current prevalence, so I do not think it suffers from the file drawer problem. Also, because they looked at specific things, I do not think it suffered from the Texas sharp shooter problem either. And they are all published works, which means they have been subjected to peer review in order to get here. The bit that I did find challenging was the experiment relating to self esteem. This was based on self reported, self reflection on how the participants were impacted, and self esteem is a solf endpoint and therefore difficult to effectively mesure.

Overall, yes, social media does have a negative effect on adolescent development for the most part. The exception to this being that the reception on one’s profile on social media, and how it affects adolescent self-esteem. Based on this, I think that the best thing for SC200 students to do would be to consider whether the risks of these effects, and the effects themselves are worth it to you. If nothing else, they should try to be more aware of their social media consumption, and its influence on their life.

I have been trying to convince my mom to let me get a tattoo. It’s not that she thinks tattoos are bad, she just thinks I’m too young. Plus she says my arms are too skinny so once I get bigger arms it’ll stretch out and look weird. So there are a multitude of reasons why my mom doesn’t want me to get a tattoo. I feel that if I educate myself more on tattoos and the science behind them, I can provide a valid argument to her and hopefully convince her to let me get one. So this can be like practice for my real presentation to my mom.

My knowledge of tattoos before I did research was minimal. I knew that the needle penetrated your skin really fast with ink and then you would have a tattoo. That’s it. There is also a lot more that goes into tattoos than you would think. According to this article that I read, your body is actually constantly fighting infection to keep the tattoo there. The needle penetrates the skin through the epidermis and past the dermis and ink is released. This causes your body to send immune system cells to fight off the injury. The article states specifically that the cells that are set are called macrophages. They go to get rid of the infection. Fibroblasts absorb the rest of the dye. You are able to see the ink though your skin because of the ink in the fibroblasts and macrophages. So you see, my plan is to first hit her with the science behind tattoos. If she sees how invested I am in the subject, maybe she’ll understand how bad I actually want it. Now, time to tell her more about the healing process and any risk factors.

The healing process, from what I read, can be annoying. Depending on the tattoo, it can take up to two months for it to heal according to an article that I read. It also stated that there are three stages to the healing process. The first stage is where the tattooed area is swelling and oozing liquids. The second stage is when the tattoo starts to itch and the scabs start to come off. The last stage is when the tattoo seems to be healed but deeper layers still need to heal. This can be the longer stage out of the three. The outer layer of skin heals relatively quickly because it is the most important to heal because it is exposed. Overall, the healing process does not seem too bad. Annoying, yes, but other than that it’s all good.

With any time of injury endured, there is a chance of infection. This is true for people who choose to get tattoos as well. A study was done on 4,227 students, ages 14 to 22, to see how conscious they are about infections you can get from tattoos. The boys studied, for the most part, were unaware of the possible infections they could get from tattoos. They were did not even know what steps to take to fix the issue or what warning signs they should be looking for. This tells me that not enough young people are educated on the possible risks of getting tattoos. According to a report I read put out by the FDA, possible risks include: infection, allergies, scarring, granulomas, and MRI complications. The first three risks are straight forward but granulomas are little bumps or knots that can form around the tattoo changing the pigment. As far as MRI complications, the tattoo may react poorly with the radioactive waves during the scan. These risks are preventable if you take proper care of your tattoo and give it the correct time to heal without much stress on the tattooed area.

Overall, I think my argument is pretty solid. My mom would be comforted in knowing that proper care of my tattoo can help prevent any possible risk. Also that the healing time isn’t too long and wouldn’t effect me for a long period of time if, again, I take care of it. Hopefully my mom buys into what I have to say, but if not at least I know cool things about tattoos now.

A study done at USC shows that the daily use of aspirin could result in a longer life in older Americans. Since the study wasn’t laid out for me in an abstract, rather just published on a website, I decided to use their findings and create an abstract of their work.

Null Hypothesis: Aspirin does not lengthen the life of older Americans.

Alternative Hypothesis: Aspirin lengthens the life of older Americans.

Subject Scenarios: This study is ethically difficult to conduct, considering it impacts people’s lives and the longevity of their lives, so the researchers at USC used simulations and surveys in order to duplicate the effects. USC researchers ran two scenarios which project the health of older Americans and their trajectory in aging. These were called the “Guideline Adherence” and the “Universal Eligibility”.

Guideline Adherence: focused on determining the potential health and savings, benefits and drawbacks of following the task force’s guidelines from 2009.

Universal Eligibility: Not realistic and aimed to measure the full potential benefits and drawbacks if all Americans 51 and older, regardless of the guidelines, took aspirin every day.

Results: Researchers found the following the daily low-dose guidelines, 11 cases of heart disease and 4 cases of cancer for every 1,000 Americans between the ages of 51 to 79. In addition, the life expectancy of these Americans increased by .3 years. Therefore, by the year 2036, an estimated 900,000 more Americans would be alive as a result of the aspirin regimen. Negative results were shown in terms of not being able reduce the rate of strokes as well as increasing the rate of gastrointestinal bleeding by 25%.

Conclusion: I would reject the null hypothesis. However, there are some flaws in this study. First, the fact that no actual being (whether that be people or animals) are being used on this study makes the results much more difficult to believe to be accurate. This may have been victim to the Texas-Sharpshooter problem because the researchers wanted a result. Also, the results do not seem significant enough to recommend to people to take aspirin daily. If a larger majority of the people were positively affected then I could understand the recommendation. In my opinion, in order for this study to have been done more accurately, a placebo would have been a good alternative to see the difference between taking an aspirin daily and not taking one daily.

I have recently subscribed to Netflix and ever since I have been addicted to a show called Orange is the New Black. One of the characters on the show never speaks because of a severe speech impediment, but she is a fantastic singer. I am curious about the difference in mechanisms between what it takes to speak, and what it takes to sing. My voice is not particularly unpleasant, but it is a whole different story when I sing. Singing is a skill, while normal speech tends to comes easy, but we the voice for both actions.

The American Academy of Otolaryngologyexplains well how the voice is produced. To talk, obviously air is needed. During a breath, that air goes through the larynx and makes the vocal folds in the larynx vibrate. Those vibrations make the sound needed for speech, but that is not the only component. Every person has an individual shape of their nose, mouth, and throat, which that changes the sound of our voice. That is why everyone has a different voice. The process for speaking is pretty straight forward.

Singing is not much different. According to The National Center for Voice and Speech (NCVS) there are only two differences. The first one is that more air is needed to sing. You can feel that just from doing each. Before singing, I naturally take a deep breath in, whereas I barely even notice the breaths I take in order to speak. NCVS also says there is more supraglottis space needed to sing. Frank Gaillardsays the supraglottis is just another part of the larynx, the same place where those vocal folds are. The same mechanism is used for singing and speaking.

One study at the Institute of Experimental Psychology compared singing and speaking. There were 24 participants and they were between the ages of 21 and 33. First, they were instructed to speak a word with no specific pitch, then they had to match pitch with a note that was played prior to their singing. The pitch of the note was based on the gender on the participant, all of the females had the same frequency and all of the males had the same frequency. In doing this, the researchers excluded gender as a confounding variable by blocking the frequencies. A major finding of the study was that the data showed it takes more control over your voice to sing in comparison to just speaking. Physical structure and experimental data support the fact that singing takes more vocal control than speaking.

My findings have left me a little bit more confused about the character in Orange is the New Black, but I guess that is okay. If more control is needed to sing, I would think that it would be harder to sing with a speech impediment, but that does not prove to be true in the instance of that specific character. On the other hand, I am not totally sure if the character represents an actual person and their situation.

I was sitting on my couch over breaking watching football and basketball and I noticed the hairlines of some of the players and the people in some commercials. Everyone knows it quite common for your hairline to recede when you get older and you can tell when it is receding by the high hairline and the indents on the top of the head. I was thinking, are there any other factors that could possibility cause your hairline to recede besides age. Do certain treatments cause your hair too thin out, is hair loss genetic, is it due to chance and it is just a matter of time.

Receding hair is due to hair lose commonly experienced by men more than women and usually occurs around your mid-30’s but can happen as early as your early 20’s. You will begin to notice a loss of hair over your temples that causes a widow peak on your forehead, and eventually form a bald spot on your head. This article breakdown the process of hair loss and the numerous causes it originates from, including hormones, genetics, ag
e, and stress. A sex hormone called DHT (dihydrotestosterone) found in men is r
esponsible sexual characteristics during puberty is a direct causes of hair loss in some men. When it continues to increase after puberty it can build up around hair follicles and it limits them from growing because it restrains them from receiving blow flow and it causes hair to fall out. Genes are another cause of receding hairlines. Scientist found that the gene AR
(androgen receptor) that is passed down from a boy’s father can cause hair loss based on the strength of the gene, it can start to show as early as a boy’s teenage years. Age is obviously a factor in hair loss with 65 percent of men showing signs of hair loss in their later years and 25 percent of them show signs of loss around 30 according to the National Institute of Health.

A study done by doctor Keith D. Kaufman surveyed a group of male patients ages 18 to 41 that experienced pattern hair loss. Two 1-year trials were observed while one group of men received an oral hair loss supplement while the other group was a blinded trial. The 1553 men who received the oral supplement were discovered to have shown improvement after every year while the placebo trail of 1215 men continued to result in loss. The oral treatment did not only slow down the hair loss, it also improved the quality of their hair while increasing the growth. The test was randomized among the men and the age at which they began to notice hair loss was divide between the supplement and placebo as well as U.S. and internationally. Another statistic that was shown was in every category more than 70 percent of the men had a family history of hair loss. Photographs of every patient were captured to record their progress and in this study and photo that shows a man’s baseline photo with a bald spot on his head was completely gone after 24 months of receiving the oral treatment. The oral treatment method that was used was to counteract the process of DHT, which they said was the main contributing factor in these men.

In conclusion, after reviewing the study it would be interesting to see how patient’s hair loss reacted to this supplement who did not suffer from DHT. Would that cancel out the process of getting rid of the DHT and go straight to increasing the hair growth? Since hair loss is so common among men there are many store bought products that can be used, but there were also teenagers in that study so would they just have to wait to use it or take a different approach? What is the approach you take if you carry the AR gene, can that be cured?

Today’s question dives into whether the Great Barrier Reed is actually “dead” and what is causing it to die.

Scientist examines healthy coral

I stumbled upon this question as a trending topic of Facebook news section a few weeks back in the middle of November. Although Facebook may not be the best source for news because of the abundant “click bait” headlines, the headline I read: The Great Barrier Reef considered Dead. After surfing through the buzzfeed reactions and the endless opinionated mislead posts of individuals known as the Facebook comment section, I decided to do my own research on the topic.

The Great Barrier Reef as seen from Outer Space

The Great Barrier Reef is located off the northeastern coast of Australia and is the largest coral reef system in the world stretching 2,300 kilometerse long with countless speices of fish, turtles, starfish, and sharks. Known for its beautiful corals and unique underwater eco-system, the Great Barrier Reef is widely believed to be receding and dying due to a phenomena known as coral bleaching due to the baren white color the corals take after algae and nutrients are absent. Corals can survive a bleaching event but are exponentially vulnerable to the environment. Marine Biologists speculate reasons as to why coral bleaching is rapidly spreading across the Great Barrier Reef and here is why:

Change in Oceanic Temperatures (Global Warming) – Thought of as the leading cause of coral bleaching, as water temperatures rise corals expel algae known as zooxanthellae living in their tissues causing the coral to turn completly white.

3. Overexposure to sunlight – High exposure to sunlight results in high solar irradiance which attributes to coral bleaching.

4. Abnormally low tides – Exposure to the air during extreme low tides can cause bleaching in shallow corals.

CNN recently posted an article disclaiming that the Great Barrier Reef is dead stating there is a difference between “dead” and dying”. I am in the belief that arguing over whether the Great Barrier Reef is dead or dying is petty and more efforts should be attributed to preserve the reef and conserve the living corals rather than arguing the state of living of world’s biggest organism.

I have been to the hospital many times since I was born. I have bad asthma and a terrible immune system so at least once a year I have been to a hospital for health reasons. Many people don’t like hospitals or are even afraid of them. My father is an OB/GYN, Obstetrics and Gynecology, so I have also been inside hospitals many times without the intent for treatment, just hanging around my father as he did his business. As a result of this I feel comfortable in hospitals even going so far as to say I feel at home in one. However, aside from the time I spent with my father inside hospitals, I feel comfortable in them while there for treatment because of how the staff treats you while your there. Nurses and doctors do their best to make you feel comfortable and at ease while getting exams or treatment. Last year I hurt my leg and I needed surgery. I needed to be put under anesthesia and that made me nervous. The nurses and doctor performing my surgery were being extremely friendly and even telling me jokes to calm my nerves and reassuring me that I will be fine.

My question “Is future healthcare becoming less human?” is inspired by news articles I read every year about the future of healthcare and how robots and robot technology are becoming more popular in the field of medical treatment and study. I have read over and over again how robots are becoming more integrated with the treatment of patients from performing something as simple as drawing blood to something as complex as performing surgery. I ask this question because at the rate of how fast robots are becoming apart of hospitals, will the patient still feel the compassion, dedication, and care that nurses, doctors, and other human staff members often provide for them?

According to MedicalFuturist.com, a site and blog run by Doctor Bertalan Mesko PHD, the future of robot healthcare will increase the level of security and care the patients feel and will even be a benefit to the staff as well as the patients. Dr. Mesko says that machines do not need sleep or food which benefits the patient because it is ready to provide care and perform procedures at a moments notice. He also described the machines as the opposite of discriminatory, racist, or prejudice, so the treatment it gives to the patients will never change based on the person and will always be at its best.

As anyone can imagine surgery is never fun and there are risks to having it done but the benefits could potentially out weigh the risks. All you can do is hope your in the most capable hands of a skilled surgeon. Now imagine your surgeon controlling a robot that makes pin point precision cuts and and other delicate tasks like neck and back surgery. However the site and blog also claim that just like most practices in medicine, robotic surgery is not 100% safe. In fact severe injury and even death have occurred because of unskilled robotic technicians/surgeons. Power failure, all though not common, is also a risk factor.

Well as seen from the topic reviewed above, thinking logically, the patient will always have human contact with interjections of robot involvement, some minor and some major. The robotics field shows amazing hope for the future, but progress isn’t without it failures along the way to success.

P.S. Another huge impact on robotics and human health science, ladies and gentlemen, behold the Exoskeleton.

As of lately, I start catching myself grinding my teeth very aggressively, I don’t know why, but I just do it. I didn’t do it before and now I have to be aware of the fact that i am doing it, and stop. we all have that one habit that pops up from no where, like cracking fingers, neck, or back. eating fingers, and other small irritant behaviors that actually happens for a reason. everybody concludes that these little moves are a natural result of stress. I disagree, I don’t think i’m stressed, I just think my brain is playing mind games – no pun intended-.

the medical term for grinding your teeth is called Bruxism. apparently, if you grind your teeth during the day, then you must do it during your sleep. Bruxism happens mostly during sleeping, especially after a day of consuming caffeine or alcohol. the habit is intertwined with other bad habits, such as chewing pencils or biting your nails. as I type this, I am very aware of the strain in my jaw and the fact that my brain orders my mouth to stop doing it every 15 second. it sort of sneaks up and you start doing it without even noticing, its driving me crazy! especially because it leads to migraines.

if you also have that problem, I have a simple solution for you. I was at my old Penn State New Kensington campus and ran into nurse Elaine, one of my favorite people in the world, she taught me a simple trick, which is to position your tongue in the middle of the mouth, enabling your jaws to relax and preventing teeth grinding. its a simple trick that goes a long way, I have applied it and I cant say you will stop, but atleast you will be more able to control the uncontrollable urge to give yourself a dull headache.

As my first semester at Penn State begins to come to a close, i’ve been reflecting back on how many absolutely adorable puppies i’ve seen around campus. However, these aren’t your typical animals. These are dogs that are being raised to positively affect someones life. In one of my previous posts I discussed animal testing and why you could argue that it is a waste. That is definitely a controversial topic, however, the fact that we (as humans) NEED services dogs surely is not. More specifically, I am interested in how animals can help veterans– people who have sacrificed and possibly lost so much from their lives after not having to rely on anyone for anything up to that point.

Service dogs have been proven to help with physical disabilities such as hearing and vision difficulties in addition to balance and anxiety issues. The The Military Times has even been publishing recent studies that research how dogs may be able to help veterans with PTSD. Marguerite O’Haire, an assistant professor of Human-Animal interaction at Purdue University, is conducting an experiment consisting of 100 post-9/11 veterans split into 50 with dogs and 50 without. Tests to measure different variables will be conducted and will provide the basis of a pilot study that could eventually lead into further exploration of the importance of service animals in relation to veterans.

The null hypothesis is that dogs used to help veterans cope with PTSD will not be successful and won’t have a positive impact on their lives. The alternative hypothesis is that these animals will positively influence veterans and their overall quality of life by providing them with a constant companion.

For this longitudinal study, O’Haire addresses the fact that non biased data is necessary in order to find a reliable foundation of data to build off of. The study was paused and then restarted in 2012 with a larger sample size. This emotional support service dog study was still ongoing when the article was published, but it is safe to assume that Purdue’s research in pairing veterans with PTSD dogs will be progressive in finding ways to comfort fellow americans who were so selfless that quality of life was lessened upon their arrival.

It is clear to see how much potential this research has. To myself along with so many others that I know, pets are just companions– “extra” things that do make us happy and can arguably make life better, but we don’t necessarily rely on them. But to those who are less fortunate, or who deserve to have some emotional outlet, canines have the potential to be so much more.

I am an avid user of allergy medicine. Ever since I was about ten, I have been developing new allergies for things that I love to eat. For example, around the age of twelve, I realized I had developed an allergy for peaches—my favorite fruit at the time. Over the years I have developed even more allergies for more and more foods. To make a long story short, I am constantly taking Benadryl, a common allergy medicine, to mitigate my allergies. Because I am frequently taking Benadryl, I decided to look into it and asked myself a few questions: “Does Benadryl have any negative effects on my body or brain? If so, what are those negative effects? What is it in Benadryl that helps people to fall asleep and how does this work?”

After doing some research, I found my answer. Yes! According to various studies done, over the counter drugs and pills such as Benadryl are actually linked to Dementia and Alzheimer’s disease. Well, I was not expecting to find something as shocking as that, but I did have an idea that taking Benadryl as often as I do is probably not the best for me.

The null hypothesis of this study is that over the counter drugs such as Benadryl do not actually have any effects on the human body/brain. The alternative hypothesis is that drugs such as Benadryl do, in fact, have effects on the body/brain. The question here seems to be is there a correlation between over the counter drugs and dementia?

According to the Fischer Center for Alzheimer’s Research Foundation, the longer you take the medication or drug, the greater you are at risk of diseases such as Alzheimer’s disease. Other than Benadryl, the studies have focused on antidepressant drugs and antihistamines (used in the treatment of allergies). Apparently, certain medications are recognized for blocking a specific brain chemical called acetylcholine. Acetylcholine transmits nerve signals throughout the brain and nervous system.Another well known risk of the drugs is severe damage in particular features of cognition, which has been demonstrated in single-dose experimental studies. The drugs that are associated with dementia and Alzheimer’s are known as “anticholinergic agents.” These specific drugs are most commonly taken for things ranging from bladder problems to allergies and mood disorders. There have been various studies done on whether or not specific drugs are linked to dementia and Alzheimer’s, but the study performed by researchers at Group Health and University of Washington was the first to show a dose-dependent relationship, which meant that taking the drugs for longer actually increased dementia risk.

To do the study, researchers at Group Health and the University of Washington examined medical and pharmacy records from 3,434 men and women who were involved in the Adult Changes in Thought study (ACT). The ACT was an ongoing study of brain aging and dementia in the Seattle area. Participants of the study were 65 and up and did not have dementia at the start of the study.

The researchers evaluated the participants every two years for signs of dementia, over a period of more than seven years. After the study was done, researchers found that overall, long-term use of the drugs significantly increased the risk of developing dementia, including Alzheimer’s disease. After the study, alternative drugs were recommended to patients that did not have anticholinergic effects. A similar study suggests that a person taking specific drugs for more than 3 years would have a greater risk for dementia.

After reading into multiple studies done on whether or not certain drugs are related to dementia and Alzheimer’s disease, I am still unsure as to whether or not my findings will change my habit of constantly taking Benadryl. While reading, I wondered if a possible third variable could be the fact that certain diseases might run in the family. Many of the studies did not focus only on Benadryl, but focused also on other drugs including antidepressant drugs. I do not have to worry about the negative effects for those drugs, but if others are taking antidepressants along with certain allergy medicines, they might want to start taking alternative drugs to reduce their risks of getting dementia or Alzheimer’s disease. I always find it interesting to see the correlations between certain things and the fact that Benadryl is actually shown to be linked to dementia blows my mind!

Sounds crazy right? Why would a product have the opposite effect of its intended use? Well it many not be as ludicrous as it seems. Chapstick companies make more money when you have chapped lips, so why fix the problem?

According to Caitlin Covington of Greatist.com, applying lip balm to ones lips only temporarily soothes them but had a negative effect on your lip’s health in the long run. The lip balm interferes with the signaling between the dying cells and the mechanism that produces new healthy cells. As a result, the lips are prevented from replenishing the cells and healing your chapped lips. This makes the chapped lips worse causing you to again apply chapstick. This process repeats over and over and your lips become more chapped than they were before you started using the product.

Dermatologist Leslie Baumann, MD blames the negative effects of chapstick on hyaluronic acid and glycerin, both common ingredients in products for chapped lips. These chemicals draw moisture away from the skin. While she claims that these ingredients can be effective alongside other chemicals that prevent water loss, she recommends to never use chapstick that don’t also include occlusive agent. These are often only found in medicated or high end lip balms.

Dry climates, altitude, and many other factors can chap your lips, but your body is built to heal on its own. Unless your chapstick is medicated or contains an occlusive agent like beeswax or shea butter, you are better off leaving them alone. Using chapstick without these agents is no better than licking your lips and can cause health problems rather than solutions.

Don’t have access or the funds to purchase this pricey lip balm? Today.com has some other solutions. The first is staying hydrated. Through my research, I’ve discovered this is pretty much the #1 thing on every dermatologists lists. Other advice includes using a humidifier and using sugar to exfoliate the lips. As well as avoiding lip balms, they also recommend staying away from lipstick and gum.

As this dry winter weather approaches, I hope this will be a helpful guide. Do you have any personal stories that support or refute my findings? Let me know in the comments below.

Walking back into my dorm to my roommate was such a great feeling. It was nice to be back into the typical routine and seeing our cozy room the way we had left it. The one thing that disturbed me, though, was a strange odor lingering. I checked the garbage which was empty as I had suspected, but as soon as I was in the vicinity of the refrigerator, i gagged (literally). The two of us opened the door to the fridge part and saw mold infested jars and drinks. Milk, cream cheese, pasta sauce, yogurt–everything was thrown out. The thing reeked (it still does). How could we be so stupid to unplug our fridge?! Major Freshman rookie move. *Insert insults here*

But it didn’t end there, the freezer was even worse. Ice cream had spilt over and spoiled, making our room smell like cheese and/or a dead body. Horrific. It still smells, despite cleaning it multiple times and putting baking soda in it. Now to the point, how long does it take for food to spoil and why does it smell so foul?

Since most of the products that got disturbingly rotten were dairy, I decided to start my search of how long Milk lasts outside the fridge. This website explains the FDA guidelines to dairy consumption after it being out in various temperatures. Milk is only consumable outside the fridge (or in a temperature over 40 degrees Fahrenheit) for only two hours. I found that pretty shocking, it can’t even last a day?! (Makes our 10 day milk and ice cream situation seem even more horrifying.) Likewise, if milk is left in a 90 degree area, it will only take an hour before having to toss it. Keep this in mind if you ever leave milk out on the counter on accident!

I’m sure it wasn’t just the dairy that stunk up our fridge, and I’m sure it wasn’t just the food that did it all itself. This article states that most of the scents on our rotten food consist of spoiled microbes. These include bacteria, yeast and mold. Some scientists suggest (but it is not proven) that microbes create these scents so that they can take over and consume the food and frighten humans away from it. Interesting theory, but who knows if it will ever be proved. And while some foods go back and stink, some may go bad and have no smell at all. This does not mean it’s safe to drink/eat. Likewise, milk for example can stink but not yet be spoiled, so pay attention to expiration dates and temperature the product is and needs to be stored at.

if you voted for Donald Trump, thats great. If you voted for Hillary or Joan Stine, I have terrible news for you. the incoming President has chosen a man named Myron Ebell, wanted in Paris for “destroying our future” by a french climate change advocacy group. now, it doesnt matter if you voted republican or democrat of even for Harambe, we all live in this earth and climate change is an issue that concerns literally all of humanity. when your town is flooded or has a hurricane, its not going to care if your a republican. when the wind blows and storms and massive heat or massive cold strikes large parts of this country, its going to be too late to do anything. this is a massive issue for many people who advocate more protection for the planet. and the name Myron Ebell should terrify you more than anything you have read in the past two years. as we all know, the Paris Climate Agreement, an already insufficient accord, was signed by the Obama administration last year. John Kerry, his secretary of state, brought his own granddaughter to the singing of the accord in the United Nations general assembly. the message he tried to convey was that he is doing this to protect the future generations, including his own family.

as sweet and symbolic that was, the future of the accord is in serious jeopardy, not because Trump and Ebell can exist the agreement, but because they can remain in it and not abide by it, which would be even more dangerous. I was watching Leonardo DiCaprio’s exquisite documentary on Global Warming, where he shows viewers the effects of this issue that are happening already and will get much worse. so Myron Ebell is sort of the last thing we need, why ? because he is a consistent denier of the whole thing. he persistently attacks scientists and researches and advocates for climate action. like his boss, he even went after the pope for talking about it. his vision for the future is as scary as anything I personally know about the next 4 years.

Football is such a wonderful game, isn’t it? It brings families and friends and foes together for a series of sporting spectacles for ten hours on a Sunday, with a primetime show the preceding Thursday and the following Monday. Every single stadium roars loudly, tens of thousands of witnesses cheering for their side to win. Try taking away the field, the officials, teammates and replace pads with good old fashioned sword and shield. Doesn’t that sound awfully familiar, just like something that occurred regularly in Rome over a thousand years ago?

This connection has been made by many people many times before but for some reason, people aren’t positive whether or not football players are modern-day gladiators. This was my view when I first heard the comparison as, like the writer, I too entered this debate with bias as an avid fan of football. As a little kid, my eyes couldn’t believe the plays that immortal legends such as LaDainian Tomlinson, Ray Lewis and Randy Moss made (yes, I actually thought they were immortal. I was a kid, cut me some slack).

Once I started to realize the real dangers of football, I no longer celebrated the huge hits Ray Lewis made that I once cheered wildly for, instead hoping his victim walked away unscathed. Football causes an alarming number of concussions, and all of these players are either suing the NFL over permanent head injuries or have already died as a result of concussions. This alarming statistic only adds onto the fire. Chronic Traumatic Encephalopathy (CTE) has caused many players to die. It scared me when I first heard about it, and chills ran up my spine when I saw the movie Concussion in theaters. I realized how dangerous football was. I always played football with my friends up through high school but now we all play touch so we don’t end up in an emergency room.

This article, written by legendary sports journalist Bill Plaschke, affirms the connection between gladiators and football. Both gladiators and football players receive cheers from tens of thousands of fans who came to huge stadiums to watch the contests. Huge NFL salaries don’t compare to the minimal, if any payment, the gladiators garnered. However, the NFL and NCAAF both profit from the players’ suffering, just like the owners of the gladiators did. Other than that, they’re both fight to the death “games.” The only difference is that gladiators had the luxury of dying on the spot. Football players have to suffer for years after they wrap up their careers. They end up retiring to some exotic place only to suffer and even die, whether “naturally”, like Ken Stabler or Mike Webster, or by suicide, like Dave Duerson or Junior Seau. Ask their families if football is just a game. Football players are dying in heaps, just like gladiators did when they “lost.” Now, ask yourself; just how are football players different from gladiators? Or if you still can’t answer that, ask these two Broncos players kneeling over 2015 NFL MVP Cam Newton below.

for some weird reason, when you look at the history of scientists, especially revolutionary ones, you will find a bizarre common treat. which is the huge amount of hatred and anger they faced. when one takes a look at the history of other professions, one cant find the same amount of historical persecution that scientists had to face. Today, scientists are admired and respected. they can even turn to famous celebrities, like Bill Nye or Bill De Grass Tyson. but that was not the case hundreds and even tens of years ago. in this post I will give a few brief examples of how scientists were persecuted and try to explain why at the end.

Einstien does not need an introduction, his genius and innovation was rejected in Adolf Hitler’s Germany. being Jewish, Hitler believed him to be of a lesser race than himself. Hitler’s forces siezed his home in Berlin, and publicly torched his books as a symbolic rejection of a Jew scientists findings, no matter how revolutionary they were. We are lucky that Hitler did not embrace Einstein, as maybe if that happened, Hitler would have developed an atomic bomb to his Nazi state.

Galelio, who lived in the 1600’s, said the earth revolves around the sun. a shocking finding at the time, He was mercilessly targeted by the Catholic Church. which designated him an “apostate” or a “heretic” for going against the teachings of the scripture that contradicted his theory. his scientific research was banned, and he remained in house arrest until he died.

Rhazes lived in the late 800’s. he was a Muslim scientist who specialized in many numerous areas of medicine especially urology, his medical contributions were priceless at the time. until a preacher ordered a hit-man to smack him in the head for “apostasy”. he was blind since then and was not able to further pursue his studies.

All of the examples I gave prove that the persecution of scientists all happened due to religious reasons. most preachers and “snake oil” salesmen despise science and fight it aggressively, is it because science often proved their teachings wrong ? is it because science enjoyed the logic and reason they did not? we will never know, but the fact that scientists were on the run -historically- only makes me trust them more!

As a child with severe ear problems the carbonation in soda always gave me a rough time, so I moved away from Coke and Pepsi and discovered all of the delicious ways one can drink iced tea. Lets just say that I should probably be in the Guinness Book of World Records for amount of peach tea I have consumed. All of the fellow tea drinkers in this class will have heard the warnings from just about everyone about the danger of the ever painful kidney stones. According to This Article the primary cause of Kidney Stones is dehydration. The same article brings up the fact that some ingredients that go into making iced tea are the same chemicals that lead to formation of kidney stone development. Also, iced tea makes you more susceptible for kidney stones than hot tea just for the fact that people drink more iced tea than hot tea. So I decide to put together a little mechanism for a possible study that can be conducted.

When it comes to possible reverse causation one can argue that people who are susceptible to kidney stones drink more iced tea. Also chance is always a possibility.

Some other possible causes for the kidney stones are the type of sweetener used in the beverages that can possibly cause the stone development. This study concluded that sugar sweetened beverages may be more of a risk than the specific beverage. So multiple studies have been conducted and scientists can not strongly conclude what is causing the rise in kidney stones. So In reality just remember that everything in moderation is okay, just remember to drink a lot of water and you will be a lower risk of getting kidney stones. So my personal conclusion is that it doesn’t matter what beverage you drink when it comes to kidney stones.

At some point or another in our lives, we have all had an overwhelming number of tasks to get done in a seemingly minimal window of time. To solve this, we attempt to multitask. I for one try to multitask quite frequently, albeit with things that have varying levels of importance. I try to lock in and focus on my work but a lot of the time, text messages and snapchat enter the fold. Sometimes, I feel compelled to throw in headphones or answer a facetime call. Maybe my fantasy football roster could use an overhaul prior to the games on Sunday. Meanwhile, I somehow manage to get my work done well and on time. Seemingly, multitasking is a real thing then, right?

Well, the answer is no; what society in general understands to be multitasking isn’t even real. It’s actually a great method of becoming inefficient and can be very distracting, such as the example from the source in the previous sentence: using your eyes to attempt to text and to drive at the same time. A less-deadly example that I’ve actually attempted was to watch television while I was running on a treadmill and hydrate with some Gatorade while still on the treadmill. I was able to watch television and run on the treadmill simultaneously (I’ll explain how in the next paragraph) but I was not able to take a drink. I had to slow down my speed to make sure I could actually grab my drink and had to look down to make sure I didn’t fly off the treadmill.

Now, how come I was able to run while I was watching television? Every single thing that you do is processed by some part of your brain. If you try to complete multiple tasks at the same time and those tasks involve different parts of the brain, it is possible to do multiple things at once. This exception has very clear and specific parameters: if one of the two actions is basically mastered, such as walking, running, eating, and the like, another action using a different part of the brain can be done. Since watching television and running are two actions that don’t involve the same part of the brain, I was able to do them together. However, adding my sip of Gatorade into the equation was not good, as I was moving my legs and arms in different directions to do two different things. How could that part of my brain focus on either one of them? photo 2

What actually happens when people think they are multitasking, as described here and in the same link from the paragraph above, is called serial tasking. Your brain shifts focus from one task to another pretty quickly but it doesn’t do multiple tasks at the same time. So, going back to the opening paragraph when I was discussing my attempts of multitasking work with various things on the outside, I was actually shifting my focus from my work to texting, snapchat, music or a call. I wasn’t actually doing any of those things at the same time because each activity needs to be focused on to be completed. If I’m going to send a text, I cannot type on my laptop and my phone at the same time. If I’m going take a picture on snapchat, I have to divert my attention from my textbook to my phone briefly. The list goes on and on. What shouldn’t go on anymore is the belief that the human brain can do multiple tasks, what we all thought was multitasking, because the brain cannot do that.