Site navigation

government

The Tax Cuts and Jobs Act, passed along party lines by Congress and signed into law by President Donald Trump in December 2017, marks the most expansive overhaul to the tax code since the Reagan era. Economic conservatives could not have asked for a better Christmas gift, as they celebrated the tax bill’s inclusion of slashes to the corporate tax rate, widespread lower income taxes and a variety of deregulatory and financial perks for small and large businesses across most industries. However, people yearning for the delivery of a central promise of the bill are left wanting; while elements of the byzantine tax code were simplified by the new bill’s passage, the system remains essentially as complex as ever.

The reality is Americans in 2018 will not be able to do their taxes “on a postcard,” a colloquialism used by Senate Majority Leader Mitch McConnell and Trump, which referred to the optimistic idea that tax-payers would no longer need to fill out – or pay someone to fill out – hundreds of sheets and documents for the IRS each year before Tax Day. The failure to streamline the tax system begs the question of whether Americans will ever see simpler procedures, or if they should accept that today’s economy is too complex and the influence of special interest groups too strong for itemized deductions to ever fully disappear.

The history of the federal tax system is one of sustained growth, both in regards to the number of citizens targeted and the size of the bureaucratic skeleton. Throughout the 20th century, tax rates and the number of deductions ballooned. Before World War II, only a small percentage of the wealthiest Americans paid taxes; that number, about seven percent, expanded to include nearly 70 percent of the population while the war was being funded, according to economic website Marketplace. Due to the introduction of the Earned-Income Tax Credit (EITC), a refundable tax credit for low income people (especially those with children), this number has declined to 53 percent of the population that pay taxes today. As a result, changes to the federal tax code mostly affect non-retired earners who do not qualify for credit breaks.

The new bill also doubles the standard deduction for these non-retired earners, which admittedly promotes more simplicity as an increased amount of people will be incentivized to deduct a flat rate rather than itemizing deductions line by line. Most people who have the opportunity would evidently rather pay a clear flat rate than suffer through an endless sea of forms and painstakingly account for their receipts. Internal Revenue Service (IRS) data from 2013 (the most recent year with available data published) shows that 68.5 percent of households took the standard deduction in lieu of line by line deductions, leaving only 30.1 percent of households that choose to itemize. This latter number will shrink as a result of the tax bill, but for those that continue to itemize, the system will likely continue to be as convoluted as it was last year.

One of the tax system’s central controversies has repeatedly been individuals exploiting technicalities in tax breaks for certain behaviors, in an effort to avoid paying taxes. One example of this involves businesses shifting their official classifications from an industry such as “Consulting” to a less-taxed industry, in order to keep more of their revenue. Since certain new rules implemented by the Tax Cuts and Jobs Act are intended to simplify parts of the code, there will inevitably be situations where people will attempt to cheat the IRS by taking advantage of loopholes that reward certain types of behavior.

For example, an important provision in the new bill increases the deduction for pass-through entities, which are business vehicles exempt from corporate income tax such as landlords’ real-estate, legal partnerships and S-corporations. According to the Brookings Institute, however, 95 percent of all U.S. businesses can be defined as pass-throughs. While this particular change will be a boon for many small businesses and may spur economic growth, the reality remains that the majority of this pass-through income flows to the top one percent of earners, who might try and position themselves as businesses rather than as individual taxpayers in order to pay a lower rate. For example, a hedge fund might absorb hundreds of millions of dollars a year and still classify as a pass-through based on the status of its ownership, which may seem unfair to earners making less but proportionally paying more. In turn, more rules would have to be put in place to prevent this disingenuous behavior from occurring, leading to more provisions, more paperwork and more bureaucracy.

Similarly, while the bill lowers most individual rates, it also keeps the previous seven income brackets in place. As was the case in the previous tax code, the amount of income paid scales up gradually. For example, a family with an income requiring a 28 percent income tax rate will not pay a full 28 percent of its income to the IRS. Instead, for its income that falls within the first income bracket, the family would be taxed at the 10 percent rate stipulated by the new bill. This taxing procedure would continue for its income falling within the subsequent income brackets (rates of 15 percent, then 25 percent), and finally the remainder of its untaxed income will be taxed at 28 percent. This is just one more cause of confusion and frustration that the bill did little to eliminate.

Several solutions to tackle the complexity of the current taxing system have long been brought up by experts. One solution is that of investor Steven Forbes, who has always advocated for a flat tax system with no deductions or loopholes. This radical solution would eliminate the interest group question entirely and make tax filing truly doable on a postcard, although inevitably resulting in a loss of benefits for the poor and elderly. However, the idea never gained much real traction during the process of amending the bill. Instead, the debates over what changes to make to the existing code centered around haggling over individual bracket rates, debates over which deductions to add or eliminate and a variety of other minutia that was decided by different Congressmen and committees. A different path towards simplification, and the one that was most used in practice, is the elimination of as many itemizations as possible. However, most discussions of removing specific line items faced pushback from one interest group or another.

Approaching the tax system by attacking line-by-line details parallels the myth of Hercules and the Hydra: cutting off one head of the beast leads to two more popping up in its place. For a meaningful simplification of the tax code to take place, politicians will likely have to start working on a bill from the top down with the philosophy that all itemizations must go and that flat taxes will be the new standard for each income group. If they can accomplish this while maintaining a safety net for low-income and elderly people, all families and earners will benefit. Until then, per the Taxpayer Advocate Service, Americans will continue spending a combined six billion hours a year filing taxes while paying compliance costs roughly totaling 195 billion dollars. There’s a reason tax magnate H & R Block’s stock actually increased after the bill passed, despite early fears the company would be forced to close its doors. The Tax Cuts and Jobs Act made some meaningful strides by doubling the standard deduction, but the code still has a long way to go before filing taxes on a postcard becomes possible.

As the United States economy struggled to avoid complete collapse in the height of the financial meltdown, leading players in the regulatory arena, as well as those in Congress, pushed for a solution that would prevent any future downturn from escalating to the near catastrophic levels of 2009. That solution emerged in the form of the Dodd-Frank Wall Street Reform and Consumer Protection Act, which passed Congress on party lines and was signed by President Obama on July 21, 2010.

Dodd-Frank formed a key part of President Obama’s plan for economic recovery. The law significantly overhauls the American financial regulatory system, creating a new Consumer Financial Protection Bureau, limiting risky trading, implementing a new regulatory regime for credit agencies and insurance companies, among others. The law is arguably the biggest push for financial reform since the Great Depression, and given its length and breadth, has generated substantial opposition.

One of the most controversial provisions of Dodd-Frank is the Volcker Rule, named after former Federal Reserve Chairman Paul Volcker. The rule would ban most forms of proprietary trading, which prevents financial firms from profiting directly from trading on the market, as opposed to profiting through commissions. This rule is significant in that it can potentially prevent some of the dangerous behavior that helped cause the financial collapse. Because of the ardent opposition by banks and the limited financial resources appropriated to regulatory agencies, the Securities and Exchange Commission and other financial regulators have requested additional authority to prosecute violators and impose penalties.

But Republicans in Congress see the Volcker Rule as an unwarranted and harsh crackdown on financial activities necessary to promote healthy economic growth. Chairman of the House Financial Services Committee Jeb Hensarling (R-TX) said previously in a statement that the final version of the Volcker Rule “is just the latest example of Washington’s regulatory overkill that ends up hurting more than it helps.”

In this new Congressional session, a bill to delay the Volcker Rule failed in its first attempt to pass the House of Representatives when Republicans used a procedural rule requiring a two-thirds majority to speed up its passage. That effort failed 276-146, but ultimately, a new attempt under regular order passed by a simple majority of 271-154. While twenty-nine Democrats voted with a unified Republican caucus in support of the bill, its passage into law is doubtful given fierce Democratic opposition in the Senate and a likely presidential veto. “One day into the new Congress,” said Senator Elizabeth Warren (D-MA), one of the leading proponents of financial regulation, “House Republicans are picking up right where they left off: trying to gut Wall Street reforms so that big banks can make more risky bets using taxpayer-backed money.”

The controversy over the Volcker Rule and Dodd-Frank in general stems down to a difference in political viewpoints between Democrats and Republicans on how much regulation is necessary to prevent a future financial collapse. Democrats cite the need for increased reforms and requirements on big banks in order to safeguard consumers and prevent large financial institutions from growing to be “Too Big to Fail” and bringing down the economy in the event of bankruptcy. Republicans, on the other hand, believe these increased regulations hamper financial growth and economic recovery, and they point to deregulation as necessary to promote growth. Political consensus on this fundamental difference appears doubtful, but smaller bills have previously garnered bipartisan support, indicating the possibility for compromise in the future.

The Dodd-Frank Act is one of the most sweeping pieces of financial reform legislation ever to become law, but only time will tell whether it will have the fortitude to resist sustained attacks from the financial industry. Furthermore, many believe it has not gone far enough to prevent a future financial collapse, a reflection of the legislative process as well as sustained opposition from Republicans. Now, with a wholly Republican Congress and only two more years left in President Obama’s second term, the future of many of President Obama’s initiatives remains uncertain. As the economy approaches full recovery from the financial collapse, the impetus for additional regulations will undoubtedly diminish, as legislators move beyond the financial collapse. The debate over Dodd-Frank and the Volcker Rule will surely continue beyond 2017, and as partisan politics heighten as the 2016 election season begins, President Obama’s record will become a key component of the campaign.

Congress recently passed a bill confirming, for the first time that “climate change is real and not a hoax”. With that out of the way, America can officially begin to view the future of energy in a new light.

According to recent data from the U.S. Energy Information Administration, fossil fuels accounted for nearly 82% of America’s energy consumption, a number that has remained steady over the last few years. Recently the Keystone XL has become central to a heated debate over the environmental costs versus economic benefits of natural gas.

The price per barrel of crude oil has dropped to its lowest since 2008-2009. Fluctuating between 40 and 50 dollars, this figure is the result of oversupply by OPEC nations and the refusal of other producers to curtail production.

www.macrotrends.net

At the same time, the American government continues to wrestle with the contentious Keystone XL bill, a proposal that has environmentalists up in arms.

Keystone XL is an extension of North-American oil company TransCanada’s existing Keystone pipeline. The system covers nearly 3000 miles, delivering Canadian oil to refineries in the mid-west and southern United States.

The proposed extension is a shorter, wider, and more direct route from Alberta to Steele City, Nebraska that passes through sensitive environmental areas, sparking controversy.

Environmental groups point to an array of concerns. The threat of oil spill and leakage raises alarms for the highly sensitive landscape that includes one of the largest reserves of fresh water in the world, the Ogallala Aquifer.

Source: National Atlas

Another worry is the extraction of crude oil from Canada’s oil sands. These tar sand regions provide a majority of the product running through Keystone XL, and research has shown that removing this crude yields more greenhouse gas emissions than conventional methods do. According to the Pembina Institute, a think-tank on Canadian energy, “average greenhouse gas emissions for oil sand extraction and upgrading are estimated to be 3.2 to 4.5 times as intensive per barrel as for conventional crude oil produced in Canada or the United States.”

The project will reportedly transport 830,000 barrels of oil a day and create jobs for people in the area, although the exact number is under heavy dispute.

As of February 2015, both Congress and the Senate have passed different versions of the bill, but President Obama maintains his position against it, having vetoed the proposal once and publicly demonstrating willingness to do so again.

The debate on Keystone XL over the past few years added a new wrinkle with today’s plunging oil prices. While the fluctuations won’t stop the production from current tar sands, a prolonged slump may inhibit future production. According to Amy Harder of the Wall Street Journal, the most efficient existing oil sands have breakeven points below $40 per barrel, while new production areas may require up to $90 per barrel.

However, TransCanada’s CEO, Russ Girling, recently spoke out against these concerns, saying that “Keystone XL is a project that was needed when the price of a barrel of oil was less than $40 in 2008, when we first made our application, at more than $100 last year, and around $45 today.”

Experts agree that the project still makes economical sense, but it is no longer a pressing necessity. Recent data from the federal government shows that the US imported more than 3 million barrels of oil from Canada per day, and U.S. refiners still hope to see the infrastructure built.

Bill Day Valero, spokesperson for Energy Corp., the biggest oil refiner in the United States, recently stated that the “ample supply of inexpensive crude oil would offset declining supplies from fields in Mexico and South America. That’s the case no matter what the benchmark price of crude is today.”

The fall of oil prices isn’t over. Most analysts believe that the price per barrel has the potential to drop to $30, and experts from Citi predict that prices will tumble even further before producers begin to limit production. The passage of Keystone XL will only contribute to the supply.

For the foreseeable future, oil prices will continue to plunge and eventually bottom out. Reliance on fossil fuels will decrease only infinitesimally over the next few years, while demand will remain across the globe. Barring the rapid introduction and commercialization of a viable alternative energy source, the price per barrel will recover. The question is how much the recoup will be.

The Federal Reserve has adapted a new stance on monetary policy: patience. With the federal funds rate at the zero lower bound since 2006, the Fed’s policy-setting committee, the FOMC, has delicately approached the idea of beginning to raise rates. Beginning in 2012, the Fed assured the markets that there would be a considerable time period of highly accommodative monetary policy to bolster the economy’s progression to the Fed’s dual mandate of maximum employment and stable prices.

During the last two FOMC meetings, the committee decided to exclude the language of “considerable time” from the statement and adopted a new vow to remain “patient in beginning to normalize the stance of monetary policy.” The statement assessed that labor market conditions were improving and that labor market slack, or underutilization, has continued to diminish. Following robust job gains at the end of 2014 and a significant drop in the headline unemployment rate, which is now down to 5.6 percent, a quick glance at the labor market looks more reassuring than reality.

There are still 6.8 million workers who are employed part time for economic reasons, 2.3 million workers who were marginally attached to the labor force, and 740,000 workers who were discouraged from looking for work since they did not believe there would be any employment opportunities. Over the past twelve months, there has been little change in underemployment. The U-6 alternative measure of broad labor market underutilization, which is at 11.2 percent, has remained twice the size of the headline U-3 unemployment rate. This is drastically above pre-recession norms, when there was a 3 percentage point differential in the rates compared to the 5.6 percentage point difference today.

In the past 35 years, the unemployment rate has generally been considered a good measure of labor market health (Barnes et al. 2007). However, the recent improvements in the headline (U-3) unemployment rate have overstated the health of the entire labor market. The FOMC learned this first hand. In 2012, the committee decided to target a 6-1/2 percent unemployment threshold to assure the market that rate rises would be delayed. Even with an unemployment rate nearing 5-1/2 percent, the FOMC has been forced to backtrack and recognize that this threshold was shortsighted. At the 2014 annual Jackson Hole conference, Janet Yellen claimed that the “decline in the unemployment rate over this period somewhat overstates the improvement in overall labor market conditions.”

In order to assess the amount of residual labor market slack, I created an aggregate distance function summing eight different labor market indicators from the distance of their pre-recession norms. Over the past year, Chair Yellen has mentioned that a variety of labor market indicators are on her “labor market dashboard.” Using speeches over the past year, I gathered eight different labor market indicators that Chair Yellen has highlighted when talking about labor market health. This includes: total unemployment, total underemployment (U-6), the labor force participation rate, percent of unemployed who have been out of work for over 27 weeks, employment-population ratio, hires rate, quits rate, and the job openings rate. For each indicator, I examined the pre-recession average (2005 – 2007), and then found the indicator’s proportional magnitude distance from current levels.

As demonstrated by the indicator, there is still significant labor market slack, even though it is diminishing. The indicator has no form of weighting between the components, and assumes no structural changes in the labor market when using the 2005 – 2007 pre-recession. However, it is interesting to determine the relationship between labor market underutilization and wage growth. By plotting this labor market indicator against total private average hourly earnings growth, I drew a line of best-fit wage curve through the relationship of wage growth and labor market slack.

Capitalism has been lauded as the market system necessary for providing better prices and quality for consumers. Fast-forward to modern day and capitalism implies large “big box” discount retail stores buying real estate, in stark contrast to small, family owned businesses. Are these stores a sign of U.S. economy growth or evidence that commercialism has overcome the American Dream of immigrants starting their own businesses?

The importance of small business to the US economy

Big stores and name brands may have the greatest visibility in the US economy, but approximately 99.7 percent of all US employer firms are small businesses. Small businesses employ nearly half of the private workforce and since 1995 have created nearly 2 out of the 3 new net jobs in the United States. These small businesses include local mom and pop stores as well as private, local brand labels. They tend to have much smaller operational scale and hire fewer people than retail stores, but since these stores are so plentiful in the local economy, their quantities aggregate to the largest employing sector in the US.

Aside from their role on the workforce, smaller businesses also do a better job stimulating local economies than do retail stores. Small businesses, on average, return nearly half of their revenues to the economy, whereas large chains only inject 14 percent of their revenues into the local economy. Local businesses tend to buy products that are sourced geographically closer to their business whereas larger retailers are able to ship in products and supplies from more distant locales. To scale out – spending $1,000,000 dollars at local stores leads to $500,000 going back into the economy, which could also include at least 10 jobs, as compared to the $14,000 that a large-scale retailer would generate. Include the multiplier effect- and it seems that smaller businesses can generate greater positive economic gains than do retail stores.

The importance of retail stores to the US economy

Large corporations open up new local businesses. For instance, Apple in Cupertino, while employing 12,000 people, also created 60,000 additional jobs thanks to the creation of local businesses relating to Apple. In addition, the accompanying wealth created by corporations in an area has a multiplier effect in the economy, where each additional dollar of wealth can generate even greater sums of economic impact when people start spending it at gyms, grocery stores, local coffee shops, etc.

In addition, retail stores are signs of population growth and expansion in a city. For instance, the opening of a Heinen’s grocery in downtown Cleveland signaled that the retailers see potential in the revamp of the city’s population, even though Cleveland has not quite met the population threshold for major retailers to open branches in the city.

How they impact and interact

Because larger businesses can incur cost savings through economies of scale and vertical integration of their supply chain, they are able to provide lower prices than small businesses. Furthermore, they can deliver a greater variety and breadth of products. With the opening of the Heinen’s in downtown Cleveland, local small restaurant owners in the 5th Street Arcade, a popular lunch destination during the work week, are worried that they will start seeing decreased clientele as their customers stop by Heinen’s for lunch instead, due to Heinen’s greater selection of foods and cheaper prices.

On a net balance, large retailers also tend to decrease the amount of employment in a region when they open. While a large store, such as Walmart, can easily create 320 jobs, this could also lead to losing 510 jobs through closed businesses. This means that on average, for every job that a retail store contributes to the economy, approximately 1.4 jobs are lost.

However, while these statistics may paint a grim picture of retail, there could be potential upside. As explained, there are synergies that evolve from the opening of new retail stores in any given region, leading to perhaps the destruction of some jobs, but also the potential for new ones in the area. While the media and certain statistics may paint the image of big box retail taking over smaller businesses, there are also ways for both to coexist.

Obama and the Castro regime in Cuba have recently begun talks about ending the trade embargo between the two countries, arguing that it is archaic and that both sides could benefit from a new trade agreement. Although Cuba is a communist state, something that Castro is not willing to change, Cuba still presents opportunity from an economic standpoint. Cuba is a country stuck in the 1960s (the embargo was put in place in 1961) that desperately needs new infrastructure. Because sanctions are Congress’s most powerful tool, it has proven to be hard for Obama to simply remove the embargo completely. Instead, the Obama administration has had to propose ideas to Congress and remove sanctions regarding Cuba one at a time.

Recently Obama proposed a few rule changes to Congress, which capture his eagerness to take advantage of the economic opportunities Cuba has to offer. He proposed that the United States be allowed to export building materials, agricultural equipment, and telecommunication equipment. In addition to those requests, he also asked Congress to allow the United States to import Cuban cigars and rum.

This is definitely a step in the right direction. However, the U.S. government has to remember that although Cuba is anxious to renew ties with the United States, there may also be some skepticism associated with a new trade agreement, as the United States’ ultimate goal is obviously to transform the Cuban government into a democracy. When Castro made a speech outlining his demands of the United States at the Community of Latin American and Caribbean States summit, he stated that before a new agreement was reached, the United States must: return Guantanamo Bay and pay reparations for all of the economic distress caused by the embargo. Some of this may be fair, but the United States does not have too much to gain economically from Cuba, which has a GDP about the size of West Virginia. Thus, Cuba will probably end up conceding to any requests made by the United States.

Although the economic benefits for U.S. corporations in Cuba are relatively small compared to size of the U.S. economy, there is still some upside for certain sectors and regions. For example, at the port of Baton Rouge, there is expected to be a major economic impact benefitting the state of Louisiana due to the amount of wheat Cuba will potentially import. Since Cuba usually has to get its wheat from a distant source, the close proximity of Baton Rouge promises lower prices and efficient trading for Cubans. Also, with the increase in ships coming in and out of the port, Louisiana Commissioner of Agriculture and Forestry, Mike Strain, says, “docking fees and the purchase of groceries and supplies while the ships are docked in Baton Rouge will have an economic impact of $1 million annually.”

Some commentators say that although Cuba seems like it is a goldmine for corporations considering it has been shut off from the biggest superpower in the world since 1961, the only thing U.S. corporations will find there is fool’s gold. The excitement about taking advantage of the new Cuban market is simply a front behind which the U.S. government will begin to chip away at the Communist regime in Cuba. The last attempt to dethrone the Castro regime was when the United States slapped the trade embargo on Cuba, which obviously didn’t work. Now it is up to the “economic promise” Cuba provides, for the United States to start instilling the wonders of democracy in the minds of the Cubans.

In today’s global economy, every million counts. American corporations spend, or “lose”, hundreds of millions of dollars due to taxation– in the United States, 35 percent of corporate income is taxed. The quick fix for U.S. companies seems to be tax inversions. A tax inversion involves a U.S. company buying a foreign company and using its newly acquired tax nationality to reduce tax costs globally. While such a procedure is legal, it has been deemed as unpatriotic by many in the American political sphere.

Burger King

Burger King may be one of the most famous American companies to seek corporate tax haven. Fast food chain Burger King is in the process of completing its buyout of Canadian coffee chain, Tim Hortons. Burger King’s headquarters would be relocated from Florida to Canada and the company would pay far less taxes to Uncle Sam.

According to a critical report by the non-profit “Americans for Tax Fairness”, Burger King’s estimated savings are $117 million from the merger. Both Burger King and Tim Hortons will continue their operations separately; both will continue to use their respective brand names and serve the same products. The inversion changes the nationality of Burger King for tax purposes, and nothing else. The merged company would also have a shared official name, but that is a mere symbolic change.

Caveats

Tax inversion only works for companies that have significant revenues overseas. There are three main caveats to tax inversion as a quick fix. First, the United States has a policy of taxing foreign income, not just stateside profit. This is referred to as a “worldwide” system of taxation according to The Economist, and is a unique taxation system among the developed economies of the world. This worldwide tax essentially makes it so that U.S.-based companies pay 35 percent tax on money made anywhere in the world. Therefore, whatever Burger King makes from restaurants in Mexico, France, Lebanon or Bermuda would be under the same 35 percent tax rate. However, once Burger King finalizes the tax inversion process, it will be subject to the 35 percent rate for profit made within the US, 15 percent for the profit made in Canada, 0 percent for profit from Bermuda, and so on for any other country. For companies that make significant profit domestically, the tax inversion might not save much money.

Secondly, the Treasury Department has plans to make inversions less attractive going forward. Treasury Secretary, Jacob Lew, has added regulations to ensure that inversions are more difficult to accomplish. One example is preventing inverted companies from transferring cash or property from a controlled foreign corporation (CFC) to the new parent directly, in order to completely avoid U.S. tax. Another example is reinforcing the 80 percent rule, which requires that a U.S. company be worth less than 80 percent of the new company that will be merged abroad. The merger must happen between the U.S. company and a foreign company that owns at least 20 percent of the new total value. While the rule was in effect before the new regulations are set, companies with a different distribution of shares were usually able to change the proportions to meet the rule. This will become more difficult now. Of course, many on Capitol Hill have argued that the onerous tax code, and not the lack of regulations, is the source of the inversion phenomenon. While lawmakers are divided on how to improve the tax code, most have proposed to cut the highest corporate rate. Republicans have proposed changing the worldwide tax to a stateside tax, also known as territorial tax, which would mean that corporates have to pay the U.S. rate of 35 percent for domestic profit only.

Thirdly, and perhaps most importantly, is the public perception of tax inversion. Are U.S. companies unpatriotic for reducing some of the millions in taxes that go into the treasury? For some, the answer is clear. In 2014, Walgreens had plans to merge with a Swiss company; which would have saved Walgreens an estimated $783 million per year. However, this move was called off because of the controversy and backlash it created. Still, companies need to compete in the global market, and U.S. companies are at a disadvantage relative to overseas companies with significantly lower tax rates. But some companies may decide that the negative publicity that comes with the tax inversion may be more costly than the potential tax savings.

It hovered over the annual Dartmouth Homecoming Bonfire. “It’s a drone,” my friend explained. A drone? The only drones I’d every really heard of were furtive aircrafts used for reconnaissance missions and surveillance over enemy territory.

But as the Homecoming weekend came to a close and the green “18” finally rubbed off of my chest, I found the incredible video captured by this high-flying piece of technology. I looked further only to find that the drone market, like the footage I just watched, is soaring. With such a sturdy frame and notable flight stability, these drones have the capability to fly with surprising ability; GPS systems reduce and mostly eliminate any flight error of a pilot from the ground.

In response to the booming commercial drone market, companies like GoPro established strong footholds within the industry. And while GoPro tapped substantial profits from major drone producers, the producers themselves have emerged as the ultimate winners. Parrot, for example, a French tech company that specializes in Drone production, marked a 130 percent spike in drone revenue.

So what is it that makes these drones so enticing? Drones offer a glimpse into the future of technology. They produce 3D landscaping for agricultural research that farmers can use for highly accurate aerial data acquisition. Companies like Amazon—specializing in internet-based retail in the United States—hope to utilize drones in the delivery process. Some daring owners even descended their drone into an active volcano, and when the camera melted from the overwhelming heat, the drone, operating through a programmed safety feature, was able to the owner on its own.

While the civilian drone industry is booming, the military drone industry still dwarves it. Currently, civilian drones make up only 11 percent of the drone industry, although analysts expect this percentage to increase to over 14 percent. Though these numbers still seem low, in an aerial drone market expecting to climb to over 98 billion within the next decade, commercial drones still hold an impressive stake (13.72 billion) in earnings.

Although the fiscal state of the drone market is optimistic, there are still obstacles. In recent dealings, the underfunded Federal Aviation Administration (FAA) has significantly handicapped the drone industry. Companies like Amazon forewarned the U.S. government that “[they] will have no choice but to divert even more of our [drone] research and development resources abroad.”

On the FAA website, several rules restrict drone users and ads litter the page explaining that “the Super Bowl is a no drone zone, so leave your drone at home”. Because these drones classify as “model aircrafts”, they fall under a specific set of rules. They must remain “below 400 feet, away from airports…and within sight of the operator.” Additionally, the FAA claims the ability to “take enforcement action against [those who]…endanger the safety of the national airspace system.”

Recently, in fact, a drone crashed into the White House Lawn, violating the FAA rules that restrict flight over Washington D.C. In an interview with CNN, President Obama even remarked that the incident only calls for more restrictive regulations on commercial drones.

And so, with the future of these drones looking unclear, we are left to grapple with two different ends of the spectrum. On one end we see a commercial drone industry with considerable potential in the technological world, and on the other the careful yet considerably limiting FAA. As mentioned by an entrepreneur interviewed by Fortune, “There’s still a lot of uncertainty, but the time for this industry is now.”

Unlike many central banks, the Federal Reserve has a dual mandate to promote maximum sustainable employment with stable prices. This dual mandate has been a guide for the Fed to implement accommodative monetary policy, especially during the poor economic and financial conditions over the past six years. Unfortunately, after targeting the Federal Funds Rate at the zero lower bound, completing Operation Twist, introducing new methods of forward guidance, creating two new interest rate modification tools, and implementing three rounds of quantitative easing, America’s economy is still underperforming. In order to fulfill its mandate, the Fed needs to expand its labor market perception of underemployment, consider targeting wage growth, and look towards new measures of labor market slack.

Over the past five years, the economy has consistently undershot the Federal Reserve’s two percent targeted inflation goal. In the last 70 months, only three months have seen measures of inflation, or the Personal Consumption Expenditures (PCEPI) core price index, above the two percent mandated target. Furthermore, the Producers Price Index (PPI), a leading indicator for inflation, has remained well anchored under two percent. Even with the $3.5 trillion in large-scale asset purchases and maintaining the federal funds rate at the zero lower bound since 2008, inflationary pressures have remained absent. The Fed does not currently face a conflict in monetary policy goals between prices and the labor market.

With inflationary pressures out of the picture, the labor market has remained sluggish since the recession. In 2012, the December FOMC policy statement announced that the Committee decided that an exceptionally low range for the federal funds rate “will be appropriate at least as long as the unemployment rate remains above 6-1/2 percent.” This new form of forward guidance was monumental; however, it became shortsighted when the unemployment threshold quickly became irrelevant when the unemployment rate plummeted below the 6-1/2 percent threshold in less than a year. While the improvements in unemployment are a positive factor, the rate fails to account for significant amount of labor underutilization remaining in the economy.

Back in 2009, the economy was losing over 700,000 jobs per month. This amounts to more jobs lost per month than the number of residents in Vermont. Since the end of the recession, the private sector has regained over nine million jobs and the unemployment rate has fallen from 10.1 percent to 5.9 percent. Despite the improvement in unemployment, further measures of underemployment suggest significant labor market slack. There are currently 7.1 million Americans who would prefer to work full-time, 2.2 million Americans who are marginally attached to the workforce, and 698,000 discouraged workers who are not searching for a job but would want work if the labor market were stronger. As Janet Yellen addressed during the September FOMC press conference, “there are still too many people who want jobs but cannot find them, too many who are working part time but would prefer full-time work, and too many who are not searching for a job.”

The Fed’s dual mandate does not only encapsulate the U-3 measure for unemployment. Instead, in order to stimulate maximum employment, the Fed needs to consider a large number of cyclical and structural factors to gauge maximum employment and labor market slack.

The Labor Force Participation rate is currently at a 36 year low at 62.7 percent. This is the lowest participation rate ever recorded for individuals between 25 – 29 years old and the lowest participation rate ever recorded for men. According to research from the Chicago Fed, the labor force participation rate is 3/4 of a percentage point below predictions based on historical projections (Aaronson et al., 2014). This disparity may indicate that there is likely an extra margin of slack in labor markets beyond the U-3 unemployment rate.

Another indicator for labor market slack is wages. Over the past few years, wage growth has remained stagnant around 2 to 2.25 percent for the private sector. This is well below pre-recession levels of 3 to 4 percent. Moreover, Aaronson and Jordan demonstrated that wages would have been almost a full percent higher in 2014 if pre-recession labor market conditions had been restored, indicating that wage stagnation is another factor contributing to labor market slack. Wage growth is a good tool to measure labor market slack since it applies to all workers, and captures both economic growth and inflationary pressures. For example, four percent nominal wage growth in an economy in the long run would account for two percent inflation along with additional economic growth upwards of two percent. Unlike unemployment, wage growth also includes pressures from underemployment, discouraged workers, and inactivity in the labor force, which influences labor supply.

Blanchflower and Posen (2014) propose that the FOMC could use wage growth as an intermediate target for the employment stabilization side of the Fed’s dual mandate. Unlike unemployment, the rate of wage inflation is subject to less distortion by such factors as inactivity and discouraged workers, while it encompasses influences of underemployment in the economy. This new perspective could be a more successful threshold the FOMC could implement in order to judge labor conditions.

However, wage targeting presents a variety of issues. For one, there are multiple measures of wage growth compiled by the Bureau of Labor Statistics. This includes average hourly earnings, the employment cost index, unit labor costs, and median weekly earnings. Furthermore, private companies like ADP also have their own measures of wage growth. The Fed would have to make a transparent and explicit definition of wage growth before considering the use of a wage benchmark. There are also concerns regarding sticky wages. Wage pressures have been demonstrated to be rigid, especially in America where employers often wait until the fourth quarter to raise wages. Olivei and Tenreyro have examined how wage rigidity in America plays a significant role in monetary policy transmission and the impacts of monetary policy shocks. Rigidity in wages poses an interesting dilemma for the FOMC if they wish to use wages as a benchmark. It would become difficult for the Fed to accurately examine wage pressures on a monthly basis, and could make monetary policy shocks more frequent if the Fed fails to interpret the appropriate level of wage pressures.

Targeting wages would be a useful tool, but continues to pose significant dilemmas in perfecting the labor market slack picture. At the Annual Jackson Hole monetary policy conference, Chair Yellen commented that the “decline in the unemployment rate over this period somewhat overstates the improvement in overall labor market conditions… our assessments of the degree of slack must be based on a wide range of variables.” In order to get a better sense of labor market underutilization, the Kansas City Federal Reserve has produced a Labor Markets Condition Index (LMCI). The Fed’s new index is a “dynamic factor model” that displays monthly changes in 19 labor market indicators. The Fed has just started to publish the LMCI on a monthly basis, and many economists have considered the weighted index to be a useful tool for assessing the change in labor market conditions.

In recent monthly FOMC policy statements, the Fed includes a section emphasizing that the Fed will not raise short-term interest rates “until the outlook for the labor market has improved substantially in a context of price stability.” Even though the Fed attests that LMCI captures a broader level of labor market activity than the unemployment rate, the index has a correlation of -0.96 to the unemployment rate. Basically, the index mimics movement of the unemployment rate. Although the index is designed to indicate broader measures of labor market slack, it is still highly correlated to the unemployment rate.

Recognizing these issues, I put together my own index measure of labor market slack. Looking at broad measures of labor underutilization, the index takes into account the movements of nine labor market indicators that Chair Yellen has voiced as her favorite indicators for inspecting labor market health. These indicators include the U-3 unemployment rate, U-6 underemployment rate, hires rate, quits rate, layoffs/discharges rate, job openings rate, long-term unemployment percent share of total unemployed, labor force participation rate, and average hourly earnings. Using these nine indicators of labor market conditions, I created an aggregate distance function. For example, the equation is an accumulation of distance functions, seen below:

(ac – ap/ap)^2 + (bc – bp/bp)^2 + … (nc – np/np)^2

The subscript “c” indicates current conditions, and the subscript “p” represents pre-recession averages (2005 – 2007). By comparing these measures to their pre-recession norms, we can measure remaining labor market slack. Through this index, it is clear that there is still significant labor market slack (see below).

Using an index similar to this could be a method for assessing labor market slack in order to determine the proper time to raise short-term interest rates. Regardless of the Fed’s choice in labor market indexes, many indicators point towards excessive labor market slack remaining in the economy. With inflation undershooting the two percent target, the Fed faces no conflict in goals, and can continue its accommodative monetary policy stance in order to promote employment and economic growth.

This past September, NASA announced two landmark contracts with domestic aerospace firms. The two largest, and arguably greatest innovators in space technology over the past decade, Boeing and SpaceX, walked away with $6.8 billion dollars to finalize their capsules and thruster systems so that they may provide transport to the International Space Station (ISS) for US Astronauts by 2017. However, September’s announcement more than just stepped up NASA’s current space programs, it signaled an unprecedented move to privatize the space industry.

Private US aerospace companies were eager to express their approval of these latest contracts, which vastly expanded earlier NASA initiatives. Earlier on in 2009, NASA launched its Commercial Crew Program (CCP) in an effort to “stimulate the private sector to develop and demonstrate human spaceflight capabilities that could ultimately lead to the availability of commercial human spaceflight services.” Major players such as Boeing, SpaceX, and Sierra Nevada Corporation have been beneficiaries of NASA’s Commercial Crew Program, each receiving funding in the realm of $500 million for the initiative.

The impetus behind this latest move has its roots back in 2011 when the United States government terminated the shuttle program. Since then, US astronauts’ only option for reaching the International Space Station has been to catch a ride with the Russians, an alternative that is far from perfect. The funding cuts handed down from Congress hardly cover the hefty price tag of $71 million a seat that the Russians are garnering. From the very onset, the steep price tag proved to be an issue and spurred NASA to investigate other options. Furthermore, along with these fiscal motivations, growing tension between the United States and Russia, recently exemplified by the animosity over Ukraine, provided growing momentum behind the motivation to return the space industry to US soil.

The decision to hand over space travel to the private sector is nothing short of a clear course change for NASA that many argue is a change in the right direction In these September contracts, Chicago based Boeing and Los Angeles based SpaceX, walked away with a $4.2 billion and $2.6 billion respectively. The funds awarded are earmarked for certification and final development of each company’s respective capsules: Dragon for SpaceX and CST-100 for Boeing. Both Boeing and SpaceX have proven that they can innovate and NASA believes that the competition between the firms will serve to alter the face of spaceflight.

In a recent interview, SpaceX’s CEO, Elon Musk, said that these contracts are absolutely about driving down costs and eliminating the United States dependency on Russia. As a relative newcomer to the field, Musk also pointed out that this contract helped secure a spot for SpaceX as a “key anchor tenant” in NASA’s plans for the future. NASA’s new initiative is a crucial next step for pioneering companies like SpaceX and will allow them to prove themselves as top innovators. The key, Elon Musk believes, is that you’ve got to be committed, especially when you’re competing with the likes of Boeing, who plans on bringing aboard Amazon CEO Jeff Bezos and his company Blue Origin to help with new rocketry. As NASA turns its attention towards deeper space missions to Mars and asteroids, the companies are investing for the long run.

Mr. Musk, along with both NASA and other private aerospace firms, seem to have their focus on both the long term and the big picture. After all, the long game is the nature of the space industry. Nothing happens overnight and nothing is cheap or simple. Planning for years into the future is often required to put up a successful mission, and both SpaceX and Boeing have sought to break new grounds. Both have created capsules that can splash down like conventional capsules, but can be reused, saving a great deal of time and enormous costs that come with having to start from scratch at the end of each mission. SpaceX’s Grasshopper rocket has pulled off some impressive feats in testing, where it has shown that it can effectively launch, soar to great heights, and employ its guiding sensors to re-land on the same pad from which it launched. Each company is deeply committed to inventing innovative space technology that will cut costs and increase the efficiency of leaving our atmosphere, which by the way requires an escape velocity of 36,000 MPH. Talk about a high stake buy-in.

No time since the Apollo age has been more exciting for the space industry. New ideas, fresh faces, and private companies are mixing it up with the old guard at NASA. Beyond the standard cast of characters in the established corporate world, some of the world’s most innovative billionaires have made substantial investments into private spaceflight, earning themselves a spot among the “space cowboys.” But new faces and bold innovation still need to come to terms with old problems and the inherent risks associated with space travel. The Challenger and Columbia disasters remain part of public consciousness and are a reminder of how wrong a mission can go. Such disasters have put a great deal of pressure on NASA to develop and go unmanned whenever possible. This is especially the case given the aging fleet of refurbished rocket engines and other parts now being used by some companies, a practice SpaceX is openly critical about. Like many, critics fear such stopgap measures will tarnish the privatization process as a whole.

In the past, NASA missions required a great deal of time, energy, and planning that caused long separations between missions. But the goal of privatization is to make this a thing of the past and to make spaceflight more commonplace, less expensive, and more accessible. The development of new rocketry, fueled in part by fierce competition, sets a feverish tone that will catapult the United States back into manned space missions and routine space transport. NASA’s ultimate goal is to reach a point where spaceflight is a possibility for more than just astronauts. In a recent press conference NASA administrator, Charles Bolden, announced the contracts with Boeing and SpaceX bring with them the “promise to give more people in America and around the world the opportunity to experience the wonder and exhilaration of spaceflight.” The private sector has the capability to make this goal a profitable possibility, even with a ticket price well below the going $71 million the competition is charging.