When Hillary Clinton laid out her economic vision for her prospective presidency in a speech last July, she made sure to work in a shoutout to her husband’s economic record as president. “The results speak for themselves,” Clinton said. “Under President Clinton — I like the sound of that — America saw the longest peacetime expansion in our history.”

Now Clinton is doubling down on that message. On Sunday, she told voters in Kentucky that she would put her husband “in charge of revitalizing the economy” because “he knows how to do it.” Aides subsequently told The New York Times’ Amy Chozick that the former president would more specifically focus on parts of the country that are struggling.

Whatever Bill Clinton’s exact role in a Hillary Clinton administration would be, it’s no surprise that she is looking to tie herself to his economic legacy. Bill Clinton’s second term was the last time the U.S. economy was unequivocally strong; for most voters this November, it was the best economy they’ve ever known. But while Hillary Clinton wants voters to look back fondly on the first Clinton presidency, she should hope they don’t remember too much about what happened next.

Perhaps most importantly, the late 1990s were a period of shared prosperity. The strong labor market drove up wages for workers throughout the earnings ladder, while drawing in people who traditionally struggle to find work, such as convicted felons and the disabled. The racial wealth gap narrowed. Inequality continued to rise, but families of all income levels saw gains.

The bursting of the tech bubble in 2000, and the subsequent recession, revealed that the 1990s boom was, at least to some degree, a mirage, the result of cheap money and, in then-Fed Chairman Alan Greenspan’s famous phrase, “irrational exuberance.” The recession that followed the tech bust, however, was relatively mild. If that were the worst consequence of the Clinton era, it might seem a small price to pay for a decade of solid growth.

But the Clinton boom, and even some specific Clinton policies, also helped sow the seeds for the far more severe Great Recession of the late 2000s. Mortgage-backed securities and subprime loans weren’t invented in the 1990s, but they expanded greatly during the period, part of a broader “financialization” of the U.S. economy that contributed directly to the severity of the Great Recession. Critics on the right argue Clinton-administration policies promoting increased lending to low-income and minority applicants contributed to the subsequent bubble; critics on the left, including Bernie Sanders, argue that Clinton’s deregulation of the banking industry paved the way for the crisis.

Bill Clinton deserves, at most, a small sliver of the blame for the financial crisis. But he probably doesn’t deserve much credit for the late-’90s boom, either. The reality is, presidents have at best limited influence over the economy. Clinton’s economic policy was determinedly centrist: modest tax increases, free trade (including the signing of the North American Free Trade Agreement) and limited government regulation and spending (the latter due in part to the Republican Congress). Those policies no doubt affected the economy, for good or bad. But their impact pales in comparison to that of forces beyond Clinton’s control: the rise of the internet, the entrance of the baby boomers into their peak earning years, the “peace dividend” that came from the fall of the Soviet Union.

It is a stretch, then, for Hillary Clinton to argue that her husband — or anyone else — “knows how” to ensure a good economy. But there are still lessons to take from the late 1990s. Most importantly, that low unemployment is crucial to generating wage gains for low-income workers — and that a period of such low unemployment need not lead to runaway inflation. The surest way to create an economy that works for everyone is to make sure that anyone who wants one can have a job.

Time and a half

The Obama administration on Wednesday announced a major expansion of who qualifies for overtime pay under federal labor law. Under a new rule, which will take effect in December, virtually all workers earning less than $47,476 will qualify for overtime, meaning they must get paid time-and-a-half beyond 40 hours a week — even if they are on salary. The new threshold is more than double the current cutoff of $23,660, which hadn’t changed in more than a decade. (From now on, the threshold will rise automatically along with wages.)

The scope of the change is huge: The Labor Department says the policy will extend overtime to an additional 4.2 million workers; other estimates put the number much higher. But no matter the number, workers shouldn’t necessarily expect a big boost to their paychecks. Some companies, no doubt, will raise pay (either by paying overtime or by raising workers salaries above the new threshold), but many more will likely rein in employees’ hours so they don’t work more than 40 hours a week.

In the short-term, the new policy likely means more free time for workers and perhaps also more jobs for the economy, as companies hire additional employees to do the work formerly done using unpaid overtime. Over the longer run, many companies will likely adjust wages to account for the new overtime requirements — paying a lower base salary in order to offset overtime costs, leaving workers with pretty much the same take-home pay as before. But the new rule will definitely mean one big change for millions of workers: no more working long hours without getting paid for it.

The what economy?

As an economics writer, I sometimes feel like all anyone talks about these days is the “gig economy.” So it was a helpful corrective this week when a Pew survey found that nearly nine out of 10 Americans have never even heard the term. For the uninitiated: the gig economy is the latest catchall term for apps like Uber, Taskrabbit and Airbnb that provide a platform for individuals to offer their services to other individuals. (It’s not a very good name, but it’s at least better than the one it replaced, “sharing economy,” which was simply inaccurate. As we all learned in kindergarten, it doesn’t count as sharing if you demand something in return.)

Besides terminology, the Pew survey asked nearly 5,000 Americans about their use of various apps and other online services. Overall, the results weren’t too surprising: Users of apps such as Uber tend to be young, affluent urbanites, while a much larger group has used older services such as eBay and Craigslist. Depending on your perspective, the report showed either the amazingly rapid rise of online services or their still limited penetration. On the one hand, 85 percent of adults have never used a ride-hailing app, and a third have never even heard of them. On the other hand: Seven years after the launch of Uber, 15 percent of American adults have used it or a rival. That’s pretty incredible.

Number of the week

The typical woman working full time as a medical doctor earns $135,169 a year. The typical man? $209,596, or 55 percent more. That’s one of the widest wage gaps of any profession, according to a Wall Street Journal analysis this week. (Don’t miss the great interactive chart.)

The Journal analysis finds that, in general, the wage gap is widest for high-paying, elite jobs. The finance industry features particularly wide disparities, for example. But the gap for doctors is particularly noteworthy because it’s comparatively easy to control for some of the factors that are often cited to explain the wage gap. Doctors have distinct specialties, for example, so it’s easy to distinguish between a man who has pursued a competitive, highly paid field such as orthopedic surgery and a woman who has chosen a lower-paying specialty such as pediatrics. Sure enough, gender differences in specialization do explain part of the gap — but not all of it.1 Nor do other factors such as hours worked or employment structure, according Anthony Sasso, a health-policy professor who told the Journal he was “befuddled” by the gap.