Share this story

The rise and fall of AMD

The conclusion of our two-part series on AMD. Part one covered AMD's attempts to transform itself from a second-source supplier of Intel designs into a chipmaker powerhouse in its own right.

Athlon 64, and AMD’s competitive peak

Overall, the Opteron's architecture was similar to K7’s but with two key differences. The first was that the CPU incorporated the system’s memory controller into the chip itself, which greatly reduced memory latency (albeit at the cost of some flexibility; new CPUs had to be introduced to take advantage of things like dual-channel memory and faster memory types like DDR2). This showed that AMD saw the benefits of incorporating more capability into the CPU itself, an instinct that would inform the later purchase of GPU maker ATI Technologies.

The K8’s biggest benefit for servers, though, was its 64-bit extensions. The extensions enabled AMD’s chips to run 64-bit operating systems that could address more than 4GB of memory at a time, but they didn’t sacrifice compatibility or speed when running then-standard 32-bit operating systems and applications. These extensions would go on to become the industry standard, beating out Intel’s alternate 64-bit Itanium architecture—Intel even licensed the AMD64 extensions for its own compatible x86-64 implementation. (Intel's initial approach could only run x86 code with an immense performance penalty.)

The K8 architecture was successful on the desktop in the form of the Athlon 64 lineup, but it was the Opteron server variants that brought AMD real success in the high-margin market. By the time Intel introduced dual-core Xeons based on the company's Core architecture in September of 2006, AMD had snapped up an estimated 25 percent of the server market. AMD continued to iterate successfully on K8 for a few years, performing several architecture tweaks and manufacturing process upgrades and even helping to usher in the multicore era of computing with the Athlon 64 X2.

Enlarge/ The Opteron CPU, and the K8 architecture upon which it was based, helped AMD break into some new and lucrative markets.

Despite technical successes, AMD's financial situation had become precarious. Processor unit sales were falling, and margins on most chips dropped quickly after 2000. AMD also had problems with producing too much inventory; in the second half of 2002, AMD actually had "to limit shipments and to accept receipt of product returns from certain customers," it announced, because the chips it made weren't selling fast enough. The company had a net loss of $61 million in 2001, $1.3 billion in 2002, and $274 million in 2003.

What was sucking away the company's money? It was those darned fabs, just as Raza had feared. In the company's 2001 10-K, AMD estimated, "construction and facilitation costs of Dresden Fab 30 will be approximately $2.3 billion when the facility is fully equipped by the end of 2003." There was also a $410 million to AMD Saxony, the joint venture and wholly owned subsidiary that managed the Dresden fab.

By the following year, AMD upped its estimated costs to fund Dresden to $2.5 billion and added that by the end of 2001, it had invested $1.8 billion. The estimated costs continued to rise, as per the 2003 10-K: "We currently estimate that the construction and facilitation costs of Fab 30 will be $2.6 billion when it is fully equipped by the end of 2005. As of December 29, 2002, we had invested $2.1 billion in AMD Saxony." That same year, AMD plowed ahead with a new Dresden fab ("Fab 36"), investing $440 million into it by the end of the year.

The money for these huge investments all relied on AMD's ability to sell chips, and AMD's ability to sell chips was made easier by its competitive edge over Intel. Unluckily for AMD, Intel didn't take this challenge lying down.

Intel resurgent

Enlarge/ Intel's Core 2 Duo took the wind out of AMD's sails, and the company has never quite recovered.

AMD's high point was, in most respects, one of Intel's lowest. "Clearly [AMD] had a very competitive product in Opteron in particular," Intel spokesperson Bill Calder told Ars, "and there was a lot of consternation inside of Intel and a lot of work going around trying to correct the problem and trying to counter not only in the market but in the press. At the time, there was quite a bit of focus on the competitive threat from AMD, but it was also very much a rallying call inside of Intel."

Even as AMD was beating Intel soundly with the Opteron server parts, the AMD64 extensions, and the Athlon desktop parts, Intel was sowing the seeds that would eventually grow into one of the company's most resounding successes: the Core architecture. By 2003, it was becoming clear that the NetBurst architecture that powered the Pentium 4 wasn't performing as well as the company had hoped—Intel had hoped to push the chips' clock speeds all the way up to 10GHz, but even at 4GHz, the Pentium 4's heat and power consumption were causing reliability problems. These same heat and power issues also made NetBurst ill-suited for use in the growing laptop segment. Rather than modify the Pentium 4's architecture to work better in laptops, the company went back to the drawing board and assigned a small team in Israel to work on a project known as Banias. This chip would later go on to be known as the Pentium M, the basis of Intel's successful Centrino marketing push (Centrino bundled a Pentium M processor, an Intel chipset, and Intel 802.11b and 802.11g wireless adapters).

Pentium M didn't start from scratch; it instead went back to Intel's Pentium III architecture and modified it to increase performance and efficiency. Pentium M also refined power saving technologies like SpeedStep, which dynamically adjusted the CPU's clock speed and voltage depending on how heavily the chip was being used.

The CPU was such a success for Intel in laptops that, when the NetBurst architecture's time was up, the company set about to adapting the Pentium M's architecture for desktops and servers as well. It ramped up Pentium M's clock speed, added 64-bit extensions (licensed, of course, from AMD), and added a second CPU core, which provided the basic ingredients for the Core 2 Duo (the original Core Duo and Core Solo were sold only in laptops and lacked 64-bit extensions—Core 2 Duo was this architecture's first foray into non-mobile form factors.)

This Core architecture accomplished several important goals: it gave Intel a fast, power-efficient 64-bit Xeon in the server market to stem Opteron's tide; it took back the symbolically important performance crown in the desktop market; and it was much more power-efficient than AMD's laptop chips right at the time when laptops began to outsell desktops for the first time. (AMD's power consumption in laptops became competitive only recently with 2011's Llano and 2012's Trinity parts.)

The Core architecture hit AMD where it hurt, but the biggest damage to AMD's long-term health came from Intel's execution strategy. Beginning around the same time, Intel moved to a system of smaller but aggressively timed processor updates that it called "tick-tock."

Every year, Intel would introduce a new processor lineup—the "ticks" would gently tweak a CPU architecture and move it to a smaller, lower-power manufacturing process, while the "tocks" would remain on the established manufacturing process and introduce more drastic architectural changes. This system limits the risk that a new process or architecture will run into significant problems during the manufacturing stage, and new processor iterations can be introduced so quickly that a competitor with a superior architecture won't necessarily be able to stay on top for years, as AMD did with K8.

Neither Core nor any subsequent Intel architecture has left AMD behind all by itself, but Core 2 kicked off a relentless string of well-executed Intel CPUs. While AMD's CPUs continued to improve, they were over time shut out of the high-end market once more and forced to compete again mainly on price, mirroring the company's early struggles. It also didn't help that, just as Intel was churning out its best products in years, AMD was trying to swallow another company whole.

Share this story

Andrew Cunningham
Andrew wrote and edited tech news and reviews at Ars Technica from 2012 to 2017, where he still occasionally freelances; he is currently a lead editor at Wirecutter. He also records a weekly book podcast called Overdue. Twitter@AndrewWrites

109 Reader Comments

It's worth adding that they've recently re-acquired two prime talents from Apple, and put them in roles that could theoretically drive things forward significantly, Jim Keller in the middle of last year , and Raja Koduri last week. http://www.anandtech.com/show/6907/the- ... rns-to-amd

I wonder if that x86-64 that was licensed to Intel is per chip? That would have been the move to make.

Quote:

Ruiz explained to Ars that, when he became the CEO of the company in 2002, AMD was simply trying to do too many things. "When I joined the company we were involved in memory, in logic, in microprocessors, communication products—there was quite a bit of stuff going on,” he said. “And so I felt that the talented people were just spread a bit too much, and perhaps one of the reasons they couldn't figure out which way to kick the ball was that there were just too many things that the company was trying to do. One of the first things I tried to do was to pare down the product line."

Shades of a pre-Jobs comeback, Apple.

Quote:

According to both reports at the time and to Ruiz’s own book, Nvidia was considered a potential acquisition target first, since the company had plenty of graphics experience and some of the best chipsets for AMD's K7 and K8-based CPUs. But Jen-Hsun Huang, Nvidia's outspoken CEO, wanted to run the combined company—a non-starter for AMD's leadership.

I abandoned AMD after ATI because of drivers and being open. (I am also a Linux person for context.) ATI's drivers were always miserable. I remember having to take a machine apart to find out the exact string of letters and numbers after "Rage". This compared to Nvidia's unified driver which just worked no matter what you had. There was forever talk of open ATI drivers on Linux, but there was never a sustained track record. Nvidia by contrast had a binary only driver but they have a good track record of hardware and software support. I am now Intel everything - the source code for all parts is right there. They even did things like powertop.

I'm just one person with a small circle of influence, but that resulted in Intel and Nvidia getting a few thousand dollars of revenue that would have gone to AMD in earlier days (I had the K6, K6-2, Athlon 64 etc). This illustrates that software matters even to companies that make only hardware.

Mostly good but there is a common misperception: The AMD/Intel 'license' for x86-64 was based on the result from the court case where AMD lost because of using the 8087 FPU design past where they were licensed for. Having to solve that puzzle meant the judge gave Intel a perpetual license to Intel for AMD IP at no cost to Intel. So license, yes that is true. But it doesn't mean it in the sense that is normally thought of as a license where A pays B to use B's IP in A's products. AMD had to pay Intel but Intel could use anything that AMD expanded the x86 IP with. Good for Intel, AMD does the heavy lifting for the x86-64 and Intel steps in when it looks like it would succeed. Of course, they should have stepped in much earlier but that is a different comment and they never listened to me anyway. No matter how loudly I shouted at my screen.

The ATI purchase was just one of the weirdest things to watch from the sidelines. A $2+B company buys a $5+B company and within 18-24 months the combined company is worth less than $2B. How is that even possible? It stills boggles my mind.

Great read that gave me a lot of insight into the days of chips before I built my first rig. Core was already out when I started high school, so it was pretty much Wintel all the way.

The story reminds me of how Sony lost the MD fight, poor management, not planning far enough into the future, spending too much on research and making bad decisions. It's funny how people never seem to learn from past failures.

Also while some mention of Fusion, does the market at large believe in the future of a CPU/GPU unification

The idea is good. But Atom and other low end Intel CPU/GPU solutions are viable now when they weren't at the introduction of the concept of Fusion. By the time AMD got reasonable parts out the market had moved and now they have good products that have no market.

AMD does have a pretty big win in the PS4. If the APU in that pans out, I could see a future for AMD. For me it's like nVidia's stance that we are practically at the point of 'good enough' CPUs where the determining factor in the customer experience on a platform is dictated by the GPU. If the unified memory space found in the PS4's APU between the CPU and the GPU units do really make for a better, 'snappier', graphics-ier, shinier system /development platform, then the APU could be the savior of AMD. They really do need good execution though, really good execution.

AMD is failing where it is, so the grass is greener elsewhere. In this article the elsewhere is in servers, the one place where AMD had a day in the sun a decade ago. Unfortunately AMD hasn’t had differentiating IP for server products in years, so the only value-add has been cheap-cheap-cheap in a market where the CPU is a small fraction of the end-product cost.

So now AMD’s salvation, funded by future EBIT from a console strategy that was created and executed under the previous management team, is going to be micro-servers based upon someone else’s IP.

If the preceding doesn’t give you pause, exactly how big is this micro-server market going to be? Is Intel going to give up and let someone carve off a large chunk of the overall server market to make it worthwhile for AMD and all of the other ARM licensees? Exactly what advantage will AMD have over everyone else in this sliver of a market to drive preferential market share?

It isn’t what was presented that should give someone pause, it is what is missing.

Intel won't let AMD die - there are just too many pesky government regulators wanting to start crawling all over Intel to want to be in sole control. In fact, in this weakened state, AMD is the perfect competitor to Intel - they're big enough to keep the government away, small enough to not be a huge pain in the rear.

Heck, should AMD spiral further, I'm sure Intel has contingency plans - whether that involves buying up a pile of AMD chips and burying them, selling a few patents to them or whatever. Heck, for all we know, the PS4 and Durango are using AMD parts BECAUSE Intel shooed Sony and Microsoft away. Having a steady demand for AMD's chips should keep AMD afloat for a few more years, right where Intel wants them. (And it makes some sense too - Intel has fab capacity - they could make custom chips for Sony and Microsoft without breaking a sweat. And while it's true AMD has better graphics, I'm sure something could've been done.)

And yes, this isn't as unusual as it seems - a lot of companies have very incestuous relationships with their competitors. Google needs Apple, and vice-versa. Likewise, Google needs Microsoft, and vice-versa. Microsoft needs Apple, etc. Apple needs Samsung, Samsung needs Apple, Nobody is pure - there are piles of agreements and other things.

And these things are done because Intel cannot hold stock in AMD they probably have tons of shell companies to work around that. And I'm sure shareholders are just as happy - sure it means lower profit, but a little pain is a lot better than huge government pain. Hell, the government can split Intel in to chip design and foundry, an outcome Intel will definitely not want, having invested a lot in fabs.

Also while some mention of Fusion, does the market at large believe in the future of a CPU/GPU unification

The idea is good. But Atom and other low end Intel CPU/GPU solutions are viable now when they weren't at the introduction of the concept of Fusion. By the time AMD got reasonable parts out the market had moved and now they have good products that have no market.

As an IT engineer about to start an MBA, this is the kind of business cases that I would like to see analyzed and dissected fully.

My impression is that, initially, AMD had vision and drive (Sanders), but not internal order or execution capacity. The next CEOs did try to put some order and execution, but failed to whip the team into good shape. The Abu Dhabi deal seems well done, though, but I am not sure how wise is to speak about their "tricky" negotiations openly.

The idea behind Fusion chips, combining CPU and GPU seems prescient considering the integrated SoCs that we have today, but they arrived late at that party. And now that low power performance is the name of the game, seems they are arriving late as well.

Regarding their presence in the next consoles, I remember reading some comments about an Nvidia executive saying that it wasn't worth their time to get into the PS4 and the next PS4, because it meant devoting resources they are dedicating to their mobile SoCs and also it would be a low margin deal.

Different industries aside, it kind of reminds me what happened with Nokia.

If the preceding doesn’t give you pause, exactly how big is this micro-server market going to be? Is Intel going to give up and let someone carve off a large chunk of the overall server market to make it worthwhile for AMD and all of the other ARM licensees? Exactly what advantage will AMD have over everyone else in this sliver of a market to drive preferential market share?

It isn’t what was presented that should give someone pause, it is what is missing.

Quote:

Micro-servers could be a chance for AMD to get a toehold in a lucrative, high-margin market, something the company’s balance sheet desperately needs

The way I gather it that particular market is untested for everyone, Intel included.Although cutting out the middleman and selling directly to customers will help. Cheap, cheap, cheap (within reason) helps with the high margin part (buy low, sell high).

Also while some mention of Fusion, does the market at large believe in the future of a CPU/GPU unification

The idea is good. But Atom and other low end Intel CPU/GPU solutions are viable now when they weren't at the introduction of the concept of Fusion. By the time AMD got reasonable parts out the market had moved and now they have good products that have no market.

Not many businesses were willing to bet big on a small part from AMD.

You obviously have never used an Atom

AFAIK, the new 22mm Atom with out-of-order execution and Ivy Bridge graphics is due at the end of this 2013 or beginning 2014.

I've been wondering what Intel is up to. They bought Fulcrum, which offers very low latency communications chips. And lately they're doing a foundry deal with Achronix, an FPGA startup.

Intel is pushing Fulcrum chips into various communications boxes, and its deal with Achronix may be nothing more than their continuing effort to become a foundry to help offset the enormous costs of running a fab.

But there is one other thing that these two companies have in common: they use asynchronous logic. This is a niche design technology that allows you to design large digital blocks, even entire processors, without using any clocks or flip-flops. (There are registers--but they are loaded when the data is ready, not by a clock). It is the secret to the very low latency that Fulcrum gets with its chips.

Another advantage of asynchronous logic is extremely low power, which is just the technology that Intel needs to enter the phone and tablet market.

I'm totally just guessing here, but if it turns out asynchronous logic design capability gives Intel a major advantage in chip design, then AMD is going to be in even more trouble.

According to the SEC's complaint against Dell, Intel paid the computer maker rebates as part of a deal in which Dell agreed not use microchips manufactured by Intel's rival AMD. We're not talking small change: The payments totaled $4.3 billion between 2003 and 2006.

That's actually not what landed Dell in hot water. Instead, Dell was charged with defrauding its investors by pretending that those payments were operating income. The maneuver artificially inflated Dell's balance sheet and helped it beat Wall Street's earnings estimates for four years.

Also while some mention of Fusion, does the market at large believe in the future of a CPU/GPU unification

The idea is good. But Atom and other low end Intel CPU/GPU solutions are viable now when they weren't at the introduction of the concept of Fusion. By the time AMD got reasonable parts out the market had moved and now they have good products that have no market.

Not many businesses were willing to bet big on a small part from AMD.

You obviously have never used an Atom

AFAIK, the new 22mm Atom with out-of-order execution and Ivy Bridge graphics is due at the end of this 2013 or beginning 2014.

Too little too late? The netbook era has passed and ARM is closing the bridge

I've been wondering what Intel is up to. They bought Fulcrum, which offers very low latency communications chips. And lately they're doing a foundry deal with Achronix, an FPGA startup.....

I'm totally just guessing here, but if it turns out asynchronous logic design capability gives Intel a major advantage in chip design, then AMD is going to be in even more trouble.

It's a good guess, but what you should really be paying attention to is 2.5D & 3D die-stacking, through silicon vias (TSV), and packaging technologies to support that. Imagine stacking APUs and memory connected to 4096 bit wide memory buses which deliver 512GB/s throughput to those CPU & GPU cores, burning less power than a current stand-alone APU, but outperforming them by a significant factor.

The capability to do this is manufacturing dependent, and guess who has the best semiconductor manufacturing technology in the world. That being said something like the preceding might be a great cost-reduction step for those next generation consoles. Its a race to get there; pick your horse.

I can't help but wonder what AMD would be like today with Nvidia's CEO Jen-Hsun Huang at the helm. I've always thought he was a good CEO.

It was always hard for me to wrap my head around why AMD would have thought it was a good idea to sell off its Imageon tech, especially for so little. This was possibly AMD's worst single decision in recent years. I thought of it as throwing away their future for peanuts.

I've been wondering what Intel is up to. They bought Fulcrum, which offers very low latency communications chips. And lately they're doing a foundry deal with Achronix, an FPGA startup.....

I'm totally just guessing here, but if it turns out asynchronous logic design capability gives Intel a major advantage in chip design, then AMD is going to be in even more trouble.

It's a good guess, but what you should really be paying attention to is 2.5D & 3D die-stacking, through silicon vias (TSV), and packaging technologies to support that. Imagine stacking APUs and memory connected to 4096 bit wide memory buses which deliver 512GB/s throughput to those CPU & GPU cores, burning less power than a current stand-alone APU, but outperforming them by a significant factor.

The capability to do this is manufacturing dependent, and guess who has the best semiconductor manufacturing technology in the world. That being said something like the preceding might be a great cost-reduction step for those next generation consoles. Its a race to get there; pick your horse.

I'd say IBM has some competitive semiconductor tech. They're just not interested in playing in Intel's space.

Loved AMDs CPUs back in the day, better performance with a lower price, although every one I ever had ran hot but never failed due to heat stress. Intel finally did step up their game and left AMD with all their flubs in the dust. Interesting points on the merger between AMD/ATI and how they never did learn to play well together, if they had I think AMD would be in much better shape today.

It's nice to see what happened with all the players in the CPU game summed up for us laypeople. Great article.

AMD does have a pretty big win in the PS4. If the APU in that pans out, I could see a future for AMD. For me it's like nVidia's stance that we are practically at the point of 'good enough' CPUs where the determining factor in the customer experience on a platform is dictated by the GPU. If the unified memory space found in the PS4's APU between the CPU and the GPU units do really make for a better, 'snappier', graphics-ier, shinier system /development platform, then the APU could be the savior of AMD. They really do need good execution though, really good execution.

At the very worst, AMD will be carved up and sold off. I know that sounds horrible, but it's unlikely that any of their important product lines - important to the market, if not the consumer - will disappear. After reading these articles and knowing what I've known about AMD for a few years now, I'm not so certain that would be an entirely bad thing - barring for the executives.

This illustrates that software matters even to companies that make only hardware.

ATI has always had terrible software/drivers. Every time I've tried an ATI part, I've ended-up regretting it and gone back to Nvidia. Not because the hardware performance was bad, but because the software stack has always been a mess. It's improved quite a bit in recent years, but still pales in comparison to what Nvidia delivers.

when one has a hot Item one exploits itit is the resulting cash that is the problemwith out a clear "next hot item" the cash will drift awaylike American motors in the late 50sthey went from the successful rambler to a full lineupover at ford Lee went from the falcon to the mustangand cashed in on the muscle car era.

As a former employee, I am really sad to see that AMD is stumbling so profoundly. I do believe they had a great idea (with Fusion) and that if they were able to execute it would have really taken off much more quickly. I still am cautiously hopeful that they'll turn themselves around. It's incredibly important to have them in the market as a competitive force against Intel. Also, I love AMD GPUs and hope to see them continue to stick around.

It really is true what was said in the article. I was on the "red" side. Once our ATI office moved into the AMD Boxborough one, people were very often introduced as being from the "red side" or the "green side". They were very hands-off for a long time regarding our process. My impression was that they didn't want to mess up a good thing, since the GPUs were about the most successful thing for the company and were one of the only things making it money for a long time. There was effort to integrate the tool flows and environments but it took a long time, and individual teams did have a propensity to really push back against environment/flow changes. Often each site would do its own thing and didn't want to conform to what the higher-ups did. Standardizing was hard, and it was such a long-standing aspect of the culture that it probably held back the company a lot in terms of bringing IPs together.

I still felt proud working for AMD though. Even if they were struggling to make a dent in the market in some areas, they had an excellent corporate culture when it came to being good community neighbors, employee-centric programs and activities, and advocating for standards-based solutions and consumer-based messaging. And I also believe they were in the right when it came to making sure the market players were acting fair and not being anticompetitive. You could consider some of it drinking the corporate kool-aid on my part, but I also followed online discussions on hardware enthusiast sites quite closely and often times when they were discussing things that AMD were doing, there was more merit to it than whatever the detractors could come up with (in my experience).

This looks like they always suffered from lousy leadership. That extra fab, the buying of ATI for too much money, the late and botched merger, the selling of imageon technology, and the list keeps going. Those are all management issues... Sucks to be them I guess...

I don't think you're giving Bobcat it's due. It's shipped over 50 million units, had the fastest production ramp of any product in AMD history, was AMD's first APU to production, and had great margins due to it being tiny on a mature process. In fact this comment is being typed on an 1.6 GHz Zacate which is still happily chugging along two year's later(in spite of being in a laptop made by HP ).

Hopefully AMD can make an resurgence. The same issues that drove IBM to get AMD to be a second supplier still stand as x86 still is a large chunk of the overall market.

I have always tried to get AMD as they did always offer good price/perf most of the time and comparable products.

My first replacement build was to a cheaper Athlon X2 2.7ghz off my K7/P4 (ran both at the same time) but ended up getting a slightly slower E3300 as I got it as a gift.

I would have gone for AMD this time around if my needs were not so single threaded though my GPU's have all been ATI/AMD except my FX5200.

I always liked the idea of supporting the underdog and getting a better price/perf.

Having invested in too many fabs and then spinning them off was a bad idea IMO. Best result was to keep a number of fabs and sell off the rest. Then they wouldn't have had to that these recent charges related to GF and also provide the financing to support GF.The acquisition of ATI was a good move but ATI was overvavlued by half.

by the end of 2013, CEO Rory Read wants 20 percent of the company's revenue to come from other markets.

Getting AMD chips in the new XBox and PlayStation will surely help in achieving that goal.I'm surprised that AMD CPU and GPUs being in the new consoles wasn't mentioned in the article.

"Indeed, the graphics division produced one of the company's rare bits of recent good news: AMD will supply both the GPU and CPU for Sony's PlayStation 4 and is widely expected to do the same for Microsoft's next Xbox. AMD also supplies the GPU for the outgoing Wii and the struggling Wii U. Game consoles are a relative drop in the bucket compared even to the dwindling PC market (Microsoft has sold a little over 70 million Xbox 360s since 2005, and the PC market can generally meet or beat that number in a single quarter), but having its silicon in all three of the major contenders is a chunk of change and a good bit of publicity for AMD."

It really is true what was said in the article. I was on the "red" side. Once our ATI office moved into the AMD Boxborough one, people were very often introduced as being from the "red side" or the "green side". They were very hands-off for a long time regarding our process. My impression was that they didn't want to mess up a good thing, since the GPUs were about the most successful thing for the company and were one of the only things making it money for a long time. There was effort to integrate the tool flows and environments but it took a long time, and individual teams did have a propensity to really push back against environment/flow changes. Often each site would do its own thing and didn't want to conform to what the higher-ups did. Standardizing was hard, and it was such a long-standing aspect of the culture that it probably held back the company a lot in terms of bringing IPs together.

I once (unknowingly) joined a company just after a hostile takeover. The most disturbing thing (and the first hint of issues) was being asked the question: "which side are you from?" on my first day.

I think in that sort of scenario, trying to not rock the boat, just allows things to fester. Takeovers and mergers are disruptive and upsetting. You can't avoid having some people be upset by them. Your best option is to control and direct the disruption.

A strong leader will interrupt everything and forcefully set the direction for the whole organisation. But they need to do it fast, like ripping a bandaid off. The people who don't like that direction might leave, but you'll lose some people regardless. If you're lucky, your leader will be charismatic enough to get the majority of people on board. Then then the organisation can set out in their (new) direction with gusto.