The "boot to black screen" issue as well as the broken sleep/wake have been fixed! Read below for details.​

0. Introduction
Most of you might have noticed that the Buyer's Guide doesn't list a single compatible AMD Radeon card at the moment. This would make many people assume that they weren't compatible with OS X at all, but in fact Apple has included driver support for almost every relevant AMD graphics chipset from the last years, so there's a good chance your standard PC card will work out of the box!

However, before getting too excited there are a few things to consider to achieve hassle-free long-term compatibility, so I've composed this guide to collect all necessary information and to clarify some urban myths.

I'll explain how to find perfectly compatible cards and how to improve the overall experience on not so compatible GPUs. This won't be a super-detailed step-by-step tutorial on every single aspect, but instead an overview on most important topics with links to further information.

Note: I don't want any Nvidia vs. AMD discussions here: Both have their right to exist, and after reading the GPU recommendation section you'll know the pros and cons of Radeons so you can choose yourself.

1. General Information
To choose a compatible graphics card it is helpful to know some details about the operation principles of the AMD drivers. Obviously I have never seen the source code of those drivers so my knowledge isn't as deep as it could be, but for the scope of this guide it'll be fine.

AMD Kernel Extensions
The AMD drivers consist of a lot of kernel extensions, two of which are particularly interesting for the (possible) OS X compatibility of a specific GPU: AMD[5,6,7,8,9]000Controller.kext (one for each AMD Radeon GPU family) and AMDRadeonX[3,4]000.kext. The first is responsible of setting up basic 2D operation, correct resolutions, handling the connection ports of your card (rooting the signals to the correct ports, detecting hot plug actions, handling multiple screens) and others while the latter is mainly responsible for the 3D acceleration of your card.
Both types of kexts contain a list of PCI device IDs in their Info.plist to detect and properly initialize connected GPUs. Having the device ID of a specific card in both files is necessary but not sufficient to make it work! Apple will add some device IDs every now and then, and while some of the added cards will actually work, others might have awfully buggy drivers because Apple added the IDs just for testing purpose. Do not buy a card assuming it's fully compatible just because you see the device ID in there!!
There are some cards which are very similar to supported cards but still don't have their device ID in the drivers. They can usually be enabled by either modifying the Info.plist of both kernel extensions (not recommended, because changes will be lost on the next OS X update and you'll violate the kext signing) or by spoofing your GPUs device ID with Clover.

Initialization Process
If the PCIe device ID of the installed card has a match in Info.plist files, it can be initialized. There are two ways to make that happen, I'll call them EFI-Init and Auto-Init.

EFI-Init: On a genuine Mac, the graphic card carries a EFI ROM right next to the legacy vBIOS on the EEPROM chip. On startup, this EFI binary will be executed by the machines EFI, which will introduce the card to the system thus initializing the driver. On a Hack, this can be mimicked by the bootloaders Graphics Injection functionality. This way you can tell the system which Framebuffer to use (see next section) or how the card shall be called in the System Profiler. Some of you will also remember tools like ATY_Init.kext which can basically do the same. Those are not relevant any longer.

Auto-Init: With Lion, a new feature has been added: Automatic initialization of the drivers. Apple gave the system the ability to initialize a PCI graphics card just by its legacy vBIOS, without the need for any (faked) EFI stuff. At the beginning this had some compatibility issues, but as of Yosemite one can say that in most cases this works as good as the classic EFI-based injection method.

For Yosemite this means that the correct driver for your GPU will automatically be launched at startup if the device ID is in there. You don't have to do any setup for this to happen, but please remember that the drivers being loaded might still be crap and only give you a garbled or black screen.

Framebuffers
Each of the AMDx000Controller.kexts contain a set of so-called framebuffers for one GPU family. For the scope of this guide you can think of them as set of port mapping tables, which tell the driver how each port of the graphics card is physically linked to the GPU chip. This is important, because a mismatch might result in a black screen, system freezes or a lack of certain features (e.g. no hot-plug detection or no audio).
Most of those framebuffers are Apple specific (e.g. 1 LVDS port and 2 DisplayPorts for a MacBook), but in the past they were kind enough to include some framebuffers for generic PC video cards despite they didn't use any themselves. Those framebuffers usually match AMDs reference layouts (see below), so they won't help you in case of any odd custom design card.
When using the graphics injection functionality of your bootloader, you'll tell the system (implicitly or explicitly) to use one of those framebuffers. This is fine if your card has a perfect match, but for non-reference cards usually you won't have much luck.
If you don't inject a specific framebuffer but rely on the drivers auto-init feature, the driver will fall back to the generic, so-called RadeonFramebuffer. This framebuffer is dynamically built from the cards vBIOS, so it will match any non-reference card. In the past there have been some issues with this approach, especially on multi-display setups and with certain apps (DVD player, Steam), but as of today most of this seems to have been solved. Note that your card won't show up with its correct name, but instead report something like „Radeon HD 7xxx“ or "Radeon HD 5000 series". This is only cosmetic though and shows that you're currently using the RadeonFramebuffer; usually nothing to worry about.
If you want to use graphics injection on a non-reference card, you'll have to patch your framebuffer in the corresponding AMDx000Controller.kext binary. This can be either achieved directly on your filesystem (not recommended, see previous section) or on the fly with Clover.
You can get a full list of AMD framebuffers in your OS using the PHP script provided in this thread.

2. Compatibility Chart
In this section I'll list the compatibility of most relevant Radeon chipsets (as of Yosemite, if not stated otherwise). This also covers the best matching framebuffer for each card, the reference port layout and known issues. Please note that I obviously don't have the means to test every single card in every single vendor-specific variant myself, so there is always a chance that you'll encounter any specific card that doesn't behave as described here.
I'll focus on new GPUs (HD 7000 and following) leaving out everything older than HD 5000, since most of those cards have already been dropped in OS X due to the lack of 64bit drivers. I also won't cover mobil GPUs, uncommon variants or low-end office computer cards.

AMD product naming convention
Before starting, it is noteworthy that AMD has created a lot of confusion in the last years with their rebadging policy: The three latest GPU generation (by name: HD 7000 series, Rx 200 series and Rx 300 series) each contain specific GPUs with different chip architectures, thus using different drivers resulting in totally different OS X compatibility.
For example, a seemingly new R7 370 card uses the same, 3 year old GCN 1.0 based "Pitcairn" chip as a HD 7870, while a (by name similar) R9 380 uses the latest GCN 1.2 based "Tonga"-chip. I'll try to list all of those doubles in the following compatibility chart. Additionally, I'll partition the list by GPU family to make clear which kext is in charge of driving which card.
To increase the confusion a little further, AMD has introduced HD 8000 cards for the OEM market, which are 1:1 rebrands again. Since they are quite rare, I won't cover them in this guide. If you're unsure about your HD 8xxx chipset, look it up on Wikipedia.

Definition of the term 'Reference Design / Layout'
The "reference design" is the PCB design AMD publishes when releasing a new card. Vendors like Sapphire or XFX can then decide to use this reference design for their cards or develop their own (somehow improved or cheaper) board layout.
In the past Apple has included Framebuffers for AMDs reference layouts which usually provide full functionality for those cards.
For the future this is quite unlikely, because Apple has abandoned the classic MacPro, which was the only Mac using standard PCI cards.
The physical ports of a card are no sufficient criterion to decide whether it's a reference card or not. It's a good hint though, and the best you can get without dumping the cards vBIOS, so I'll include this information in the compatibility chart.My personal opinion on this: If you need multi screen support (especially Eyefinity = more than 2) and don’t want to risk any trouble, get something as close to the reference layout as possible. If you're happy with one screen, get any variant you'd like to have, in most cases it will do the job just fine.

OOB = No means (unless otherwise stated) you'll have to spoof or add the device ID to enable support

'Framebuffer' is the framebuffer that matches the reference port layout best. Non-reference cards might work better with others.

'min OSX' refers to the initial version which first added driver support for a specific chipset. The device ID might have been added in a later release, so your card might not work OOB at this version.

Note on XFX cards:
Many people are reporting problems with XFX cards, especially HD 7xxx / R9 generation. They're using a custom BIOS which can cause a crash upon booting, which can't be fixed in OS X.
A common solution is either flashing a alternate VBIOS on your card (only do if you can recover form a bad flash!) or using Clover to load a compatible VBIOS dynamically (without flashing).

Note on current AMD cards (3rd & 4th Gen GCN):
Many modern AMD GPUs are incorrectly initialized during boot phase, which will can lead to serious issues in OS X (e.g. boot to black screen or crash after sleep/wake). This was first solved by the WhateverGreen Lilu plugin. Extensive research has been done by Mieze, resulting in a DSDT patch. This knowledge has been incorporated into Clover (starting with rev. 4296) and can be enabled from the config.plist like this:

Code:

<key>Graphics</key>
<dict>
<key>RadeonDeInit</key>
<true/>

We recommend using the Clover solution.

Note on Fiji and Polaris 10 based GPUs (Deprecated since 10.12.6):
OS X still lacks native graphics acceleration for R9 Nano / Fury / Fury X and RX 470 / 480. However, they can use the "Baffin" accelerator (e.g. by spoofing the device ID), which was originally made for Polaris 11 based GPUs (especially the Radeon Pro GPUs in 2016 rMBP).

3. Finding or patching a compatible framebuffer
As stated before, choosing a card with reference layout should usually give you a 100% matching framebuffer out of the box. And even if your card differs from reference design or you've chosen a GPU without any "reference framebuffers" (e.g. R9 380), RadeonFramebuffer will do the job perfectly fine in most cases. But what to do if it doesn't?

Finding the best Framebuffer
A quite common way to find the best matching framebuffer is injecting every available framebuffer of a GPU family and manually testing the compatibility with each port until you find one that suits your needs. While this might work, it’s obviously a very time-consuming way to test compatibility, so I wouldn't recommend blindly testing to anyone. It's way faster to analyze the compatibility of a specific framebuffer directly by having a look in your graphics cards vBIOS.
The mapping of a specific port is defined by four different identifiers: Encoder ID, Transmitter ID, Hotplug ID and Sense ID. All of them have to match for perfect compatibility. A mismatch of the hotplug ID will usually just disable the hot plug detection (you might need to put your computer to sleep to activate the port), while a mismatch of the other 3 IDs will result in a black or garbled screen. Getting this mapping from the BIOS is easy:

Run my fork of radeon_bios_decode (attached to this post). This will return the hotplug ID and sense ID. Note: Other tutorials claim that the hotplug ID equals the 'Connector index'. This might be true in special cases, but it's definitely not in general. Using my fork will reveal the real hotplug ID of each port.

The following screenshot shows the complete port mapping at the example of a Radeon HD 7950 Mac Edition with Hamachi framebuffer:

Each line of the framebuffer equals one physical port, and as you can see, Hamachi perfectly matches! You'll certainly have noticed that the framebuffer contains a lot more information than just those 4 identifiers. I won't cover every single aspect here, just one short note one the connector type which most other tutorials neglect: The first two bytes (byte-swapped!) of each line declare the type of the connector, e.g.:
0x4 => DualLink-DVI
0x200 => SingleLink-DVI
0x400 => (mini)DisplayPort
0x800 => HDMI
Due to the compatibility (some) of those protocols it is way less important to match the connector type than one would think: For example, the DisplayPort specification guarantees backward compatibility to single-link DVI and HDMI. From my experience it's no problem to have a physical DVI port declared as DisplayPort as long as you don't exceed FullHD resolution. Same applies to some other combinations.

Patching the Framebuffer
If OS X doesn't offer a matching framebuffer for your card you'll have to patch a existing one. This is covered in great detail in a separate guide.

4. Spoofing the Device ID
Due to popular demand, here’s a short guide on enabling GPUs with missing device ID in the drivers using Clovers "FakeID" feature. For the scope of this tutorial, let’s assume you want to enable a Radeon R9 270, which isn’t OOB. The device ID of this card is 0x6811.

First you have to find a similar graphics card using the same chipset which is supported OOB (have a look in the chart above). In our example this could be a Radeon R9 270X (0x6810).

The full ID of this graphics card consists of the device ID and the vendor ID (always 0x1002). So the full R9 270X ID is 0x68101002.

In Clover, you have to set:
- FakeID / ATI = full ID (here: 0x68101002)
- Inject / ATI = true
- FBName = framebuffer name (here:Futomaki [enter some random garbage in case you don’t want to use a framebuffer, the driver will default to RadeonFramebuffer])
- FixDisplay = true

On the next boot, you should have full acceleration with your unsupported GPU!
If it didn’t work, you can verify the ID by looking in the System Profiler -> Graphics section. If the device ID listed there didn’t change expectedly, you might have a typo somewhere.

5. FAQ

Where can I download drivers for my Radeon XYZ?
AMD Radeon drivers are always provided by Apple as part of their regular system updates. You might find some dubious patched kexts in the internet which might be binary compatible to your OS if you’re lucky, but I’d advice against using those. Keep your system up to date and you’ll always have the latest drivers. Understand what your problems are and apply the patches yourself with Clover so they’ll survive system updates.
On the one hand this policy is a good thing, because you don’t have any hassle comparable to the Nvidia Web Drivers when a new OS X update arrives. On the other hand, it’s totally up to Apple when they include some new Radeons and which they choose to be supported OOB. Nvidia is usually a lot faster updating their Web Drivers to support new GPUs.

My Multi-screen setup won't work! What should I do?!
First, make sure all ports work with just one screen attached. If you want to use 3 or more displays, remember that you'll need native DisplayPort screens or active DisplayPort adaptors (see http://support.amd.com/en-us/recommended/eyefinity-adapters)!
If you're using RadeonFramebuffer, try to inject a matching framebuffer (and vice versa).

I have found a Mac EFI ROM for my card in the internet. Will flashing it make my card more compatible / faster / Better?
Nope, not at all. Your Hackintosh isn’t able to utilize a MacPro EFI, no matter if your MoBo has a legacy BIOS or a modern UEFI, so it will fall back to the legacy vBIOS on your card either way. It won’t change anything.

OS X boots up fine but everything is sluggish and/or the resolution is low!
Most certainly the driver didn't load because the device ID of your card isn't in their (check compatibility chart above). Spoof your device ID with Clover to match a similar, supported card or add your device ID to both kexts (not recommended).

My GPU shows up as 'HD 7xxx' (or similar)! Do I have to worry?
This just means that you're not injecting anything GPU related, no need to worry. Read the Framebuffer section in chapter 1 again. Other common placeholder strings are "HD 8xxx" or "R9 xxx".

Which Device ID does the Radeon XYZ have?
The vBIOS Database from TechPowerUp (Link: http://www.techpowerup.com/vgabios/) is always a good start. If you've already installed the card in your machine you might also consult the System Profiler.

What about El Capitan and Metal support?
As of today, there have been no significant driver changes in El Capitan compared to Yosemite.
Metal is supported starting with HD 7xxx series. Older cards are still supported, just without Metal.

How to enable dual cable 5K support (e.g. Dell UP2715K)?
It's necessary to edit AGDP, details are in post #1000. Thanks @LostVector for finding that out!

You have some additional information which you think should be included here? Found a mistake or an unclear explanation? Feel free to leave some feedback here!

Great thread, Fl0r!an.
Maybe you'll find useful info to add R9 280X on my last guide. Tried framebuffer edits, but I ended up using the native one, with full support on every output (actually dual R9 280x) and hdmi audio as well (dsdt edits required, of course) through 10.10 up to 10.10.5.
Thanks!

I've been building Hackintoshes for the last 6 years or so and I've built 4 now. I always use Nvidia cards because Im a fan boy and they are easy to get working. (best of all Cuda.) The most recent system I built was on the x99 platform and it deff was the toughest to iron out all the bugs. I built it with a GTX 780ti because it has more cuda cores than the 980 and was $600 CND$.

I do a lot of video editing. It is 80% of my job and so I figured the 780ti would haul through 4k video. Well It never has and although it should be an amazing video editing card it just isn't on OS X no matter what anyone tells me. So I more recently sold it and "Upgraded" to a GTX 980. Mostly because of all the hype everyone has over how good the 970 and 980 are in OS X. Yes they are easy to get working and Yes Nvidia is awesome to quickly come out with new drivers. But the 980 still struggled with 4k video in OSX. Im not a noob. I did a fresh install of OS X and clover. Still no improvement. So for kicks I decided to drop in an AMD card. I found an HD 7970 on craigslist for $120.

HOLY CRAP! I can't believe what a difference it makes. Smooth OS X UI. Shreds 4K video editing. I honestly don't get it. Is it because its natively supported in OS X? Is it because it handles OpenCL so much better? Obviously the 780ti and 980 stomp all over it in gaming on windows but WOW in OS X it is awesome! And I'm an Nvidia fan boy lol. I think I'll try to find another one or crossfire it with 280x since its the same card. At least it'll give me some more power in gaming on windows. But this is my honest opinion! It even runs better than my last system which had a gtx 670 4gb. And it was natively supported in ML

BTW: To get all ports working I on the gigabyte HD 7970 ghz I used "Radeon" FB and ATI inject. Just throwing my thoughts and experience out there to anyone who is in my shoes. Thanks.

I've been building Hackintoshes for the last 6 years or so and I've built 4 now. I always use Nvidia cards because Im a fan boy and they are easy to get working. (best of all Cuda.) The most recent system I built was on the x99 platform and it deff was the toughest to iron out all the bugs. I built it with a GTX 780ti because it has more cuda cores than the 980 and was $600 CND$.

I do a lot of video editing. It is 80% of my job and so I figured the 780ti would haul through 4k video. Well It never has and although it should be an amazing video editing card it just isn't on OS X no matter what anyone tells me. So I more recently sold it and "Upgraded" to a GTX 980. Mostly because of all the hype everyone has over how good the 970 and 980 are in OS X. Yes they are easy to get working and Yes Nvidia is awesome to quickly come out with new drivers. But the 980 still struggled with 4k video in OSX. Im not a noob. I did a fresh install of OS X and clover. Still no improvement. So for kicks I decided to drop in an AMD card. I found an HD 7970 on craigslist for $120.

HOLY CRAP! I can't believe what a difference it makes. Smooth OS X UI. Shreds 4K video editing. I honestly don't get it. Is it because its natively supported in OS X? Is it because it handles OpenCL so much better? Obviously the 780ti and 980 stomp all over it in gaming on windows but WOW in OS X it is awesome! And I'm an Nvidia fan boy lol. I think I'll try to find another one or crossfire it with 280x since its the same card. At least it'll give me some more power in gaming on windows. But this is my honest opinion! It even runs better than my last system which had a gtx 670 4gb. And it was natively supported in ML

BTW: To get all ports working I on the gigabyte HD 7970 ghz I used "Radeon" FB and ATI inject. Just throwing my thoughts and experience out there to anyone who is in my shoes. Thanks.