We have received the board. I disabled the PCIe module in the device tree to avoid contention, and I can see that the external clock chip is outputting the required clock. Now I am looking at the software changes required for the PCIe module to support the external clock.

I have searched the forum and I can see a whole host of different threads with different advice, some of it targetting older kernel versions etc.

My question is: Is there example code, an app note or a go-to forum post with the most up to date procedure for disabling the internal PCIe clock and enabling the use of the external reference clock?

We have received support through our supplier and internally within NXP to understand why we were not seeing Gen2 reported from the driver.

The summary ( as it looks just now ) is that there is a bug in the pci-imx6.c reporting.

If you read the register PCIE_RC_LCSR [IMX6QDRM.pdf, Rev 5, 06/2018 - page 4254] when you are fully booted, it will have the correct speed, even though the read during the driver initialisation has reported Gen1.

e.g. We inserted two cards, a Gen1 card and a Gen2 card and got the following results for PCIE_RC_LCSR when booted:

Gen1: 0x30110040

Gen2: 0x30120040

We used the utility Devmem2 to read this value. For our Android build we had to enable the following in the kernel:

It recommends adjusting the registers ATEOVRD and MPLL_OVRD_IN_LO, however it seems that on our original board with the internal PCIe clock you can only read these registers when a PCIe board is connected, otherwise the system hangs.

Studying the imx6qp dts and dtsi files does not help either. The schematic clearly shows an external clock source, however the imx6qp.dtsi has the IMX6QDL_CLK_LVDS1_GATE setting which I think is wrong and should be IMX6QDL_CLK_LVDS1_IN.

I am running out of options.

Is there local support (central scotland) apps engineers that we can access to help solve the problem?

As you can see, the code is only active for the IMX6QP case, not the IMX6Q case.

When we enable the same code for the IMX6Q we can see our PCIe board come up in Gen1 mode.

We are now investigating why it does not come up in Gen2 mode - do you have any thoughts on this? Is this purely down to eye-diagram tweaking via swing settings etc at this point, or is there some configuration we need to set?

We have received support through our supplier and internally within NXP to understand why we were not seeing Gen2 reported from the driver.

The summary ( as it looks just now ) is that there is a bug in the pci-imx6.c reporting.

If you read the register PCIE_RC_LCSR [IMX6QDRM.pdf, Rev 5, 06/2018 - page 4254] when you are fully booted, it will have the correct speed, even though the read during the driver initialisation has reported Gen1.

e.g. We inserted two cards, a Gen1 card and a Gen2 card and got the following results for PCIE_RC_LCSR when booted:

Gen1: 0x30110040

Gen2: 0x30120040

We used the utility Devmem2 to read this value. For our Android build we had to enable the following in the kernel: