$\begingroup$Note for reference: it's not at all clear that the RHS should mean that.$\endgroup$
– Raphael♦Oct 27 '17 at 10:42

$\begingroup$Referring to my answer, this pertains to reading the original expression as $O_{n,k}(n) + O_{n,k}(k) = O_{n,k}(n + k)$. This is probably the intended interpretation, e.g. if $n$ and $k$ hold some meaning in the context in which the expression appears. In isolation, it's an ambiguous and hence rather vacuous thing to write down; but that's of course neither prithvi parre's nor Yves Daoust's fault.$\endgroup$
– Raphael♦Oct 27 '17 at 10:54

Note how $n$ is a bound variable inside the set expression. It's only a label for the abstract parameter of the mathematical functions. Therefore,

$\qquad O(n) = O(k)$.

Note how, with this definition, it's completely unclear what $O(n + k)$ is supposed to mean. That is not (only) an oversight in the definition; as referenced here and here, there simply is no rigorous definition that does all we want. The use of multi-parameter $O$ in the literature is usually sloppy, if not to say wrong, and only to be read in a hand-waving kind of way.

Now, in algorithm analysis such terms do not appear on paper on their own. Say, $n$ is the number of nodes of the input graph, and $k$ its diameter. Then, we may perform two subroutines, one that runs in time $O(n)$ (it doesn't depend on the diameter), and the other runs in time $O(k)$ (it doesn't depend on the number of nodes). Then, we sort of understand what $O(n + k)$ is supposed to mean.

Now, back to your original expression. The question is, what's variable and what's constant? To make the point, consider $O(n + k^2)$ instead. We can read it as

$O(n)$ if $k$ is constant,

$O(k^2)$ if $n$ is constant, and

unsimplified (with unclear meaning) if neither is constant.

To make this explicit, I write $O_x$ to imply that $x$ is the variable used in the definition of $O$ here.
In the above example, we get

$O_n(n + k^2) = O_n(n) = O_x(x)$,

$O_k(n+k^2) = O_k(k^2) = O_x(x^2)$, and

$O_{n,k}(n + k^2) =\ ?$.

Here we immediately see that the three are different things!

Now we have notation to write down what we really mean in our setting:

$\qquad O_{n,k}(n) + O_{n,k}(k) = O_{n,k}(n + k)$.

Here, $n$ and $k$ are both independent variables!
(Let's ignore for a moment that $O_{x, y}$ is not defined by the definition I give at the top; it's inherently single-parameter. We're hand-waving, as I said.)
Note how this is different from

$\quad O_n(n) + O_k(k) \neq O_{n,k}(n+k)$,

which is what just inserting the definiton of $O$ in the straight-forward way would give you.

While the "result" read as a two-variable $O$-term doesn't have a lot of meaning on its own, we can use it as a stepping stone! Once we fix a functional relationship between $n$ and $k$ -- we may investigate $k \sim n$ vs $n \sim \log n$ vs $k = 10$, for instance -- we get back to single-variable $O$ which is perfectly well-defined.

This is not a mathematically rigorous approach, but if we don't accidentally "simplify" away terms in the hand-wavy two-parameter world that we shouldn't, inserting the functional relationship afterwards gives the same result as doing the whole analysis with it in mind -- it's just a substituion. Some care has to be taken.