Well, if I understand your question correctly, then the answer is yes. For 2-D and 3-D waves, as they spread over farther and farther distances, their amplitudes must decrease because of conservation of energy. In fact, this is why a radio signal gets weaker as you move away from the source of the signal.

This may possibly be a homework question, so I can't give you a full derivation, but maybe I can set you off on the right path and you can derive it yourself.

Start with a spherical wave from a point source, which has an intensity [tex]I_o[/tex] at a distance [tex]r_o[/tex] from the source.

Now, remember that: [tex]I=P/A[/tex] where P is power and A is area of the surface the wave is on. Since energy is conserved, power must also be conserved, so we have:

[tex]I_o=P/A_0[/tex] and [tex]I_1=P/A_1[/tex] where I_1 is the intensity at some farther point r_1. Now, can you use these equations and the expressions for the surface areas of the surfaces at r_1 and r_2, to find the ratio of I_1 to I_o? If you can, then how do you relate amplitude to intensity?

i =constant * amplitude square
still I dont think one could go that easily
relation between amplitude and intensity is derived in book and other books at my standard using 1 D wave motion , how can we say that that remains true for all dimensions
2,3

i already have thought of something ,but can you tell whether
average energy passing through a point in any wave motion is always half the max. passing through it when its phase=0,2pi etc;

i =constant * amplitude square
still I dont think one could go that easily
relation between amplitude and intensity is derived in book and other books at my standard using 1 D wave motion , how can we say that that remains true for all dimensions
2,3

You can't, you need to make additional assumptions to make the derivation.

For example, the inverse square law assumes a source with spherical wavefronts. You can't apply the inverse square law to laser beams.