[DISCUSSION] Refactory on spark related modules

[DISCUSSION] Refactory on spark related modules

Hi community,

Currently we have spark-common and spark2 module, and inside spark2 module there are spark2.1 folder and spark2.2 folder.
This creates problems for IDE for developer, and it spark-common is not required since we have drop spark 1.5 support.
So I suggest we:
1. Merge spark2 and spark-common module, name it spark2 module
2. Put spark2.1 folder to a new module called spark2.1 (It mainly holds CarbonSessionsState)
3. Put spark2.2 folder to a new module called spark2.2 (It mainly holds CarbonSessionsState)

When building carbon, user can specify profile to select modules to include for building
And since CarbonSessionState is created by reflection in spark2 module, so this approach will not have cyclic dependency problem

Re: 回复： [DISCUSSION] Refactory on spark related modules

Hi Jacky,

I don't think it's a good idea to create new modules for spark2.1 and
spark2.2 versions.We should not create a module for every spark minor
version. Earlier we had a modules spark and spark2 because of major version
change and a lot of interfaces are changed along with it. If it is a major
version we can create module but not for every minor version. And regarding
the IDE issue, it is just a developer understanding of how to switch the
versions so we can have FAQ for that and also we can check the solution for
IDE problem.

And about merging spark2 and spark-common module, There is no harm in
keeping all the RDD in common package as if you need to support any major
spark versions like spark3.0 in future may require separating the modules
again.

Re: 回复： [DISCUSSION] Refactory on spark related modules

> Hi Jacky,
>
> I don't think it's a good idea to create new modules for spark2.1 and
> spark2.2 versions.We should not create a module for every spark minor
> version. Earlier we had a modules spark and spark2 because of major version
> change and a lot of interfaces are changed along with it. If it is a major
> version we can create module but not for every minor version. And regarding
> the IDE issue, it is just a developer understanding of how to switch the
> versions so we can have FAQ for that and also we can check the solution for
> IDE problem.
>
> And about merging spark2 and spark-common module, There is no harm in
> keeping all the RDD in common package as if you need to support any major
> spark versions like spark3.0 in future may require separating the modules
> again.
>
> Regards,
> Ravindra.
>
> On 6 December 2017 at 11:00, wyphao.2007 <[hidden email]> wrote:
>
> > +1
> >
> >
> >
> >
> >
> >
> > 在2017年12月06 11时44分, "岑玉海"<[hidden email]>写道:
> >
> > +1
> >
> >
> >
> >
> >
> >
> > Best regards!
> > Yuhai Cen
> >
> >
> > 在2017年12月6日 11:43，David CaiQiang<[hidden email]> 写道：
> > +1
> >
> >
> >
> > -----
> > Best Regards
> > David Cai
> > --
> > Sent from: http://apache-carbondata-dev-mailing-list-archive.1130556.
> > n5.nabble.com/
> >
>
>
>
> --
> Thanks & Regards,
> Ravi
>