Answer: Aiming for a alpha/beta for OR2018, and a final release out later in 2018. Timeline may change based on # of developers involved (obviously).

Question: What is the message for someone getting involved with DSpace 6 now, considering the existing UIs will no longer be supported in DSpace 7?

Answer: Keep in mind that DSpace 7 is coming soon, so you may not want to make significant theming changes to a DSpace 6 UI (either JSPUI or XMLUI)

However, there are no data model changes planned between DSpace 6 and 7. So, custom metadata fields will be automatically ported into DSpace 7

Configuration will also not change between DSpace 6 and 7. So, configuration changes/tweaks will also work in DSpace 7.

Also worth noting we always support the last 3 major releases. Currently support DSpace 4, 5 and 6. So, even when DSpace 7 is released, we'll still support 5 and 6 for some time. So, if you are on DSpace 6, expect to be supported for at least two years after the DSpace 7 release. There will be no rush to upgrade immediately, and you'll have plenty of time to plan out your upgrade locally.

We are looking for:

More developers. Get in touch if your staff is interested in getting involved!

A designer: currently all UI work is using core/default Bootstrap themes. We're in search of a Bootstrap designer to provide some exemplar theme(s) for DSpace 7.

6 Comments

Ideas on improving testing on (minor) releases

Testing instances that stay up to date automatically with the last changes on the branches of versions that we still support. e.g. a test server for 4_x, 5_x and 6_x

Downtime or features being broken on demo.dspace.org should be strictly limited by design, so it can always serve the purpose of being a solid and positive advertisement to the qualities of DSpace. At any point in time, somebody who is comparing DSpace to other software could be looking at demo, so if we mess up their first impression, they might be put off to invest more time in exploring DSpace. Maybe this could even mean we should take away (full) admin access to anyone from demo, and direct people to the test servers for that type of testing?

Before the start of the (mini) testathons for a minor release, it would be helpful if committers could flag those tests in the test plan, that may have a higher chance of breaking than other ones. The problem with the current test plan is that we have a lot of tests in there, and that there's no difference between high priority tests and maybe ones that are a bit less important.

If finding volunteers for (human) testing is a bottleneck, we'd have to look into automating as many of the tests as we can. For example the test where we have to check if the sitemap is not returning a 404 is an example of a test that could be a candidate for easy automation.

I would love to see more automated tests. I think we namely need someone (or an institution) to start drafting out a plan/proposal for how to achieve this.

For example, I know Fedora has their own Jenkins and Sonar (both hosted/maintained by U of Maryland on behalf of the project) which automate building/testing Fedora after each commit (and I think it even auto-deploys to http://test.fcrepo.org, but that URL is not responding for me at the moment). Fedora also has their own shared integration test codebase in their "fcrepo-labs" github org:

So, there are plenty of ways we could do something similar and create similar scripts under DSpace-Labs: https://github.com/DSpace-Labs/ I think we really just need a volunteer or two to begin running with this concept, and see what we can start to automate better. If anyone has ideas on this, please do get in touch. I'd definitely want to find ways to support this going forward (and I'm sure DSpace Steering Group would be similarly interested)