In the section Create Folder Structure, I'm a bit confused as to what should be <labkey-home>.

I've created the directory /usr/local/labkey, and then I've added into that directory, apps, labkey, backups, src, src/labkey, so nowI have a total of 3 labkey directories. During the installation, there's no part that has me adding anything to /usr/local/labkey/labkey, or/usr/local/labkey/src/labkey. I've added all the jars to tomcat-lib/, then is says to copy files to the labkey home directory, which I thoughtwas going to be /usr/local/labkey. But I also noticed that in the docs as a description for /usr/local/labkey/labkey, it says that that's theactual install directory of the labkey server. I don't know what that means. Also is says in the description that /usr/local/labkey/src/labkey is the place where you store the downloaded binaries, yet there's no step that outlines anything related to either. So, for all the directories that need to be moved, example labkeywebapp, modules, and pipelines-lib, do these go under /usr/local/labkey, or /usr/local/labkey/labkey? And is /usr/local/labkey considered <labkey-home>, or is /usr/local/labkey/labkey considered <labkey-home>? Also, for Linux install, does anything go under /usr/local/labkey/src/labkey?

Tomcat and postgres are up and running, and I have worked through to step 4, move the LabKey Server configuration file. When I got to this step, there was no /conf/Catalina/localhost/ directory in my install - I created one and then put the labkey.xml there.

I then continued through to step 11, Tomcat is running, and the landing page comes up. But when I go to /labkey, it throws a 404 not found error:Type Status Report

Message /labkey

Description The origin server did not find a current representation for the target resource or is not willing to disclose that one exists.

Is this a known issue? Is it unusual for the "/conf/Catalina/localhost" directory to not exist after an install? I could use any help debugging this.

We have upgraded from LabKey 18.1 to 18.2 and the connection to the external data source we had running (a MySQL database), is not working anymore.
I attach the log file for documentation of the error messages.

First of all, the error was clearly stating that it is necessary to change a parameter value in the connection string: from "zeroDateTimeBehavior=convertToNull" to "zeroDateTimeBehavior=CONVERT_TO_NULL", which we did. But that did not solve the issue.

Hi, Is there any way to obtain (via command line or GUI) current LabKey's running version? When accessing Admin Console -> Module Information -> Module Details version number indicated is 17.30 instead of 17.3 (see attached file) As it will be used in an IQ document we'd like to obtain the right value. Thanks,

I'm installing a fresh server using provided installation file (LabKey17.3-55140.16-professional-bin.tar.gz), but when moving LabKey Server Libraries as indicated at https://www.labkey.org/Documentation/wiki-page.view?name=configTomcat there is no ant.jar into tomcat-lib (extracted from tgz provided file).
Should I get ant.jar from somewhere else?

Hope you are well. How does on 'tell' labkey server how to read RAW files? msConvert is installed in the pipeline root bin directory (C:\Labkey\bin where x!, etc. are). The server (installed on Windows) works fine with mzXML files, but when attempting to run on RAW clicking the X! search button doesn't actually do anything. To me - this suggests, the server doesn't actually think it can trigger a conversion. How does one let LabKey know it has msConvert and is perfectly qualified to do its own conversions?

Note - we are NOT using the enterprise pipeline, so this is all being run on a single windows head computer.

With the help of the documentation at [https://www.labkey.org/Documentation/wiki-page.view?name=configLdap], I was able to get Labkey working with LDAP. However, Labkey's authentication GUI assumes a fairly simple LDAP configuration and it is currently allowing all our internal users and external customers to log in regardless. I would like to limit Labkey login to just the people we approve. For this purpose, I've set up an LDAP group called 'labkey'.

My current working Security Principal is uid=${uid},ou=Webusers,dc=bcgsc,dc=ca.

Our LDAP hierarchy looks as such:

Users are uid=${uid},ou=Webusers,dc=bcgsc,dc=ca

And the group is at cn=labkey,cn=Webgroups,ou=Groups,dc=bcgsc,dc=ca

Attached is the graphical representation of what I am working with.

How can I get this working with the following Context elements in labkey.xml?

Dear All,
I have tried to install Labkey server but without any success.
When i enter the adress localhost:8080/labkey i have this error in the tomcat manager.
I use java 8, tomcat 8.5 on a debian stretch.
If anyone can help i would be very grateful. I found the installation very complicated.

I have been compiling TPP 5.0 on our Linux cluster (running also Labkey Server 16.3) having lot of problem. But an unexpected problem occurred when I found out that the Labkey Server stopped working -
Below is the error I receive in web browser.
Permissions on logs folder are for tomcat:tomcat

java.lang.IllegalStateException: Error rolling labkey-errors.log file, likely a file permissions problem in CATALINA_HOME/logs
at org.labkey.api.module.ModuleLoader.rollErrorLogFile(ModuleLoader.java:604)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:262)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:318)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:240)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4855)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5549)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:677)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1912)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

Any ideas on how to fix this? The Tomcat default homepage loads fine, and I can see in postgres that the labkey database has been built. A file with the relevant labkey.log and labkey-errors.log entries is attached.
Thanks,
Jim

HI,
I"m doing some analysis of our site users and am trying to make a view that shows the date a user was "deactivated" from the system. I can see when the account was "created", when the "Last Login" was ... but under "grid views" is there a way to see when someone's account was "deactivated" or "disabled"?

Thanks!

-- Nat

Apologies if this belongs in the Community Forum ... please let me know if it does. thx.

Hi,
We have run into an issue around sending bulk emails this week where the outbound queue of our SMTP server is *very* slow. Currently at almost 36 hrs after the original announcement was made from LabKey, there are about 1k emails still in outbound queue. The list is slowly shrinking ...

So our user base of emails is now almost 9K in size, so not small anymore but not huge either. Our IT dept recommended we send out bulk emails in 1K batches (5 min apart), or put a 5 sec delay between individual emails ... oye, not getting into manual management of batch emails...

I've pushed on them regarding whether there are controls/adjustibility on the SMTP server to automatically do this and I'm waiting to hear back. They also suggested I contact LabKey as to whether there is any outbound email functionality that is built into this product.

I did search the support base prior to sending this but couldn't find anything related.

I searched around, but couldn't find an answer to this exact question.

We have a system that outputs excel files. The top of the file contains two columns of metadata, comprising six rows. There is a blank row directly below, and then on row 8, the data we wish to store begins.

Row 8 contains the headers we want in our data domain, so our data fields have been designed to match these. However, when I upload a file, Labkey appears to check the first row for header information (which only contains two non-blank columns). Thus, the file does not upload successfully.

Is there a way to tell Labkey to skip a few lines when searching for the headers? I saw on another post, someone suggested prepending the first seven lines with "#", but I am wondering if Labkey can be configured to ignore these automatically.

We are splitting one of our LabKey sites in two. One will be for customers and the other will be the internal lab use. They and their databases will be hosted on separate VMs, however we want them to share the same directory for file storage -- the same one they are using before the split. The idea is to allow us to reference the same data from both sites without duplicating storage.
1. Does anyone see a problem with this?
2. If we remove a project from one site would it delete the data files even if we wanted to continue to access them from the other site?

We have an assay designed which has a field for gene ontology. This can get very long (above 4000 characters) and one of our users was wondering if there was a variable somewhere in the settings that would allow us to increase the maximum number of characters for Text fields. Assuming our database can handle such a change, is this something that could be done? Moreover, would this still allow us to use built-in assay types?

I have just performed a new install of LabKey 16.3 on Windows Server 2012 R2. Everything looks good except that the home page is editable when i'm not signed in. The home folder permissions are set for "all site users" and "guests" as reader and nothing else. See attachments. I have another install where the home page is not editable and appears to have the same permissions set. That install is version 15.30.

We want to use labKey in an university research environment in epidemiology.
We already have a computing server to run R and we want to add a new Labkey server to our environment. This tool seems strong but a bit hard to configure. In particular when you try to run R remotly.

We would be delighted if you could help us find the appropriate documentation for this configuration. The documentation available on the website does not describe precisely this type of configuration and we have not found any information on your forum.

We're trying to configure our LabKey server (linux) to run R scripts remotly on another server (linux) on the same VLAN but we get a java null pointer exception.

On Linux, is it possible to run LabKey with Tomcat's conf directory as read-only? It is usually recommended to make this directory read-only (The user running Tomcat is the owner of this directory and its contents, with permissions 0400)? I've tried it out and was unable to reach LabKey's homepage and had to revert back to 0740, but I would still like to restrict how the configuration files are able to be accessed and changed by outside users.

LabKey Server 16.3 now officially supports Tomcat 8.5.x, starting with the Tomcat 8.5.9 release. This Tomcat release includes a fix to an important issue that previously blocked our ability to approve use of the 8.5.x line. We expect to continue to support Tomcat 7.0.x and 8.0.x for the foreseeable future, although the Tomcat team now encourages use of 8.5.x.

Dear LabKey team,
We've tried to upgrade LabKey from a 16.2 installation to 16, but we get this message in the labkey log files and labkey doesn't boot properly:

ERROR ModuleLoader 2016-12-15 13:16:14,635
localhost-startStop-1 : Failure occurred during ModuleLoader init.
java.lang.AbstractMethodError: Method
org/labkey/illumina/IlluminaModule.getSchemaNames()Ljava/util/Collection;
is abstract
at org.labkey.illumina.IlluminaModule.getSchemaNames(IlluminaModule.java)
at
org.labkey.api.module.ModuleLoader.getModuleDataSourceNames(ModuleLoader.java:1622)
at
org.labkey.api.module.ModuleLoader.getAllModuleDataSourceNames(ModuleLoader.java:1613)
at org.labkey.api.data.DbScope.initializeScopes(DbScope.java:1008)
at
org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:917)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:330)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:240)
at
org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at
org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at
org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at
org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4908)
at
org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5602)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at
org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652)
at
org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:677)
at
org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1962)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)

Do you have any hints about what could be the cause?
Thanks a lot,
Eva

Hello all, I set up LabKey for the first time on a new server. Everything seemed fine at first, I could go to http://<server-name>:8080/labkey and start setting up an e-mail and password for the administrator. Then, when I continued to the next step I was immediately brought to the homepage ((http://<server-name>:8080/labkey/home/project-begin.view?). When I try to sign in from the Sign-In link in the top right, I am brought back to the homepage without being signed in. If I try to log in with bogus credentials, it does recognize that it couldn't find that user. Sometimes I get a page saying that maintenance mode is going on, and I am given a button for Adminstrator login. However, I am almost instantly redirected back to the homepage.

My logs don't help much. I've included labkey.log, but my labkey-error.log is completely empty. PostgreSQL logs have a few interesting lines, I have attached it as well.

EDIT: Investigating further, I also ran bin/configtest.sh from Tomcat's directory which uncovered some errors, although I don't see anything else using port 8080.

I am building a continuous integration environment for our labKey development team so that we can better QA and manage our releases.

I have been able to successfully compile and push (copy) the web app to the staging server along with a restoring database from production. It seems that the web-server starts up correctly; however, upon first accessing the web-app, I receive the following error message.

"Module has already been set" I added the entire stack-trace below

I was wondering if anyone else had a similar experience and could point in the right direction.

There is a second labkey installation on the same server, but it seems to be working without issue.

Thanks,
Jason

----
java.lang.IllegalStateException: Module has already been set.
at org.labkey.api.view.BaseWebPartFactory.setModule(BaseWebPartFactory.java:177)
at org.labkey.api.module.DefaultModule.getWebPartFactories(DefaultModule.java:414)
at org.labkey.api.view.Portal.getPartsToAdd(Portal.java:1178)
at org.labkey.api.view.Portal.addCustomizeDropdowns(Portal.java:905)
at org.labkey.api.view.Portal.populatePortalView(Portal.java:1027)
at org.labkey.api.view.Portal.populatePortalView(Portal.java:950)
at org.labkey.core.portal.ProjectController$BeginAction.getView(ProjectController.java:327)
at org.labkey.core.portal.ProjectController$BeginAction.getView(ProjectController.java:280)
at org.labkey.api.action.SimpleViewAction.handleRequest(SimpleViewAction.java:80)
at org.labkey.api.action.BaseViewAction.handleRequest(BaseViewAction.java:178)
at org.labkey.api.action.SpringActionController.handleRequest(SpringActionController.java:409)
at org.labkey.api.module.DefaultModule.dispatch(DefaultModule.java:1281)
at org.labkey.api.view.ViewServlet._service(ViewServlet.java:190)
at org.labkey.api.view.ViewServlet.service(ViewServlet.java:124)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.data.TransactionFilter.doFilter(TransactionFilter.java:38)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.core.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:118)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.module.ModuleLoader.doFilter(ModuleLoader.java:1144)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.security.AuthFilter.doFilter(AuthFilter.java:205)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:218)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:169)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:956)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:442)
at org.apache.coyote.ajp.AjpProcessor.process(AjpProcessor.java:190)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:623)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)

Hi.
I have tested Labkey server 16.1 and several tables (about 10 from 23 module lists) gives the message

500: Unexpected server error

Biostanice1.Index (primary key) has not been provisioned properly. Ensure the domain is established before constructing.

java.lang.IllegalStateException: Biostanice1.Index (primary key) has not been provisioned properly. Ensure the domain is established before constructing.
at org.labkey.list.model.ListTable.<init>(ListTable.java:105)
…

Similar or the same as Issue 26423. (solved in version 16.2)?
So I have installed 16.2 version and all tables which had problem in 16.1 disappeared?! (… object not found) What happened? Can you help me please?

In version 15.2 was all OK. All tables I created by default function in "Manage Lists - Create new lists”. I'm afraid to upgrade to the next version.

Thanks
lubomir pavliska

Another cases and projects.
Some project are OK. But in many cases is the same - error in primary key. I don't understand about it in version 15. was all OK and those projects come from 13. version.

I have been tasked with installing TPP (Trans Proteomic Pipeline) under Labkey 16.2 running on Windows Server 2012. The documentation on LabKey refers to TPP 4.6.3 but the current version is 4.8. The docs also point to the TPP source which is apparently not available. Can the TPP v4.8.0 .exe Installer version be used and if so, what process needs to be followed? Any help will be greatly appreciated.

While trying to connect to the Labkey server, I am getting an unexpected server error with the attached messages. It looks like Labkey is trying to run a script (viability-14.20-14.20.sql) that upgrades from 14.10 to 14.20. Since I'm running Labkey 16, I'm not sure why it is trying to run the script. Please let me know how I can fix this.

Hello,I installed only Labkey core and Filecontent and audit modules. Publishing package is installed there are many modules,Delete not every module is cumbersome. How to make your own personalized installation package?

Hello, I was wondering if anyone could guide me in the right direction to setting up a free instance of labkey on the cloud using a virtual machine in Azure. I have set up a VM machine on azure and they have provided me with a custom domain under their services. If I go to their domain name I get the error in my browser saying it took to long to respond and no connection is available to labkey.

Now, what I am wondering is how can I setup the labkey services so that when I put the Azure domain into a browser I will be prompted with the login page of labkey. I am new to this so I have no clue how to set up it up- right now I can just connect through azures portal so I have direct access to the virtual machine with the lab key hosted on the computer/server. How do I setup the labkey server settings where accessing the domain directly would lead me to the login page of labkey? Or rather how do I configure a VM so that labkey will be accessible? Any help would be appreciated!

I am setting up a new BI server for a biobank, and am designing it now. We are a Microsoft shop, so the choice of DBMS is already made :)

SQL Server 2016 has just been released by Microsoft, and is not mentioned as a supported release for current LabKey. We are going with it to get extra features not available in earlier releases.

At the same time we are looking for alternatives to Microsoft's tools for information delivery because we have data we cannot store in Sharepoint, and I am thinking to take a close look at LabKey for this.

How long might it be before Labkey will support SQL Server 2016? How risky is it to try and make them work together before it is officially supported? Are the risks just that things might not work, or are there data security implications?

Hi, I installed every required component succesfully,
But when I configure the application It returnes me an configuration error. I send you a .log file and a print.
I checked that the database is running, and the database connection URL and user credentials in labkey.xml are ok.

Each one of our Projects has it's own user base. Has anyone looked into providing Single Sign-on for users at the Project level? The current system only allows for one Single Sign-On server configuration. We would need to be able to point to multiple CAS servers.

I use the tutorial study/datasets HIV/CD4 .
If I import the demo study and try to make a time chart with e.g the lab-data it says:
No calculated interval values (i.e. Days, Months, etc.) for the selected 'Measure Date' and 'Interval Start Date'

If I click view data then I can see there is empty fields in days/visits/visit day.

This goes for any measure in any dataset I have tried

I have clicked recalculate visits, is there anything else obvious I could try?
If I check the Lab result data I can see the Day field.

Have successfully installed R but I am having issues configuring the VFB server to work on LabKey Server. Followed the tutorial below, but it seems as though some steps might be different for FreeBSD... Please advise!

LabKey 15.3 is correctly running, now wanting to install ImportableDemoStudy.folder.zip, but noticed the instructions to the below tutorial do not match the options to what is available in my LabKey version.

1.) "Confirm that Inherit from Parent Folder is checked" do not seem to find that option anywhere...
2.) "On the New Study page, click Import Study" do not seem to find the "import study" option anywhere...
3.) Thanks in advance! Screenshot attached!

An important bug has been fixed on LabKey 15.3 r42135.48 regarding WebDav and the recent updates to Tomcat, specifically Tomcat 8.0.30+ and Tomcat 7.0.67+

Due to unforeseen changes with Tomcat's handling of WebDav calls, some functions with the FileContent module (Files Web Part) would cause issues with creating new folders, generate Error 400 responses in the background, or even prevent LabKey from starting due to the WebDav controller failing during the initialization.

All WebDav bugs have been identified and fixed, so if you have plans on upgrading Tomcat to one of the two versions listed above, please make sure you download our revised LabKey 15.3 build.

I have also notice that the postgres driver jar is "PostgreSQL 9.4 JDBC4.1 (build 1201)" but I am using postgres 9.2 installed fro the centos SCL repositories (https://wiki.centos.org/AdditionalResources/Repositories/SCL). I was thinking about upgrading postgres to 9.4 but in the labkey compatibility table I see that postgres 9.2 is supported so I am not sure if this could be the problem.

First time LabKey user. I am attempting to install LabKey on Linux with the following details:
LabKey 15.3, Tomcat 8.0.30, JDK 1.8, Postgres 9.1.18 - all running on Debian 7.
PG and Tomcat are working fine. Labkey installation is not, as it is throwing a server error - 500. Its able to run the SQL scripts and create few schemas, and populate the core schema (the log has a message of not supporting PSQL version 9.1.18, but this does not seem to be the issue, although I am planning an upgrade later). The Exception:

Longtime labkey user first time installation problem (well its been a while). Followed the linux instructions on the website (installing on Centos7) and when I start it up I get a 500 error. Versions all appear to be supported(javac 1.8.0_66, PostgreSQL 9.2.14, apache-tomcat-7.0.67). Tomcat appears to running just fine. Any suggestions would be great.
Thanks
-Rich

I'm working with ETL's and have some questions. They spring from the fact that we have a collection of source tables on an external database server, which needs some basic cleaning/transforming before loading them into labkey. e.g. we need to change some',' and '.' to be able to convert a column to floating point.

1: If I use a stored procedure server to populate a source table. How do I make sure that the stored procedure is executed before loading the data into the destination?

2: Is it possible to use a stored procedure which return a table/select query as a source in the ETL step so that it is loaded directly into the destination?

3: Is there a better way to do the transform step than in stored procedures when using Labkey?

4: How does <incrementalFilter> work?

I couldn't find answers to these questions in the documentation, but if I have overlooked something please point me in the right direction.

Quick note just to let everyone know that LabKey Server's automatic upgrade process is becoming somewhat more automatic starting with the 16.1 release. Previously, the server would upgrade the core module at server startup, but would wait for an administrator login before upgrading other modules or initiating module startup. A trunk commit last week changed the server to initiate upgrade and startup of all modules at server startup time. Administrators can still log in to view upgrade/startup progress, but the server will no longer wait for a login.

This change streamlines the startup process, simplifies upgrade code, and facilitates scripted upgrades of LabKey Server. We suspect most administrators won't notice a difference, but let us know if you have any questions or concerns about this change.

I have installed the SAS and Java libraries and added them to the paths in the sasv9.cfg file

I have created a _netrc file in my home directory (there is a copy attached with a changed password, but the password starts with a '-' which is kept), and I made an environmental variable named HOME pointing to my home directory.

There doesn't seem to be anything in the other log-files. (labkey.log, labkey-errors.log)

For good measure I restart SAS after each change.

Furthermore, if I add the userName and password to the SAS-macro call, then it also works.

My own conclusion is tha tit must på the SAS interpretation of the _netrc file that fails, but because it wotks with Rlabkey i'm not sure what kind of error it could be. Do you have any idea what might cause this?

Is it possible to add our own pipeline (perl/python/R scripts) elsewhere than when importing files? Is it possible to have them with the other analysis scripts/tools in the Sequence Analysis (DISCVR-Seq) module?

I've spent some time reading about creating pipelines, but I can only find examples and description of pipelines used when importing files, not to work on files already uploaded/analysed.

I have setup LabKey several times (13.3, 14.1, 14.2, 14.3), and the 15.3 webapp is running, but this time I can't get past this JDBC error on first access. The labkey user has superuser access. I added the defaultAutoCommit line to try and resolve it, but no go. Any ideas? Thanks.

A database connection is in an unexpected state: auto-commit is false. This indicates a configuration problem with the datasource definition or the database connection pool.

This is a problem with your configuration. Please contact your local server administrator for assistance, or LabKey Software at support@labkey.com for operational assistance with correcting the configuration error.

org.labkey.api.util.ConfigurationException: A database connection is in an unexpected state: auto-commit is false. This indicates a configuration problem with the datasource definition or the database connection pool.
at org.labkey.api.data.DbScope._getConnection(DbScope.java:737)
.

I'm attempting to upgrade to 15.3 on Windows. The installer gives me the following message: Postgres upgrade failed: 1. Unable to recover. However before that a message flashed up complaining about invalid characters in the install path. The message did not persist long enough for me to read it all :(

Now it would appear that my install is broken and I have no idea how to fix it without reverting to a backup.

I have been playing around with some ".mzXML" data obtained from "LTQ Orbitrap Velos" raw data on the LabKey server. I will like to find perturbed pathways based on the differential expressions of mutant/wild-type(control). Even though I have followed the tutorials in the LabKey documentation, I'm still not sure about the steps that are necessary to do this kind of analysis that I'm interested in. For example, I need to know which proteins are over/under- expressed in either mutant or the control. How can I use LabKey server to achieve this? Any help will be appreciated. Thanks

According to the documentation there should be a "Change User Properties" link in the "Site Users" section, which don't seem to be there. It is however in the 'Admin Console', but when I click it I just get an empty page with a 'back' and a 'home' button.
There is no errors in the log. Do you have any clue to what the problem might be? Does it require any setup to use?

I clicked on search to carry out X!TANDEM PEPTIDE SEARCH, but got the following error message: "A sequence database must be selected". I checked the Protein databse section of this page and only found <root> dropdown link and a "none found" message.

I noticed in the file section that there is no file in the "databases" folder even though I had already downloaded and imported the swissprot database as recommended in the documentation.

Can we use labkey’s built-in authentication service (i.e. database authentication) to pass credentials to other software/sites? We want to integrate with other software, but do not want to have 2 credentialing systems.
This seems to be point to an SSO integration but with LabKey as primary. We also do not want to use LDAP at this time. Would there be a capability or best practice to accomplish this from LabKey?

Every time I try to send a job to the remote pipeline server from the labkey webserver, it still tries to run it locally from the Sequence Analysis module (DISCVR-Seq module). The message I get in the pipeline log has: "Starting to run task 'org.labkey.sequenceanalysis.pipeline.SequenceAlignmentTask' at location 'webserver'" and it can't find the tool I'm trying to run has it's not installed locally.

I've added -DsequencePipelineEnabled=true in the Tomcat7w startup options and configured the remote and local pipelineConf.xml scripts. ActiveMQ is setup on the linux server (on the remote pipeline server) and seems to works.

Is there something I missed and is there some documentation on labkey.org I should check to help me debug this?

We wanted to give everyone an update on our support for SQLServer 2008R2. We've long listed it as deprecated (since at least version 14.2), but we're shifting starting in 15.3 to actively encourage administrators to upgrade. In version 16.1, we will no longer support 2008R2.

This is in line with our standard approach of dropping support for versions of third-party software that are no longer supported by their vendor. Microsoft's end-of-life for 2008R2 is complicated by the vast number of versions, but has been largely unsupported for a couple of years already:

I am not sure what is wrong, but I tried to upload Comet search results from shotgun proteomics LCMS acquired on Thermo Fusion and the upload failed with the following in the log file (below).
I was able to upload Comet search results and shotgun proteomics from Fusion before.
I noticed that in reading the mzXML file it shows MS3: 0 and I am sure that is incorrect.
Any suggestions?

org.labkey.api.util.ConfigurationException: DataSources are not properly configured in labkey.xml.
at org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:887)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:318)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:230)
at java.lang.Thread.run(Thread.java:745)

Caused by: javax.naming.NameNotFoundException: Name [comp/env] is not bound in this Context. Unable to find [comp].
at org.apache.naming.NamingContext.lookup(NamingContext.java:819)
at org.apache.naming.NamingContext.lookup(NamingContext.java:167)
at org.apache.naming.SelectorContext.lookup(SelectorContext.java:156)
at javax.naming.InitialContext.lookup(InitialContext.java:417)
at org.labkey.api.module.ModuleLoader.ensureDatabase(ModuleLoader.java:902)
at org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:864)

Hi
I'm still having problems with getting a running Labkey installation using MS SQL Server.

The basic questions are:
1: What is the preferred process to install a Labkey server using MS SQL Server when you don't have full permission/control of the server and want to be certain that the labkey admin have all the necessary permissions (at the moment the labkay admin is owner of the database) ?
2: Can you describe what permissions are needed (besides from being owner of the database) in the database to successfully run labkey?
3: Must it be the labkey server that runs sp_updatestats or is it OK if we have the option 'Auto update statistics' enabled on the database?

I'm not sysadmin so I don't have full control over the database, I have to ask our central IT department to do stuff, which takes 1-3 days depending on how busy they are.

There are (at least) two issues:

1: GROUP_CONCAT
Needs to be part of core schema, but the core schema cannot be made before the initial run of the server.
I get an error if I try to create the core schema in advance and assign the GROUP_CONCAT functions to it before I start the server ("core schema already defined" exception). After moving GROUP_CONCAT back to dbo and remove the schema. I get another error "DROP ASSEMBLY failed because 'GroupConcat' is referenced by object 'GROUP_CONCAT_D'" but I was able to start labkey. Then it seemed to work but when I installed a module I was back to a previouos discussed "FastaAdmin already exist" exception.

2: sp_updatestats
Need permission to run this stored procedure(in our system we don't have this pr default), this must be assigned before creating any database objects. Is it important who runs this procedure, as long as it is run?

Our data is almost exclusively stored in SQL databases, and data is comming continously into the tables.

What is the preferred way to import these data into dataset/assays/lists to be able to work with them inside labkey? f.x. we have a table with basic information about the patient birthday, gender, enrollmen date etc, and with new patients being enrolled continously. What will be the best way to use this table as a basic demographic table in the study?

I have looked at external schemas but it is not obvious for me how to do something like that in labkey.

I adventurously started upload of the full Uniprot-TrEMBL annotation from uniprot_trembl.xml. It appears to be endless - has been running since Aug 23, and it is still adding new entries.
Is there a way to abort/cancel this process?
In retrospect subsetting the uniprot_trembl.xml to only relevant species would have been prudent, but at this point I am just trying to stop it.
Perhaps incorrectly I tried to "Complete" the process in the Admin>Site>Pipeline, which marked it as Complete, yet the process is still running.
Any advice?

Date: Could not convert 'Fri Aug 15 00:00:00 CEST 2008' for field Date, should be TimeStamp

The underlying format in excel i a number (when converted to text)

If I convert to a TSV file it works ok.
I think this error propagates into the rest of the tutorials where timepoints is used because they don't work as expected. Is there any settings that can cause/solve this?

Our users have been creating many assays. The default setting for Assay Location when you create a new assay is Project (not Current Folder). So of course, the first dozen or so assays they set up used this default setting for the assay location. Now that we've started to organize things a bit further, we've set up folders underneath the project. When you add the Assay List web-part to any folder, the assay list shows all project assays. We sort of realized too late that we should not have used the default Project assay location, because we want the assay lists to only show the assays that belong to that particular folder. I haven't been able to find a way to go back and change this setting after the assay has been created. Is there something I'm missing, and if there is no way to change this through the application, how would be go about changing it through the backend database. Alternatively, we could even settle for a way to permanently filter the assay list so it only shows the assays for the current folder location.

Hello, new to the community so apologies if this has been answered in the past, but I couldn't find in documentation/forum.

We have a study/registry we've kept in an excel sheet and would like to transition into LabKey. At a high level all subjects are assigned an ID and have information collected at multiple events (separate label/column).

I tried "importing a dataset" but have gotten a number of errors. Is this the best way to setup a study from an excel sheet? This one seems to keep popping up: "Only one row is allowed for each Participant/Visit. Duplicates were found in the database or imported data".

We are running LabKey 15.1, and we see in the Web interface, when setting permissions for a folder, the following Author Role description:
"Authors may read and add information in some cases, but may update and delete only information they added."

When trying such role using the impersonate function, it seems that Authors can create data and view all data (not only data belonging to the current user), but they are not able to update or delete the data created by themselves.
Also, that seems to apply only for lists, and not for datasets.
How does that correspond with the Author Role description? Are we understanding that incorrectly?

Is there a way to give a user permissions to edit/delete and to view only data created by him/herself, both in lists and datasets?

Is it possible to create messages (discussion threads) in datasets at the record level?

For lists, it is possible to allow one or multiple discussions per item (set in the list edit design).
But for datasets, we have only seen that at the end of the dataset view, there is a "DISCUSSION" link which allows submission of messages referring to the whole dataset. Is it in some way possible, to create those discussions for single records of a dataset, like it is possible in lists?

Something else:
It seems that visualization of messages in the details part of a list item does not work properly: once the discussion is hidden (or closed), hitting the "DISCUSSION" link does not allow showing it again.

Hi,
I have set a pipeline override for a project, but when I then go upload MS2 (or Panorama/Skyline) data a log file is not written (and upload to Panorama type folder within project fails) unless I give write privileges on the given folder on the linux box to "all" (i.e. drwxrwxrwx).
If I have the privileges set at our default drwxrwxr-x (with tomcat being member of the group which owns the folder) the log files cannot be written (Panorama Skyline file uploaded). I would have assumed that if tomcat is a member of the group which owns the folder that it should be allowed to write there and this should work.

Could you please advice how the privileges need to be set in order to get this work without giving write privileges to "all".

(2) Hierarchy concept
In our view, Batch can have multi runs.
However, after select a batch in view batches, it shows the run file. When I use “Import Data” under “view runs”,
It still creates a new batch and put the new imported file as one run associated with a new batch.
I can’t find a document how to tide multi runs to a given batch.
I also have used select batch -> import data -> "save and import another run"
The file has a new entry and the file name is also not overlap with any other run I have already uploaded.
However, I got the following error that was not making sense for me
500: Unexpected server error
The same file was uploaded twice - all files must be unique
rg.labkey.api.exp.ExperimentException: The same file was uploaded twice - all files must be unique
at org.labkey.api.study.assay.AssayFileWriter.savePostedFiles(AssayFileWriter.java:192)
at org.labkey.api.study.actions.AssayRunUploadForm.getAdditionalPostedFiles(AssayRunUploadForm.java:253)
at org.labkey.api.study.actions.AssayRunUploadForm.getPropertyMapFromRequest(AssayRunUploadForm.java:130)
at org.labkey.api.study.actions.AssayRunUploadForm.getRunProperties(AssayRunUploadForm.java:111)
at org.labkey.api.study.actions.UploadWizardAction$RunStepHandler.validatePost(UploadWizardAction.java:682)
at org.labkey.api.study.actions.UploadWizardAction$RunStepHandler.handleStep(UploadWizardAction.java:628)
at org.labkey.api.study.actions.UploadWizardAction.getView(UploadWizardAction.java:169)
at org.labkey.api.study.actions.UploadWizardAction.getView(UploadWizardAction.java:95)
at org.labkey.api.action.SimpleViewAction.handleRequest(SimpleViewAction.java:80)
at org.labkey.api.action.BaseViewAction.handleRequestInternal(BaseViewAction.java:179)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:153)
at org.labkey.api.action.SpringActionController.handleRequest(SpringActionController.java:414)
at org.labkey.api.module.DefaultModule.dispatch(DefaultModule.java:1032)
at org.labkey.api.view.ViewServlet._service(ViewServlet.java:190)
at org.labkey.api.view.ViewServlet.service(ViewServlet.java:124)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:731)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.data.TransactionFilter.doFilter(TransactionFilter.java:38)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.core.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:118)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.module.ModuleLoader.doFilter(ModuleLoader.java:1085)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.labkey.api.security.AuthFilter.doFilter(AuthFilter.java:182)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:423)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1079)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:620)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:318)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)

I am trying to install labkey on a Ubuntu 14.04 linux. I am using the tomcat7 installation from the distribution. I am using postgres 9.4 installed via the postgres repositories.

The connection to the database is made, and the labkey database is created, but stays empty. In the browser I get the following exception:

org.labkey.api.util.ConfigurationException: Cannot connect to DataSource "labkeyDataSource" defined in labkey.xml. Server cannot start.
at org.labkey.api.data.DbScope.initializeScopes(DbScope.java:985)
at org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:890)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:318)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:230)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4809)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5485)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:670)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1839)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Why am I in a transaction?
at org.labkey.core.dialect.PostgreSql91Dialect.configureToDisableJdbcCaching(PostgreSql91Dialect.java:1604)
at org.labkey.api.data.SqlExecutingSelector$ExecutingResultSetFactory.getResultSet(SqlExecutingSelector.java:316)
at org.labkey.api.data.BaseSelector.handleResultSet(BaseSelector.java:254)
at org.labkey.api.data.BaseSelector.getObject(BaseSelector.java:163)
at org.labkey.api.data.BaseSelector.getObject(BaseSelector.java:158)
at org.labkey.core.dialect.PostgreSql91Dialect.determineSettings(PostgreSql91Dialect.java:813)
at org.labkey.core.dialect.PostgreSql91Dialect.prepare(PostgreSql91Dialect.java:756)
at org.labkey.api.data.DbScope.initializeScopes(DbScope.java:973)
... 19 more

we have just installed a new instance of Labkey server 15.1 with postgres and start to feel out way around. Ran into issue when comparing multiple sets of proteomics data. Selecting few runs and comparing them (any comparison - ProteinProphet, Search Engine, etc. has the same issue) works fine, but when I try to sort on any column, the column returns with blank content - other columns for other samples are OK, but anything related to the samples which column was used to sort is blank. Going into Admin Console and looking at Errors - this is what I get.
Any suggestions?
Thanks,
Tomas Vaisar

org.labkey.api.data.SqlExecutingSelector$ExecutingResultSetFactory.handleSqlException(SqlExecutingSelector.java:436)
org.labkey.api.data.BaseSelector.handleResultSet(BaseSelector.java:268)
org.labkey.api.data.SqlExecutingSelector.getResultSet(SqlExecutingSelector.java:121)
org.labkey.api.data.TableSelector.getResults(TableSelector.java:274)
org.labkey.api.data.TableSelector$1.call(TableSelector.java:289)
ERROR ExceptionUtil 2015-07-16 06:30:24,304 http-bio-8080-exec-21 : Exception detected and logged to mothership:
org.springframework.jdbc.BadSqlGrammarException: ExecutingSelector; bad SQL grammar []; nested exception is org.postgresql.util.PSQLException: ERROR: relation "bogustable" does not exist
Position: 1871
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.translate(SQLErrorCodeSQLExceptionTranslator.java:276)
at org.labkey.api.data.ExceptionFramework$1.translate(ExceptionFramework.java:37)
at org.labkey.api.data.ExceptionFramework$1.translate(ExceptionFramework.java:31)
at org.labkey.api.data.SqlExecutingSelector$ExecutingResultSetFactory.handleSqlException(SqlExecutingSelector.java:441)
at org.labkey.api.data.BaseSelector.handleResultSet(BaseSelector.java:268)
at org.labkey.api.data.SqlExecutingSelector.getResultSet(SqlExecutingSelector.java:121)
at org.labkey.api.data.TableSelector.getResults(TableSelector.java:274)
at org.labkey.api.data.TableSelector$1.call(TableSelector.java:289)
at org.labkey.api.data.TableSelector$1.call(TableSelector.java:286)
at org.labkey.api.data.AsyncQueryRequest$1.run(AsyncQueryRequest.java:109)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.postgresql.util.PSQLException: ERROR: relation "bogustable" does not exist
Position: 1871
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:570)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:406)
at org.postgresql.jdbc2.AbstractJdbc2Statement.executeQuery(AbstractJdbc2Statement.java:286)
at org.apache.tomcat.dbcp.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
at org.apache.tomcat.dbcp.dbcp.DelegatingStatement.executeQuery(DelegatingStatement.java:208)
at org.labkey.api.data.dialect.StatementWrapper.executeQuery(StatementWrapper.java:916)
at org.labkey.api.data.SqlExecutingSelector$ExecutingResultSetFactory.executeQuery(SqlExecutingSelector.java:372)
at org.labkey.api.data.SqlExecutingSelector$ExecutingResultSetFactory.getResultSet(SqlExecutingSelector.java:322)
at org.labkey.api.data.BaseSelector.handleResultSet(BaseSelector.java:254)
... 6 more

I've been able to install a test labkey server on my home computer, everything works perfect except: When i import data from excel spreadsheet and link the subjectID column there is an odd problem. Labkey recognizes all subjects (sub01, sub02, ... sub50) however if i click on the first one (sub01) i get an error: Could not find Subject sub01

This error is only shown on every first record in the database, other subjects work great!

I am attempting a new v15.1 server installation on MacOS X and when I try to use Tomcat 8 (8.0.23) it fails to start the labkey webapp: catalina.out:

02-Jul-2015 10:03:16.729 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal One or more Filters failed to start. Full details will be found in the appropriate container log file
02-Jul-2015 10:03:16.729 SEVERE [localhost-startStop-1] org.apache.catalina.core.StandardContext.startInternal Context [/labkey] startup failed due to previous errors

labkey.log:

ERROR ModuleLoader 2015-07-02 10:03:16,726 localhost-startStop-1 : Failure occurred during ModuleLoader init.
java.lang.NullPointerException
at java.io.File.<init>(File.java:277)
at org.labkey.api.module.ModuleLoader.webappFilesExist(ModuleLoader.java:449)
at org.labkey.api.module.ModuleLoader.verifyProductionModeResources(ModuleLoader.java:433)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:313)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:258)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4574)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5193)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:586)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1750)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)

The version of Java is 1.8.0_25. The problem does not occur if I use a Tomcat 7 instance instead of the Tomcat 8. Digging in the Tomcat 8 source code for org.apache.catalina.core.StandardContext it appears that this may be due to sending Strings (specified in ModuleLoader's verifyProductionModeResources()) to getRealPath() that are not prepended with a '/' and hence result in it returning null:

I am trying ot install Labkey on SL7 (equivalent to CentOS7), using tomcat (7.0.54) and postgres (9.2.10) from the distributions repositories. I get a stack trace with the You must have a DataSource named "labkeyDataSource" defined in labkey.xml error, even though I have configured the dataSource. See attached labkey.xml

The stack trace looks like this:

org.labkey.api.util.ConfigurationException: DataSources are not properly configured in labkey.xml.
at org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:861)
at org.labkey.api.module.ModuleLoader.doInit(ModuleLoader.java:344)
at org.labkey.api.module.ModuleLoader.init(ModuleLoader.java:258)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4809)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5485)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:672)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1862)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.labkey.api.util.ConfigurationException: You must have a DataSource named "labkeyDataSource" defined in labkey.xml.
at org.labkey.api.module.ModuleLoader.ensureDatabase(ModuleLoader.java:929)
at org.labkey.api.module.ModuleLoader.initializeDataSources(ModuleLoader.java:838)
... 18 more

Is LDAP authentication on port 389 with StartTLS currently supported by Labkey? I would like to use such a configuration as ldaps:// is deprecated in favor of Start TLS [RFC2830], but my attempts are failing.

A detailed description of your problem or question, including instructions for reproducing your issue.

Error information. Please attach log files to your message, rather than pasting in long text. Beginning with release 17.3, error pages and log files will include an Error Code. Please include this code in your message.

In order to post to the community forum, you'll need to register for a user account. If you already have an account but have forgotten your password, you can reset your password using the link on the Sign in page.