For anyone who is using the Bedrock library of TI functions, I thought it might be useful to have a central thread where issues relating to use or documentation can be aggregated. It would probably be useful if the issues were kept to a standard format as shown in the next few posts.

Details
The documentation refers to a non-existent parameter called pSkipConsols. Note that this is not the same as the pSuppressConsol parameter, which serves a completely different purpose. That parameter does not seem to exist in any process at all. Obviously if someone includes it in a process call it will crash the call chain; calling a process with an undefined parameter drops it dead.

Details
The documentation refers to a parameter named pDeleteTempObj. The real parameter name is pDestroyTempObj. If you copy and paste from the manual, the call fails because the parameter does not exist.

In the case of Bedrock.Cube.Data.Copy there is also a comma missing after the pDeleteTempObj parameter value in the example, which will cause a compile error if you simply copy and paste it.

Details
Per the documentation, pFileName parameter: "If no file name is provided, a combination of the cube, dimension and element suffixed by 'export.csv' will be used."
It actually gets generated as CubeName_Export.csv.
This will be a problem if you are generating multiple exports for multiple dimension / element combinations; each new file will simply overwrite the previous one.
The workaround is to always specify a value for the pFileName parameter.

Details
If a process has a coding error you may need to correct it in your own environment rather than waiting for the next iteration of Bedrock to be released with the corrections. The problem is that some Bedrock code has a cube view data source, a source which of course points to a cube which is not on your system. This can make modifying the code problematic. To rectify this you can create a Bedrock Test cube on your own system using the following code:

to trigger code reporting that temporary objects were destroyed. You will therefore get that report even if debugging is set to 0 (off). This one is slightly more annoying than issue 4 in two respects:
(a) The file in the logging folder incorrectly reports that the file is being generated by the Prolog, which means that you can waste a lot of time trying to find out where the file is coming from. (The debug file name is (correctly) only changed to Epilog if the pDebug value is >=1.); and
(b) Because the source is a cube view of a cube that you pretty certainly won't have on your system until or unless you create it. And until then you can't edit the process to fix that. See issue 5 for instructions on how to create the cube.

However ProcessQuit does not in fact return an error status of any kind. If you compare the return value from ExecuteProcess, a ProcessQuit; statement will cause the process to return the value ProcessExitByQuit(). To get a ProcessExitSeriousError() return value it would have been necessary for the Bedrock code to have used the ProcessError; statement.

This will only be an issue if:
(a) You are relying on the return value for flow control; and
(b) You take the comment at its word that the code will return a serious error.

Details
This is something that will rarely catch anyone out if they:
(a) Utilise the process only as per the description in the manual and
(b) Sanitise their input before calling it, since while the process does an extensive range of validation checks there is one thing that it doesn't do, as we shall see.

In my own case there was a residual issue from some contract work that was done long ago using a pre-release version of Bedrock, and which reared its head when I upgraded the code base to version 3.0 this year.

The process is supposed to create a subset of N level descendants of a particular consolidation, not including the consolidation itself. However the process does not test whether the "consolidation" really is a consolidation. If you pass an N element you won't get an error. You may or may not get a populated subset, as described below.

In our case there was a need to create a subset from an element selection in a websheet. If an N element was selected, that element would need to be the subset, otherwise the N descendants of the selected consolidation would be used as the subset. This is not in theory what Bedrock.Dim.Sub.Create.Consolidation.Leaf is supposed to do (nor its predecessor process, presumably), but it did it anyway. Then after the upgrade to 3.0 the calling process stopped working. Or, more disturbingly, it stopped working sometimes.

Specifically I found that if the element was an N element without an alias, then the calling process would still work. If the element had an alias, it would fail.

The reason: Bedrock.Dim.Sub.Create.Consolidation.Leaf is really just a wrapper around the Bedrock.Dim.Sub.Create, passing it all of the received parameters, but with 0 hard coded as both the Level From and Level To parameters.

Bedrock.Dim.Sub.Create iterates through the All subset of the specified dimension to do the addition of elements. However it uses a subtractive rather than additive approach. That is, it checks to see whether each element in the dimension does not match with the parameters specified. If it doesn't, then it itemskips that element. If the element being tested is not skipped by one of the tests, it's added to the subset.

In creating a subset from a consolidation element, the relevant test (in pseudo-code) is this:

If the consolidation is NOT an ancestor of the the element, AND if the element is NOT the same as the consolidation, then skip it.

"Hang on", I hear you say, "But the consolidation itself would fail the second arm of that test (because it is the same as the consolidation) and therefore would not be skipped. Why isn't it always included in the subset?"

Because the later test for the element level will catch a real consolidation and skip it there. Remember that the call limited the level of subset members to level 0.

However...

If the specified "consolidation" is an N element and does not have an alias, then it won't be skipped by the test specified above. Nor will it be skipped by the level test, since it is level 0. The result is that you end up with a subset consisting of just that one element.

If on the other hand the "consolidation" has an alias then it won't equal the element, and it will be skipped by the first test. You therefore end up with an empty subset. The first you find out about this is when the subset becomes part of a view and the view becomes a data source which causes the process that the data source is assigned to to crash and burn.

Remedies
The process should probably have a test added to confirm that the received element is a consolidation and spit an error if it isn't to keep the behaviour consistent. A test that the resulting subset really does have members (and an error if it doesn't) probably wouldn't hurt. Of course anyone using legacy code similar to ours would need to update it.

In the absence of that any calling code should do that check itself and only call the process if the "consolidation" really is one is. If it isn't then the calling process should call the SubsetCreate / SubsetElementInsert pair of functions. (Or Bedrock.Dim.Sub.Create.ByElement if a Bedrock only approach is preferred.)

Details
If used in conjunction with Bedrock.Dim.Export, the output file will include any rule-derived values from the element attributes cube. The import does not check if the destination is writable, so if numeric attributes for consolidated elements are rule-derived the import will fail.In all other cases the destination could simply not have the rules to start with (assuming same as source), but numeric overrides of consolidations are the exception.