Tips, Tricks, and Techniques

Last update: 30 April 2004

Superdescriptor and the last record

Question

When doing a READ by superdiscriptor using from/thru, you cannot reference the last record, even though it is within the from/thru range. You can see it when it initally comes in but not later on. I know that last sentence is a bit confusing. Let me show you a real life example of what I am talking about:
This example is a short cut/pasted test program to simplify a real program, so don't pick at the details (e.g. unreferenced fields etc.) thanks.

because 19970825 is before the requested end range of 19970923. And in fact it showed the record just fine while within the READ loop, but not outside of the loop, and not if I used AT END OF DATA within the loop.

To make it work properly I removed the THRU caluse and did the test mayself. This is a bit spooky, we use READ FROM/THRU all over the place!!!

Is there a known problem with THRU or ENDING AT with READ on superdescriptors? I searched QUEST and found nothing.

We are running Natural 2.2.7.
This is not a problem or a bug or an error. As you probably know, the READ statement is traversing the inverted list built for that superdescriptor starting at the first value. When the value in the THRU clause is reached or exceeded, the loop is escaped and the record(s) for the associated ISNs for that value are never actually read. Think about the fact that if you didn't have an entry in the inverted list for the THRU value... you wouldn't want to have that/(those) records read for the value you DID come across that exceeded your end value.

Because the THRU value is reached and the loop is escaped, the previous record read is the value that is still contained in your view STUDENT-ENROLLMENT.

Perhaps you could make the #STDT-WSU-NUM-EE value be the next student number and leave #STDTENRM-EFCTV-BGN-DATE8-EE set to zero. Or you could recode the program to use ESCAPE BOTTOM logic to break out of the loop and not use the THRU clause.

Other (unsolicited) recommendations: use (D) format fields for dates instead of alphas or numerics. Also, if you usually need to get to the most current record hronologically, add a date-compliment field (N8 and value equals 99999999 - YYYYMMDD of STDTENRM-EFCTV-BGN-DATE8) that you can use in you superdescriptor(s) in place of STDTENRM-EFCTV-BGN-DATE8. That way you will only have to read one record to get the latest one and not all of them for a given student (or whatever).
Brian Johnson
Cutler-Hammer
Pittsburgh, PA

++++++
Thanks folks, but I still have an Issue here. I obviously didn't make my point.

First let me say I *KNOW* that Natural is doing the checking, not ADABAS, and that it actually reads one more record beyond the THRU value.

Second let me apologize for the misleading output. I was trying to disguise the student's id by over-typing it with 12345678. And I missed replacing it on the last line, where I said "IT SHOULD BE:...". That typo made it look like what I expected was greater than my requested ending value, when in fact it was not.

I would like to respond to Skip's point, when he said: The problem you are running into is that NATURAL keeps calling ADABAS for the "next" record until NATURAL gets the first record that is GREATER THAN the THRU value. NATURAL then exits the loop. And, as you found out, the data area has that last record in it.

In fact I never saw values greater than what I asked for in the THRU clause. That is OK. It is as it SHOULD be.

What I saw was that Natural showed me the records I wanted as long as I was within the loop, but when I left the loop and tried looking at the same variables, they were one *behind* where they should be. They showed me the SUMM (SUMMER) session, not the FAL1 (FALL) session.

So far, so good... but...then...
here is what Natural thinks is the last record read when I am *outside* the loop:

FOUND L 0002323500 12345678 19970602 1997 SUMM

What happened to the FAL1 record? It *IS* within the from/thru range. The last time I checked, 19970825 is less than 19970923. And it *was* available all the time I was in the loop! To me that is a bug. Either give it to me or don't!! Don't give it to me and then take it away!

Oh, and Skip, I DON'T WANT the data after the loop. I only what I asked for in the from/thru range. Thanks for the suggestion anyway.

Skip had said:
One solution you could use to get at the data after the loop is to code >something like
the following in the READ loop:

After all the comments on this I would ask two more questions.
1) Did you get the same results if you used line reference number of the if *counter > 0 statement?
2) This the last record displayed in the read loop, the last value for the superdescriptor (did it reach end of file)?
Phil Hansens
Database Administrator (DBA)
(517) 636-9480

++++++

It is possible to use read from/thru with a super-d. All you need to do is include the super in your local or define data. Once this is done you can use the true clause.

++++++

Steve wrote:
Strange indeed. As noted by some of the responses, Natural should be 2
records ahead of where you seem to be. Not only should

L 0002323500 12345678 19970825 1997 FAL1

have been read, the next record that caused you to escape
the READ loop should have been read also.
When you say "here is what Natural thinks is the last record read
when I am *outside* the loop:" do you mean you are referencing
variables like MYVIEW.YEAR outside the loop?

This would really be strange then since they must have the
first non valid value, otherwise you would not have escaped the loop.

Yes, Steve, that is exactly what happened.
The variables were referenced with VIEWNAME.FIELDNAME outside of the
loop and they contained the previous record!
Someone guessed that perhaps the FAL1 record was end-of-file.
So, maybe it never made it to the local data area.
I also wondered about that.
I will try to get some time to check that possibility.

++++++
Thanks to all for testing that FROM/THRU stuff for me.
Our problem must be somehting to do with an end-of-file condition,
or a corrupted file where associator does not match data.
It seems to be localized to this file and record.
I tried it on other files and it works OK.
It certainly had me going for a while though.
Since this is a test record on a test file,
I don't think I'll persue it too much more... other than asking the
DBA to run some file-verification utilities on it.
Although it is rare, it won't be the first time we had a corrupted file.

= = = = = =
I was alarmed by Darrell's report, so wrote the following simple program to
test this against the EMPLOYEES file. The result seems to substatiate that
following the read loop, the record greater than the THRU is the one
available within the program -- which makes the data or file Darrell is
dealing with suspect? This was with Natl 2.2.8.