Hi All,
I have a report program which uses the OPNQRYF to select the records to generate a report based on user selection in display screen, but it is taking lot of time (20 to 30 minuates) to select the records and write write into output file.
I am accessing keyed logical file based on physical which contains 180 Million records.
Can anyone please suggest any ideas to reduce the time taken to select the records from this huge input file to generate the output extract.
your help is greatly appreaciated.

Answer Wiki

It’s almost impossible to troubleshoot something like this in a vacuum… to give a good answer we would need to know the layout of the table(s) involved and the indexes for starters. But there are a couple of things to consider

Like @TomLiotta said stop using OPNQRYF. It’s old and it does not take advantage of the SQL Query Engine (SQE) on the system, there are just too many benefits to using the SQE to cover them here… check out these articles:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States.
Privacy

Processing your response...

Discuss This Question: 15 &nbspReplies

There was an error processing your information. Please try again later.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States.
Privacy

Can anyone please suggest any ideas to reduce the time...
The thought that comes immediately to mind is to stop using OPNQRYF for this process; use SQL. And this seems much too large to be creating access paths at run time.
Also, if you already have a keyed logical file for access, why would you use OPNQRYF in the first place? If you can explain that part, it might be easier to come up with a full suggestion.
Tom

Just curious as to what the parameters are they are searching for. We had a similar situation on a large file. The were a lot of common parameters being used so we created a few new logical files with keys based on their search fields. You can even add select/omit criteria to streamline it further What took just over 10 minutes is down to about 10 seconds. Just a suggestion.

Hi Tom and others,
Thanks for your suggestions,
Here is the scenario in my program,
User is allowed to select the period, country and 3 more fileds on screen to pull the report,
So in our existing program we have Opnqryf to select the records from master file(Keyed logical file) and joining other files also to pull the records based on screen selection and then we are reading the resultant records from above query in do while loop and apply some more business conditions to filter the records and write the final output to file and create a CSV formatted file and send it to user as an attachment.
Program is taking time to select the records at OPNQRYF, I am not allowed to change the entire structure of the program, since it is very important report. so i am thinking of putting some small patch to the existing program to reduce the time taken to select the records.

I am not allowed to change the entire structure of the program, since it is very important report.
If you can't change the programming, then you probably can't change anything that will make much difference. Tell your manager that nothing can be done as long as the programming can't be fixed.
What are the other three fields that a user can enter?
Tom

What are the other 3 fields and business conditions being applied? It my be that is where the bottleneck in processing the data is. Having never worked with a file that large myself I am curious as to the scope of the file. For example our sales detail file has just under 5 million records going back to 1998 for a company that does approx 265 mill a year.

The period and country fields might be good candidates for EVIs. The other three fields are total unknowns, but any one of them might also be an EVI candidate. EVIs are potential ways to significantly speed up the process.
Of course, it's not at all clear how OPNQRYF fits with "accessing keyed logical file" unless the LF is effectively being created by the OPNQRYF. An existing LF wouldn't seem to be useful at all except potentially for the initial compile. Without seeing any programming, it's pure guesswork.
As for the file size, I've seen much larger files for businesses smaller than $265M/yr. There are many reasons for larger files.
Tom

Have 1 point. Create a logical File with multiple members. Arrange the date. you just select the correct member based on the User's input. Read that Member and Filter the Records
Eg: Data should be arranged based on period and Country. So you can select the members which is related to res.country and period. then start filtering based on remaining 3 fields. PLs suggest will it work.

The fact that it's a keyed file doesn't help when the selection is not by the key or is a range or list. You may need to build this cl using different Logicals for different selection rules so the query engine doesn't have to seach the entire data set.

Are you using OPNQRYF directly to read and write in printer file or CPYFRMQRYF ?The use of OPNQRYF can be optimized in various mode and it is particularly useful in programs because it allows you to use the normal definitions of files. The first thing to be done is reduce the number of fields in base to the real necessity, using a proper format, than use logical files only if correspond in arrangement with OPNQRYF, otherwise use physical file.

The use of OPNQRYF can be optimized...
That's true.
The use of plutonium for generating heat can also be optimized, but that doesn't mean that a block of plutonium makes a good heater for my living room.
So far, the OP

(Previous comment was truncated on input...)
So far, the OP hasn't told us anything useful about the table structure nor answered questions about the other three values. We don't know what records look like. We haven't even seen the OPNQRYF command yet.
The target program has a trivial description and sounds easy. It almost seems like it could be replaced by a single Qshell db2 statement. It's hard to tell how it could be so difficult to change.
As far as we know, the OPNQRYF is already optimized to maximum possible degree. Maybe "20-30 minutes" is the best possible time on that system for that table.
Tom

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States.
Privacy

Processing your reply...

Ask a Question

Free Guide: Managing storage for virtual environments

Complete a brief survey to get a complimentary 70-page whitepaper featuring the best methods and solutions for your virtual environment, as well as hypervisor-specific management advice from TechTarget experts. Don’t miss out on this exclusive content!

To follow this tag...

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States.
Privacy