Its time to increase the number of caches in each Pocket Queries

Recommended Posts

As the number of caches are growing, the limit of 1.000 caches in each Pocket Query is too low. Sometimes more than 1000 caches are published in a (limited) area in one day, which makes it very difficult to monitor and update with today's Pocket Queries. A limit of, let us say, 5000 caches would be more appropriate.

Share this post

Link to post

As the number of caches are growing, the limit of 1.000 caches in each Pocket Query is too low. Sometimes more than 1000 caches are published in a (limited) area in one day, which makes it very difficult to monitor and update with today's Pocket Queries. A limit of, let us say, 5000 caches would be more appropriate.

You may run up to 10 pocket queries in every 24 hour period, each returning up to 1000 geocaches.

Share this post

Link to post

Since this is a feature request, you might want to have the topic moved to the Website Subforum.

Just looking at the math...

You're allowed 10 queries a day, with 1,000 Listings per query. You can store 40 queries which you could cycle through in a 4 day period. So in essence, you can gather up 40,000 Listings over a 4 day time period.

Share this post

Link to post

[...] You can store 40 queries which you could cycle through in a 4 day period.[...]

That's wrong. From the PQ site:

You can create up to 1000 queries. [...]

Anyway. Instead of heaving to create several queries with a lot of time waste for adjusting or other splitting methods it would be much smarter to have one query filling the Garmin unit. The 500/1000 limits are given historically from the limits of the early (III, 60, 76) Garmin units (which where first 500 caches, then 1000 caches). Today, 5000 is the standard Gpx limit on newer units. The PQ limits should consider that. Let the user decide if a PQ should be a 500/1000 PQ or an XL-PQ.

Share this post

Link to post

It is not about the daily limits or what you can find. It is about getting more in one PQ. Keep the daily limits in place and increase the PQ limit. Let's say you want to visit a large city and are not sure where you will go and want to load up the whole city. A 1000 PQ does not do it and you need to setup complex time based PQs taking a lot of time and effort. Really it would just be so handy to be able to grab a whole state. Just set daily, weekly or monthly limits to prevent abuse.

Share this post

Link to post

I was happy when they raised the number of caches in a PQ from 500 to 1000. And that is all I need! I'm not sure that PQs are high on the GS planning. I think we're luck we still have them. Yes. I run 29 PQs which covers all on NJ and anything else within 65 miles. (Though I really wish I could exclude counties in the PQs. I doubt that I'll ever go back to Long Island. Using GSAK, I refine it to areas that I may search this weekend. Added Carbon County, PA for this weekend, though I seldom go there. Raising the PQ limit to 5000 caches might save me three minutes a week? Not worth the effort. I do use GSAK to refine my search area - where I might go caching this weekend. Load 3000 caches into Gupy, and I'm set!

Nope. I'm happy with the way it works now.

Share this post

Link to post

It is not about the daily limits or what you can find. It is about getting more in one PQ. Keep the daily limits in place and increase the PQ limit. Let's say you want to visit a large city and are not sure where you will go and want to load up the whole city. A 1000 PQ does not do it and you need to setup complex time based PQs taking a lot of time and effort. Really it would just be so handy to be able to grab a whole state. Just set daily, weekly or monthly limits to prevent abuse.

We use to have raging battles over the number of PQ's per day and the number of caches in a PQ. Since the advent of smartphone caching those battles are long gone. Perhaps the op should consider a smartphone instead of an offline database that is never current. If your caching in areas where there is no cell coverage then the current limits are more than adequate except for maybe a few power trails or geo-art and in that case a couple days worth is enough and they generally won't be out of date and the logs, for the most part, are useless.

The other problem with increasing the PQ size limit, you then would have to also increase the bookmark cache limit to match the PQ size.

For the stated problem of a city with lots of caches and not knowing where your going a smartphone is the answer, not gigabytes of offline database.

Share this post

Link to post

My home PQ is set for within 11 miles. Consistently generates around 700 caches, and has done so for the 8 years I've lived in this area. I don't know what the value a PQ with well more than 1000 caches would would be for me.

Share this post

Link to post

My home PQ is set for within 11 miles. Consistently generates around 700 caches, and has done so for the 8 years I've lived in this area. I don't know what the value a PQ with well more than 1000 caches would would be for me.

I'm somewhat saddened by the number of "I can't see what use it would be for me, therefore it can't be any use for anybody" responses this request gets

Share this post

Link to post

My home PQ is set for within 11 miles. Consistently generates around 700 caches, and has done so for the 8 years I've lived in this area. I don't know what the value a PQ with well more than 1000 caches would would be for me.

I'm somewhat saddened by the number of "I can't see what use it would be for me, therefore it can't be any use for anybody" responses this request gets

Sorry to hear it. However the OP did not provide a very convincing reason that there would be any substantial benefit. By the time the vast majority of Users would exhaust a 5,000 Listing Query (or the anybody as you stated) the data would be so outdated as to be irrelevant (i.e. due to Disables, Archiving, Coordinate Updates and so forth). I'd be interested to hear how someone would utilize such a large Query of Listings in a meaningful manner.

Share this post

Link to post

Sorry to hear it. However the OP did not provide a very convincing reason that there would be any substantial benefit. By the time the vast majority of Users would exhaust a 5,000 Listing Query (or the anybody as you stated) the data would be so outdated as to be irrelevant (i.e. due to Disables, Archiving, Coordinate Updates and so forth). I'd be interested to hear how someone would utilize such a large Query of Listings in a meaningful manner.

Here's two examples:-

* I don't know where I'll be from one day to the next. I therefore have PQs for a very generous area around my home location. Currently there are 9 of them, covering some 8300 caches. I have to re-split them periodically as old caches disappear. Resplitting two PQs would be much easier than resplitting nine. And I'm not even in a very cache-dense area.

* I'm going on holidays to a new area (with a higher cache density). Again, I could be anywhere +/- 50km over a week (where I may have very limited network coverage). Now I need to create another 8 PQs to cover that area.

Both of these scenarios are real (the former describes my usual situation, and the latter is what I did while on a recent international trip.)

There are people who keep large databases of caches in things like GSAK. I don't, and I don't know what use that is either, but I don't presume that just because I don't have a reason to do this, then nobody has a valid reason.

Do I *need* bigger PQs? No, the current 1000-cache PQs can be made to work. Would bigger PQs cut down my workload? Absolutely.

It's not about exhausting the listing. It's about having a valid snapshot of what's around, in a situation where there are multiple thousands of caches to choose from.

Share this post

Link to post

Thanks EngPhil. I completely understand the benefits given the type of situations you described. It's been quite some time since the last update on the issue, and caching has changed quite a bit. I support the idea now.

Share this post

Link to post

The other problem with increasing the PQ size limit, you then would have to also increase the bookmark cache limit to match the PQ size.

That's a problem? I'd love to see the bookmark size increased. An increased PQ size does me little good, though, although my PN-60 could accept 1500 entry PQs if the system would generate them. (Unlike the GPSrs the rest of you are talking about, PN-60's look at one PQ at a time, so the larger the PQs, the less often I'd have to switch as I drove between areas.)

Share this post

Link to post

[....] However the OP did not provide a very convincing reason that there would be any substantial benefit. [...]

One advantage would be that you can load your unit in one go without the need of using third party apps. That' s pretty nice when on vacation or traveling some days/weeks without internet/pc availability.

Hans

Share this post

Link to post

There are some good insights into caching behaviour here. First, thank you to EngPhil for pointing out that not everyone caches the same way I do. Also HHL has a very good point on being able to load one's gps in a single shot without 3rd party apps. Comments also about not being connected and having crazy travel schedules are also appreciated.

There are other metrics that have not been discussed including how far/long one travels to cache and where one is. Several cachers I know are willing to drive 2 hours one way to a day's caching ground. Others are unable to do the same. Cache density and style varies by region of the world. The US seems pretty dense. Southern US and desert US seem to have more power trails and geoart than New England and New York. Australia does not have the same density. This drastically impacts how people cache.

As someone who has a geomate JR and a garmin 64, it would be lovely to be able to load these with a 250k cache pocket query that excludes my finds and other caches that I don't care about. That said, I don't thing that Groundspeak is going to be thrilled with upping the pocket query size that much.

On the flip side as someone who still owns a blue etrex, uping the PQ size to 2k caches means that I cannot load a 2k pocket query in directly. While these are not the dominant receivers out there any more there are still people using them. See HHL's point about 3rd party apps. This also creates problems for new people joining the hobby with hand me down equipment. Also unless one can download the whole database, then there is no real guarantee that everyone will be happy.

A potential balance, that seems similar to what another site does, is to allow a user to download a certain number of caches a day and break them up into chunks of their choosing. So I might want 10 pqs of 1000 or two of 2k and one of 6k. Or maybe I only can choose from 10 1k, 5 2k, 2 5k, or 1 10k download. Maybe I could be able to download 10k caches a day but have 30PQ's that each overlap some.

The hope is to have fun where ever you are and minimize "friction" for all types of caching styles, equipment and locations. This includes both yours and that person you just don't understand. That is what keeps it fun for all folks.

Share this post

Link to post

A potential balance, that seems similar to what another site does, is to allow a user to download a certain number of caches a day and break them up into chunks of their choosing.

That would be a much better way of doing it and it has been suggested in the past but hasn't been implemented for whatever reason. I saw a response on another thread some time ago which seemed to suggest that they're thinking of making the results of the search page downloadable, so if that's the case they're probably concentrating on that and have put PQ development on ice, so I wouldn't hold my breath waiting for anything to change on the PQ front.

Share this post

Link to post

Also unless one can download the whole database, then there is no real guarantee that everyone will be happy.

Well, some people are never happy, but it strikes me there are reasonable trade offs available in this case. You actually do get the whole database if you're on-line with a smart phone, so the problem can be solved that way except when you go out of service. When you're going out of service, you have to revert to preplanning and loading some specified set of caches, but since it will be a limited number, the existing PQ limit can handle that.

A potential balance, that seems similar to what another site does, is to allow a user to download a certain number of caches a day and break them up into chunks of their choosing. So I might want 10 pqs of 1000 or two of 2k and one of 6k. Or maybe I only can choose from 10 1k, 5 2k, 2 5k, or 1 10k download. Maybe I could be able to download 10k caches a day but have 30PQ's that each overlap some.

The API already provides downloads limited by number of caches per unit time, so you aren't asking for a new feature, you're just insisting it be implemented in the old, nearly obsolete PQ infrastructure. I don't use GSAK much, but even I can see it's a much better solution to this problem than any reorganization of PQ limits even before I start to think about the complications this would add to the currently quite simple and obvious PQ user interface.

The hope is to have fun where ever you are and minimize "friction" for all types of caching styles, equipment and locations. This includes both yours and that person you just don't understand. That is what keeps it fun for all folks.

The problem here is that I already see everything available, so I can't get worked up because someone can imagine a different way for an existing feature to be provided.

Share this post

Link to post

For the stated problem of a city with lots of caches and not knowing where your going a smartphone is the answer, not gigabytes of offline database.

As stated, the files are really small for 2016. Around 1 kB per cache. A 10000 cache PQ would be 10 MB, really small and it is not going to hurt the system. I'm not sure why people feel the need to argue against better features. They are not going to hurt anyone. The limit right now is 10 PQs at 1000 each. Just change it to be 10000 per day and there is no affect on the servers.

Share this post

Link to post

A 10000 cache PQ would be 10 MB, really small and it is not going to hurt the system.

The limit right now is 10 PQs at 1000 each. Just change it to be 10000 per day and there is no affect on the servers.

The effect would be on processing effort rather than storage (which, arguably, should be less with fewer, larger PQs), but, again, it's 2016, not 2006. I'd fully support something along the lines of "up to 10 PQs per day, with a combined total of up to 10k caches between them."

If (at the extremes of those limits) one PQ of 10k caches uses more resources than ten PQs of 1k caches each, something is seriously broken.

ETA: I suspect the cost here would be in developing the code to check the total number of results against such a quota, rather than the actual ongoing resource usage.

Edited March 8, 2016 by EngPhil

Share this post

Link to post

My home PQ is set for within 11 miles. Consistently generates around 700 caches, and has done so for the 8 years I've lived in this area. I don't know what the value a PQ with well more than 1000 caches would would be for me.

I'm somewhat saddened by the number of "I can't see what use it would be for me, therefore it can't be any use for anybody" responses this request gets

Did I say that? No, I just said it would be of no value to me. So sorry that expressing one's own thoughts is a problem for you.

Share this post

Link to post

I'd also like to see the number of caches in an individual PQ increased beyond the 1,000 limit that stands today. Anyone who spends time on the road caching knows that loading up a GPS unit with areas they are traveling through can be a tedious process, involving many separate PQs and creating all sorts of PQ routes.

While I appreciate the PQ route function, it would also be great to see alternative, simpler ways to create a PQ for downloading. For example, why not allow folks to download an entire county's worth of PQ data in one file? It would be much easier to pick a few counties that your journey travels through, rather than the trial-and-error of creating PQs from overlapping virtual geographic 'circles', trying to cobble together PQ routes, etc.

And for those people fortunate enough to cram their entire area into one PQ file and 628 caches, I'm jealous.

Share this post

Link to post

My home PQ is set for within 11 miles. Consistently generates around 700 caches, and has done so for the 8 years I've lived in this area. I don't know what the value a PQ with well more than 1000 caches would would be for me.

I'm somewhat saddened by the number of "I can't see what use it would be for me, therefore it can't be any use for anybody" responses this request gets

I was going to say the same thing but I like yours better.

We tend to forget this is a global hobby and what works for you is incovenient for someone else in the world.

GC imposes a drastic limit and I think they should be more flexible.

Share this post

Link to post

I happen to wish to maintain reasonably up to date knowledge of caches within a range of my home location. I am not so much interested in new caches as new logs and status changes. At the moment this is done by the tried and tested PQ based on date strategy, I currently need 18. My life would be a lot easier with being able to run fewer bigger PQs.

I happen to wish to maintain reasonably up to date knowledge of caches within a range of my home location. I am not so much interested in new caches as new logs and status changes. At the moment this is done by the tried and tested PQ based on date strategy, I currently need 18. My life would be a lot easier with being able to run fewer bigger PQs.

Link to post

I happen to wish to maintain reasonably up to date knowledge of caches within a range of my home location. I am not so much interested in new caches as new logs and status changes. At the moment this is done by the tried and tested PQ based on date strategy, I currently need 18. My life would be a lot easier with being able to run fewer bigger PQs.

Dude, that's nuts. You need to get out more.

Oh, wait - that would end up being for be 'geocaching'. Never mind.

:) Well they don't all end up on my GPS but in GSAK. It is then easy for me to output a few around the area I am going to end up in e.g. Fun event tonight about 20 miles from home.Also I can output them to other devices e.g. Sat Nav and MemoryMap which lets me see interesting areas not too far away.

Share this post

Link to post

I'm also in the camp of having a larger return on a PQ. I have to run two PQs to adequately account for where I live and where I work. I have to run two more PQs to account for where I might go on the weekends. Accounting for a bit of over lap, this picks up about 3500 caches in a decent area. If I planning on going to a nearby city, I have to run another PQ.

Is it unbearable to run four PQs? No. Do I run the PQs more than once a month? Rarely. Would it be more convenient for the PQ to return more caches. Yes. And this is just for a small market city size. I can't imagine how many PQs you would need to cover a larger city like Los Angeles.

Share this post

Link to post

I live in a very cache dense area and to cover it I need 6 PQs - of course I'll never find them all but that's besides the point....

Is it? I would prefer that GS spends it development efforts on features that help us find caches rather than something that only allows us to load waypoints on our GPS for caches that we might not find.

Share this post

Link to post

As the number of caches are growing, the limit of 1.000 caches in each Pocket Query is too low. Sometimes more than 1000 caches are published in a (limited) area in one day, which makes it very difficult to monitor and update with today's Pocket Queries. A limit of, let us say, 5000 caches would be more appropriate.

You may run up to 10 pocket queries in every 24 hour period, each returning up to 1000 geocaches.

To answer the last question - on Nov 20, 2015 in the desert of Southern California there were 2026 caches placed within 50 miles of one another. The "Highway to H.E.L.L" series (see GC66HHX as an example) So yes it would help having more than a 1000 per pocket query!

Share this post

Link to post

I live in a very cache dense area and to cover it I need 6 PQs - of course I'll never find them all but that's besides the point....

Is it? I would prefer that GS spends it development efforts on features that help us find caches rather than something that only allows us to load waypoints on our GPS for caches that we might not find.

Actually this would help me find caches since it would make it easier to see the ones that are available and interest me.

Share this post

Link to post

Is it? I would prefer that GS spends it development efforts on features that help us find caches rather than something that only allows us to load waypoints on our GPS for caches that we might not find.

Different people cache in different ways. We don't all always go out with a specific list of caches for the day loaded onto the GPS, some of us load up an entire area and then have options while we're in the field. The "you won't find them all so it's pointless" argument is a strawman.

Share this post

Link to post

Is it? I would prefer that GS spends it development efforts on features that help us find caches rather than something that only allows us to load waypoints on our GPS for caches that we might not find.

Different people cache in different ways. We don't all always go out with a specific list of caches for the day loaded onto the GPS, some of us load up an entire area and then have options while we're in the field. The "you won't find them all so it's pointless" argument is a strawman.

Exactly. I work full time and can only really cache one day a week, if that. So yes - I prefer to just get them all on the GPS quickly and not have to worry about whether or not I have the caches I want on my GPS when I'm out in the middle of the woods .... or any place else I might find myself.

Share this post

Link to post

I think the bigger problem is a 50 km radius limit. I prepared the data for a tour of the Caribbean and in the end it was more than 40 fields for just about 2000 caches!

Eh? The maximum radius for a PQ is 500 miles, or 750km

[...]

Actually there is no radius limit in PQs when creating them by country or state search.

Hans

But if you wanted to do a search in the Carribean (multiple small countries very close together), then it would be logical to do it by a centred PQ rather than doing multiple PQs for all the countries.

Share this post

Link to post

As the number of caches are growing, the limit of 1.000 caches in each Pocket Query is too low. Sometimes more than 1000 caches are published in a (limited) area in one day, which makes it very difficult to monitor and update with today's Pocket Queries. A limit of, let us say, 5000 caches would be more appropriate.

I feel your pain. Power caches are a huge part of the problem with shrinking pocket query circles. But trying to get anything done about it is like pulling teeth. You can place power caches in an ignore list and KINDA block the person whos placing the power caches which is pretty ineffective. I started a thread on the subject and am basically made to feel like I'm doing it wrong by having so many pocket queries and so many caches in my GPS. If they would classify power caches as a type of cache and add it to the pocket query creation page then a huge part of the problem would be solved. But there seems to be no willingness to do so. Lots of arguments both for and against have been brought up, but all in all it's too hard and they don't know how to write the rules. I personally download 10 pocket queries daily out of 399 to keep my GPS constantly up to date. GSAK has a lot of cool tools that make it easy.

Just recently I went to do some caching near Fredrick Colorado. I have about 255,000+ caches in my GPS. I arrived east of Fredrick just to discover there were NO caches in the area I was going to. No phone service either so I couldn't check the phone. I went home and imported a map with GSAK just to find the reason was that the DIA area was saturated with Power Caches. I placed all of those in the ignore list and suddenly there are caches in the area I wanted to cache in the first place.

In another case, I went to Las Crusis New Mexico expecting to see some caches in a certain area near there. I had previously checked the pocket query and I knew exactly where I wanted to go. A few weeks later the query for that area was updated in the regular cycle that I use. When I went to Las Crusis there were no caches in the GPS in the area I where I was going. The reason for this was a new Geoart and anew power cache which had caused my query circle to shrink to about 1/2 it's original size.

I'm not seeing any difficulty with adding a power cache type and some power cache rules. But convencing the powers that be that they are a problem isn't so easy.

Geocaching used to be about going to cool places where you may not have gone if there hadn't been a cache. Now it's about numbers numbers numbers.

Share this post

Link to post

As the number of caches are growing, the limit of 1.000 caches in each Pocket Query is too low. Sometimes more than 1000 caches are published in a (limited) area in one day, which makes it very difficult to monitor and update with today's Pocket Queries. A limit of, let us say, 5000 caches would be more appropriate.

You may run up to 10 pocket queries in every 24 hour period, each returning up to 1000 geocaches.

To answer the last question - on Nov 20, 2015 in the desert of Southern California there were 2026 caches placed within 50 miles of one another. The "Highway to H.E.L.L" series (see GC66HHX as an example) So yes it would help having more than a 1000 per pocket query!

I recently put that one in the ignore list. There's another east of there with 1500+ caches.

Share this post

Link to post

[....] then it would be logical to do it by a centred PQ rather than doing multiple PQs for all the countries.

It's a lot easier (at least for me) to combine several "countries" into one PQ than fiddling around with several PQs.

Hans

I trying out this method. My concern is does it give all the caches in an entire state or only a select 1000? If it's only 1000 then its usefulness would be limited. If it's the whole state then I could delete all the pocket queries I previously made and it would help solve the power cache nightmare. They'll download tomorrow.

Share this post

Link to post

[....] then it would be logical to do it by a centred PQ rather than doing multiple PQs for all the countries.

It's a lot easier (at least for me) to combine several "countries" into one PQ than fiddling around with several PQs.

Hans

I trying out this method. My concern is does it give all the caches in an entire state or only a select 1000? If it's only 1000 then its usefulness would be limited. If it's the whole state then I could delete all the pocket queries I previously made and it would help solve the power cache nightmare. They'll download tomorrow.

It's limited to 1000 caches just like any other PQ. I started another database and ran it there. Then I ran the API for those states. It would take forever to populate the map with that method. It's OK if you're limiting yourself to 1000 caches in the GPS but if you're running 250,000+ caches its not so good. I imported the states list that was just created to the preload list and only got about 500 new caches. That's because Washington Oregon and California weren't completed yet in my pocket query list. Doing the same thing on a second day gave me 96 new caches. To make the states work it would need to have a much higher cache limit. I am keeping the states in my PQ list because they might help fill in some blanks areas eventually.

Edited April 19, 2016 by Jake81499

Share this post

Link to post

I'm going on vacation later this year. within 10 miles of where I will be staying, there are 2,700 caches. Within 25 miles there are 8,800 caches. Am I going to find them all while on vacation? No. But in order to have the information, without resorting to Live Mode, I'd have to run a dozen PQ.

Share this post

Link to post

I'm actually kind of surprised the following suggestion hasn't been made yet here.....

For dense cache areas, set up PQs to query by Placed Date. Run them (only once needed) from May 2000 for as long a period as necessary to max out at just under 1000 caches, then start the next PQ on the next day. Run them until you get to the PQ that includes today (for which I just leave the end date as Dec 31). Eventually (after many PQs have run over a few days due to max run count) you'll have every single active cache in the region of your choice.

For me, I now have all geocache listings in Ontario Canada in both my GSAK and Geosphere offline databases. It's much easier to search and filter (w/o online requests) to local areas if I'm out searching.

I only need to update the status of nearby caches for archivals and disables, or download recent logs to know their current find state. I can do that with the API or run a smaller direct PQ beforehand.

You leave a buffer for the PQs at just under 1000 so that you can re-run them as far back as you like in order to 'catch' recent publishes that might fall outside the bounds of your current PQ (for example, a cache 'placed' 6 months ago that was just published yesterday might show up 2 PQs back). The buffer allows you to run the PQ without having to adjust the end date to make sure you don't max out, and still get all its caches.

It's a very convenient strategy. I'm now running at most 2 or 3 PQs on a day I want to go out aimlessly for some caching (and yes that still covers all of Ontario), to ensure I have everything. And I have a non-date-restricted PQ centered on home of unfound caches which I can run to automatically make sure all the statuses are updated.

The date-range PQ strategy is really best if you intend to have an offline reference of all the caches in any particular area, though it takes a few days to run through the years of geocaching. Up-to-date listing info can still be handled by traditional localized PQs or API queries. And it's greater for seeking more quickly out target caches if you're into challenge caching.

(related to the op topic, bumping the PQ limit would have made getting all those caches a little faster by allowing a wider date range each when running them)

PS, my Ontario cache group is closing in on 51000 caches (including archives since I began).