There are something really strange.
I don't know whether anyone have worked with this such feature or it's just
not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

I have figure out the problem.
The main problem is I have used the same output for all input then ES have
been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES have
been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change payload,
the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change payload,
the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

I have exactly same problem.
I resolved duplicates and appearance of deleted items by running
"_optimize?only_expunge_deletes=true" on daily basis.

However i have still problem with updates. Even if data have been updated
for items in index, they still show old data when searching.
Only solution i found is running: "_optimize?max_num_segments=1'".

However i have quite a lot of updates and this get me worried, because of
Alexander's note:

if you do regular updates on that index, you should not update down to one
single segment

Alexander Reelsen, could you please advise, why we shouldn't do that or
does this create some permanent damage for optimization logic or does it
affects performance for search later?
What should we do instead?

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change payload,
the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

Hi Alexander Reelsen,
Just do at my example above. When you do update on payload or do reindex
(with same doc id, same input, different output or payload) then the
completion will work funny.
Sometime it suggest updated document, some time it suggest old document.

Currently, I need to use _optimize?... to refresh FST. I don't know if
there have anyway to refresh it better.

I do update 5-10 per second, and I think it's fine with
"max_num_segments=1".

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change payload,
the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

If I may, I have a follow-up question to your response here. How does the
completion suggester behave with fields such as payload and score when it
is unifying the response based on output ?? Are scores increased based on
this combination? if payloads are different, which ones are returned?

Thanks for you help!

Alistair

On Monday, April 21, 2014 2:26:13 PM UTC+2, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though a
document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or it's
just not-stable feature.
If we do index same input, and different output,payload, then only one
result found.

i still have same problems with completion suggest duplicates of old and
updates data using ES 1.4.x
Only way that fixed it so far was using _optimize?max_num_segments=1 which
has performance and maybe other impacts i assume.

Is there another solution than _optimize?max_num_segments=1?

Thx
Tom

Am Montag, 28. April 2014 11:21:49 UTC+2 schrieb Kaspars Sprogis:

Hi,

I have exactly same problem.
I resolved duplicates and appearance of deleted items by running
"_optimize?only_expunge_deletes=true" on daily basis.

However i have still problem with updates. Even if data have been updated
for items in index, they still show old data when searching.
Only solution i found is running: "_optimize?max_num_segments=1'".

However i have quite a lot of updates and this get me worried, because of
Alexander's note:

if you do regular updates on that index, you should not update down to
one single segment

Alexander Reelsen, could you please advise, why we shouldn't do that or
does this create some permanent damage for optimization logic or does it
affects performance for search later?
What should we do instead?

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change
payload, the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though
a document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or
it's just not-stable feature.
If we do index same input, and different output,payload, then only
one result found.

i still have same problems with completion suggest duplicates of old and
updates data using ES 1.4.x
Only way that fixed it so far was using _optimize?max_num_segments=1 which
has performance and maybe other impacts i assume.

Is there another solution than _optimize?max_num_segments=1?

Thx
Tom

Am Montag, 28. April 2014 11:21:49 UTC+2 schrieb Kaspars Sprogis:

Hi,

I have exactly same problem.
I resolved duplicates and appearance of deleted items by running
"_optimize?only_expunge_deletes=true" on daily basis.

However i have still problem with updates. Even if data have been updated
for items in index, they still show old data when searching.
Only solution i found is running: "_optimize?max_num_segments=1'".

However i have quite a lot of updates and this get me worried, because of
Alexander's note:

if you do regular updates on that index, you should not update down to
one single segment

Alexander Reelsen, could you please advise, why we shouldn't do that or
does this create some permanent damage for optimization logic or does it
affects performance for search later?
What should we do instead?

Hey Alexander,
Thanks for your reply.
Currently I also manual do optimize by running optimize:
host:9200/completion_index/_optimize?max_num_segments=1
I think it's a work around solution. I would like to make it better.

Sometime, I also have problem with update payload, when I change
payload, the completion is not change too.

On Monday, April 21, 2014 7:26:13 PM UTC+7, Alexander Reelsen wrote:

Hey,

the output is used to unify the search results, otherwise the input is
used. The payload itself is just meta information.
The main reason, why you see the suggestion twice is, that even though
a document is deleted and cannot be found anymore, the suggest data
structures are only cleaned up during merges/optimizations. Running
optimize should fix this.

I have figure out the problem.
The main problem is I have used the same output for all input then ES
have been wrong in this case.

I still trying to improve the performance. I am just test on 64Gb Ram
server (32Gb for ES 1.0.1) 24 core.
Have only 2 record but it took me 3ms to suggest.

On Sunday, April 13, 2014 4:53:21 PM UTC+7, kidkid wrote:

There are something really strange.
I don't know whether anyone have worked with this such feature or
it's just not-stable feature.
If we do index same input, and different output,payload, then only
one result found.