Google/FTS search by Michael Coles in .net 3.5

I am having a problem getting a .dll to work properly with the newest Irony dll.

I am using Mikes Coles SearchGrammar and ConvertQuery functions and the newest Irony dll. I am using C# in Visual Studio Pro 2008 SP1. When done this will work on .NET 3.5. There is one function which accepts a google search string and returns the FTS equivalent
as a string.

I have a test app in my C# project and it is returning: "Error in string literal [Phrase]: No start/end symbols specified.".

ah, you're talking about source attached to the article? you should go to "Source code" page on this site and download the latest source zip. Just checked, the search grammar is there. You can see it without downloading by simply browsing the sources
in the changeset

correction - I understand now the source of trouble - you were trying to use not source from article but alpha-version in Downloads page on this site. This doesn't work either, use latest from Source code page

For now, without running the code I see one problem: you don't check errors after calling Parse method, and this might be your problem. Parser does not throw exceptions when it sees syntax error but tries to recover and parse further to uncover all
errors - that's the behavior you see in c# compiler for example. So even if Parse method finished without exception, you should first check for errors; if there were errors, the root node might not be created at all, and that's why it blows up later in ConvertQuery

See if it helps; if it doesn't I wil later tonight try to run it, right now can't do it.

To avoid rebuilding parser data (LanguageData) on each request in web app, you should build it once and save in server-wide cache, probably Application object or whatever is there for shared object. Initial data construction takes much more time (10-s
of milliseconds) compared to actually parsing and converting the query (microseconds). Language data is immutable (well, almost, there are some mutable but thread-safe parts), so it is safe to share it between multiple parsers running on different threads.

(This double-checking for null and lock in between is a standard "thread-safe singleton pattern")

Then you can create a Parser on each request:

var parser = new Parser(SearchGrammar.LanguageInstance);

In this case you will have a shared instance of language data per app domain. ASP.NET may create several domains on one server (possibly, not sure) but this small duplication is OK i think. The main point I think is that ASP.NET would keep domain alive
for multiple requests, so static data will remain intact, and LanguageData can be reused again when request is processed in the same domain.

(This double-checking for null and lock in between is a standard "thread-safe singleton pattern")

Then you can create a Parser on each request:

var parser = new Parser(SearchGrammar.LanguageInstance);

In this case you will have a shared instance of language data per app domain. ASP.NET may create several domains on one server (possibly, not sure) but this small duplication is OK i think. The main point I think is that ASP.NET would keep domain alive
for multiple requests, so static data will remain intact, and LanguageData can be reused again when request is processed in the same domain.

===============================

I would been wrestling with signed dll's and global assemblies and it seems there is an easier method. Could you point me in the correct direction as to how to really get this into the web server cache. I have been up and down many a path and am now clear
that I don't how to get this into the web server cache.

I don't understand what is the problem here. Doing this web server cache thing - it is performance optimization, no more. It will still work without all this, just with tiny extra delays (few milliseconds). Can you run it as is on web page?! First make it
run without optimizations, then move to use this static field for LanguageData I described