> Library search algorithms, by contrast, especially when harnessed to open
> source search engines like Lucene/Solr, can be verified by others for
> accuracy and objectivity.
>
>
Not really.
You need to preprocess/normalize data before indexing it which makes a huge
difference in the results. Also, these indexing engines allow you to assign
more or weight to different components upon ingest or even during the search
itself. Altering even one weighting factor by a fraction of a point can make
a huge difference in the results.
Setting these values is more art than science, and the actual numbers are
irrelevant outside the context of the specific application at hand as they
are ultimately based on what you know about the resources and the people
using them.
This means that the values can be regarded as arbitrary, and when you don't
get the desired results, you tweak the values until you achieve the desired
behavior.
Since things constantly change, the fine tuning process never ends. There's
no practical way to make it objective.
kyle
Received on Mon Dec 06 2010 - 18:06:55 EST