[Koha-devel] MARC record size limit

LAURENT Henri-Damien henridamien.laurent at biblibre.com
Tue Oct 12 18:20:58 CEST 2010


Le 12/10/2010 14:48, Thomas Dukleth a écrit :
> Reply inline:
> 
> 
> Original Subject:  [Koha-devel] Search Engine Changes : let's get some solr
> 
> On Mon, October 4, 2010 08:10, LAURENT Henri-Damien wrote:
> 
> [...]
> 
>> zebra is fast and embeds native z3950 server. But it has also some major
>> drawbacks we have to cope with on our everyday life making it quite
>> difficult to maintain.
> 
> [...]
> 
>> I think that every one agrees that we have to refactor C4::Search.
>> Indeed, query parser is not able to manage independantly all the
>> configuration options. And usage of usmarc as internal for biblio comes
>> with a serious limitation of 9999 bytes, which for big biblios with many
>> items, is not enough.
> 
> How do MARC limitations on record size relate to Solr/Indexing or Zebra
> indexing which lacks Solr/Lucene support in the current version?
Koha is now using iso2709 returned from zebra in order to display result
lists.
Problem is that if zebra is returning only part of the biblio and/or
MARC::Record is not able to parse the whole data then the biblio is not
displayed. We have biblio records which contains more than 1000 items.
And MARC::Record/MARC::File::XML fails to parse that.

So this is a real issue.


> 
> How does BibLibre intend to fix the limitation on the size of
> bibliographic records as part of its work on record indexing and retrieval
> in Koha or in some parallel work.?
Solr/Lucene can return indexes and thoses be used in order to display
desired data or we could also do the same as we do with zebra :
	- store the data record (Format could be iso2709 or marcxml or YAML)
	- use that for display.
Or we could use GetBiblio in order to get the data from database.
Problem now would be the fact that storing xml in database is not really
optimal for process.

-- 
Henri-Damien LAURENT


More information about the Koha-devel mailing list