More than 230.000 documents in Hippo Repository
Two months ago I blogged about Importing lots of data into Hippo Repository. The import was not just an experiment, it was for a real website. The website is now up and running!
For performance reasons we only put the imported documents into the www tree. The imported data is not managed through Hippo CMS and no workflow is necessary. Putting the data into both preview and live would mean twice as many records for the properties, twice as many storage and a much bigger Lucene index. We looked very strict to which WebDAV properties should be set on the imported data. Each property takes a record in the database, not much unless you have 230.000 documents. Not all data from the original webservice was necessary for the site we were using. The average document size decreased from an average of 4kB to 500 bytes each.
Curious about the 230.000 documents?