Date: prev next · Thread: first prev next last
2012 Archives by date, by thread · List index


2012/2/8 Caolán McNamara <caolanm@redhat.com>:
On Tue, 2012-02-07 at 22:05 +0100, Markus Mohrhard wrote:
But keep in mind that this is nothing that will be fast. I needed
around one week to check a bit more than 4000 files

My times are dramatically faster than this for .doc files in writer at
least. I can load 4000 .doc files *under valgrind* overnight in about 10
hours. So apparently mileage varies quite a bit depending on hardware
and file format and debugging level.


That only works if you have no crashs or looping documents. Especially
looping is a big problem in calc.

Then if we want to be fast using a debug/dbgutil build is the wrong
way but then we loose the advantages of gcc's safe iterators.

So I think 4000 known good documents can be easily tested in one day
or even faster with a "decent" machine but taking 4000 random
documents from bugzilla needs some manual interaction and therefore
will need more time.
As mentionend in the last mail, I think we could speed that up by
copying the test code and the makefile several times and run the test
in parallel. That way we could use more cores and would be more
reliable against crashs. ( we should of course not commit this stuff
then)

Markus

Context


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.