Date: prev next · Thread: first prev next last
2016 Archives by date, by thread · List index


On Wed, Jul 13, 2016 at 4:18 AM, Stephan Bergmann <sbergman@redhat.com> wrote:

The idea behind the second choice is to periodically update and rebuild the
bot's compilerplugins repo, so that sequences of LO builds on the bot are
guaranteed to all use the same plugin.so and be able to reuse cached objects
across those builds.  However, that would mean that Gerrit changes based on
a relatively old code base could be built against a newer version of the
compilerpluigns repo, running into warnings/errors from new plugins, the
fixes for which across the LO code base not yet being included in the code
base revision the change is based on.  So I think this approach isn't
feasible.


Actually that is very feasible. since
1/ the solution for these old-based change is 'rebase'

2/ that has to be done anyway before the said patch is the be
integrated into master.


(Another problem would be that e.g. the name of a class from the
LO code base can be whitelisted in one of the plugins, to suppress
warnings/errors from that class.  If the name of the class changes in the LO
code base, the compilerplugins repo must be changed in sync.)

That sound like the whole contraption is quite fragile and random.. if
you have to 'whitelist' random 'class'
directly in the pluging... I'd suggest there is something
fondamentally wrong in the process.
whitelisting, or more eactly a way to say to the plugin: 'shut-up, I know'
should be done by annotating the source code not by patching the plugin.



So my proposal would be as follows:  First, check whether enabling
compiler_check=content or the compiler_check=string:X setup (and increasing
the ccache size if necessary and possible)

the cache size is 100GB dedicated to clang... that is quite a chunk of
disk already.


gives good-enough performance.
If not, restrict commits to compilerplugins/ to be less frequent, and see
whether that increases the ccache hit rate and results in good-enough
performance.

I do not favor compiler_check=content... as this means calculating a
hash of the compiler _and the plugin every time.
the plugin.so alone is 150MB (which is quite insane btw considering
that clang itself is ~50MB)
I really do not need to waste too much time experimenting to know that
hashing 15,000 X 200MB = 3TB per build is going to be a waste.

compiler_check=string  is no better since that would actually run
these as external process for each ccache invocation
bearing in mind that a build is north of 15K of theses...

What I suggest is:

1/ allow the plugins to be built standalone and delivered (ideally,
spin it in a sub-module). Allow configure.ac to use a 'external
version of the compiler plugin.
2/ work on the plugin in a branch.. (or ideally in a sub-module), push
core.git needed fixes in advance of merging the pluging changes
3/ every so often merge the plugin changes... (if it is a submodule it
is just a matter of moving the submodule ref in core.git) (if 2/ was
followed that does not break master and master had been compatible
with if for some time
4/ at that point a jenkins job will use 1/ to deploy a new version of
the plugin... everything will be build with that from that point on
(again if it is a submodule.. that can be cloned and built
stand-alone.. which would be less wastefull thatn maintaining a full
core.git workspace for that purpose on each concerned slaves)
5/ too old gerrit patch could fails.. but that is fine, they need to
be rebased anyway.. and hopefully that will give an incentive to
people not to keep patches on old base...


Norbert

Context


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.