Date: prev next · Thread: first prev next last
2013 Archives by date, by thread · List index

Le Tue, 12 Nov 2013 09:17:31 +0100,
Alexander Thurgood <> a écrit :

Le 11/11/13 12:19, Charles-H. Schulz a écrit :

Hi Charles,

Next time I'm  sure you can join us in the weeks during which we
discussed the survey on the marketing and project's list :-)

Gladly, but the main reason I'm not on, or only rarely follow, the
marketing list, is because of what I see as the zealotist atmosphere
that tends to reign there. I have nothing against being keen to
support and promote a product, but I draw the line at losing all
objectivity. Unfortunately, that is how I perceive the marketing list
to function.

Well, I truly hope we're changing that perception and that the
marketing list can be a place for collaborative work :-)

- penetration of the product;

I honestly would not  think there's relevant data for this in the
survey and from the respondents.

Hmm, yes, I realize that, but neither was that my intention in that
particular statement. What I meant is that this survey appears not to
have been promoted in such a way as to maximise the number of
returns. I can imagine a number of reasons for this, but I wonder if
you remember, back in the old days of Sun StarOffice, when Sun ran a
user-oriented survey that was linked to the installation (or
post-installation / start-up / one month's use) of the product ?

Although this might have seemed invasive to many at the time (I really
don't know), I actually feel that this is quite a good idea to borrow
from, much in the same way that the download page now links you to
donations to LibreOffice, perhaps it would be possible to organise
future surveys via a similar mechanism ?  In other words, make it so
that, say, after a period of one month from the initial installation
of LO, that the user be directed to a web page to participate in a
survey relating to the usage or desiderata of the product (from the
user's perspective, of course).

I do remember it well, but it sure came with lots of criticism too
(much more than the present one). 

- reach of the survey;

Good question with no easy answer. The survey was localized in 5
languages aside English. The link was posted here and on the several
other users mailing lists. The word was spread on the Facebook
LibreOffice page and Google+ and to a lesser extent on Twitter. 

Yes, and it felt to me that people who were already on the mailing
lists would more likely be inclined to attempt to respond to the
survey anyway, since that means of communication was used foremost
(it is how I found out about the existence of the survey). Certainly,
that seems to have been the behavioural response on this list. Again,
this would be considered normal behaviour for people who are already
on the project mailing lists and occasionally like to have a say in,
or just follow, contributions from others. "Preaching to the
converted", I believe the French say.

- the survey could have had a bigger and much deeper outreach if  it
  had been pushed directly to the users, say at the installation
phase or even through a mechanism allowing users to respond to it
via the StartCenter. That was obviously not the case, so in the end
we reached out the users who are on the project's mailing list and
  connected to us through our social networks. This leaves out
plenty of users irrespective of their language. 

Yes, I understand, hence my suggestion above to think back to how Sun
went about handling a similar situation.

My honest answer would be: not enough resources. 

- design of the survey;

What would you like to know? The survey was designed in order to be
progressive in its questioning as should be all the surveys. Beyond
that, don't look too much into survey methodologies, I'm not sure
they are that sophisticated, unless of course you would like to get
a particular answer in advance, and that's precisely what  we
wanted to avoid.

Nonetheless, as others here have indicated, it did seem that the
questions were biased towards a particular goal, i.e. showing that the
website or the project's communication methods were not quite there
yet, or that the project hadn't managed to foster the required
"community impetus" due to a failure in one or more areas.

The questions were biased not because we wanted people to tell us
something we wanted to hear, but because the survey stems from
discussions where an analysis was drawn, namely, we don't engage users
in our community and we don't talk to them enough. In this sense, it is
biased because the questions were framed around this thinking. But to
take only one example, the answers could have been something like a
majority telling us they don't have the time and another large chunk
telling us they already donated money. And that would not have led to
the same conclusion and would ultimately have invalidated the thinking. 

- length and time for which the survey ran ?

The survey  started on the 31st of October and expired yesterday.

Thanks for taking the time to answer my different points Charles, I
appreciate it.

Thank you for your feedback!

Charles-H. Schulz 
Co-founder, The Document Foundation,
Kurfürstendamm 188, 10707 Berlin
Gemeinnützige rechtsfähige Stiftung des bürgerlichen Rechts
Legal details:
Mobile Number: +33 (0)6 98 65 54 24.

To unsubscribe e-mail to:
Posting guidelines + more:
List archive:
All messages sent to this list will be publicly archived and cannot be deleted


Privacy Policy | Impressum (Legal Info) | Copyright information: Unless otherwise specified, all text and images on this website are licensed under the Creative Commons Attribution-Share Alike 3.0 License. This does not include the source code of LibreOffice, which is licensed under the Mozilla Public License (MPLv2). "LibreOffice" and "The Document Foundation" are registered trademarks of their corresponding registered owners or are in actual use as trademarks in one or more countries. Their respective logos and icons are also subject to international copyright laws. Use thereof is explained in our trademark policy.