Growth/Analytics updates/Welcome survey initial report/tr

Büyüme ekibinin "Kişiselleştirilmiş ilk gün" projesi kapsamında 19 Kasım 2018'de Çekçe ve Korece Vikipedi'lere Karşılama Anketi'ni 19:00 UTC saatinde konuşlandırdık. Anketin amacı, vikideki ilk günlerini kişiselleştirebilmemiz ve hedeflerine ulaşmalarına yardımcı olabilmemiz için yeni kullanıcılar hakkında bazı ilk bilgileri toplamaktır. Konuşlandırmadan önce neyi ve neden ölçeceğimizi ayrıntılarıyla açıklayan bir deneme planı yayınladık. Bu sayfa ekibimizin anket sonuçları hakkındaki ilk raporudur ve deney planından çeşitli soruları ele alan daha ayrıntılı analizlerle takip edilecektir.

Bu raporda, 17 Aralık 2018'de dağıtım ile gün sonu (UTC) arasında kaydedilen hesaplara dayalı olarak anket ve yanıtlarına hızlı bir genel bakış sunuyoruz. Bu sonuçların ne güven aralığını ne de istatistiksel önemini hesapladık ve önemli farklılıklar olduğu konusunda iddiada bulunmuyoruz (örneğin, iki Vikipedi arasında veya kullanıcı grupları arasında). Ayrıca henüz anket sorularını birbirimize karşı çapraz tablolamadık veya EditorJourney verileri ile analiz etmedik. Bunun yerine, bunları ön bulgular olarak sunuyoruz ve bu bulguların önerebileceği bazı potansiyel adımları tartışıyoruz ve önümüzdeki çeyrekte daha kapsamlı bir analiz yapacağız.

Konu başlıkları

 * Çoğu kullanıcı ankete yanıt vererek, Çekçe ve Korece Vikipedi'lerinde sırasıyla %67 ve %62 gibi yüksek yanıt oranları vermektedir.
 * Şu anda anketin yeni kullanıcıların siteden ayrılmasına neden olduğundan endişe etmiyoruz.
 * Korece Vikipedi'de hesap oluşturmanın en yaygın nedeni, düzenlemek için değil (%29) maddeleri okumaktır. Bu, %18'in bu cevabı verdiği Çekçe'tekinden farklıdır. Buradaki yüksek rakamlar, bu kullanıcıları Vikipedi'yi düzenlemenin mümkün ve kolay olduğu konusunda eğitme fırsatını temsil edebilir.
 * Her iki dilde katılımcıların çoğunluğu daha önce Vikipedi'yi düzenlememiştir (Çekçe'de %51 ve Korece'de %63). Ancak bu yüzdeler aynı zamanda çok sayıda insanın daha önce (anonim olarak veya farklı bir hesapla) düzenlediği anlamına gelir ve bu nedenle nasıl düzenleneceği hakkında bilgi sahibi olabilir.
 * Koreli katılımcıların, yalnızca önceden doldurulmuş seçenekleri seçmenin aksine, Çekli katılımcıların kendi özel konularını yazma olasılıkları çok daha fazlaydı. Ankete katılanların %28'i, Çek katılımcılarının %9'una kıyasla kendi konularını ekledi.
 * Şaşırtıcı derecede çok sayıda katılımcı, düzenleme konusunda yardım almak için kendilerine başvurmak istediklerini söyledi: Çekçe'de %36 ve Korece'de %53. Bu, insandan insana yardım potansiyeli ve arzusunun var olduğuna dair güçlü bir beyandır. Katılımcılar, erişim için eyleme geçirilebilir kullanıcıların bir listesidir.
 * Birkaç kişi, hesap oluşturma sırasında bir tane eklemeyen bir e-posta adresi ekler. Sayılar, seçeneğin verimli olmasına yetecek kadar büyük (Korece %6 ve Çekçe %7), ancak bir e-posta adresi eklemeyi teşvik etmek için daha iyi itirazlar düşünebileceğimiz kadar küçük.

Arka plan
Bu anketin asıl amacı, deneyimlerini kişiselleştirmek için kullanabileceğimiz kullanıcılar hakkında bilgi toplamaktı. bu projenin bir sonraki aşamasında bu veriler üzerinde nasıl harekete geçileceğimize dair düşüncelerimiz hakkında burayı okuyun.

Dağıtımdan sonraki dört hafta boyunca, anket, hedef vikilere yeni bir hesap kaydedenlerin rastgele seçilmiş% 50'sine gösterildi (yani, farklı bir vikide zaten bir hesabı olan kullanıcılara gösterilmiyor. "otomatik oluşturulan kullanıcılar"). Anket grubu ve kontrol arasındaki bu A/B testi, anketin kayıttan sonraki 24 saat içinde ilk düzenlemelerini yapan kullanıcıların daha düşük bir oranına yol açıp açmadığını belirleyebilmemiz için seçildi ("editör aktivasyonu" olarak adlandırıyoruz). Bu deneyin sonuçlarının bir analizi yakında sunulacaktır.

Anketin nasıl göründüğüne ve içerdiği sorulara hızlı bir kaynak burada bulunabilir. Anket Çekçe dili ve Korece dilinde ilgili vikilerde görüntülenir.

Yanıt oranı
Çekçe Vikipedi'de anketi 669 kullanıcıya, Korece Vikipedi'de 836 kullanıcıya gösterdik. Kullanıcı anketi gördüğünde doldurmaları gereken bir dizi soruları vardır ve bunların tümü isteğe bağlıdır. Daha sonra "Bitir"'i tıklayarak anketi gönderebilirler (herhangi bir soruya yanıt vermemiş olsalar bile), "Bu anketi atla"'yı tıklayarak yanıtlarını silebilir veya sayfayı veya siteyi terk eden başka bir işlem yapabilirler. sol gezinme bölmesinde bir bağlantı veya sekmelerini kapatma. Bu ikinci eyleme "vazgeçme" diyoruz. İki viki için bu eylemler arasındaki dağılım aşağıdaki gibidir: Table 1 shows that most of the users submitted the survey, which is great! As we will see below, users also answer our questions (rather than submit a survey with no answers). The abandonment rate appears to be fairly high, and at first we were concerned this meant that the survey was causing users to leave the website entirely, which would be a counter-productive outcome. To look into this, we dug into the data captured via our team's "Understanding first day" project, which gathers data on what new users view during their first 24 hours. We found that in Czech, only 47 users (7.0%) left the site, while in Korean it was only 99 users (11.8%). Both of these proportions are below the thresholds we had set for whether to change the survey or turning it off. This question will be answered more conclusively when we analyze the control group's rate of abandoning the site after account creation.

It is also possible to split the response rates by whether the account was created on the desktop or mobile site, but we find that the proportions are generally similar.

Why did you create your account today?
Why did you create your account today?


 * To fix a typo or error in a Wikipedia article
 * To add information to a Wikipedia article
 * To create a new Wikipedia article
 * To read Wikipedia
 * Other (please describe)

Our first question asks why the user created an account, and provides several options, as well as an "Other" option where the user is given a text field to explain further. For our two target Wikipedias, the responses pan out as follows, with proportions based on of the number of respondents in each language: The first thing to notice is perhaps that the most frequent option is different between the two languages. In Czech it is creating a new article that is selected by 32.6% of respondents, while in Korean it is reading (28.8%). In both languages, the other option is third on the list, reading was chosen by 17.5% of Czech respondents, and creating a new article by 19.7% in Korean. It's interesting to learn that reading Wikipedia motivates a lot of account creation, since having an account does not materially change the reading experience. That may point to a misperception around account creation, but may also be an opportunity to engage users both as readers and potential editors.

Adding information to an article is consistently the second option in both languages, and has a comparable proportion of around 25%. The same goes for fixing a typo or an error, which is consistently fourth on the list with about 17% of the responses.

Have you ever edited Wikipedia?
Have you ever edited Wikipedia?


 * Yes, many times
 * Yes, once or twice
 * No, I didn't know I could edit Wikipedia
 * No, other reasons
 * I don't remember

The second question asks whether the user has edited Wikipedia before and lists five potential answers. Some users also submit the survey without responding to this question. Table 3 below gives an overview of the responses, and again proportions are based on total number of survey responses. In both languages we find "No, I didn't know I could edit Wikipedia" is the most frequent option, and that a majority of respondents say they had not edited Wikipedia before (combining both "no" options: Czech: 50.5%; Korean: 63.2%). Regarding the "No, I didn't know I could edit Wikipedia" response, it makes sense that many people would give this answer given how many say they are creating their account for the purpose of reading. But we were also surprised that the number was quite so high. One hypothesis is that the question might be interpreted to mean different things by different respondents. One possible interpretation is "No, I didn't know I could edit Wikipedia until this survey question pointed it out", and another is "No, I didn't know I could edit Wikipedia until recently, but once I discovered that I could, I decided to create this account." We will learn some more about this question once we make cross-tabulations against the other questions, and we can consider clearer phrasings of these responses in the future.

It is also worth noting that the order of the responses is the same across both languages, and that it is different from the order the options are shown to the user. This means that the respondents did not simply choose the first answer in the list when responding, but are instead actively letting us know that they haven't edited Wikipedia before.

Select some topics you may wish to edit
''People can edit Wikipedia articles on topics that they care about. We've listed a few topics below that are popular for editing. Select some topics that you may wish to edit:''

Explicitly listed as checkboxes: Arts, Science, Geography, History, Music, Sports, Literature, Religion, Popular culture.

Available in a typeahead dropdown menu: Entertainment, Food and drink, Biography, Military, Economics, Technology, Film, Philosophy, Business, Politics, Government, Engineering, Crafts and hobbies, Games, Health, Social science, Transportation, Education.

The third part of the survey asks the respondents to select some topics that they may wish to edit. Nine topics are shown as checkboxes, and another eighteen topics show up when the user clicks on or types in the field. The field is free-form, allowing respondents to add additional topic. Respondents may choose and add as many topics as they like.

This analysis only covers the suggested topics. Future analyses will address the user-supplied topics, which require translation before they can be analyzed. We show one table below for each language. The table identifies the way a user can select a topic as either "checkbox", meaning it is one of the nine checkboxes; "prefilled", meaning it is one of the eighteen pre-filled topics found in the free-form field; or "other", meaning it is a topic added by the respondent.

We can see that the dominating topics are all the ones listed in the checkboxes. The least frequent checkbox is selected by 20.8% of respondents, while the most frequent topic in the free-form field is only chosen by 3.5% of respondents. It is noteworthy that respondents are selecting multiple topics, as opposed to just one.

We see a similar trend in Korean as for Czech: the checkboxes are dominating when it comes to selecting topics, although the difference between the least popular checkbox and the most popular pre-filled topic is smaller (11.0%) in Korean than in Czech (17.3%).

Are you interested in being contacted to get help with editing?
''We are considering starting a program for more experienced editors to help newer users with editing. Are you interested in being contacted to get help with editing?''

We find that in both languages, a surprisingly large number of users are interested in being contacted. 164 users in Czech (36.4% of all survey respondents) and 273 users in Korean (52.7%) answered "yes" to that question. This means that there's clearly interest among new users to get help to edit Wikipedia, and that this is a potential venue for community outreach. When we dig deeper into the survey responses, we will also compare the responses to this question with the answer to the question of whether the user had already edited Wikipedia, as well as why they signed up to create an account.

Adding an email address
Users who did not add an email address during their initial account creation are given a second opportunity to add their email address in the survey. We find that very few users do so, only 13 on Czech Wikipedia, and 20 on Korean. This corresponds to 6.5% of Czech users who did not already have an email address when shown the survey, and 5.7% of the Korean users.

Repeat survey responses
Though there is not an explicit workflow for doing so, users can take the survey multiple times by revisiting the survey URL. We only store their most recent responses, meaning that we regard their most recent answer to accurately reflect their interests and opinions. At the same time, we store a count of how many times they have responded/skipped. Table 8 below shows how the number of responses is distributed, where the proportion is out of all users who either saved or skipped the survey. We can see that it's relatively rare that users take the survey multiple times, and if someone does, it's typically only one more time. This means that we see little reason to discard responses based on users taking the survey multiple times and potentially changing their answers.

Sanity checks
We have also run various sanity checks on our data in order to ensure that things are working properly. For example, we have calculated the distribution of users assigned into the survey and control groups, which ideally should be 50/50. This also turns out to be the case, overall on Czech Wikipedia the proportions are 49.7%/50.3% survey/control, and on Korean Wikipedia it is the other way around. We do find some variation when accounts are split into registrations from desktop and mobile (e.g. that it's 47/53 in some cases), but not enough to warrant a concern that the randomization has led to imbalanced or biased groups.

While working on this report, we have not yet dug carefully into the data to determine if the responses appear to be truthful. For example, if a user answers that they did not know they can edit Wikipedia but also says they had edited Wikipedia many times, we should most likely discard their answers to at least both those questions, potentially the entire survey. This is noted and will be done as part of a more thorough examination of the survey results at some point in the near future.

Appendix A: Email added at registration
How did we determine how many users had not provided an email address at signup to be able to calculate that proportion? This is not trivial, because the MediaWiki database does not store a timestamp of when a user added their email address, nor is there an EventLogging schema in use for logging that kind of information either. The only piece of information in the database that seemed related is the expiration timestamp of the verification token that is emailed to the user when they enter their email address.

We examined the difference between the timestamps of account registration and verification token expiration for accounts registered between January 1 and July 1 2018 on both Wikipedias and found that it is typically set to slightly more than seven days. How much more is "slightly more"? In the vast majority of cases less than ten seconds, which we think is the delay between the system creating the account and the subsequent emailing of the verification token (at which point the expiration timestamp is set to "seven days from now"). We therefore adopted a simple heuristic for determining if the user supplied an email at registration: it happened if the difference between the two timestamps is less than "one week + ten seconds".

Another thing we have to consider is that we do not have information about whether a user supplied an email address at registration but then decided to delete it. This means that they'll show up in our statistic as "did not supply an email at registration". We decided to assert that this is rarely done based on the fact that as of December 19, 64% of Czech registrations and 75% of Korean registrations between January 1 and July 1 did not have a verified email address. This suggested to us that users most likely either supply an email address that they do not check, or do not really care much about email verification, which we took to mean they are also unlikely to delete their email address.

Lastly, the proportion listed in the "added email" section above was not based on an upper limit for how quickly after registration a user can add their email address. This means that users who took the survey shortly after it was deployed have had more time to provide us with an address. In future calculations we will have a limit (e.g. one week), but in the meantime we will assert that if they have not provided us with an address already it's unlikely that they return to do so (in other words, that it's relatively unlikely that a user adds an email address after registration).