Logo of Wikidata Welcome to Wikidata, Dipsacus fullonum!

Wikidata is a free knowledge base that you can edit! It can be read and edited by humans and machines alike and you can go to any item page now and add to this ever-growing database!

Need some help getting started? Here are some pages you can familiarize yourself with:

  • Introduction – An introduction to the project.
  • Wikidata tours – Interactive tutorials to show you how Wikidata works.
  • Community portal – The portal for community members.
  • User options – including the 'Babel' extension, to set your language preferences.
  • Contents – The main help page for editing and using the site.
  • Project chat – Discussions about the project.
  • Tools – A collection of user-developed tools to allow for easier completion of some tasks.

Please remember to sign your messages on talk pages by typing four tildes (~~~~); this will automatically insert your username and the date.

If you have any questions, don't hesitate to ask on Project chat. If you want to try out editing, you can use the sandbox to try. Once again, welcome, and I hope you quickly feel comfortable here, and become an active editor for Wikidata.

Best regards! Liuxinyu970226 (talk) 04:22, 5 August 2014 (UTC)Reply

Your edit to William Harbord

edit

In this change you claimed that William Harbord was born 25 April 1635 in the Gregorian calendar, and that the date comes from the English Wikipedia. But the English Wikipedia states that he was born 25 April 1635, with no calendar being specified. The normal practice of historians writing about that period in England, and also the practice called for in en:Wikipedia:Manual of Style/Dates and numbers#Julian and Gregorian calendars, is to use the Julian calendar, which was the calendar in force at that time in England. Thus the date should be interpreted as being in the Julian calendar, and converted to the Gregorian calendar before adding to Wikidata. Jc3s5h (talk) 02:23, 23 August 2014 (UTC)Reply

Hi, thank you for pointing out my error. I have corrected it for William Harbord, but I may also have done similar errors while inserting other dates. I will try to be more aware of the used calendar in future edits. Regards, Dipsacus fullonum (talk) 05:55, 23 August 2014 (UTC)Reply
Thanks for attending to this. I find the display of the diff confusing, so I have asked about it at Wikidata:Project chat#Reading a diff about dates. Jc3s5h (talk) 16:17, 25 August 2014 (UTC)Reply

Translitteration

edit

Vil du se på mine rettelser idag, jeg har måttet indgå en række kompromisser. Tre personer, far, søn og datter, fik to forskellige efternavne: Elena Baltacha (Q232680), Sergei Baltacha (Q4076889) og faderen Sergei Baltacha (Q552337), translitterationen ændrededes i vist nok 1991, og samtidigt har børnene boet i England/Scotland i adskillige år. Så de har sikkert officielt skiftet navn til en translitteration. På artiklen om da:Elena Baltacha på da wiki, har jeg indsat de forskellige navne i brødteksten. Og om P. Jurgenson (Q20054933) måtte jeg også indgå et kompromis, grundlæggeren af forlaget var en estisk-svensker, der var født i Tallinn, hvor han sikkert fik det svenske navn. Samtidigt er der en artikel om ham på sv wiki, derfor valgte jeg at sætte aka Musikforlaget P.I. Jürgenson. Han er selv indsat som Peter Jürgenson, mens hans søn og barnebarn begge har fået den danske translit af det russiske efternavn, Jurgenson, da de sikkert havde droppet deres svenske navn. Jeg vil blive glad hvis du forholder dig til navnene. Ligesådan vil jeg gerne høre hvad du mener om "kandidat nauk" = "Ph.D." efter du har set mine link. PerV (talk) 09:51, 24 March 2017 (UTC)Reply

Forbehold: Jeg er ikke kompetent inden for translitteration, og har ikke kendskab til hvordan emnerne tidligere er omtalt på dansk. De fleste ser umiddelbart rimelige ud. Sergei Baltacha (Q4076889) er tilsyneladende vokset op Skotland, så jeg ville skrive hans navn som på engelsk med Jr. med stort J. Higher school of politics (Q2535586) - det er ikke almindeligt for dansktalende at forstå tjekkisk, så jeg vil foretrække en ordret oversættelse. Det bør måske afklares principielt hvornår navne på institutioner mv. skal oversættes. Riemann's Music Dictionary (Q27680201) - parentesen med årstal ser ikke ud til at være en del af navnet, så ville flytte udgivelsesår til beskrivelsen i stedet. Cairo University (Q194445) ville jeg skrive Cairo Universitet med stort jf. retskrivningsreglerne § 12.7 da det ser ud til at være universitets navn oversat til dansk. Jeg kan ikke læse arabisk, men de har en hjemmeside på engelsk hvor de omtaler sig som "Cairo Univeristy". Meal of the Amsterdam militia (Q17342532) - Hvorfor skal "Rot L" ikke oversættes når resten af titlen er oversat? Jeg er ikke god til hollandsk, men måske "Måltid for Atten Amsterdamske Skytter i Enhed L"? Årstal bør stå i beskrivelsen, og hvad det også er kendt som skal stå under aliaser. Meal of 14 militia guardsmen (Q17541505) - jeg er igen usikker på den hollandske betydning, men måske "Skytter i Enhed G". Mvh. Dipsacus fullonum (talk) 10:57, 24 March 2017 (UTC)Reply

Diskussion på da wiki om brug af infobokse

edit

Hvis du har energi til det, tror jeg, du kan berige denne diskussion på da wiki: Wikipedia:Landsbybrønden/Kort udgave af infobokse. Den er en opfølgning på en diskussion, jeg rejste her. PerV (talk) 08:45, 9 April 2017 (UTC)Reply

Hej PerV. Tak for beskeden, men jeg vil ikke blande mig i hvordan man laver infobokse på dansk Wikipedia. Jeg deltager ikke længere i det projekt fordi jeg er træt af at nogle brugere uhindret kan svine andre brugere til uden at det har konsekvenser. Jeg så at Dannebrog Spy lavede en rammende beskrivelse af situationen i indlægget w:da:speciel:diff/9004545 på Landsbybrønden. Mvh. Dipsacus fullonum (talk) 09:20, 9 April 2017 (UTC)Reply
Tak for din besked, og nu hærger Rodejong så videre med infoboksene. Jeg gider iøvrigt heller ikke at blande mig. Jeg har tænkt mig at opdatere samtlige russiske byer med nyeste indbyggertal, indsætte borgmester osv. (de oplysninger vi på da wiki har hentet, med kilder), her på wikidata. I forbindelse med det, forestiller jeg mig, som allerede nævnt, at det vil være smart at komme i kontakt med en russisk bruger her på wikidata. Vil du hjælpe mig med det. Du kender helt sikkert wd langt bedre end jeg! PerV (talk) 09:51, 9 April 2017 (UTC)Reply
PerV, jeg vil venligst bede dig om at ikke at tale grimt om andre brugere. Jeg kan se ikke noget i Rodejongs historik de sidste dage på dansk Wikipedia som er "hærgen". Mvh. Dipsacus fullonum (talk) 10:46, 9 April 2017 (UTC)Reply
Så har du helt åbentlyst ikke set den seneste redigering af Infoboks Wikidata person, helt uden forudgående diskussion. Redigeringen betød blandt andet at Aleksandr Dugins infoboks blev tømt for akademiske grader. Men det skal man vel bare vende sig til. Du kan evt. også se hans seneste redigeringer af labels her på wikidata. Det er ihvertfald ikke danske labels han sætter ind. PerV (talk) 11:09, 9 April 2017 (UTC)Reply
Det var ikke uden forudgående diskussion. Der var en meget diskuteret ændring, som bl.a. jeg har været stor fortaler for, som betyder at man kan i infoboksene kan vælge hvilke oplysninger der er hentes fra Wikidata. Dugins akademiske grader forsvandt, fordi de ikke er tilvalgt i infoboksen. Jeg er ikke altid enig med Rodejongs redigeringer, men netop denne bør han have ros for. Ja, der er nogle stavefejl mv. i de indsatte titler på malerier her på Wikidata. Lad os rette dem i stedet for at skælde ud på manden. Mvh. Dipsacus fullonum (talk) 11:34, 9 April 2017 (UTC)Reply

Hej Kartebolle

edit

Hej Kartebolle, jeg hopper bare ind for at hører om du helt har opgivet os på Wikipedia på dansk, eller om man skulle være så heldig at du kom tilbage efter påsken ---Zoizit (talk) 12:21, 9 April 2017 (UTC)Reply

Hej Zoizit. Ja, jeg har opgivet dansk Wikipedia og har ingen planer om at vende tilbage dertil foreløbig. Der er ingen glæde ved at deltage som tingene fungerer i øjeblikket. Jeg vil genoverveje det hvis man på et tidspunkt skifter praksis så der gribes ind overfor grove personangreb. Mvh. Dipsacus fullonum (talk) 19:39, 9 April 2017 (UTC)Reply

Infoboks kunstner

edit

Hej Kartebolle, Infoboks kunstner har følgende kode:

| above          = {{#if:{{{navn|}}}
                    | {{{navn}}}
                    | {{#if:{{{Navn|}}}
                       | {{{Navn}}}
                       | {{PAGENAME}}
                      }}
                   }}{{Wikidata-emne | P2348 | ikon=ja | adskil=,<br /> |<br /> {{{æra|}}} }}
| æra                  = 

J.L. Jensen fremkommer wikidatas oplysning dog ikke. Kan du fortælle hvad der er forkert?

Tak på forhånd. Kind regards,  Rodejong  💬 ✉️  17:40, 10 April 2017 (UTC)Reply

@Rodejong: Det ser ud til at første ikke-navngivne parameter til Wikidata-emne er "<br /> {{{æra|}}}", så der indsættes et linjeskift i stedet for værdien af P2348. Mvh. Dipsacus fullonum (talk) 17:58, 10 April 2017 (UTC)Reply
Rettelse: Jeg mente anden ikke-navngivne parameter. Den første er "P2348". Men konklusionen er stadig at "<br />" foran "{{{æra|}}}" skal væk. Mvh. Dipsacus fullonum (talk) 18:16, 10 April 2017 (UTC)Reply
Tak. Jeg har løst det på det her måde:
| above          = {{#if:{{{navn|}}}
                    | {{{navn}}}
                    | {{#if:{{{Navn|}}}
                       | {{{Navn}}}
                       | {{PAGENAME}}
                      }}
                   }}{{#if:{{{æra|}}}{{#property:P2348}}|<br /> }}{{Wikidata-emne | P2348 | ikon=ja | adskil=,<br /> | {{{æra|}}} }}
Og nu virker det fint. Tak for det. Kind regards,  Rodejong  💬 ✉️  18:19, 10 April 2017 (UTC)Reply

Fulde navn / fødselsnavn

edit

Der er uenighed om følgende:

| fulde navn  = <!--P1477 kan hentes fra Wikidata -->
eller
| fødselsnavn = <!--P1477 kan hentes fra Wikidata -->

Tankene er at ens fulde navn kan ændres (fx, når man bliver gift) mens fødselsnavn forbliver det sammen

Se denne lille diskussion.

Kan der bruges: official name for en biografisk emne? Kind regards,  Rodejong  💬 ✉️  19:02, 10 April 2017 (UTC)Reply

birth name (P1477) er ens fulde navne som man hed da man blev født. Det bør kaldes "fødenavn" (ikke "fødselsnavn") i en infoboks. Betegnelsen "fulde navn" vil blive forstået som ens nuværende fulde navn hvilket kan være noget andet.
official name (P1448) er det officielle navn på et officielt sprog. Det kan bruges hvis det officielle sprog skrives med latinske bogstaver. Men for russere, kinesere, grækere osv. vil det officielle navn være skrevet med kyriliske bogstaver, kinesiske tegn, græske bogstaver osv. Der er ikke nogen Wikidata-property der kan bruges til "fulde navn på dansk".
Hvis jeg lavede en infoboks ville jeg have "fødenavn" som brugte birth name (P1477), og "fulde navn" som ikke brugte Wikidata. Mvh. Dipsacus fullonum (talk) 19:20, 10 April 2017 (UTC)Reply
PS. Ved nærmere eftertanke kan "fulde navn" i en dansksproget infoboks bruge official name (P1448) for danskere, men ikke for personer af andre nationaliteter. Mvh. Dipsacus fullonum (talk) 19:38, 10 April 2017 (UTC)Reply
PPS. Tænkt mere over det. Det ville her være praktisk at have en parameter til w:da:Skabelon:Wikidata-tekst som begrænser værdierne til tekster til sprog som bruger latinske bogstaver. Det ville ikke være svært at lave. Mvh. Dipsacus fullonum (talk) 21:08, 10 April 2017 (UTC)Reply

Björn J:son Lindh

edit

Please read my comments on https://www.wikidata.org/wiki/Talk:Q1812220 Tapperheten (talk) 06:35, 13 April 2017 (UTC)Reply

I read them and replied at the page. Best regards, Dipsacus fullonum (talk) 07:08, 13 April 2017 (UTC)Reply

Valuta

edit

Hej DS, Jeg sidder med et problem i da:skabelon:infoboks virksomhed hvor det angår beløb og valuta.

Selve beløbet vises fint med "|usikkerhed=nej", men hvordan får jeg den rette valuta bag den?

Se denne version af Novo Nordisk
Se Q818846
Se da:Skabelon:Infoboks virksomhed

Kan det gøres med {{Wikidata-tal}}? Eller skal der en {{Wikidata-valuta}} til? Håber du kan hjælpe os med det.

Tak på forhånd Kind regards,  Rodejong  💬 ✉️  19:57, 19 April 2017 (UTC)Reply

Det kan gøres med Wikidata-tal, men alle brugte valuta-enheder skal indsættes i en tabel i w:da:Modul:Brug Wikidata. Gå til modulet og find tabellen:
-- Units used for quantity values. For each give name to display, conversion factor, 
local wd_units = {
	-- area units
	Q712226 = { name = 'km2', show_as = 'km<sup>2</sup>', conv = 1e6, type = 'area' },
	Q25343 = { name = 'm2', show_as = 'm<sup>2</sup>', conv = 1, type = 'area' },
	Q232291 = { name = 'mi2', show_as = 'mi<sup>2</sup>', conv_to = 'km2', conv = 2589988.110336, type = 'area'},
	-- length units
	Q11573 = { name = 'm', show_as = 'm', conv = 1, type = 'length' },
	Q828224 = { name = 'km', show_as = 'km', conv = 1e3, type = 'length' },
	Q253276 = { name = 'mile', show_as = 'mi', conv_to = 'km', conv = 1609.344, type = 'length' },
	Q174728 = { name = 'cm', show_as = 'cm', conv = 0.01, type = 'length' },
	-- mass units
	Q11570 = { name = 'kg', show_as = 'kg', conv = 1, type = 'mass' },
	-- time units
	Q11574 = { name = 's', show_as = 's', conv = 1, type = 'time' },
	Q7727 = { name = 'minut', show_as = 'min.', conv = 60, type ='time' },
	Q25235 = { name = 'time', show_as = 't', conv = 3600, type = 'time' },
	-- speed units
	Q180154 = { name = 'km/t', show_as = 'km/t', conv = 0.2777777777777777778, type = 'speed' }
}
For hver valuta skal der indsættes en ny linje. For danske kroner og amerikanske dollar kan linjerne for eksempel se sådan her ud:
        -- currency units
        Q25417 = { name = 'DKK', show_as = "danske kroner", conv = 1, type = 'currency' },
        Q4917 = { name= 'USD', show_as = "amerikanske dollar", conv = 1, type = 'currency' },
  • Q25417 og Q4917 er Wikidata-emnerne for valutaerne. Teksten efter "show_as" bestemmer hvordan valutaenhed vises. Man kan vælge det fulde navn eller en forkortelse som man synes. Man kan også indsætte wikikode, for eksempel et link til artiklen om valutaen.
  • Linjer med "--" er kommentarer i Lua
  • Hvis man vil omregne automatisk mellem forskellige valutaer, skal man indsætte omregningsfaktorer (valutakurser) i feltet "conv", samt også tilføje linjer til tabellen "wanted_units". Hvis der ikke skal omregnes, er dette ikke nødvendigt.
  • Vigtigt: Alle linjer med enheder i tabellen, på nær den sidste, skal slutte med et komma. Hvis der tilføjes flere linjer til sidst i tabellen, skal linjen for km/t som nu er den sidste, også have et komma. Hvis man glemmer dette, kommer der en syntaksfejl, og modulet vil ikke virke længere. Jeg anbefaler kraftigt at teste alle ændringer i modulet i en sandkasse.
Hvis der mangler valutaer eller andre enheder i tabellen, vil det ses i kategorien w:da:Kategori:Enhed for størrelse på Wikidata ikke genkendt. Jeg ser at foruden artikler med valutaenheder, er der lige nu også artiklen w:da:Roy Williams i kategorien. Hans vægt er opgivet i amerikanske pund. For at få hans vægt vist i kg, skal linjen:
        Q100995 = { name = 'lb', show_as = "lb", conv = 0.45359237, type = 'mass' },
indsættes.
Mvh. Dipsacus fullonum (talk) 08:44, 20 April 2017 (UTC)Reply
Jeg har gjort det som du beskrev her. Og det virker fint. Dog er der 3 artikler der ikke forsvinder fra kategorien. Det kan selvfølgelig være sync der er forsinket, så jeg checker i morgen igen om de stadig er der. Dog kunne jeg se at Microsoft og Google direkte forsvandt. Hvis ikke - så melder jeg tilbage. Jeg takker for din tydelige beskrivelse. Kind regards,  Rodejong  💬 ✉️  22:12, 20 April 2017 (UTC)Reply
Dine ændringer ser OK ud. Det er før set at nogle artikler først forsvinder fra sporingskategorier mange timer efter en ændring, mens andre forsvinder med det samme. Jeg har ingen forklaring på dette. Mvh. Dipsacus fullonum (talk) 22:39, 20 April 2017 (UTC)Reply

Vil du kigge her, jeg tror, det er noget du kan klare i en håndevending? PerV (talk) 18:35, 20 May 2017 (UTC)Reply

Om valg billede

edit

Hej. Jeg vil høre om du kunne knytte en kommentar til dette spørgsmål om valg af billede til infoboksen på da:Hjælp:Teknisk_forum#Valg_af_billede_til_infoboksen. -- Mvh PHansen (talk) 16:33, 14 June 2017 (UTC)Reply

Dipsacus fullonum bot

edit

Your bot has been listed at Wikidata:Requests for permissions/Removal/Inactive bot accounts as being inactive for over two years. As a housekeeping measure it's proposed to remove the bot flag from inactive bot accounts, unless you expect the bot will be operated again in the near future. If you consent to the removal of the bot flag (or do not reply on the deflag page) you can rerequest the bot flag at Wikidata:Requests for permissions/Bot should you need it again. Of course, You may request retaining your bot flag here if you need the bot flag. Regards--GZWDer (talk) 12:34, 26 June 2017 (UTC)Reply

Feil i Module:Cycling race‎ – Norge med språkkode i resultatlistene

edit

Hei! Det er utrolig bra at du har satt igang arbeidet med å forbedre denne modulen. Jeg kopierte nettopp over den nye modulen til nowiki for å teste, og da kom jeg til å se at det er en liten feil i opplistingene av land. For der hvor Norge er med i listene så blir navnet etterfulgt av språkkoden. Det vises altså som "Norge (nb)". Se for eksempel her: no:Tour of Norway 2018. Jeg vil anta at dette har noe med det faktum at vi i Norge har to språkformer, og nowiki bruker språkform nb (mens nnwiki bruker språkform nn). Bergenga (talk) 13:16, 18 July 2018 (UTC)Reply

Ja, modulet tror at wikinavn og sprogkode er det samme, så det genkender ikke 'nb' som det lokale sprog for nowiki. Jeg vil rette det. --Dipsacus fullonum (talk) 14:05, 18 July 2018 (UTC)Reply
Flott! Lykke til med det videre arbeidet med modulen! Bergenga (talk) 14:15, 18 July 2018 (UTC)Reply
@Bergenga: Problemet skulle være løst med no:Special:Diff/18704463. Jeg vil flytte ændringen til Wikidata, når jeg opdaterer her næste gang. --Dipsacus fullonum (talk) 17:53, 18 July 2018 (UTC)Reply
Jeg opdagede at navne på trøjer blev ikke vist på norsk af samme grund. Det er rettet med no:Special:Diff/18704520. --Dipsacus fullonum (talk) 18:14, 18 July 2018 (UTC)Reply
Takk! Da ser det ut til at det ble fikset nå ja. Jeg så denne feilen med tooltip også, men tenkte ikke at den var så viktig at den var verdt å fikse enda. Tooltip-feilen finnes stadig på infoboks for etappe, men dette er vel ikke en del av koden som du har optimalisert, så da er det ingen hastverk med å få det fikset før du kommer så langt. Bergenga (talk) 18:31, 18 July 2018 (UTC)Reply
Nej, jeg har ikke ændret koden til at lave infobokse endnu, så fejlen med engelsk navn på trøjer i infobokse har været der hele tiden. Du kan rette det ved at søge efter "entity_jersey:getLabel(wiki)" og ændre det til "entity_jersey:getLabel(wikilang)". --Dipsacus fullonum (talk) 19:04, 18 July 2018 (UTC)Reply
Takk! Jeg hadde egentlig tenkt at feilen var så ubetydelig at det ikke var noen grunn til å fikse den nå, men siden du faktisk gav meg hele løsningen så måtte jeg jo bare gjøre det. :-) Bergenga (talk) 20:52, 18 July 2018 (UTC)Reply

Could you fix easy this problem

edit

Hi friend. There is a old bug in classifications like "generalclassifications" that they do not show classifications is there are repeated classifications. For example in the race da:Amstel Gold Race for kvinder 2017 we have 2 riders at same position 3rd 2017 Amstel Gold Race (women) (Q27481896) see it at procyclingstats, but due a bug for this tie the function "Cycling race/generalclassifications" truncate the list. Is it easy for you fix this in order be able to show all list? Repf72 (talk) 14:51, 26 July 2018 (UTC)Reply

@Repf72: Hello friend. Yes, that is easy to fix, so I have already done it in this edit: Special:Diff/715972646. Thank you for telling me about it. Best regards, Dipsacus fullonum (talk) 15:31, 26 July 2018 (UTC)Reply

Bug with new p.listofstages

edit

Hi friend. The new function lost some functionality for teams as stage winners. Currently you can datafill as winner the team of the season or you can datafill the main element for the team. The name you should see is the official name (present in the season or present in the main element of team) and if there is article from the team season the link should go to this or as default show the article of main element. Please see BMC as winner of stage 3 at Tour de France en "es" (previous code) es:Tour de Francia 2018 and Tour de France in "da" (new code) da:Tour de France 2018. Repf72 (talk) 02:33, 31 July 2018 (UTC)Reply

@Repf72: Hi friend, sorry for the bug. I think it is fixed in this edit: Special:Diff/718617509. The fix is for both listofwinners and listofstages, as these two functions now call the same subfunction to get the winner of a stage. --Dipsacus fullonum (talk) 07:29, 31 July 2018 (UTC)Reply
@Dipsacus fullonum: Thank you very much. I did propagate latest updates to "es". Due you are working a "subfunction to get the winner of a stage", could you include the special case when a team is a elite national cycling team (Q23726798) or under-23 national cycling team (Q20738667) team? In order to fix this old bug that implies not to see the team winner for da:Tour de l'Avenir 2017. Best regards.Repf72 (talk) 13:02, 31 July 2018 (UTC)Reply
@Repf72: National teams are included in listofwinners if they win the general classification, and they are included in listofstages if they win a stage. I still have not worked on the infobox, but will make sure to include them in the infobox when I do. --Dipsacus fullonum (talk) 13:32, 31 July 2018 (UTC)Reply
@Dipsacus fullonum: Excellent. I noticed your changes and finally I could replace listofwinners at es:Giro de Italia and es:Tour de los Alpes. Repf72 (talk) 14:09, 31 July 2018 (UTC)Reply

Bug with wikidata icon of listofstages

edit

Hi friend @Dipsacus fullonum:. If you click on "wikidata icon" at any table generated with "Cycling race/listofstages" there is not working the link to "Qxxx - #P527". Instead of it you are linked to see the wikidata icon: "File:Wikidata-logo_S.svg". Repf72 (talk) 01:58, 1 August 2018 (UTC)Reply

Rare bug with listofstages

edit

Hi friend @Dipsacus fullonum:. I can see an strange bug at da:Vuelta a España 2016. With previous version I see listofstages full, but with new version of the code only appears stage 10.Repf72 (talk) 02:09, 1 August 2018 (UTC)Reply

@Repf72: Hi, That was because someone had marked stage 10 as the preferred stage of the race. It was done in this edit Special:Diff/361730932 back in 2016, even before the start of the race. I have reverted that edit. If some stages are ranked as "preferred", only those will be used. I could change that to always include the "normal" rank, but a stage should never be preferred for any reason that I can think of. --Dipsacus fullonum (talk) 05:50, 1 August 2018 (UTC)Reply
@Dipsacus fullonum: Hi. Thank you. I did not know anything about rank level as normal or preferred. It can be a compatriot who felt the importance of that stage ;). Repf72 (talk) 12:15, 1 August 2018 (UTC)Reply
@Repf72: That may be. Anyway I will make sure to include all stages in the next version, even if some of them should be marked as preferred. I know what the problem is when the Wikidata logo links to Commons instead of to the Wikidata entity, and will also fix that in the next version. I don't feel that it is urgent, so it will not be immediately as I am working on infoboxes right now. I hope that is OK. Best regards, --Dipsacus fullonum (talk) 12:28, 1 August 2018 (UTC)Reply
@Dipsacus fullonum: Excellent that you are working on infoboxes. Really your improvements has been the best of the module since it creation. Some others wikipedias like italian did not want to use the module due early problems and bugs, may be we can try to convence them later due your exellent work. Kind regards. Repf72 (talk) 12:37, 1 August 2018 (UTC)Reply

Due you are working on infoboxes, we have also have the bug you fixed with repeated positions at all classifications that do not shows this case on infobox as 2017 Amstel Gold Race (women) (Q27481896). Repf72 (talk) 12:46, 1 August 2018 (UTC)Reply

Bug for declared desert positions

edit

Hi friend. There is a bug for "declared desert". Some days ago I could se "declared desert" in listofwinners at positions with criterion used (P1013) at declared deserted (Q45123627) filled, for example Tour de France, but now I can not see it. Please see: here for years 1999 to 2004. Regards. Repf72 (talk) 15:41, 15 August 2018 (UTC)Reply

@Repf72: Sorry, that came when I modified the new function winner so it can be used by both p.listofwinner and p.infobox. It should be fixed now. --Dipsacus fullonum (talk) 22:29, 15 August 2018 (UTC)Reply
Thank you. Now is ok. Repf72 (talk) 02:44, 16 August 2018 (UTC)Reply

Coordination

edit

Hello Dipsacus fullonum, many thanks first for the changes you made in the code, they are much appreciated. I just wanted to know, if you are still working on it. I have a bit of time in September/October, I could try to perform similar changes as you did on the function calendar first (I am not sure that it will be as good, but you can correct afterward :D). Psemdel (talk) 20:02, 10 September 2018 (UTC)Reply

@Psemdel: Sorry for the late reply. I have been busy with other things this week. I am not actively working on the cycling module at the moment, but I have plans to do more later. Now there is a lot of duplicated functionality because I made new functions to work with IDs to replace the existing functions working with entities. But the fuctions working with entities cannot be removed yet, becuase there are still in use. So I would like to clean it up by converting the remaining functions at some point. I would also like to move translations and local configuration to a submodule. But I have a lot of things on my todo-list and limited time, so it would fine if you work on the calendar function. You should be able to reuse the functions I made to find and display winners of the races in the calendar. If you have trouble finding out how to use them, I will try to document the new functions better. You are also always welcome to ask me for advice. Regards, Dipsacus fullonum (talk) 17:10, 14 September 2018 (UTC)Reply
No problem, we all have the right to have a life outside wikidata/wikipedia ;). Of course, we first have to convert everything before deleting the old stuff, don't worry for that. And of course also, it won't happen in one day. Ok, so I correct calendar with your functions (Actually I already corrected 2 infoboxes on WP:fr with your functions, so I know round about how they work) when I can. We have the whole winter to do that peacefully. Psemdel (talk) 18:19, 14 September 2018 (UTC)Reply

Genvalgte folketingsmedlemmer

edit

Hej. Jeg kan se du genbruger forrige periode i position held (P39) for folketingsmedlemmer. Jeg kan se at praksis for valgte normalt er at lave et nyt udsagn for hver valgperiode. Det har jeg fulgt med de andre folketingsmedlemmer. Det vil være rart at bruge samme praksis også for den nye valgperiode. --Steenth (talk) 09:22, 6 June 2019 (UTC)Reply

Jeg tænkte at det ville være en fordel ved genvalg idet start time (P580) og end time (P582) så kan give en sammenhængende periode uden at man skal til at stykke flere valgperioder sammen. Det vil sikkert gøre det nemmere at håndtere hvis værdierne skal bruges i infobokse. Men det andet kan også klares, så jeg følger din opfordring. --Dipsacus fullonum (talk) 14:17, 6 June 2019 (UTC)Reply

Hello!

edit

I see you are also working on using Familysearch. I run a query looking for people with only a birth year, and no date (only in the US for now, born between 1880 and 1920) and I use Familysearch to add in the full dates from the two draft registrations. All good stuff, good to see someone else using an amazing free resource! --RAN (talk) 21:35, 22 February 2020 (UTC)Reply

Hello RAN. I guess that I have to disappoint you. I'm not using Familysearch. I fact I don't know what it is. I just helped another user who do use it with a related SPARQL query. --Dipsacus fullonum (talk) 09:22, 23 February 2020 (UTC)Reply

Hi

edit

Please refrain from such accusations and attempt to express your difference of opinion in another way. The solution with MINUS that someone else added is still missing and is not what you provided. --- Jura 13:17, 1 March 2020 (UTC)Reply

@Jura1: It doesn't matter if a query uses MINUS or OPTIONAL. In my eyes it is misinformation saying that a sample is missing, when one has been provided. If the technical details matter to you, then change the description from "with MINUS" to "with OPTIONAL and COALESCE" (even though it also could have been made with MINUS). It is also not OK saying without any form of argumentation, that it will likely timeout. You can argue against the proposed qualifier for next level in hierarchy (you already did), but please don't say unfounded information about it. I reserve the right to call misleading or wrong information for misinformation. --Dipsacus fullonum (talk) 14:36, 1 March 2020 (UTC)Reply

Regarding "How to deal with time limit constraints?" in Wikidata: Request A Query

edit

Hello! Thank you for your answer to: https://www.wikidata.org/wiki/Wikidata:Request_a_query#How_to_deal_with_time_limit_constraints?

Can I check then, is there any way I can retrieve all the results to a query that will have a timeout error?

In theory you could split the query into a series of queries that will each give a subset of the result and then combine the results of all these queries manually. Each query in the series should limit the result using a key that is fast to use (some indexed value), but I doubt that it is possible to find a good key in this case. Another option is to download a database dump from Wikidata and analyze it. --Dipsacus fullonum (talk) 07:53, 13 April 2020 (UTC)Reply

Add a filter in a query

edit

Hello, how to make that query work with the filter I put on ? https://w.wiki/VD9 Thank you! Bouzinac (talk) 21:04, 25 June 2020 (UTC)Reply

Hi Bouzinac. I suppose you mean the MINUS part in the query. That has no effect at all because it doesn't match any of the results of the query as there are no shared variables. To exclude values of ?time coming from entities that are instances of astronomical transit (Q6888) you need to define ?item in the first part of the query too:
SELECT ?time 
WHERE
{
  ?item p:P585 / psv:P585 ?fullvalue.
  ?fullvalue wikibase:timePrecision 11 . # Precision is date
  ?fullvalue wikibase:timeValue ?time.
   MINUS    { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
}
GROUP BY ?time
ORDER BY DESC(?time)
Try it!
--Dipsacus fullonum (talk) 03:58, 26 June 2020 (UTC)Reply
Thank you, I tried to adapt such this way (with a time out) : any tip?
SELECT ?item ?itemLabel ?time 
WHERE
{
  ?item p:P585 / psv:P585 ?fullvalue.
  ?fullvalue wikibase:timePrecision 11 . # Precision is date
  ?fullvalue wikibase:timeValue ?time.
  FILTER ((?time > "1001-01-01"^^xsd:dateTime))
   MINUS    { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "fr" . 
  }
}
GROUP BY ?time ?item ?itemLabel
ORDER BY DESC(?time)
Try it!
Thank you again for your SPARQL science Bouzinac (talk) 21:28, 30 June 2020 (UTC)Reply
@Bouzinac: First I no see reason for "GROUP BY" with no aggregation functions and probably only one result in each group with very few exceptions. There are also two more reasons for the timeout: 1) The filter adds a comparison for 299,907 results and probably don't remove very many of them, so the filter uses much time with little gain. 2) The label for ?item. It is impossible to get labels for 299,907 items in a query even if it didn't do anything else. You need to either drop the labels or limit the number of results, and it must happen in a subquery as the label service will else be applied before filtering and limiting. A solution with labels but only 10,000 results can be:
SELECT ?item ?itemLabel ?time 
WHERE
{
  {
    SELECT ?item ?time
    WHERE
    {
      ?item p:P585 / psv:P585 ?fullvalue.
      ?fullvalue wikibase:timePrecision 11 . # Precision is date
      ?fullvalue wikibase:timeValue ?time. hint:Prior hint:rangeSafe true.
      MINUS    { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
    }
    ORDER BY DESC(?time)
    LIMIT 10000
  }
  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "fr" . 
  }
}
Try it!
But if you omit the labels, no limit is necessary. --Dipsacus fullonum (talk) 23:10, 30 June 2020 (UTC)Reply
Yes, my goal is to find any duplicates and potentially find them thanks to date as an hint... https://w.wiki/Vjd Bouzinac (talk) 05:35, 1 July 2020 (UTC)Reply
@Bouzinac: Then I suggest a query that only gives duplicate dates:
SELECT ?time (COUNT(?time) AS ?count)
WHERE
{
  ?item ps:P585 / psv:P585 ?fullvalue.
  ?fullvalue wikibase:timePrecision 11 . # Precision is date
  ?fullvalue wikibase:timeValue ?time.
  MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
}
GROUP BY ?time
HAVING (?count > 1)
Try it!
The query had no result when I ran it so it seems there are no duplicates! --Dipsacus fullonum (talk) 06:48, 1 July 2020 (UTC)Reply
Huh? GROUP BY ?time means you only have one date ? so logically no duplicates…. ?
@Bouzinac: No, COUNT(?time) will count the number of values for each date. The error was "ps:P585" instead of "p:P585". The query below is better and gives the number of duplicates for each date:
SELECT ?item ?time ?count
WITH
{
  SELECT ?time (COUNT(?time) AS ?count)
  WHERE
  {
    ?item p:P585 /  psv:P585 ?fullvalue.
    ?fullvalue wikibase:timePrecision 11 . # Precision is date
    ?fullvalue wikibase:timeValue ?time.
    MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
  }
  GROUP BY ?time
  HAVING (?count > 1)
} AS %get_duplicate_times
WHERE
{
  INCLUDE %get_duplicate_times
  ?item p:P585 / psv:P585 ?fullvalue.
  ?fullvalue wikibase:timePrecision 11 . # Precision is date
  ?fullvalue wikibase:timeValue ?time.
  MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
}
Try it!
--Dipsacus fullonum (talk) 07:03, 1 July 2020 (UTC)Reply
@Bouzinac: Here is a version which also exclude calendar days from the count and the list:
SELECT ?item ?time ?count
WITH
{
  SELECT ?time (COUNT(?time) AS ?count)
  WHERE
  {
    ?item p:P585 / psv:P585 ?fullvalue.
    ?fullvalue wikibase:timePrecision 11 . # Precision is date
    ?fullvalue wikibase:timeValue ?time.
    MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
    MINUS { ?item (wdt:P31/wdt:P279*) wd:Q47150325. } # exclude calendar dates in themselves
  }
  GROUP BY ?time
  HAVING (?count > 1)
} AS %get_duplicate_times
WHERE
{
  INCLUDE %get_duplicate_times
  ?item p:P585 / psv:P585 ?fullvalue.
  ?fullvalue wikibase:timePrecision 11 . # Precision is date
  ?fullvalue wikibase:timeValue ?time.
  MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
  MINUS { ?item (wdt:P31/wdt:P279*) wd:Q47150325. } # exclude calendar dates in themselves
}
Try it!
--Dipsacus fullonum (talk) 07:12, 1 July 2020 (UTC)Reply
@Dipsacus fullonum: the latter is very great, found a duplicate at first try! Lot of job
 
... By curiosity, why duplicating the MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques MINUS { ?item (wdt:P31/wdt:P279*) wd:Q47150325. } # exclude calendar dates in themselves in the WITH and in the second where  ? Bouzinac (talk) 08:50, 1 July 2020 (UTC)Reply
@Bouzinac: The subquery finds and counts duplicates values and has the MINUS parts to find and count only duplicates that aren't astronomical transits or calendar dates. The main part of the query finds all items with the duplicate dates. It has the MINUS parts in order to avoid to list items for astronomical transits or calendar dates. E.g. if some date has a duplicate count of say 2, it will (most likely) also have a calendar date item which aren't included in the count. That item is removed by the second set of MINUS. --Dipsacus fullonum (talk) 09:05, 1 July 2020 (UTC)Reply
  • Do you see why I am having false duplicates there ?
    SELECT ?item ?time ?count
    WITH
    {
      SELECT ?time (COUNT(?time) AS ?count)
      WHERE
      {
        ?item p:P585 / psv:P585 ?fullvalue.
        ?fullvalue wikibase:timePrecision 11 . # Precision is date
        ?fullvalue wikibase:timeValue ?time.
    #     FILTER ((?time < "1900-01-01"^^xsd:dateTime))
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q47150325. } # exclude calendar dates in themselves
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q14795564. } # exclude OTHER calendar dates 
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q2334719. } # exclude trials
        ?item (wdt:P31/wdt:P279*) wd:Q141022.#éclipses
      }
      GROUP BY ?time
      HAVING (?count > 1)
    } AS %get_duplicate_times
    WHERE
    {
      INCLUDE %get_duplicate_times
      ?item p:P585 / psv:P585 ?fullvalue.
      ?fullvalue wikibase:timePrecision 11 . # Precision is date
      ?fullvalue wikibase:timeValue ?time.
    #      FILTER ((?time < "1900-01-01"^^xsd:dateTime))
      MINUS { ?item (wdt:P31/wdt:P279*) wd:Q6888. } # exclude transit astronomiques
      MINUS { ?item (wdt:P31/wdt:P279*) wd:Q47150325. } # exclude calendar dates in themselves
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q14795564. } # exclude  OTHER calendar dates 
        MINUS { ?item (wdt:P31/wdt:P279*) wd:Q2334719. } # exclude trials
          ?item (wdt:P31/wdt:P279*) wd:Q141022.#éclipses 
    } order by ?time
    
    Try it!
    Thank you Bouzinac (talk) 10:45, 1 July 2020 (UTC)Reply
    @Bouzinac: Yes, it is because there are multiple matches for the same items in the graph pattern ?item (wdt:P31/wdt:P279*) wd:Q141022.#éclipses" like e.g.
    giving 2 results for Q7556373. You can avoid the duplicate results by changing SELECT to SELECT DISTINCT. --Dipsacus fullonum (talk) 10:58, 1 July 2020 (UTC)Reply

Q229 or Q41

edit

Hello. Sorry for writing in your talk page and not in Request a query page. Is just a small change. How can this query search for both P27 -> Q229 or Q41.

SELECT ?item ?itemLabel
{
    ?item wdt:P27 wd:Q229 .
    ?item wdt:P31 wd:Q5 .
    ?item wdt:P735 wd:Q87263878 .
    SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!

Data Gamer play 11:13, 23 July 2020 (UTC)Reply

Hi Data Gamer. By using the VALUES keyword to list allowed values for a variable:
SELECT ?item ?itemLabel
{
    VALUES ?countries { wd:Q229 wd:Q41 }
    ?item wdt:P27 ?countries .
    ?item wdt:P31 wd:Q5 .
    ?item wdt:P735 wd:Q87263878 .
    SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!
--Dipsacus fullonum (talk) 11:25, 23 July 2020 (UTC)Reply

Thanks! Data Gamer play 11:36, 23 July 2020 (UTC)Reply

Wikipedia articles

edit

Hello. Can you change the query show in the results will be a column with the Wikipedia article (if wikibase:language "en" then English Wikipedia, if wikibase:language "el" the Greek Wikipedia).

SELECT ?item ?itemLabel
{
    VALUES ?countries { wd:Q229 wd:Q41 wd:Q15240466 }
    ?item wdt:P27 ?countries .
    ?item wdt:P31 wd:Q5 .
    ?item wdt:P735 wd:Q87263878 .
    SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
}
Try it!

Data Gamer play 15:33, 27 July 2020 (UTC)Reply

@Data Gamer: Sure, but you have to select the Wikipedia language manually. Here is the version for English Wikipedia. Change <https://en.wikipedia.org/> to <https://el.wikipedia.org/> for the Greek Wikipedia.
SELECT ?item ?itemLabel ?wikipedia_article
{
    VALUES ?countries { wd:Q229 wd:Q41 wd:Q15240466 }
    ?item wdt:P27 ?countries .
    ?item wdt:P31 wd:Q5 .
    ?item wdt:P735 wd:Q87263878 .
    OPTIONAL { ?wikipedia_article schema:about ?item ; schema:isPartOf <https://en.wikipedia.org/> . }
    SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
}
Try it!
--Dipsacus fullonum (talk) 15:54, 27 July 2020 (UTC)Reply

Thanks! Data Gamer play 16:11, 27 July 2020 (UTC)Reply

Query performance questions

edit

Hello Dipsacus fullonum,
I asked your help some weeks ago about a query to get value, qualifiers, unit, etc.
My main purpose was to solve an issue with the query currently used in the python library WikidataIntegrator.
I tried to implement your query, but I have a performance issue when asking for a property widely used (like P31) that I don't have with the old query. It's possible to optimize the query? I tried to understand the difference between the two queries and adapt to the new one, without succes.
The query currently used in WikidataIntegrator: this one
Your query but adapted to WikidataIntegrator: this one
Thank you for your help,
Best Regards,
Myst (talk) 08:06, 3 October 2020 (UTC)Reply

Hi Myst. I see that there is a problem with ?property used out of scope in the query I made. Sorry about that.
When the query is made by a program there are some optimizations that can be done by the program: The VALUE assignments with one value are superfluous as the program can just insert the value where needed instead of a variable (which will also fix the scope problem), and triples like ?property wikibase:claim ?claim . and ?property wikibase:statementValue ?statementValue . are superfluous because the program can construct the value of ?claim to be <http://www.wikidata.org/prop/P31> and the value of ?propertyStatement to be <http://www.wikidata.org/prop/statement/P31> like it is already done in the currently used query linked above.
You can also optimize the query by looking up the property type first and then make different queries depending on if the property type is amount (having a unit) or something else. Going this way, you can also make specialzed queries for other property types like geting precision and calendar model for time values etc. --Dipsacus fullonum (talk) 06:35, 5 October 2020 (UTC)Reply
Hello Dipsacus fullonum,
Thank you for your answer.
If I understood correctly, I think something like this is a correct query.
Can you confirm me it's correct ?
Thank you
BR, Myst (talk) 19:50, 8 October 2020 (UTC)Reply
@Myst: Yes, I think that the query is correct, but I haven't tested it extensively. It may (or may not) be faster to avoid the first UNION and instead use same construct as used in https://w.wiki/evA, like this:
#Tool: wbi_fastrun _query_data
SELECT ?sid ?item ?v ?unit ?pq ?qval ?qunit
WHERE
{
  ?item <http://www.wikidata.org/prop/direct/P699> ?zzP699 .

  # Get statement ID and the simple value for the statements
  ?item <http://www.wikidata.org/prop/P31> ?sid .
  ?sid <http://www.wikidata.org/prop/statement/P31> ?v .

  # Get the optional unit for statements with a quantity value
  OPTIONAL
  {
    ?sid <http://www.wikidata.org/prop/statement/value/P31> / wikibase:quantityUnit ?unit .
  }

  # Get qualifiers
  OPTIONAL
  {
    {
      # Get simple values for qualifiers which are not of type quantity
      ?sid ?propQualifier ?qval .
      ?pq wikibase:qualifier ?propQualifier .
      ?pq wikibase:propertyType ?qualifer_property_type .
      FILTER (?qualifer_property_type != wikibase:Quantity)
    }
    UNION
    {
      # Get amount and unit for qualifiers of type quantity
      ?sid ?pqv [wikibase:quantityAmount ?qval; wikibase:quantityUnit ?qunit] .
      ?pq wikibase:qualifierValue ?pqv .
    }
  }
} ORDER BY ?sid OFFSET 0 LIMIT 10000
Try it!
It may also be faster to move the 3 first triples to a subquery with the LIMIT and OFFSET, and then drop these from the main query. But by doing so, you can have more than 10000 results as there is a result for each qualifier of a statement. (The same is true for https://w.wiki/evA). If that isn't a problem the query could be:
#Tool: wbi_fastrun _query_data
SELECT ?sid ?item ?v ?unit ?pq ?qval ?qunit
WHERE
{
  {
    SELECT ?item ?sid ?v
    WHERE
    {
        ?item <http://www.wikidata.org/prop/direct/P699> ?zzP699 .

        # Get statement ID and the simple value for the statements
        ?item <http://www.wikidata.org/prop/P31> ?sid .
        ?sid <http://www.wikidata.org/prop/statement/P31> ?v .
    }
    ORDER BY ?sid OFFSET 0 LIMIT 10000
  }

  # Get the optional unit for statements with a quantity value
  OPTIONAL
  {
    ?sid <http://www.wikidata.org/prop/statement/value/P31> / wikibase:quantityUnit ?unit .
  }

  # Get qualifiers
  OPTIONAL
  {
    {
      # Get simple values for qualifiers which are not of type quantity
      ?sid ?propQualifier ?qval .
      ?pq wikibase:qualifier ?propQualifier .
      ?pq wikibase:propertyType ?qualifer_property_type .
      FILTER (?qualifer_property_type != wikibase:Quantity)
    }
    UNION
    {
      # Get amount and unit for qualifiers of type quantity
      ?sid ?pqv [wikibase:quantityAmount ?qval; wikibase:quantityUnit ?qunit] .
      ?pq wikibase:qualifierValue ?pqv .
    }
  }
}
Try it!
I don't think it possible to avoid the UNION for getting qualifiers in a similar way because some qualifiers (depending on property type) don'ẗ have nodes for full values, but only simple values. --Dipsacus fullonum (talk) 21:09, 8 October 2020 (UTC)Reply

Wikidata common service | Mass editing of local language caption

edit

A copied message from Request page.

Is there any ways to add captions for images in commons which has captions in English not in Arabic. Can we find out those particular category images and add captions.

There are over 1.5 million captions in English and less than 19,000 captions in Arabic, so making a list of all images with English caption and no Arabic caption will timeout as the list would be too big. You need to somehow limit the group of images you are working with. --Dipsacus fullonum (talk) 20:19, 25 October 2020 (UTC)Reply

If we get the query for that we could have limit.--Akbarali (talk) 10:55, 31 October 2020 (UTC)Reply

Answered at Wikidata:Request a query. --Dipsacus fullonum (talk) 15:48, 31 October 2020 (UTC)Reply

Wikimedia Commons SPARQL service federation

edit

Hi, I noticed at Wikidata:Request_a_query/Archive/2020/10#Querry_for_Wikimedia_Commons that you understand Wikimedia Commons + Wikidata federation (at least better than I do). I was trying to do some federated query but failed. The goal was to find uploaders of pictures linked to Czech municipalities via image (P18). The query went like this:

SELECT ?username (count(distinct ?image) as ?count) WITH
{
  SELECT ?item ?image
  WHERE
  {
    SERVICE <https://query.wikidata.org/sparql>
    {
      ?item wdt:P31 wd:Q5153359 .
      ?item wdt:P18 ?image .
    }
  }
} AS %get_items
WHERE
{
  INCLUDE %get_items
  //somehow link ?image to its structured file ... reverse of schema:contentUrl ... sort of like ?image HASFILE ?file .  
  ?file p:P170/pq:P4174 ?username . 

} group by ?username
Try it!

Is this somehow possible? Thank you in advance, Vojtěch Dostál (talk) 21:52, 15 November 2020 (UTC)Reply

Hi Vojtěch Dostál. I think that is possible, but the query you outline will not find the uploader. It will find the creator if they happen to have a Wikimedia username. That may be someone else than the uploader. You can find the uploader for all images by using the MWAPI service to call Commons Mediawiki API although I'm not sure that it is possible for all 6328 images in one query without timeout. So do you want creator or uploader for images? --Dipsacus fullonum (talk) 22:45, 15 November 2020 (UTC)Reply
Hi! Yeah, sorry, I should have made myself clearer, although the uploader will be the same as creator in most of these cases. Anyway, querying for "creator" would be just fine for this :). Thank you very much, Vojtěch Dostál (talk) 06:46, 16 November 2020 (UTC)Reply
@Vojtěch Dostál: I read the documentation at mw:Extension:WikibaseMediaInfo/RDF mapping and thought that I easily could make a query for you. However it turned out that the documentation is wrong, and it seems that there are no triples in WCQS to go from filenames to the media objects (M-numbers). I asked for help yesterday at the documentation talk page at mediawiki.org. If I don't get an answer from the development team today I will try at Phabricator. I will return when I know more. --Dipsacus fullonum (talk) 09:44, 17 November 2020 (UTC)Reply
Thank you - I see the discussion. Happy to see that I might not be completely ignorant of something elementary :-). Thank you for inquiring the issue.Vojtěch Dostál (talk) 09:49, 17 November 2020 (UTC)Reply
@Vojtěch Dostál: Sorry for the delay. I found out how to construct the content URL from the filename. It is explained in mw:Manual:$wgHashedUploadDirectory. Using that info I made this query:
SELECT ?username (COUNT(DISTINCT ?file) AS ?count)
WITH
{
  SELECT ?item ?image ?filename ?contentUrl
  WHERE
  {
    SERVICE <https://query.wikidata.org/sparql>
    {
      ?item wdt:P31 wd:Q5153359 .
      ?item wdt:P18 ?image .
    }
    BIND (REPLACE(wikibase:decodeUri(SUBSTR(STR(?image), 52)), " ", "_") AS ?filename)
    BIND (MD5(?filename) AS ?MD5)
    BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/", SUBSTR(?MD5, 1, 2), "/", ?filename)) As ?contentUrl)
  }
} AS %get_items
WITH
{
  SELECT ?file
  WHERE
  {
    INCLUDE %get_items
    ?file schema:contentUrl ?contentUrl .
  }
} AS %get_files
WHERE
{
  INCLUDE %get_files
  ?file p:P170 / pq:P4174 ?username .
}
GROUP BY ?username
Try it!
However another problem is that it seems to me that many images are missing in WCQS. The Wikidata federated call gives 6328 distinct values for ?image, but the query only has 1013 values of ?file. All constructed values of ?contentUrl seems to be good so I don't think these are the problem. An example of a missing file in WCQS is sdc:M47869727. This query has no results:
SELECT *
WHERE
{
  sdc:M47869727 ?pred ?object.
}
Try it!
But http://commons.wikimedia.org/entity/M47869727 exists and structured data claims was added at 30 September 2020. So 5313 out of 6328 files seems to be missing in WCQS and I don't have any explanation for this. --Dipsacus fullonum (talk) 13:14, 2 December 2020 (UTC)Reply
Thank you, that is extremely useful! I can apply the same approach to similar queries in future. The missing files are really weird, I see it too when I drop the aggregation from the query above. Maybe we could try to ask Lucas Werkmeister, who commented at mw:Topic:Vxuqddsgciypooid, for his thoughts on this? Vojtěch Dostál (talk) 13:34, 2 December 2020 (UTC)Reply
@Vojtěch Dostál: I created phab:T269302 for the missing files in WCQS. --Dipsacus fullonum (talk) 00:55, 3 December 2020 (UTC)Reply

So, Zbyszko suggest to correct the query this way:

SELECT ?username (COUNT(DISTINCT ?file) AS ?count)
WITH
{
  SELECT ?item ?image ?filename ?contentUrl
  WHERE
  {
    SERVICE <https://query.wikidata.org/sparql>
    {
      ?item wdt:P31 wd:Q5153359 .
      ?item wdt:P18 ?image .
    }
    BIND (REPLACE(wikibase:decodeUri(SUBSTR(STR(?image), 52)), " ", "_") AS ?filename)
    BIND (REPLACE(SUBSTR(STR(?image), 52), "%20", "_") AS ?filenameUnencoded)
    BIND (MD5(?filename) AS ?MD5)
    BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/", SUBSTR(?MD5, 1, 2), "/", ?filenameUnencoded)) As ?contentUrl)
  }
} AS %get_items
WITH
{
  SELECT ?file
  WHERE
  {
    INCLUDE %get_items
    ?file schema:contentUrl ?contentUrl .
  }
} AS %get_files
WHERE
{
  INCLUDE %get_files
  ?file p:P170 / pq:P4174 ?username .
}
GROUP BY ?username
Try it!

This query now gives me 4462 images in total, which is better but still does not seem to be the full number. What do you think? Vojtěch Dostál (talk) 14:32, 4 March 2021 (UTC)Reply

@Vojtěch Dostál: How do you get the number 4462? If I modify the query to count the images I get 6326 images found at Wikidata and 6317 files found at WCQS. The small difference of 9 can be images without structured data or new images since the last weekly update of WCQS:
SELECT (COUNT(DISTINCT ?image) AS ?images) (COUNT(DISTINCT ?file) AS ?files)
WITH
{
  SELECT ?item ?image ?filename ?contentUrl
  WHERE
  {
    SERVICE <https://query.wikidata.org/sparql>
    {
      ?item wdt:P31 wd:Q5153359 .
      ?item wdt:P18 ?image .
    }
    BIND (REPLACE(wikibase:decodeUri(SUBSTR(STR(?image), 52)), " ", "_") AS ?filename)
    BIND (REPLACE(SUBSTR(STR(?image), 52), "%20", "_") AS ?filenameUnencoded)
    BIND (MD5(?filename) AS ?MD5)
    BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/", SUBSTR(?MD5, 1, 2), "/", ?filenameUnencoded)) As ?contentUrl)
  }
} AS %get_items
WHERE
{
  INCLUDE %get_items
  OPTIONAL { ?file schema:contentUrl ?contentUrl . }
}
Try it!
--Dipsacus fullonum (talk) 16:29, 4 March 2021 (UTC)Reply
I got to the number using this modified query:
SELECT ?username ?file 
WITH
{
  SELECT ?item ?image ?filename ?contentUrl
  WHERE
  {
    SERVICE <https://query.wikidata.org/sparql>
    {
      ?item wdt:P31 wd:Q5153359 .
      ?item wdt:P18 ?image .
    }
    BIND (REPLACE(wikibase:decodeUri(SUBSTR(STR(?image), 52)), " ", "_") AS ?filename)
    BIND (REPLACE(SUBSTR(STR(?image), 52), "%20", "_") AS ?filenameUnencoded)
    BIND (MD5(?filename) AS ?MD5)
    BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/", SUBSTR(?MD5, 1, 2), "/", ?filenameUnencoded)) As ?contentUrl)
  }
} AS %get_items
WITH
{
  SELECT ?file
  WHERE
  {
    INCLUDE %get_items
    ?file schema:contentUrl ?contentUrl .
  }
} AS %get_files
WHERE
{
  INCLUDE %get_files
  ?file p:P170 / pq:P4174 ?username .
}
Try it!

One reason for the lower number is definitely missing data (missing username of author). I already asked Multichill about that at User_talk:Multichill#Suggestion_re:author_structured_data. Vojtěch Dostál (talk) 16:56, 4 March 2021 (UTC)Reply

@Vojtěch Dostál: Yes, the difference is caused by not all files having a value for creator (P170) with a Wikimedia username (P4174) qualifier. Some files which should have it may miss this data, but also many files aren't own works and so not created by the user who uploaded it. --Dipsacus fullonum (talk) 17:15, 4 March 2021 (UTC)Reply
Yes. Anyway, thank you very much for your help on this. I really appreciated learning from your insights. Vojtěch Dostál (talk) 17:49, 4 March 2021 (UTC)Reply

Airports (again :)

edit

Hello, sorry to bother you again, hope you'll get a better year 2021 than 2020! I'm having a difficulty when airport data has multiples monthly source. Eg with Cape Town International Airport (Q854130), and with that query, you'll see that when there is multiples statements for same month+year : they are added instead of being sampled. I wonder if I change the (SUM(?numberperperiod) AS ?number) to (MAX(?numberperperiod) AS ?number) would have any drawbacks/side effects.

Any thoughts ? Bouzinac💬✒️💛 21:46, 8 January 2021 (UTC)Reply


SELECT ?year ?item ?itemLabel (MAX(?number) AS ?passengers)
  (SAMPLE(COALESCE(?reference_URL, ?monthly_reference_URL2)) AS ?sample_reference_URL)
WITH
{
  SELECT ?item ?statement ?year ?timevalue ?numberperperiod ?reference_URL
  WHERE
  {
    ?item wdt:P238 ?airport_code
    VALUES ?airport_code 
    {
 "CPT"
    }
    ?item p:P3872 ?statement.
    ?statement pqv:P585 ?timevalue;
               ps:P3872 ?numberperperiod.
    ?timevalue wikibase:timeValue ?date.
    OPTIONAL { ?statement pq:P518 ?applies. }
    OPTIONAL { ?statement prov:wasDerivedFrom / (pr:P854|pr:P4656) ?reference_URL. }
    FILTER (BOUND(?applies)=false || ?applies = wd:Q2165236 )
    MINUS { ?statement wikibase:rank wikibase:DeprecatedRank }
    BIND (YEAR(?date) AS ?year)
    FILTER (?year >1949).
    FILTER (?year < YEAR(NOW()))
  }
} AS %airport
WHERE
{
  {
    # Get the sum of monthly values within a year
    SELECT ?item ?year (SUM(?numberperperiod) AS ?number) (SAMPLE(?monthly_reference_URL) AS ?monthly_reference_URL2)
    WHERE
    {
      # Get a sample reference URL for each monthly value
      {
        SELECT ?item ?year ?numberperperiod (SAMPLE(?reference_URL) AS ?monthly_reference_URL)
        WHERE
        {
          INCLUDE %airport
          ?timevalue wikibase:timePrecision ?prec.
          FILTER (?prec > 9)
        }
        GROUP BY ?item ?statement ?year ?numberperperiod
        # Include ?statement in the GROUP BY because ?numberperperiod may not be unique
      }
    }
    GROUP BY ?item ?year
  }
  UNION
  {
    ?timevalue wikibase:timePrecision 9 .
    BIND (?numberperperiod AS ?number)
    BIND (?reference_URL AS ?sample_reference_URL)
    INCLUDE %airport
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr". }
}
GROUP BY ?item ?itemLabel ?year
ORDER BY ?item DESC (?year)
Try it!
@Bouzinac: Hi. Thank you and I wish the same for you.
I looked at Cape Town International Airport (Q854130) and I cannot see any month which more than one value for patronage (P3872). Which month you do think has multiple statements?
But it happens at other airports. This query gives a list of not-deprecated multiple values for months:
SELECT ?item ?itemLabel ?date ?count
{
  {
    SELECT ?item ?date (COUNT(?timevalue ) AS ?count)
    WHERE
    {
      ?item wdt:P238 ?airport_code .
      ?item p:P3872 ?statement .
      ?statement pqv:P585 ?timevalue .
      ?statement ps:P3872 ?numberperperiod .
      VALUES ?rank { wikibase:NormalRank wikibase:PreferredRank } 
      ?statement wikibase:rank ?rank.
      ?timevalue wikibase:timeValue ?date .
      ?timevalue wikibase:timePrecision 10 . # Precicison is month
    }
    GROUP BY ?item ?itemLabel ?date
    HAVING (?count > 1)
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr". }
}
Try it!
If you just change SUM(?numberperperiod) to MAX(?numberperperiod), you will not get the sum for 12 months but only the highest monthly value for each year. E.g. for 2019 you will get the value 1072884 for Cape Town International Airport (Q854130) – the value for December 2019 which was the month with most passengers that year. When there are multiple values for some months for an airport, you have to first group by month and year to get just one number for each month, and then group by year to get the sum for 12 months. I have modified the query below to do that:
SELECT ?year ?item ?itemLabel (MAX(?number) AS ?passengers)
  (SAMPLE(COALESCE(?reference_URL, ?monthly_reference_URL2)) AS ?sample_reference_URL)
WITH
{
  SELECT ?item ?statement ?date ?year ?timevalue ?numberperperiod ?reference_URL
  WHERE
  {
    ?item wdt:P238 ?airport_code
    VALUES ?airport_code 
    {
 "CPT"
    }
    ?item p:P3872 ?statement.
    ?statement pqv:P585 ?timevalue;
               ps:P3872 ?numberperperiod.
    ?timevalue wikibase:timeValue ?date.
    OPTIONAL { ?statement pq:P518 ?applies. }
    OPTIONAL { ?statement prov:wasDerivedFrom / (pr:P854|pr:P4656) ?reference_URL. }
    FILTER (BOUND(?applies)=false || ?applies = wd:Q2165236 )
    MINUS { ?statement wikibase:rank wikibase:DeprecatedRank }
    BIND (YEAR(?date) AS ?year)
    FILTER (?year >1949).
    FILTER (?year < YEAR(NOW()))
  }
} AS %airport
WHERE
{
  {
    # Get the sum of monthly values within a year
    SELECT ?item ?year (SUM(?max_numberperperiod) AS ?number) (SAMPLE(?monthly_reference_URL) AS ?monthly_reference_URL2)
    WHERE
    {
      # Get the maximal value and a sample reference URL for each month
      {
        SELECT ?item ?year (MAX(?numberperperiod) AS ?max_numberperperiod) (SAMPLE(?reference_URL) AS ?monthly_reference_URL)
        WHERE
        {
          INCLUDE %airport
          ?timevalue wikibase:timePrecision ?prec.
          FILTER (?prec > 9)
        }
        GROUP BY ?item ?year ?date
      }
    }
    GROUP BY ?item ?year
  }
  UNION
  {
    ?timevalue wikibase:timePrecision 9 .
    BIND (?numberperperiod AS ?number)
    BIND (?reference_URL AS ?sample_reference_URL)
    INCLUDE %airport
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr". }
}
GROUP BY ?item ?itemLabel ?year
ORDER BY ?item DESC (?year)
Try it!
--Dipsacus fullonum (talk) 22:46, 9 January 2021 (UTC)Reply

Yes Matěj Suchánek's bot did compress duplicated identical values so the CPT problem did get solve, thanks to his bot. Now your correction will be useful, say, for two different sources one saying X for same month and saying X' for same month. Now, I'll check if there are other sides effect of your correction, with, say airports having bizarre months (for instance having international and domestic statistics) that needs to sum up. Checking in progress :) Thanks Dipsacus fullonum

Airports (again #2)

edit

Hello dude, thank you again for your help on the queries. Here's another request if you could help? The need would simply to know the ranks of airport for, say, 2020.

SELECT ?year ?item ?itemLabel (MAX(?number) AS ?passengers)
  (SAMPLE(COALESCE(?reference_URL, ?monthly_reference_URL2)) AS ?sample_reference_URL)
WITH
{
  SELECT ?item ?statement ?date ?year ?timevalue ?numberperperiod ?reference_URL
  WHERE
  {
?item (wdt:P31/wdt:P279*) wd:Q62447.
    ?item p:P3872 ?statement.
    ?statement pqv:P585 ?timevalue;
               ps:P3872 ?numberperperiod.
    ?timevalue wikibase:timeValue ?date.
    OPTIONAL { ?statement pq:P518 ?applies. }
    OPTIONAL { ?statement prov:wasDerivedFrom / (pr:P854|pr:P4656) ?reference_URL. }
    FILTER (BOUND(?applies)=false || ?applies = wd:Q2165236 )
    MINUS { ?statement wikibase:rank wikibase:DeprecatedRank }
    BIND (YEAR(?date) AS ?year)
    FILTER (?year=2020).
  }
} AS %airport
WHERE
{
  {
    # Get the sum of monthly values within a year
    SELECT ?item ?year (SUM(?max_numberperperiod) AS ?number) (SAMPLE(?monthly_reference_URL) AS ?monthly_reference_URL2)
    WHERE
    {
      # Get the maximal value and a sample reference URL for each unique month
      {
        SELECT ?item ?year (MAX(?numberperperiod) AS ?max_numberperperiod) (SAMPLE(?reference_URL) AS ?monthly_reference_URL)
        WHERE
        {
          INCLUDE %airport
          ?timevalue wikibase:timePrecision ?prec.
          FILTER (?prec > 9)# precision more precise or equal to month
        }
        GROUP BY ?item ?year ?date
      }
    }
    GROUP BY ?item ?year
  }
  UNION
  {
    ?timevalue wikibase:timePrecision 9 .
    BIND (?numberperperiod AS ?number)
    BIND (?reference_URL AS ?sample_reference_URL)
    INCLUDE %airport
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr". }
}
GROUP BY ?item ?itemLabel ?year
ORDER BY desc(?passengers)
LIMIT 50
Try it!

The pb is that, for instance Tokyo International Airport (Q204853) has data only from january to november. Would it be possible to show that either the airport has data for whole year ("2020") or (till this month (max of montlhy values) ? Thus it would help saying that HND is yet to be updated. Thanks again! Bouzinac💬✒️💛 21:25, 3 February 2021 (UTC)Reply

@Bouzinac: This query lists all airports with some monthly passenger values for 2020 but not for all months. Some of these also have a number for the year. These are indicated with the ?have_value_for_year variable. It is not always the last months of the year that is missing. In some cases it is the first months. You can see that in the variables ?number_of_months, ?first_month and ?last_month. Any airports not in the list have either no monthly values or values for all 12 months.
SELECT ?year ?item ?itemLabel ?number_of_months ?first_month ?last_month ?have_value_for_year
WITH
{
  SELECT ?item ?statement ?date ?year ?timevalue ?numberperperiod ?reference_URL
  WHERE
  {
?item (wdt:P31/wdt:P279*) wd:Q62447.
    ?item p:P3872 ?statement.
    ?statement pqv:P585 ?timevalue;
               ps:P3872 ?numberperperiod.
    ?timevalue wikibase:timeValue ?date.
    OPTIONAL { ?statement pq:P518 ?applies. }
    OPTIONAL { ?statement prov:wasDerivedFrom / (pr:P854|pr:P4656) ?reference_URL. }
    FILTER (BOUND(?applies)=false || ?applies = wd:Q2165236 )
    MINUS { ?statement wikibase:rank wikibase:DeprecatedRank }
    BIND (YEAR(?date) AS ?year)
    FILTER (?year=2020).
  }
} AS %airport
WHERE
{
  {
    SELECT ?item (COUNT(DISTINCT ?date) AS ?number_of_months) (MAX(?date) AS ?last_month) (MIN(?date) AS ?first_month)
    WHERE
    {
      INCLUDE %airport
      ?timevalue wikibase:timePrecision 10 . # Precicision is month
    }
    GROUP BY ?item
    HAVING (?number_of_months != 12)
  }
  OPTIONAL
  {
    {
      SELECT ?item
      WHERE
      {
        INCLUDE %airport
        ?timevalue wikibase:timePrecision 9 . # Precicision is year
      }
      GROUP BY ?item
    }
    BIND ("yes" AS ?have_value_for_year)
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr". }
}
Try it!
--Dipsacus fullonum (talk) 06:26, 4 February 2021 (UTC)Reply

liste over mennesker med qualifier dødsdato?

edit

Jeg er efterhånden fortrolig med nogenlunde simple søgninger. Nu kunne jeg dog tænke mig at vide hvordan man finder is a list of (P360)human (Q5) med qualifier date of death (P570) (eks. deaths in 1978 (Q2618431)), men kan ikke lige finde nogen eksempler jeg kan gennemskue. --Hjart (talk) 16:03, 23 February 2021 (UTC)Reply

@Hjart: Den simple:
SELECT ?item ?itemLabel ?tidspunkt
WHERE
{
  ?item p:P360 ?statement .
  ?statement ps:P360 wd:Q5 .
  ?statement pq:P570 ?tidspunkt .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "da,en" . }
}
Try it!
Og med angivelse af præcision for tidspunktet:
SELECT ?item ?itemLabel ?tidspunkt
 (IF(?præc = 9, "år",
     IF(?præc = 10, "måned", "andet")
     ) AS ?præcision)
WHERE
{
  ?item p:P360 ?statement .
  ?statement ps:P360 wd:Q5 .
  ?statement pqv:P570 ?tidspunkt_v .
  ?tidspunkt_v wikibase:timeValue ?tidspunkt .
  ?tidspunkt_v wikibase:timePrecision ?præc .
  SERVICE wikibase:label { bd:serviceParam wikibase:language "da,en" . }
}
Try it!
--Dipsacus fullonum (talk) 16:57, 23 February 2021 (UTC)Reply
PS. Den anden forespørgsel har 1 resultat mindre end den første fordi list of Dutch military personnel killed in action during peace-keeping missions (Q2797589) mangler da den har tidspunktet "ukendt værdi" som ikke har en "fuld værdi" (pqv:) med en præcision. --Dipsacus fullonum (talk) 17:03, 23 February 2021 (UTC)Reply

Query

edit
SELECT ?cast ?castLabel
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "en.wikipedia.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "categorymembers" .
    bd:serviceParam mwapi:gcmtitle "Category:Christmas films" .
    bd:serviceParam mwapi:gcmlimit "max" .
    ?cast wikibase:apiOutputItem mwapi:item .
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en" . } 
}
Try it!

Any idea how to change it the query so it also the items found in the subcategories? --Trade (talk) 00:54, 27 February 2021 (UTC)Reply

@Trade: This will work if the number of subcategories doesn't exceed 256. It will only return the film items and not any items for the subcategories. (That can also be done but with a different query). As always for MWAPI calls max 10000 results are returned.
SELECT ?cast ?castLabel
WHERE
{
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "en.wikipedia.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "search" .
    bd:serviceParam mwapi:gsrsearch 'deepcat:"Christmas films"' .
    bd:serviceParam mwapi:gsrlimit "max" .
    ?cast wikibase:apiOutputItem mwapi:item .
  }
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en" . } 
}
Try it!
--Dipsacus fullonum (talk) 01:58, 27 February 2021 (UTC)Reply

Call for participation in the interview study with Wikidata editors

edit

Dear Dipsacus fullonum,

I hope you are doing good,

I am Kholoud, a researcher at King’s College London, and I work on a project as part of my PhD research that develops a personalized recommendation system to suggest Wikidata items for the editors based on their interests and preferences. I am collaborating on this project with Elena Simperl and Miaojing Shi.

I would love to talk with you to know about your current ways to choose the items you work on in Wikidata and understand the factors that might influence such a decision. Your cooperation will give us valuable insights into building a recommender system that can help improve your editing experience.

Participation is completely voluntary. You have the option to withdraw at any time. Your data will be processed under the terms of UK data protection law (including the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018). The information and data that you provide will remain confidential; it will only be stored on the password-protected computer of the researchers. We will use the results anonymized to provide insights into the practices of the editors in item selection processes for editing and publish the results of the study to a research venue. If you decide to take part, we will ask you to sign a consent form, and you will be given a copy of this consent form to keep.

If you’re interested in participating and have 15-20 minutes to chat (I promise to keep the time!), please either contact me at kholoudsaa@gmail.com or use this form https://docs.google.com/forms/d/e/1FAIpQLSdmmFHaiB20nK14wrQJgfrA18PtmdagyeRib3xGtvzkdn3Lgw/viewform?usp=sf_link with your choice of the times that work for you.

I’ll follow up with you to figure out what method is the best way for us to connect.

Please contact me using the email mentioned above if you have any questions or require more information about this project.

Thank you for considering taking part in this research.

Regards

Kholoudsaa (talk) 00:41, 1 June 2021 (UTC)Reply

Files with coordinate location in categories not connected to Wikidata

edit

Hello Dipsacus, only you can help me, with your immense experience with WCQS :) The query below is probably crap but somewhat works. Yet I think it only displays results from only SOME subcategories of a given category. Where is the mistake? I'd especially love if it only showed pictures from categories which are NOT connected to Wikidata yet, because it would tip me off about images which can be used to geo-localize the Commons category.


SELECT ?file ?coord WITH {
SELECT ?out ?depth WHERE {
  SERVICE <https://query.wikidata.org/bigdata/namespace/categories/sparql>
  {
    SERVICE mediawiki:categoryTree
     { bd:serviceParam mediawiki:start <https://commons.wikimedia.org/wiki/Category:Schools_in_Liberec_District> .
       bd:serviceParam mediawiki:direction "Reverse" .
       bd:serviceParam mediawiki:depth 3 .} 
  } 
} ORDER BY ASC(?depth) } as %categories WITH {
  
  SELECT ?out ?depth ?title ?member ?ns ?contentUrl WHERE {
    
include %categories
        
SERVICE <https://query.wikidata.org/sparql> {  
?out schema:name ?title .
SERVICE wikibase:mwapi {
     bd:serviceParam wikibase:endpoint "commons.wikipedia.org";
                     wikibase:api "Generator";
                     mwapi:generator "categorymembers";
                     mwapi:gcmtitle ?title;
                     mwapi:gcmlimit "max".
    
     ?member wikibase:apiOutput mwapi:title.
     ?ns wikibase:apiOutput "@ns".
                        }
    BIND (CONCAT("http://commons.wikimedia.org/wiki/Special:FilePath/",REPLACE(SUBSTR(STR(?member),6)," ","_")) as ?image) .
    BIND (REPLACE(wikibase:decodeUri(SUBSTR(STR(?image), 52)), " ", "_") AS ?filename)
    BIND (REPLACE(SUBSTR(STR(?image), 52), "%20", "_") AS ?filenameUnencoded)
    BIND (MD5(?filename) AS ?MD5)
    BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/", SUBSTR(?MD5, 1, 2), "/", ?filenameUnencoded)) As ?contentUrl)
FILTER (?ns = "6")
 } } } as %images WITH {
  select * WHERE {
include %images  

?file schema:contentUrl ?contentUrl .

  } } as %files where {
  
include %files
optional {?file wdt:P1259 ?coord1 .}
optional {?file wdt:P9149 ?coord2 .}
bind((coalesce(?coord2,?coord1)) as ?coord ) filter(bound(?coord)) .
}
Try it!

Thank you for your help in advance! Hope you are doing fine. Vojtěch Dostál (talk) 19:32, 5 January 2022 (UTC)Reply

Hi Vojtěch Dostál. The query does the opposite of what you want. The federated query to WDQS in the named subquery %images begins with the triple
?out schema:name ?title .
In other words, the query uses Wikidata to go from the full URI for a category to the title, and that means that all categories not connected to Wikidata are discarded in that step. You have to extract the category title without using Wikidata. --Dipsacus fullonum (talk) 21:01, 5 January 2022 (UTC)Reply
Ah, well. I actually don't need the titles of the categories, I just need all images that are inside those categories :-/. Are all wikibase-prefixed lines going to be a problem? Is there a way to circumvent Wikidata and still obtain a list of images inside a category tree? Thank you. Vojtěch Dostál (talk) 16:18, 6 January 2022 (UTC)Reply
@Vojtěch Dostál: Yes, you do need the titles of the categories for the MWAPI service call to get category members. Fortunately, the title is easy derive from the URI by using
BIND (wikibase:decodeUri(SUBSTR(STR(?out), 36)) AS ?title)
I made that change, and also corrected how ?contentUrl is constructed below (the former code did not work for filenames with non-ASCII characters) and did a few other changes:
SELECT DISTINCT ?file ?coord
WITH
{
  SELECT ?out
  WHERE
  {
    SERVICE <https://query.wikidata.org/bigdata/namespace/categories/sparql>
    {
      SERVICE mediawiki:categoryTree
      {
        bd:serviceParam mediawiki:start <https://commons.wikimedia.org/wiki/Category:Schools_in_Liberec_District> .
        bd:serviceParam mediawiki:direction "Reverse" .
        bd:serviceParam mediawiki:depth 3
      }
    }
  }
}
AS %categories
WITH
{
  SELECT ?contentUrl ?filename
  WHERE
  {
    include %categories
    BIND (wikibase:decodeUri(SUBSTR(STR(?out), 36)) AS ?title)
    SERVICE <https://query.wikidata.org/sparql>
    {
      SERVICE wikibase:mwapi
      {
        bd:serviceParam wikibase:endpoint "commons.wikipedia.org" ;
                        wikibase:api "Generator" ;
                        mwapi:generator "categorymembers" ;
                        mwapi:gcmtitle ?title ;
                        mwapi:gcmlimit "max" ;
                        mwapi:gcmnamespace "6" .
        ?member wikibase:apiOutput mwapi:title
      }
      BIND (REPLACE(SUBSTR(STR(?member), 6), " ", "_") AS ?filename)
      BIND (MD5(?filename) AS ?MD5)
      BIND (URI(CONCAT("https://upload.wikimedia.org/wikipedia/commons/", SUBSTR(?MD5, 1, 1), "/",
                       SUBSTR(?MD5, 1, 2), "/", ENCODE_FOR_URI(?filename))) AS ?contentUrl)
    }
  }
} AS %images
WITH
{
  SELECT ?file
  WHERE
  {
    include %images
    ?file schema:contentUrl ?contentUrl
  }
} AS %files
WHERE
{
  include %files
  OPTIONAL { ?file wdt:P1259 ?coord1 }
  OPTIONAL { ?file wdt:P9149 ?coord2 }
  BIND ((COALESCE(?coord2, ?coord1)) as ?coord )
  FILTER BOUND(?coord)
}
Try it!
I tried to exclude categories connected to Wikidata, but my attempts did not work and I don't know why. --Dipsacus fullonum (talk) 19:45, 6 January 2022 (UTC)Reply
Thank you! That works really nicely! I'll be using that for matching items to categories. Vojtěch Dostál (talk) 18:29, 7 January 2022 (UTC)Reply

Thank you!

edit
  The Wikidata Barnstar
Seems (?) no one has given you one of these! It was very kind of you to help me so promptly, mange tak! --Goldsztajn (talk) 09:20, 24 January 2022 (UTC)Reply
Thanks, Goldsztajn. Yes, this is my first barnstar at Wikidata, but I already have some from Danish Wikipedia. --Dipsacus fullonum (talk) 13:21, 24 January 2022 (UTC)Reply

Refine query on adjacent station (P197)

edit

Hello! I once asked a query and you responded very well https://w.wiki/4wS7. May I ask you why Yeongtong Station (Q218354) (and some other) is a false positive ? I don't get it since I wish to find only stations whose P197/P1192 are in a mess (not clearing stating which ligne with which neighbour). Your help would be appreciated much. Cheers Bouzinac💬✒️💛 13:25, 11 March 2022 (UTC)Reply

Hello Bouzinac. I assume you are asking why this P197 statement is in the results. It is because it has both a connecting line (P81) qualifier and a connecting service (P1192) qualifier, each with different values. Note that query has:
?next_stm ps:P197 ?next;
   pq:P81|pq:P1192 ?line1, ?line2 .
where the | sign means both qualifiers can be used for ?line1 and ?line2. If I understand correctly, that is an error, and you should remove pq:P81| from the query. --Dipsacus fullonum (talk) 17:52, 11 March 2022 (UTC)Reply
Hi ! Well, I have removed P81 inside https://w.wiki/4xTg and there is still false positive such as Quiddestraße (Q479415) where P197 statements do not mix like spaghetti and are good looking. Cheers, Bouzinac💬✒️💛 15:57, 16 March 2022 (UTC)Reply
@Bouzinac: Well, no, it isn't a false positive. Quiddestraße (Q479415) isn't is the results as a value for ?item. But the ?item value Neuperlach Zentrum (Q479494) is found with Q479415 as the value for ?next. That is correct, as I see it, looking at the statement Q479494#Q479494$11c76036-481e-0cd1-cc5b-0ac555813438. --Dipsacus fullonum (talk) 13:44, 17 March 2022 (UTC)Reply
Oh my bad! I was looking at the wrong item :ç thanks for the clarity   Bouzinac💬✒️💛 14:26, 17 March 2022 (UTC)Reply

Last revision

edit

Thanks again for your help! I didn't articulate my problem very clearly. I want to display a list for such a triple:

SELECT ?item ?revid WHERE {

 VALUES ?item_strings { "Q2" "Q5" } -- > ?item wdt:P1087 ?Elo
 SERVICE wikibase:mwapi
 {
   bd:serviceParam wikibase:endpoint "www.wikidata.org" .
   bd:serviceParam wikibase:api "Generator" .
   bd:serviceParam mwapi:generator "revisions" .
   bd:serviceParam mwapi:prop "revisions" . 
   bd:serviceParam mwapi:titles ?item_strings .
   ?item wikibase:apiOutputItem mwapi:title .
   ?revid wikibase:apiOutput "revisions/rev/@revid" .
 }

}

But no matter how I tried to apply your code to this set, I can't. Help me, please. Игорь Темиров (talk) 19:01, 18 April 2022 (UTC)Reply

Hi Игорь Темиров. I can not make the query to include all items which have statements with P1087 because there are too many. The query below finds revisions IDs for 100 items and takes ca. 8 seconds to run, so the limit to avoid timeout is probably below 1000 items. You can change the subquery to select the items you want, but it can not be too many.
SELECT ?item ?revid
WITH
{
  SELECT DISTINCT ?item
  WHERE
  {
    ?item wdt:P1087 ?Elo .
  }
  LIMIT 100
} AS %items
WHERE
{
  INCLUDE %items
  BIND (STRAFTER(STR(?item), STR(wd:)) AS ?title)
  SERVICE wikibase:mwapi
  {
    bd:serviceParam wikibase:endpoint "www.wikidata.org" .
    bd:serviceParam wikibase:api "Generator" .
    bd:serviceParam mwapi:generator "revisions" .
    bd:serviceParam mwapi:prop "revisions" . 
    bd:serviceParam mwapi:titles ?title .
    ?revid wikibase:apiOutput "revisions/rev/@revid" .
  }
}
Try it!
--Dipsacus fullonum (talk) 21:00, 18 April 2022 (UTC)Reply
I suspected. But it's a good example to study. Thank you for your skill and patience. Thank you very much! Игорь Темиров (talk) 21:11, 18 April 2022 (UTC)Reply

Hello! Subways again ;)

edit

How are you? May I ask you to amend that query so that I can spot the bad items that haven't got correct qualifyers (that is no ?subwayLine_predLabel or no ?towardsLabel). Thanks !

SELECT ?station ?stationLabel ?subwayLine_predLabel ?predLabel ?towardsLabel WHERE {
  VALUES ?search {
    wd:Q462201  
#critère de recherche de système de métro
  }
      ?search wdt:P527 ?lignes.#quelles sont les lignes de ce métro
      ?lignes wdt:P559 ?termini.#quels sont les terminus de ce métro
  ?station wdt:P31/wdt:P279* wd:Q928830;  #station de métro
           wdt:P361|wdt:P16 ?search;#qui font partie du réseau recherché
   wdt:P81|wdt:P1192 ?subwayLine;    wdt:P197 ?pred.
  ?pred wdt:P625 ?coords_pred;    wdt:P81|wdt:P1192 ?subwayLine_pred.
  ?station p:P197 _:b1.
  _:b1 ps:P197 ?pred ;    pq:P5051 ?towards;
    pq:P81|pq:P1192 ?line_pq.  
  FILTER(?subwayLine_pred = ?lignes)#on ne prend les LIGNES que si la correspondance est sur la même ligne
  FILTER(?subwayLine = ?lignes)
  FILTER(?towards = ?termini) 
  MINUS { ?station (wdt:P576|wdt:P582|wdt:P3999) ?dispar. }#on ne veut pas les stations disparues
  MINUS {    ?station wdt:P5817|wdt:P5816 ?interdit.
    VALUES ?interdit {    wd:Q811683    wd:Q63065035    wd:Q12377751   wd:Q97317113   wd:Q55653430   wd:Q30108381   wd:Q55570340   wd:Q11639308 wd:Q104664889 wd:Q110713763    }
  }#on ne veut pas les cas particuliers, en construction etc
 ?pred  p:P625 ?node_pred .
  ?node_pred a wikibase:BestRank.
  ?station p:P625 ?node_station.
 ?node_station a wikibase:BestRank.
  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en,fr" .
    ?subwayLine rdfs:label ?layer .
    ?pred rdfs:label ?predLabel .
    ?subwayLine_pred rdfs:label ?subwayLine_predLabel .
    ?towards rdfs:label ?towardsLabel.
    ?station rdfs:label ?stationLabel .
  } } GROUP BY ?station ?stationLabel  ?subwayLine_predLabel ?predLabel  ?towardsLabel
Try it!

Bouzinac💬✒️💛 08:50, 21 April 2022 (UTC)Reply

Hi Bouzinac. I don't understand what the query should find. Can you say what kind of bad items more detailed? --Dipsacus fullonum (talk) 10:53, 21 April 2022 (UTC)Reply
Sorry if I were unclear. I meant items like that one https://www.wikidata.org/w/index.php?title=Q406349&oldid=1622789490#P197 (just before my correction) : having only simple P197 statement(s),
  • without qualifyers P81|P1192
  • OR without qualifyers P5051
In other words, they all should be having a P197 with both qualifyers and I'd like to find/correct that don't have both qualifyers. Bouzinac💬✒️💛 11:37, 21 April 2022 (UTC)Reply

Presidents and their spouses

edit

Good afternoon! There is such a simple request about the wives of presidents:

SELECT ?p ?pLabel ?spouse ?spouseLabel WHERE {
  BIND(wd:Q30 AS ?country)
  ?country (p:P6/ps:P6) ?p.
  ?p wdt:P26 ?spouse.
  SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". }
}
Try it!

How can I make it so that if there are two wives, they are displayed not in a separate line, but in the spouses field separated by commas? With thanks Игорь Темиров (talk) 19:51, 11 May 2022 (UTC)Reply

@Игорь Темиров: By using the GROUP_CONCAT aggregation function for ?spouse and/or ?spouseLabel, and use GROUP BY for the other variables in the SELECT. To aggregate over a label, you also need to use the manual version of the label service:
SELECT ?p ?pLabel (GROUP_CONCAT(?spouse; separator=", ") AS ?spouses) (GROUP_CONCAT(?spouseLabel; separator=", ") AS ?spouseLabels) WHERE {
  BIND(wd:Q30 AS ?country)
  ?country (p:P6/ps:P6) ?p.
  ?p wdt:P26 ?spouse.
  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en".
    ?p rdfs:label ?pLabel.
    ?spouse rdfs:label ?spouseLabel.
  }
}
GROUP BY ?p ?pLabel
Try it!

--Dipsacus fullonum (talk) 20:16, 11 May 2022 (UTC)Reply

Thank you very much for your help! Игорь Темиров (talk) 03:19, 12 May 2022 (UTC)Reply

Import af folketingsmedlemmer

edit

Jeg har lige ved hjælp af et hjemmestrikket Bash/Perl/Gawk script tilladt mig at importere 35 af de artikler om folketingsmedlemmer du har oprettet den seneste tid. Jeg vil gerne høre om du har kommentarer til det. Scriptet er pt ikke specielt letlæseligt, men er som følger:

wdpolitikerDK () {
    curl -d "format=tsv" -d "psid=22269697"  "https://petscan.wmflabs.org/" | sed '1d; s/_/ /g' | gawk -F'\t' -v i="1" 'BEGIN{print "qid,P31,P27,P106,P39,Sdawiki,Dda,Lda,Ada,P569,P570"} FNR==NR{m[$1]=i; i++; next;} {cmd = "wikiget.awk -p -l da -w \"" $2 "\" | perl -n -e \47if ($_ =~ /([a-zA-Z æøåÆØÅ]+) \\(født ([0-9.]+ [a-z]+ [0-9]{4}).+, død ([0-9.]+ [a-z]+ [0-9]{4}).+\\) var en ([a-z æøå]+)/) {print \"$1#$2#$3#$4\"};\47"; cmd | getline p;if (p ~ /#/) {n=$2; sub(/ \(.+/,"",n); OFS=","; split(p,a,"#"); split(a[2],f," "); split(a[3],d," "); print ",Q5,Q756617,Q82955,Q12311817,#" $2 "#,#" a[4] sprintf(" \(%4d-%4d\)#,", f[3], d[3]) n, a[1], sprintf("+%4d-%02d-%02dT00:00:00Z/11,+%4d-%02d-%02dT00:00:00Z/11", f[3], m[f[2]], f[1], d[3], m[d[2]], d[1])}; p=""}' <(printf "januar februar marts april maj juni juli august september oktober november december" | tr " " "\n") - | tr '#' '"'
}

--Hjart (talk) 13:35, 12 June 2022 (UTC)Reply

Interessant. Jeg vidste ikke at petscan kan bruges ikke-interaktivt. Jeg har en ide om hvad der sker, men mit kendskab til perl og awk er for begrænset til at jeg kan følge detaljerne. Jeg formoder at wikiget.awk henter artiklernes wikitekst. Hvorfor bruge awk til det? Jeg ville nok bruge curl til at lave et API-kald til det. Men jeg kan ikke se noget som importerer de udtrukne data fødsels- og dødsdato til Wikidata. --Dipsacus fullonum (talk) 15:34, 12 June 2022 (UTC)Reply
@Dipsacus fullonum Jeg har brugt awk rigtig meget til forskellige opgaver i forbindelse med wikidata. Det er en standardkomponent i Linux, og man kan nemt eksperimentere med det direkte fra kommandolinjen indtil man finder ud af hvordan man kommer frem til et ønsket resultat, og er faktisk et ret snu værktøj til mange forholdsvist simple opgaver. Jeg har mange gange downloaded en wikidata søgning og lokalt kørt en masse iterationer af forskellige scripts. I starten brugte jeg det primært til at fixe etiketter. Nu bruger jeg det til import af bygninger og gader m.m. og alle mulige forskellige maintenanceopgaver. Wikiget har jeg fundet på https://github.com/greencardamom/Wikiget og bruger det her til at hente artikeltekster minus markup. Perls regexp er snu til at pille bestemte data ud af en tekst. Scriptet importerer ikke direkte til Wikidata, men formaterer blot data, så de er klar til at indsætte i Quickstatements, hvor jeg så lige kigger på data inden jeg klikker "Run". Hjart (talk) 16:56, 12 June 2022 (UTC)Reply
Awks store styrke er tabeller og transformationer af data i disse. En simpelt script jeg har mange gange har kørt er f.eks.
::sed -E '1d; s|http.+/entity/||' "query.tsv" | gawk -F'\t' 'BEGIN{print "qid,Lda"} $2 ~ /\(/ {sub(/ \(.+/,"",$2); print $1 "," $2}'
::
Den køres på en downloaded søgning med 2 kolonner (emner + etiketter på f.eks. personer), finder etiketter med paranteser og laver ændringsforslag uden paranteser. Sed er en anelse mere snu til fjernelse af headers og dele af kolonner som bare er i vejen. Hjart (talk) 18:39, 12 June 2022 (UTC)Reply
Følgende oneliner brugte jeg til at finde og merge 437 dubletter af svenske kirker baseret på 2 forskellige søgninger:
::wdStrip "query(40).tsv" | gawk -F'\t' 'FNR==NR{a[$2]=$1; b[$2]=$3; next;} $2 ~ /kyrka$/ && $2 in a && $3 == b[$2] {print "MERGE\t" $1 "\t" a[$2]}' <(wdStrip "query(39).tsv") - | xclip
::
Kirkenavn og kommune i den ene fil læses ind i 2 arrays, hvorefter der løbes gennem de tilsvarende data i den anden fil og sammenlignes.

Den næste kører en Petscan på commons:Category:Listed buildings in Ribe, finder underkategorier på formen "gadenavn nummer" og formaterer data til oprettelse af huse der ikke allerede er repræsenteret i wikidata. Filen "gaderRibe.tsv" er "Qxxx,gadenavn", så scriptet selv kan finde gadernes Qværdier og oprette located on street (P669). Dette script er noget mere overskueligt end det med politikerne på dawiki, fordi der ikke er behov for at læse artikeltekster.
::wdfrededeRibe () {
::wdPetscan 22105285 | gawk -F'\t' 'BEGIN {print "qid,P31,P17,P131,P276,Dda,Den,Lda,Len,Scommonswiki,P669,qal670"}  FNR==NR{a[$2]=$1; next;} $2 ~ /[1-9]/ {hus = $2; sub(/ \(.+/,"",hus); gade = hus; sub(/ [0-9].+/,"",gade); no = hus; sub(/.+ /,"",no); print ",Q41176,Q35,Q645747,Q322361,fredet bygning i Ribe,#listed building in Ribe, Denmark#," hus "," hus ",\"Category:" $2 "\"," a[gade] ",###" no "###"}' gaderRibe.tsv - | tr '#' '"'
::}
::
Hjart (talk) 07:17, 14 June 2022 (UTC)Reply

Too meta?

edit

I think you may have accidentally left Renè hanging (in the request a query forum). I find request involving the use of the mwapi extension annoying and thus I flat out ignore those, even if they pop up like maybe 50% of the time, I'm quite happy to leave those to someone else.

I had an idea here the other day; Constructing a graph made from the metadata available in the washed SQL dumps. This would make dealing with metadata a lot more pleasant, but what are the chances we could push the WMDE to do this? Like I said, the request for metadata shows up very often. Infrastruktur (talk) 19:42, 22 June 2022 (UTC)Reply

Please

edit

You wrote:
I propose this solution to get headers with gender specific words:

  1. In Module:Cycling race/l10n, add these new the tables for languages where it is needed due to gender specific wording: startlist_translate_women, startlisttable_translate_women, infobox_translate_women, headoftable_translate_women, headoftableII_translate_women, calendar_translate_women, victories_translate_women.
  2. In Module:Cycling race in function translate, if called on a woman race and if a _translate_women table exists, then return the translation from that table. Otherwise use the _translate table as now.

Please tell me step-by-step how to do. --Tommes (talk) 15:06, 31 July 2022 (UTC)Reply

It would take as much time to make a complete description as to do the work myself, and it would probably not be useful anyway for someone who has no experience with the programming language lua and modules in Wikimedia projects. Wait for discussion/comments from experienced users. This is very complex program so bigger changes should be considered carefully. --Dipsacus fullonum (talk) 15:16, 31 July 2022 (UTC)Reply
What a pity! --Tommes (talk) 11:40, 1 August 2022 (UTC)Reply
Please stop nagging! --Dipsacus fullonum (talk) 11:47, 1 August 2022 (UTC)Reply
I second D.F. All of us are voluntaries, so either show some respect or expect none in return. Infrastruktur (talk) 19:56, 15 September 2022 (UTC)Reply
And if you think I am pinging @Tommes: you just to remind you, you would be correct. Some times things do need to be rubbed in, because it would not sink in otherwise. Have an otherwise great day! Infrastruktur (talk) 22:11, 15 September 2022 (UTC)Reply
I guess there is a misunderstanding. I am not an english native speaker. "This is a pity" or "What a pity!" shall express that I would like to have to issue solved or anything improved, find someone who knows how to do and shows the solution, but I can't implement the solution. It is just an expression of regret. Dipsacus fullonum, Infrastruktur --Tommes (talk) 16:25, 20 September 2022 (UTC)Reply

Contests

edit

I figure it would be hard to come up with new contests that are novel. Buuut I have an idea that I think would be appropriate for December. Ever done a puzzle-run as a kid? Figured a fun exercise would be to do a couple of riddles that tell you where to find the next one. And we'll keep the difficulty down for this one. It might involve a problem where you have to find something specific that is within distance of something else. Not so much challenging at it will be charming I guess, but for this kind of contest it should not be hard. Infrastruktur (talk) 19:49, 15 September 2022 (UTC)Reply

That sounds interessting. You do think like an advent calendar (julekalender) with one puzzle per day or week? --Dipsacus fullonum (talk) 06:49, 16 September 2022 (UTC)Reply
Good idea stretching it out. Maybe something like an easter-quiz (påskenøtter), where you get say 7 or 10 small tasks in the days leading up to Christmas, that each will give you a word of X letters and you combine a single letter from each of those to get the the final word which is the solution. The individual things will all be found on Wikidata, but could be found by other means as well, it could be train stations/airports, locations, famous people, something from pop-culture etc. Infrastruktur (talk) 11:50, 16 September 2022 (UTC)Reply

Order of results

edit

Hi! I am trying to get countries with this query:

select ?item ?itemLabel (GROUP_CONCAT(?CountryLabel; separator=", ") AS ?Countrys) where {
  ?item wdt:P1440 "900206".
  ?item p:P1532 [ps:P1532 ?Country].

  SERVICE wikibase:label {
    bd:serviceParam wikibase:language "en" .
    ?item rdfs:label ?itemLabel .
    ?Country rdfs:label ?CountryLabel           
  }
}
GROUP BY ?item ?itemLabel
Try it!

But the countries are not listed in the order they appear in the wikidata, but first with the preferred rank, then with the normal rank. Please help me make a request for the output in the order they appear on the wikidata. With gratitude Igor Игорь Темиров (talk) 14:04, 5 November 2022 (UTC)Reply

Call for participation in a task-based online experiment

edit

Dear Dipsacus_fullonum,

I hope you are doing well,

I am Kholoud, a researcher at King's College London, and I am working on a project as part of my PhD research, in which I have developed a personalised recommender model that suggests Wikidata items for the editors based on their past edits. I am inviting you to a task-based study that will ask you to provide your judgments about the relevance of the items suggested by our model based on your previous edits. Participation is completely voluntary, and your cooperation will enable us to evaluate the accuracy of the recommender system in suggesting relevant items to you. We will analyse the results anonymised, and they will be published to a research venue.

The study should take no more than 15 minutes.

If you agree to participate in this study, please either contact me at kholoud.alghamdi@kcl.ac.uk or use this form https://docs.google.com/forms/d/e/1FAIpQLSees9WzFXR0Vl3mHLkZCaByeFHRrBy51kBca53euq9nt3XWog/viewform?usp=sf_link

Then, I will contact you with the link to start the study.

For more information about the study, please read this post: https://www.wikidata.org/wiki/User:Kholoudsaa In case you have further questions or require more information, don't hesitate to contact me through my mentioned email.

Thank you for considering taking part in this research.

Regards Kholoudsaa (talk) 21:01, 17 February 2023 (UTC)Reply