Underlying lk
The programmer of Grand Theft Auto V
editI wondered why would your bot think Grand Theft Auto V was developed by the Develop magazine, and the only idea I got was that the English article cites the magazine in its references, using the proper citation templates, with <code>publisher=''[[Develop (magazine)|Develop]]''</code>. Which is obviously a completely different thing than Template:Infobox video game’s publisher=
. I can only hope your bot did not make the same mistake in too many items… --Mormegil (talk) 08:16, 30 April 2014 (UTC)
- @Mormegil: it's not much of a mystery, the bot extracts the first valid interlink it comes across in the given parameter, which in this case was Develop (Q844746).--Underlying lk (talk) 03:21, 1 May 2014 (UTC)
UnderlyingBot and BC(E)
editIn this edit, the bot forgot the BC when setting the date of birth. Thanks Haplology (talk) 02:10, 2 May 2014 (UTC)
- This also happened with some ancient Chinese kingdoms, Wei (Q912052) and Han (Q1574130) at least. --BurritoBazooka (talk) 17:59, 28 September 2015 (UTC)
Would you mind if you make a script for me?
edithello. thank you for contributing wikidata. I'm member of Wikidata:Anime_and_Manga_task_force so I want to coutribute about it. but, There are too many things to do. I'd like make a articles like K-On! (Q482482). I think Infobox animanga links one beings.
example : en:K-On! at english wikipedia.
- Infobox animanga/Header : general infomation ==> Q482482
- Infobox animanga/Print : about original manga ==> Q482482
- Infobox animanga/video :anime season 1 ==> Q15863567
- Infobox animanga/video : anime season 2 ==} Q15863577
- ...
Original work (in this case, Manga item has interwiki) has interwiki.
I'd like to do this work so I tried to make a script but failed. Would you mind if you make a script to do this, please?--Konggaru (talk) 10:22, 2 May 2014 (UTC)
- @콩가루: Hi Konggaru, if I understand your request correctly what you need is a script that will add
publisher = [[Houbunsha]]
to K-On! (Q482482),publisher = [[Sega]]
to K-ON! Hōkago Live!! (Q860047), etc. This is possible, but since a script has no way to 'guess' that the item ID for anime season 1 is Q15863567 or that the item ID for the game is Q860047, the id would have to be added manually each time. I created a similarly semi-automated pywikibot script, you will find it here and it depends on a helper script that should be moved to the 'pywikibot' folder.--Underlying lk (talk) 03:01, 3 May 2014 (UTC)
- Thank you for developing it. but when i tried to run it, it failed.
Extended content |
---|
user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[wikidata:Oreimo]]
WARNING: API warning (wbgetentities): Unrecognized value for parameter 'sites': wikidatawiki
ERROR: APIError: param-missing: Either provide the item "ids" or pairs of "sites" and "titles" for corresponding pages
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 145, in procesPage
if not item.exists():
File "/home/user/core/pywikibot/page.py", line 2444, in exists
self.get()
File "/home/user/core/pywikibot/page.py", line 2634, in get
super(ItemPage, self).get(force=force, *args)
File "/home/user/core/pywikibot/page.py", line 2457, in get
data = self.repo.loadcontent(self._defined_by(), *args)
File "/home/user/core/pywikibot/site.py", line 3697, in loadcontent
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: param-missing: Either provide the item "ids" or pairs of "sites" and "titles" for corresponding pages
And is it possible make a claim for genre and main subject (P921) for every item? i think it makes query by genre and media easy. Always Thank you.--Konggaru (talk) 04:26, 3 May 2014 (UTC)
user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'?
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? n
Bot: New item
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
ERROR: TypeError: not all arguments converted during string formatting
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 176, in procesPage
itemID = self.newItem(page, item, label)
File "harvast_anime.py", line 103, in newItem
itemID = pywikibot.input("Now enter the Item ID for the newly created item:" % label)
TypeError: not all a
user@debian:~/core$
user@debian:~/core$
user@debian:~/core$
user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'K-On!'
Finding redirects...
Processing No. 0: [[en:K-On!]]
Infobox animanga/Header
P18 cannot be added as it was recently removed from the item
P136 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'K-On! Hōkago Live!!' ('n' to create)?
Infobox animanga/Game
Item ID for 'K-On! Hōkago Rhythm Time' ('n' to create)?
Infobox animanga/Footer
Infobox animanga/Print
Item ID for 'manga'?
Infobox animanga/Video
Item ID for 'tv series'?
Infobox animanga/Video
Item ID for ova 'Live House!' ('n' to create)?
Infobox animanga/Video
Item ID for tv series 'K-On!!' ('n' to create)?
Infobox animanga/Video
Item ID for ova 'Plan!!' ('n' to create)?
Infobox animanga/Video
Item ID for 'film'?
user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'?
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? n
Bot: New item
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
ERROR: TypeError: not all arguments converted during string formatting
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 176, in procesPage
itemID = self.newItem(page, item, label)
File "harvast_anime.py", line 103, in newItem
itemID = pywikibot.input("Now enter the Item ID for the newly created item:" % label)
TypeError: not all arguments converted during string formatting I think i did something wrong. because i dont know the meanings "Item ID for 'light novel'...--Konggaru (talk) 14:31, 3 May 2014 (UTC) I found one problem but there is one more problem.. guser@debian:~/cor:'Infobox animanga/Header' -page:'Oreimo'ly:wikipedia -template:
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? 483535
ERROR: APIError: no-such-entity: Invalid id: 483535
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 242, in procesPage
if pid in item.get().get('claims'):
File "/home/user/core/pywikibot/page.py", line 2634, in get
super(ItemPage, self).get(force=force, *args)
File "/home/user/core/pywikibot/page.py", line 2457, in get
data = self.repo.loadcontent(self._defined_by(), *args)
File "/home/user/core/pywikibot/site.py", line 3697, in loadcontent
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: no-such-entity: Invalid id: 483535user@debian:~/core$
user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
Adding P123 --> [[wikidata:Q297315]]
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Sleeping for 5.9 seconds, 2014-05-03 13:54:36
P50 already exists (-overwrite to change it)
Extracting June 7, 2013 --> 2013-6-7
Adding P582 --> {'after': 0, 'precision': 11, 'time': '+00000002013-06-07T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Sleeping for 5.6 seconds, 2014-05-03 13:54:46
Sleeping for 7.6 seconds, 2014-05-03 13:54:54
Extracting August 10, 2008 --> 2008-8-10
Adding P580 --> {'after': 0, 'precision': 11, 'time': '+00000002008-08-10T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Sleeping for 6.5 seconds, 2014-05-03 13:55:05
Sleeping for 7.0 seconds, 2014-05-03 13:55:15
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? n
Bot: New item
ERROR: TypeError: not all arguments converted during string formatting
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 176, in procesPage
itemID = self.newItem(page, item, label)
File "harvast_anime.py", line 103, in newItem
itemID = pywikibot.input("Now enter the Item ID for the newly created item:" % label)
TypeError: not all arguments converted during string formatting I think the reason of this error is that "title" at template is empty. and I have a question. some of item has all of instance of (P31). should i delete it manually? Always thank you. --Konggaru (talk) 14:57, 3 May 2014 (UTC)
Is the Order a Rabbit? (Q11267313) has two value of p31. and if this script run, one of them should be deleted. --Konggaru (talk) 15:16, 3 May 2014 (UTC) user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
P123 already exists (-overwrite to change it)
P50 already exists (-overwrite to change it)
P582 already exists (-overwrite to change it)
P580 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? n
Bot: New item
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Now enter the Item ID for the newly created item: Q16744859
Adding P144 --> [[wikidata:Q483535]]
Sleeping for 8.0 seconds, 2014-05-03 14:27:58
Adding P31 --> [[wikidata:Q7889]]
Sleeping for 4.3 seconds, 2014-05-03 14:28:12
Sleeping for 8.2 seconds, 2014-05-03 14:28:18
No valid item found for P123
Adding P400 --> [[wikidata:Q170325]]
Sleeping for 4.7 seconds, 2014-05-03 14:28:31
Sleeping for 8.1 seconds, 2014-05-03 14:28:38
Adding P136 --> [[wikidata:Q689445]]
Sleeping for 6.5 seconds, 2014-05-03 14:28:50
Sleeping for 7.7 seconds, 2014-05-03 14:28:58
Extracting January 27, 2011 --> 2011-1-27
Adding P577 --> {'after': 0, 'precision': 11, 'time': '+00000002011-01-27T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Sleeping for 7.3 seconds, 2014-05-03 14:29:09
ERROR: APIError: failed-save: Edit conflict.
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 277, in procesPage
item.addClaim(claim)
File "/home/user/core/pywikibot/page.py", line 2751, in addClaim
self.repo.addClaim(self, claim, bot=bot, **kwargs)
File "/home/user/core/pywikibot/site.py", line 726, in callee
return fn(self, *args, **kwargs)
File "/home/user/core/pywikibot/site.py", line 3795, in addClaim
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: failed-save: Edit conflict.user@debian:~/core$ python harvast_anime.py:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
P123 already exists (-overwrite to change it)
P50 already exists (-overwrite to change it)
P582 already exists (-overwrite to change it)
P580 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? Q16744859
p144 already set
p31 already set
No valid item found for P123
P400 already exists (-overwrite to change it)
P136 already exists (-overwrite to change it)
Extracting January 27, 2011 --> 2011-1-27
Adding P577 --> {'after': 0, 'precision': 11, 'time': '+00000002011-01-27T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Sleeping for 7.8 seconds, 2014-05-03 14:30:54
Adding P178 --> [[wikidata:Q1194689]]
Sleeping for 4.7 seconds, 2014-05-03 14:31:07
Sleeping for 6.6 seconds, 2014-05-03 14:31:15
Infobox animanga/Print
Item ID for 'manga'? n
ERROR: APIError: no-such-entity: Invalid id: N
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 242, in procesPage
if pid in item.get().get('claims'):
File "/home/user/core/pywikibot/page.py", line 2634, in get
super(ItemPage, self).get(force=force, *args)
File "/home/user/core/pywikibot/page.py", line 2457, in get
data = self.repo.loadcontent(self._defined_by(), *args)
File "/home/user/core/pywikibot/site.py", line 3697, in loadcontent
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: no-such-entity: Invalid id: N I'm really sorry that i did only bug report. This script didn't ask me if bot create new item.. --Konggaru (talk) 15:23, 3 May 2014 (UTC)
It's 1:00 AM so I need to sleep now. And.. I fixed this edit I'll test your script later. Always thanks. --Konggaru (talk) 15:54, 3 May 2014 (UTC)
new bugedituser@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
P123 already exists (-overwrite to change it)
P50 already exists (-overwrite to change it)
P582 already exists (-overwrite to change it)
P580 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? n
Bot: New item
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Now enter the Item ID for the newly created item: Q16744859
Adding P144 --> [[wikidata:Q483535]]
Sleeping for 8.0 seconds, 2014-05-03 14:27:58
Adding P31 --> [[wikidata:Q7889]]
Sleeping for 4.3 seconds, 2014-05-03 14:28:12
Sleeping for 8.2 seconds, 2014-05-03 14:28:18
No valid item found for P123
Adding P400 --> [[wikidata:Q170325]]
Sleeping for 4.7 seconds, 2014-05-03 14:28:31
Sleeping for 8.1 seconds, 2014-05-03 14:28:38
Adding P136 --> [[wikidata:Q689445]]
Sleeping for 6.5 seconds, 2014-05-03 14:28:50
Sleeping for 7.7 seconds, 2014-05-03 14:28:58
Extracting January 27, 2011 --> 2011-1-27
Adding P577 --> {'after': 0, 'precision': 11, 'time': '+00000002011-01-27T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Sleeping for 7.3 seconds, 2014-05-03 14:29:09
ERROR: APIError: failed-save: Edit conflict.
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 277, in procesPage
item.addClaim(claim)
File "/home/user/core/pywikibot/page.py", line 2751, in addClaim
self.repo.addClaim(self, claim, bot=bot, **kwargs)
File "/home/user/core/pywikibot/site.py", line 726, in callee
return fn(self, *args, **kwargs)
File "/home/user/core/pywikibot/site.py", line 3795, in addClaim
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: failed-save: Edit conflict.user@debian:~/core$ python harvast_anime.py:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
P123 already exists (-overwrite to change it)
P50 already exists (-overwrite to change it)
P582 already exists (-overwrite to change it)
P580 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? Q16744859
p144 already set
p31 already set
No valid item found for P123
P400 already exists (-overwrite to change it)
P136 already exists (-overwrite to change it)
Extracting January 27, 2011 --> 2011-1-27
Adding P577 --> {'after': 0, 'precision': 11, 'time': '+00000002011-01-27T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Sleeping for 7.8 seconds, 2014-05-03 14:30:54
Adding P178 --> [[wikidata:Q1194689]]
Sleeping for 4.7 seconds, 2014-05-03 14:31:07
Sleeping for 6.6 seconds, 2014-05-03 14:31:15
Infobox animanga/Print
Item ID for 'manga'? n
ERROR: APIError: no-such-entity: Invalid id: N
Traceback (most recent call last):
File "harvast_anime.py", line 81, in run
self.procesPage(i, page)
File "harvast_anime.py", line 242, in procesPage
if pid in item.get().get('claims'):
File "/home/user/core/pywikibot/page.py", line 2634, in get
super(ItemPage, self).get(force=force, *args)
File "/home/user/core/pywikibot/page.py", line 2457, in get
data = self.repo.loadcontent(self._defined_by(), *args)
File "/home/user/core/pywikibot/site.py", line 3697, in loadcontent
data = req.submit()
File "/home/user/core/pywikibot/data/api.py", line 406, in submit
raise APIError(code, info, **result["error"])
APIError: no-such-entity: Invalid id: Nuser@debian:~/core$ python harvast_anime.:'Infobox animanga/Header' -page:'Oreimo'
Finding redirects...
Processing No. 0: [[en:Oreimo]]
Infobox animanga/Header
File:Ore no imouto novel v1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'light novel'? Q483535
Adding P144 --> [[wikidata:Q483535]]
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Sleeping for 5.4 seconds, 2014-05-03 14:42:57
p31 already set
P123 already exists (-overwrite to change it)
P50 already exists (-overwrite to change it)
P582 already exists (-overwrite to change it)
P580 already exists (-overwrite to change it)
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konnani Kawaii Wake ga Nai Portable' ('n' to create)? Q16744859
p144 already set
p31 already set
No valid item found for P123
P400 already exists (-overwrite to change it)
P136 already exists (-overwrite to change it)
P577 already exists (-overwrite to change it)
P178 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for manga 'Ore no Kōhai ga Konna ni Kawaii Wake ga Nai' ('n' to create)? n
Bot: New item
Now enter the Item ID for the newly created item: Q16747005
Adding P144 --> [[wikidata:Q483535]]
Sleeping for 8.2 seconds, 2014-05-03 14:45:11
Adding P31 --> [[wikidata:Q8274]]
Sleeping for 6.1 seconds, 2014-05-03 14:45:23
Sleeping for 7.9 seconds, 2014-05-03 14:45:31
No valid item found for P123
No valid item found for P50
No valid time for P582
No valid time for P580
Infobox animanga/Game
Item ID for 'Ore no Imōto ga Konna ni Kawaii Wake ga Nai Portable ga Tsuzuku Wake ga Nai' ('n' to create)? n
Bot: New item
Now enter the Item ID for the newly created item: Q16747020
Adding P144 --> [[wikidata:Q483535]]
Sleeping for 8.2 seconds, 2014-05-03 14:47:03
Adding P31 --> [[wikidata:Q7889]]
Sleeping for 6.5 seconds, 2014-05-03 14:47:15
Sleeping for 8.1 seconds, 2014-05-03 14:47:24
No valid item found for P123
No valid item found for P400
No valid item found for P136
Extracting May 17, 2012 --> 2012-5-17
Adding P577 --> {'after': 0, 'precision': 11, 'time': '+00000002012-05-17T00:00:00Z', 'timezone': 0, 'calendarmodel': 'http://www.wikidata.org/entity/Q1985727', 'before': 0}
Sleeping for 6.2 seconds, 2014-05-03 14:47:35
Sleeping for 8.3 seconds, 2014-05-03 14:47:43
No valid item found for P178
Infobox animanga/Video
Item ID for 'ona'? n
ERROR: KeyError: u'title'
Traceback (most recent call last):
File "harvast_anime.py", line 82, in run
self.procesPage(i, page)
File "harvast_anime.py", line 225, in procesPage
label = fielddict['title']
I found one more bug. would you mind if you fix it, please?--Konggaru (talk) 15:23, 4 May 2014 (UTC) and... one more report. user@debian:~/core$ python harvast_anime.py -lang:en -family:wikipedia -template:'Infobox animanga/Header' -page:'Is the Order a Rabbit?'
Finding redirects...
Processing No. 0: [[en:Is the Order a Rabbit?]]
Infobox animanga/Header
File:Gochūmon wa Usagi Desu ka? volume 1 cover.jpg does not exist on Commons
P136 already exists (-overwrite to change it)
Infobox animanga/Print
Item ID for 'manga'? Q11267313
Adding P144 --> [[wikidata:Q11267313]]
Password for user 콩가루 on wikidata:wikidata (no characters will be shown):
Logging in to wikidata:wikidata as 콩가루
Sleeping for 7.7 seconds, 2014-05-03 14:55:56
p31 already set
P123 already exists (-overwrite to change it)
No valid time for P582
No valid item found for P50
No valid time for P580
Infobox animanga/Footer
Infobox animanga/Video
Item ID for 'tv series'? Q16747052
ERROR: KeyError: u'title'
Traceback (most recent call last):
File "harvast_anime.py", line 82, in run
self.procesPage(i, page)
File "harvast_anime.py", line 225, in procesPage
label = fielddict['title']
KeyError: u'title'
|
Item ID for 'tv series'? Q16747052 ==> i made new item manually and input it but it doesn't work. and.. Would you mind if you add new function? I think all item involved with original work should have same genre and main subject (P921)(if possible). Always thank you and sorry for annoying you. --Konggaru (talk) 15:34, 4 May 2014 (UTC)
- @콩가루: Starting with the first error: this happened because 'ona' does not have a title. I have changed the script so that if 'title' is missing it will use the article's name instead (as in Oreimo (Q16747333) for the 'ona'). The second problem had the same cause, it should be fixed if you see Is the Order a Rabbit? (Q16747052). Also, I've made a couple of changes to import genre (P136) and main subject (P921) from the main item, but only for videos since games would have a different genre for example.--Underlying lk (talk) 17:42, 4 May 2014 (UTC)
Thank you. but Would you mind if you add new function? "Part of" property value of original work have all of media involved with original work like K-On! (Q482482). please add this function. Really THANK YOU. your script works very well. --Konggaru (talk) 05:11, 5 May 2014 (UTC)
- @콩가루: I made another change so that the script will add part of (P361) instead of based on (P144).--Underlying lk (talk) 06:26, 5 May 2014 (UTC)
- I think based on (P144) is better than P361. anyway, Thank you. I'm using your script well. It works fine. but.. would you mind if you make new function that add all adaptation to original work? eg: add Yuruyuri ova, yuruyuri anime to Original work like this [1] --Konggaru (talk) 15:30, 10 May 2014 (UTC)
- @콩가루: I might have misunderstood your request at 05:11 of 5 May, weren't you asking you use part of (P361) instead of based on (P144)? I can change it back to based on (P144), but how did you want to use part of (P361)? Also, I've made a change to add has part(s) (P527), the script is here. I haven't tested it extensively so let me know if there are problems, as always.--Underlying lk (talk) 23:20, 10 May 2014 (UTC)
- I think based on (P144) is better than P361. anyway, Thank you. I'm using your script well. It works fine. but.. would you mind if you make new function that add all adaptation to original work? eg: add Yuruyuri ova, yuruyuri anime to Original work like this [1] --Konggaru (talk) 15:30, 10 May 2014 (UTC)
Importing data from a list
editHi Underlying lk, I have a list of values that I'd like to import in Wikidata. Do you know if there is an existing script which can be easily adapted to do it, or should I submit a bot request? Thanks in advance. — Ayack (talk) 14:53, 7 May 2014 (UTC)
- @Ayack: I don't have any scripts for this, but I could probably make one. If you had a page generator or a list of pages instead of a list of item IDs it would be better though, so that imported from Wikimedia project (P143) could be added to all claims instead of importing them without source.--Underlying lk (talk) 07:35, 9 May 2014 (UTC)
- @Ayack: I created a little script for this: sycamore.py. It depends on another script, wdhelper.py, being located in the core/pywikibot folder. I tested it only once so if problems should arise don't hesitate to drop me a message. As I said above, this import will not add any sources, it would be better if one could be found for the data.--Underlying lk (talk) 09:38, 9 May 2014 (UTC)
- Wow, thanks a lot! I can't test it now but I'll try later. In fact, I have a source. If you could add stated in (P248): Sycomore (Q15271528) it would be great! — Ayack (talk) 09:46, 9 May 2014 (UTC)
- @Ayack: Updated to add the source. See this test edit: [2].--Underlying lk (talk) 10:21, 9 May 2014 (UTC)
- Thanks! — Ayack (talk) 10:28, 9 May 2014 (UTC)
- @Ayack: Updated to add the source. See this test edit: [2].--Underlying lk (talk) 10:21, 9 May 2014 (UTC)
- Wow, thanks a lot! I can't test it now but I'll try later. In fact, I have a source. If you could add stated in (P248): Sycomore (Q15271528) it would be great! — Ayack (talk) 09:46, 9 May 2014 (UTC)
- @Ayack: I created a little script for this: sycamore.py. It depends on another script, wdhelper.py, being located in the core/pywikibot folder. I tested it only once so if problems should arise don't hesitate to drop me a message. As I said above, this import will not add any sources, it would be better if one could be found for the data.--Underlying lk (talk) 09:38, 9 May 2014 (UTC)
Human groups
editHi, your bot makes one error often. It adds human-related properties to items that represent human groups, examples: James and Oliver Phelps (Q343954), Arkady and Boris Strugatsky (Q153796), Paolo and Vittorio Taviani (Q351697), Auguste and Louis Lumière (Q55965). Could you make additional check in bot`s code: if item has has part(s) (P527) property then do not touch the item. Or check instance of (P31). — Ivan A. Krestinin (talk) 18:28, 8 May 2014 (UTC)
- @Ivan A. Krestinin: you're right about this, but I also noticed that often enough those items don't have has part(s) (P527) or instance of (P31), or if they do have instance of (P31) it's set to human (Q5). Changing the script would mean that the bot would make all those mistakes once again, so it would probably cause more trouble than it would avoid.--Underlying lk (talk) 07:24, 9 May 2014 (UTC)
- has part(s) (P527) or instance of (P31) check is way to prevent edit wars. I revert your bot`s edits many times, but bot add invalid claims again. I try to improve data quality, but your bot makes this task hard. — Ivan A. Krestinin (talk) 10:58, 9 May 2014 (UTC)
- @Ivan A. Krestinin: Did that happen since your last message?--Underlying lk (talk) 11:05, 9 May 2014 (UTC)
- The latest invalid edit that I saw is [3]. I will notify you if will see its again. — Ivan A. Krestinin (talk) 11:16, 9 May 2014 (UTC)
- @Ivan A. Krestinin: Did that happen since your last message?--Underlying lk (talk) 11:05, 9 May 2014 (UTC)
- has part(s) (P527) or instance of (P31) check is way to prevent edit wars. I revert your bot`s edits many times, but bot add invalid claims again. I try to improve data quality, but your bot makes this task hard. — Ivan A. Krestinin (talk) 10:58, 9 May 2014 (UTC)
Merging multiple numbers
editHi. Wikidata:Database reports/Constraint violations/P1083#"Range" violations (and possibly others) show a couple of edits where you transformed infobox statements like "Capacity: 18,000, 2,000 seated" to something like "Capacity: 180002000". Please try to take more care parsing numbers. --YMS (talk) 14:49, 17 May 2014 (UTC) PS: Found next type immediately in the constraints list for P1092: from "Number built: about 450-500", your bot made "total produced: -500". --YMS (talk) 14:54, 17 May 2014 (UTC)
- @YMS: Hi, and sorry for the late reply. This is the regex I'm currently using to capture numbers:
(?:^|[\s\-–~>:約])((?:(?<!\d)-?)\d[\d,\.\'\s]*)(\smillion)?(?:[\s<+/\}人]|$)
- It requires that the number, if preceded by a minus sign, should not come after a digit, so perhaps the edit was made when I was using an older version of this regex. The other mistake happened because the space was accepted as part of the number, since they're often used as thousand separators.--Underlying lk (talk) 18:47, 21 May 2014 (UTC)
labels bot
editHello, do you have some script which works similar as claimit.py, but adds descriptions? There are many categories, which have old description in czech Kategorie Wikipedie and we would like to change it to Kategorie Wikimedie. Similar task would be about templates, project pages etc.
My idea is:
description.py -pagesgenerator [-overwrite]
, the description(s) would be hard-coded in script. JAn Dudík (talk) 19:07, 27 May 2014 (UTC)
errors in harvest_template
editHello, I am using your older versions of harves_template and claimit. After some changes in other core scripts I got following error message:
... Processing [[cs:Kategorie:Clenove Strany pokrokovych socialistu]] WARNING: Claim.getType is DEPRECATED, use Property.type instead. [[cs:Strana pokrokovych socialistu]] doesn't exist so I can't link to it ...
I found this change, but after changing I got new errors:
... Processing [[cs:Kategorie:Alba Vanguard Records]] ERROR: TypeError: 'unicode' object is not callable Traceback (most recent call last): File "D:\Py\rewrite\scripts\ht2.py", line 89, in run self.procesPage(page) File "D:\Py\rewrite\scripts\ht2.py", line 294, in procesPage if claim.type() == 'wikibase-item': TypeError: 'unicode' object is not callable ...
What to do for scripts working again without errors? I need mainly timedate harvesting and harvesting items without brackets, the other can be done by core scripts. JAn Dudík (talk) 06:23, 12 June 2014 (UTC)
- Resolved This bug was in site.py. After implementing gerit change scripts works again fine. JAn Dudík (talk) 11:42, 16 June 2014 (UTC)
Using remove claims.py with a list of "Q items"
editHi, I'm trying to remove some incorrect statements added by my bot. I've got a list of these items in a file but the script doesn't seems to find them:
Processing [[wikidata:Q103767]]
No Wikidata item for this page
. It's strange because with harvest_template.py it works without any problem. Could you have a look at it please? Thanks. — Ayack (talk) 10:32, 13 September 2014 (UTC)
BTW, I use python pwb.py remove_claims.py -lang:wikidata -family:wikidata -file:Liste2.txt P625
. — Ayack (talk) 10:34, 13 September 2014 (UTC)
Spouse: Year 2014
editSeems I replicated some of your import. Example: here. --- Jura 19:11, 22 October 2014 (UTC)
Incorrect instance of United Kingdom
editThis bot added several incorrect instance of (P31) for United Kingdom (Q145). For example on Secretary of State for Health and Social Care (Q3397406) and Chancellor of the Exchequer (Q531471). A bot should be careful about adding instance of (P31) since it has a very specific meaning. See the help.
I'm curious. Do you know why this bot added these claims? Jefft0 (talk) 14:11, 27 October 2014 (UTC)
- @Jefft0: The bot was fetching claims from the post parameter of en:Template:Infobox official post, which in some cases erroneously contained United Kingdom instead of linking to the article about the post. Apologies.--Underlying lk (talk) 09:11, 30 October 2014 (UTC)
Your bot is adding instance of (P31) instead of subclass of (P279) when importing types of artillery pieces from infobox of Russian Wikipedia
editHello. Check, for example, M2A2 Terra-Star (Q4043346). Is it problem of ru-wiki? Thank you beforehands. Ain92 (talk) 09:39, 3 November 2014 (UTC)
- P.S. Noticed the same problem with designed by (P287) instead of developer (P178), see Cañón 155 mm. L 45 CALA 30 (Q5055604). Ain92 (talk) 09:59, 3 November 2014 (UTC)
Charles Borromeo Charles Borromeo (Q216946)
edithttps://www.wikidata.org/w/index.php?title=Q216946&diff=117949224&oldid=116098764 I reverted this mistake--Oursana (talk) 15:42, 22 January 2015 (UTC)
sources of INE data
editDear Underling lk,
your UnderlyingBot did a change of INE data. In german Wikipedia there are some critics and questions, maybe you can give some answers? If you need translation help, please give a ping. Regards, Conny (talk) 12:50, 5 June 2015 (UTC).
Pywikibot
editHello. Can I ask you some questions about the use of pywikibot? I can add claims from infobox parameters (using harvest) and categories (using claimit). I wonder if there a way to add claims from a wikipedia table or from a list. Xaris333 (talk) 17:31, 29 July 2015 (UTC)
- Hi Xaris, I'm sorry but I haven't contributed anything to Wikidata for a long time, and I have not operated a bot for even longer, so I'm not really in a position to help much!--Underlying lk (talk) 02:46, 14 August 2015 (UTC)
\n in quantity produced by this bot
editLooks like this bot (User:UnderlyingBot) produces broken data. E.g. in Indian Wells Tennis Garden (Q1427761) in property maximum capacity (P1083) the data is "+16100\n" which is not correct number for quantity. You can see https://www.wikidata.org/wiki/Special:EntityData/Q1427761.json for the raw data to ensure the item is broken. See also: https://phabricator.wikimedia.org/T110728 --Smalyshev (WMF) (talk) 19:57, 28 August 2015 (UTC)
Your bot and Property:P18
editSpecial:Diff/119634110 looks like a bug in your bot. The bot copied a file name to Wikidata because there is a file in the article on English Wikipedia, but the file is a local file and not a file on Commons. Commons has a completely unrelated file with the same name, though. I think that your bot should check whether the file used on Wikipedia is a local file or a file on Commons before adding information about the file to Wikidata. --Stefan2 (talk) 21:03, 24 September 2015 (UTC)
Broken founding date
editSee this edit. --Yair rand (talk) 16:17, 17 November 2015 (UTC)
strange "license"s
editYou may have produced several of these strange statements. While in some cases it may be seen as a (somehow useful) creative way of using P275, but in this case I can't even find that company mentioned in the given source. What's going on there? Can you fix it?--Frysch (talk) 16:16, 4 December 2015 (UTC)
Weird future publication dates
editSee this edit; same thing here and here. I guess many video games from this list as well. -- LaddΩ chat ;) 03:03, 18 January 2016 (UTC)
UnderlyingBot
editYour bot has been listed at Wikidata:Requests for permissions/Removal/Inactive bot accounts as being inactive for over two years. As a housekeeping measure it's proposed to remove the bot flag from inactive bot accounts, unless you expect the bot will be operated again in the near future. If you consent to the removal of the bot flag (or do not reply on the deflag page) you can rerequest the bot flag at Wikidata:Requests for permissions/Bot should you need it again. Of course, You may request retaining your bot flag here if you need the bot flag. Regards--GZWDer (talk) 12:44, 26 June 2017 (UTC)
Ranks for historical data
editPlease don't change them to "deprecated"! This rank is for wrong data. Just mark current data with "preferred". --Infovarius (talk) 12:21, 22 October 2018 (UTC)
- Okay, will do.--Underlying lk (talk) 17:11, 22 October 2018 (UTC)
year 4305? (your bots first edit was 4 years ago, maybe you fixed it now?)
editWhy did your bot do edits like these where the supposed year is 4305 where it also adds that year to multiple languages? In Swedish "Datorspel från 4305". Before thinking it was wrong I assumed good faith and looked it up "4305" in the English Wikipedia and found no reference at all...oh wait I found something, there's a polygon article that has this link http://www.polygon.com/2013/5/7/4305926/double-fine-humble-bundle-brutal-legend-psychonauts-mac-linux-pc oh :) ... now I see. Dbfyinginfo (talk) 17:43, 25 January 2019 (UTC)
Adding population (P1082)
editWhen adding population (P1082) values to entries, please mark the most recent value with preffered rank. For example at Waldenburg (Q20085). Otherewise item will have two (or more) "current" values. -- VlSergey (трёп) 06:57, 11 March 2019 (UTC)
- Quickstatements cannot change ranks, unfortunately.--Underlying lk (talk) 09:30, 11 March 2019 (UTC)
- May be it shouldn't be used for population property then. -- VlSergey (трёп) 10:20, 11 March 2019 (UTC)
Falsches Datum?
editDu hast als Zeitpunkt/Stand "31. Dezember 2017" eingetragen. Die von dir zitierte Fundstelle nennt aber "Alle politisch selbständigen Gemeinden mit ausgewählten Merkmalen am 31.12.2018 (4. Quartal) (Deutsch)". --Eduard47 (talk) 08:39, 11 March 2019 (UTC)
- True, but if you open the xls file it says "Bevölkerung am 31.12.2017".--Underlying lk (talk) 09:30, 11 March 2019 (UTC)
- Sorry, du hast Recht. Aber häufig existieren bereits jüngere Daten, siehe Marne (Q542799), Husum (Q21159). --Eduard47 (talk) 09:56, 11 March 2019 (UTC)
Help:Add localized label to Wikidata
editHey, I'd like to add a label in Kabyle to [4], how can I do that? there's no "add language" option. Thanks in advance. Sami At Ferḥat (talk) 20:08, 11 March 2019 (UTC)
- @Sami At Ferḥat:: you first need to add this code to your user page: {{#babel: kab-3}} (or 2, or 4, depending on your knowledge of the language). I've done it for you, hope you don't mind.--Underlying lk (talk) 20:16, 11 March 2019 (UTC)
- ty so much! Sami At Ferḥat (talk) 22:28, 11 March 2019 (UTC)
Population Data for North Hassia
editHello Underlying lk,
eventually I found your wonderful work with imported data by this edit[5]. Such a great and very helpful work to include such officially available data by bot! Before i was not even aware, that this data could be entered - and BINGO they are so useful !!! Based on this data you can find automatically created diagrams in articles of wikipedia :-o See example in https://de.wikipedia.org/wiki/Stormbruch#Demographie
It took me nearly ONE HOUR !!! to enter the pure naked population data for this ONE little village - without references (which are noted in the wiki-article).
You (your Bot) could be very helpful to add some data. In many cases this would be the first the first statement of population data in objects for municipalitys. The key-reference in this official statistics is "Gemeindeschlüssel" = German municipality key in wikidata: https://www.wikidata.org/wiki/Property:P439 (details see https://de.wikipedia.org/wiki/Amtlicher_Gemeindeschlüssel#Deutschland ) Keep in mind: for all data in this statistics the Regional key = "06" https://www.wikidata.org/wiki/Property:P1388 Thus it might be useful to update missing regional keys.
I don't know which data formats you can handle best. Just let's begin with an dedicated example for 2009/2010 data and my request to include this data in wikidata-objects:
- Base data 2005 until 2017:
- 2009/2010 population data https://web.archive.org/web/20190509091941/https://www.destatis.de/GPStatistik/receive/HEHeft_heft_00007583;jsessionid=A8D9706EA13DACBF76B9F27A45732971
- Table SP.1-25 Bevölkerung insgesamt am 31. Dezember 2009 (this is the point in time for wikidata)
- Row in this table: Bevölkerung insgesamt (total summ of people 2009 for all listed municipalitys)
All data of this rows should be imported to existing data objects in wikipedia.
- Target = Population https://www.wikidata.org/wiki/Property:P1082
- Refenerence key for check = German municipality key/ "Gemeindeschlüssel" = https://www.wikidata.org/wiki/Property:P439
Just lets do some examples to see how it can work:
- Hessen (State of Hassia Germany) https://www.wikidata.org/wiki/Q1199
- Data = 6061951 (total inhabitants)
- Point in time = 31. Dezember 2009
- German municipality key /Gemeindeschlüssel= 000000 (special case of "Gemeindeschlüssel" because it is the complete state)
- Regional key = 06
- Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
- Reg.-Bez. Darmstadt (Darmstadt Government Region) https://www.wikidata.org/wiki/Q7932
- Data = 3792941
- Point in time = 31. Dezember 2009
- German municipality key / Gemeindeschlüssel = 400000
- Regional key = 06
- Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
- Frankfurt am Main https://www.wikidata.org/wiki/Q1794
- Data= 671927
- Point in time = 31. Dezember 2009 (the entry for 2009 is missing; but attention: some existing other points might need a correction based on this more thrustworthy data)
- German municipality key / Gemeindeschlüssel= 412000
- Regional key = 06
- Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
- Gorxheimertal https://www.wikidata.org/wiki/Q1538518
- Data= 3967
- Point in time = 31. Dezember 2009
- German municipality key / Gemeindeschlüssel= 431008
- Regional key = 06
- Reference: Hessisches Statistisches Landesamt: Hessische Gemeindestatistik 2010, webarchive online
Well buddy i don't know wether you like my suggestion to let your bot do some work. Shure that you might have to solve some problems; nevertheless I think it could be an enormous advantage and great time saving for our project. Data for one year alone could help .... every year more helps more. :-)
Best --Thombansen (talk) 10:54, 9 May 2019 (UTC)
- @Thombansen: Hi Thom, I looked at the links you provided and they only seem to cover Hessen. Searching around I found this: 12411-01-01-5 Bevölkerung nach Geschlecht - Stichtag 31.12. - regionale Tiefe: Gemeinden, which covers the 2008-2017 period. I also found another page with data going back to 1975. Perhaps we could use those instead?--Underlying lk (talk) 12:48, 9 May 2019 (UTC)
- @Underlying lk: hey your'e so smart :-) very fine. whatever fits your needs for getting your bot working best will be fine - sources from https://www.regionalstatistik.de and https://www.destatis.de are official = trustworthy/reliable. If you like to you can start with this. LoL for some other (even more difficult requests) I might contact you later if this is ok for you. Best --Thombansen (talk) 13:26, 9 May 2019 (UTC) PS. Just had a deeper look in the files (1995.xls) of destatis.de If the bot runs he could probably check/add references the data for "Fläche in km2" https://www.wikidata.org/wiki/Property:P2046 and "Post-leitzahl"=Postal-Code https://www.wikidata.org/wiki/Property:P281
- @Thombansen: Hi again, I started adding populations for German municipalities in 1975.--Underlying lk (talk) 00:54, 14 May 2019 (UTC)
- @Underlying lk: hey your'e so smart :-) very fine. whatever fits your needs for getting your bot working best will be fine - sources from https://www.regionalstatistik.de and https://www.destatis.de are official = trustworthy/reliable. If you like to you can start with this. LoL for some other (even more difficult requests) I might contact you later if this is ok for you. Best --Thombansen (talk) 13:26, 9 May 2019 (UTC) PS. Just had a deeper look in the files (1995.xls) of destatis.de If the bot runs he could probably check/add references the data for "Fläche in km2" https://www.wikidata.org/wiki/Property:P2046 and "Post-leitzahl"=Postal-Code https://www.wikidata.org/wiki/Property:P281
- @Underlying lk: very, very nice !!! More than 20.000 updates :-o I fixed description for Q63812020 Happy to see more data coming in. Cheers --Thombansen (talk) 11:30, 14 May 2019 (UTC)
Outdated population figures
editHello Underlying lk,
When adding historical population figures, could you please either rank them as "deprecated" or rank the current figure as "preferred"? They outdated figures should not be on the same level with the recent ones. Thank you & kind regards, --RJFF (talk) 18:24, 31 August 2019 (UTC)
Lausanne
editHello. I just saw that the page Q22374786 should be fused in Q807. I do not have the rights to do it and would appreciate if you can. Thanks in advance. 83.228.229.210 21:21, 2 September 2019 (UTC).
Community Insights Survey
editShare your experience in this survey
Hi Underlying lk,
The Wikimedia Foundation is asking for your feedback in a survey about your experience with Wikidata and Wikimedia. The purpose of this survey is to learn how well the Foundation is supporting your work on wiki and how we can change or improve things in the future. The opinions you share will directly affect the current and future work of the Wikimedia Foundation.
Please take 15 to 25 minutes to give your feedback through this survey. It is available in various languages.
This survey is hosted by a third-party and governed by this privacy statement (in English).
Find more information about this project. Email us if you have any questions, or if you don't want to receive future messages about taking this survey.
Sincerely,
Reminder: Community Insights Survey
editShare your experience in this survey
Hi Underlying lk,
A couple of weeks ago, we invited you to take the Community Insights Survey. It is the Wikimedia Foundation’s annual survey of our global communities. We want to learn how well we support your work on wiki. We are 10% towards our goal for participation. If you have not already taken the survey, you can help us reach our goal! Your voice matters to us.
Please take 15 to 25 minutes to give your feedback through this survey. It is available in various languages.
This survey is hosted by a third-party and governed by this privacy statement (in English).
Find more information about this project. Email us if you have any questions, or if you don't want to receive future messages about taking this survey.
Sincerely,
Imported Date of Death
editDear lk in Shepseskare Isi (Q268601) you imported a date of death from Italian site. What hint made you think that this jear might have been given as gregorian? --Vollbracht (talk) 19:59, 3 February 2022 (UTC)
You didn't react! If you enter historic data prior to 1584 the usage of Julian date instead of proleptic Gregorian is presumed to be agreed. According to ISO 8601 proleptic usage of Gregorian dates needs an agreement. You may get that in astronomy. You won't get it in history. --Vollbracht (talk) 11:20, 19 May 2022 (UTC)
incorrect population data
edithi, i dont know how wikidata works, but want to point you to an data error in an import here: https://www.wikidata.org/wiki/Wikidata:Report_a_technical_problem#inhabitant_number_incorrect_for_2016_for_the_village_Conques_in_france could you please look at it? looks very wrong in the chart now that is used on the wikipedia page of the village. 84.241.199.211 16:52, 21 November 2022 (UTC)
Call for participation in a task-based online experiment
editDear Underlying_lk,
I hope you are doing well,
I am Kholoud, a researcher at King's College London, and I am working on a project as part of my PhD research, in which I have developed a personalised recommender model that suggests Wikidata items for the editors based on their past edits. I am inviting you to a task-based study that will ask you to provide your judgments about the relevance of the items suggested by our model based on your previous edits. Participation is completely voluntary, and your cooperation will enable us to evaluate the accuracy of the recommender system in suggesting relevant items to you. We will analyse the results anonymised, and they will be published to a research venue.
The study should take no more than 15 minutes.
If you agree to participate in this study, please either contact me at kholoud.alghamdi@kcl.ac.uk or use this form https://docs.google.com/forms/d/e/1FAIpQLSees9WzFXR0Vl3mHLkZCaByeFHRrBy51kBca53euq9nt3XWog/viewform?usp=sf_link
Then, I will contact you with the link to start the study.
For more information about the study, please read this post: https://www.wikidata.org/wiki/User:Kholoudsaa In case you have further questions or require more information, don't hesitate to contact me through my mentioned email.
Thank you for considering taking part in this research.
Regards Kholoudsaa (talk) 20:58, 17 February 2023 (UTC)