User talk:TomT0m

Best rank

Hello, je m'en remets à tes lumières dans cette requête qui a des faux positifs: je cherche des dates P1619 ayant uniquement une précision année (donc celles pas précises). Ce code propose des vrais et des faux positifs. Comment limiter aux best ranks ? Ex de faux positif Q10275535#P1619

SELECT ?item?itemLabel ?date ?coords WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr,en". }
  ?item (wdt:P31/(wdt:P279*)) wd:Q928830; 
    wdt:P5817 wd:Q55654238;wdt:P1619 ?date ;wdt:P625 ?coords .   
  VALUES ?precision { 9 }#10 = month precision, 11 = day precision
  ?item p:P1619/psv:P1619 [wikibase:timePrecision ?precision; wikibase:timeValue ?time].     
     }
Try it!

Bouzinac💬✒️💛 19:52, 4 January 2025 (UTC)

@Bouzinac: Salut ! Ca devrait fonctionner :
SELECT ?item?itemLabel ?date ?coords WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr,en". }
  ?item (wdt:P31/(wdt:P279*)) wd:Q928830; 
    wdt:P5817 wd:Q55654238 ;wdt:P625 ?coords . # on a pas besoin du wdt pour le temps si on récupère la déclaration équivalente   
  VALUES ?precision { 9 }#10 = month precision, 11 = day precision
  ?item p:P1619 [
       ps:P1619 ?date ; # utiliser la forme "ps:" dans les déclarations pour récupérer la valeur principale
       a wikibase:BestRank ;                    # récupérer les "best rank", les déclarations "bestrank" sont instances de cette classe rdf/owl (à un niveau de langage inférieur, rien à voir avec les instances et classes wikidata)
       psv:P1619 [
         wikibase:timePrecision ?precision
         # wikibase:timeValue ?time          # du coup on a plus besoin de la valeur de "?time", c'est la même que la valeur principale, à la place, on pourrait supprimer le "ps:P1619 ?date" et le remplacer par "wikibase:timeValue ?date" ça fonctionnerait aussi  
      ]
  ]
}
Try it!
voir les commentaires pour explication. Pour forcer des valeurs à être identiques dans des parties différents d'une requête, on réutilise simplement le même nom de variable. author  TomT0m / talk page 20:57, 4 January 2025 (UTC)
Mince 1582 vrai positifs, donc du boulot :( Merci en tous cas pour ta requête! Bouzinac💬✒️💛 09:20, 5 January 2025 (UTC)
Re, cerise sur le gâteau, si tu pouvais m'aider à éliminer les éventuels items date=année et où j'ai mis circa comme sur cet exemple (impossible de préciser la date, quoi) https://www.wikidata.org/wiki/Q10276202#P1619 ? Bouzinac💬✒️💛 17:43, 5 January 2025 (UTC)
@Bouzinac:
SELECT ?item?itemLabel ?date ?coords WHERE {
  SERVICE wikibase:label { bd:serviceParam wikibase:language "fr,en". }
  ?item (wdt:P31/(wdt:P279*)) wd:Q928830; 
    wdt:P5817 wd:Q55654238 ;wdt:P625 ?coords . # on a pas besoin du wdt pour le temps si on récupère la déclaration équivalente   
  VALUES ?precision { 9 }#10 = month precision, 11 = day precision
  ?item p:P1619 ?datestatement . # on récupère toute la déclaration dans une variable pour pouvoir faire un optional/minus, pour le coup
  
  ?datestatement ps:P1619 ?date ; # utiliser la forme "ps:" dans les déclarations pour récupérer la valeur principale
       a wikibase:BestRank ;                    # récupérer les "best rank", les déclarations "bestrank" sont instances de cette classe rdf/owl (à un niveau de langage inférieur, rien à voir avec les instances et classes wikidata)
       psv:P1619 [
         wikibase:timePrecision ?precision
         # wikibase:timeValue ?time          # du coup on a plus besoin de la valeur de "?time", c'est la même que la valeur principale, à la place, on pourrait supprimer le "ps:P1619 ?date" et le remplacer par "wikibase:timeValue ?date" ça fonctionnerait aussi  
      ]
   .
  
  minus {
    ?datestatement pq:P1480 wd:Q5727902 . # pq pour récupérer le qualificatif
  }
}
Try it!
hop ! author  TomT0m / talk page 21:15, 5 January 2025 (UTC)
Merci à toi ;) https://public.tableau.com/views/Stationsmtrocumuldesouvertures/Stationsdemtro?:language=fr-FR&publish=yes&:sid=&:redirect=auth&:display_count=n&:origin=viz_share_link Bouzinac💬✒️💛 23:11, 22 January 2025 (UTC)

Pb de prog

Hello, comment vas tu ? Dis, j'ai un pb de prog de modèle : https://en.wikipedia.org/wiki/User:Bouzinac/sandbox, il y a un code avec { {#if:|'{{{railcolour}}}' |(concat('#',if(BOUND(?idhexcolor), str(?idhexcolor), if(BOUND(?hexcolor),str(?hexcolor), '#07c63e')) )}} as ?stroke) Mon pb est que la couleur de railcolour ne semble pas prise en compte, quand elle est définie comme paramètre dans l'appel au modèle dans par exemple https://en.wikipedia.org/wiki/User:Bouzinac/doc Une idée? Bonne soirée A. Bouzinac💬✒️💛 21:49, 12 May 2025 (UTC)

fictional device used by fictional character: value-type constraint violation

Since you have reverted my edit of imaginary character(Q115537581), I now have to ask you how to satisfy the following value-type constraint for (fictional device, such as a teleporter) used by(P1535) (fictional character):

Values of used by statements should be instances or subclasses of being or object (or of a subclass of them), but (fictional character) currently isn't.

If a fictional character is not considered a being (since they’re fictional), should they be considered objects? Or is there a property akin to used by(P1535) but for fictional characters? Or should we extend the value-type constraint to also allow fictional objects, e.g. fictional entity(Q14897293) and its subclasses/instances?
keepright! ler (talk) 15:48, 17 June 2025 (UTC)

You can do something like adding "fictional agent" to the allowed class of values for the constraint. I just did it, please tell if that does not work. This is legitimate to do. author  TomT0m / talk page 16:01, 17 June 2025 (UTC)
Yes, this does work! Thanks for your time, and have a nice day! — keepright! ler (talk) 16:14, 17 June 2025 (UTC)

User:Mateusz Konieczny/failing testcases

You contributed in past to helping with cleanup at [User:Mateusz Konieczny/failing testcases]] and thanks for that!

Later for some time I had no availability to update listings - but recently I had again posted some cases there. Feel free to use them! (or ignore them, maybe you are not interested anymore)

Mateusz Konieczny (talk) 12:14, 30 June 2025 (UTC)

defining formula (P2534) for transitive over (P6609)

I see that you added a defining formula for transitive over. That's an interesting idea. Have you thought about how to do this for more items? I'm thinking of things like defining instances of plumber (Q252924) as items that have occupation (Q12737077) plumber (Q252924). To make this useful the definition could be a SPARQL query against the RDF dump of Wikidata. But then defining formula (P2534) might not be suitable and a new property could be better, perhaps along with a qualifier to state what the formula/query is and how it is to be used so that different definitions could be provided, e.g., SPARQL query on the RDF dump and Horn clause on some other formalization of Wikidata. Peter F. Patel-Schneider (talk) 10:45, 1 July 2025 (UTC)

Searching, there is http://ns.inria.fr/sparql-extension/rule.html as a language which would use raw sparql for the definition of the rules. Maybe a Wikidata "sparql" datatype would help, we could ask the devteam if the ontology/reasoning projects can gain traction on this.
There seem to be other options like https://spinrdf.org/ and tools to translate inference rule in swrl in spin : https://ideas.repec.org/a/igg/jswis0/v16y2020i1p87-115.html . Shacl seems to be able to do inference, but Wikidata uses shEx which does not, if my informations are correct.
It seems that a "construct" notation in sparql would be appropriate for our need. author  TomT0m / talk page 11:12, 1 July 2025 (UTC)
There are several ways that "defining formulae" could work. One is inside Wikibase, where rules could add new information directly into Wikidata. But there there is no RDF so RDF-based rule systems don't apply. Another is inside the query service, where the rules could add new triples to the RDF graph. There RDF-based rule systems could apply but the changes would only be seen when querying. A third way would be to have SPARQL queries against the RDF dump be available from Wikidata (somewhat as some constraints use the RDF dump, I believe). This is kind of a poor-man's approach as it wouldn't implicitly use any other rules or queries. The advantage is that it would work with the existing WDQS.
Solutions that use RDF or SPARQL rules would depend on getting Wikidata into a system that implements some sort of rules on top of RDF or SPARQL. I am only aware of one such system that can handle the Wikidata RDF dump - RDFox. But that system is main-memory only and proprietary. Even then I don't know if RDFox can handle Wikidata plus rules.
A problem with forward-chaining rules is that they should keep track of the support for the information they add and remove it when the support is removed. As far as I know, bots that add information to Wikidata don't do this in all cases, which is a problem.
This is something that could be discussed in Project Reasoning. Peter F. Patel-Schneider (talk) 14:09, 2 July 2025 (UTC)
I was just discussing the format to store the rules, of course the use of the rule is a totally different question. We can't really hope an implementation at the Wikidata software level of course, it's already hard to get the primitives to get the inverse property value which is necessary for such an implementation.
But yes, if we end up implementing an infrastructure on top of this rules, we need to know how to keep track of the support for the information. But in the spirit of minimising redundancy, what is deducible from these rules does not need to be explicitely added, which solves the problem of revoking the datas when they are deprecated or removed altogether, if they work on live datas, like sqid does (ping @Markus Krötzsch if he is around). But the rules are not stored on wiki. author  TomT0m / talk page 14:24, 2 July 2025 (UTC)
I would prefer multiple ways of stating the rules, including at least a generalization of Horn rules directly over Wikidata, as in the article you mention, and SPARQL queries over the RDF dump. Whether the results of the rules are recorded is separate, as you say, but there are efficiency concerns here. If the results are not recorded then interfaces that need the results may be too slow. If the results of the rules are recorded then there is a need to retract the a result when its support is removed, which has its own cost in both time and space. What I do know is that the current situation is much less than ideal.
What I would like is some way of both getting some useful consequences and alerting users when what is stored is not nearly the full picture. Peter F. Patel-Schneider (talk) 16:35, 2 July 2025 (UTC)
Forward chaining tends to make the numbers of facts potentially explodes. Think of, simply, the "located in" or "instance of" very simples inferences of transitive properties. You can multiply the number of statement, which is already quite large, with one or two orders of magnitude just with those 2 rules. This may even have a cost in consecutive queries and slow them down because the cardinality of the triples for one property gets bigger, for all queries even if many involved triples are not relevant. (the class tree of Wikidata is a big ball of thread, at least with inferences this would be apparent we should not do anything with it)
A compromise would be "caching", like in queries, you're interested in something, you draw some conclusion and you keep them for a while. This would be the strategy dependant of the caching of some query results, just as it is now and keep some "hot" for a while, this looks like the qlever caching strategy.
Backward chaining can be more efficient but you need to have a clear goal. Anyway this is a problem for an inference engine, it has a problem, a query engine available and must be smart in the use of the query engine to get to the conclusion/result set as fast as possible. It would require a program on top of the query service. author  TomT0m / talk page 17:00, 2 July 2025 (UTC)
The "best" is to proactively pre-compute all and only the consequences that are going to be useful soon enough that it is better to pre-compute them than to compute them on demand. But this ideal situation requires knowledge of the future. Without such knowledge there are several ways to try to be efficient. Each of them has problems and each can lead to situations where a simple request can take a very long time and each of them can lead to situations where a very large amount of memory is required.
There are two questions to be answered. 1/ Is there a system that can handle Wikidata plus a significant number of rules that we would like to include and have reasonable performance? 2/ Is there a system that can provide a partial solution that reasonably handles a reasonable number of requests? I don't know of any system for question 1, so I'm looking for a solution for question 2 even if the number of requests it can handle is limited, e.g., to instances of some classes, and the answers it provides are not necessarily complete. Peter F. Patel-Schneider (talk) 18:24, 2 July 2025 (UTC)
There is also this work : https://arxiv.org/pdf/2304.03375 that write rules to handle qualifiers in reasoning on Wikidata and compiles its rules to SPARQL. author  TomT0m / talk page 14:39, 2 July 2025 (UTC)

Revision from Property:P31

Hi! There is a misunderstanding with the image used in the article of Alfredo Dominguez Batista and Alfredo Domimguez Romero. Those are two different people. I edit de correct image, can you verify, pelase? Wawitasny7 (talk) 15:37, 2 July 2025 (UTC)

I could but I don't know anything about them, I just reverted your edit on instance of (P31) because by mistake you edited the Help:Property, which is about none of these persons. This happens, I reverted my own edit more than once, the revert was mainly to let you know something went wrong and you might want to check.
If you want to correctly edit the items, one is Alfredo Domínguez Batista (Q4723760) and the other is Alfredo Domínguez Romero (Q16941710). You can access the wikidata item from the Wikipedia article es:Alfredo Domínguez Romero by the menu of the right
Herramientas
=> the last item of the list
En otros proyectos => Elemento de Wikidata
. Then you can add the relevant picture, this is the right place. author  TomT0m / talk page 16:47, 2 July 2025 (UTC)
Oh, now I understand. Thank you very much for the explanation and the patience Wawitasny7 (talk) 01:24, 3 July 2025 (UTC)