Manual:Domain Events/Outlook
This page provides on outlook on the future of Domain Events in MediaWiki, as of June 2025. Please update when plans change or things get implemented.
Model events for more entities
So far (MW 1.45) only changes to pages have been modeled as domain events. There is a lot of potential in modeling changes to other kinds of entities as events as well, such as user accounts, page renderings, log entries, etc. See Manual:Domain_events/Hierarchy#Outlook for an overview of domain events that have been discussed and conceptualized but not yet implemented.
Cross Service integration events
In 2025, WMF Data Platform Engineering and MediaWiki Interfaces teams explored how Domain events might be used to better integrate MediaWiki sites with each other and with external services. The result was the Manual:Domain_events/Outlook/Cross-service document. Read that for more detail.
The document discusses the following subjects and more.
Generalize the broadcasting of events
We have been using the EventBus extension for broadcasting changes on MediaWiki to Kafka. EventBus has recently adopted the use of Domain Events for this purpose. However, the mapping from Domain Events to the message schema used on Kafka is still hard coded. It would be nice to have a simple generic mechanism for hooking up event types with Kafka topics and specifying the desired serializer to use. This would make event broadcasting to services outside MediaWiki a matter of configuration.
Enable receiving events from other services
Currently, MediaWiki can not receive events from services outside of MediaWiki over an event bus. It would be useful to have that capability. However, there are a number of open questions to resolve:
- How do extensions specify what external event sources they want to listen to?
- How should events get from Kafka into MediaWiki?
- Should we rely on a standalone service like ChangePropagation for consuming Kafka events and relaying them to MediaWiki over an API?
- Should there be a native PHP Kafka consumer running as a maintenance script? Note that we would need to run one script for each wiki site.
- Alternatively, we could also have the maintenance scripts consume events from STDIN, with a dedicated Kafka consumer running in a separate process.
- How and how often should consumers commit their index?
- How granular should topics be? One topic per emitter seems obvious, but it's not so clear whether events for all wikis should be on the same topic, or on separate topics.