Event Driven Programming with Mendix Business Events

Event Driven Programming

  • Applications may use business events as a signal to let other applications know that something interesting has happened.
  • For instance, if a customer makes a purchase from a web store,
    • This can be communicated by the online store using a business event called New Order Placed.
    • Any application can choose to be notified of this event and handle subsequent actions in real time (for example, to send an invoice or handle packaging and logistics).
  • Open standards ensure seamless integration of the Mendix Data Hub Broker and Business Events with non-Mendix components in your application landscape.
    • CloudEvents is an open standard (supported by the Cloud Native Computing Foundation [CNCF]) that specifies how messages between applications should technically appear.
    • The business event services are described by the term asyncAPI. An open standard for contracts outlining message-based services is called AsyncAPI.
    • Apache The Mendix Event Broker makes use of Kafka as its messaging system. A tested and scalable platform for event streaming is Kafka.
  • When we deploy our apps to the free cluster, a free event broker is provided and configured automatically.
  • There is a cap of 1000 events per app per day in the Mendix Free App environment.
  • Events are delivered and distributed by the Data Hub Event Broker.
  • To deliver these events reliably between your applications, a Mendix Event Broker is required.
  • Business event services offer contracts that specify which events are available and how they should be designed for client application developers. The open AsyncAPI format is the foundation of this contract.
  • Events are only sent to other applications in the event that your microflow is successful. Your published business events will also be rolled back if your microflow fails and your entity changes are undone, which prevents them from being sent to the subscribed applications.
  • Apps publishing events do not need to know who needs to receive events, and apps receiving events do not have to call the publishing app.
  • However, in order to be deployed in production environments, the Mendix Business Events module needs an event broker.
  • To deploy an unlimited number of apps on production environments in the Mendix Cloud, we must purchase a license for the Mendix Event Broker.
  • Although licences for the Mendix Event Broker are available for all regions, we can only use one region at a time once it has been chosen (no multi-region support).

Prerequisite to build.

  • We need Mendix Studio Pro Version – 9.18 and above.
  • Download Mendix Business Events from the Marketplace.
  • Setup Docker in Laptop.
  • Visual Studio Code Editor.
  • Mendix Event Broker Toolkit.


  • You can build systems of event producers and consumers, known as publishers and subscribers, with the help of pub/sub.
  • Publishers can have asynchronous conversations with subscribers about the broadcasting events.
  • Pub/Sub is an asynchronous and scalable messaging service that separates the services responsible for producing messages from those responsible for processing them.
  • With latencies of around 100 milliseconds, Pub/Sub enables services to communicate asynchronously.

Architecture of Business Events

  • To run the web server, we need to install docker and run the following command in the event broker tool terminal.

docker exec kafka /bin/kafka-console-consumer –topic local –from-beginning –bootstrap-server kafka:19092

PS C:\Users\I5583\Downloads\event-broker-tools-main>docker exec kafka /bin/kafka-console-consumer — topic local –from begining –bootstr ap-server


  • The Producer Application produces a message, and that message will be captured and stored by Data Hub Event Broker
  • The data in the event broker will be in JSON (Java Script Object Notation) data format which plays a major role in organizing the data into a readable format.
  • Soon after Event Broker receives an event from the producer application in JSON data format, it sends the JSON data to Consumer Event Services and the Event Handler microflow will be triggered and executed.
  • The Event Broker data appears as follows:

PS C:\Users\I5583\Downloads\event-broker-tools-main> docker exec kafka /bin/kafka-console-consumer — topic local –from begining –bootstrap-server



Steps for Implementation:

  • Firstly, we need to have 2 applications: one is the Publishing Application and the other is the Consuming Application.
  • In both the Applications we need to download the BusinessEvents module from the Marketplace.
  • Publishing Application
    • Create an event entity and then add the attribute fields.
    • Specialize that entity with “BusinessEvents.PublishedBusinessEvent”.
  • Create an EventEntity object and pass that object to the NewEdit page through microflow as shown below and call that microflow home page.


  • Call a Microflow in the submit button and publish the event through the publish business event activity.
  • Apply Error handling in the publish business event activity and log the message as shown below.
  • Create a new Publish Business Event Service and select the entity in the business event card and synchronize the attributes.
  • Soon after that export the service contract [in Yaml Source File Type (.yaml)] by clicking on the Export AsyncAPI Service Contract button as shown below.
  • Go to Application Settings → Configuration tab → Edit the configuration → set the values of Constants (ChannelName and ServerUrl) as shown below and run the application
  • Event Broker
    • To test on your development workstation, run the Event Broker on your machine using Docker.
    • Download and Install Docker Desktop.
    • The required configuration can be found in the local setup for the event broker tool.
    • Download the event broker tool.
    • Open VS Code → Extensions → Docker (Install)
    • Open event-broker-tools-main downloaded folder in VS Code.
    • Run the below command in the terminal.
  • docker exec kafka /bin/kafka-console-consumer –topic local –from-beginning –bootstrap-server kafka:19092
  • You can find the server started.{“Name”:”Test”,”Email”:”Test@mail.com”,”PhoneNumber”:1234567890,”EventTime”:”2023-03- 02T13:58:31.5462″,“EventValue”:8888,”DoorNo”:”88″,”City”:”Visakhapatnam”,”State”:”Andra Pradesh”,”Country”:”India”}
  • Consuming Application
    • Go to Application Settings → Configuration tab → click on Edit the configuration → set the values of Constants (ChannelName and ServerUrl) as shown below and run the application.
  • Create a Consumed business event service and import the AsyncApi file (Service Contract) which was exported by published applications.
  • Soon after that you can find the EventEntity which was specialized by “BusinessEvents.ConsumedBusinessEvent”.
  • Now you can see a microflow which triggers soon after receiving the JSON Data Format from Published Application via Even Broker.
  • Generate the overview pages for the Event Entity and add it to navigation.


Know More About Mendix Event Broker

  • The Mendix Event Broker is an extremely reliable event broker created to transfer business events between Mendix applications.
  • It enables the modelling of Mendix applications that react to events in your organisation almost instantly. For instance, workflows that begin an approval process when a customer files a claim or automation microflows that update stock are two examples. Mendix Cloud production environments.
  • The Mendix Event Broker is used in conjunction with the Studio Pro Business Event modelling tool and the Marketplace’s Business Events module.
  • When deploying apps using Business Events to licenced nodes in the Mendix Cloud, an Event Broker is always necessary.
  • An Event Broker must be purchased to be used with licenced apps in the Mendix cloud. To order an Event Broker, get in touch with your account manager, CSM, or DataHub.

For more help on our Low-code services and Mendix get in touch with us today

Contact us


Mendix Business Events allows developers to create highly responsive and scalable software applications that can adapt to real-time events and user interactions. By designing applications as a series of small, loosely coupled events, developers can build applications that are more modular, easier to maintain, and less prone to errors. This approach also enables applications to be more flexible, as developers can easily add or remove events without disrupting the overall system.

The decoupled nature of event-driven applications means that different parts of the application can operate independently, without waiting for other parts to finish. This can improve the overall speed and performance of the application, as well as the user experience.

Overall, Mendix business events provides a powerful way to create software applications that can respond in real-time to user actions and system events, and that can adapt to changing business requirements. As the digital landscape continues to evolve, event-driven programming will undoubtedly play an increasingly important role in creating the next generation of intelligent and responsive software applications