Performance Benchmarking Methodologies for Real-Time Integration Middleware Platforms

Authors

  • Suman Neela Visvesvaraya Technological University, India. Author

DOI:

https://doi.org/10.63282/3117-5481/AIJCST-V5I5P104

Keywords:

Middleware Benchmarking, Real-Time Integration, Event-Driven Architecture, Map-Bench, Latency Measurement, Throughput Evaluation, Fault-Aware Testing, Enterprise Workload Modeling

Abstract

Real-time integration middleware sits at the core of how modern enterprises move data, trigger actions, and keep distributed systems in sync. Platforms like Apache Kafka, RabbitMQ, IBM MQ, and Apache Pulsar carry the weight of mission-critical operations—from payment processing and supply chain coordination to IoT data collection and distributed analytics. Yet, despite how much it depends on them, the methods used to benchmark their performance are still surprisingly immature. Most current ways of testing these systems—like checking the infrastructure, running application load tests, or assessments done by vendors—fail to capture the important behaviors that really affect their performance in real situations: how well they handle sudden spikes in traffic, how they deal with pressure, and if keeping messages safe slows things down too much. This article covers the theoretical basis for middleware performance evaluation, maps out where current practice falls short, and offers a detailed look at the Middleware-Aware Performance Benchmarking Framework (MAP-Bench). MAP-Bench combines a set of standard measurements, models for business workloads, tests for handling faults, and a more advanced way to evaluate architecture into one complete method. The practical value extends to platform selection, capacity planning, SLA design, and performance tuning. The goal is to move middleware benchmarking away from narrow load testing and toward something that actually reflects how these systems perform when it counts.

References

[1] Kai Sachs, et al., "Performance evaluation of message-oriented middleware using the SPECjms2007 benchmark," Performance Evaluation, 2009. [Online]. Available: https://www.sciencedirect.com/science/article/abs/pii/S016653160900011X

[2] Xing Chen, et al., "Self-adaptive resource allocation for cloud-based software services based on iterative QoS prediction model," Future Generation Computer Systems, 2020. [Online]. Available: https://www.sciencedirect.com/science/article/abs/pii/S0167739X19302894

[3] Vineet John and Xia Liu, "A Survey of Distributed Message Broker Queues," arXiv, 2017. [Online]. Available: https://arxiv.org/pdf/1704.00411

[4] Van-Nam Pham, et al., "Efficient Edge-Cloud Publish/Subscribe Broker Overlay Networks to Support Latency-Sensitive Wide-Scale IoT Applications," Symmetry, 2020. [Online]. Available: https://www.mdpi.com/2073-8994/12/1/3

[5] Martin Grambow, et al., "Continuous Benchmarking: Using System Benchmarking in Build Pipelines," 2019 IEEE International Conference on Cloud Engineering (IC2E), 2019. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/8790186

[6] Arnamoy Bhattacharyya, et al., "Phase Aware Performance Modeling for Cloud Applications," University of Toronto, 2020. [Online]. Available: https://www.eecg.toronto.edu/~amza/papers/CLOUD2020.pdf

[7] Kai Sachs, et al., "Performance Modeling and Analysis of Message-Oriented Event-Driven Systems," Software and Systems Modeling. [Online]. Available: https://scispace.com/pdf/performance-modeling-and-analysis-of-message-oriented-event-1nda4aizzs.pdf

[8] Matteo Nardelli, et al., "Efficient Operator Placement for Distributed Data Stream Processing Applications," IEEE Transactions on Parallel and Distributed Systems, 2019. [Online]. Available: https://ieeexplore.ieee.org/document/8630099

[9] Christoph Laaber, et al., "Software microbenchmarking in the cloud. How bad is it really?" Empirical Software Engineering, 2019. [Online]. Available: https://dl.acm.org/doi/10.1007/s10664-019-09681-1

[10] Pascal Maissen, et al., "FaaSdom: A Benchmark Suite for Serverless Computing," arXiv, 2020. [Online]. Available: https://arxiv.org/pdf/2006.03271

[11] Sachs, K., Kounev, S., & Buchmann, A. (2009). Performance evaluation of message-oriented middleware using the SPECjms2007 benchmark. Performance Evaluation, 66(10), 1027–1042. https://doi.org/10.1016/j.peva.2009.03.003

[12] Chen, X., Zhang, H., & Li, Y. (2020). Self-adaptive resource allocation for cloud-based software services based on iterative QoS prediction model. Future Generation Computer Systems, 105, 287–296. https://doi.org/10.1016/j.future.2019.12.019

[13] John, V., & Liu, X. (2017). A survey of distributed message broker queues. arXiv preprint arXiv:1704.00411.

[14] Pham, V.-N., Nguyen, T. D., & Huh, E.-N. (2020). Efficient edge-cloud publish/subscribe broker overlay networks to support latency-sensitive wide-scale IoT applications. Symmetry, 12(1), 3. https://doi.org/10.3390/sym12010003

[15] Grambow, M., Haselbring, W., & Reussner, R. (2019). Continuous benchmarking: Using system benchmarking in build pipelines. In Proceedings of the IEEE International Conference on Cloud Engineering (pp. 321–326).

[16] Bhattacharyya, A., et al. (2020). Phase-aware performance modeling for cloud applications. IEEE International Conference on Cloud Computing.

[17] Sachs, K., et al. (2018). Performance modeling and analysis of message-oriented event-driven systems. Software and Systems Modeling, 17(2), 1–20.

[18] Nardelli, M., et al. (2019). Efficient operator placement for distributed data stream processing applications. IEEE Transactions on Parallel and Distributed Systems, 30(8), 1753–1767. https://doi.org/10.1109/TPDS.2019.2896239

[19] Laaber, C., Scheuner, J., & Leitner, P. (2019). Software microbenchmarking in the cloud: How bad is it really? Empirical Software Engineering, 24, 1–37. https://doi.org/10.1007/s10664-019-09681-1

[20] Maissen, P., et al. (2020). FaaSdom: A benchmark suite for serverless computing. arXiv preprint arXiv:2006.03271.

[21] Duan, S., et al. (2023). Distributed artificial intelligence empowered by end-edge-cloud computing: A survey. IEEE Communications Surveys & Tutorials, 25(2), 591–624. https://doi.org/10.1109/COMST.2023.3241234

Downloads

Published

2023-09-10

Issue

Section

Articles

How to Cite

[1]
S. Neela, “Performance Benchmarking Methodologies for Real-Time Integration Middleware Platforms”, AIJCST, vol. 5, no. 5, pp. 39–46, Sep. 2023, doi: 10.63282/3117-5481/AIJCST-V5I5P104.

Similar Articles

11-20 of 186

You may also start an advanced similarity search for this article.