2nd ObjectWeb Benchmarking Workshop, Prague, July 7 2004
- Exploring Performances of Distributed Applications on Clusters via Tracing and Paje Visualization (Jean-Marc Vincent, ID-IMAG)
Abstract: An approach is presented to help programmers understand the behavior of distributed applications
executed on clusters. It is based on tracing execution and trace based visualization. This approach needs to be applied
coherently at several abstraction levels from the runtime system to the high level programming language.
Paje software tools allows a dynamic browsing of traces of distributed applications according to
a programming model. Illustration of pattern analysis are given in several contexts : cluster supervision, Java distributed
applications, MPI parallel programs.
- CLIF: work in progress in 2004 (Bruno Dillenseger, France Telecom R&D )
Abstract: Current CLIF developments and perspectives until end of 2004.
- CORBA benchmark (Philippe Merle, LIFL)
Abstract: Evaluation of various ORB.
- Capturing benchmarking environments and multi-level monitoring (Emmanuel Cecchet, INRIA)
Abstract: In this talk, we will present our current developments to automatically capture the system environment used for benchmarks and how to map this with monitoring information. We will also present a brief status of LeWYS, the ObjectWeb monitoring framework, the available probes, channels and ongoing work on monitoring repositories. Various examples of mixed online/offline and multi-level monitoring will also be given to show the flexibility of the framework.
- Writing System Probes on Various Platforms (Petr Tuma, Charles University)
Abstract: A brief overview of the type of information that can be obtained from various operating system platforms
with regard to resource usage and other useful data that a benchmark should generally monitor. Focus will be put on the Linux,
Solaris and Windows operating system platforms, attention will be paid especially to the disruption that the gathering of this
information can represent to a benchmark.
- Regression Benchmarking Framework Architecture (Tomas Kalibera, Charles University)
Abstract: A first-stage design of a regression benchmarking framework will be presented. The focus of the design is on flexibility, as the framework must support various benchmarks on various platforms, and that both for existing as well as future ones.
- Get the slides of the different presentations.
1st ObjectWeb Benchmarking Workshop, Grenoble, April 5th 2004
- CLIF: status and plans for 2004 (Bruno Dillenseger, France Telecom R&D )
Abstract: The CLIF project intends to provide a load injection framework based on Fractal
component model, for scalable performance testing of any kind of system. This presentation will show
current status and feedback as well as development plans for year 2004, and will outline possible contributions.Slides.
- IDX-Tsunami TCP/IP benchmarking framework (Mickael Remond, Erlang)
Abstract: IDX-tsunami is a distributed load testing tool. It is protocol-independent
and can currently be used to stress HTTP, SOAP and Jabber servers. This talk will present IDX-Tsunami
main achievements and explain how the framework could improve for the next steps of the development.Slides
- LeWYS is Watching Your System (Emmanuel Cecchet, INRIA)
Abstract: LeWYS is a monitoring framework based on Fractal components. This talk will present
LeWYS design and how LeWYS can be used in benchmarking frameworks or for building supervision consoles.Slides
- JOnAS performance benchmarking (Olivier Gusching, Bull)
Abstract: Presentation of what has been done with JOnAS benchmarking, current work
(test of JOnAS in a cluster environment with grinder and a sample J2EE application, performances
comparison with JBoss), and workplan for the coming months. Slides
- Regression Benchmarking (Tomas Kalibera, Charles University)
Abstract:The goal of regression benchmarking is automatic detection of performance
regressions during application development. This presentation will show current status and
development plans, which include results repository and automatic analysis. Slides
Outcome of the workshop discussions.
Workshop on Middleware Benchmarking, OOPSLA 2003
Papers presented in session 1, Middleware Benchmark Construction.
Papers presented in session 2, Workload Generation And Characterization
- Workshop summary.
Cluster And Performance Meeting On October 15th 2002
- Workplan presentation by François Exertier [pdf]
- Report on the use of Mod_jk for clustering at Web tier
level by Goulven Le Jeune [pdf]
- Sizing and architecture presentation by François
presentation by Emmanuel Cecchet [pdf]
- RUBiS benchmark results by Emmanuel Cecchet [pdf]