...
- jmeter
- Open Source
- can be run distributed
- Heavily reliant on IDE to generate tests
- generates xml test plans that are then ran
- http://jmeter.apache.org/
- gatling
- Has a DSL based on scala
- can be run distributed
- http://gatling.io/#/
- grinder
- Open Source
- Allows test scripting in java, jython, and clojure
- can be run distributed
- No recent maintainers
- Would be great base for our own frameowork in some future project
- https://sourceforge.net/p/grinder/code/ci/master/tree/
- spf4j
- (simple performance framework for java)
- This is more of a Performance monitoring/metrics library but could be useful
- http://zolyfarkas.github.io/spf4j/
- locust.io
- Open Source
- Python for writing tests
- can be run distributed
- http://locust.io/
- tsung
- xml based
- can be run distributed
- (written in erlang–just a side note)
Note |
---|
Also I think its important to mention a large reason for preference of locust is the rapid speed at which we can prototype/develop tests. The minimal investment means that down the road when we either invest in creating our own performance testing framework and/or find a suitable long term solution these can be easily ported and thrown away. |
An example of locust which I choose as prime cantidatepersonally would use: in 34 ~30 lines of code
Code Block | ||||||
---|---|---|---|---|---|---|
| ||||||
import locust from locust import task class IntrigueTasks(locust.TaskSet): def on_start(self): self.client.verify = False @task def index(self): self.client.get('/search/catalog/') @task def sources(self): self.client.get('/services/catalog/sources') @task def catalogid(self): self.client.get('/search/catalog/internal/localcatalogid') @task def workspace(self): self.client.get('/search/catalog/internal/enumerations/metacardtype/workspace') class IntrigueUser(locust.HttpLocust): task_set = IntrigueTasks min_wait = 5000 max_wait = 10000 host = "https://example.com:8993" |
...