Making Sense of the Numbers - DOs and DON'Ts of Quality Performance Testing

Click here to watch Making Sense of the Numbers - DOs and DON'Ts of Quality Performance Testing.

A quality performance test is broken up into 3 stages - (1) creating representative user scenarios; (2) scripting the actions of the simulated users; and (3) analyzing the results of the test. Unfortunately most tests are plagued by shortcuts taken at one of the 3 stages. Whether the mistake is not testing a common user path, choosing the wrong tool, or incorrectly believing bad data, a breakdown anywhere in the process leads to unnecessary fallout and likely little improvement in the site's performance. This presentation is a collection of lessons learned through a number of high concurrency load tests, both effective and failed. Through these lessons, we'll walk through each stage in depth to ensure that each and every load test is valid and beneficial the very first time.

While this session is highly technical, the lessons themselves will attempt to address the concerns of everyone in the process, including sysadmins, developers, and even project managers. Performance testing is a critical component in any site launch for the entire project team. Sysadmins will benefit from knowing how to correlate data to different layers in the hosting stack. Developers will learn how to write precise tests and analyze their results. Project managers will gain the knowledge required to ensure the tests are accurate and indicative of real site performance.

Schedule info
Status: 
Accepted
Time slot: 
Wednesday, May 22 - 03:45pm-04:45pm
Room: 
OR 201 - Phase2
Session Info
Speaker(s): 
Track: 
Coding + Development
Experience level: 
Intermediate

Comments

I will attend this session for sure . . . I think where Drupal is going with the larger projects, performance testing is going to become routine and will be automated and built into CI in a lot of cases.

As much as I want to cover the technical aspects of how the test runs, it's also vitally important that the right things are tested and the numbers mean something. As DevOps teams become more ubiquitous, I'm seeing a huge dearth in smart testing practices as a whole.