More about Performance Testing
This topic covers important aspects of performance testing and some of the industry technologies.
Performance Testing In The Cloud
Cloud technology has changed the way applications work and the expectations of users in terms of features such as real-time syncing.
The key difference when running a performance test for the cloud is to try and test the application for its performance (usability, scalability and reliability) in a non functional test (performance test).
The user experience when an application is under load is studied in load - stress - performance test. However setting up the performance test infrastructure is a key activity before starting a test and includes a lot of factors:
- Environment usage planning is extremely important since the resources are costly. If not planned well, mismanagement may lead to inappropriate planning and consequently insufficient usage of infrastructure resources.
- Most of the times the servers are managed by one team and utilized by another. With increasing complexity in infrastructure, teams are required to depend on each other causing delays in the core work of software development.
- Software that needs to be installed on the servers can be expensive
- Along with the financial investment, infrastructure management requires a significant amount of time investment with activities like procurement of servers, software installation, and so on.
Due to the factors listed above, setting up a performance testing environment can be very costly (optimization is the key). It therefore makes sense to outsource this activity. Running the test infrastructure through the cloud can provide the following benefits:
- Focus on core performance testing activity
- Pay-per-use
- Virtual services. You can easily set up test instances using virtual services at minimal cost
Blazemeter: Load - Performance - Stress testing on Cloud
BlazeMeter's Load Testing Platform for Developers is designed for professional use, is equipped with a self-service, on-demand platform and advanced scripting capabilities leveraging JMeter and Selenium (WebDriver). BlazeMeter can run multiple load tests that easily simulate load of 2,000,000 plus globally distributed virtual users from both the public cloud or inside the corporate firewall, enabling its customers to quickly locate and fix performance bottlenecks.
You can create proprietary test scripts and load scenarios using a graphical web environment. BlazeMeter offers web-based test management, archiving, repository, cloud-based monitoring, rich scripting language, and supports HTTP/S, web-services, XML, TCP, SQL, Login (Flash, images, streaming) and more.
BlazeMeter enables you to write load test-scripts using JMeter and user-experience test-scripts using Selenium . BlazeMeter generates a load based on the JMeter script. The Selenium script is used during the load to automate the launch of real browsers to measure the real end-user experience.
The load and monitoring uses a pre-configured distributed load testing environment. The environment is ready to use and available at all times.
If you are not familiar with Selenium and do not wish to create a Selenium test-script, the system can generate one for your programmatically based on landing pages you provide (Using our Google Chrome Extension). If you want the same for your JMeter script, BlazeMeter generates JMeter scripts automatically.
With BlazeMeter, all you need is to write the test-scripts, choose the amount of load-engines and run the test. The system takes care of the everything else. Load engines are pre-configured and available at your disposal. Detailed graphical reports are generated during the load.
BlazeMeter also offers Enterprise services as well as scripting services, scripting support and full turnkey solutions. For more about our service, please Contact Us.
Performance Testing In Agile
Agile development is a software creation methodology that encourages incremental or iterative development, making an application better with each update, over time.
Performance testing is still a necessary component; however, the workflow is different from traditional corporate environments.
Here are a few tactics designed to help you embrace performance testing within an Agile workflow:
- Team work: Agile is about working in teams, but traditionally performance testers were independent contributors. Today, these testers now work in teams. Not only do they have to find performance bugs, but they also have to collaborate with developers and performance engineers in resolving the issues.
- Start early: Unlike in the past, performance testing now starts early in the development process, and not at the end of it.
- Iterations: As iterative development happens, performance testing must be undertaken at each stage of the process. It's easier to make adjustments to new bugs when they arise than at the end of an applications development cycle.
- Over-communicate: Agile requires exhaustive communication and sometimes over-communication. This helps reduce lost time, and eliminates time waste.
Here are some suggested steps to perform in Agile performance testing:
- Goal: It is essential to understand the goal and vision of the project. A high level overview should be the first step in the process.
- Business targets: Along with goals, businesses may have different targets within an application (for example, response time of less than 1 sec for every page). Understanding these targets is essential to a project's success.
- Setting performance SLAs: Performance test managers collaborate with business and technical stakeholders to set the performance targets or SLAs.
- Identifying the tests and planning for specific scenarios: Based on the goals and targets, identify proper tests, and create proper business scenarios to emulate.
- Strategy document: Create a basic strategy document/wiki page that has information about all key test elements and share it with appropriate stakeholders.
- Test environment: Environment creation and documentation both happen in parallel. Create a testing infrastructure (tool and other setup) to meet testing demands. Replicate production environments.
- Tasks: Identify tasks and delegate jobs where appropriate.
- Scripting & test execution: Create scripts and execute tests based on the scenarios, as identified in the test planning phase.
- Analyze results and add them to the test result reports. Document bugs and add them to bug tracking tools. Publish results, and contact stakeholders.
- After completing these changes, re-testing should occur, and notes should written about possible app updates.
Performance Testing And Mobile Applications
To deal with the increasing demand for highly functioning mobile applications, developers now also need to ensure their mobile application with the same robustness and performance that consumers demanded on other computing devices.
However, the mobile frontier has also brought with it new challenges. Listed below are some of the more important things we need to consider:
- Network availability at the place where the device is in use plays a very important role and there is no way to control this environment.
- Mobile phones make more requests (thus create more load) on the servers compared to desktop devices.
- There are many types of operating systems, and we need to make the app compatible with each.
- Innovations happen quickly in the mobility space, we need to keep up.
- Executing a performance test and simulating the production behavior of mobile users is essential, but difficult since users keep moving from onelocation to another while using the app. Their network capabilities change quickly from one place to another.
Performance testing mobile apps:
- Try to bring the test as close to the production environment as possible.
- Don't worry about the factors that are beyond control.
- Use network simulators while running the tests, and through these simulators, try decreasing and increasing the available bandwidth to reproduce real-life environments and variables.
- Perform tests the replicate clients getting disconnect from a network.
- If something like this happens, try to resolve the issue and retest to avoid data loss. Sharing key details in the user sessions helps in such situations.
- While using simulators (during your load tests), try performing the same functionalities using a mobile device and check for the app performance in manual mode.
- While simulating the test, focus only on the key end users (operating systems & devices) that cover 70-80% of your business scenarios and don't go for exceptions (not necessary to go for all devices), unless the exceptions are too critical. This will help you save on your resources.