Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Software performance testing
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Methodology == === Performance testing web applications === According to the Microsoft Developer Network the Performance Testing Methodology consists of the following activities: # '''Identify the Test Environment.''' Identify the physical [[test environment]] and the production environment as well as the tools and resources available to the test team. The physical environment includes hardware, software, and network configurations. Having a thorough understanding of the entire test environment at the outset enables more efficient [[test design]] and planning and helps you identify testing challenges early in the project. In some situations, this process must be revisited periodically throughout the project's [[Systems development life-cycle|life cycle]]. # '''Identify Performance Acceptance Criteria.''' Identify the response time, throughput, and resource-use goals and constraints. In general, response time is a user concern, throughput is a business concern, and resource use is a system concern. Additionally, identify project success criteria that may not be captured by those goals and constraints; for example, using performance tests to evaluate which combination of configuration settings will result in the most desirable performance characteristics. # '''Plan and Design Tests.''' Identify key [[scenario]]s, determine variability among representative users and how to [[simulate]] that variability, define test data, and establish metrics to be collected. Consolidate this information into one or more models of system usage to implemented, executed, and analyzed. # '''Configure the Test Environment.''' Prepare the test environment, tools, and resources necessary to execute each strategy, as features and components become available for test. Ensure that the test environment is instrumented for resource monitoring as necessary. # '''Implement the Test Design.''' Develop the performance tests in accordance with the test design. # '''Execute the Test.''' Run and monitor your tests. Validate the tests, test data, and [[results collection]]. Execute validated tests for analysis while monitoring the test and the test environment. # '''Analyze Results, Tune, and Retest.''' Analyze, consolidate, and share results data. Make a tuning change and retest. Compare the results of both tests. Each improvement made will return smaller improvement than the previous improvement. When do you stop? When you reach a CPU bottleneck, the choices then are either improve the code or add more CPU.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Software performance testing
(section)
Add topic