Sunday, June 14, 2015

Performance Testing with Cloud

What exactly it means when we say "Performance Testing with Cloud". Before we get deep into this , let us first understand some basics about what exactly cloud is and how can we utilize cloud for our testing needs.

According to Gartner, the cloud is defined as 

" a style of computing in which scalable and elastic IT-enabled capabilities are delivered as service to external customers using internet technologies."

Quite simple !!!! ..

This definition refers to following characteristics of cloud :
  1. Scalability
  2. Larger amount of resources
  3. Offering end user services over internet
Technical translation of above will look like :
  1. Service Oriented - Focus is on "What" we need instead of "How"
  2. Elastic - Pay per use concept
  3. Scalable
  4. Internet connected
These are the design concepts behind cloud and what services we as an end user will get out of these design :
  1. Infrastructure as a Service (IaaS) - Here service delivery focus on delivery of Physical or virtual machines , firewalls , load balancers and network infrastructure.
  2. Platform as a Service ( PaaS) - Here service providers delivers a working platform which includes Operating Systems, development environment, data base , web servers and application containers etc. Use this service to develop application without worrying about licensing, buying and maintaining the platform.
  3. Software as a Service (SaaS) - This model ensures provision of executable applications in cloud which are directly accessible to end users. For Example - We can utilize performance test tools ( BalzeMeter, StromRunner etc) in cloud where we need not to worry about it's installation and maintenance , we just need to get access to it and start work on this. Another example could be New Relic application monitoring as a SaaS model of delivery.
I believe above description of cloud and related services should be sufficient in order to relate cloud and use of cloud with performance testing activities.

Continuing on performance testing with cloud.. in order to perform performance testing activity what we really need :
  1. Performance Test environment which matches to production configurations
  2. Platform ( Web Server, App Servers , Load Balancers and Data Base)
  3. Performance Testing tool to simulate real user activities
  4. Performance Monitoring tool
 Now we have two solutions with us in order to fulfill the same :
  1. In House Set up of Environment, Platform and related licensing aspects along with license procurement of all related testing and monitoring tools. This will lead to huge set up and maintenance cost.
  2. Use Cloud for desired services where all above performance testing needs can easily be satisfied with Iaas, PaaS and SaaS delivery models.
 Benefits of Performance Testing with Cloud:
  1.  Perform large scale tests - In order to simulate thousands users of traffic on your application , you need to invest a lot in required hardware and configuring such a big test environment is really very time consuming. Today , we have to meet demand of fast pace development models ( Agile ) and to achieve the same , cloud services can help us in greater extent as we have all this ready in few clicks. 
  2. Perform more realistic tests - In order to performance test of your application which include complete delivery chain , we should target of testing application outside of our organization's firewall because if testing application inside firewall may fail to reveal all performance issues.  With the help of cloud we can test application as real users will use application i.e outside of firewall and we can validate all components in delivery chain including firewall, DNS, ISP, network equipment.
  3. Save Time and Reduced Cost ( Pay per use) - As mentioned earlier, we may not need entire performance test environment available at all times , hence we can save lot with pay per use delivery model. We can create instance images and save these images to launch another instances at later point of time when needed.

 Challenges of Performance testing with Cloud:
  1. Isolation of Root Cause - When we use cloud for performance testing activities , it becomes  difficult to isolate the exact root cause of bottleneck discovered specially when we are not equipped with application performance monitoring tools. This technique is perfectly fine if there is a single source of bottleneck. Consider a situation when performance bottleneck is related to multiple problems both inside and outside the firewall.  For this reason it is advisable to have an internal performance test environment where we can segregate root cause of performance problems inside firewall (if any).
  2. Reproducing tests - Reproduction of performance defect in cloud is quite difficult (specially when this is linked to infrastructure ) because of variation in internet traffic and bandwidth availability at data center level.
  3. Choosing right mix of computing needs - Some cloud providers delivers instances based on computing needs such as Amazon provides compute optimized and memory optimized instances. We should have clear understanding of computing needs at different layer of architecture such as App containers are most likely need Compute optimized instances where data base needs to be on memory optimized. Wrong decision can have impact on testing results.
Best Approach - Take Hybrid approach ( Internal + Cloud ) :   
It is advisable to employ a two stage process where application should first be performance tested on internal testing environment with medium load conditions. By this way we will able to identify all design level issues. Once we are done with internal testing , we can move on cloud based testing to mimic real user behavior with large scale tests and can validate entire delivery chain. This approach offers following advantages:
  1. Enables early testing in development cycle
  2. Isolate design level issues before moving into large scale tests
  3. Enables reproducible tests
  4. Provides better understanding of each major area in delivery chain
  5. It lowers performance testing cost


Thursday, June 11, 2015

How to Monitor Memcached

MemCahced is a high performance distributed memory object caching system. It helps in speeding of dynamic web application by alleviating database load.

During performance testing of application which uses MemCached as it's cache architecture, it becomes necessary to monitor health of MemCached. Although Memcached is fast enough to retrieve required data. For every thing it can, MemCached commands provide algorithmic complexity of O(1). Each commands takes roughly same amount of time every time.

Memcached expose its statistics and we can get those statistics using Stats commands. To execute stats command just telnet Memcahed port :

telnet   Server Port

By default Memcahced use 11211 port. 

Once you are connected to telnet session , type Stats and press enter , this will display MemCached performance metrics.

Below is the list of key performance metrics that we should monitor in order to collect required performance data:

  1. bytes : Number of bytes currently used for caching items.
  2. limit_maxbytes : Maximum configured cache size.
  3. curr_connections : Number of open connections to this memcached server.
  4. curr_items : Number of items currently in server's cache.
  5. evictions : Number of objects removed from cache to free up memory for new items.
  6. cmd_get : Number of get commands received since server start up not counting whether they are successful or not.
  7. get_hits : Number of successful get operations since startup. Divide this by cmd_get to calculate cache hit rate.
  8. get_misses : Number of failed get requests because key was not there in cache.
  9. listen_disabled_num : Number of denied connections attempts because memcached reached it's max connection limit.
  10. threads : Number of threads used by memcached server process.

Wednesday, June 10, 2015

Real Browser Performance Testing with Jmeter


In case you want to have more realistic performance test results , use of browser based performance testing is the way to go. As we know that Selenium has capability to perform browser based testing using WebDriver.  This blog describes step by step procedure to configure Selenium WebDriver with Jmeter and how to use it for performance testing activities.

Configuring WebDriver Plugin:   

  1. Download Selenium web driver plugin from jmeter-plugins.org
  2. Copy extracted jar files into /lib and lib/Ext
  3. Delete Older / Duplicate Jars files from /Lib  
 
Once you have completed above steps , you are done with configuration. 
 
Creating Browser based Test Script in Jmeter - WebDriver Plugin: 
 
  • Open Jmeter and Add Thread Group
  • Add Firefox Driver config' from config element
 
  • Add WebDriver Sampler
  • Add WebDriver code
 
 
  • Add Listener for Debugging
  • Run the Test
 

Tuesday, June 9, 2015

Understanding Google Analytics Metrics



Before we start reading about Google Analytic measures, it is worth to have a little background about basic concept of dimension  and metrics.

In Google Analytics we have two types of data :
  1. Dimensions :  It describes characteristics of users , their sessions and actions. The dimension city describes a characteristics of sessions and indicates the city for example "Paris" , from which sessions was originated. The dimension page describes a characteristics of page view actions and indicates the URL of each page that was viewed.
  2. Metrics : These are simply quantitative measurement of users, sessions and actions. Metrics are numerical data , basically they are numbers.
When extracting data for City as a primary dimension and Browser as a secondary dimension , we  will have following view :


 Google Analytics Metrics :

  1. Visitors or Users : This metrics measures number of unique users that visit your site during certain period of time. This is most commonly used metrics used to measure overall size of audience. This can further be categorized as New Visitors and Returning visitors. This number is more accurate in telling you how many individual people visited your web site. In order to perform work load modelling for your performance test , always ask this metrics as part your requirement gathering.
  2. Visits or Sessions :  Visits are also known as sessions , are defined as a period of consecutive activity by the same user. By default in Google Analytics , a session persists until a user stops interacting the sites for 30 minutes. 
  3. Pageviews : Within each visit or session , your users will engage in one or more interaction with your web pages. Google Analytics will automatically track these interactions as "pageviews". Pageviews metrics counts every time a page is viewed on your site.
  4. Pages per Session : This is average number of pages viewed during a session on your web site. More pages per session indicates user is quite engaged with your web site. 
  5. Average Session Duration : This is average length of user's sessions. Higher number again indicates users are more engaged with your web site.
  6. Bounce Rate : This is percentage of visits that are single page only i.e users who visits single page and leave. 
  7. % New Sessions :  This is an average percentage of first time visitors on your web site. 
We will get all above mentioned metrics under Audience Overview :



Monday, June 8, 2015

Performance Testing of Micro Service Based Architecture

 

Background:

Micro services are a style of software architecture in which system is delivered using a small set of services which are granular, independent and collaborating services. It is a technique of applying single responsibility principle at architectural level.

Micro services are often integrated using REST over HTTP. 

Layered Architecture of Micro services :


Resources handle incoming requests. They validate request format, delegate to services and package response. For RESTFUL services this includes deserialization of requests, authentication, serialization of response and mapping exceptions to HTTP status codes.

Services represents core business logic. They may collaborate with other services, adapters or repositories to retrieve required data to fulfill a request. Services only consumes and produces domain objects. They don't interact with DTOs from persistence layer and transport layer objects.

Adapters handles outgoing requests to external services. They are responsible to marshal requests, unmarshal responses, and map them to domain objects. Object mappers are widely used at this layer.

Repositories handles transactions with persistence layer.

A lightweight micro service may combine one or more of the above layer in a single component.

Performance Testing challenges:

  1. Whole application is not available from starting instead we have set of fully functional modules which later plug into end product.
  2. Use of different technologies in micro service development.
  3. Interaction among micro services is not readily available unless there is not single point of contact who has complete view of entire solution.
  4. Performance monitoring is a big challenge considering different technological aspect in micro services development ( Message Brokers, NOSQL, DataBase, N number of independent running JVMs etc).
  5. We need different benchmarks for capacity planning considering benchmarking activities specific to technologies (e.g  Heap sizing needs different benchmarks for JVM based services and different for allied services like Node.js apps and .NET CLR will also needs its separate benchmarks)

Performance Testing of Micro Services - Approach

While designing performance testing approach for micro service based architecture, we should consider solutions to above mentioned challenges.
  1. Consider shortening your selection from lot of services bunch to have focus on those services which represent critical business activities model.
  2. Try to build a service interaction diagram for all of your performance scenarios.
  3. Always start with performance testing of services in isolation manner rather than replaying end to end business scenario in an integrated environment. Once you have separate performance benchmarks for each services in isolation then focus on integrated test / business scenarios.
  4. Performance monitoring is critical during testing activities , and use of commercial agent based monitoring solution can increase cost of testing to great extent. Although SaaS based monitoring solutions is also a solution but use of open source tools and in house developed API profiling solution will reduce testing cost to great extent.


                                                         Testing Phase Pyramid