I don't believe performance tests results

Performance testing is one of the most complex and hard types of testing. It’s really hard to make fair tests which fit everyone. There are a lot of things which can impact the measurement result. That’s why I don’t believe any particular test result which I found on the Internet.

If you are a software or hardware vendor, it’s a big temptation to test only your strengths. Even if you want to get fair results, you can miss something during the testing or don’t have enough results to see a performance impact. If you test your application with only one core/process but it’ll have two cores/processes on a production environment your tests could not show application performance on production.

Web servers performance could not be measured in general. E.g.: what will affect Apache2 vs Nginx performance results? There are a lot of things could affect performance like: static or dynamic content, what app server do you use (mod_wsgi or uwsgi), what does your app do (e.g. I don’t care on Wordpress performance results if I develop some custom apps with Django).

That’s why I believe that everybody should do own performance testing for their apps in a production-like environment. Any existing tests results found on the web should not be treated as 100% of the truth.

Tags

.net .net-framework .net-framework-3.5 agile ajax ajax-control-toolkit ampq ansible apache asp.net asp.net-mvc automation axum babel bash benchmark blog blog-engine bootstrap buildout c# cache centos chrome ci cinder ckan cli cloud code-review codeplex community config debugger deface dependencies development-environment devices devstack devtime disks django dlr dns docker dockerimage dos easy_install elmah encoding environment-variables error event events everything-as-a-code exception exceptions fabrik firefox flask foreach forms fstab gae gcc gerrit git github go google google-app-engine grep hack hacked hardware headless horizon hound html hugo iaas ienumerable iis internet iptables iron-python ironic iscsi java-script javascript jenkins jquery js jsx k8s kharkivpy kiss kombu kubernetes kvm kyiv lettuce libvirt linux lio loci logging loopback losetup lvm mac-os macos mercurial microsoft microsoft-sync-framework mobile mono ms-office msbuild networking news nginx npm npx offtopic oop open-source open-xml opensource openstack openvswitch os packages paraller-development patterns-practices performance php pika pip plugins pnp podcast popup postgresql profiler project protocols proxy pycamp pycharm pycon pykyiv pylint pypi python python-3 qcow quantum qumy rabbitmq rar react reactjs refactoring rfc rhel search-engine security selenium server shell silverlight socket software-engineering source-control sourcegear-vault sources sql sql-server sql-server-express sqlalchemy ssh static-site sublimetext svg tests tgt tipfy todo tornado typescript uapycon ui uneta unit-tests upgrades usability vim virtualenv visual-studio vitrage vm vue.js vuejs web-development web-server web-service web_root webpack webroot windows windows-live word-press x32 x64 xcode xml xss xvfb интернет-магазин книги

Recent posts

Go 1.18: new features

Всё будет Kubernetes

2022 Relaunch

Everyday Blogging

I don't want this CI


Archives

2022 (3)
2019 (73)
2018 (2)
2017 (3)
2016 (2)
2015 (3)
2014 (5)
2013 (17)
2012 (22)
2011 (36)
2010 (25)
2009 (35)
2008 (32)
2007 (2)