We performed a comparison between OpenText LoadRunner Cloud and OpenText LoadRunner Enterprise based on real PeerSpot user reviews.
Find out in this report how the two Performance Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."Keeping up with DevOps, thus the best feature of StormRunner is that we don't have to build and maintain infrastructure anymore."
"The TruClient feature is the most valuable for us. An application with testing can only be scripted using TruClient, so it's part web-based, but it also has its own protocol combined with HTTP and HTML. So many other tools do not recognize this specific proprietary protocol. Using TruClient, we can still create scripts that cover everything that we need to cover."
"The TCO has been optimized along with the total ROI."
"The most valuable feature is having load generators in countries where we don’t have access to them."
"It is feature-rich. It supports most protocols, which is important because I am in charge of a team at the bank, and we do performance testing for all kinds of different applications. We have tons of them. We even do video streams."
"It's fast, easy to use, has a user-friendly UI, and you can split users."
"The most valuable feature is the ability to configure browser settings for different operating systems and on different versions without the need to install every single version on each machine and to manage them."
"The most valuable feature is that we do not have to accommodate the load-testing infrastructure in our own data center."
"The initial setup was straightforward. I was able to download everything myself without any IT support."
"It's a very powerful tool."
"I like how you can make modifications to the script on LoadRunner Enterprise. You don't have to go into the IDE itself."
"We can measure metrics like hits per second and detect deviations or issues through graphs. We can filter out response times based on timings and identify spikes in the database or AWS reports."
"Micro Focus LoadRunner Enterprise Is very user-friendly."
"The product is good, and the concept is good as well."
"LoadRunner Enterprise's best feature is the detailed reporting structure."
"The solution is a very user-friendly tool, especially when you compare it to a competitor like BlazeMeter."
"The product price could be more affordable."
"The product must provide agents to monitor servers."
"The support team provides delayed responses."
"I don't know of any features that should be added. The solution isn't lacking anything at this point."
"Reporting and analysis need improvement. Compared to the old school LoadRunner Windows application, the reporting and analysis are mediocre in LoadRunner Cloud."
"Improvements to the reporting would be good."
"Scriptless automation is an area that can be improved."
"We did have some challenges with the initial implementation."
"A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard. For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user. Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc."
"I have seen some users report some issues, but I have personally not had any issues."
"The product's scalability must be improved."
"The process of upgrading LoadRunner can be difficult and time-consuming."
"We are expecting more flexible to use Jenkins in continuous integration going forward."
"OpenText LoadRunner Enterprise doesn't support some mainframe protocols. We had to build scripts to access the interface."
"The cost of the solution is high and can be improved."
"New features have been added in latest version and need to be improved with the DevOps integration."
More OpenText LoadRunner Enterprise Pricing and Cost Advice →
OpenText LoadRunner Cloud is ranked 6th in Performance Testing Tools with 39 reviews while OpenText LoadRunner Enterprise is ranked 5th in Performance Testing Tools with 81 reviews. OpenText LoadRunner Cloud is rated 8.2, while OpenText LoadRunner Enterprise is rated 8.4. The top reviewer of OpenText LoadRunner Cloud writes "Enterprise modeling, server maintenance, and competitive pricing". On the other hand, the top reviewer of OpenText LoadRunner Enterprise writes "Saves time and effort, and makes it easy to set up scenarios and execute tests". OpenText LoadRunner Cloud is most compared with Tricentis NeoLoad, OpenText LoadRunner Professional, BlazeMeter, Apache JMeter and Oracle Application Testing Suite, whereas OpenText LoadRunner Enterprise is most compared with OpenText LoadRunner Professional, OpenText Silk Performer, Tricentis NeoLoad, Apache JMeter and OpenText ALM / Quality Center. See our OpenText LoadRunner Cloud vs. OpenText LoadRunner Enterprise report.
See our list of best Performance Testing Tools vendors and best Load Testing Tools vendors.
We monitor all Performance Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.