WHO WE ARE
Founded in 1994 by top thought leaders in the software testing industry, LogiGear has completed software testing and development projects for prominent companies across a broad range of industries and technologies.
LogiGear provides leading-edge software testing technologies and expertise, along with software development services that enable our customers to accelerate business growth while having confidence in the software they deliver.
LogiGear is headquartered in the heart of Silicon Valley with the majority of the software testing and software development staff located in Ho Chi Minh City and Da Nang Vietnam. We are among the largest employers of software testing and development professionals in Vietnam, and our close partnerships with universities throughout the country allow us to attract and recruit top software engineering talent.
LogiGear continues to grow as companies realize the benefits of outsourcing their software testing and development. We have been listed among the fastest growing privately held companies by Inc. 500|5000 in 2009, 2012, 2013 and 2014.
The senior executive team has co-authored several top-selling books on software testing and test automation, including:
- Testing Computer Software, by Cem Kaner, Jalk Falk and Hung Q. Nguyen
- Testing Applications on the Web, by Hung Q. Nguyen, Michael Hackett and Robert Johnston
- Integrated Test Design and Automation, by Hans Buwalda, Dennis Janssen, Iris Pinkster, and Paul Watters
- Global Software Test Automation, by Hung Q. Nguyen, Michael Hackett, and Brent K. Whitlock (foreword by Apple Computers co-founder Steve Wozniak)
LogiGear CTO to Deliver Keynote at ChinaTest Conference in Beijing
Hans Buwalda will also lead a software testing best practices workshop at esteemed four-day event
Foster City, CA, July 18th, 2016 — LogiGear, a world leader in software testing solutions, today announced that its CTO and prominent software testing expert, Hans Buwalda, will deliver a keynote address at ChinaTest Conference, set to take place at the China National Convention Center in Beijing, China, July 17-20, 2016.
How to handle intermittent automated test failures
But, in order for automation to be successful in enterprise test management, the scripts need to run perfectly all the time. Nevertheless, as any test case management pro knows, nothing is ever perfect. While it's a rare occurrence, sometimes an automated test case management system fails to function properly. This can dramatically slow down operations and prevent teams from releasing code on time.
However, by addressing these three points when faced with an automated test failure, teams can move past the issue and right the ship again.
1) Determine why the failure occurred
If and when problems arise with automation, it's critical to determine what precisely went wrong. Did the test case tool malfunction, were the tests themselves at fault, or was the source of the issue something else entirely? Only by getting to the root of the problem can an effective solution ever be put in place later on.
2) Make sure automation is correctly applied
One possible explanation for an automated test failure is that a test was automated even though it should not have been. While automation can bring a lot of benefits, it is not a panacea. In fact, there are many use cases in which automation is the entirely wrong approach to take. For example, while it's often a great idea to automate load testing, user experience testing should be executed manually.
Before assigning blame to a script or test case management tool, first make sure that automation was correctly applied. Sometimes, righting the ship is as easy as not applying automation tools to a certain area that used to get it.
"It does not make sense to use automated testing tools if, during analysis, it is found that the time needed to create, maintain and run the scripts exceeds the time allotted to conduct quality testing of the application," industry expert John Scarpino once told TechTarget. "Reviewing the rewards of cost, time and quality is again very important to look at for the creation of manual tests."
3) Go back to the drawing board (if need be)
Sometimes, correctly solving an automated testing failure is a quick fix. But, this is not always the case. In some more rare occasions, teams may have to totally rethink the entire development process to address the issue.
When a major automation failure is found, it can be helpful to go all the way back to square one. For example, let's say there is a dramatic shift in end-user expectations for software. In such an instance, what automated tests were used for may no longer apply. But, by resetting expectations with users and even establishing new quality assurance metrics, teams can make sure everyone is back on the same page again. This will require a lot of work and even the creation of new automated testing scripts, but it may be necessary in certain instances.
While automation can be great for so many test case management tasks, it is still prone to the occasional failure. When faced with this scenario, software engineers need to get to the root of the problem in order to effectively solve it. Sometimes, addressing such a failure will require a total reshaping of the work. But, by taking the time to do this and by adopting a robust enterprise test management solution like Zephyr for JIRA, teams can get things rolling again after an automation failure.
What happens when automation goes awry?