The Way to Quality Software
OnPath Testing helps organizations ensure their software works as expected and as designed, by following rigorous software quality assurance (QA) practices. We’ve tested software on web, mobile, and desktop platforms, on development teams following various methodologies, using a plethora of tools in the QA arena. From our extensive experience we have developed a testing process that delivers consistent, successful results.
Whether your technical team is newly created or fully established, we excel at implementing a QA solution customized to your needs. Our team of trained engineers support your QA efforts through the complete product cycle of requirements analysis, test planning, test execution, defect management and ongoing support. We perform functional, automation, and performance test verification to provide thorough coverage of every software release.
OnPath recognizes the challenges of successfully integrating offshore resources within a development team, and our unique, proven test management process offers the best of both worlds - excellent leadership, clear communication, and affordable engineering. With our software quality expertise and experienced professionals, we provide immediate quality improvement.
Specialties:
• Test project management and team leadership
• Functional test planning and execution
• Automation test framework design and development
• Performance test scripting, execution, and results analysis
• Defect management, metrics tracking, and reporting
• SQA tools selection and administration
1 Language
- English
4 Timezones
- ECT
- IST
- PST
- EST
Custom Software Development for Mattress Company
the project
“They looked at our entire digital ecosystem and what we were doing to recommend the most effective processes.”
the reviewer
the review
A Clutch analyst personally interviewed this client over the phone. Below is an edited transcript.
Introduce your business and what you do there.
I’m the senior product manager for Leesa Sleep, a mattress company.
What challenge were you trying to address with OnPath Testing?
The main challenge we were looking to tackle was building our QA and testing department. When we started, we didn’t have a formalized QA and testing process or automation, which was challenging because we’d sometimes have bugs or have to do a rollback in our releases. In summary, we didn’t have a lot of confidence in the code we were building with the technology team.
What was the scope of their involvement?
OnPath Testing developed the testing automation platform and the different critical scenarios we needed for verification. They also executed those tests for us and gave us the results. They started by identifying different scenarios to test. Then, we launched different phases of testing automation, introduced the developers to it, and started sending out reports with our release notes. We had several stages throughout the project, including building the platform, creating dozens of critical and non-critical testing scenarios, and setting up reporting.
Initially, we gave them documentation on our systems, platforms, software development process, and business information so they’d know which scenarios were the most critical. They first used JavaScript and then shifted to C#. In the end, we utilized a platform called Cypress for testing.
The platform’s main function was to test our shopping journey on our website to ensure customers could add products to the cart, check out, and engage with any critical tasks during the main shopping journey. OnPath Testing also ensured that the images and pages loaded quickly and appropriately. We also did some scenarios around integrations to ensure none of them could break and the code was stable whenever we did a release.
Additionally, OnPath Testing offered to maintain the platform. Before we finished our engagement, they trained our developers on how to maintain the platform and use it effectively. They showed us how to do everything ourselves.
What is the team composition?
We worked with three teammates for the project, including a QA specialist. We only had up to two resources at the time.
How did you come to work with OnPath Testing?
My boss came across their company. Lon (Director of Operations) and Brian (President) were the initial contacts.
How much have you invested with them?
We spent roughly $100,000. We didn’t invest much in the platform; most of the cost was for the resources.
What is the status of this engagement?
We worked with them from January–November 2022. We paused our service with them due to budgetary issues.
What evidence can you share that demonstrates the impact of the engagement?
When we started, we had bugs and rollbacks, and we didn’t have any at all by the end of the project. OnPath Testing created around 35 scenarios for our testing automation. In the last few releases, we had 100% success on all the critical and non-critical testing scenarios, so we got to a really good place thanks to them.
How did OnPath Testing perform from a project management standpoint?
They performed really well. We often had sync-ups to check on the process; they’d demonstrate the progress they’d made. Their developer was very involved with our team, and we’d also discuss how we could iterate on the project every month or so.
We used monday.com as our software development platform. We mainly communicated through Microsoft Teams, but we sometimes had Zoom calls for the monthly sync-ups.
What did you find most impressive about them?
OnPath Testing was great at customizing their services based on our individual needs. As a small company, they were very in tune with our processes. They looked at our entire digital ecosystem and what we were doing to recommend the most effective processes and strategies to take us from crawling to riding a bike. We’re still not a race car, but we’re doing so much better than before.
Are there any areas they could improve?
Some of the work had to be done a second time because they didn’t act upon all the information that our developers gave them. However, it was a minor bump in the road; we didn’t have many challenges.
Do you have any advice for potential customers?
I suggest involving them in your company’s processes as much as possible. We struggled internally with it for a bit because we usually keep our contracted resources at an arm’s length. However, if you embrace their resources and treat them as internal employees, they’ll understand things better.
Focus
Portfolio
Leesa Sleep, ClearCaptions, CodeLogic, RAIR Technologies, Orgill Inc, ChronoTrack, LearningObjects, MapQuest, Wyoming Dept of Health, Wrapmate, BioTrust

ClearCaptions Case Study
Read how OnPath helped their client ClearCaptions successfully release the product and grow usage by over 155% within a few months.
https://www.onpathtesting.com/contact

CI/CD Infographic
One of the great aspects of this project was the highly technical nature of the automation work that we built and maintained, primarily within the CI/CD pipeline. This diagram is an example of our process, generalized for a wider audience and client base.

OnPath Intro
Learn more about OnPath Testing and why we are your best choice for outsourced software qualiy engineering.
Reviews
the project
Integrated QA Team for Nutrition Startup
“ OnPath Testing has been very accommodating to us when things have come up unexpectedly. "
the reviewer
the review
A Clutch analyst personally interviewed this client over the phone. Below is an edited transcript.
Please describe your organization.
We're an online e-commerce retailer that is focused specifically around manufacturing nutritional supplements. We sell everything directly to the consumer through online marketing and direct response marketing.
What is your position and responsibilities?
I am the chief technology officer, so I oversee all of the software and IT [information technology] aspects of the business.
What business challenge were you trying to address with OnPath Testing?
In general, as a small startup company, we use contractors in various forms for surge capacity, such as if I need additional hands for a project or even just to fill a staffing need for an unspecified duration. I need some flexibility with it, so I'll possibly be able to control that spend again. We partnered with OnPath to be able to add those contingent resources to the QA [quality assurance] process.
Please describe the scope of their involvement.
The work they did involved QA [quality assurance] for desktop and tablet and mobile manual testing for user experience and functional QA – general web-based testing. The team we were allocated ranged from one full-time person to three at the height. We have two major areas of development work internally. One focused around marketing initiatives, which is page presentation, conversion analysis, and split test work, and then another on more core back-end development work.
Mostly, I have focused OnPath on supporting the marketing end of things because the turnaround time needs to be much tighter with those releases. However, at the height, we had one person also associated with what I consider core development project effort. We didn't use a project manager from OnPath Testing. We managed the resources and assign their tasks directly. Not myself, but actually the QA manager on my team.
How did you come to work with OnPath Testing?
Brian [Borg], the founder of the company, was actually on staff as a direct, individual contributor and contract resource when I came on board. Through working with him, we hit a point where we needed additional resources and, at that point, he said, "I need to step away. The business is growing well, and I need to step away from the hands-on testing and really do more project and resource management pieces." At that point, we put somebody in place of his work, and when we were looking to broaden the work, it was a natural progression to talk to him about using OnPath across the board for my contingent QA needs.
Could you provide a sense of the size of this initiative in financial terms?
In the last 24 months, most of the time or the whole time, I've had at least one engineer, 40 hours a week. For probably half of that time, I had two engineers. You could call it two people for 24 months, and their billing rate is around $35 an hour.
What is the status of this engagement?
We've been using them for two years in various forms, with different numbers of people.
Could you share any statistics or metrics from this engagement?
I've been doing software engineering for close to 20 years, and in only two of the positions did I not use either contract QA or contract development work. Compared to other collaborations I've had, I actually rank OnPath highly. I would recommend them without hesitation. The nice thing in my experience with Brian is that he's very good at being flexible with how we wanted to manage the team. I've worked with companies where their only model was to have a project manager who we had to work through to assign tasks, and it's just been an inefficient way to manage the small numbers of resources we had.
Brian was very flexible about the model that we were using. He was good about taking feedback. At one point, we had a test engineer who initially was performing well, and then it seemed like there was a focus issue or something was going on. He was good about addressing that, and then eventually helping us find a different person to fill that role and getting a better resource in place. In general, openness to making a difference and moving on the fly has been great. I'm a huge fan.
We develop new products on about a six-month basis, so about two or three times a year, we're launching what we call a new video mashup. It's a new marketing site focused around whatever the new product line is. Historically, prior to putting a focus on it, ramping up QA and putting OnPath on, and creating the structured QA methodology that we have right now, we would put those launches out, and invariably, immediately we found issues with either page content, incorrect presentation for customers, or problems in completing an order. Frankly, I don't have those issues anymore, in part because of process changes, but also just around being able to add the staffing needed to make sure we have good coverage.
The second piece, from my standpoint, is just responsiveness and turnaround time for projects associated with the marketing team in particular. The ability to add contingent staff, and have that resource layer that I can grow as needed when projects come in, has helped me keep that responsiveness in turnaround time. When I came in the company, it was a significant talking point around how the software engineering team could support the marketing team better, and why we can't get stuff done. That conversation has pretty much ended. In most cases, now we really are beholden to the ability of the marketing team to create the copy that actually needs to go up online, versus being able to get the technical work done and test it.
At an execution level, they've been very focused on the test work. Honestly, that's partly me focusing them in that area, and really driving the strategy for QA process evolution internally. The flipside being, Brian and I had several very good, overarching conversations about where the relationship is going, what's our focus longer term, looking at broadening test automation, whether that's something that OnPath could support and help with. There's definitely strategic insight there, even though I haven't particularly leveraged it in the engagement with them.
How did OnPath Testing perform from a project management standpoint?
The project management tool we use is JIRA, the cloud-based version. We have a very organized bug-tracking workflow. Brian actually helped establish that as the bug-tracking tool in the company before I came to the company. That was definitely, from my standpoint, one of the nice things that OnPath helped to drive, prior to me coming in.
What distinguishes OnPath Testing from other providers?
Like I said, it has to be their management approach and flexibility during project work. OnPath Testing has been very accommodating to us when things have come up unexpectedly. They're never afraid to give us crucial feedback, and they have made some key recommendations around structuring and improving our QA environment. I've been very grateful for that support.
Is there anything OnPath Testing could have improved or done differently?
The only thing is setting expectations with the contingent staff that they bring in and making sure those are aligned with my own expectations. When we do those product launches that I was talking about, we tend to move into what we call our launch schedule, where we essentially have development and QA staff available – if not online, working, but available – about 20 hours a day, seven days a week to be able to do changes.
We run a lot of A/B split testing at that point in time. At 11 o'clock at night, I need somebody available to rotate into the next test, to make sure that from a testing standpoint everything's functional. We had a couple of bumps with people who didn't understand that expectation upfront, and it was a little bit of a reset. That hasn't happened in the recent past. Brian and I have discussed this, and I think he has kind of taken it to heart.
the project
Integrated QA Team for e-Learning Company
"I liked how independent OnPath Testing was. It made them easy to work with, and my time could be put to better use."
the reviewer
the review
A Clutch analyst personally interviewed this client over the phone. Below is an edited transcript.
Please describe your organization.
I am currently a freelance QA [quality assurance] manager. At the time, I was working with OnPath, I was working for an e-learning company.
What is your position?
I was the QA manager.
What business challenge were you trying to address with OnPath Testing?
We did a series of different projects with them. The biggest one for Evolve, which is a publishing and marketplace for medical books and research – a place where students can buy books for classes and teachers can administer quizzes to their students. It's a large platform with a lot of stuff going on. It would take a while to describe it all. That project was probably a period of more than two years or so.
We hired OnPath to provide us with engineers able to test the software's functional reliability and usability. At one time, we probably had as many as eight or nine testers from them, working every day, testing a product and whatnot. It was mostly usability, functional, and compatibility testing. We did hire them to start setting up automated testing as well, and that was a whole separate side thing, but it was for the same product. They provided someone to help get things more automated. The QA phase was integrated directly into the development process before we released the software.
Please describe the scope of their involvement in more detail.
The purpose of the software itself was transactional. It dealt with large volumes of data, and there were interactions between multiple third-party systems. It had e-commerce functionality to buy books, examination functionality to quiz students. Teachers could connect and contact students, it had grading - both the grades for various assignments and weighted grades for the whole class.
As far as the e-commerce side goes, that was connecting the teacher, who would send the book to the student, and the student would buy the book. The teacher would talk to the third-party people to set the book up into the marketplace as well. It was a large, multimillion-dollar product.
They were located in India, and we had a manager of the team that was in India, and I was managing the manager, so to speak. They were working at night, and we were working during the day, so I would talk to the manager and tell him what to do for the day, and he would manage the team there. I would say the manager was more of a senior level, and then all the testers were mid-level. There weren't any pure rookie testers.
How did you come to work with OnPath Testing?
Our vice president of engineering had worked with them in the past through a different company. He already had experience with them and recommended them for this particular project. I believe we were already working with them when we started that project, so they were the only ones we thought of because we were already happy with them.
Could you provide a sense of the size of this initiative in financial terms?
It was during a period of years, and we varied from two or three testers working every day up to 10 working every day. I would say it was somewhere around $60,000 to $100,000 a year for two years.
What is the status of this engagement?
When I left the company, which was about a year and a half ago, we had released the product, but we were still developing it and adding on to it, and still using OnPath to test the add-ons, fixes, and everything. As far as I know, they're still working on it. I'm not with that company anymore, so I don't know exactly, but I know the product is still live.
Could you share any statistics or metrics from this engagement?
We were happy with the way they were able to run tests efficiently, and they were able to manage their own team very well. We had them also writing test cases sometimes, and running test cases. Their test cases were written well. Their communication was good. Like I said, it was overseas and it was overnight, so we had to rely on a lot of emails and stuff. We couldn't talk directly very often. They handled that process well.
If we needed more people, they were quick to jump on and get us some more testers. We could ramp up and ramp down quickly, which is why we really liked using them a lot. Our testing needs would change quite often, and they were able to handle all the changes without any headaches or rigmarole. If we needed less, they'd take a couple people off. They were very flexible.
How did OnPath Testing perform from a project management standpoint?
They were able to deliver all their milestones on time and within budget. We were working with the developers, obviously, and the milestones would change quite a bit. With their ability to ramp up and down, we were able to meet milestones on a consistent basis with no problem. We did use JIRA sometimes for project management but, mostly, we interacted by Skype and email.
What distinguishes OnPath Testing from other providers?
I liked how independent OnPath Testing was. I've used overseas staff a lot, and usually I have to do a lot of hand-holding, a lot of phone calls. With them, I could just send an email. It had to be pretty detailed, but the email was enough. I would go for a couple or three weeks without a verbal conversation, and I was working with eight of them every day. I was impressed by their independence. It made them easy to work with, and my time could be put to better use than a lot of hand holding.
Is there anything OnPath Testing could have improved or done differently?
Some of the junior testers didn't have perfect English, but eight out of 10 of them had totally usable English. I was only really talking with the manager, and his English was perfect. The only time I would see language issues would be in the description of a bug by a junior tester, but those were few and far between. Every issue I can think of was just due to them being located overseas. We had a couple meetings with the CEO, he came to D.C. a couple times, but there wasn't a lot of personal interaction because they were overseas.
the project
Android Device Testing for Computer Software Startup
"OnPath Testing manages themselves extremely well. The testers were always very diligent and always got the work done."
the reviewer
the review
A Clutch analyst personally interviewed this client over the phone. Below is an edited transcript.
Please describe your organization.
WeCam Technologies has built a mobile video chat platform based on WebRTC technology for encrypted end-to-end, peer-to-peer video chat solutions for mobile phones.
What is your position?
I am the founder and president.
What business challenge were you trying to address with OnPath Testing?
We're a startup, and we had our initial seed funding from a number of friends and family. We hobbled together, using developers offshore, our first kick of the can for our proof of concept for WeCam. WeCam chat is actually available on the iTunes Store and Google Play. It's got about 200,000 users, and it's sort of a social media video chat app where you can login with Facebook, you can video chat with your Twitter friends, and there is a lot of complexity, as you can imagine.
Two-way audio/video for mobile devices was horribly interrupted with a discontinuance of Flash technologies on mobile devices, and the two-way portion of audio and video calling was a big issue. You had your big guys, like FaceTime, Google Hangouts, and Skype that had their own proprietary video codecs they were using. But, there was really a huge gap in the marketplace, and WeCam's proposal to our investors and to the audience at large was to provide a low-cost platform for people to be able to develop on and develop their own video chat encrypted apps.
We launched at Macworld 2014 and we did that on a limited budget. We had a 10-week development cycle culminating with me presenting it live on stage at Macworld in San Francisco. If you've ever presented anything live using technology, you know that's a leap of faith. My background is all digital and digital engineering. I've been in the interactive field pretty much the whole career, so I know and value QA [quality assurance] in a big way. I also know offshoring development can be great if it's done right, and part of what has to be done right is the third-party quality control.
Please describe the scope of their involvement in more detail.
After launching at Macworld, we came back and we learned a bunch of lessons from the marketplace and decided to revamp and rebuild the platform from the ground up last October [2014]. From the things that we learned from the mobile development cycle earlier on, we knew we wanted to integrate QA into the process early on. That was always something that I did with my software companies, not to let QA be the thing at the end of the road, as it's asking for trouble. If integrated directly into the development cycle early on, and you have experts in that field that are given enough responsibility, then it's going to make a quality product, and that's what we did.
Part of the challenge was having developers and designers and UI [user interface] people in three countries – Canada, United States, and India – and that can be challenging. Some of the software development, as I mentioned, is great, but you really need to be able to tell them what to do and what to fix. We learned that the first time because the company we were using had their own QA team, and having your own QA team internal to a software company is kind of like giving to the fox the keys to the henhouse.
It's not that these guys didn't do their job. We just wanted to step it up a little. In doing that, one of the things that we wanted to do with our technology and with the company was to partner with an expert in mobile QA, or in QA in general. We got in touch with Brian [Borg, the founder of OnPath Testing] and told him what we were looking for, what our challenges were.
Within about a couple of weeks, he had put together a proposal and a team that really fit our schedule, being in North America. We had testers that he provided in India, in the field, as well as his expertise in helping build this plan, "Here, we should be using these tools, we should be making sure that the software people are doing this, are going to have daily scrums" and really helped to architect our QA strategy for the development of our new platform. That started probably last October and wrapped up at the beginning of this June [2015].
At one point, we had three of his testers working closely with our tester – our internal company guy – and the testers at the software development house. It's a challenging situation to be in because you've got a third party analyzing code and development. They did a great job. They really went over and above and we couldn't have been happier with what we had decided to do. Of course, having a third-party QA provider costs more. Ultimately, we delivered on time, on budget, and delivered a quality product that really was error free, for the most part, and a lot of that was due to the guys that we were partnered with at OnPath Testing.
That was the extent of our work to date. Our plan is we develop a platform, they help to build the platform, and now we develop white label versions of it and don't need that same level of QA and the intensity that we had in doing that. I would certainly use them again when we have another large-scale project.
How did you come to work with OnPath Testing?
We got referred to OnPath from a colleague at our office in San Francisco who had used them before, so it was a referral. We got in touch with Brian Borg shortly after that, started talking to him about what we were doing, the tools we were using, and some of the vision of where this was going.
The vetting process was simple. It was a referral from a colleague at our office in San Francisco. It came up in a meeting, we talked to him about some of the issues we had with quality control in our previous version of the software. He recommended that we get in touch with Brian, highly recommended him, and that was the vetting process.
We had a couple Skype meetings with Brian. We connected right away, we liked his style, and he delivered everything he said he was going to deliver on time, including quotes and revisions. That means a lot, especially when you have a specific delivery date.
Could you provide a sense of the size of this initiative in financial terms?
Overall, it was probably around $60,000.
What is the status of this engagement?
The project was completed June [2015].
Could you share any statistics or metrics from this engagement?
That's a tough one. If you've been in mobile development at all, especially one that has deadlines and launch dates, it's just crazy. I wouldn't wish it on anyone. All I can really say is that they integrated themselves really well into our core development team. They were in a precarious position from that scenario. The software guys didn't report to them, and it was an odd situation, but they handled the whole thing extremely well. At the end of the day, I was able to sleep better. That was really my goal. I didn't want to put myself through what we did the first time, which was just rampant QA on the fly, and staying up all night. I got them to stay up all night.
What distinguishes OnPath Testing from other providers?
Brian's expertise and involvement level with the projects. He wasn't one of our testers, obviously, but he kept in touch with the project, and he kept in touch with us. He was always present in seeing how things are, following up on things, called with any concerns, and that's a good way to run a business. OnPath Testing manages themselves extremely well. The testers were always very diligent and always got the work done.
Is there anything OnPath Testing could have improved or done differently?
At the time, they were really starting to – and this is probably an issue with a lot of QA companies that are getting in the mobile space – it's easy to have QA efforts on web-based products. The issue with mobile QA is being able to test on all the Android devices. Something, that really didn't come up at the beginning, something that nobody thought about for the most part, was that we had our own device testers as well in the company, separate from OnPath, but there was an early confusion regarding, "Well, we don't have the LG Flex, so I don't know if it works on that."
We worked closely with them to figure it out. They did start to invest in mobile hardware, which is a pain. I mean, you've got your iPhone, and then you have like 20 Android devices, with 20 different versions of Android. I can only speak to their mobile prowess. They certainly came to the table with adding more devices. That was something we didn't really think about, at first. It was kind of like mobile testing is mobile testing, but then we realized it doesn't work that way.
In the Android world, it just becomes quickly mired down. "But what phone are you using? What version of Android are you using? Is there an upgrade?" It's a whole different world. If anything, it's making sure that when you're talking about mobile QA, you're very well defined in the devices you need to test on. Either providing them to OnPath or building it into your agreement to have them purchase it, but not leave it to the ninth hour when you're going "What do you mean you don't have a Galaxy S5?" That would be the only thing, and I think they probably learned that lesson as well, through working with us.
The last releases had a 100% success rate in roughly 35 critical and non-critical testing scenarios. The team was in tune with the client’s needs, and OnPath Testing customized their services to meet them. They managed the project well and used monday.com, Microsoft Teams, and Zoom to communicate.