The Way of Quality Software

OnPath Testing is your source for outstanding QA engineering in a global workforce. With our resource talent and project experience, we give our clients the confidence their testing is achieved exactly as expected, within budget, and to the highest standards.

 
$5,000+
 
$25 - $49 / hr
 
2 - 9
 Founded
2009
Show all +
Boulder, CO
headquarters
  • OnPath Testing
    2525 Arapahoe Ave, Ste #E4-150
    Boulder, CO 80302
    United States
    (303) 497-4994

Portfolio

Key clients: 
  • BioTrust
  • ChronoTrack
  • English360
  • LearningObjects
  • WeCam Technologies
  • MapQuest
  • Wyoming Dept of Health
  • Kozio
  • MapCentrix

Reviews

Sort by

Regression Analysis for Software Product Company

"OnPath Testing worked harder than most of our employees, which was impressive."

Quality: 
4.5
Schedule: 
4.5
Cost: 
4.0
Willing to refer: 
4.5
The Project
 
$50,000 to $199,999
Project summary: 

OnPath provided a number of regression analysis services included the construction of a custom test suite, and then a complete rebuild in order to accommodate a new language. 

The Reviewer
 
1-10 Employees
 
Louisville, Colorado
Product Owner, Sports & Recreation Startup
 
Verified
The Review
Feedback summary: 

OnPath developed strong testing scripts that saved a considerable amount of time and internal resources. Although successful, the partnership concluded as methodologies changed. 

BACKGROUND

Please describe your organization.

We are a Scrum Team working on developing software products for online registration of athletic events. This can include, for example, the New York City Marathon. Registration for the event is typically done online currently. We provide the necessary software for it. We also provide scoring systems for these races.

What is your position?

I am the product owner.

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

The business goal was making sure that we had sufficient regression testing for the code we had developed. We initially used Selenium for the process, trying to automate the testing for our registration and scoring systems.

SOLUTION

Please describe the scope of their involvement.

OnPath Testing wrote test scripts in Selenium and Capybara for us. We moved to the latter because we needed Ruby code support. They became familiar enough with our systems to be able to write tests scripts themselves, without the need of doing step-by-step scenarios.

OnPath built our entire test suite for Selenium in 2013, over the span of three to six months. They continued to write automated tests as we developed new features, in order to make sure that everything was covered. In late 2014, our automated test suite became too cumbersome, so we decided to switch from Selenium to Cucumber. This involved rewriting all of our application tests. OnPath were the ones who undertook the task in its entirety.

They had to learn the new technology, as they didn't have past Cucumber experience. They not only did this, but they also learned to write for Capybara, the Ruby code that runs Cucumber. It was a long project, but they stuck to it. The total time was about six months for complete code coverage. They continued to write new parts as we released new features.

OnPath was willing to learn new technologies, which we seriously appreciated. We had a remote relationship. The dev team we worked with is based in India.

How did you come to work with OnPath Testing?

They were referred to us by a collaborator who had worked with them in the past.

Could you provide a sense of the size of this initiative in financial terms?

The total expense for our collaboration was around $50,000.

What is the status of this engagement?

The collaboration ended in December 2015. Our company is moving to a different direction, and the Scrum developers are now responsible for writing tests. Ending our relationship had nothing to do with OnPath Testing’s performance.

We only gave them two weeks’ notice of the termination, and OnPath Testing continued the work up until the last day. It was impressive to see this attitude.

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

We consider the relationship with OnPath to have been a very good investment. 

How did OnPath Testing perform from a project management standpoint?

We found them very reliable and great collaborators. They don’t need to be micromanaged and they’ve always been able to deliver the specs requested of them.

What distinguishes OnPath Testing from other providers?

We appreciate their desire to learn new technologies and their ability to put in long hours. OnPath Testing worked harder than most of our employees, which was impressive.

Is there anything OnPath Testing could have improved or done differently?

They were very willing to learn new technologies, but it took them a while to obtain the domain knowledge necessary to actually write out the test scenarios themselves. I had to take care of this for longer than I would have wanted.

4.5
Overall Score
  • 4.5 Scheduling
    ON TIME / DEADLINES
  • 4.0 Cost
    Value / within estimates
  • 4.5 Quality
    Service & deliverables
  • 4.5 NPS
    Willing to refer

Integrated QA Team for Events Company

"OnPath Testing is well versed in agile development. It's worked very well from our perspective."

Quality: 
4.0
Schedule: 
5.0
Cost: 
3.5
Willing to refer: 
4.0
The Project
 
$50,000 to $199,999
Project summary: 

OnPath Testing provided two full time engineers in order to assist in quality assurance feedback for an event registration platform. It is a large, recurring project and the work is ongoing.

The Reviewer
 
51-200 Employees
 
Fort Collins, Colorado Area
Former Chief Operating Officer at Events Company
 
Verified
The Review
Feedback summary: 

OnPath's engineers have integrated as nearly a full-time part of the team. They have helped significantly refine and improve the overall testing environment. Site productivity and reliability has increased a result of their work. 

BACKGROUND

Please describe your organization.

We work in the endurance event space, and we were building an online registration tool to facilitate transactions for participants in the events.

What is your position?

I was a project manager when we brought OnPath Testing onboard. 

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

We were building out our integration tests, trying to improve them. We needed more manpower to help us ramp up the creation of specific quality assurance tests. Then, in time, we changed the technologies that we were using to write the tests. We knew this would add significantly to our time. We leveraged OnPath to help guide us to new systems and help build the foundation of those systems. They restructured our testing environment and kept us on pace with our growth. 

SOLUTION

Please describe the scope of their involvement.

We moved to Cucumber, which is what they're using right now. Before that, we had a couple changes. We were always using Jenkins server, then we migrated, and now we're using Cucumber tests. OnPath helped us with that migration and took all of our old tests and rewrote them so they could be utilized again with the new system.

Our software developers also write tests. But, we needed a starting point and we wanted our software developers to focus on writing tests for new code in features recently developed. OnPath helped us catch up with the existing code that was not covered at the moment. They became an extension of our team.

We've used just two of their resources, continuously. They were like full-time members of our team. Since we started our relationship, it's always been full-time with those two. We never used anyone else.

We didn't have a choice regarding who joined our project team. That was up to OnPath. At one point, they brought in a project manager who was going to help and add to that and after working with him for a couple weeks it wasn't a good fit so we just kept with the two testing engineers. It was easier for them to work as members of our team than be managed through an intermediary. 

How did you come to work with OnPath Testing?

One of our senior software engineers had worked with Brian [Borg], the founder of OnPath Testing. He vouched for Brian. We spoke to Brian and decided we would give OnPath a shot starting out with just some basics, nothing too major. I'll say we were in our infancy in terms of our understanding as to what we truly needed. We didn't have a well-defined project scope when we first engaged with them. Since then, they have grown with us and gone through some growing pains with us.

Could you provide a sense of the size of this initiative in financial terms?

They were being billed out the two engineers each for $35,000 to $40,000 a year, which was higher than we would've expected. That's why we relied on them as full-time team members.

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

We went from weekly to every other week releases. We were working on a live web application. We handled about $300,000 in transactions a day. We wanted to go to a continuous employment setup, but we didn't have the testing infrastructure to do so. As a result, we went from very large releases every two weeks that we littered with regressions to a rapid release to anywhere between three and seven deployments to production a day with a regression rate of less than 3 percent. That was all based on the work they helped facilitate. They also led the charge in training our software developers and bringing them up to speed. Our reliability increased a great deal.

Our own developers are more than impressed with the engineers we have from OnPath Testing. I don't know if we would switch those engineers if we would be as confident. Those two engineers became part of the team. They worked more than your 40 hours a week just as if they were members of the team. That was fantastic. I got the sense that we were the first company that brought them in as part of the team. The energy and enthusiasm they brought was awesome. It was a very good relationship. I know this was a higher price point than most people would expect to pay for offshore testing. We brought in a new vice president of technology about a year ago. That was one of the first things they said. IT [information technology] seemed expensive. Our software developers love working with them, though. There's a little bit of a learning curve to start, but once they're familiar with the process and standards, things to very smoothly. Our project managers love working with them, too. They're very consistent. 

How did OnPath Testing perform from a project management standpoint?

We managed everything through JIRA. OnPath Testing is well versed in agile development. I don't know if the whole organization is or not, but it's worked very well from our perspective. We never had any issues with delayed work or anything like that.

Is there anything OnPath Testing could have improved or done differently?

I don't have great judgment as to the value of the entire organization. I know these two test engineers are fantastic. One time they brought in a new person, and it wasn't a good fit. That project manager didn't work out. I can't really attest to whether or not other members would have produced the exact same results as the two we have. There were several times in which Brian offered to bring other people on board to help. But, we were satisfied with the people we had and the amount of output we were getting, so we felt like it would have been unnecessary. 

Do you have any suggestions for future clients of theirs?

We use offshore resources a lot and for us it's always been about making them feel like an integral part of the team. Have them partake in the wins and losses of the team. We've gotten more out of our contractors than I've seen other organizations just because they were part of our culture. Just get them as involved as possible so they understand the context around the tasks and can take some degree of ownership of the project success.

4.5
Overall Score
  • 5.0 Scheduling
    ON TIME / DEADLINES
  • 3.5 Cost
    Value / within estimates
  • 4.0 Quality
    Service & deliverables
  • 4.0 NPS
    Willing to refer
    Depending on what the projects are.

Integrated QA Team for Education Management Firm

"We needed something out there in the market very quickly, and [OnPath] played a big role in getting that done."

Quality: 
5.0
Schedule: 
5.0
Cost: 
5.0
Willing to refer: 
5.0
The Project
 
$200,000 to $999,999
Project summary: 

OnPath Testing provided performance and functional testing for a cloud-based digital education system. They retained full responsibility for quality assurance and control of the platform on a tight project timeline. 

 

The Reviewer
 
1-10 Employees
 
New York City Area
Cleve Miller
Founder and CEO at English360
 
Verified
The Review
Feedback summary: 

OnPath became a crucial element to the successful development and delivery of the product. Their ability to minimize rework and maximize efficiency resulted in a successful rollout on deadline. 

BACKGROUND

Please describe your organization.

We have a web platform that helps language teachers use the web for what's called blended learning, or hybrid learning. This is where part of the traditional language teaching class, part of it is the normal face to face in the classroom, and then part of it is online, where the student works on their own. It's not a virtual classroom where you're connecting with a teacher online, rather it's what they call a learning management system, where the student does different tasks to work with the language that they're learning.

The platform is a learning management system and a content repository. We license content from publishers in the English language teaching field, and then we do some fancy stuff with the content, including disaggregating it so that teacher can take bits and pieces from different courses and recombine them into a learning sequence that is personalized for each student. There's a learning management aspect to the platform, there is a content repository, there are authoring tools, and then there are some social features, where students can collaborate together on projects, and that type of thing.

We sell student subscriptions to language schools and universities. We don't sell directly to students. This is not a "learn on your own product" like Rosetta Stone would be. Rather, it's for language schools, universities, and even corporate training departments. There is a tremendous amount of corporate language training that goes on as well. It's technology geared to help teachers do a better job as opposed to technology that's supposed to replace teachers. It's a typical software as a service business model. We sell student subscriptions. It's about $12 per month, per student, and most of our customers are in Europe and Latin America. We're just getting started in Asia, and we have a fair amount in the Middle East as well.

What is your position?

I am the founder and CEO.

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

We went through quite a dramatic evolution in terms of the coding talent that we were hiring, which might be typical in situations where you have a non-coder as the CEO because I'm not a tech guy. As a result, it was difficult for me to judge talent and manage the project. We went through some ups and downs with different firms that were building our software and then we finally decided to go ahead and just build our own.

We were working with our lead engineer at the time, a gentleman by the name of Aaron Zakowski, a very qualified and competent engineer. He recognized that we needed some good, hardcore quality assurance and brought testing in as an integral part of the processes we use to build our product, and we ended up hiring Brian [Borg, Founder at OnPath Testing] and his firm.

SOLUTION

Please describe the scope of their involvement.

The development of our actual system was done in-house as well as with the help of other people we hired. We didn't get any really decent results until we hired our own. That's not to say it's the only way you can do it. It just worked out that way. I did hire a couple of firms and, in both cases, it did not work out in a satisfactory way. It was our in-house team that ended up doing it, although they inherited some buggy legacy code from the firms that had started on the project.

In a way, a lot of the development ended up being an undoing of what people had done before them. It was through that that we ended up working with Brian. Although he headed up the OnPath firm, at that point, he only had maybe five or six guys. But, Brian was able to fit in as a part of the internal team somehow. I think that was one of the reasons that it worked out so well with him. He was very good at the communication side of things, understanding the dynamic of the software team, and how to fulfill his responsibilities in a way that kept the whole thing moving along nicely.

Reworking some of the buggy legacy code was a nightmare. We used a rapid development framework, and had mixed results with a couple developers that we hired, and finally decided to hire the firm that invented this rapid development framework, thinking they would be great. All they wanted to do was deliver software that was so abstracted that they could then reuse it for other projects, and it was just nightmarish. There were just layers and layers on things. This meant that you had a core element that you could reuse for any project in the future, regardless of context or specific use. Once we cut ties with them and hired our in-house team, they had to pull some of those layers of abstraction out, so that everything ran a lot more efficiently.

OnPath was integrated directly into our development cycle. They would participate in the sprints we would run. We might also take some time and have a period with more comprehensive testing on a specific new piece that we were releasing. Brian was on all the stand-ups we would have each day, and he worked not only with our lead engineer but also with all the other coders on our team.

We did primarily performance and functional testing. We didn't do a lot of load testing. We really didn't do a lot of usability testing, so it was mostly just making sure that things were working the way they should. We had some complicated stuff happening on these web pages with the authoring tools. The authoring tools produce a range of almost 20 different question templates, including some drag-and-drop stuff, picture labeling, and stuff like that. Those are aspects characteristic of language teaching, but tough to actually implement in HTML and JavaScript.

Brian would work directly with our lead engineer, looking at some of the basic architectural questions as well. That's probably less traditional testing, but more of just his feedback on anticipating directions we didn't want to take. He could say, "Hey, I can see that causing problems down the road." That's an example of him being involved directly in the team in a way that was very much not how you would traditionally view a vendor, especially one running their firm.

Anyway, we didn't do a lot of load testing. It wasn't automated. A couple of times we were going to go ahead and put in some automated testing tools, and then, for a variety of reasons, and usually just the immense time pressure we were under, we decided not to actually do that. We were talking about whether we should go with Selenium, but it ended up being manual. Brian was actually the one pushing for that because he said, "Guys, let's do this right." I think I kept overruling him, to his displeasure, but he worked well with everybody, even though sometimes as part of the team he wasn't running the show.

How did you come to work with OnPath Testing?

I'm not sure. My guess is that we put something in Stack Overflow. Or, maybe it wasn't. I don't even know if Stack Overflow was around back then. This would have been 2009, so it was something like that. This was mostly run by our own lead engineer, and I wasn't really involved with the decision myself. Aaron probably interviewed two or three different guys. He's a very tough and demanding interviewer. I remember him being enthusiastic about hiring OnPath.

Could you provide a sense of the size of this initiative in financial terms?

We probably spent around a quarter of a million dollars annually.

What is the status of this engagement?

The last milestone completed was about 18 months ago. At the time, we were half-owned by Cambridge University in the United Kingdom. Just after I launched and got a beta version of my product, they came along and liked the technology, so we set up a joint venture that was 50/50. For a variety of reasons, the university publisher, Cambridge University Press, decided to go in a different direction in terms of technology. We had a very amicable separation about a year and a half ago. It was at that point that we dramatically downsized our team, and we don't have a dedicated testing unit right now. When we get going again, we will absolutely be calling Brian right away.

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

I don't have anything specific to offer you. There wasn't really a before and after because he was there building it from scratch. I can't tell you anything on page load times, uptime, or anything like that. In terms of specific numbers, I can't really help you out there.

In terms of internal stakeholders, I do know that occasionally we would pull in people from the university, subject matter experts on something and have them be involved in certain projects where they would have actually been in meetings with Brian there. They were external to the company, but Brian has a way of getting to the crux of an issue very calmly in a way that doesn't step on anyone's toes. He is very diplomatic. He comes in very calm, gets right to the essence of the matter, and unravels the knots, so to speak. I do remember our Cambridge partners occasionally saying, "Wow, that guy really contributed to that meeting." These were sometimes technical people at the university as well because we had a couple joint projects with them.

In terms of customers, we have a reputation for doing some complicated stuff in a way that works. The platform is very stable, and we do some fancy stuff that really no other platform does. If you look at other learning management systems and the authoring tools that they come with, ours is really second to none in terms of the complexity of the types of tasks that we let the end-users create by themselves. Many times, if you go in and custom build something, you can make some fantastic learning tasks on the web, but we went beyond that and created a template that allowed any user to do it. We had a bottom-up, user-generated content element that's critical and a big differentiator for us as well.

That stuff just works, so in terms of customer reputation I think you can see Brian's work in that recognition of just the fact that it does work. People really trust English360 a lot because their whole work processes run through English360. If you're a small language school or a university, and you're delivering blended learning to these customers, years' worth of your work is inside the English360 platform in terms of the content that you've created. There's a trustworthiness that's important there and, if we didn't have that, we wouldn't be successful as a business. The fact that we do have it is a testament to the work that Brian did.

What distinguishes OnPath Testing from other providers?

One of the reasons that Brian is so good is he has a very high level of emotional intelligence. There are many good QA people out there, but what sets Brian apart is just fantastic communication skills. He's a great teammate; he's always very steady, and he calms everybody down. That was hugely beneficial for us because we were under a lot of pressure to release our partners at the time. We needed something out there in the market very quickly, and Brian played a big role in getting that done.

Is there anything OnPath Testing could have improved or done differently?

The only possible thing I could say on that is not anything that Brian needed to do better, that OnPath would do better, it's just that we got used to working with Brian personally, so that when we would have one of his guys jump in, I would email Brian and say, "Dude, we need you." Sometimes, Brian and I would go back and forth on this, he'd say, "I've got other stuff going on as well. I'm trying to do as much as I can to be dedicated to English360, but I do have other customers." At the end of the day, he has a business to run, too. It's understandable, but it's also a bummer because he was such a huge asset to us.

As it turned out, his guys are fantastic. I forget the other guy we used to work with a lot, who reported to Brian inside OnPath. He was a very high-quality tester. What Brian brought to everything was that emotional intelligence aspect of it. That dedication and expertise that he brings became essential to the whole build, so if he couldn't be in a meeting, we felt the absence. I have no idea about the guys now. Like I said, I haven't worked with Brian or his team in more than a year now. I don't know who he's got in place now.

That really isn't so much an improvement area, it's just recognizing that, as an individual, Brian has something special when it comes to the human side of software development, and when that wasn't there, we missed it. I guess that's saying that he just needs to clone himself to populate his company. Other than that, Brian was great to work with.

5.0
Overall Score
  • 5.0 Scheduling
    ON TIME / DEADLINES
  • 5.0 Cost
    Value / within estimates
    I don't know what his prices are now, but when I was working with them, they were very competitive. I was surprised at how relatively inexpensive we were getting the quality that we were getting.
  • 5.0 Quality
    Service & deliverables
  • 5.0 NPS
    Willing to refer
    Definitely.

Integrated QA Team for Nutrition Startup

“ OnPath Testing has been very accommodating to us when things have come up unexpectedly. "

Quality: 
4.0
Schedule: 
5.0
Cost: 
4.0
Willing to refer: 
4.0
The Project
 
$50,000 to $199,999
Project summary: 

OnPath testing assumed responsibility for quality control of several web and mobile solutions. The project included testing of user experience and functionality of marketing intitiatives, as well as back-end testing. 

The Reviewer
 
201-500 Employees
 
Greater Denver Area
Chief Technology Officer, Nutrition Startup
 
Verified
The Review
Feedback summary: 

OnPath displayed excellent skills in execution as well as strategizing how to structure and implement testing environments. When expectations of a particularly team member were not met, OnPath was responsive and proactively fixed the problem quickly.

BACKGROUND

Please describe your organization.

We're an online e-commerce retailer that is focused specifically around manufacturing nutritional supplements. We sell everything directly to the consumer through online marketing and direct response marketing.

What is your position and responsibilities?

I am the chief technology officer, so I oversee all of the software and IT [information technology] aspects of the business.

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

In general, as a small startup company, we use contractors in various forms for surge capacity, such as if I need additional hands for a project or even just to fill a staffing need for an unspecified duration. I need some flexibility with it, so I'll possibly be able to control that spend again. We partnered with OnPath to be able to add those contingent resources to the QA [quality assurance] process.

SOLUTION

Please describe the scope of their involvement.

The work they did involved QA [quality assurance] for desktop and tablet and mobile manual testing for user experience and functional QA – general web-based testing. The team we were allocated ranged from one full-time person to three at the height. We have two major areas of development work internally. One focused around marketing initiatives, which is page presentation, conversion analysis, and split test work, and then another on more core back-end development work.

Mostly, I have focused OnPath on supporting the marketing end of things because the turnaround time needs to be much tighter with those releases. However, at the height, we had one person also associated with what I consider core development project effort. We didn't use a project manager from OnPath Testing. We managed the resources and assign their tasks directly. Not myself, but actually the QA manager on my team.

How did you come to work with OnPath Testing?

Brian [Borg], the founder of the company, was actually on staff as a direct, individual contributor and contract resource when I came on board. Through working with him, we hit a point where we needed additional resources and, at that point, he said, "I need to step away. The business is growing well, and I need to step away from the hands-on testing and really do more project and resource management pieces." At that point, we put somebody in place of his work, and when we were looking to broaden the work, it was a natural progression to talk to him about using OnPath across the board for my contingent QA needs.

Could you provide a sense of the size of this initiative in financial terms?

In the last 24 months, most of the time or the whole time, I've had at least one engineer, 40 hours a week. For probably half of that time, I had two engineers. You could call it two people for 24 months, and their billing rate is around $35 an hour.

What is the status of this engagement?

We've been using them for two years in various forms, with different numbers of people.

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

I've been doing software engineering for close to 20 years, and in only two of the positions did I not use either contract QA or contract development work. Compared to other collaborations I've had, I actually rank OnPath highly. I would recommend them without hesitation. The nice thing in my experience with Brian is that he's very good at being flexible with how we wanted to manage the team. I've worked with companies where their only model was to have a project manager who we had to work through to assign tasks, and it's just been an inefficient way to manage the small numbers of resources we had.

Brian was very flexible about the model that we were using. He was good about taking feedback. At one point, we had a test engineer who initially was performing well, and then it seemed like there was a focus issue or something was going on. He was good about addressing that, and then eventually helping us find a different person to fill that role and getting a better resource in place. In general, openness to making a difference and moving on the fly has been great. I'm a huge fan.

We develop new products on about a six-month basis, so about two or three times a year, we're launching what we call a new video mashup. It's a new marketing site focused around whatever the new product line is. Historically, prior to putting a focus on it, ramping up QA and putting OnPath on, and creating the structured QA methodology that we have right now, we would put those launches out, and invariably, immediately we found issues with either page content, incorrect presentation for customers, or problems in completing an order. Frankly, I don't have those issues anymore, in part because of process changes, but also just around being able to add the staffing needed to make sure we have good coverage.

The second piece, from my standpoint, is just responsiveness and turnaround time for projects associated with the marketing team in particular. The ability to add contingent staff, and have that resource layer that I can grow as needed when projects come in, has helped me keep that responsiveness in turnaround time. When I came in the company, it was a significant talking point around how the software engineering team could support the marketing team better, and why we can't get stuff done. That conversation has pretty much ended. In most cases, now we really are beholden to the ability of the marketing team to create the copy that actually needs to go up online, versus being able to get the technical work done and test it.

At an execution level, they've been very focused on the test work. Honestly, that's partly me focusing them in that area, and really driving the strategy for QA process evolution internally. The flipside being, Brian and I had several very good, overarching conversations about where the relationship is going, what's our focus longer term, looking at broadening test automation, whether that's something that OnPath could support and help with. There's definitely strategic insight there, even though I haven't particularly leveraged it in the engagement with them.

How did OnPath Testing perform from a project management standpoint?

The project management tool we use is JIRA, the cloud-based version. We have a very organized  bug-tracking workflow. Brian actually helped establish that as the bug-tracking tool in the company before I came to the company. That was definitely, from my standpoint, one of the nice things that OnPath helped to drive, prior to me coming in.

What distinguishes OnPath Testing from other providers?

Like I said, it has to be their management approach and flexibility during project work. OnPath Testing has been very accommodating to us when things have come up unexpectedly. They're never afraid to give us crucial feedback, and they have made some key recommendations around structuring and improving our QA environment. I've been very grateful for that support.

Is there anything OnPath Testing could have improved or done differently?

The only thing is setting expectations with the contingent staff that they bring in and making sure those are aligned with my own expectations. When we do those product launches that I was talking about, we tend to move into what we call our launch schedule, where we essentially have development and QA staff available – if not online, working, but available – about 20 hours a day, seven days a week to be able to do changes.

We run a lot of A/B split testing at that point in time. At 11 o'clock at night, I need somebody available to rotate into the next test, to make sure that from a testing standpoint everything's functional. We had a couple of bumps with people who didn't understand that expectation upfront, and it was a little bit of a reset. That hasn't happened in the recent past. Brian and I have discussed this, and I think he has kind of taken it to heart.

5.0
Overall Score
  • 5.0 Scheduling
    ON TIME / DEADLINES
  • 4.0 Cost
    Value / within estimates
  • 4.0 Quality
    Service & deliverables
  • 4.0 NPS
    Willing to refer

Integrated QA Team for e-Learning Company

"I liked how independent OnPath Testing was. It made them easy to work with, and my time could be put to better use."

Quality: 
4.5
Schedule: 
5.0
Cost: 
4.0
Willing to refer: 
5.0
The Project
 
$50,000 to $199,999
Project summary: 

OnPath Testing assumed responsibility for all aspects of quality assurance (QA) and control testing of a series of independent projects These projects included complex platforms that handled large amounts of data and integrated with third party systems. 

The Reviewer
 
51-200 Employees
 
Greater Denver Area
Former QA Manager, e-Learning Company
 
Verified
The Review
Feedback summary: 

OnPath contributed to the successful launch of a near-flawless end product. They displayed outstanding communication skills and integrated into a pre-existing development team without slowing progress. Their independent efficiency and team management skills were particularly impressive. 

BACKGROUND

Please describe your organization.

I am currently a freelance QA [quality assurance] manager. At the time, I was working with OnPath, I was working for an e-learning company.

What is your position?

I was the QA manager.

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

We did a series of different projects with them. The biggest one for Evolve, which is a publishing and marketplace for medical books and research – a place where students can buy books for classes and teachers can administer quizzes to their students. It's a large platform with a lot of stuff going on. It would take a while to describe it all. That project was probably a period of more than two years or so.

We hired OnPath to provide us with engineers able to test the software's functional reliability and usability. At one time, we probably had as many as eight or nine testers from them, working every day, testing a product and whatnot. It was mostly usability, functional, and compatibility testing. We did hire them to start setting up automated testing as well, and that was a whole separate side thing, but it was for the same product. They provided someone to help get things more automated. The QA phase was integrated directly into the development process before we released the software.

SOLUTION

Please describe the scope of their involvement in more detail.

The purpose of the software itself was transactional. It dealt with large volumes of data, and there were interactions between multiple third-party systems. It had e-commerce functionality to buy books, examination functionality to quiz students. Teachers could connect and contact students, it had grading - both the grades for various assignments and weighted grades for the whole class.

As far as the e-commerce side goes, that was connecting the teacher, who would send the book to the student, and the student would buy the book. The teacher would talk to the third-party people to set the book up into the marketplace as well. It was a large, multimillion-dollar product.

They were located in India, and we had a manager of the team that was in India, and I was managing the manager, so to speak. They were working at night, and we were working during the day, so I would talk to the manager and tell him what to do for the day, and he would manage the team there. I would say the manager was more of a senior level, and then all the testers were mid-level. There weren't any pure rookie testers.

How did you come to work with OnPath Testing?

Our vice president of engineering had worked with them in the past through a different company. He already had experience with them and recommended them for this particular project. I believe we were already working with them when we started that project, so they were the only ones we thought of because we were already happy with them.

Could you provide a sense of the size of this initiative in financial terms?

It was during a period of years, and we varied from two or three testers working every day up to 10 working every day. I would say it was somewhere around $60,000 to $100,000 a year for two years.

What is the status of this engagement?

When I left the company, which was about a year and a half ago, we had released the product, but we were still developing it and adding on to it, and still using OnPath to test the add-ons, fixes, and everything. As far as I know, they're still working on it. I'm not with that company anymore, so I don't know exactly, but I know the product is still live.

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

We were happy with the way they were able to run tests efficiently, and they were able to manage their own team very well. We had them also writing test cases sometimes, and running test cases. Their test cases were written well. Their communication was good. Like I said, it was overseas and it was overnight, so we had to rely on a lot of emails and stuff. We couldn't talk directly very often. They handled that process well.

If we needed more people, they were quick to jump on and get us some more testers. We could ramp up and ramp down quickly, which is why we really liked using them a lot. Our testing needs would change quite often, and they were able to handle all the changes without any headaches or rigmarole. If we needed less, they'd take a couple people off. They were very flexible.

How did OnPath Testing perform from a project management standpoint?

They were able to deliver all their milestones on time and within budget. We were working with the developers, obviously, and the milestones would change quite a bit. With their ability to ramp up and down, we were able to meet milestones on a consistent basis with no problem. We did use JIRA sometimes for project management but, mostly, we interacted by Skype and email.

What distinguishes OnPath Testing from other providers?

I liked how independent OnPath Testing was. I've used overseas staff a lot, and usually I have to do a lot of hand-holding, a lot of phone calls. With them, I could just send an email. It had to be pretty detailed, but the email was enough. I would go for a couple or three weeks without a verbal conversation, and I was working with eight of them every day. I was impressed by their independence. It made them easy to work with, and my time could be put to better use than a lot of hand holding.

Is there anything OnPath Testing could have improved or done differently?

Some of the junior testers didn't have perfect English, but eight out of 10 of them had totally usable English. I was only really talking with the manager, and his English was perfect. The only time I would see language issues would be in the description of a bug by a junior tester, but those were few and far between. Every issue I can think of was just due to them being located overseas. We had a couple meetings with the CEO, he came to D.C. a couple times, but there wasn't a lot of personal interaction because they were overseas.

5.0
Overall Score
  • 5.0 Scheduling
    ON TIME / DEADLINES
  • 4.0 Cost
    Value / within estimates
  • 4.5 Quality
    Service & deliverables
  • 5.0 NPS
    Willing to refer
    I would definitely recommend them. I worked with a lot of overseas companies, and they were one of the best. They were wonderful.

Android Device Testing for Computer Software Startup

"OnPath Testing manages themselves extremely well. The testers were always very diligent and always got the work done."

Quality: 
4.0
Schedule: 
5.0
Cost: 
5.0
Willing to refer: 
5.0
The Project
 
$50,000 to $199,999
Project summary: 

OnPath assumed complete responsibility for quality assurance and control of an Android-based video chat application. This included working with the third-party developer from the beginning of the development process in order to assure a seamless rollout. 

The Reviewer
 
11-50 Employees
 
San Francisco Bay Area
Rob Whent
Founder & CEO at WeCam Technologies, Inc.
 
Verified
The Review
Feedback summary: 

OnPath was extremely responsive and meticulous throughout the project. They were an integral part of the end-to-end development process of the application. 

BACKGROUND

Please describe your organization.

WeCam Technologies has built a mobile video chat platform based on WebRTC technology for encrypted end-to-end, peer-to-peer video chat solutions for mobile phones.

What is your position?

I am the founder and president.

OPPORTUNITY / CHALLENGE

What business challenge were you trying to address with OnPath Testing?

We're a startup, and we had our initial seed funding from a number of friends and family. We hobbled together, using developers offshore, our first kick of the can for our proof of concept for WeCam. WeCam chat is actually available on the iTunes Store and Google Play. It's got about 200,000 users, and it's sort of a social media video chat app where you can login with Facebook, you can video chat with your Twitter friends, and there is a lot of complexity, as you can imagine.

Two-way audio/video for mobile devices was horribly interrupted with a discontinuance of Flash technologies on mobile devices, and the two-way portion of audio and video calling was a big issue. You had your big guys, like FaceTime, Google Hangouts, and Skype that had their own proprietary video codecs they were using. But, there was really a huge gap in the marketplace, and WeCam's proposal to our investors and to the audience at large was to provide a low-cost platform for people to be able to develop on and develop their own video chat encrypted apps.

We launched at Macworld 2014 and we did that on a limited budget. We had a 10-week development cycle culminating with me presenting it live on stage at Macworld in San Francisco. If you've ever presented anything live using technology, you know that's a leap of faith. My background is all digital and digital engineering. I've been in the interactive field pretty much the whole career, so I know and value QA [quality assurance] in a big way. I also know offshoring development can be great if it's done right, and part of what has to be done right is the third-party quality control.

SOLUTION

Please describe the scope of their involvement in more detail.

After launching at Macworld, we came back and we learned a bunch of lessons from the marketplace and decided to revamp and rebuild the platform from the ground up last October [2014]. From the things that we learned from the mobile development cycle earlier on, we knew we wanted to integrate QA into the process early on. That was always something that I did with my software companies, not to let QA be the thing at the end of the road, as it's asking for trouble. If integrated directly into the development cycle early on, and you have experts in that field that are given enough responsibility, then it's going to make a quality product, and that's what we did.

Part of the challenge was having developers and designers and UI [user interface] people in three countries – Canada, United States, and India – and that can be challenging. Some of the software development, as I mentioned, is great, but you really need to be able to tell them what to do and what to fix. We learned that the first time because the company we were using had their own QA team, and having your own QA team internal to a software company is kind of like giving to the fox the keys to the henhouse.

It's not that these guys didn't do their job. We just wanted to step it up a little. In doing that, one of the things that we wanted to do with our technology and with the company was to partner with an expert in mobile QA, or in QA in general. We got in touch with Brian [Borg, the founder of OnPath Testing] and told him what we were looking for, what our challenges were.

Within about a couple of weeks, he had put together a proposal and a team that really fit our schedule, being in North America. We had testers that he provided in India, in the field, as well as his expertise in helping build this plan, "Here, we should be using these tools, we should be making sure that the software people are doing this, are going to have daily scrums" and really helped to architect our QA strategy for the development of our new platform. That started probably last October and wrapped up at the beginning of this June [2015].

At one point, we had three of his testers working closely with our tester – our internal company guy – and the testers at the software development house. It's a challenging situation to be in because you've got a third party analyzing code and development. They did a great job. They really went over and above and we couldn't have been happier with what we had decided to do. Of course, having a third-party QA provider costs more. Ultimately, we delivered on time, on budget, and delivered a quality product that really was error free, for the most part, and a lot of that was due to the guys that we were partnered with at OnPath Testing.

That was the extent of our work to date. Our plan is we develop a platform, they help to build the platform, and now we develop white label versions of it and don't need that same level of QA and the intensity that we had in doing that. I would certainly use them again when we have another large-scale project.

How did you come to work with OnPath Testing?

We got referred to OnPath from a colleague at our office in San Francisco who had used them before, so it was a referral. We got in touch with Brian Borg shortly after that, started talking to him about what we were doing, the tools we were using, and some of the vision of where this was going.

The vetting process was simple. It was a referral from a colleague at our office in San Francisco. It came up in a meeting, we talked to him about some of the issues we had with quality control in our previous version of the software. He recommended that we get in touch with Brian, highly recommended him, and that was the vetting process.

We had a couple Skype meetings with Brian. We connected right away, we liked his style, and he delivered everything he said he was going to deliver on time, including quotes and revisions. That means a lot, especially when you have a specific delivery date.

Could you provide a sense of the size of this initiative in financial terms?

Overall, it was probably around $60,000.

What is the status of this engagement?

The project was completed June [2015].

RESULTS & FEEDBACK

Could you share any statistics or metrics from this engagement?

That's a tough one. If you've been in mobile development at all, especially one that has deadlines and launch dates, it's just crazy. I wouldn't wish it on anyone. All I can really say is that they integrated themselves really well into our core development team. They were in a precarious position from that scenario. The software guys didn't report to them, and it was an odd situation, but they handled the whole thing extremely well. At the end of the day, I was able to sleep better. That was really my goal. I didn't want to put myself through what we did the first time, which was just rampant QA on the fly, and staying up all night. I got them to stay up all night.

What distinguishes OnPath Testing from other providers?

Brian's expertise and involvement level with the projects. He wasn't one of our testers, obviously, but he kept in touch with the project, and he kept in touch with us. He was always present in seeing how things are, following up on things, called with any concerns, and that's a good way to run a business. OnPath Testing manages themselves extremely well. The testers were always very diligent and always got the work done.

Is there anything OnPath Testing could have improved or done differently?

At the time, they were really starting to – and this is probably an issue with a lot of QA companies that are getting in the mobile space – it's easy to have QA efforts on web-based products. The issue with mobile QA is being able to test on all the Android devices. Something, that really didn't come up at the beginning, something that nobody thought about for the most part, was that we had our own device testers as well in the company, separate from OnPath, but there was an early confusion regarding, "Well, we don't have the LG Flex, so I don't know if it works on that."

We worked closely with them to figure it out. They did start to invest in mobile hardware, which is a pain. I mean, you've got your iPhone, and then you have like 20 Android devices, with 20 different versions of Android. I can only speak to their mobile prowess. They certainly came to the table with adding more devices. That was something we didn't really think about, at first. It was kind of like mobile testing is mobile testing, but then we realized it doesn't work that way.

In the Android world, it just becomes quickly mired down. "But what phone are you using? What version of Android are you using? Is there an upgrade?" It's a whole different world. If anything, it's making sure that when you're talking about mobile QA, you're very well defined in the devices you need to test on. Either providing them to OnPath or building it into your agreement to have them purchase it, but not leave it to the ninth hour when you're going "What do you mean you don't have a Galaxy S5?" That would be the only thing, and I think they probably learned that lesson as well, through working with us.

5.0
Overall Score
  • 5.0 Scheduling
    ON TIME / DEADLINES
  • 5.0 Cost
    Value / within estimates
  • 4.0 Quality
    Service & deliverables
  • 5.0 NPS
    Willing to refer
    Definitely.