Test your JDE E1 implementation, not the application

by Pat Neary

I’ve recently been searching the Internet for some killer facts and figures that I might use to convince prospective customers that they should invest time, energy and money to improve the testing that they do as part of their JD Edwards EnterpriseOne (JDE E1) projects.

What’s interesting, and a little perplexing, is the lack of content relevant to testing enterprise application software. When I say enterprise application software, I am thinking of packaged software products like JDE E1; like other products in this category, are designed to support multiple business-critical process (such as payroll, procurement, order processing etc).

With packaged software, we are led to believe (and should reasonably expect) that the software we have purchased will have been thoroughly tested by the author. The challenge is, people rarely use these products “out of the box”. ERP software is complex and most, if not all, implementations will involve a degree of customization.

Owning the QA process

Experience tells us that it is not practical, safe or sensible to rely solely on the author to own the QA process. Things go wrong, and there are many examples of thoroughly tested software failing to meet the specific requirements of a business. Users, therefore, need to take ownership of testing. When planning and executing tests, we need to focus on our implementation of the software and how it supports our business-critical workflows.

The “ownership” of testing is particularly relevant at a time when enterprise software developers are changing the way their customers buy and consume their software. Many applications are moving to the cloud, where software is being consumed as-a-service. The JDE team at Oracle has introduced a continuous delivery (adoption/innovation) model that has major implications for the way customers test. Smaller, more frequent change event projects will require more frequent testing. If the end-users themselves aren’t rethinking their approach to testing, the systems integration partners they rely on should be.

The importance of testing

When we think about testing JDE, it’s not in the way we might think of testing within a traditional Software Development Lifecycle. It’s important to understand that the onus is on testing the implementation of the software, not the software itself. This doesn’t mean you have to do the testing yourself. Just make sure the partner you have chosen to carry out the testing understands your business, how you have implemented the software and how it supports your other business-critical applications.

My search for statistics wasn’t completely fruitless. Here are a couple of facts and figures that I was able to dig out that you might use if you’re building a business case for testing:

  • 23% to 35%. The proportion of total IT budget allocated to QA and testing from 2013 through 2017. (World Quality Report 2017-18, Ninth Edition)
  • 50%. The proportion of time typically spent testing during the development phase. So >50%, because testing is broadly deployed in every phase in the software development cycle. (Institute for Software Research – Carnegie Mellon University Research)

In our own experience, testing can account for anything from 25% to 65% of the total upgrade effort.

So, there might not be a lot of relevant research out there, but the search itself proved useful in helping me understand the JDE E1 testing challenge. Whether you carry out your own testing or partner with a JDE service provider: Don’t test the application, test the implementation.

If you are a JDE E1 customer, or a systems integrator supporting companies using JDE E1, and would like to learn how DWS testing products can support you throughout the JDE lifecycle, please get it touch.

DWS Logigear Logo
DWS Logigear logo3 (1)