Petals Junit Framework

Improve the test development

Details

  • Type: Improvement Request Improvement Request
  • Status: New New
  • Resolution: Unresolved
  • Affects Version/s: None
  • Fix Version/s: None
  • Component/s: None
  • Security Level: Public
  • Description:
    Hide

    Currently, the integration environement works as follows:

    • There are some configurations on OW2's forge.
    • There are other ones in the integration project.
    • Requests are defined in a custom XML file, in a distinct project.
    • Requests are all mixed together, and can target any service.

    This mechanism is interesting, but remains quite heavy.
    I think with few updates, we could make it easier to use.
    That would develop the reflex: one component feature <=> write the tests.

    Here is what I would suggest:

    • Do not use the OW2 configurations anymore (plus, with the wiki, they will be useless soon).
    • SU projects should have a specific test directory.
      • src/main/test for the usual Java unit tests.
      • src/main/petals-test to put requests to send to this SU.

    Basically, this last directory would contain XML files that would contain requests, attachment references, message properties, invocation way (simple or parallel), etc. to invoke this SU.
    It would also contain conditions about the response (fault/status/payload and possibly XPath assertions).

    Conventions:
    A set of request = 1 XML file.
    1 request / file = simple invocation, several ones = parallel invocations.
    1 file name = 1 test name. (That's for the copy-paste)

    This would require two additional things:

    • An editor in the studio, to easily write these requests. Automatically select the provides to invoke, generate a message skeleton (SoapUI-like), use pre-defined lists to make your choice...
    • An upgrade of the Petals Maven plug-in. Add a goal jbi-test that takes a configuration file (which Petals, which host...), deploys the SA and goes through all the requests and sends them to this SU.

    The algorithm to run the tests would be the following one:

    1. For each SA:
      1. Deploy the SA.
      2. For each SU:
        • For each test request :
          • Send the request.
          • Check the post-conditions.
      3. Undeploy the SA

    The integration project would then only contain the SA and SU projects, and maybe one configuration file to replace the launcher project.
    The biggest part is the Maven plug-in upgrade.
    About the test editor, I'm quite confident about it. It could be done by the first week of October.
    And this could be announced as a new feature for the end of the year, something that clients and customers could use as well.
    That's not the ultimate testing framework, but that would be a beginning for the grand public.

    And I have no doubt that will be very useful for the development first, and the support then (people could transmit their SU and SA projects for tests, used with mocks...). Once again, it's a first step that will have to be followed by others.

    Show
    Currently, the integration environement works as follows:
    • There are some configurations on OW2's forge.
    • There are other ones in the integration project.
    • Requests are defined in a custom XML file, in a distinct project.
    • Requests are all mixed together, and can target any service.
    This mechanism is interesting, but remains quite heavy. I think with few updates, we could make it easier to use. That would develop the reflex: one component feature <=> write the tests. Here is what I would suggest:
    • Do not use the OW2 configurations anymore (plus, with the wiki, they will be useless soon).
    • SU projects should have a specific test directory.
      • src/main/test for the usual Java unit tests.
      • src/main/petals-test to put requests to send to this SU.
    Basically, this last directory would contain XML files that would contain requests, attachment references, message properties, invocation way (simple or parallel), etc. to invoke this SU. It would also contain conditions about the response (fault/status/payload and possibly XPath assertions). Conventions: A set of request = 1 XML file. 1 request / file = simple invocation, several ones = parallel invocations. 1 file name = 1 test name. (That's for the copy-paste) This would require two additional things:
    • An editor in the studio, to easily write these requests. Automatically select the provides to invoke, generate a message skeleton (SoapUI-like), use pre-defined lists to make your choice...
    • An upgrade of the Petals Maven plug-in. Add a goal jbi-test that takes a configuration file (which Petals, which host...), deploys the SA and goes through all the requests and sends them to this SU.
    The algorithm to run the tests would be the following one:
    1. For each SA:
      1. Deploy the SA.
      2. For each SU:
        • For each test request :
          • Send the request.
          • Check the post-conditions.
      3. Undeploy the SA
    The integration project would then only contain the SA and SU projects, and maybe one configuration file to replace the launcher project. The biggest part is the Maven plug-in upgrade. About the test editor, I'm quite confident about it. It could be done by the first week of October. And this could be announced as a new feature for the end of the year, something that clients and customers could use as well. That's not the ultimate testing framework, but that would be a beginning for the grand public. And I have no doubt that will be very useful for the development first, and the support then (people could transmit their SU and SA projects for tests, used with mocks...). Once again, it's a first step that will have to be followed by others.
  • Environment:
    -

Activity

People

Dates

  • Created:
    Fri, 17 Sep 2010 - 16:53:32 +0200
    Updated:
    Thu, 8 Oct 2015 - 16:37:59 +0200