Re: CSIT test tools


Moiz Raja <moraja@...>
 

When you push code into the repository it will trigger a verify build. However if you do not have your integration tests integrated into the maven build then they will not be run. 

Was your concern that pushing the code will automatically trigger a test run? Because it won't.

I suggest that we simply add whatever is in the Github repo to the ODL integration repo. I can do it if you like.

-Moiz


On Nov 5, 2013, at 5:13 PM, Luis Gomez <luis.gomez@...> wrote:

Hi Moiz, just a question: our repo as it is configured now, should not trigger a verify build job whenever someone tries to put something on it (i.e. with git push)? or is there another way to do this?
 
 
 
From: Moiz Raja [mailto:moraja@cisco.com] 
Sent: Tuesday, November 05, 2013 5:04 PM
To: Luis Gomez
Cc: Baohua Yang; huang denghui (huangdenghui@...); integration-dev@...
Subject: Re: [integration-dev] CSIT test tools
 
How about just adding the python scripts into the integration repo instead of using github. It doesn't hurt.
 
-Moiz
 
On Nov 5, 2013, at 4:31 PM, Luis Gomez <luis.gomez@...> wrote:


Hi guys,
 
Congratulations, I downloaded the python scripts to the test tools VMs, changed the controller IP and run the system test with no issues. I saw you coded very well all the REST requests so this should be a good input for Robot framework.
 
BR/Luis
 
 
From: Baohua Yang [mailto:yangbaohua@gmail.com] 
Sent: Tuesday, November 05, 2013 1:10 AM
To: Luis Gomez
Cc: Moiz Raja; integration-dev@...
Subject: Re: [integration-dev] System Test Plan discussion
 
Hi Luis
    Thanks for your willing to help test the code.
    Currently, we've finished the tests on main functions of all modules in the base edition.
    We believe there's still lots of work to do and welcome for any feedback!
    Denghui and I have discussed a lot on the development plan, but we think there will be more power community-widely. 
    Every member, please do not hesitate to drop lines.
    Thanks!

 

On Tue, Nov 5, 2013 at 12:04 PM, Luis Gomez <luis.gomez@...> wrote:
Hi Moiz,
 
See my answers inline:
 
 
From: Moiz Raja [mailto:moraja@...] 
Sent: Monday, November 04, 2013 5:59 PM
To: Luis Gomez
Cc: Gmail; integration-dev@...
Subject: Re: [integration-dev] System Test Plan discussion
 
Hi Guys,
 
A couple of questions on the System test.
 
a. Will the System Test be integrated with the build? The system test will not run with the build, at least the one based on Robot/Phyton. The idea is to trigger a job so that the controller VM (separated from the build server) fetches the latest release vehicle from Jenkins and runs it. After this we will trigger the test case execution in Robot.
 
b. What framework are we going to use to deploy the built artifacts? Is it going to be something like capistrano or custom bash scripts?
The test code (Robot or Python) does not need to be built so I do not think we are going to have release artifacts as we have in Java. Instead we will have the test code stored in our git and then Robot will fetch the code from there.
c. Will the python/robot tests live in the integration repository? Anything tests that I can look at?
Yes, that is the idea although nothing has been uploaded to the repo yet. So far we have 2 things: Python scripts created by China team and stored in an external repo and Robot framework installed in Open Lab at Ericsson. Both are now described (Carol updated the Robot today) in https://wiki.opendaylight.org/view/CrossProject:Integration_Group:Test_Tools
 
I remind everybody this week is to get familiar with these tools and see how we can better use them so yes you and everybody is invited to take a look.
 
I will personally try to get the python scripts to work in the Open Lab at Ericsson tomorrow the latest.
 
BR/Luis
 
 
-Moiz
 
On Nov 3, 2013, at 12:16 AM, Luis Gomez <luis.gomez@...> wrote:

 

OK, I changed the test case order so that we start with the most basic services and end with most abstracted services :
 
-          Switch Mgr
-          Topology Mgr
-          FRM
-          Statistics Mgr
-          Configuration
-          Host Tracker & Simple Forwarding
-          ARP Handler
-          Forward Manager
-          Container Mgr
 
Note that basic service does not necessarily means simple to test, so maybe I was not very precise when I said simple-to-complex. What I really meant was something like basic-to-extra functions.
 
I also added the required steps on every area so every test case above is self-complete. Please review the test plan and let me know if you agree with it.
 
As for your question, yes, we can categorize the services in different ways like for example: basic network functions (Switch Mgr, Topology Mgr, FRM, Stats Mgr), extra network functions (Host Tracker, Simple Forwarding, ARP Handler, Forward Mgr), basic node functions (Configuration, User Mgr, Connection Mgr) and extra node functions (Container Mgr, Cluster Mgr). This is just an idea but it could be more, anyway besides the classification the important is that we do not leave features/modules without test.
 
BR/Luis
 
 
From: Gmail [mailto:yangbaohua@gmail.com
Sent: Friday, November 01, 2013 10:22 PM
To: Luis Gomez
Cc: integration-dev@...
Subject: Re: System Test Plan discussion
 
Sure, Luis. This is a valuable question!
IMHO, the simple-to-complex order is good.
However, we might also keep test cases independent from each other, i.e., each test case should be self-complete. Because, we sometime may want to test the function of individual module, instead of the entire platform.
Besides, we may even categorize the tested modules based on their functions, test complexity, etc.  For example, state collection modules, basic forwarding modules, QoS modules.... And each category can be tested separately.
How do you think?
 


On Nov 2, 2013, at 6:10, Luis Gomez <luis.gomez@...> wrote:

BTW, I think I would prefer some logic in the order like (simple to complex) and independent modules even it this means more TCs.
 
BR/Luis
 
 
From: integration-dev-bounces@... [mailto:integration-dev-bounces@...On Behalf OfLuis Gomez
Sent: Friday, November 01, 2013 11:47 AM
To: Baohua Yang
Cc: integration-dev@...
Subject: [integration-dev] System Test Plan discussion
 
Hi Baohua and all,
 
I have also observed the list of TCs in the Base Test Plan has been reordered following alphabetical order. This is fine but I just want to explain the reason for the order before:
 
1) I started from more simple (less abstraction) test cases modules like Switch Mgr or Toplogy Mgr, FRM and finished with more complex (more abstraction) ones like: host tracker, ARP Handler, Container Mgr, etc.. The reason for this is if we start testing simple things and these fails, there is no point to continue with complex ones. Also if the test starts with a complex case and fails, more difficult to debug.
 
2) I also combined some modules like for example if I need to have some flows (FRM) in order to check statistics (stats Mgr), I create the flows (FRM1), then I check the statistics (Stats Mgr) and then I clear the flows (FRM2). This way I do not need to create flows 2 times, one for FRM and another for Stats Mgr.
 
The thing is the existing TC flow does not work as the way I wrote it there are some modules (i.e. Stats Mgr) that needs to have certain conditions created in previous modules (FRM). Anyway, instead of fixing this I would like to open a discussion (this is the right time) on how we want to write and later execute test plan:
 
1) Module order: do we want to follow any logical order (simple to complex or something else) or we just do alphabetical?
 
2) Module dependencies: do we want to have modules depending on the result of previous modules (more test efficient, less TCs) or have totally independent modules (less efficient but very flexible if for example I want to run just one module test)
 
What do you think?
 
 
 
From: Baohua Yang [mailto:yangbaohua@...
Sent: Friday, November 01, 2013 1:42 AM
To: Luis Gomez
Cc: integration-dev@...; huang denghui
Subject: Re: [integration-dev] Next steps for system test
 
Hi, all
     As discussed this morning, we've shared the developing CSIT test tool  information at https://wiki.opendaylight.org/view/CrossProject:Integration_Group:Test_Tools#CSIT_Test_Tool.
     Please help verify and input more details, or to see if we have better place to move to.
     Denghui and I are working hard on the code writing these days, and we plan to release a workable one for the base edition in this week.
     Currently, the simple list_nodes tests on arp handler, host tracker and switch manager have been provided. 
     Any one is welcome to make contribution to the code, documents or bug fixing.
     The code is free to download through "git clone https://github.com/yeasy/CSIT_Test.git".
     Thanks for feedback!

 

On Fri, Nov 1, 2013 at 12:47 PM, Luis Gomez <luis.gomez@...> wrote:
Hi all,
 
It is been a busy day but finally I got some time to summarize our afternoon discussion around the system test:
 

-          Phython scripts: Baohua will write some guide in the wiki on how to use the scripts. These can be very useful to debug test cases.

 

-          Robot framework: It is already installed in Ericsson Lab, Carol will write some quick guide on how to use it. Denghui, you have also Robot experience so you can help Carol with this.

 

-          TestON framework: We have meeting next week to know more about it (especially APIs and driver support), I will share with you some documentation I have already got from Swaraj. From the TSC call today, it looks like we are interested in collaborating with ON.Lab people so this could be a good opportunity.

 

-          System Test Plan: We need to continue working on this, everybody is invited to contribute in the wiki. I am going to meet Madhu next Monday to talk about OVSDB inclusion in base release, I will also work on some test cases around OVSDB plugin. Punal is going to take a look on VTN project (Virtualization release) as they have all very well documented in the wiki. The rest of the projects we will need to ask information as we are writing the test plan.

 

So, the plan for next week is to get familiar with Robot framework and the python scripts created by China team, evaluate TestON framework,  and  continue filling the test plan.
 
BR/Luis

 

 
 


_______________________________________________
integration-dev mailing list
integration-dev@...
https://lists.opendaylight.org/mailman/listinfo/integration-dev



 
-- 
Best wishes!
Baohua
_______________________________________________
integration-dev mailing list
integration-dev@...
https://lists.opendaylight.org/mailman/listinfo/integration-dev
 


 
-- 
Best wishes!
Baohua
_______________________________________________
integration-dev mailing list
integration-dev@...
https://lists.opendaylight.org/mailman/listinfo/integration-dev


Join {integration-dev@lists.opendaylight.org to automatically receive all group messages.