Re: [release] [OpenDaylight][TSC] Fluorine SR3 status - candidate sign off


Luis Gomez
 

On May 30, 2019, at 2:26 PM, Robert Varga <nite@...> wrote:

On 30/05/2019 20:58, Luis Gomez wrote:

2) Some perf regression in controller:

https://jenkins.opendaylight.org/releng/job/controller-csit-3node-clustering-ask-all-neon/247/robot/controller-clustering-ask.txt/Chasing%20The%20Leader/Unregister_Candidates_And_Validate_Criteria/
<https://jenkins.opendaylight.org/releng/job/controller-csit-3node-clustering-ask-all-neon/247/robot/controller-clustering-ask.txt/Chasing
The Leader/Unregister_Candidates_And_Validate_Criteria/>
#2‚Äč44
27-May-2019 09:46

Hmm... what is the usual achieved rate?
Around 50 and now we get less than 5 so we are talking about 1 magnitude perf regression.

The test itself does:

1) Start a singleton registration flap on every controller instance with this RPC: /restconf/operations/odl-mdsal-lowlevel-control:register-flapping-singleton

2) Maintain the flap for 60 secs.

3) Stop the flap on every controller instance: /restconf/operations/odl-mdsal-lowlevel-control:unregister-flapping-singleton

4) Get flap count from above RPC response: <output xmlns="tag:opendaylight.org,2017:controller:yang:lowlevel:control"><flap-count>83</flap-count></output>

5) Add all the flaps for the 3 controller instances and divide the total by 60 secs.

So in the original controller handled ~50 flaps/sec, after the regression is less than 5 flaps/sec.

I hope this helps.



Jam0, we've been messing with those numbers last, but I do not remember
the specifics...

https://jenkins.opendaylight.org/releng/job/controller-csit-3node-clustering-tell-all-fluorine/220/robot/controller-clustering-tell.txt/Chasing%20The%20Leader/Unregister_Candidates_And_Validate_Criteria/
<https://jenkins.opendaylight.org/releng/job/controller-csit-3node-clustering-tell-all-fluorine/220/robot/controller-clustering-tell.txt/Chasing
The Leader/Unregister_Candidates_And_Validate_Criteria/>

I am not sure how critical is this perf test.
Well, regressions needs to be investigated. On the other hand, we have
seen problems with Nexus, so this could (I am not saying it is) an env
issue.

Let's see what the usual rates were, but I think these should actually
improve in this release :)

Also, let's make this a blocker until we understand more. It is the last
SR after all...

Regards,
Robert

Join {integration-dev@lists.opendaylight.org to automatically receive all group messages.