settingsLogin | Registersettings

[openstack-dev] 9/4 state of the gate

0 votes

There are a few things blowing up in the last 24 hours so might as well
make people aware.

  1. gate-tempest-dsvm-large-ops was failing at a decent rate:

https://bugs.launchpad.net/nova/+bug/1491949

Turns out devstack was changed to run multihost=true and that doesn't
work so well with the large-ops job that's creating hundreds of fake
instances on a single node. We reverted the devstack change so things
should be good there now.

  1. gate-tempest-dsvm-cells was regressed because nova has an in-tree
    blacklist regex of tests that don't work with cells and renaming some of
    those in tempest broke the regex.

https://bugs.launchpad.net/nova/+bug/1492255

There is a patch in the gate but it's getting bounced on #3. Long-term
we want to bring that blacklist regex down to 0 and instead use feature
toggles in Tempest for the cells job, we just aren't there yet. Help
wanted...

  1. gate-tempest-dsvm-full-ceph is broken with glance-store 0.9.0:

https://bugs.launchpad.net/glance-store/+bug/1492432

It looks like the gate-tempest-dsvm-full-ceph-src-glancestore job was
not actually testing trunk glance
store code because of a problem in the
upper-constraints.txt file in the requirements repo - pip was capping
glance_store at 0.8.0 in the src job so we actually haven't been testing
latest glance-store. dhellmann posted a fix:

https://review.openstack.org/#/c/220648/

But I'm assuming glance-store 0.9.0 is still busted. I've posted a
change which I think might be related:

https://review.openstack.org/#/c/220646/

If ^ fixes the issue we'll need to blacklist 0.9.0 from global-requirements.

--

As always, it's fun to hit this stuff right before the weekend,
especially a long US holiday weekend. :)

--

Thanks,

Matt Riedemann


OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-request@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
asked Sep 4, 2015 in openstack-dev by Matt_Riedemann (48,320 points)   3 7 21

3 Responses

0 votes

On 9/4/2015 3:13 PM, Matt Riedemann wrote:
There are a few things blowing up in the last 24 hours so might as well
make people aware.

  1. gate-tempest-dsvm-large-ops was failing at a decent rate:

https://bugs.launchpad.net/nova/+bug/1491949

Turns out devstack was changed to run multihost=true and that doesn't
work so well with the large-ops job that's creating hundreds of fake
instances on a single node. We reverted the devstack change so things
should be good there now.

  1. gate-tempest-dsvm-cells was regressed because nova has an in-tree
    blacklist regex of tests that don't work with cells and renaming some of
    those in tempest broke the regex.

https://bugs.launchpad.net/nova/+bug/1492255

There is a patch in the gate but it's getting bounced on #3. Long-term
we want to bring that blacklist regex down to 0 and instead use feature
toggles in Tempest for the cells job, we just aren't there yet. Help
wanted...

  1. gate-tempest-dsvm-full-ceph is broken with glance-store 0.9.0:

https://bugs.launchpad.net/glance-store/+bug/1492432

It looks like the gate-tempest-dsvm-full-ceph-src-glancestore job was
not actually testing trunk glance
store code because of a problem in the
upper-constraints.txt file in the requirements repo - pip was capping
glance_store at 0.8.0 in the src job so we actually haven't been testing
latest glance-store. dhellmann posted a fix:

https://review.openstack.org/#/c/220648/

But I'm assuming glance-store 0.9.0 is still busted. I've posted a
change which I think might be related:

https://review.openstack.org/#/c/220646/

If ^ fixes the issue we'll need to blacklist 0.9.0 from
global-requirements.

--

As always, it's fun to hit this stuff right before the weekend,
especially a long US holiday weekend. :)

I haven't seen the elastic-recheck bot comment on any changes in awhile
either so I'm wondering if that's not running.

Also, here is another new(ish) gate bug I'm just seeing tonight (bumped
a fix for #3 above):

https://bugs.launchpad.net/keystonemiddleware/+bug/1492508

--

Thanks,

Matt Riedemann


OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-request@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
responded Sep 5, 2015 by Matt_Riedemann (48,320 points)   3 7 21
0 votes

On Fri, Sep 4, 2015 at 6:43 PM, Matt Riedemann mriedem@linux.vnet.ibm.com
wrote:

On 9/4/2015 3:13 PM, Matt Riedemann wrote:

There are a few things blowing up in the last 24 hours so might as well
make people aware.

  1. gate-tempest-dsvm-large-ops was failing at a decent rate:

https://bugs.launchpad.net/nova/+bug/1491949

Turns out devstack was changed to run multihost=true and that doesn't
work so well with the large-ops job that's creating hundreds of fake
instances on a single node. We reverted the devstack change so things
should be good there now.

  1. gate-tempest-dsvm-cells was regressed because nova has an in-tree
    blacklist regex of tests that don't work with cells and renaming some of
    those in tempest broke the regex.

https://bugs.launchpad.net/nova/+bug/1492255

There is a patch in the gate but it's getting bounced on #3. Long-term
we want to bring that blacklist regex down to 0 and instead use feature
toggles in Tempest for the cells job, we just aren't there yet. Help
wanted...

  1. gate-tempest-dsvm-full-ceph is broken with glance-store 0.9.0:

https://bugs.launchpad.net/glance-store/+bug/1492432

It looks like the gate-tempest-dsvm-full-ceph-src-glancestore job was
not actually testing trunk glance
store code because of a problem in the
upper-constraints.txt file in the requirements repo - pip was capping
glance_store at 0.8.0 in the src job so we actually haven't been testing
latest glance-store. dhellmann posted a fix:

https://review.openstack.org/#/c/220648/

But I'm assuming glance-store 0.9.0 is still busted. I've posted a
change which I think might be related:

https://review.openstack.org/#/c/220646/

If ^ fixes the issue we'll need to blacklist 0.9.0 from
global-requirements.

--

As always, it's fun to hit this stuff right before the weekend,
especially a long US holiday weekend. :)

I haven't seen the elastic-recheck bot comment on any changes in awhile
either so I'm wondering if that's not running.

Looks like there was a suspicious 4 day gap in elastic-recheck, but it
appears to be running again?

$ ./lastcomment.py
Checking name: Elastic Recheck
[0] 2015-09-06 01:12:40 (0:35:54 old) https://review.openstack.org/220386
'Reject the cell name include '!', '.' and '@' for Nova API'
[1] 2015-09-02 00:54:54 (4 days, 0:53:40 old)
https://review.openstack.org/218781 'Remove the unnecassary
volumeapi.get(context, volumeid)'

Also, here is another new(ish) gate bug I'm just seeing tonight (bumped a
fix for #3 above):

https://bugs.launchpad.net/keystonemiddleware/+bug/1492508

--

Thanks,

Matt Riedemann


OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-request@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev


OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-request@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
responded Sep 6, 2015 by Joe_Gordon (24,620 points)   2 5 8
0 votes

On 09/05/2015 09:50 PM, Joe Gordon wrote:

On Fri, Sep 4, 2015 at 6:43 PM, Matt Riedemann
<mriedem@linux.vnet.ibm.com mriedem@linux.vnet.ibm.com> wrote:

I haven't seen the elastic-recheck bot comment on any changes in
awhile either so I'm wondering if that's not running.

Looks like there was a suspicious 4 day gap in elastic-recheck, but it
appears to be running again?

$ ./lastcomment.py
Checking name: Elastic Recheck
[0] 2015-09-06 01:12:40 (0:35:54 old)
https://review.openstack.org/220386 'Reject the cell name include '!',
'.' and '@' for Nova API'
[1] 2015-09-02 00:54:54 (4 days, 0:53:40 old)
https://review.openstack.org/218781 'Remove the unnecassary
volumeapi.get(context, volumeid)'

Remember, there is a 15 minute report contract on the bot, assuming that
if we're > 15 minutes late enough of the environment is backed up that
there is no point in waiting. We had some pretty substantial backups in
Elastic Search recently.

-Sean

--
Sean Dague
http://dague.net


OpenStack Development Mailing List (not for usage questions)
Unsubscribe: OpenStack-dev-request@lists.openstack.org?subject:unsubscribe
http://lists.openstack.org/cgi-bin/mailman/listinfo/openstack-dev
responded Sep 8, 2015 by Sean_Dague (66,200 points)   4 8 14
...