Cinder issues with Openstack EPEL

Pádraig Brady P at draigBrady.com
Mon Nov 19 15:13:32 UTC 2012


On 11/17/2012 07:18 PM, Nux! wrote:
> Hello chaps,
>
> I'm having some problems with Openstack Cinder (who doesn't :> )
>
> Some info:
> OS: CentOS 6.3 + EPEL + EPEL-testing, default firewall on, selinux on
> I followed these instructions http://fedoraproject.org/wiki/Getting_started_with_OpenStack_EPEL skipping "Setup volume storage" because I have a physical LVM partition called "cinder-volumes" and also skipping "Nova Network Setup" because I want Quantum.
>
> As I said I do have a VG called cinder-volumes and have modified tgtd to include include /etc/cinder/volumes/*
> When I try to create a volume I see this in cinder/api.log:
>
> 012-11-17 19:11:50 AUDIT cinder.api.openstack.volume.volumes [req-22f1bfdc-67fd-4464-9a4b-1701e3547838 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] Create volume of 1 GB
> 2012-11-17 19:11:50 AUDIT cinder.api.openstack.volume.volumes [req-22f1bfdc-67fd-4464-9a4b-1701e3547838 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] vol={'availability_zone': 'nova', 'terminated_at': None, 'updated_at': None, 'provider_auth': None, 'snapshot_id': None, 'ec2_id': None, 'mountpoint': None, 'deleted_at': None, 'id': '0a29a3db-0b25-41bb-81c3-4e3d277e5a4a', 'size': 1, 'user_id': u'bb48f2ad7a45420ea80f8bdda97e0d54', 'attach_time': None, 'display_description': u'', 'project_id': u'cc3005e2733e423bbe1bf24931447294', 'launched_at': None, 'scheduled_at': None, 'status': 'creating', 'volume_type_id': None, 'deleted': False, 'provider_location': None, 'host': None, 'display_name': u'1', 'instance_uuid': None, 'created_at': datetime.datetime(2012, 11, 17, 19, 11, 50, 226521), 'attach_status': 'detached'}
> 2012-11-17 19:11:50 INFO cinder.api.openstack.wsgi [req-22f1bfdc-67fd-4464-9a4b-1701e3547838 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] http://127.0.0.1:8776/v1/cc3005e2733e423bbe1bf24931447294/volumes returned with HTTP 200
> 2012-11-17 19:11:50 INFO cinder.api.openstack.wsgi [req-071e26ef-f710-4548-bf7a-da0518113dd6 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] GET http://127.0.0.1:8776/v1/cc3005e2733e423bbe1bf24931447294/volumes/detail
> 2012-11-17 19:11:50 AUDIT cinder.api.openstack.volume.volumes [req-071e26ef-f710-4548-bf7a-da0518113dd6 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] vol=<cinder.db.sqlalchemy.models.Volume object at 0x427bad0>
> 2012-11-17 19:11:50 INFO cinder.api.openstack.wsgi [req-071e26ef-f710-4548-bf7a-da0518113dd6 bb48f2ad7a45420ea80f8bdda97e0d54 cc3005e2733e423bbe1bf24931447294] http://127.0.0.1:8776/v1/cc3005e2733e423bbe1bf24931447294/volumes/detail returned with HTTP 200
>
> This in cinder/scheduler.log:
>
> 012-11-17 19:14:48 3689 ERROR cinder.openstack.common.rpc.amqp [-] Exception during message handling
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp Traceback (most recent call last):
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib/python2.6/site-packages/cinder/openstack/common/rpc/amqp.py", line 276, in _process_data
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp rval = self.proxy.dispatch(ctxt, version, method, **args)
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib/python2.6/site-packages/cinder/openstack/common/rpc/dispatcher.py", line 145, in dispatch
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp return getattr(proxyobj, method)(ctxt, **kwargs)
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib/python2.6/site-packages/cinder/scheduler/manager.py", line 98, in _schedule
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp db.volume_update(context, volume_id, {'status': 'error'})
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib64/python2.6/contextlib.py", line 23, in __exit__
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp self.gen.next()
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib/python2.6/site-packages/cinder/scheduler/manager.py", line 94, in _schedule
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp return driver_method(*args, **kwargs)
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp   File "/usr/lib/python2.6/site-packages/cinder/scheduler/simple.py", line 78, in schedule_create_volume
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp raise exception.NoValidHost(reason=msg)
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp NoValidHost: No valid host was found. Is the appropriate service running?
> 2012-11-17 19:14:48 3689 TRACE cinder.openstack.common.rpc.amqp
>
> And this in httpd/error_log:
>
> [Sat Nov 17 19:15:36 2012] [error] unable to retrieve service catalog with token
> [Sat Nov 17 19:15:36 2012] [error] Traceback (most recent call last):
> [Sat Nov 17 19:15:36 2012] [error]   File "/usr/lib/python2.6/site-packages/keystoneclient/v2_0/client.py", line 135, in _extract_service_catalog
> [Sat Nov 17 19:15:36 2012] [error]     endpoint_type='adminURL')
> [Sat Nov 17 19:15:36 2012] [error]   File "/usr/lib/python2.6/site-packages/keystoneclient/service_catalog.py", line 73, in url_for
> [Sat Nov 17 19:15:36 2012] [error]     raise exceptions.EndpointNotFound('Endpoint not found.')
> [Sat Nov 17 19:15:36 2012] [error] EndpointNotFound: Endpoint not found.
>
> Tried all of this with "setenforce 0", but to no avail.
>
> Suggestions?

So following the above link, I assume you've run openstack-demo-install which:
sets up cinder with keystone as follows:
https://github.com/fedora-openstack/openstack-utils/blob/a978b04/utils/openstack-demo-install#L202

Is the output of `keystone endpoint-list` as expected?

Also probably not related to your issue above,
but in case you missed it, please edit targets.conf as per:
https://fedoraproject.org/wiki/QA:Testcase_Create_Cinder_Volumes#Integration_with_tgtd

thanks,
Pádraig.


More information about the cloud mailing list