Merge lp:~cjwatson/charms/trusty/turnip/build-label into lp:~canonical-launchpad-branches/charms/trusty/turnip/devel
- Trusty Tahr (14.04)
- build-label
- Merge into devel
Status: | Merged |
---|---|
Merged at revision: | 81 |
Proposed branch: | lp:~cjwatson/charms/trusty/turnip/build-label |
Merge into: | lp:~canonical-launchpad-branches/charms/trusty/turnip/devel |
Diff against target: |
491 lines (+209/-72) 12 files modified
.bzrignore (+1/-2) Makefile.common (+24/-21) README.md (+0/-2) config.yaml (+28/-0) deploy-requirements.txt (+0/-2) hooks/actions.py (+149/-38) hooks/services.py (+2/-2) templates/turnip-httpserver.conf.j2 (+1/-1) templates/turnip-packbackendserver.conf.j2 (+1/-1) templates/turnip-packfrontendserver.conf.j2 (+1/-1) templates/turnip-sshserver.conf.j2 (+1/-1) templates/turnip-virtserver.conf.j2 (+1/-1) |
To merge this branch: | bzr merge lp:~cjwatson/charms/trusty/turnip/build-label |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Kit Randel (community) | Approve | ||
Review via email: mp+275468@code.launchpad.net |
Commit message
Allow updating the code payload separately from the charm using a build label.
Description of the change
Allow updating the code payload separately from the charm using a build label.
The approach used here is based heavily on the software-
The virtualenv moves inside the payload directory so that each payload gets its own. /srv/turnip/code remains as a symlink to the current payload, which is convenient and saves us having to substitute the current build label into the Upstart jobs.
There are no tests directly here, but it'll at least get integration testing by way of the corresponding changes to the Mojo spec.
Kit Randel (blr) wrote : | # |
Although flake8 is in turnip-
Kit Randel (blr) wrote : | # |
> Although flake8 is in turnip-
> manifest, so `make lint` will fail here afaict.
Apologies, disregard that, comment was intended for the turnipcake MP.
- 82. By Colin Watson
-
Fix over-indentation.
Preview Diff
1 | === modified file '.bzrignore' | |||
2 | --- .bzrignore 2015-04-27 01:59:44 +0000 | |||
3 | +++ .bzrignore 2015-11-02 11:16:19 +0000 | |||
4 | @@ -1,5 +1,4 @@ | |||
5 | 1 | *.pyc | 1 | *.pyc |
6 | 2 | .coverage | 2 | .coverage |
7 | 3 | .venv | 3 | .venv |
10 | 4 | files/pip-cache | 4 | files/* |
9 | 5 | files/turnip.tar.gz | ||
11 | 6 | 5 | ||
12 | === modified file 'Makefile.common' | |||
13 | --- Makefile.common 2015-03-29 02:58:13 +0000 | |||
14 | +++ Makefile.common 2015-11-02 11:16:19 +0000 | |||
15 | @@ -5,13 +5,13 @@ | |||
16 | 5 | PWD := $(shell pwd) | 5 | PWD := $(shell pwd) |
17 | 6 | HOOKS_DIR := $(PWD)/hooks | 6 | HOOKS_DIR := $(PWD)/hooks |
18 | 7 | SOURCE_DIR ?= $(shell dirname $(PWD))/.source/$(APP_NAME) | 7 | SOURCE_DIR ?= $(shell dirname $(PWD))/.source/$(APP_NAME) |
20 | 8 | PIP_CACHE := $(PWD)/files/pip-cache | 8 | FILES_DIR := $(PWD)/files |
21 | 9 | 9 | ||
27 | 10 | ifeq ($(PIP_SOURCE_DIR),) | 10 | BUILD_LABEL = $(shell bzr log -rlast: --show-ids $(SOURCE_DIR) | sed -n 's/^revision-id: //p') |
28 | 11 | PIP_CACHE_ARGS := | 11 | TARBALL = $(APP_NAME).tar.gz |
29 | 12 | else | 12 | ASSET = $(FILES_DIR)/$(BUILD_LABEL)/$(TARBALL) |
30 | 13 | PIP_CACHE_ARGS := --no-index --find-links=file://$(PIP_SOURCE_DIR) | 13 | UNIT = $(APP_NAME)/0 |
31 | 14 | endif | 14 | CHARM_UNIT_PATH := /var/lib/juju/agents/unit-$(APP_NAME)-0/charm |
32 | 15 | 15 | ||
33 | 16 | all: setup lint test | 16 | all: setup lint test |
34 | 17 | 17 | ||
35 | @@ -20,9 +20,22 @@ | |||
36 | 20 | @juju upgrade-charm --repository=../.. $(APP_NAME) | 20 | @juju upgrade-charm --repository=../.. $(APP_NAME) |
37 | 21 | 21 | ||
38 | 22 | 22 | ||
40 | 23 | deploy: tarball pip-cache | 23 | deploy: payload |
41 | 24 | @echo "Deploying $(APP_NAME)..." | 24 | @echo "Deploying $(APP_NAME)..." |
42 | 25 | @juju deploy --repository=../.. local:trusty/$(APP_NAME) | 25 | @juju deploy --repository=../.. local:trusty/$(APP_NAME) |
43 | 26 | @$(MAKE) rollout SKIP_BUILD=true | ||
44 | 27 | |||
45 | 28 | |||
46 | 29 | # deploy a new revision/branch | ||
47 | 30 | rollout: _PATH=$(CHARM_UNIT_PATH)/files/$(BUILD_LABEL) | ||
48 | 31 | rollout: | ||
49 | 32 | ifneq ($(SKIP_BUILD),true) | ||
50 | 33 | $(MAKE) payload | ||
51 | 34 | endif | ||
52 | 35 | # manually copy our asset to be in the right place, rather than upgrade-charm | ||
53 | 36 | juju scp $(ASSET) $(UNIT):$(TARBALL) | ||
54 | 37 | juju ssh $(UNIT) 'sudo mkdir -p $(_PATH) && sudo mv $(TARBALL) $(_PATH)/' | ||
55 | 38 | juju set $(APP_NAME) build_label=$(BUILD_LABEL) | ||
56 | 26 | 39 | ||
57 | 27 | 40 | ||
58 | 28 | ifeq ($(NO_FETCH_CODE),) | 41 | ifeq ($(NO_FETCH_CODE),) |
59 | @@ -39,23 +52,14 @@ | |||
60 | 39 | endif | 52 | endif |
61 | 40 | 53 | ||
62 | 41 | 54 | ||
63 | 42 | pip-cache: fetch-code | ||
64 | 43 | @echo "Updating python dependency cache..." | ||
65 | 44 | @mkdir -p $(PIP_CACHE) | ||
66 | 45 | @pip install $(PIP_CACHE_ARGS) --no-use-wheel --download $(PIP_CACHE) \ | ||
67 | 46 | -r $(SOURCE_DIR)/requirements.txt \ | ||
68 | 47 | -r deploy-requirements.txt | ||
69 | 48 | |||
70 | 49 | |||
71 | 50 | check-rev: | 55 | check-rev: |
72 | 51 | ifndef REV | 56 | ifndef REV |
73 | 52 | $(error Revision number required to fetch source: e.g. $ REV=10 make deploy) | 57 | $(error Revision number required to fetch source: e.g. $ REV=10 make deploy) |
74 | 53 | endif | 58 | endif |
75 | 54 | 59 | ||
80 | 55 | tarball: fetch-code | 60 | payload: fetch-code |
81 | 56 | @echo "Creating tarball for deploy..." | 61 | @echo "Building asset for $(BUILD_LABEL)..." |
82 | 57 | @mkdir -p files/ | 62 | @$(MAKE) -C $(SOURCE_DIR) build-tarball TARBALL_BUILDS_DIR=$(FILES_DIR) |
79 | 58 | @tar czf files/$(APP_NAME).tar.gz -C $(SOURCE_DIR) . | ||
83 | 59 | 63 | ||
84 | 60 | 64 | ||
85 | 61 | # The following targets are for charm maintenance. | 65 | # The following targets are for charm maintenance. |
86 | @@ -65,7 +69,6 @@ | |||
87 | 65 | @find . -depth -name '__pycache__' -exec rm -rf '{}' \; | 69 | @find . -depth -name '__pycache__' -exec rm -rf '{}' \; |
88 | 66 | @rm -f .coverage | 70 | @rm -f .coverage |
89 | 67 | @rm -rf $(SOURCE_DIR) | 71 | @rm -rf $(SOURCE_DIR) |
90 | 68 | @rm -rf $(PIP_CACHE) | ||
91 | 69 | @rm -rf .venv | 72 | @rm -rf .venv |
92 | 70 | 73 | ||
93 | 71 | 74 | ||
94 | @@ -94,4 +97,4 @@ | |||
95 | 94 | @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py > /tmp/charm_helpers_sync.py | 97 | @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py > /tmp/charm_helpers_sync.py |
96 | 95 | 98 | ||
97 | 96 | 99 | ||
99 | 97 | .PHONY: clean lint setup tarball test upgrade | 100 | .PHONY: clean lint setup payload test upgrade |
100 | 98 | 101 | ||
101 | === modified file 'README.md' | |||
102 | --- README.md 2015-02-11 21:48:08 +0000 | |||
103 | +++ README.md 2015-11-02 11:16:19 +0000 | |||
104 | @@ -11,8 +11,6 @@ | |||
105 | 11 | 11 | ||
106 | 12 | $ REV=revno make deploy | 12 | $ REV=revno make deploy |
107 | 13 | 13 | ||
108 | 14 | The deploy target will create a fresh python package cache in files/pip-cache and sets the turnip config variable 'revision' to that specified by REV. | ||
109 | 15 | |||
110 | 16 | # Todo | 14 | # Todo |
111 | 17 | 15 | ||
112 | 18 | * Refactor to use charmhelpers.core.services | 16 | * Refactor to use charmhelpers.core.services |
113 | 19 | 17 | ||
114 | === modified file 'config.yaml' | |||
115 | --- config.yaml 2015-05-22 15:42:11 +0000 | |||
116 | +++ config.yaml 2015-11-02 11:16:19 +0000 | |||
117 | @@ -3,6 +3,10 @@ | |||
118 | 3 | type: string | 3 | type: string |
119 | 4 | default: 'turnip' | 4 | default: 'turnip' |
120 | 5 | description: Name of this application. | 5 | description: Name of this application. |
121 | 6 | build_label: | ||
122 | 7 | type: string | ||
123 | 8 | default: "" | ||
124 | 9 | description: Build label to run. | ||
125 | 6 | nagios_context: | 10 | nagios_context: |
126 | 7 | default: "juju" | 11 | default: "juju" |
127 | 8 | type: string | 12 | type: string |
128 | @@ -120,6 +124,30 @@ | |||
129 | 120 | description: | | 124 | description: | |
130 | 121 | Hosts that should be allowed to rsync logs. Note that this relies on | 125 | Hosts that should be allowed to rsync logs. Note that this relies on |
131 | 122 | basenode. | 126 | basenode. |
132 | 127 | swift_username: | ||
133 | 128 | type: string | ||
134 | 129 | default: "" | ||
135 | 130 | description: Username to use when accessing Swift. | ||
136 | 131 | swift_password: | ||
137 | 132 | type: string | ||
138 | 133 | default: "" | ||
139 | 134 | description: Password to use when accessing Swift. | ||
140 | 135 | swift_auth_url: | ||
141 | 136 | type: string | ||
142 | 137 | default: "" | ||
143 | 138 | description: URL for authenticating against Keystone. | ||
144 | 139 | swift_region_name: | ||
145 | 140 | type: string | ||
146 | 141 | default: "" | ||
147 | 142 | description: Swift region. | ||
148 | 143 | swift_tenant_name: | ||
149 | 144 | type: string | ||
150 | 145 | default: "" | ||
151 | 146 | description: Entity that owns resources. | ||
152 | 147 | swift_container_name: | ||
153 | 148 | type: string | ||
154 | 149 | default: "" | ||
155 | 150 | description: Container to put objects in. | ||
156 | 123 | 151 | ||
157 | 124 | # apt configuration used by charmhelpers. | 152 | # apt configuration used by charmhelpers. |
158 | 125 | install_sources: | 153 | install_sources: |
159 | 126 | 154 | ||
160 | === removed file 'deploy-requirements.txt' | |||
161 | --- deploy-requirements.txt 2015-03-27 07:43:45 +0000 | |||
162 | +++ deploy-requirements.txt 1970-01-01 00:00:00 +0000 | |||
163 | @@ -1,2 +0,0 @@ | |||
164 | 1 | envdir==0.7 | ||
165 | 2 | gunicorn==19.3.0 | ||
166 | 3 | 0 | ||
167 | === modified file 'hooks/actions.py' | |||
168 | --- hooks/actions.py 2015-05-22 15:42:11 +0000 | |||
169 | +++ hooks/actions.py 2015-11-02 11:16:19 +0000 | |||
170 | @@ -1,4 +1,5 @@ | |||
171 | 1 | import base64 | 1 | import base64 |
172 | 2 | import errno | ||
173 | 2 | import grp | 3 | import grp |
174 | 3 | import os | 4 | import os |
175 | 4 | import pwd | 5 | import pwd |
176 | @@ -26,19 +27,29 @@ | |||
177 | 26 | 27 | ||
178 | 27 | # Globals | 28 | # Globals |
179 | 28 | CHARM_FILES_DIR = os.path.join(hookenv.charm_dir(), 'files') | 29 | CHARM_FILES_DIR = os.path.join(hookenv.charm_dir(), 'files') |
180 | 30 | CHARM_SCRIPTS_DIR = os.path.join(hookenv.charm_dir(), 'scripts') | ||
181 | 29 | REQUIRED_PACKAGES = [ | 31 | REQUIRED_PACKAGES = [ |
183 | 30 | 'python-virtualenv', 'python-dev', 'python-pygit2', 'git', 'cgit', | 32 | 'python-virtualenv', |
184 | 33 | 'python-dev', | ||
185 | 34 | 'libgit2-dev', | ||
186 | 35 | 'libffi-dev', | ||
187 | 36 | 'git', | ||
188 | 37 | 'cgit', | ||
189 | 31 | # Unfortunately we need build-essential to compile some extensions, | 38 | # Unfortunately we need build-essential to compile some extensions, |
190 | 32 | # notably Twisted. Using wheels rather than a pip-cache might let us | 39 | # notably Twisted. Using wheels rather than a pip-cache might let us |
191 | 33 | # avoid this in future. | 40 | # avoid this in future. |
192 | 34 | 'build-essential', | 41 | 'build-essential', |
193 | 42 | 'python-swiftclient', | ||
194 | 35 | ] | 43 | ] |
195 | 36 | BASE_DIR = config['base_dir'] | 44 | BASE_DIR = config['base_dir'] |
196 | 45 | PAYLOADS_DIR = os.path.join(BASE_DIR, 'payloads') | ||
197 | 37 | CODE_DIR = os.path.join(BASE_DIR, 'code') | 46 | CODE_DIR = os.path.join(BASE_DIR, 'code') |
199 | 38 | VENV_DIR = os.path.join(BASE_DIR, 'venv') | 47 | VENV_DIR = os.path.join(CODE_DIR, 'env') |
200 | 48 | OLD_VENV_DIR = os.path.join(BASE_DIR, 'venv') | ||
201 | 39 | LOGS_DIR = os.path.join(BASE_DIR, 'logs') | 49 | LOGS_DIR = os.path.join(BASE_DIR, 'logs') |
202 | 40 | DATA_DIR = os.path.join(BASE_DIR, 'data') | 50 | DATA_DIR = os.path.join(BASE_DIR, 'data') |
203 | 41 | KEY_DIR = os.path.join(BASE_DIR, 'keys') | 51 | KEY_DIR = os.path.join(BASE_DIR, 'keys') |
204 | 52 | CODE_TARBALL = 'turnip.tar.gz' | ||
205 | 42 | 53 | ||
206 | 43 | CODE_USER = config['code_user'] | 54 | CODE_USER = config['code_user'] |
207 | 44 | CODE_GROUP = config['code_group'] | 55 | CODE_GROUP = config['code_group'] |
208 | @@ -62,7 +73,7 @@ | |||
209 | 62 | def make_srv_location(): | 73 | def make_srv_location(): |
210 | 63 | hookenv.log('Creating directories...') | 74 | hookenv.log('Creating directories...') |
211 | 64 | 75 | ||
213 | 65 | for dir in (BASE_DIR, CODE_DIR): | 76 | for dir in (BASE_DIR, PAYLOADS_DIR): |
214 | 66 | host.mkdir(dir, owner=CODE_USER, group=CODE_GROUP, perms=0o755) | 77 | host.mkdir(dir, owner=CODE_USER, group=CODE_GROUP, perms=0o755) |
215 | 67 | for dir in (LOGS_DIR, DATA_DIR, KEY_DIR): | 78 | for dir in (LOGS_DIR, DATA_DIR, KEY_DIR): |
216 | 68 | host.mkdir(dir, owner=USER, group=GROUP, perms=0o755) | 79 | host.mkdir(dir, owner=USER, group=GROUP, perms=0o755) |
217 | @@ -87,23 +98,143 @@ | |||
218 | 87 | host.add_user_to_group(USER, CGIT_GROUP) | 98 | host.add_user_to_group(USER, CGIT_GROUP) |
219 | 88 | 99 | ||
220 | 89 | 100 | ||
224 | 90 | def unpack_source(service_name): | 101 | def get_swift_creds(config): |
225 | 91 | hookenv.log('Deploying source...') | 102 | return { |
226 | 92 | 103 | 'user': config['swift_username'], | |
227 | 104 | 'project': config['swift_tenant_name'], | ||
228 | 105 | 'password': config['swift_password'], | ||
229 | 106 | 'authurl': config['swift_auth_url'], | ||
230 | 107 | 'region': config['swift_region_name'], | ||
231 | 108 | } | ||
232 | 109 | |||
233 | 110 | |||
234 | 111 | def swift_base_cmd(**swift_creds): | ||
235 | 112 | return [ | ||
236 | 113 | 'swift', | ||
237 | 114 | '--os-username=' + swift_creds['user'], | ||
238 | 115 | '--os-tenant-name=' + swift_creds['project'], | ||
239 | 116 | '--os-password=' + swift_creds['password'], | ||
240 | 117 | '--os-auth-url=' + swift_creds['authurl'], | ||
241 | 118 | '--os-region-name=' + swift_creds['region'], | ||
242 | 119 | ] | ||
243 | 120 | |||
244 | 121 | |||
245 | 122 | def swift_get_etag(name, container=None, **swift_creds): | ||
246 | 123 | cmd = swift_base_cmd(**swift_creds) + ['stat', container, name] | ||
247 | 124 | file_stat = subprocess.check_output(cmd).splitlines() | ||
248 | 125 | for line in file_stat: | ||
249 | 126 | words = line.split() | ||
250 | 127 | if words[0] == 'ETag:': | ||
251 | 128 | return words[1] | ||
252 | 129 | |||
253 | 130 | |||
254 | 131 | def swift_fetch(source, target, container=None, **swift_creds): | ||
255 | 132 | cmd = swift_base_cmd(**swift_creds) + [ | ||
256 | 133 | 'download', '--output=' + target, container, source] | ||
257 | 134 | subprocess.check_call(cmd) | ||
258 | 135 | |||
259 | 136 | |||
260 | 137 | def unlink_force(path): | ||
261 | 138 | """Unlink path, without worrying about whether it exists.""" | ||
262 | 139 | try: | ||
263 | 140 | os.unlink(path) | ||
264 | 141 | except OSError as e: | ||
265 | 142 | if e.errno != errno.ENOENT: | ||
266 | 143 | raise | ||
267 | 144 | |||
268 | 145 | |||
269 | 146 | def symlink_force(source, link_name): | ||
270 | 147 | """Create symlink link_name -> source, even if link_name exists.""" | ||
271 | 148 | unlink_force(link_name) | ||
272 | 149 | os.symlink(source, link_name) | ||
273 | 150 | |||
274 | 151 | |||
275 | 152 | def install_python_packages(target_dir): | ||
276 | 153 | hookenv.log('Installing Python dependencies...') | ||
277 | 154 | subprocess.check_call( | ||
278 | 155 | ['sudo', '-u', CODE_USER, 'make', '-C', target_dir, 'build', | ||
279 | 156 | 'PIP_SOURCE_DIR=%s' % os.path.join(target_dir, 'pip-cache')]) | ||
280 | 157 | |||
281 | 158 | |||
282 | 159 | def prune_payloads(keep): | ||
283 | 160 | for entry in os.listdir(PAYLOADS_DIR): | ||
284 | 161 | if entry in keep: | ||
285 | 162 | continue | ||
286 | 163 | entry_path = os.path.join(PAYLOADS_DIR, entry) | ||
287 | 164 | if os.path.isdir(entry_path): | ||
288 | 165 | hookenv.log('Purging old build in %s...' % entry_path) | ||
289 | 166 | shutil.rmtree(entry_path) | ||
290 | 167 | |||
291 | 168 | |||
292 | 169 | def deploy_code(service_name): | ||
293 | 93 | make_srv_location() | 170 | make_srv_location() |
294 | 94 | 171 | ||
295 | 172 | current_build_label = None | ||
296 | 173 | if os.path.islink(CODE_DIR): | ||
297 | 174 | current_build_label = os.path.basename(os.path.realpath(CODE_DIR)) | ||
298 | 175 | elif os.path.isdir(os.path.join(CODE_DIR, '.bzr')): | ||
299 | 176 | log_output = subprocess.check_output( | ||
300 | 177 | ['bzr', 'log', '-rlast:', '--show-ids', CODE_DIR]) | ||
301 | 178 | for line in log_output.splitlines(): | ||
302 | 179 | if line.startswith('revision-id: '): | ||
303 | 180 | current_build_label = line[len('revision-id: '):] | ||
304 | 181 | desired_build_label = config['build_label'] | ||
305 | 182 | if not desired_build_label: | ||
306 | 183 | if current_build_label is not None: | ||
307 | 184 | hookenv.log( | ||
308 | 185 | 'No desired build label, but build %s is already deployed' % | ||
309 | 186 | current_build_label) | ||
310 | 187 | # We're probably upgrading from a charm that used old-style code | ||
311 | 188 | # assets, so make sure we at least have a virtualenv available | ||
312 | 189 | # from the current preferred location. | ||
313 | 190 | if not os.path.isdir(VENV_DIR) and os.path.isdir(OLD_VENV_DIR): | ||
314 | 191 | os.symlink(OLD_VENV_DIR, VENV_DIR) | ||
315 | 192 | return | ||
316 | 193 | else: | ||
317 | 194 | raise AssertionError('Build label unset, so cannot deploy code') | ||
318 | 195 | if current_build_label == desired_build_label: | ||
319 | 196 | hookenv.log('Build %s already deployed' % desired_build_label) | ||
320 | 197 | return | ||
321 | 198 | hookenv.log('Deploying build %s...' % desired_build_label) | ||
322 | 199 | |||
323 | 95 | # Copy source archive | 200 | # Copy source archive |
335 | 96 | archive_path = os.path.join(BASE_DIR, 'turnip.tar.gz') | 201 | archive_path = os.path.join(PAYLOADS_DIR, desired_build_label + '.tar.gz') |
336 | 97 | 202 | object_name = os.path.join(desired_build_label, CODE_TARBALL) | |
337 | 98 | with open(os.path.join(CHARM_FILES_DIR, 'turnip.tar.gz')) as file: | 203 | |
338 | 99 | host.write_file(archive_path, file.read(), perms=0o644) | 204 | try: |
339 | 100 | 205 | if config['swift_container_name']: | |
340 | 101 | # Unpack source | 206 | swift_creds = get_swift_creds(config) |
341 | 102 | archive.extract_tarfile(archive_path, CODE_DIR) | 207 | swift_container = config['swift_container_name'] |
342 | 103 | os.chown( | 208 | swift_fetch( |
343 | 104 | CODE_DIR, | 209 | os.path.join('turnip-builds', object_name), archive_path, |
344 | 105 | pwd.getpwnam(CODE_USER).pw_uid, grp.getgrnam(CODE_GROUP).gr_gid) | 210 | container=swift_container, **swift_creds) |
345 | 106 | host.lchownr(CODE_DIR, CODE_USER, CODE_GROUP) | 211 | else: |
346 | 212 | with open(os.path.join(CHARM_FILES_DIR, object_name)) as file: | ||
347 | 213 | host.write_file(archive_path, file.read(), perms=0o644) | ||
348 | 214 | |||
349 | 215 | # Unpack source | ||
350 | 216 | target_dir = os.path.join(PAYLOADS_DIR, desired_build_label) | ||
351 | 217 | if os.path.isdir(target_dir): | ||
352 | 218 | shutil.rmtree(target_dir) | ||
353 | 219 | archive.extract_tarfile(archive_path, target_dir) | ||
354 | 220 | os.chown( | ||
355 | 221 | target_dir, | ||
356 | 222 | pwd.getpwnam(CODE_USER).pw_uid, grp.getgrnam(CODE_GROUP).gr_gid) | ||
357 | 223 | host.lchownr(target_dir, CODE_USER, CODE_GROUP) | ||
358 | 224 | |||
359 | 225 | install_python_packages(target_dir) | ||
360 | 226 | |||
361 | 227 | if not os.path.islink(CODE_DIR) and os.path.isdir(CODE_DIR): | ||
362 | 228 | old_payload_dir = os.path.join(PAYLOADS_DIR, current_build_label) | ||
363 | 229 | if os.path.exists(old_payload_dir): | ||
364 | 230 | shutil.rmtree(CODE_DIR) | ||
365 | 231 | else: | ||
366 | 232 | os.rename(CODE_DIR, old_payload_dir) | ||
367 | 233 | symlink_force( | ||
368 | 234 | os.path.relpath(target_dir, os.path.dirname(CODE_DIR)), CODE_DIR) | ||
369 | 235 | prune_payloads([desired_build_label, current_build_label]) | ||
370 | 236 | finally: | ||
371 | 237 | unlink_force(archive_path) | ||
372 | 107 | 238 | ||
373 | 108 | 239 | ||
374 | 109 | def install_packages(service_name): | 240 | def install_packages(service_name): |
375 | @@ -112,26 +243,6 @@ | |||
376 | 112 | fetch.apt_install(REQUIRED_PACKAGES, fatal=True) | 243 | fetch.apt_install(REQUIRED_PACKAGES, fatal=True) |
377 | 113 | 244 | ||
378 | 114 | 245 | ||
379 | 115 | def install_python_packages(service_name): | ||
380 | 116 | hookenv.log('Installing Python dependencies...') | ||
381 | 117 | pip_cache = os.path.join(CHARM_FILES_DIR, 'pip-cache') | ||
382 | 118 | code_reqs = os.path.join(CODE_DIR, 'requirements.txt') | ||
383 | 119 | deploy_reqs = os.path.join(hookenv.charm_dir(), 'deploy-requirements.txt') | ||
384 | 120 | |||
385 | 121 | pip_bin = os.path.join(VENV_DIR, 'bin', 'pip') | ||
386 | 122 | |||
387 | 123 | subprocess.call([ | ||
388 | 124 | 'sudo', '-u', CODE_USER, 'virtualenv', '--system-site-packages', | ||
389 | 125 | VENV_DIR]) | ||
390 | 126 | subprocess.check_call([ | ||
391 | 127 | 'sudo', '-u', CODE_USER, pip_bin, 'install', '--no-index', | ||
392 | 128 | '--find-links={}'.format(pip_cache), '-r', code_reqs, | ||
393 | 129 | '-r', deploy_reqs]) | ||
394 | 130 | subprocess.check_call([ | ||
395 | 131 | 'sudo', '-u', CODE_USER, pip_bin, 'install', '--no-deps', | ||
396 | 132 | '-e', CODE_DIR]) | ||
397 | 133 | |||
398 | 134 | |||
399 | 135 | def write_ssh_keys(service_name): | 246 | def write_ssh_keys(service_name): |
400 | 136 | if PUBLIC_KEY and PRIVATE_KEY: | 247 | if PUBLIC_KEY and PRIVATE_KEY: |
401 | 137 | keys = [ | 248 | keys = [ |
402 | @@ -302,7 +413,7 @@ | |||
403 | 302 | 413 | ||
404 | 303 | 414 | ||
405 | 304 | def install_nrpe_scripts(service_name): | 415 | def install_nrpe_scripts(service_name): |
407 | 305 | src = os.path.join(CHARM_FILES_DIR, "nrpe") | 416 | src = os.path.join(CHARM_SCRIPTS_DIR, "nrpe") |
408 | 306 | dst = "/usr/local/lib/nagios/plugins" | 417 | dst = "/usr/local/lib/nagios/plugins" |
409 | 307 | if not os.path.exists(dst): | 418 | if not os.path.exists(dst): |
410 | 308 | os.makedirs(dst) | 419 | os.makedirs(dst) |
411 | 309 | 420 | ||
412 | === modified file 'hooks/services.py' | |||
413 | --- hooks/services.py 2015-05-22 15:42:11 +0000 | |||
414 | +++ hooks/services.py 2015-11-02 11:16:19 +0000 | |||
415 | @@ -34,8 +34,8 @@ | |||
416 | 34 | actions.execd_preinstall('turnip') | 34 | actions.execd_preinstall('turnip') |
417 | 35 | actions.install_packages('turnip') | 35 | actions.install_packages('turnip') |
418 | 36 | actions.create_users('turnip') | 36 | actions.create_users('turnip') |
421 | 37 | actions.unpack_source('turnip') | 37 | if hookenv.hook_name() in ('install', 'upgrade-charm', 'config-changed'): |
422 | 38 | actions.install_python_packages('turnip') | 38 | actions.deploy_code('turnip') |
423 | 39 | 39 | ||
424 | 40 | extra_requirements = [] | 40 | extra_requirements = [] |
425 | 41 | 41 | ||
426 | 42 | 42 | ||
427 | === renamed directory 'files/nrpe' => 'scripts/nrpe' | |||
428 | === modified file 'templates/turnip-httpserver.conf.j2' | |||
429 | --- templates/turnip-httpserver.conf.j2 2015-04-26 22:33:55 +0000 | |||
430 | +++ templates/turnip-httpserver.conf.j2 2015-11-02 11:16:19 +0000 | |||
431 | @@ -4,7 +4,7 @@ | |||
432 | 4 | setuid {{ user }} | 4 | setuid {{ user }} |
433 | 5 | setgid {{ group }} | 5 | setgid {{ group }} |
434 | 6 | 6 | ||
436 | 7 | env PYTHON_HOME={{ base_dir }}/venv | 7 | env PYTHON_HOME={{ base_dir }}/code/env |
437 | 8 | 8 | ||
438 | 9 | start on runlevel [2345] | 9 | start on runlevel [2345] |
439 | 10 | stop on runlevel [016] | 10 | stop on runlevel [016] |
440 | 11 | 11 | ||
441 | === modified file 'templates/turnip-packbackendserver.conf.j2' | |||
442 | --- templates/turnip-packbackendserver.conf.j2 2015-04-26 22:33:55 +0000 | |||
443 | +++ templates/turnip-packbackendserver.conf.j2 2015-11-02 11:16:19 +0000 | |||
444 | @@ -4,7 +4,7 @@ | |||
445 | 4 | setuid {{ user }} | 4 | setuid {{ user }} |
446 | 5 | setgid {{ group }} | 5 | setgid {{ group }} |
447 | 6 | 6 | ||
449 | 7 | env PYTHON_HOME={{ base_dir }}/venv | 7 | env PYTHON_HOME={{ base_dir }}/code/env |
450 | 8 | 8 | ||
451 | 9 | start on runlevel [2345] | 9 | start on runlevel [2345] |
452 | 10 | stop on runlevel [016] | 10 | stop on runlevel [016] |
453 | 11 | 11 | ||
454 | === modified file 'templates/turnip-packfrontendserver.conf.j2' | |||
455 | --- templates/turnip-packfrontendserver.conf.j2 2015-04-26 22:33:55 +0000 | |||
456 | +++ templates/turnip-packfrontendserver.conf.j2 2015-11-02 11:16:19 +0000 | |||
457 | @@ -4,7 +4,7 @@ | |||
458 | 4 | setuid {{ user }} | 4 | setuid {{ user }} |
459 | 5 | setgid {{ group }} | 5 | setgid {{ group }} |
460 | 6 | 6 | ||
462 | 7 | env PYTHON_HOME={{ base_dir }}/venv | 7 | env PYTHON_HOME={{ base_dir }}/code/env |
463 | 8 | 8 | ||
464 | 9 | start on runlevel [2345] | 9 | start on runlevel [2345] |
465 | 10 | stop on runlevel [016] | 10 | stop on runlevel [016] |
466 | 11 | 11 | ||
467 | === modified file 'templates/turnip-sshserver.conf.j2' | |||
468 | --- templates/turnip-sshserver.conf.j2 2015-04-26 22:33:55 +0000 | |||
469 | +++ templates/turnip-sshserver.conf.j2 2015-11-02 11:16:19 +0000 | |||
470 | @@ -4,7 +4,7 @@ | |||
471 | 4 | setuid {{ user }} | 4 | setuid {{ user }} |
472 | 5 | setgid {{ group }} | 5 | setgid {{ group }} |
473 | 6 | 6 | ||
475 | 7 | env PYTHON_HOME={{ base_dir }}/venv | 7 | env PYTHON_HOME={{ base_dir }}/code/env |
476 | 8 | 8 | ||
477 | 9 | start on runlevel [2345] | 9 | start on runlevel [2345] |
478 | 10 | stop on runlevel [016] | 10 | stop on runlevel [016] |
479 | 11 | 11 | ||
480 | === modified file 'templates/turnip-virtserver.conf.j2' | |||
481 | --- templates/turnip-virtserver.conf.j2 2015-04-26 22:33:55 +0000 | |||
482 | +++ templates/turnip-virtserver.conf.j2 2015-11-02 11:16:19 +0000 | |||
483 | @@ -4,7 +4,7 @@ | |||
484 | 4 | setuid {{ user }} | 4 | setuid {{ user }} |
485 | 5 | setgid {{ group }} | 5 | setgid {{ group }} |
486 | 6 | 6 | ||
488 | 7 | env PYTHON_HOME={{ base_dir }}/venv | 7 | env PYTHON_HOME={{ base_dir }}/code/env |
489 | 8 | 8 | ||
490 | 9 | start on runlevel [2345] | 9 | start on runlevel [2345] |
491 | 10 | stop on runlevel [016] | 10 | stop on runlevel [016] |
Just a minor formatting issue.
I'm working through updating Rutabaga's charm similarly as a somewhat circumlocuitous form of review. No doubt fine, given the mojo spec ran, but thought it might be worth leaving this open in the event that I came across anything in the process.
Given our services will be duplicating a reasonable amount of charm code, I did wonder if we should consider maintaining our own charmhelpers like library.