I'm trying to use django-pipeline-1.1.27 with s3boto to compress and filter static files, and then upload them to an s3 bucket. If I just use:
PIPELINE_STORAGE = 'pipeline.storage.PipelineFinderStorage'
Then it works and I get a static folder with the nice versioned file that I configured. As soon as I switch to
PIPELINE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
I get
Traceback (most recent call last):
File "manage.py", line 15, in <module>
execute_manager(settings)
File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager
utility.execute()
File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/__init__.py", line 379, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/base.py", line 191, in run_from_argv
self.execute(*args, **options.__dict__)
File "/my/virtual/env/lib/python2.7/site-packages/django/core/management/base.py", line 220, in execute
output = self.handle(*args, **options)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/management/commands/synccompress.py", line 39, in handle
packager.pack_stylesheets(package, sync=sync, force=force)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/packager.py", line 52, in pack_stylesheets
**kwargs)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/packager.py", line 60, in pack
package['output'], package['paths'])
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/__init__.py", line 45, in need_update
version = self.version(paths)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/__init__.py", line 20, in version
return getattr(self.versioner, 'version')(paths)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 37, in version
buf = self.concatenate(paths)
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 27, in concatenate
return '\n'.join([self.read_file(path) for path in paths])
File "/my/virtual/env/lib/python2.7/site-packages/pipeline/versioning/hash/__init__.py", line 31, in read_file
file = storage.open(path, 'rb')
File "/my/virtual/env/lib/python2.7/site-packages/django/core/files/storage.py", line 33, in open
file = self._open(name, mode)
File "/my/virtual/env/lib/python2.7/site-packages/storages/backends/s3boto.py", line 177, in _open
raise IOError('File does not exist: %s' % name)
IOError: File does not exist: css/style.css
which is one of my source files. So why does pipeline no longer want to do the filter/concatenate/compress steps when I switch to s3boto storage?
It may be that I'm doing something. Here is other config in case it helps:
INSTALLED_APPS = (
...
'pipeline',
'storages',
)
STATICFILES_FINDERS = (
'pipeline.finders.PipelineFinder',
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
)
STATIC_ROOT = "/some/path/outside/django_project/deploy_static"
STATICFILES_DIRS = () # All statics in this site are in apps
STATICFILES_STORAGE = 'pipeline.storage.PipelineStorage'
PIPELINE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
PIPELINE = True
PIPELINE_AUTO = True
PIPELINE_VERSION = True
PIPELINE_VERSION_PLACEHOLDER = 'VERSION'
PIPELINE_VERSIONING = 'pipeline.versioning.hash.SHA1Versioning'
PIPELINE_CSS = {
'standard': {
'source_filenames': (
'css/style.css',
...
),
'output_filename': 'css/all-VERSION.css',
'extra_context': {
'media': 'screen,projection',
},
}
}
My site is on Django 1.3.1.
The command I'm running is:
python manage.py synccompress --force
The AWS creds are also in settings, but that's moot because it's not even getting to that point.
UPDATE Added full stack and settings requested in comments
UPDATE At the request of the library author, I tried upgrading to the latest beta. Observations from that so far:
- I don't know how to get versioned compressed files now
- collectstatic leaves me with the compressed files and the originals
- Still getting the same error from django-pipeline when boto storage is configured: it wants to send my source files to s3, but I can't even see where it's staging my assets. Nothing gets placed in STATIC_ROOT.
UPDATE I've created the simplest project that works for finder storage and then breaks with S3Boto. I've pushed it to github, and included a capture of the stacktrace.
https://github.com/estebistec/simple_pipeline https://raw.github.com/estebistec/simple_pipeline/master/STACKTRACE
I would be ecstatic if I could be told I'm doing some really dumb and this should all just work.
STATIC_ROOT
,STATICFILES_DIRS
andPIPELINE_ROOT
? Also full traceback would be useful ! – Shuping