keystone-classic: Field errors in s3 file upload
I have a model with the following code for uploading files to an s3 bucket,
var keystone = require('keystone')
var Types = keystone.Field.Types
var s3_storage = new keystone.Storage({
adapter: require('keystone-storage-adapter-s3'),
s3: {
key: process.env.S3_KEY,
secret: process.env.S3_SECRET,
bucket: process.env.S3_BUCKET,
path: '/assets',
headers: {
'x-amz-acl': 'public-read', // add default headers; see below for details
}
},
schema: {
bucket: true, // optional; store the bucket the file was uploaded to in your db
etag: true, // optional; store the etag for the resource
path: true, // optional; store the path of the file in your db
url: true, // optional; generate & store a public URL
}
})
var Product = new keystone.List('Product')
Product.add({
image: { type: Types.File, storage: s3_storage },
})
Product.register()
In the admin I upload the image for the product, then I press save and finally I get the error “Field errors”, is something wrong with my code? I’m using the latest current version of master.
About this issue
- Original URL
- State: closed
- Created 8 years ago
- Comments: 15
I’m working on this, I think the best way is to replace knox with the official aws-sdk for uploading files, when I have it I will make a pull request.
@mxstbr yes!
I found the problem! I’m using the frankfurt region of s3 and it seems like knox (one dependency of keystone) doesn’t support this. Here is more info Automattic/knox#254
i’m working well
Is there a PR or ticket issue to follow up on the replacement of knox by the official aws-sdk in KeystoneJS ?
I’m getting an error 400 from amazon, my secret and key is fine because I used the cli tool of amazon and it’s uploading files fine.
Maybe it’s because of how I configured keystone?
I just added in the .env file the variables like this
S3_BUCKET=bucket.tests S3_KEY=AGFSGHSGHGHH S3_SECRET=AJKSJHSJK/ADSDFGGFSF/DHJBHJGDJHGDf S3_REGION=eu-central-1