Important: This documentation covers Yarn 1 (Classic).
For Yarn 2+ docs and migration guide, see yarnpkg.com.

Package detail

@keystonejs/file-adapters

keystonejs910MIT7.1.2TypeScript support: definitely-typed

Adapters for handling storage of the File type

readme

File adapters

This is the last active development release of this package as Keystone 5 is now in a 6 to 12 month active maintenance phase. For more information please read our Keystone 5 and beyond post.

The File field type can support files hosted in a range of different contexts, e.g. in the local filesystem, or on a cloud based file server.

Different contexts are supported by different file adapters. This package contains the built-in file adapters supported by KeystoneJS.

LocalFileAdapter

Usage

const { LocalFileAdapter } = require('@keystonejs/file-adapters');

const fileAdapter = new LocalFileAdapter({...});

Config

Option Type Default Description
src String Required The path where uploaded files will be stored on the server.
path String Value of src The path from which requests for files will be served from the server.
getFilename Function null Function taking a { id, originalFilename } parameter. Should return a string with the name for the uploaded file on disk.

Note: src and path may be the same. Note 2: You may also need to use a static file server to host the uploaded files.

Methods

delete

Takes a file object (such as the one returned in file field hooks) and deletes that file from disk.

const { File } = require('@keystonejs/fields');

const fileAdapter = new LocalFileAdapter({
  src: './files',
  path: '/files',
});

keystone.createList('UploadTest', {
  fields: {
    file: {
      type: File,
      adapter: fileAdapter,
      hooks: {
        beforeChange: async ({ existingItem }) => {
          if (existingItem && existingItem.file) {
            await fileAdapter.delete(existingItem.file);
          }
        },
      },
    },
  },
  hooks: {
    afterDelete: async ({ existingItem }) => {
      if (existingItem.file) {
        await fileAdapter.delete(existingItem.file);
      }
    },
  },
});

GraphQL Usage

You can upload files directly through the GraphQL API. For example, with the above list you can do the following:

// Query
mutation uploadImageQuery ($file: Upload){
  createUploadTest(data: {
    file: $file,
  }) {
    id
    file {
      publicUrl
    }
  }
}

// Variables
variables: {
  file: // File path
},

Note that you'll need support in your GraphQL Client to make this work. Two popular ones include apollo-upload-client and urql.

If you're not familiar with file uploads in GraphQL, check out Altair Playground and follow the file upload docs to try it out.

CloudinaryFileAdapter

Usage

const { CloudinaryAdapter } = require('@keystonejs/file-adapters');

const fileAdapter = new CloudinaryAdapter({...});

Config

Option Type Default Description
cloudName String Required
apiKey String Required
apiSecret String Required
folder String undefined

Methods

delete

Deletes the provided file from cloudinary. Takes a file object (such as the one returned in file field hooks) and an optional options argument passed to the Cloudinary destroy method. For available options refer to the Cloudinary destroy API.

S3FileAdapter

Usage

const { S3Adapter } = require('@keystonejs/file-adapters');

const CF_DISTRIBUTION_ID = 'cloudfront-distribution-id';
const S3_PATH = 'uploads';

const fileAdapter = new S3Adapter({
  bucket: 'bucket-name',
  folder: S3_PATH,
  publicUrl: ({ id, filename, _meta }) =>
    `https://${CF_DISTRIBUTION_ID}.cloudfront.net/${S3_PATH}/${filename}`,
  s3Options: {
    // Optional paramaters to be supplied directly to AWS.S3 constructor
    apiVersion: '2006-03-01',
    accessKeyId: 'ACCESS_KEY_ID',
    secretAccessKey: 'SECRET_ACCESS_KEY',
    region: 'us-west-2',
  },
  uploadParams: ({ filename, id, mimetype, encoding }) => ({
    Metadata: {
      keystone_id: `${id}`,
    },
  }),
});

Config

Option Type Default Description
bucket String Required S3 bucket name
folder String '' Upload folder from root of bucket. By default uploads will be sent to the bucket's root folder.
getFilename Function null Function taking a { id, originalFilename } parameter. Should return a string with the name for the uploaded file on disk.
publicUrl Function | By default the publicUrl returns a url for the S3 bucket in the form https://{bucket}.s3.amazonaws.com/{key}/{filename}. This will only work if the bucket is configured to allow public access.
s3Options Object undefined For available options refer to the AWS S3 API
uploadParams `Object Function` {}

Note: Authentication can be done in many different ways. One option is to include valid accessKeyId and secretAccessKey properties in the s3Options parameter. Other methods include setting environment variables. See Setting Credentials in Node.js for a complete set of options.

S3-compatible object storage providers

You can also use any S3 compatible object storage provider with this adapter. You must provide the config s3Options.endpoint to correctly point to provider's server. Other options may be required based on which provider you choose.

const s3Options = {
  accessKeyId: 'YOUR-ACCESSKEYID',
  secretAccessKey: 'YOUR-SECRETACCESSKEY',
  endpoint: 'https://${REGION}.digitaloceanspaces.com', // REGION is datacenter region e.g. nyc3, sgp1 etc
};
const s3Options = {
  accessKeyId: 'YOUR-ACCESSKEYID',
  secretAccessKey: 'YOUR-SECRETACCESSKEY',
  endpoint: 'http://127.0.0.1:9000', // locally or cloud url
  s3ForcePathStyle: true, // needed with minio?
  signatureVersion: 'v4', // needed with minio?
};

For Minio ACL: 'public-read' config on uploadParams.Metadata option does not work compared to AWS or DigitalOcean, you must set public policy on bucket to use anonymous access, if you want to provide authenticated access, you must use afterChange hook to populate publicUrl. Check policy command in Minio client

Methods

delete

Deletes the provided file in the S3 bucket. Takes a file object (such as the one returned in file field hooks) and an optional options argument for overriding S3.deleteObject options. Options Bucket and Key are set by default. For available options refer to the AWS S3 deleteObject API.

// Optional params
const deleteParams = {
  BypassGovernanceRetention: true,
};

keystone.createList('Document', {
  fields: {
    file: {
      type: File,
      adapter: fileAdapter,
      hooks: {
        beforeChange: async ({ existingItem }) => {
          if (existingItem && existingItem.file) {
            await fileAdapter.delete(existingItem.file, deleteParams);
          }
        },
      },
    },
  },
  hooks: {
    afterDelete: async ({ existingItem }) => {
      if (existingItem.file) {
        await fileAdapter.delete(existingItem.file, deleteParams);
      }
    },
  },
});

changelog

@keystonejs/file-adapters

7.1.2

Patch Changes

7.1.1

Patch Changes

7.1.0

Minor Changes

7.0.9

Patch Changes

7.0.8

Patch Changes

7.0.7

Patch Changes

7.0.6

Patch Changes

7.0.5

Patch Changes

7.0.4

Patch Changes

  • 6cb4476ff #3481 Thanks @Noviny! - Updated dependencies through a major version - this shouldn't require change by consumers.

  • 5935b89f8 #3477 Thanks @Noviny! - Updating dependencies:

    These changes bring the keystone dev experience inline with installing keystone from npm :D

7.0.3

Patch Changes

  • 2f76473ae #3393 Thanks @gautamsi! - Fixed S3 adapter issue on windows where the wrong path character was being used due to path.join

7.0.2

Patch Changes

  • 35335df8e #3268 Thanks @zamkevich! - Fixed a bug in the delete function, due to which it was impossible to delete images in folders.

7.0.1

Patch Changes

  • 25921ebe4 #3220 Thanks @singhArmani! - Fix: delete function not passing required config params

    We are not setting global configuration (api_key, api_secret, and cloud_name) in our cloudinary SDK. We are relying on passing these mandatory config settings in the options param of the upload API. But in the case of destroy method, we are omitting them and passing ondefault empty options object as an argument. This results in a rejected promise. To fix this issue, we are now injecting these values in the provided options object.

    NOTE: User can still override these values if required.

7.0.0

Major Changes

  • 614164c58 #2967 Thanks @intmainvoid! - Providing an access key, secret access key or region directly to AWS has been deprecated (according to AWS). As such, parameters accessKeyId, secretAccessKey and region are no longer required on the S3Adapter's constructor and - if provided this way - are ignored by the S3Adapter. These parameters can however, still be provided via the optional s3Options parameter object if required like so:

    const fileAdapter = new S3Adapter({
      bucket: 'bucket-name',
      // accessKeyId: 'ACCESS_KEY_ID', // No longer required. Ignored if provided here
      // secretAccessKey: 'SECRET_ACCESS_KEY', // No longer required. Ignored if provided here
      // region: 'us-west-2' // No longer required. Ignored if provided here
      s3Options: {
        accessKeyId: 'ACCESS_KEY_ID',
        secretAccessKey: 'SECRET_ACCESS_KEY',
        region: 'us-west-2',
      }
    });

Patch Changes

  • cced67b8f #3073 Thanks @gautamsi! - * Replaced stream.close with stream.destroy() in s3 file-adapter as there is no readableStream.close method.
    • Added documentation about using S3-compatible storage provider. Provided sample config for DigitalOcean and Minio.

6.0.2

Patch Changes

6.0.1

Patch Changes

6.0.0

Major Changes

  • 787eabb3 #2291 Thanks @LiamAttClarke! - Returned promise from file-adapter delete methods, made delete params consistent between adapters.

Patch Changes

5.5.0

Minor Changes

5.4.0

Minor Changes

5.3.2

Patch Changes

5.3.1

Patch Changes

  • f3ea15f8 #2189 - Upgraded prettier to 1.19.1 and formatted code accordingly.

5.3.0

Minor Changes

5.2.0

Minor Changes

5.1.0

Minor Changes

  • ebbcad70 #1833 Thanks @Vultraz! - Added getFilename LocalFileAdapter config parameter to allow configuration of saved filename and saved original filename in database.

5.0.1

Patch Changes

  • 209b7078 #1817 Thanks @Vultraz! - Doc updates and minor functionality improvements for file field/adapters

5.0.0

Major Changes

  • 7b4ed362 #1821 Thanks @jesstelford! - Release @keystonejs/* packages (つ^ ◡ ^)つ

    • This is the first release of @keystonejs/* packages (previously @keystone-alpha/*).
    • All packages in the @keystone-alpha namespace are now available in the @keystonejs namespace, starting at version 5.0.0.
    • To upgrade your project you must update any @keystone-alpha/* dependencies in package.json to point to "@keystonejs/*": "^5.0.0" and update any require/import statements in your code.

@keystone-alpha/file-adapters

2.0.1

Patch Changes

  • 9b532072: Rename Keystone to KeystoneJS in docs where possible in docs

2.0.0

Major Changes

  • d316166e: Change FileAdapter API from: { route, directory } to { path, src } to match other packages.

1.1.1

Patch Changes

  • 19fe6c1b:

    Move frontmatter in docs into comments

1.1.0

Minor Changes

  • af3f31dd:

    Allow passing relative paths to fileAdapter

1.0.2

  • [patch]302930a4:

    • Minor internal code cleanups
  • [patch]a62b869d:

    • Restructure internal code

1.0.1

  • [patch]1f0bc236:

    • Update the package.json author field to "The Keystone Development Team"
  • [patch]9534f98f:

    • Add README.md to package

1.0.0

  • [major] 8b6734ae:

    • This is the first release of keystone-alpha (previously voussoir). All packages in the @voussoir namespace are now available in the @keystone-alpha namespace, starting at version 1.0.0. To upgrade your project you must update any @voussoir/<foo> dependencies in package.json to point to @keystone-alpha/<foo>: "^1.0.0" and update any require/import statements in your code.

@voussoir/file-adapters

0.2.1

  • [patch] fc1a9055:

    • Update dependencies to latest patch versions

0.2.0

  • [minor] 47c7dcf6" :

    • Bump all packages with a minor version to set a new baseline

0.1.3

  • [patch] Bump all packages for Babel config fixes d51c833

0.1.2

  • [patch] Rename readme files a8b995e

0.1.1

  • [patch] Remove tests and markdown from npm dc3ee7d