functions-samples: unable to split Firebase functions in multiple files

I’m working with firebase functions and arrived to hundreds of functions, and now it is very hard to manage it in single index.js file as shown in their lots of examples

I tried to split that functions in multiple files like:

    --firebase.json
    --functions
      --node_modules
      --index.js
      --package.json
      --app
        --groupFunctions.js
        --authFunctions.js
        --storageFunctions.js

in this structure i devide my functions in three categories and put in that three files groupFunctions.js authFunctions.js storageFunctions.js and tried to import that files in index.js but i don’t know why it is not working for me

Here is groupFunctions.js

var functions = require('firebase-functions');
module.exports = function(){
    exports.onGroupCreate = functions.database.ref('/groups/{groupId}')
        .onWrite(event => {
            console.log(`A group is created in database named:${event.params.groupId}.`);
            // some logic...
            //...
        })
}

Here is index.js file:

var functions = require('firebase-functions');
module.exports = require("./app/groupFunctions")();

my editor not giving any warning in this code but when i deploy this code with firebase deploy --only functions it does not deploy function (if some functions already exist on firebase console, it remove all functions on deploy)

So I need help regarding this problem, looking forward to listen from you guise.

question is also asked on stackoverflow

About this issue

  • Original URL
  • State: closed
  • Created 7 years ago
  • Reactions: 7
  • Comments: 35 (6 by maintainers)

Commits related to this issue

Most upvoted comments

Now you could even automate this a bit and register functions based on their file name. For instance with an index.js like this:

**UPDATE, this works:**

const fs = require('fs');
const path = require('path');

// Folder where all your individual Cloud Functions files are located.
const FUNCTIONS_FOLDER = './my_functs';

fs.readdirSync(path.resolve(__dirname, FUNCTIONS_FOLDER)).forEach(file => { // list files in the folder.
  if(file.endsWith('.js')) {
    const fileBaseName = file.slice(0, -3); // Remove the '.js' extension
    if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === fileBaseName) {
      exports[fileBaseName] = require(`${FUNCTIONS_FOLDER}/${fileBaseName}`);
    }
  }
});

Where your file structure is like:

    --firebase.json
    --functions
      --node_modules
      --index.js
      --package.json
      --my_functs
        --sendFollowerNotification.js
        --blurOffensiveImages.js
        --renderTemplate.js

EDIT: Edited to reflect new name of environment variable.

I’ve created a slightly modified version of index.js that uses globbing to search for files ending with .function.js. This allows me to organise my directory in any manner.

/** EXPORT ALL FUNCTIONS
 *
 *   Loads all `.function.js` files
 *   Exports a cloud function matching the file name
 * 
 *   Based on this thread:
 *     https://github.com/firebase/functions-samples/issues/170
 */
const glob = require("glob");
const files = glob.sync('./**/*.function.js', { cwd: __dirname });
for(let f=0,fl=files.length; f<fl; f++){
  const file = files[f];
  const functionName = file.split('/').pop().slice(0, -12); // Strip off '.function.js'
  if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === functionName) {
    exports[functionName] = require(file);
  }
}

EDIT: Edited to reflect new name of environment variable.

Here is my take on it:

You can have an index.js file that will import and list all other Cloud Functions. One trick to improve performance is to use the process.env.FUNCTION_TARGET env variable that will have the name of the Function currently being triggered. During deployment that env variable is null.

Here is an example index.js for 3 functions:

// Sends notifications to users when they get a new follower.
if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === 'sendFollowerNotification') {
  exports.sendFollowerNotification = require('./sendFollowerNotification');
}

// Blur offensive images uploaded on Cloud Storage.
if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === 'blurOffensiveImages') {
  exports.blurOffensiveImages = require('./blurOffensiveImages');
}

// Renders my server side app template.
if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === 'renderTemplate') {
  exports.renderTemplate = require('./renderTemplate');
}

Then for instance, the ./blurOffensiveImages.js would be:

const functions = require('firebase-functions');
const admin = require('firebase-admin');
try {admin.initializeApp(functions.config().firebase);} catch(e) {} // You do that because the admin SDK can only be initialized once.
const mkdirp = require('mkdirp-promise');
// etc... my imports ...

/**
 *  Blur offensive images uploaded on Cloud Storage.
 */
exports = module.exports = functions.storage.object().onChange(event => {
  // Function's code...
});

This will ensure all the 3 functions are deployed if doing a firebase deploy but, at runtime, only one function will be imported. So you won’t pollute the functions with each other’s imports.

EDIT: Edited to reflect new name of environment variable.

Hi, Please help. I want to write a program on our own https server to add two number without use fire base. So please to run on google home .

@oodavid thx for the great example! 😄 Maybe a little modification that could help. This should remove the problem for @andrewspy

/** EXPORT ALL FUNCTIONS
 *
 *   Loads all `.function.js` files
 *   Exports a cloud function matching the file name
 *
 *   Based on this thread:
 *     https://github.com/firebase/functions-samples/issues/170
 */
const glob = require("glob");
const files = glob.sync('./**/*.function.js', { cwd: __dirname, ignore: './node_modules/**'});
for(let f=0,fl=files.length; f<fl; f++){
  const file = files[f];
  const functionName = file.split('/').pop().slice(0, -12); // Strip off '.function.js'
  if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === functionName) {
    exports[functionName] = require(file);
  }
}

If you don’t ignore the node_modules and you have a lot of them the functions deployment could time out. It will deploy all functions but we don’t need it to search over the node_modules folder.

And just for the record if someone gets an error for the firebase-admin initialization. The way @nicolasgarnier wrote his functions is the one that should work.

const functions = require('firebase-functions');
const admin = require('firebase-admin');
try {admin.initializeApp(functions.config().firebase);} catch(e) {} // You do that because the admin SDK can only be initialized once.
const mkdirp = require('mkdirp-promise');
// etc... my imports ...

/**
 *  Blur offensive images uploaded on Cloud Storage.
 */
exports = module.exports = functions.storage.object().onChange(event => {
  // Function's code...
});

The try catch does the trick 😉

And the camelCase worksout with this little change:

const functionName = camelCase(file.slice(0, -12).split('/').join('_'));

EDIT: Edited to reflect new name of environment variable.

after trying most of these solutions, only this one worked for me

index.js

'use strict';

//Declare all your child functions here
const fooFunction = require('./foo');
const barFunction = require('./bar');

// Note: these tasks need to be initialized in index.js and
// NOT in child functions:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const database = admin.database();
const messaging = admin.messaging();

exports.fooFunction = functions.database.ref('/users/messages_inbox/{sender_id}/message').onWrite(event => {
// Pass whatever tasks to child functions so they have access to it
  fooFunction.handler(database, messaging, event);
});

exports.barFunction = functions.database.ref('/users').onWrite(event => {
// Pass whatever tasks to child functions so they have access to it
  barFunction.handler(database);
});

foo.js

exports.handler = function(database, messaging, event) {
       // some function
}

bar.js

exports.handler = function(database) {
       // some function
}

I edited this script to allow for multiple functions within each document

Where each file may export multiple functions

const functions = require('firebase-functions');

const query = functions.https.onRequest((req, res) => {
    let query = req.query.q;

    res.send({
        "You Searched For": query
    });
});

const searchTest = functions.https.onRequest((req, res) => {
    res.send({
        "searchTest": "Hi There!"
    });
});

module.exports = {
    query,
    searchTest
}

And the script goes through each one identified in each file

// Folder where all your individual Cloud Functions files are located.
const FUNCTIONS_FOLDER = './scFunctions';

fs.readdirSync(path.resolve(__dirname, FUNCTIONS_FOLDER)).forEach(file => { // list files in the folder.
  if(file.endsWith('.js')) {
    const fileBaseName = file.slice(0, -3); // Remove the '.js' extension
    const thisFunction = require(`${FUNCTIONS_FOLDER}/${fileBaseName}`);
    for(var i in thisFunction) {
        exports[i] = thisFunction[i];
    }
  }
});

The URL’s on firebase are predictably named

✔ functions: query: http://localhost:5001/PROJECT-NAME/us-central1/query ✔ functions: helloWorlds: http://localhost:5001/PROJECT-NAME/us-central1/helloWorlds ✔ functions: searchTest: http://localhost:5001/PROJECT-NAME/us-central1/searchTest

Following @boon4376 previous comment, I prefer multiple functions in each file. I have added some “common” files like the following to avoid the SDK being initialized multiple times:

admin.js

// The Firebase Admin SDK to access the Firebase Realtime Database.
const admin = require('firebase-admin');
admin.initializeApp();

module.exports = admin;

The functions are implemented as follows.

realtimedb.js

const functions = require('firebase-functions');
const admin = require('./admin');
// ... more imports

exports.onDeleteItemA = functions.database.ref('/ItemA/{itemId}')
  .onDelete((snapshot, context) => {
// code
// admin.database().ref('/somePath')...
  });

exports.onDeleteItemB = functions.database.ref('/ItemB/{itemId}')
  .onDelete((snapshot, context) => {
// code
  });

And the main index.js file is just excluding some particular files.

index.js

const fs = require('fs');
const path = require('path');

// Folder where all your individual Cloud Functions files are located.
const FUNCTIONS_FOLDER = './';

fs.readdirSync(path.resolve(__dirname, FUNCTIONS_FOLDER))
  .forEach(file => { // list files in the folder.
    if ((file.endsWith('.js')) && (file !== 'index.js') && (file !== 'admin.js')) {
      const fileBaseName = file.slice(0, -3); // Remove the '.js' extension
      const exportedModule = require(`${FUNCTIONS_FOLDER}/${fileBaseName}`);
      for(var functionName in exportedModule) {
        if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === functionName) {
          exports[functionName] = exportedModule[functionName];
        }
      }
    }
  });

EDIT: Edited to reflect new name of environment variable.

I’ve been researching about scalable and convenient firebase functions project structure, and came up with this thread. My implementation below allows for standard firebase function naming convention for objects that will follow your folder structure.

import { merge } from 'lodash'
import { sync } from 'glob'

const functionsRoot = 'root'

const files = sync(`./${functionsRoot}/**/*.f.js`, { cwd: __dirname, ignore: './node_modules/**'});

const nameIt = (name, file) => {
  const qname = name.slice(0, -5).split('/').filter(e => e) // Strip off '.f.js'
  const first = qname.shift()
  const fname = [first, ...qname].join('-')
  if (!process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === fname) {
    // lazy loading - require file only when building or running function
    const exported = require(file)
    const subject = qname.reverse().reduce((obj, el) => ({[el]: obj}), exported.default || exported)
    exports[first] = qname.length ? merge({}, exports[first] || {}, subject) : subject
  }
}

for (let f=0,fl=files.length; f<fl; f++) {
  const file = files[f];
  const name = file.replace(new RegExp(`^./${functionsRoot}`), '')
  nameIt(name, file)
}

This will allow you to have a folder structure like:

root
  users
    initUser.f.js
    findUsers.f.js
    admin
      deleteUser.f.js
  upgradePost.f.js

and an export structure like:

exports = {
  users: {
    initUser: fn(),
    findUsers: fn(),
    admin: {
      deleteUser: fn()
    }
  },
  upgradePost: fn()
}

One thing I didn’t like with the original suggestion by @armenr is that it collapsed the function name into a single string and bypassed an entire feature that allows structuring the functions based on objects exported.

This not only a styling issue making function names really long and basically hard to use, but also a functional issue as it disables your ability to deploy a group of functions using something like firebase deploy --only function:users.admin

EDIT: Edited to reflect new name of environment variable.

@sonovice you could add a library like camelCase and do something like this to generate your function names, ie:

const functionName = camelCase(file.slice(0, -12).split('/'));

this would set the name of functions/path/to/my/logic.function.js to pathToMyLogic.

(untested logic)

@malikasinger1 I had a similar issue so I had to make a higher order function that takes my firebase objects and returns the cloud function I would like to deploy.

cloudFunction.js

export const cloudFunction = (functions, admin) => {
  return functions.https.onRequest((req, res) => {
    const {userID} = req.body;
    admin.database()
      .ref(`users/${userID}`)
      .update(...storylines)
      .then(snapshot => {
        res.send(snapshot.val());
      });
  });
};

index.js

const functions = require('firebase-functions');
const admin = require('firebase-admin');
const src = require('./src');
admin.initializeApp(functions.config().firebase);
exports.cloudFunctions = src.blah.cloudFunction(functions, admin);

Hey everyone! After doing a bit of research and finding this thread among others, I’ve decided to release this solution as a package, better-firebase-functions, which takes into account the majority of use-cases outlined by everyone here.

Your index.js file needs only 2 lines

import exportCloudFunctions from 'better-firebase-functions'
exportCloudFunctions(__dirname, __filename, exports, './', GLOB_PATTERN);

https://www.npmjs.com/package/better-firebase-functions https://github.com/gramstr/better-firebase-functions

This is my first open-source repo, so let me know what you guys think. Any pull requests welcome!

  • any file / dir structure you want, dots and dashes in filenames and folder names auto-handled
  • dots, dashes turned into camelCase
  • any number of file extension chars or segments, automatically handled
  • every file exports one default func
  • correctly nested exports object
  • lightweight, 3 deps
  • we can all contribute and improve module to make it more robust & resistant to edge cases, such as auto-detecting conflicts and incorrect exports

Well this trick is specifically made when using different Cloud Functions. In your case everything is in the same cloud function but you can still “lazy-load” dependencies the normal way using require() if you want. Like this for instance:

const express = require('express');
const app = express();

app.get("/users/:uid/posts", async (req, res) => {
   require("./users_get.js").default(req, res);
});

app.post("/users/:uid/posts", async (req, res) => {
   require("./users_post.js").default(req, res);
});

exports.api = functions.https.onRequest(app);

And you’d have the two files “./users_get.js” and “./users_post.js” doing whatever is only needed for one specific handler.

This optimisation is not that useful though because eventually both handlers run on the same Cloud functions instance so they will get both called on the same instance after a while. But still you could do it, it would help a little bit spread the static init over time.

In the case of entirely different Cloud Functions we do this optimisation because we’re entirely sure that the code will never be ran on the same instance so there is really no point in doing all the static initialisation for other functions.

The FUNCTION_TARGET trick could be written in a more concise way:

if ((process.env.FUNCTION_TARGET || functionName) === functionName) {
  exports[functionName] = require(functionName);
}

EDIT: Edited to reflect new name of environment variable.

This thread is great. Props to @TarikHuber for his Medium post that brought me here.

My working implementation for single function/file is as follows:

const glob = require("glob");
const camelCase = require("camelcase");
const files = glob.sync("./**/*.f.js", {
  cwd: __dirname,
  ignore: "./node_modules/**"
});

files.forEach(functionFile => {
  let functionName = camelCase(
    functionFile
      .slice(0, -5)
      .split("/")
      .join("_")
  );

  !process.env.FUNCTION_TARGET || process.env.FUNCTION_TARGET === functionName
    ? (exports[functionName] = require(functionFile))
    : console.log('Something went wrong!');
});

EDIT: Edited to reflect new name of environment variable.

@andrewspy with the index.js file I suggested, I’m globbing for files that match './**/*.function.js', so helloWorld.js won’t get matched. Rename it to helloWorld.function.js and see if that changes anything.

If not, I’ll make an example repo.