node.js - How to implement parallel insert operations to MongoDB database with Nodejs -


i developing piece of code massive import of items database. total records in excel file 24103. , total time of 40 minutes taking import 700 lines of record because of massive validations. after reducing strategical steps time has reduced 24 minutes records. there 4 activities execute parallely independent of each other. activities tried executing parallely async.parallel process. when saw console debugging, came know still execute in 1 record @ time way. difference in shuffling of records getting inserted. , after execution, took same time took before when activities executing sequentially. possible way make execution(insert) in parallel way.

my steps include

        flowcontroller.on('start', function () {           // uploading , storing binary data excel file          });          flowcontroller.on('1', function () {           // validation check if file contain required sheet name , , convert file json format         });          flowcontroller.on('2', function () {           // replacing keys json match required keys          });          flowcontroller.on('3', function () {           // massive validation of records & seperating records per activity         });          // activity 1           flowcontroller.on('4', function () {           // insert records of activity 1           });          // activity 2           flowcontroller.on('5', function () {           // insert records of activity 2         });          // activity 3           flowcontroller.on('6', function () {           // insert records of activity 3         });          // activity 3           flowcontroller.on('end', function () {           // end         }); 

apparently, logic in applications takes longer few seconds execute. consider moving off main thread, if running many times per hour or day.

because node single threaded, long running processes can block other code executing , give end users perception application slow.

you can spin child process using node’s child_process module , child processes can communicate each other messaging system. spin multiple processes @ same time , them executed in parallel.

your first move whole long-computation function own file , make invoke function when instructed via message main process. now, instead of doing long operation in main process event loop, can fork long-computation.js file , use messages interface communicate messages between server , forked process.

when request long-computation happens above code, send message forked process start executing long operation. main process’s event loop not blocked.

once forked process done long operation, can send result parent process using process.send

references might you:

update:

here sample code has 2 functions for loop goes million. wrapping 2 in parallel-wrapper (async.parallel in case) doesn't mean magically become parallel operations. must async operations.

const promise = require('bluebird');  function one() {     return new promise((res, rej) => {         (let = 0; <= 1000000; i++) {             console.log('one');             if (i == 1000000) res('done')         }     }); }  function two() {     return new promise((res, rej) => {         (let = 0; <= 1000000; i++) {             console.log('two');             if (i == 1000000) res('done')         }     }); }   let = [one(), two()];  promise.all(a).then(r => console.log(r)); 

function two executed after one


Comments

Popular posts from this blog

resizing Telegram inline keyboard -

command line - How can a Python program background itself? -

php - "cURL error 28: Resolving timed out" on Wordpress on Azure App Service on Linux -