首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >超出最大堆栈大小-事件流

超出最大堆栈大小-事件流
EN

Stack Overflow用户
提问于 2018-02-28 15:04:11
回答 1查看 656关注 0票数 1

拥有这段代码:

代码语言:javascript
运行
复制
function query(url, dbName, collection, filter, requestId) {
    MongoClient.connect(url, {native_parser:true, authSource:'admin'}, function(err, client) {
        if (err) {
            throw err;
        }
        const db = client.db(dbName);
        var stream = db.collection(collection).find(filter, {fields:{_id: 0}}).stream();

        var fileName = '/opt/' + requestId + '.txt';
        var writer = fs.createWriteStream(fileName);
        writer.write('[\n');

        stream.on('end', function(){
            writer.write('\n]');
        });

        stream.pipe(es.map(function (doc, next) {
            doc = JSON.stringify(doc);
            next(null, doc);
        })).pipe(es.join(',\n')).pipe(writer).on('close', function(){
            sftp.put(fileName, '/opt/' + requestId + '.txt')
                .then(logger.info('Done uploading the file via SFTP'));

            mqttClient.publish('response', 'The CSV for requestId has been uploaded FTP');
        });
    });
}

问题是,当查询返回大量文档时,函数将失败,并返回

代码语言:javascript
运行
复制
/node_modules/map-stream/index.js:103
        throw err
        ^

RangeError: Maximum call stack size exceeded
    at Stream.ondata (internal/streams/legacy.js:14:18)
    at emitOne (events.js:116:13)
    at Stream.emit (events.js:211:7)
    at Stream.<anonymous> (/node_modules/event-stream/index.js:298:12)
    at Stream.stream.write (/node_modules/through/index.js:26:11)
    at Stream.ondata (internal/streams/legacy.js:16:26)
    at emitOne (events.js:116:13)
    at Stream.emit (events.js:211:7)
    at queueData (/node_modules/map-stream/index.js:43:21)
    at next (/node_modules/map-stream/index.js:71:7)
    at /node_modules/map-stream/index.js:85:7
    at /opt/subscriber.js:84:7
    at wrappedMapper (/node_modules/map-stream/index.js:84:19)
    at Stream.stream.write (/node_modules/map-stream/index.js:96:21)
    at Cursor.ondata (_stream_readable.js:639:20)
    at emitOne (events.js:116:13)

这个函数所做的是获取一个过滤器,根据过滤器运行mongodb查询,并将结果文档写入一个文件,然后对该文件进行ftp处理。

该函数在next(null, doc);处失败

关于如何改进代码以避免增加调用堆栈大小,有什么建议吗?

EN

回答 1

Stack Overflow用户

回答已采纳

发布于 2018-02-28 16:12:26

我从来没有使用过这个库,尽管它看起来非常流行。您是否可以尝试使用steam的事件来执行操作,并查看它是否有效?

代码语言:javascript
运行
复制
function query(url, dbName, collection, filter, requestId) {
    MongoClient.connect(url, {native_parser: true, authSource: 'admin'}, function (err, client) {
        if (err) {
            throw err;
        }
        const db = client.db(dbName);
        var stream = db.collection(collection).find(filter, {fields: {_id: 0}}).stream();

        var fileName = '/opt/' + requestId + '.txt';
        var writer = fs.createWriteStream(fileName);
        writer.write('[\n');

        stream.on('data', function (doc) {
            writer.write(`${JSON.stringify(doc)}\n`);
        });

        stream.on('end', function () {
            writer.write('\n]');
            sftp.put(fileName, '/opt/' + requestId + '.txt')
                .then(logger.info('Done uploading the file via SFTP'));

            mqttClient.publish('response', 'The CSV for requestId has been uploaded FTP');
        });
    });
}

票数 2
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/49023745

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档