Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

自动同步问题 #20

Open
glacierck opened this issue Apr 28, 2019 · 2 comments
Open

自动同步问题 #20

glacierck opened this issue Apr 28, 2019 · 2 comments

Comments

@glacierck
Copy link

glacierck commented Apr 28, 2019

服务启动时的全量同步有调用e_pipeline,但运行期间mongo数据修改后仅仅是同步了指定的m_collectionname,没有执行e_pipeline。

PS。我在自定义e_pipeline里对m_collectionname里的指定属性做了多个拷贝并正则替换成新文档,然而在自动同步过程中这些字段都没被更新
bulkDataAndPip 里的日志:

--启动时的bulk

[
    {
        "index":{
            "_index":"corpus",
            "_type":"contents",
            "_id":"ImQs6IdHp"
        }
    },
    {
         "title":"doc2019-03-24-2",
         "comments":"11111"
    }
]

--更新时的bulk

[
    {
        "update":{
            "_index":"corpus",
            "_type":"contents",
            "_id":"ImQs6IdHp"
        }
    },
    {
        "doc":{
            "title":"doc2019-03-24-2",
            "comments":"22222"
        }
    }
]
@glacierck
Copy link
Author

解决方案。修改getUpdateMasterDocBulk :

return new Promise(function (resolve, reject) {
       var bulk = [];
       var item = {};
       item.doc = opDoc;
       bulk.push({
           index: {
               _index: watcher.Content.elasticsearch.e_index,
               _type: watcher.Content.elasticsearch.e_type,
               _id: id
           }
       }, opDoc);
       return resolve(bulk);
   });

@zhr85210078
Copy link
Owner

这个方法主要是为了做mongodb的原子更新用的,因为oplog里面返回的数据不一定是全部的,改成上面那种方法就是全量替换elasticsearch里面的数据,如果你的应该场景是每次更新操作oplog返回的数据是全量的,可以改成上面的那种,否则你会发现elasticsearch里面的数据会丢节点
$38015BBFE04B17B3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants