I'm trying to develop a plugin extracting some statistical information from the blockchain. I kept the code similar to the market history plugin and it works well - except that the in-memory database is not rebuild when the witness node has been terminated and then started again - neither for the market history nor for my own plugin. Instead, the blockchain state is rewound for some blocks and blocks will start coming in from that point of time on. All in-memory information from the blocks before is lost.
In detail, the 'applied_block' hook is used to connect to the database during plugin initialization:
database ().applied_block.connect ([this] (const signed_block& b) { my->on_block_applied (b); });
When a block is applied, I will consider both applied operations and the current block transactions:
void extractor_plugin_impl::on_block_applied( const signed_block& b )
{
graphene::chain::database& db = database ();
// Applied operations
for( const optional< operation_history_object >& o_op : db.get_applied_operations () )
...
// Transactions of incoming block
operation_handler handler (_plugin, b.block_num (), b.timestamp);
for (const auto& t : b.transactions)
for (const auto& op: t.operations)
...
This is the output when the witness node is started:
...
927579ms th_a object_database.cpp:115 open ] Opening object database from /data/blockchain/bitshares/blockchain ...
928582ms th_a object_database.cpp:124 open ] Done opening object database.
928599ms th_a db_management.cpp:64 reindex ] reindexing blockchain
928599ms th_a db_management.cpp:70 reindex ] Replaying blocks, starting at 350003...
...
* Is this the right way to ensure that, at witness node restart, the already processed blocks are replayed ? Or is some hook missing ?
* Or do I need to care for storing the already gathered information in a local database myself because there is no replay at all ?
* Why does the official market history plugin fail in this case, too ?
Thank you in advance !