chinwagsocial/db
unarist 0129f5eada Optimize FixReblogsInFeeds migration ()
We have changed how we store reblogs in the redis for bigint IDs. This process is done by 1) scan all entries in users feed, and 2) re-store reblogs by 3 write commands.

However, this operation is really slow for large instances. e.g. 1hrs on friends.nico (w/ 50k users). So I have tried below tweaks.

* It checked non-reblogs by `entry[0] == entry[1]`, but this condition won't work because `entry[0]` is String while `entry[1]` is Float. Changing `entry[0].to_i == entry[1]` seems work.
  -> about 4-20x faster (feed with less reblogs will be faster)
* Write operations can be batched by pipeline
  -> about 6x faster
* Wrap operation by Lua script and execute by EVALSHA command. This really reduces packets between Ruby and Redis.
  -> about 3x faster

I've taken Lua script way, though doing other optimizations may be enough.
2017-10-27 16:10:22 +02:00
..
migrate Optimize FixReblogsInFeeds migration () 2017-10-27 16:10:22 +02:00
schema.rb foreign_key, non-nullable, dependent: destroy in account_moderation_notes () 2017-10-10 13:12:17 +02:00
seeds.rb Fix db:seed - only run some validations when the field was changed () 2017-06-08 09:22:01 -04:00