Put an upper limit on update batches

When using update_column_in_batches the upper limit on the batch size is
now 1000. This ensures that for very large tables we don't lock tens of
thousands of rows during the update. This in turn should reduce the
likelyhood of running into deadlocks.
上级 2923a0c8
......@@ -233,6 +233,12 @@ module Gitlab
# Update in batches of 5% until we run out of any rows to update.
batch_size = ((total / 100.0) * 5.0).ceil
max_size = 1000
# The upper limit is 1000 to ensure we don't lock too many rows. For
# example, for "merge_requests" even 1% of the table is around 35 000
# rows for GitLab.com.
batch_size = max_size if batch_size > max_size
start_arel = table.project(table[:id]).order(table[:id].asc).take(1)
start_arel = yield table, start_arel if block_given?
......
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册