This is probably a rare occurrence — but I've noticed that data validation for an article with 10 billion or more rows will fail due to an arithmetic overflow error in sp_MSarticle_validation.
Data validation for a specific article in a transactional publication is done using sp_article_validation which in turn calls sp_MSarticle_validation. In the definition of sp_MSarticle_validation, a local variable named @actual_rowcount is defined which is of the type bigint. Later on in sp_MSarticle_validation, a command is built to execute sp_table_validation and the @expected_rowcount parameter passed in is derived by converting @actual_rowcount to a varchar(10).
The offending statement can be observed in the following code snippet from sp_MSarticle_validation:
select @command = 'exec dbo.sp_table_validation @table = ''' + replace(@destination_table, '''', '''''') + ''', @expected_rowcount = ' + convert(varchar(10), @actual_rowcount) + ', @expected_checksum = ' + convert(varchar(100), @actual_checksum) + ', @rowcount_only = ' + convert(varchar(5),@rowcount_only) + ', @full_or_fast = ' + convert(varchar(10), @full_or_fast) + ', @shutdown_agent = ' + convert(varchar(10), @shutdown_agent) |
Considering @actual_rowcount is a bigint, it should be converted to a varchar(19), rather than a varchar(10). This is where an arithmetic overflow error occurs when validating an article that has 10 billion or more rows, causing validation to fail.
If you find yourself needing to validate an article with 10 billion or more rows, please vote this item as important to expedite a fix.