I would think that most DBAs and SQL developers out there realize they need to understand some basic statistics and mathematics to write meaningful reports for their customers. Since many of us write our reports in T-SQL, if we correctly write the formula for one row, it expands to work the same way for all rows returned from the database. That should hold true in Excel, if you cut and paste the formulas across the all the rows or columns. However since each cell could have a different formula, you can't always be sure that the calculations for all rows are the same.
This was brought to light in a blog that looked at the problems with a public policy analysis based on an Excel worksheet. The audit-ability problem of validating results in Excel was one issue, and it comes about because you would have to manually verify the formulas in every cell to be sure they were correct. I'm not sure how many people want to do that, though I like the transparency of providing the formulas with the data. Someone will verify them.
The other part of the blog talks briefly about other issues. One is that to accurately represent the meaning of data, we need to do more than show simple aggregates like range or average. The reduction of a set of data to a single or small group of representative values may not describe the information in the data. An average without knowing the variance or standard deviation may not be the data you want to base a business decision on.
As we seek to provide more accurate information to our customers and clients, we need to better understand the data we are querying. At some point it won't be enough to understand the basic aggregates in order to report on data. Big data can provide many false patterns, but tiny data suffers from a similar problem. We need to learn more about statistics and analysis to ensure that as we work with business people to query data, we understand what information we are extracting out of data sets.