July 10, 2019 at 4:04 pm
We have multiple SQL DBs on AZURE - and the server has a virtual master DB. I have the correct rights to connect to it and can look at sys.resource_stats.
I have a requirement to save this data to make calculations for internal usage and costing purposes. I've used scoped credentials to use sp_execute_remote on other DBs in the same server, and they all work well. I'd rather not have something that runs every hour to retrieve the sys.dm_db_resource_stats from each DB as it's a lot of data, and seems rather over the data I'm going to aggregate per day anyway.
What I've not managed to do, or find an example, is how you do the same sort of thing to save sys.resource_stats from master into RandomNameDB on the same server. Anybody any ideas? It has to be T-SQL based.
July 10, 2019 at 5:29 pm
You can't, because you can't just insert to another database. There is a way to do cross database queries, and it can be complex. I realized I should have written something on this, but this might help: https://blogs.msdn.microsoft.com/azuresqldbsupport/2018/11/14/cross-database-query-in-azure-sql-database/
July 10, 2019 at 5:45 pm
A side note, there is also sys.dm_db_resource_status in each db - https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-resource-stats-azure-sql-database?view=azuresqldb-current
July 11, 2019 at 8:29 am
Thanks - I've already done cross database (scoped credentials etc) querying. I tried that on master, but it doesn't work there - I may ahve got the credentials etc wrong.
Looks like I'll be importing from sys.dm_db_resource_status every hour.
July 11, 2019 at 9:18 am
Thanks - I've already done cross database (scoped credentials etc) querying. I tried that on master, but it doesn't work there - I may ahve got the credentials etc wrong. Looks like I'll be importing from sys.dm_db_resource_status every hour.
Do you really need such granular data (every 5 sec for the last 1 hour)?
In master db and sys.resource_stats dmv there's data aggregated per 5 minutes for the last 2 weeks. I collect it (scheduled batch file) on regular basis and store in standalone SQL Express for the workload analysis.
To visualize the data, custom ssms report was created - see pic attached.
July 11, 2019 at 10:43 am
No I don't need such granular data. But I can use a procedure to read all the data off all our clients, and store it in a table. I'll aggregate it on a daily basis and delete. The intention is to have this data available in the long term for internal reporting.
The original question was how to store the 5 minute data.
July 11, 2019 at 12:09 pm
The original question was how to store the 5 minute data.
best approach is to save it into local table
how to get the 5 minute data remotely :
the following script will give you access to the content of master.sys.resource_stats on the server zyx.database.windows.net from your own repository db (Azure SQL db)
-- in you repository db create credential to connect to master databases using username sql_test and appropriate password
CREATE DATABASE SCOPED CREDENTIAL collect_test WITH IDENTITY = 'sql_test', SECRET = '********';
-- create external datasource with the servername you want to collect data from
CREATE EXTERNAL DATA SOURCE master_db WITH
(TYPE = RDBMS,
LOCATION = 'xyz.database.windows.net',
DATABASE_NAME = 'master',
CREDENTIAL = collect_test
);
--check if it is created
select * from sys.external_data_sources;
-- create external table in your db which will map the remote dmv sys.resource_stats
CREATE external TABLE [dbo].[master_data](
start_timedatetime2
,end_timedatetime2
,database_namenvarchar(128)
,skunvarchar(128)
,storage_in_megabytesfloat
,avg_cpu_percentdecimal(5,2)
,avg_data_io_percentdecimal(5,2)
,avg_log_write_percentdecimal(5,2)
,max_worker_percentdecimal(5,2)
,max_session_percentdecimal(5,2)
,dtu_limitint
,xtp_storage_percentdecimal(5,2)
,avg_login_rate_percentdecimal(5,2)
,avg_instance_cpu_percentdecimal(5,2)
,avg_instance_memory_percentdecimal(5,2)
,cpu_limitdecimal(5,2)
,allocated_storage_in_megabytesfloat
)
WITH
(
DATA_SOURCE = master_db,
SCHEMA_NAME = 'sys', --schema name of remote table
OBJECT_NAME = 'resource_stats' --table name of remote table
);
-- get the data you need
select top 100 * from [dbo].[master_data]
July 11, 2019 at 12:54 pm
F1Baron wrote:The original question was how to store the 5 minute data.
best approach is to save it into local table how to get the 5 minute data remotely : the following script will give you access to the content of master.sys.resource_stats on the server zyx.database.windows.net from your own repository db (Azure SQL db)
Thank you so much - I'd entirely forgotten about external tables, and was fixated on sp_execute_remote.
July 11, 2019 at 4:36 pm
I'd always save locally and then pull from there for reporting. That way if the pull process has issues, at least you have some aggregate data in the db.
I tried the cross db thing, but permissions in master are strange, and since you can't grant view server state or create a sys schema, not sure you can query sys.resource_stats without admin privileges.
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply