August 23, 2002 at 8:03 pm
We have a number of "power" uses of our SQL 2000 data warehouse who use SQL and some Access to build their own queries.
A common problem is that they often need to load related data of their own into the database. For example, they may have a long list of items or customers for which they want to pull sales data, usually in an Access database or spreadsheet. They want to load to a table on the SQL Server so they can do joins. Doing joins against Access (via a linked table to the DW) is awful in terms of performance (really big tables on the DW).
The solution we are leaning toward (and would appreciate advice) is to create a database they can use for such temporary uploads. note that "temporary" may be weeks. #temp tables are inadequate, especially when trying to use Acecss, plus they really need them longer.
What I want to do is create a database and give everyone access to create their own tables, restrict it's size of course, but otherwise let the users do as they need. HOWEVER, I realize this will quickly fill up and never be maintained.
So what I really want is to track ownership, but most important last use, last update, then I can run reports on what is stale, large, or infrequently used.
Other than owner (and I'm not even quite sure about that with Windows authentication) I do not see how I can get this information from any native metadata.
Running continual trace procedures to a file (loaded to a database) might do it, but that seems like a lot of overhead.
Anyone tried to give users create access and then manage it? Ideas?
August 23, 2002 at 8:22 pm
Putting users in the db_ddladmin role will give them permissions to create objects. So long as they maintain their ownership, they'll have full permissions on their own objects. So that will solve how to give 'em access.
Auditing is a whole different issue. You could build a job that checks the tables each night and if it finds a new one, creates triggers for insert, update, and delete. The trigger could simply record the table and when it last had data modified in a separate table the users don't have access to. You'd also want the job to check to see if any of the triggers are missing for whatever reason. Keep in mind that the triggers will require extra operations per queries, so if you're going to use them, make them fairly simple.
If you are also concerned about SELECT, you have a substancially more difficult task on your hands. Unless anyone knows anything different, I believe only Profiler traces will reveal the SELECT information, which means you'll have to trace and parse in order to create your reports.
As far as size, you can track that via a job each night. You could even record the info to a separate table to keep track of trends.
K. Brian Kelley
http://www.truthsolutions.com/
Author: Start to Finish Guide to SQL Server Performance Monitoring
http://www.netimpress.com/shop/product.asp?ProductID=NI-SQL1
K. Brian Kelley
@kbriankelley
August 24, 2002 at 6:09 am
RE: creates triggers for insert, update, and delete.
Thanks. Had not thought of triggers to track changes. but...
Re: If you are also concerned about SELECT, you have a substancially more difficult task on ...
It's really usage that I'm interested in. Or I think so.
My goal is to give them a mechanism to upload data they need to use. But like file servers, once saved no one is likely to clean things up. As we get hundreds of tables (many large) I want to deal with those which are no longer needed, without hassling those which are in regular use.
We run a continuous profiler query that loads long running queries into a table (to address "the server was really slow this afternoon, what was it?" questions). Maybe I just need to bite the bullet and load all access in this database then filter through the objects and update a list of last-access. I don't think it's hard to do, I was more worried about overhead.
Has anyone done this, who might comment on overhead?
Incidentally, I'd like to one day meet the people who decided to remove the functionality that let the traces log to a table instead of a file. What a pain it was to change in SQL2000, log to a file, stop the trace, read the file, start the trace.... even opens a window of opportunity to loose items. If you wait for files to roll over based on size then you have a long delay before you can "see" things. Am I missing something, or was there a big loss of functionality there for automated tracking? Is there some way to be reading it real time I missed? I have it all automated, but it's not pretty.
August 24, 2002 at 8:07 am
Have to agree about the Profiler thing, logging to a table can hurt performance, but let ME decide!
On the sandbox thing, we have a simiar situation where I work. We have a team that does a lot of data prep/cleaning in Access that requires them to query against sql tables and you're right, sometimes that does hurt performance, both on their side and server side. Made worse because they access it across a T1.
Resolved part of it by building a process where they do steps 1-x locally, append records to a main table, we do some processing closer to the server, they pick up the results, do remainder of processing.
Even still they need access to live data for other parts, in some cases they will pull a copy of the table into Access and index it there. Data gradually ages and they refresh if needed. Considered building a nightly process to build those files for them, not workable because we have 200+ dbs, just too much.
Have considered the sandbox but not implemented, haven't gotten them to invest in Query Analyzer training, without it...not sure how much difference it would make.
No easy answers, though I think its a lot of work to track all the access and do the clean up, perhaps just as easy to say 1st of each month db gets dropped and rebuilt. Alternative is to have them load personal edition and do it there, at least then it's their disk space.
Andy
August 26, 2002 at 5:23 am
Re: Alternative is to have them load personal edition and do it there, at least then it's their disk space.
In terms of logistics this is not a bad alternative, the number of people is probably under a dozen and it's including in the CAL price. however...
If I understand what's required to have the main SQL Server do the joins (it's got the big tables) someone has to link these desktop databases to the main server databases in the main server. I'm sure this is something of a pain to keep straight, and not at all sure how big a security risk it represents.
Is there a way to make the user's database tables visible by the main server without introducing significant security issues on the main server?
August 26, 2002 at 6:28 am
Linked servers are the usual way, could set up a script to add them to the sandbox, either using a sql login or the NT user (requires delegation I think?).
Of course they can access them but not see them. I suppose you could then add a view to each table via the linked server in the same script.
One other option might be Access 2000 in project mode.
Andy
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply