October 17, 2016 at 11:27 pm
Comments posted to this topic are about the item Archaic Commands
October 18, 2016 at 4:10 am
My basic feel is that GUIs are great if you have less than about 5 of something to do. More than that and it is usually worth it to figure out a way to script it. I just cannot reliably do the same thing without error that many times.
October 18, 2016 at 5:29 am
It feels like things have come full circle.
At one time you scripted because that was all there was. Now you script because it's the best way of having a reliable and repeatable process.
Tools like SPSS used to (and probably still do) echo any GUI commands to a log file so you could focus on what you were trying to achieve first and then work out how best to code it later.
Office macros work on the same principle. Quite a few DB IDEs have a "script to window" function and it's a great way of learning the underlying script and also emphasising the additional options that are available with a command. I've found it a handy aide memoir to Google a setting or two that I hadn't heard of, particularly when dealing with a DB platform that was new to me.
October 18, 2016 at 6:57 am
GUIs are great to use, but sometimes one has to use the shell to get things done. I had to export users from an Active Directory system and create a CSV file for import into a new system.
I wrote an awk program that reads the LDIF file and converts the users to a CSV file. I installed Cygwin on my Windows 7 system to have access to Unix commands.
October 18, 2016 at 8:22 am
Cmd lines are also very handy for documentation purposes. Not only are they typically more precise, compact and clear, they also can be documented in text only screens which are more portable and easier to distribute.
412-977-3526 call/text
October 18, 2016 at 9:01 am
Heh... being one that seriously exploits the capabilities of the command line through the use of xp_CmdShell, you've just got to know that I'm loving this discussion. :w00t:
--Jeff Moden
Change is inevitable... Change for the better is not.
October 18, 2016 at 9:58 am
I'm one who's almost always in favor of writing what you need done instead of clicking my way through. It's typically done with greater precision and control and it has the benefit of being repeatable when it needs to be done again. I also like to save many of the things I run. If there's ever a question about what I did or how I did it, I know exactly what I did or how I overcame a problem, which is missing if I use the GUI.
October 18, 2016 at 10:56 am
I still do little things here or there in command line for automation purposes. I remember back in college though (early 90's), using Unix, I had one class that used VI editor. I found it so horrific, it was easier for me to FTP the files down to my PC, edit them in DOS Edit, then FTP them back up to the Unix system.
October 18, 2016 at 4:05 pm
I find that I rarely do things just once therefore scripting is my normal approach.
You can share a script and learn from one. A well written one has a strange beauty that is almost poetic
October 18, 2016 at 4:58 pm
David.Poole (10/18/2016)
I find that I rarely do things just once therefore scripting is my normal approach.You can share a script and learn from one. A well written one has a strange beauty that is almost poetic
Same here but I do a lot of it through xp_CmdShell via stored procedures, which can easily be scheduled as jobs in SQL Server, provides nearly automatic monitoring of any of the output, puts logging all in one place (a table), keeps the functionality backed up, etc, etc. Works great for ETL and a whole bunch of other things, as well. It's also wicked easy to simply copy the functionality between servers, if I need to.
--Jeff Moden
Change is inevitable... Change for the better is not.
October 19, 2016 at 7:05 am
I personally don't like complex tasks that require many steps in GUI application. It's probably why I don't care for much of SSIS quirks. Or the fact that Microsoft decides to rearrange Windows every iteration. Or the new "improved" Cisco VPN clients that make it darn near impossible to have multiple profiles.
The CLI should be second nature to a Windows or Linux pro. There's tasks that can't be done efficiently in a GUI. Ask a newbie how to get a list of files in a folder thru the Windows GUI. Or automate a daily system process with several steps. Or quickly get a list of processes so you can kill the web browser that's eating resources.
October 19, 2016 at 8:25 am
chrisn-585491 (10/19/2016)
I personally don't like complex tasks that require many steps in GUI application. [font="Arial Black"]It's probably why I don't care for much of SSIS quirks.[/font] Or the fact that Microsoft decides to rearrange Windows every iteration. Or the new "improved" Cisco VPN clients that make it darn near impossible to have multiple profiles.[font="Arial Black"]The CLI should be second nature to a Windows or Linux pro. [/font]There's tasks that can't be done efficiently in a GUI. Ask a newbie how to get a list of files in a folder thru the Windows GUI. Or automate a daily system process with several steps. Or quickly get a list of processes so you can kill the web browser that's eating resources.
Heh... you and I would get along very well.
One of my favorite "Black Arts" techniques in a T-SQL-based ETL system is for determining if an incoming file is still open or if it's completed. I've seen folks go through all sorts of gyrations in SSIS and other places only to fail in one way or another. One of the more subtle failures are when people use techniques that end up changing the "Modified Date" on the file if the file isn't open, which is frequently a critical piece of information that shouldn't be change. The archaic CLI command of REName is all they really needed to do. Just try to rename the file the same name as it currently is. If it can't, it'll tell you. If it can, the name stays the same and the original "Modified Date" stays the same. Add that check to a BULK INSERT procedure though xp_CmdShell, and life gets real easy.
Then there's the additional functionality provided by WMIC at the command line. I wrote an enterprise wide "disk status" proc that provides a daily report of all disks on all servers including drives that point to the SAN for disk space and the ISDIRTY bit. It even finds and reports on memory sticks and CDs that folks may have left in a machine and can't find. It's all done through SQL Server and xp_CmdShell/CLI and it works great.
Heh... and then there's stuff like the relatively new FORFILES command. Amazing stuff there.
Longer story shorter, I absolutely agree. If folks don't know what's available at the CLI, they don't know what they're missing. And, yeah... you can buy software that does that type of stuff... and then watch it break on the next release of Windows Server (which we just went through on a simple monthly Windows Update that we had to roll back because of the break).
--Jeff Moden
Change is inevitable... Change for the better is not.
October 19, 2016 at 9:58 am
Glenn Berry helped me come up with this one to see if a system is virtual or not:
one line:
if (object_id('tempdb..#ServerInfo') is not null) drop table #ServerInfo create table #ServerInfo (Logdate varchar(50), ProcessInfo varchar(256), Model varchar(1000)) insert into #ServerInfo(LogDate,ProcessInfo,Model) EXEC ('sys.xp_readerrorlog 0, 1, N''Manufacturer''') select model 'System Manufacturer and System Model (virtual machines are VMWare)' from #ServerInfo -- EXEC ('sys.xp_readerrorlog 0, 1, N''Manufacturer''')
block:
if (object_id('tempdb..#ServerInfo') is not null)
drop table #ServerInfo
create table #ServerInfo
(Logdate varchar(50), ProcessInfo varchar(256), Model varchar(1000))
insert into #ServerInfo(LogDate,ProcessInfo,Model)
EXEC ('sys.xp_readerrorlog 0, 1, N''Manufacturer''')
select model 'System Manufacturer and System Model (virtual machines are VMWare)' from #ServerInfo
-- EXEC ('sys.xp_readerrorlog 0, 1, N''Manufacturer''')
412-977-3526 call/text
October 19, 2016 at 10:04 am
I've appreciated MS putting "Script" buttons into a lot of their tools over the last couple of years. There are a lot of times I'll build something up in the GUI, then take that script with me to tweak/repeat for re-use. I haven't played w/ the Windows admin side as much lately, but I think they've been doing something similar w/ PowerShell scripts to do their admin tasks. That gives me the ability to look at a GUI to better understand options I may not remember well for things like Extended Events, but easily port them into something I can re-use and adjust.
October 20, 2016 at 7:40 am
Peter Schott (10/19/2016)
I've appreciated MS putting "Script" buttons into a lot of their tools over the last couple of years...
The first MS Server released with PowerShell support was Exchange. It was that first PowerShell supporting server that added the PowerShell script window into the GUI. In fact they were the first to implement the GUI on top of their own PowerShell commands to ensure consistency of implementation between the GUI and the command line interface (CLI).
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply