January 26, 2016 at 11:42 pm
Comments posted to this topic are about the item Configuring R Services in SQL Server 2016
January 27, 2016 at 5:49 am
Thanks, Nick, for the informative article.
I think that for jobs with small data loads, like analyzing SQL Server performance information, using good old-fashioned ODBC to bring the data into a workstation instance of R will remain the better choice for many users. As you mention, the primary value to R on the server is to eliminate the need for send large data volumes over the network.
Some readers might be interested to know that the enterprise R node installed for the R server is built on the message-passing interface (MPI), a tried-and-true high performance computing protocol that has been around for a long time. For computationally intensive tasks it is possible to achieve parallel processing by distributing the R calculations to multiple nodes. Basically, the Revolution R system allows us to add "Big Math" to "Big Data".
Dan Buskirk https://www.linkedin.com/in/sqlanalytics
February 1, 2016 at 12:54 pm
Nick, Great Post !! It is good to know that we have to allocate more memory to utilize R functionality in 2016.
February 2, 2016 at 1:14 am
daniel.buskirk (1/27/2016)
I think that for jobs with small data loads, like analyzing SQL Server performance information, using good old-fashioned ODBC to bring the data into a workstation instance of R will remain the better choice for many users. As you mention, the primary value to R on the server is to eliminate the need for send large data volumes over the network.
Some readers might be interested to know that the enterprise R node installed for the R server is built on the message-passing interface (MPI), a tried-and-true high performance computing protocol that has been around for a long time. For computationally intensive tasks it is possible to achieve parallel processing by distributing the R calculations to multiple nodes. Basically, the Revolution R system allows us to add "Big Math" to "Big Data".
Dan Buskirk https://www.linkedin.com/in/sqlanalytics%5B/quote%5D
Hi Dan,
Thank you for your comments - and my apologies for taking so long to respond. I 100 % agree with you, when the data is a reasonable size and you have a one-off analysis to do then you are far better to pull it out of SQL Server. I don't think I am totally comfortable with the idea of running non-production, or non-essential analysis in a production database anyway. You could pin down the privileges, but this would still be a performance drain.
But there are some applications when running analyses in-database would be benefitial, even on VERY small units of data. For example, online fraud detection, or real-time customer-behaviour modeling, or real-time performance monitoring. I think these are really really interesting areas!
Cheers,
Nick
February 2, 2016 at 1:17 am
G H G (2/1/2016)
Nick, Great Post !! It is good to know that we have to allocate more memory to utilize R functionality in 2016.
I am soooo interested to see which companies start using R in their production databases, and what the potential performance impact will be! This could be a whole new world of pain. I don't know!?!
Thanks so much for your comment,
Nick
February 2, 2016 at 10:18 am
I am trying to replicate the example you posted. Can you provide the script for the view?
February 10, 2016 at 12:39 am
Has anyone here tried to run a stored proc that returned a result set to the local R environment?
It seems that the sqlQuery parameter of RxSqlServerData can't be things like 'Exec MySPReturnsTable'. When I do, I get mysterious error messages referring to 0 columns.
Suggestions?
Thanks.
February 25, 2016 at 4:47 pm
Thanks for the introduction.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply