March 3, 2015 at 11:31 am
I could also see Express edition used for small departmental data marts that aggregate summary data for reporting, something that can be shared between a handful of users.
One of the few nice things I could say about using the Express edition - as opposed to a fully-featured FOSS RDBMS like MySQL - is that it comes bundled with SSRS.
March 5, 2015 at 10:09 am
Our main app is an Access front end with a SQL back end. The connection between Access and SQL is ODBC. This can occasionally be prickly. Many of our customers are the Express version, larger customers use the Standard version. It's quite a jump from zero to whatever Standard version costs now.
We install an archive database which is more or less a mirror of the production database. Once a month we archive historical data into the archive database and purge from the "OLTP" database. That helps to keep the OLTP database under 10 GB. The OLTP database typically has two years of history.
Express has been rock solid. I don't believe there is any reason why it would not be as reliable as Standard version or higher. We use Ola Hallengren's script for backup and optimizing and trigger it via scheduled tasks.
March 10, 2015 at 7:45 pm
I ran a 4700+ store retail chain on 9400 instances of SQL express. Microsoft loved me 😎
March 10, 2015 at 8:18 pm
jonesboy21 (3/10/2015)
I ran a 4700+ store retail chain on 9400 instances of SQL express. Microsoft loved me 😎
Why would you do that? Surely the cost of maintenance and any development would exceed the money saved by not going to a pay for version, or have I missed something?
March 10, 2015 at 8:26 pm
Two instances in every store running on a physical server. Cloud wasn't an option at the time. No extra development costs.
March 11, 2015 at 1:43 am
ShineBoy (3/10/2015)
Why would you do that?
Let's see.
The POS system (cash registers) might have their own data store and handle all the credit card handling.
H/R and payroll might be outsourced so only timesheet data would be collected locally.
The inventory is not lot controlled. Right?
I could see it happening. Your data retention policy (as in you only keep what you need) must be outstanding. If you are not a tight bastard and passed the savings back to the customers and on to the employees then great. Development costs? Why that is nothing more than job creation.
There has to be at least an article in this. Publish the story of how you did it.
Do I feel something tugging on my one leg? Naw. April is next month.
ATBCharles Kincaid
March 11, 2015 at 7:35 am
jonesboy21 (3/10/2015)
Two instances in every store running on a physical server. Cloud wasn't an option at the time. No extra development costs.
I'm guessing this retail giant does have SQL Server Enterprise running back at corporate headquarters, and daily sales from each chain are periodically replicated or ETL'd. This sounds like a classic distributed architecture, only on a grand scale.
With 4,700 chains, you can bet that at least 100 are having internet connectivity issues at any given moment, so perhaps running SQL Server Express locally, rather than hitting a central cloud database directly, keeps the POS registers running non-stop, even if it does require a little more IT support for the data transfers.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
March 11, 2015 at 8:39 am
I could write an article about it, not sure how much information I could give out though due to confidentiality agreements. We did have large SQL Server enterprise servers running back in the data center. Data was ETL'd back but not using traditional ETL tools. All custom java and .net applications. 100 having problems at a given time is a little high unless there was a severe weather or some type of big provider outage. Usually around 30-50 at any give time. The application that made it all possible was a custom .net app that we ran from the data center. It was a light weight multi threaded C# app that could query and pull data back from all the instances across the chain in under 5 mins. Pretty neat stuff.
March 11, 2015 at 9:02 am
jonesboy21 (3/11/2015)
I could write an article about it, not sure how much information I could give out though due to confidentiality agreements. We did have large SQL Server enterprise servers running back in the data center. Data was ETL'd back but not using traditional ETL tools. All custom java and .net applications. 100 having problems at a given time is a little high unless there was a severe weather or some type of big provider outage. Usually around 30-50 at any give time. The application that made it all possible was a custom .net app that we ran from the data center. It was a light weight multi threaded C# app that could query and pull data back from all the instances across the chain in under 5 mins. Pretty neat stuff.
One of the first large applications I worked on, this was back in early 1990s, was for a publishing company with about 20 remote call centers located in five contenents. At each remote location there was a nightly batch process that FTP'd a copy of the MS Access database back to headquarters for import into SQL Server 6.5. The only connectivity available was by dialup internet, which was very slow and spotty, especially between the US and Asia at the time. Circumstances would routinely arise where that nightly batch process would be deferred for a week or more, so having a disconnected and self-contained application with a local database was a given.
Today in 2015, always on internet should still not be taken for granted, even in locations with the most reliable and cutting edge infrastructure. It's inconvenient dor accounting when the corporate data warehouse updates are a couple of days late, but it's a disaster for everyone if one or more remote offices are entirely non-operational because they can't reach the cloud.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
March 11, 2015 at 9:48 am
Eric M Russell (3/11/2015)
...Today in 2015, always on internet should still not be taken for granted...
Microsoft wanted to make that assumption for the Xbox One and had to do a U-Turn.
PS I don't think U-Turns are a bad thing, per se, as they illustrate flexibility in thinking and action. Multiple U-Turns on the same subject do not.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
March 11, 2015 at 2:04 pm
wow really glad I asked why would you do that. Enjoying the answers. Thanks Guys.
March 11, 2015 at 2:37 pm
Gary Varga (3/11/2015)
Eric M Russell (3/11/2015)
...Today in 2015, always on internet should still not be taken for granted...Microsoft wanted to make that assumption for the Xbox One and had to do a U-Turn.
PS I don't think U-Turns are a bad thing, per se, as they illustrate flexibility in thinking and action. Multiple U-Turns on the same subject do not.
I live in such an area. I have the same complaint about device makers assuming that you have persistent wireless connection: I lose phone signal about 5 minutes from my house.
Aside from wishing all application developers be denied admin access on their development computers, I wish all wireless device makers and developers were required to have offices in areas with poor and broken coverage. We don't all live in areas that are saturated in cellular and WiFi.
-----
[font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]
March 11, 2015 at 5:37 pm
Too true Wayne.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
March 11, 2015 at 5:39 pm
Gaz! Up so late?
March 11, 2015 at 5:43 pm
My son is working and I have to pick him up. It's the Rock'n'Roll lifestyle here 😉
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
Viewing 15 posts - 61 through 75 (of 78 total)
You must be logged in to reply to this topic. Login to reply