March 7, 2017 at 9:26 am
Hi All,
Very much a newbie to the whole SQL/Cloud/Web scene so apologies if anything below appears a bit 'dim'!
To cut to the chase, we provide an application to client's up and down the country to which the application stores all of it's data within an SQL Database which by default and where possible we install SQL Express on our client's Computer or Server when possible and we then install our application onto a number of the client's computer's depending on how many people need to access to said application. This would be a typical installation for a client!
Now what we are looking to acheive (and have managed to test succesfully) is eliminating the need to install SQL on our client's site and host the SQL Instance elsewhere (Our Server / Cloud / Virtual Server / Datacentres etc etc) which is the bit I'm requesting a bit of guideance with!
I understand from the succesful testing I have done in-house I have portfowarded port 1433 and have succesfully connected into the database on my own PC from an external computer (using my 'publicaddress\instance-name,1433').
I guess my query is how best to proceed? Would it be wise to rent a windows server, install SQL and create an instance for each customer?
What precautions do I need to take? security measures, backup procedures?
Any advice or input will be greatly apprecited!
March 7, 2017 at 10:23 am
That is a big chunk to bite off.
First, I would determine if you want to host the SQL yourself or cloud hosting. Depending on the required uptime (which I think in your case would be 99.99999% or higher), I would be looking at cloud hosting.
Microsoft Azure has some stuff that is pre-configured and ready to rock. But it is not a free solution; I imagine you will have a hard time finding free cloud hosting of SQL server.
I am assuming you want free/cheap as you are using SQL Express. Are you aware of the limitations of SQL express though? I think if you are moving from client-hosted to cloud hosted, you are going to want to upgrade to at least SQL Standard if not enterprise.
As for security measures, that is something that you, your company and your customers must agree on. As this would have tons of different clients, getting AD authentication will be tricky and I imagine you will be using SQL Authentication. You might be able to get by having 1 instance with multiple databases (1 per customer), but there is risk with that. The alternate solution is to have 1 instance per customer, but then you are limited in the number of customers you can have as each one would need its own TCP/IP port.
For backups, that is more up to you. How much data are you willing to lose in an emergency event such as the cloud host goes offline permanently without notice? I would recommend having some offsite storage for your backups that is hosted on a different cloud or 2. Depending on how much data you are OK with losing will determine how frequently you do backups.
Hosting things locally puts the control into your hands, but has the higher costs as you need to set up windows failover clustering and some flavor of HA/DR in SQL. If you aren't running SQL Server Enterprise, you are very limited in any HA/DR solutions. DH2i offers one solution that I have used and recommend, but I am sure there are others.
Something to keep in mind though is that some companies who use your software might not like their data existing in the cloud and it adds alot of responsibility onto you both on the technical side and on the legal side. Hypothetically, what would you do if somebody hacked your database and got all of the data for all of your customers and leaked it publicly? What if something happens to your cloud server and you have unexpected downtime? Would all of your customers accept 24 hours of downtime in which you can do nothing but sit and wait to hear back from the server hosts?
If this is the path you wish to go, I would first figure out your budget, do a little bit of research into what you can all afford to purchase with that budget, and then figure out what liability you would be accountable for. Then I would double the budget (as budgets can explode unexpectedly) and I would highly recommend hiring in a consultant to help with this.
I think the biggest take-away from this is get a consultant. That is a HUGE project and if you have no subject matter experts on site, a consultant would be the best use of the money. I expect this to be VERY expensive and VERY time consuming to complete.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
March 7, 2017 at 1:11 pm
andrew 55751 - Tuesday, March 7, 2017 9:26 AMHi All,Very much a newbie to the whole SQL/Cloud/Web scene so apologies if anything below appears a bit 'dim'!
To cut to the chase, we provide an application to client's up and down the country to which the application stores all of it's data within an SQL Database which by default and where possible we install SQL Express on our client's Computer or Server when possible and we then install our application onto a number of the client's computer's depending on how many people need to access to said application. This would be a typical installation for a client!
Now what we are looking to acheive (and have managed to test succesfully) is eliminating the need to install SQL on our client's site and host the SQL Instance elsewhere (Our Server / Cloud / Virtual Server / Datacentres etc etc) which is the bit I'm requesting a bit of guideance with!
I understand from the succesful testing I have done in-house I have portfowarded port 1433 and have succesfully connected into the database on my own PC from an external computer (using my 'publicaddress\instance-name,1433').
I guess my query is how best to proceed? Would it be wise to rent a windows server, install SQL and create an instance for each customer?
What precautions do I need to take? security measures, backup procedures?
Any advice or input will be greatly apprecited!
First, I would agree with bmg002 on getting a consultant in, you're looking at a very large project.
However, I would also strongly, strongly, strongly recommend against just forwarding port 1433 through your firewalls. Truthfully, I wouldn't forward anything through your firewall if you can help it, instead you'd be better served (from a security standpoint) by looking at configuring some sort of VPN between the clients and you. If you forward 1433, you're basically opening up your SQL Server to anyone on the Internet who wants to take a stab at logging into it, and rest assured, someone *will* succeed.
March 8, 2017 at 2:58 am
bmg002 - Tuesday, March 7, 2017 10:23 AMThat is a big chunk to bite off.
First, I would determine if you want to host the SQL yourself or cloud hosting. Depending on the required uptime (which I think in your case would be 99.99999% or higher), I would be looking at cloud hosting.
Microsoft Azure has some stuff that is pre-configured and ready to rock. But it is not a free solution; I imagine you will have a hard time finding free cloud hosting of SQL server.
I am assuming you want free/cheap as you are using SQL Express. Are you aware of the limitations of SQL express though? I think if you are moving from client-hosted to cloud hosted, you are going to want to upgrade to at least SQL Standard if not enterprise.
As for security measures, that is something that you, your company and your customers must agree on. As this would have tons of different clients, getting AD authentication will be tricky and I imagine you will be using SQL Authentication. You might be able to get by having 1 instance with multiple databases (1 per customer), but there is risk with that. The alternate solution is to have 1 instance per customer, but then you are limited in the number of customers you can have as each one would need its own TCP/IP port.
For backups, that is more up to you. How much data are you willing to lose in an emergency event such as the cloud host goes offline permanently without notice? I would recommend having some offsite storage for your backups that is hosted on a different cloud or 2. Depending on how much data you are OK with losing will determine how frequently you do backups.Hosting things locally puts the control into your hands, but has the higher costs as you need to set up windows failover clustering and some flavor of HA/DR in SQL. If you aren't running SQL Server Enterprise, you are very limited in any HA/DR solutions. DH2i offers one solution that I have used and recommend, but I am sure there are others.
Something to keep in mind though is that some companies who use your software might not like their data existing in the cloud and it adds alot of responsibility onto you both on the technical side and on the legal side. Hypothetically, what would you do if somebody hacked your database and got all of the data for all of your customers and leaked it publicly? What if something happens to your cloud server and you have unexpected downtime? Would all of your customers accept 24 hours of downtime in which you can do nothing but sit and wait to hear back from the server hosts?
If this is the path you wish to go, I would first figure out your budget, do a little bit of research into what you can all afford to purchase with that budget, and then figure out what liability you would be accountable for. Then I would double the budget (as budgets can explode unexpectedly) and I would highly recommend hiring in a consultant to help with this.
I think the biggest take-away from this is get a consultant. That is a HUGE project and if you have no subject matter experts on site, a consultant would be the best use of the money. I expect this to be VERY expensive and VERY time consuming to complete.
Hi bmg002,
Thank you for the quick response I can confirm that you are correct that we would require 99.99% uptime so cloud hosting is something that we can definitely consider.
I have briefly looked around at suppliers and are prepared to pay for hosting! - In terms of SQL Express limitations, yes I was aware that it had limitations but never delve into what they where specifically, and when it comes to separating out customers and instances then I agree that it would be best to have an instance per customer as it would get extremely messy having multiple ports!
When it comes to backups we are not talking massive amounts of data (although the data is crucial) a typical customer backup will be anywhere from 5MB - 100MB (depending on the size of the client and how long the system runs for will dictate this, I believe the biggest backup to date is around 900MB but it has been running for a good 5-6 years and we do not have many of those. Ideally for those customers we would force the client's hand to have it stored on their own servers so for this aspect I would say the biggest backup we will come across is only likely to be 100MB (and that's far fetched!) for maybe up to a total of 5 customers max.
For the client in mind hosting data in the cloud will not be an issue (permitting the data is secure) - when it comes to the legal side of things I'm currently stumped - would this not sit with the cloud provider for whom we choose to go with?
In terms of budget I can't say we have anything allocated yet, we have spoken with a few providers both local to us in location and online providers and understanding what it is they are laying out in their quotes differ and the pricing varies dramatically!
jasona.work - Tuesday, March 7, 2017 1:11 PMFirst, I would agree with bmg002 on getting a consultant in, you're looking at a very large project.However, I would also strongly, strongly, strongly recommend against just forwarding port 1433 through your firewalls. Truthfully, I wouldn't forward anything through your firewall if you can help it, instead you'd be better served (from a security standpoint) by looking at configuring some sort of VPN between the clients and you. If you forward 1433, you're basically opening up your SQL Server to anyone on the Internet who wants to take a stab at logging into it, and rest assured, someone *will* succeed.
Thanks for the reply Jasona I guess reasons behind forwarding through the firewalls is to make it as simplistic for our client(s) as we can (as we all know how tedious it can be to get them to action simple tasks!) with the idea is our application sits on there desktop, they double click it and away they go!
March 8, 2017 at 7:36 am
Is there a reason you want the databases on the cloud though? Is this something customers are requesting or your company is thinking is a good idea? I am just not sure that jumping onto the cloud for your database and having the software live locally is the best solution.
I think that there are going to be a lot of headaches getting the software to work with each individual companies requirements and ensuring the data is secured. As you would be the provider of the services, I believe you could be considered liable in the event that data got hacked due to your security settings. Think of the cloud provider as just another disk. They do their best to secure it, but lets say your admin password to get in is "Passw0rd!" It meets most security requirements (capital letter, lower case letter, more than 6 characters, symbols and numbers) so a lot of sites will say that is OK. A hacker gets in and steals all of the data and puts it out in the open, your customers will not be looking at the cloud provider for help, they will be coming to you. You could complain to the cloud host, but there is only so much they can do and if it is your account password that gets hacked and not a data breach on their end, it is very unlikely that they will help you much.
And with any data breach that results in your company having private data released publicly, you will lose a lot of customers.
Legal aside, I think you meant 1 database per customer, not 1 instance per customer. 1 instance per customer means you will use more resources and need a different port for every customer.
And my last concern is with backups. Do you really want any of your customers handling those? This feels like a huge security risk. I know if I own the instance, I wouldn't want any customers having any elevated access like that. But I am a paranoid DBA... I wouldn't want a customer to come to me to tell me that their database is corrupt and they need it restored and I have no backups. Also, if I was a potential customer, one of the benefits in my mind of having the database on the cloud is that it would be backed up and I wouldn't need to allocate space for backups. For that particular customer, I do not see what benefit they would gain from the cloud system you are proposing.
EDIT - also, forgot to include about restores. If the database needs to be restored for any reason, the user must be part of the sysadmin role. This is a HUGE security issue if you make your customers sysadmin as then they will be able to see everything on every database.
I would evaluate first if cloud is the correct solution or not. If the application requires near 100% uptime (or 5 9's as most quality cloud services offer), on premise might still be a better and a LOT cheaper solution.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
March 8, 2017 at 8:23 am
bmg002 - Wednesday, March 8, 2017 7:36 AMIs there a reason you want the databases on the cloud though? Is this something customers are requesting or your company is thinking is a good idea? I am just not sure that jumping onto the cloud for your database and having the software live locally is the best solution.I think that there are going to be a lot of headaches getting the software to work with each individual companies requirements and ensuring the data is secured. As you would be the provider of the services, I believe you could be considered liable in the event that data got hacked due to your security settings. Think of the cloud provider as just another disk. They do their best to secure it, but lets say your admin password to get in is P@ssw0rd. It meets most security requirements (capital letter, lower case letter, more than 6 characters, symbols and numbers) so a lot of sites will say that is OK. A hacker gets in and steals all of the data and puts it out in the open, your customers will not be looking at the cloud provider for help, they will be coming to you. You could complain to the cloud host, but there is only so much they can do and if it is your account password that gets hacked and not a data breach on their end, it is very unlikely that they will help you much.
And with any data breach that results in your company having private data released publicly, you will lose a lot of customers.Legal aside, I think you meant 1 database per customer, not 1 instance per customer. 1 instance per customer means you will use more resources and need a different port for every customer.
And my last concern is with backups. Do you really want any of your customers handling those? This feels like a huge security risk. I know if I own the instance, I wouldn't want any customers having any elevated access like that. But I am a paranoid DBA... I wouldn't want a customer to come to me to tell me that their database is corrupt and they need it restored and I have no backups. Also, if I was a potential customer, one of the benefits in my mind of having the database on the cloud is that it would be backed up and I wouldn't need to allocate space for backups. For that particular customer, I do not see what benefit they would gain from the cloud system you are proposing.
Thanks for the response again, & Apologies, I may have been a bit discreet when attempting to give you a good scenario!
So to give an overview of the situation:- We supply our software and for the purpose of this example we supply them with what I'll label as 'Terminal A'
Once 'Terminal A' is configured on site we will connect it to our customers network and then install our software & SQL locally on either a standalone machine or where possible on the customers server (when allowed). 'Terminal A' sends data across the network into our software and then all data is stored in our created SQL Instance - DB. Which for 95% of the time is what happens with no issue!
Now what we are now attempting to achieve is sort of similar, we will go to the customer site, and install 'Terminal A', however in this instance our customer is based away from the site itself , and we not allowed near the server, so there is no direct hardwired connection between the customer and 'Terminal A'. Now what we have done in the past is portfoward 'Terminal A' so that this customer can then see 'Terminal A' from their standalone workstation to which we have installed the software and SQL and they are able to collect the information as if they where physically on-site (permitting that they had an internet connection)
Which now brings us to the hosting aspect, we have been requested to remove the database away from this customers machine and have it stored elsewhere (I.E the cloud?) as you stated it puts the onus on us to backup the data and what not!
So the customer will still have our software on there machine, they click the icon and by magic aslong as they have an internet connection they find the database and all of the data collected by 'Terminal A'.
I hope this make's a bit more sense? I could be over complicating things by looking at this particular aspect but its sort of the path which we have looked at (if there are any alternative ways to resolve this issue would be appreciated?)
March 8, 2017 at 8:44 am
Dt_Andy - Wednesday, March 8, 2017 8:23 AMbmg002 - Wednesday, March 8, 2017 7:36 AMIs there a reason you want the databases on the cloud though? Is this something customers are requesting or your company is thinking is a good idea? I am just not sure that jumping onto the cloud for your database and having the software live locally is the best solution.I think that there are going to be a lot of headaches getting the software to work with each individual companies requirements and ensuring the data is secured. As you would be the provider of the services, I believe you could be considered liable in the event that data got hacked due to your security settings. Think of the cloud provider as just another disk. They do their best to secure it, but lets say your admin password to get in is P@ssw0rd. It meets most security requirements (capital letter, lower case letter, more than 6 characters, symbols and numbers) so a lot of sites will say that is OK. A hacker gets in and steals all of the data and puts it out in the open, your customers will not be looking at the cloud provider for help, they will be coming to you. You could complain to the cloud host, but there is only so much they can do and if it is your account password that gets hacked and not a data breach on their end, it is very unlikely that they will help you much.
And with any data breach that results in your company having private data released publicly, you will lose a lot of customers.Legal aside, I think you meant 1 database per customer, not 1 instance per customer. 1 instance per customer means you will use more resources and need a different port for every customer.
And my last concern is with backups. Do you really want any of your customers handling those? This feels like a huge security risk. I know if I own the instance, I wouldn't want any customers having any elevated access like that. But I am a paranoid DBA... I wouldn't want a customer to come to me to tell me that their database is corrupt and they need it restored and I have no backups. Also, if I was a potential customer, one of the benefits in my mind of having the database on the cloud is that it would be backed up and I wouldn't need to allocate space for backups. For that particular customer, I do not see what benefit they would gain from the cloud system you are proposing.Thanks for the response again, & Apologies, I may have been a bit discreet when attempting to give you a good scenario!
So to give an overview of the situation:- We supply our software and for the purpose of this example we supply them with what I'll label as 'Terminal A'
Once 'Terminal A' is configured on site we will connect it to our customers network and then install our software & SQL locally on either a standalone machine or where possible on the customers server (when allowed). 'Terminal A' sends data across the network into our software and then all data is stored in our created SQL Instance - DB. Which for 95% of the time is what happens with no issue!
Now what we are now attempting to achieve is sort of similar, we will go to the customer site, and install 'Terminal A', however in this instance our customer is based away from the site itself , and we not allowed near the server, so there is no direct hardwired connection between the customer and 'Terminal A'. Now what we have done in the past is portfoward 'Terminal A' so that this customer can then see 'Terminal A' from their standalone workstation to which we have installed the software and SQL and they are able to collect the information as if they where physically on-site (permitting that they had an internet connection)
Which now brings us to the hosting aspect, we have been requested to remove the database away from this customers machine and have it stored elsewhere (I.E the cloud?) as you stated it puts the onus on us to backup the data and what not!
So the customer will still have our software on there machine, they click the icon and by magic aslong as they have an internet connection they find the database and all of the data collected by 'Terminal A'.
I hope this make's a bit more sense? I could be over complicating things by looking at this particular aspect but its sort of the path which we have looked at (if there are any alternative ways to resolve this issue would be appreciated?)
That does make sense.
My understanding is that Termianl A collects data and dumps it into a database. Your application reads from that database and generates information for the end user. Since Terminal A and the customer's network would both need internet to be able to see and work with each other, is there any reason that Terminal A can't just have a VPN connection back to the customer site? This way you could host the SQL Instance on Terminal A and the end user could use your software on their own machines and as long as everybody had internet access, you'd be fine.
This would remove the reliance on the cloud and should meet all of your requirement, no? Even if you hosted the SQL instance on the cloud, you would still need to open ports in the firewall, but it would be all internal to their network due to the VPN. And there are free VPN solutions as well as paid ones that all offer different things.
One issue I do see with the cloud is, lets say you pick the cloud company called "Clouds" (an example one, I don't know if they actually exist or not). The solutions works good for a few years and you gain a few more customers. Suddenly the company closes without notifying anybody. All of your customers are left hanging.
This type of scenario happened to me with 2 different cloud file hosting services I had signed up to. I am thinking about companies like megaupload. Their services ran great for a number of years and then one day they all got shut down. They came back up with a different name, but a lot of their old content was lost. Mind you they wouldn't be what you would be looking for, but just as an example. The 2 other companies I was referring to were not megaupload mind you...
Also, in your scenario, what happens if the internet cuts out? Does Terminal A stop collecting data while waiting for the internet to kick back in or does it cache it locally or does the data get lost?
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
March 8, 2017 at 8:56 am
bmg002 - Wednesday, March 8, 2017 8:44 AMThat does make sense.
My understanding is that Termianl A collects data and dumps it into a database. Your application reads from that database and generates information for the end user. Since Terminal A and the customer's network would both need internet to be able to see and work with each other, is there any reason that Terminal A can't just have a VPN connection back to the customer site? This way you could host the SQL Instance on Terminal A and the end user could use your software on their own machines and as long as everybody had internet access, you'd be fine.This would remove the reliance on the cloud and should meet all of your requirement, no? Even if you hosted the SQL instance on the cloud, you would still need to open ports in the firewall, but it would be all internal to their network due to the VPN. And there are free VPN solutions as well as paid ones that all offer different things.
One issue I do see with the cloud is, lets say you pick the cloud company called "Clouds" (an example one, I don't know if they actually exist or not). The solutions works good for a few years and you gain a few more customers. Suddenly the company closes without notifying anybody. All of your customers are left hanging.
This type of scenario happened to me with 2 different cloud file hosting services I had signed up to. I am thinking about companies like megaupload. Their services ran great for a number of years and then one day they all got shut down. They came back up with a different name, but a lot of their old content was lost. Mind you they wouldn't be what you would be looking for, but just as an example. The 2 other companies I was referring to were not megaupload mind you...
Also, in your scenario, what happens if the internet cuts out? Does Terminal A stop collecting data while waiting for the internet to kick back in or does it cache it locally or does the data get lost?
'Terminal A' isn't what you would label as a computer (sorry my incorrect terminology I should probably of used 'Device A'!) it simply collects times and then is downloaded by the host (wherever that might be)
In the scenrio if the internet cuts out then 'Device A' will simply store all data and has a big enough reserve to last a few weeks to continue storing newer data, once the internet connection is restored all data held within the 'Device A' will be downloaded at the next download point
March 8, 2017 at 9:15 am
Dt_Andy - Wednesday, March 8, 2017 8:56 AMbmg002 - Wednesday, March 8, 2017 8:44 AMThat does make sense.
My understanding is that Termianl A collects data and dumps it into a database. Your application reads from that database and generates information for the end user. Since Terminal A and the customer's network would both need internet to be able to see and work with each other, is there any reason that Terminal A can't just have a VPN connection back to the customer site? This way you could host the SQL Instance on Terminal A and the end user could use your software on their own machines and as long as everybody had internet access, you'd be fine.This would remove the reliance on the cloud and should meet all of your requirement, no? Even if you hosted the SQL instance on the cloud, you would still need to open ports in the firewall, but it would be all internal to their network due to the VPN. And there are free VPN solutions as well as paid ones that all offer different things.
One issue I do see with the cloud is, lets say you pick the cloud company called "Clouds" (an example one, I don't know if they actually exist or not). The solutions works good for a few years and you gain a few more customers. Suddenly the company closes without notifying anybody. All of your customers are left hanging.
This type of scenario happened to me with 2 different cloud file hosting services I had signed up to. I am thinking about companies like megaupload. Their services ran great for a number of years and then one day they all got shut down. They came back up with a different name, but a lot of their old content was lost. Mind you they wouldn't be what you would be looking for, but just as an example. The 2 other companies I was referring to were not megaupload mind you...
Also, in your scenario, what happens if the internet cuts out? Does Terminal A stop collecting data while waiting for the internet to kick back in or does it cache it locally or does the data get lost?
'Terminal A' isn't what you would label as a computer (sorry my incorrect terminology I should probably of used 'Device A'!) it simply collects times and then is downloaded by the host (wherever that might be)
In the scenrio if the internet cuts out then 'Device A' will simply store all data and has a big enough reserve to last a few weeks to continue storing newer data, once the internet connection is restored all data held within the 'Device A' will be downloaded at the next download point
Ah... If you are unable to set up a VPN on the device, then cloud does sound like a good solution, but I'd still be concerned about the cloud host being hacked or shutting down.
Is it just 1 customer that is wanting the cloud solution? If so, can you just push back onto them and tell them they need to figure out where they want it hosted and not have that on you?
Where I work, we have several pieces of software that require a database and all of the ones we have put it on us to install the database software. One of them specifically requires SQL Server Standard or higher and does not include a license for it.
In my opinion, I think it should be up to the customer to provide the database, not you. In the event that the customer collects data for many many years and breaks the 10 GB limit of SQL Enterprise, what is your plan? Would you provide the SQL Standard license or would that be up to the customer to purchase it?
The other issue with SQL Express is that you don't get the SQL Agent, so all backups are done either manually or by windows task manager. I'd be looking at pushing this back on the customers instead of managing it yourself personally. If they want it on the cloud, they get that set up with the version of SQL they require (they may even have a SQL Standard or Enterprise license already that they could use) and your application just needs to know where to dump the data.
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
January 23, 2020 at 12:54 pm
This was removed by the editor as SPAM
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply