May 3, 2018 at 10:54 am
Hi,
I am trying to execute a package and I am getting the following error messages:
Error: A buffer failed while allocating 10475488 bytes.
Error: The system reports 95 percent memory load. There are 17179262976 bytes of physical memory with 687333376 bytes free. There are 4294836224 bytes of virtual memory with 420585472 bytes free. The paging file has 21897904128 bytes with 2934136832 bytes free.
I am in the process of testing these packages as part of a migration. In the current existing environment (Server A Visual Studio 2008), I have no issues and everything runs fine.
On the new server (Server B Visual Studio 2017), I am getting this error message.
Any thoughts as to why this is happening?
May 3, 2018 at 11:08 am
skaggs.andrew - Thursday, May 3, 2018 10:54 AMHi,
I am trying to execute a package and I am getting the following error messages:Error: A buffer failed while allocating 10475488 bytes.
Error: The system reports 95 percent memory load. There are 17179262976 bytes of physical memory with 687333376 bytes free. There are 4294836224 bytes of virtual memory with 420585472 bytes free. The paging file has 21897904128 bytes with 2934136832 bytes free.I am in the process of testing these packages as part of a migration. In the current existing environment (Server A Visual Studio 2008), I have no issues and everything runs fine.
On the new server (Server B Visual Studio 2017), I am getting this error message.Any thoughts as to why this is happening?
What does the package do?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 3, 2018 at 11:14 am
skaggs.andrew - Thursday, May 3, 2018 10:54 AMHi,
I am trying to execute a package and I am getting the following error messages:Error: A buffer failed while allocating 10475488 bytes.
Error: The system reports 95 percent memory load. There are 17179262976 bytes of physical memory with 687333376 bytes free. There are 4294836224 bytes of virtual memory with 420585472 bytes free. The paging file has 21897904128 bytes with 2934136832 bytes free.I am in the process of testing these packages as part of a migration. In the current existing environment (Server A Visual Studio 2008), I have no issues and everything runs fine.
On the new server (Server B Visual Studio 2017), I am getting this error message.Any thoughts as to why this is happening?
What does the package do
The package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to target
May 3, 2018 at 11:25 am
skaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to target
Sure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 3, 2018 at 11:32 am
Phil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?
Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
May 3, 2018 at 12:40 pm
skaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
Do those data flows execute in parallel, or one at a time? Does the physical configuration of the new server (RAM, CPU, DISK SPACE, DRIVE LETTERS) match or exceed the old server's values? Is there a change in the SQL Server version?
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
May 3, 2018 at 12:45 pm
skaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 3, 2018 at 12:46 pm
sgmunson - Thursday, May 3, 2018 12:40 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
Do those data flows execute in parallel, or one at a time? Does the physical configuration of the new server (RAM, CPU, DISK SPACE, DRIVE LETTERS) match or exceed the old server's values? Is there a change in the SQL Server version?
the flows DO execute in parallel. I checked the physical configuration and they appear to be the same. In this process, I have also upgraded to 2017 SQL Server. The previous version is 2008
May 3, 2018 at 12:48 pm
Phil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
May 3, 2018 at 1:06 pm
skaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
OK. What Edition of SQL Server is the 2008 instance (Standard, Enterprise, etc)? Same question for the new server.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
May 3, 2018 at 1:07 pm
skaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
And how much RAM is that? Any chance you were on the edge of capacity in the older environment but just didn't know it? Is the new server under load yet, or is this still pre-production work? While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases. Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster. I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
May 3, 2018 at 1:46 pm
Phil Parkin - Thursday, May 3, 2018 1:06 PMskaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
OK. What Edition of SQL Server is the 2008 instance (Standard, Enterprise, etc)? Same question for the new server.
SQL Server 2008
Microsoft SQL Server Management Studio 10.0.5538.0
SQL Server 2017
Microsoft SQL Server Management Studio 14.0.17224.0
May 3, 2018 at 1:49 pm
skaggs.andrew - Thursday, May 3, 2018 1:46 PMPhil Parkin - Thursday, May 3, 2018 1:06 PMskaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
OK. What Edition of SQL Server is the 2008 instance (Standard, Enterprise, etc)? Same question for the new server.
SQL Server 2008
Microsoft SQL Server Management Studio 10.0.5538.0
SQL Server 2017
Microsoft SQL Server Management Studio 14.0.17224.0
SQL Server EDITION, not the version of Management Studio. You can SELECT @@VERSION to find out.
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
May 3, 2018 at 1:53 pm
sgmunson - Thursday, May 3, 2018 1:07 PMskaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
And how much RAM is that? Any chance you were on the edge of capacity in the older environment but just didn't know it? Is the new server under load yet, or is this still pre-production work? While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases. Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster. I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...
the current(old) has 16 GB ram, the new has 16 gb of ram. should I ask my admin to have it doubled?
May 3, 2018 at 2:07 pm
skaggs.andrew - Thursday, May 3, 2018 1:53 PMsgmunson - Thursday, May 3, 2018 1:07 PMskaggs.andrew - Thursday, May 3, 2018 12:48 PMPhil Parkin - Thursday, May 3, 2018 12:45 PMskaggs.andrew - Thursday, May 3, 2018 11:32 AMPhil Parkin - Thursday, May 3, 2018 11:25 AMskaggs.andrew - Thursday, May 3, 2018 11:14 AMThe package says the execution completed. It acts as though the data flow completed without error. However, when I look at links between the sources and targets within the data flow task, I can see that they have not all completed. It's as if everything stopped and didn't complete the data transfer from source to targetSure, but that's not quite what I meant 🙂
Is this just a simple data flow from A to B, or is there more complexity than that (eg, lookups, aggregations, joins)?Sorry. This is a very straight forward source to target, no transformations involved. I have a data flow task and inside I have probably 10 source to target paths setup. This has been working for years and now that I have migrated the package, I am getting this error.
No problem.
Was this a straight migration, or were any changes made to the packages? Specifically, values like Rows Per Batch and Maximum Insert Commit Size on the data flow destinations?
Is the RAM on the 2017 machine the same, or larger, than what you have on the 2008 instance?
I essentially just copied the packages over from the old environment. Maybe 10 packages in total. I have already tested about 6 of them and no issues until now. I did not make any changes to the package other than configure the connection manager. I have the same amount of RAM on both machines.
And how much RAM is that? Any chance you were on the edge of capacity in the older environment but just didn't know it? Is the new server under load yet, or is this still pre-production work? While there have been old boxes with 4 GB of RAM operating with SQL 2008 for years, boxes that small are often way under-powered, and even a new box with the same amount of RAM isn't likely to avoid issues that might just be a little hidden because they're edge cases. Sometimes, just having the much faster processors cause the RAM requirements to jump up considerably simply because the server can now do things so much faster. I rarely spec a SQL Server at much less than 32 GB of RAM, but that doesn't mean there aren't cases for slightly less...
the current(old) has 16 GB ram, the new has 16 gb of ram. should I ask my admin to have it doubled?
Maybe... but it might mean that you'd need to justify it. Do you have any stats hanging around that document RAM usage by SQL Server? Might be hard to justify without it. Alternatively, do you have any kind of stats hanging around that document any kind of capacity limit being reached with the old server? Could be RAM, CPU, disk space, or even I/O waits
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
Viewing 15 posts - 1 through 15 (of 41 total)
You must be logged in to reply to this topic. Login to reply