SSIS failing to save packages and reboots Visual Studio
Asked Answered
S

2

2

This is my first experience with SSIS so bear with me... I am using SSIS to migrate tables from Oracle to SSMS, there are some very large tables I am trying to transfer (50 million rows +). SSIS is now completely freezing up and rebooting VS when I am just trying to save the package (not even running it). It keeps returning errors of insufficient memory, however, I am working on a remote server that has well over the RAM it takes to run this package.

Error Message when trying to save

Error Message when trying to save

The only thing I can think of is when this package is attempting to run, my Ethernet Kbps are through the roof right as the package starts. Maybe need to update my pipeline?

Ethernet Graph

Ethernet Graph

Also, my largest table will fail when importing due to BYTE sizes (again, not nearly using all the memory on the server). We are using ODBC Source as this was the only way we were able to get other large tables to upload more than 1 million rows.

I have tried creating a temporary buffer file to help with memory pressure, but that had no changes. I have changed the AutoAdjustBufferSize to True, no change in results. also changed DefaultBufferMaxRows and DefaultBufferSize.. no change.

ERRORS WHEN RUNNING LARGE TABLE:

Information: 0x4004300C at SRC_STG_TABLENAME, SSIS.Pipeline: Execute phase is beginning.

Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

Information: 0x4004800D at SRC_STG_TABLENAME: The buffer manager failed a memory allocation call for 810400000 bytes, but was unable to swap out any buffers to relieve memory pressure. 2 buffers were considered and 2 were locked.

Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.

Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

Error: 0xC0047012 at SRC_STG_TABLENAME: A buffer failed while allocating 810400000 bytes.

Error: 0xC0047011 at SRC_STG_TABLENAME: The system reports 26 percent memory load. There are 68718940160 bytes of physical memory with 50752466944 bytes free. There are 4294836224 bytes of virtual memory with 914223104 bytes free. The paging file has 84825067520 bytes with 61915041792 bytes free.

Information: 0x4004800F at SRC_STG_TABLENAME: Buffer manager allocated 1548 megabyte(s) in 2 physical buffer(s).

Information: 0x40048010 at SRC_STG_TABLENAME: Component "ODBC Source" (60) owns 775 megabyte(s) physical buffer.

Error: 0x279 at SRC_STG_TABLENAME, ODBC Source [60]: Failed to add row to output buffer.

Error: 0x384 at SRC_STG_TABLENAME, ODBC Source [60]: Open Database Connectivity (ODBC) error occurred.

Error: 0xC0047038 at SRC_STG_TABLENAME, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on ODBC Source returned error code 0x80004005. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

This is really holding up my work. HELP!

Swoon answered 22/3, 2019 at 15:46 Comment(4)
Have you tried working with the Attunity driver for Oracle/Teradata? That driver is way more helpful and easier to work with than the native ODBC. As for the error, there's not much anyone can suggest when VS cant find enough memory. Have you tried running this from some other machine possibly? Maybe a local laptop/Desktop?Lindell
I haven't worked the Attunity driver, I'll look into that. We have ran it on another machine, but still on the Remote Server and it seemed to work slightly better (Not completely shutting down as much). We havent tested it extensively yet.Swoon
@Swoon i updated my answer to add some other possible workaround check it outIbbie
@Swoon concerning that you cannot save the package it looks like it was executed and the memory is full, you have to kill the process running the select query from the Oracle engine. Considering how to be able to import huge amount of data you got an amazing answer below.Remodel
I
2

I suggest reading data in chunks:

Instead of loading the whole table, try to split the data into chunks and import them to SQL Server. From a while, I answered a similar answer related to SQLite, i will try to reproduce it to fit the Oracle syntax:


Step by Step guide

In this example each chunk contains 10000 rows.

  1. Declare 2 Variables of type Int32 (@[User::RowCount] and @[User::IncrementValue])
  2. Add an Execute SQL Task that execute a select Count(*) command and store the Result Set into the variable @[User::RowCount]

enter image description here

enter image description here

  1. Add a For Loop with the following preferences:

enter image description here

  1. Inside the for loop container add a Data flow task
  2. Inside the dataflow task add an ODBC Source and OLEDB Destination
  3. In the ODBC Source select SQL Command option and write a SELECT * FROM TABLE query *(to retrieve metadata only`
  4. Map the columns between source and destination
  5. Go back to the Control flow and click on the Data flow task and hit F4 to view the properties window
  6. In the properties window go to expression and Assign the following expression to [ODBC Source].[SQLCommand] property: (for more info refer to How to pass SSIS variables in ODBC SQLCommand expression?)

    "SELECT * FROM MYTABLE ORDER BY ID_COLUMN
    OFFSET " + (DT_WSTR,50)@[User::IncrementValue] + "FETCH NEXT 10000 ROWS ONLY;"
    

Where MYTABLE is the source table name, and IDCOLUMN is your primary key or identity column.

Control Flow Screenshot

enter image description here

References


Update 1 - Other possible workarounds

While searching for similar issues i found some additional workarounds that you can try:

(1) Change the SQL Server max memory

(2) Enable Named pipes

(3) If using SQL Server 2008 install hotfixes


Update 2 - Understanding the error

In the following MSDN link, the error cause was described as following:

Virtual memory is a superset of physical memory. Processes in Windows typically do not specify which they are to use, as that would (greatly) inhibit how Windows can multitask. SSIS allocates virtual memory. If Windows is able to, all of these allocations are held in physical memory, where access is faster. However, if SSIS requests more memory than is physically available, then that virtual memory spills to disk, making the package operate orders of magnitude slower. And in worst cases, if there is not enough virtual memory in the system, then the package will fail.

Ibbie answered 22/3, 2019 at 20:34 Comment(2)
This helped out a alot. I really appreciate it Hadi. We got it to run by changing the pipes as well as changing the virtual memory allotment of the machine. For some reason the machine was not recognizing how much virtual memory it had. This may be a temporary fix for now. but it's working. I'm sure as we get into another 90 tables, your answer will come in handy! Thanks again!Swoon
@Swoon you are always welcomed. Reading data in chuncks is very effective when handling huge amount of data.Ibbie
D
0

Are you running your packages in parallel ? If yes, change to serie.

You can also try to divide this big table into subsets using an operation like modulo. See that example :

http://henkvandervalk.com/reading-as-fast-as-possible-from-a-table-with-ssis-part-ii

(in the example, he is running in parallel, but you can put this in serie)

Also, if you are running the SSIS package on a computer that is running an instance of SQL Server, when you run the package, set the Maximum server memory option for the SQL Server instance to a smaller value. That will increases available memory.

Dy answered 22/3, 2019 at 18:21 Comment(1)
I am running one table at a time. I have tried changing the Max Server memory option in SQL to less than half of the original and that didnt change. When going to the SQL server performance, it barely makes a blip on the screen when attempting to run the table. Do you think there could be a batch / buffer limit in Oracle as this error is happening as I'm trying to pull from Oracle?Swoon

© 2022 - 2024 — McMap. All rights reserved.