Excel > SQL column size query

  • Hi all,

    I am using the Import wizard to get data from an Excel Spreadsheet into SQL Server 2000.

    The excel sheet only has 3 columns but one of them is contains a lot of text. When I import it the text in this column in the SQL table is cut to about 30 words. It is Data Type "char", Length "255".

    What can I do to allow the whole excel field to be imported?

    Thanks very much.

    Paul.

  • Hello,

    Lets make sure you actually have an issue.  If you are using Query Analyzer to run a select statement, make sure that QA is configured to show the full width of the column.  From the QA menus, select "tools", then select "options".  Select the "results" tab.  Make sure that the "maximum characters per column" is 255 or larger.

    Run the query again.  If this does not fix the problem, we can go deeper into import process.

    Hope this helps

    Wayne

  • another thing could be the definition of the import table, or a convert function:

     

    a varchar field, if no size is given defaults to 30 characters.

    select convert(varchar,'This value is actually 44 characters long.')

    results:

    This value is actually 44 char

    check to see if you are using a convert function and also your table definition for the table you are importing it to, and alter the column to a varchar 255:

     

     

    Lowell


    --help us help you! If you post a question, make sure you include a CREATE TABLE... statement and INSERT INTO... statement into that table to give the volunteers here representative data. with your description of the problem, we can provide a tested, verifiable solution to your question! asking the question the right way gets you a tested answer the fastest way possible!

  • Paul,

    When you import the Excel spreadsheet in DTS when you come to the page of the wizard where it shows a table with your excel spreadsheet and the name of the new SQL table, you must press the commandbutton in the 3rd column (...). Now you get a screen in which you can change the columm definition of your (new) table.

    If this doesn't work, just create a new table manually in Enterprise Manager, set the right column definitions and import the data into the existing table.

    JP

  • Wayne, Lowell and JP,

         All good ideas, thanks for the responses but I still have the problem. I do have a bit more info now to try and fix it, apologies for any of my inaccurate descriptions earlier.

    The excel sheet is set up as follows :

    Column1              Column2              Column3

    FileNumber          ID Number          Description

    15 Chars max      20 Chars max       Up to 25000 Chars

    Ok so the problem is the Description column. the number of characters in it varies from 20 to 25000. I have tried a Registry Hack which I have read about elsewhere, and also copied one of the larger rows to the top of the table to ensure it was being checked, details below:

    "The Jet OLE DB provider reads a registry key to determine how many rows are to be read to guess the type of the source column. The registry setting is:

    HKLM\Software\Microsoft\Jet\4.0\Engines\Excel\TypeGuessRows

    By default, the value for this key is 8. Hence, the provider scans the first 8 rows of the source data to determine the data types for the columns. If any field looks like text and the length of data is more than 255 characters, the column is typed as a memo field. So, if there is no data with a length greater than 255 characters in the first 8 rows of the source, Jet cannot accurately determine the nature of the data type."

    I have tried various data types for that column in SQL but can't get the import to run. It fails all the time at either row 13 or 14 with the following error :

    "Error at source for Row Number 14. Errors encountered so far in this task: 1.

    Data for source column 3('Goods') is too large for the specified buffer size."

    Is their any way I can get this data in to SQL?

    Update, If I use ntext it pulls in the first 8000 odd characters, but it runs without errors. This is better but I still need all the data.

    Thanks,

    Paul.

  • Paul

    Yes you can, just enter 'TEXT' as fieldtype for column 33 in SQL.

    JP

  • Sorry JP, updated my post, found a cell with about 25000 Characters in it so this needs to be my limit.

    I'm trying to avoid pointing a file location to a word doc or text file, should TEXT work with this many characters. In QA under tools/options/results I can only set it to show 8192 characters. Is this a limit.

    Thanks again.

  • Paul,

    Text is limited:

    Books on Line:

    text

    Variable-length non-Unicode data in the code page of the server and with a maximum length of 231-1 (2,147,483,647) characters. When the server code page uses double-byte characters, the storage is still 2,147,483,647 bytes. Depending on the character string, the storage size may be less than 2,147,483,647 bytes.

    That's big enough for your column !

    The max size of varchar col = 8060 bytes, which is the largest 'ordinay' data type and gets stored in a SQL datapage. The text and image datatype is stored on another type of datapage and has a reference to the table-datapage.

    There are some other limits on this type of column, it's impossible to use in in a where clause or order by. It's actually meant for storage of BLOBs or large amounts of text.

    JP

     

  • Just found out the same thing from "Microsoft SQL Server 2000 for Dummies". Quality read!

    Got it working now, it was a combination of things including my own stupidity. QA was only displaying 8192 characters. so I didn't realise I was pulling in more than that. I actually wanted it in the SQL Db to allow me to search it using LotusScript and when I tested it this way instead of QA it had pulled in the whole lot.

    Thanks for perservering with it JP and cheers to the rest for your help and suggestions.

    Regards,

    Paul.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply