December 12, 2013 at 8:40 pm
Hi,
Will SQL Server 2014 Support to Read BLOB.
As we see in oracle we have some concept called DBMS_LOB.
Where the byte array data are stored in a BLOB data type and based on the size we can read the data and convert it to the number or float or what ever data type.
In oracle there are some utilities are there utl_raw.cast_from to read or write the byte array.
Do SQL server is having those functions?
If not is SQL Server having plan to do so?
Thanks!
Wish u all Happy Christmas in advance.....!
December 12, 2013 at 9:46 pm
You can do some conversion from varbinary(max). But I guess the question is, what are you trying to do? What is your goal?
CEWII
December 13, 2013 at 3:51 am
I'm not an Oracle person, so my understanding here may be grossly inadequate, but the short answer is No.
The longer answer is, I think, that you're doing lots and lots of CLOB and BLOB storage in Oracle to get around limitations in the VARCHAR and VARBINARY fields. We don't have those same limitations within SQL Server. So, as was noted, you can use VARBINARY(MAX) or VARCHAR(MAX) to store large binary or text objects, but still have the full functions associated with those data types. There wouldn't then be a need for all the functions associated with DBMS_LOB.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
December 14, 2013 at 8:44 am
as i said i need to split byte-array based on the size.
say there are 3 columns in a table.
i am having the information in the byte array format
i need to extract those format and insert in to the table.
for first column 4 bytes second column 2 bytes third 8 bytes.
December 14, 2013 at 7:45 pm
yuvipoy (12/14/2013)
as i said i need to split byte-array based on the size.say there are 3 columns in a table.
i am having the information in the byte array format
i need to extract those format and insert in to the table.
for first column 4 bytes second column 2 bytes third 8 bytes.
What is the ultimate datatype of those 3 columns?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 15, 2013 at 9:24 am
Jeff Moden (12/14/2013)
yuvipoy (12/14/2013)
as i said i need to split byte-array based on the size.say there are 3 columns in a table.
i am having the information in the byte array format
i need to extract those format and insert in to the table.
for first column 4 bytes second column 2 bytes third 8 bytes.
What is the ultimate datatype of those 3 columns?
So what ever it may be it may be Int , float, Bigint
or even varchar
String data conversion to a varchar data type in DB
the bytearray will be having float,smallint,bigint
or
the bytearray will be having float,smallint,String with 8 bytes
is there option to split based on the size...?
December 15, 2013 at 10:27 am
can you provide a test scenario so we can better understand what you're trying to do?
December 15, 2013 at 1:08 pm
yuvipoy (12/15/2013)
Jeff Moden (12/14/2013)
yuvipoy (12/14/2013)
as i said i need to split byte-array based on the size.say there are 3 columns in a table.
i am having the information in the byte array format
i need to extract those format and insert in to the table.
for first column 4 bytes second column 2 bytes third 8 bytes.
What is the ultimate datatype of those 3 columns?
So what ever it may be it may be Int , float, Bigint
or even varchar
String data conversion to a varchar data type in DB
the bytearray will be having float,smallint,bigint
or
the bytearray will be having float,smallint,String with 8 bytes
is there option to split based on the size...?
There's a reason why these things are called "BLOBS". Splitting fixed length, fixed field blobs is nearly trivial if you know the "record layout" but when you add VARCHAR or any other variable length datatype to the mix, it isn't going to work unless there's an embedded delimiter in the data or something that identifies the start of each field for every row.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 15, 2013 at 9:04 pm
Jeff Moden (12/15/2013)
There's a reason why these things are called "BLOBS". Splitting fixed length, fixed field blobs is nearly trivial if you know the "record layout" but when you add VARCHAR or any other variable length datatype to the mix, it isn't going to work unless there's an embedded delimiter in the data or something that identifies the start of each field for every row.
There are no embedded delimiter only byte size is there.
Only byte len is the starting point of every column.
Based on the BYTE sizeneed to split the BLOB.
December 15, 2013 at 10:31 pm
Some thing like
http://www.java2s.com/Tutorial/Oracle/0660__Large-Objects/0080__CLOB.htm
Byte array to number
Thanks!
December 16, 2013 at 12:51 am
yuvipoy (12/15/2013)
Jeff Moden (12/15/2013)
There's a reason why these things are called "BLOBS". Splitting fixed length, fixed field blobs is nearly trivial if you know the "record layout" but when you add VARCHAR or any other variable length datatype to the mix, it isn't going to work unless there's an embedded delimiter in the data or something that identifies the start of each field for every row.
There are no embedded delimiter only byte size is there.
Only byte len is the starting point of every column.
Based on the BYTE sizeneed to split the BLOB.
Ok... so where is that information stored and is it stored for every row so you can do variable length columns?
Shifting gears, it would really be nice if you took a look at the first link under "Helpful Links" in my signature line below
--Jeff Moden
Change is inevitable... Change for the better is not.
December 16, 2013 at 12:54 am
yuvipoy (12/15/2013)
Some thing likehttp://www.java2s.com/Tutorial/Oracle/0660__Large-Objects/0080__CLOB.htm
Byte array to number
Thanks!
I don't actually see anything helpful in there for this... especially since I'm doing my damnedest to forget everything I ever learned about Oracle. 🙂
Like I said, see the first link in my signature line below
--Jeff Moden
Change is inevitable... Change for the better is not.
December 17, 2013 at 9:29 pm
Ok... so where is that information stored and is it stored for every row so you can do variable length columns?
The information will be stored in a BLOB table as byte arrays (junk characters) need to convert it to the original format.
If you put some thing like this in Oracle
Select utl_raw.cast_to_varchar2(utl_raw.CAST_FROM_BINARY_INTEGER(87987)) from dual;
you will be getting the o/p something as
W'
The above is the input for me. i need to read and convert back them.
convert to original format as 87987.
December 17, 2013 at 10:49 pm
yuvipoy (12/17/2013)
Ok... so where is that information stored and is it stored for every row so you can do variable length columns?
The information will be stored in a BLOB table as byte arrays (junk characters) need to convert it to the original format.
If you put some thing like this in Oracle
Select utl_raw.cast_to_varchar2(utl_raw.CAST_FROM_BINARY_INTEGER(87987)) from dual;
you will be getting the o/p something as
W'
The above is the input for me. i need to read and convert back them.
convert to original format as 87987.
Perhaps this is due to a language barrier problem. Maybe giving Oracle examples will help someone else but it doesn't help me at all. And I also can't help at all unless I know what the field lengths of the data are and what datatypes there will be. And that's only for the fixed fields. There has to be something for each and every row that identifies the starting position for all fields if any variable length data is preset.
Do you have such data? If so, would you mind sharing it with us so we can try to help you with your problem?
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 15 posts - 1 through 15 (of 19 total)
You must be logged in to reply to this topic. Login to reply