October 19, 2010 at 7:58 am
Our infrastructure guys currently provide basic disk volumes from our SAN and these support up to 2TB.
Our data warehousing solution is rapidly reaching the point where some schemas are going to breach that limit. The server guys are talking about using GPT (GUID Partition Tables).
Are there any specific considerations or set up activities I need to be aware of to enable this?
October 19, 2010 at 11:04 am
Windows Support for Logical Units Larger than 2 TB
With Windows Server 2003 Service Pack 1 (SP1) and Windows XP 64-bit Edition (x64), these limits have changed.
Microsoft added support for 64-bit block numbers in the disk/class layer, using the new SCSI Commands included in the SCSI-3 Block Commands-2 command set. Microsoft also enabled GPT support for all Windows Server 2003 SP1 platforms. With this change, for example, a snapshot of a GPT partition on an Itanium-based machine can now be transported to a 32-bit machine for data mining or archiving purposes.
The new limits are as follows:
• Basic or dynamic volume size: 264 blocks = 273 bytes (too big to pronounce)
• Maximum NTFS file system size that can be realized on Windows: 256 TB
I have personally seen / administered a share point database that was over 25 TB on a single GPT partition spread over several SAN disks.... the biggest issue was backups 😉
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply