July 21, 2005 at 9:04 pm
Microsoft Research is apparently working on P2P technology, something that might rival Bit Torrent. They are claiming 20-30% faster downloads and the technique is interesting.
They basically reencode the chunks of the file and then distribute those, but the reencoded chunks have different parts of the file described, so that they can allow a client to regenerate parts of the file that they do not have. It solves the problem (supposedly) of Bit Torrent where the last few parts of the file sometimes are held only by a couple people.
I don't understand the entire thing, and if you're a technical research capable person, read the entire paper from Microsoft Research. I got lost after 3 pages and might have been lost before that.
This is interesting to me, not because I'd use it. I don't use Bit Torrent, mostly because I don't have a need for it. However I can see the advantages for patches, service packs, betas, etc. It might really help not only Microsoft with bandwidth, but help to make everyone's live easier because they won't be competing with others to get to the downloads or Betaplace software. They could actually grab stuff from other users. Assuming it's legitimate traffic.
The more interesting thing is that many large corporations, especially media corporations have come out against P2P technologies. There is a case before the US Supreme Court, MGM vs. Grokster, suits by the RIAA, and more where big corporations are seeking more control over the use of the bits themselves. Even Microsoft has worked hard on DRM technologies, which are built into Media Player.
So to have them working on this technology is interesting. Maybe they feel they'll have sufficient controls over the end use of the products through other tools. Maybe they'll watermark all their files from now on to know if they are being distributed illegally. It's hard to know, but I do think there are some great uses for this technology and I'm glad to see them working on it.
Steve Jones
July 22, 2005 at 8:01 am
After finishing reading the report, I have a few matters to say about it.
1) Bandwidth may be less of an issue for a few reasons
a) The pipe shall be apparently jumping up in a couple orders of magnitude, shortly, can you say fiber for all.
b) The packets maybe heading from 1500 to 9000 creating less overhead.
2) Those stats would have to be applied to a 'forced help - help situation' among file hosters / traders, That might skew the data from where a significant percentage below and above the curve might not use it or it not work for them ...etc... and thereby reduce the overall effectiveness of the sharingand the data.
Now from a managed server standpoint, enabling this to handle traffic such as AD replication, clustering, file updating, and general network traffic, there might be an increase in scalability. Again though the first two reasons might lessen its impact.
I think I read somewhere regarding enhanced packet systems in the newer Cisco routers, switches etc, again this might be a place to find higher increases of traffic mitigation.
And finally as long as traffic is using DSL the up limitations placed on them may not make for a noticable improvement, beyond 'fixing' bittorrent, which from my perspective is just an awful style of file transfer.
give me ftp anyday.
cheers
July 22, 2005 at 1:48 pm
P2P and grid-computing could be the file-systems of the future, in that your actual PC is just a shard of the total data available to you (think Google's grid computing, but without the computers in the same room). This isn't revolutionary or anything, it's been around for ages.
Sure, there'll be a lot of overhead to produce this sort of parallelism. But don't tell me there's not a lot of overhead running most common OSes. History has proven that people will accept slower speeds for convenience; within certain (reasonable?) limits. And as long as it's decentralized, it may be possible to keep people from "breaking" it with DRM, required proprietary software, backdoors, and other malware. I'm thinking something with a very strong immune system built into the very fabric of the OS.
The client and pipe are getting much fatter; so maybe we do need to re-examine some paradigms.
Signature is NULL
July 22, 2005 at 2:25 pm
Any lock will be picked.
Fat? Nah. Greater features... faster ... kinder... better. You can havea thin OS its 'code' is then found in hardware.
We can't have a tool that does most things well quickly without bit of investment from the clients side. JAVA apps were to kill the local clients, I do not see it. The new dumb terminals are the phones of the now and probably the 'tricorders' of the near future. Unless APPLE gets SMRT and turns the ipod into a phone. They would be pretty cool with sunglasses with 3d google sushilia mash-up maps while one drives.
http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=61&messageid=203668
I personally like having my own home network, with all the pluses and negatives that go along with it. There is little need for an internet app if you have them all at home aready.
Happy weekend.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply