Apple could save a bundle on bandwidth by tapping into the unused cable/DSL bandwidth of its users. Macosrumors claims to have information pointing to the planned inclusion of a P2P system to be built into OS X 10.5 (Leopard). Users who elected to turn on the “Reward-Sharing system” would receive Apple credits, redeembable for iTMS downloads or other goodies.
Based on some rough math estimated for the proposal, the team pushing this concept believes they could cut Apple’s bandwidth costs by hundreds of thousands if not millions of dollars per year and by always finding the closest peer-sharing hosts, the system would also save terabytes of Internet backbone bandwidth that is now used for Software Updates, QuickTime Movie Trailers, and iTunes Store downloads among other things.
Integrating P2P into the operating system at this level would be a sort of acknowledgment that P2P isn’t an activity users do on top of a network stack, but an emergent feature of the network itself, increasingly integral to everyday computing.
In the midst of the net neutrality debate, this has additional implications, since it means users with lots of dark fiber would suddenly be using lots more of their Comcast (etc.) bandwidth. Apple essentially making the internet healthier by distributing the load … but ultimately at the expense of the carriers.
thanks dsandler.org
Technorati Tags: bittorrent, itms, mac, p2p
I have often cursed at Apple for bizarre download situations. The most recent being XCode.
I want to get XCode for myself. I do not want the >600MB download consuming all my bandwidth during that time. curl –limit-rate 50K -O http://some.url/and/path/to/xcode.dmg will do it, right?
Also, I want to connect to a friend’s machine remotely and get it for him, also using curl.
But starting the download with a browser and then passing the Akamai-generated URL to curl fails, as the URL has “expired.”
Uhhh … OK. Screw it. I won’t port my app to Intel. It’s freeware, so I lose no money. The only loser here is the OSX software ecosystem.
And all because Apple wants to track downloads?
Make. It. Easy.
Mnep, I’m not sure whether you’re saying that Apple has modified curl somehow or that the akamai URLs are per-browser-session won’t translate to any version of curl. If the latter, that’s hardly a knock against an Apple practice. Akamai and similar load-balancing systems are there to make things better for users. If you’re in Bolivia, why should you have to grab bits all the way from Cupertino? If you don’t like that Akamai URLs expire outside the browser, blame Akamai, not Apple.
If you don’t like that Akamai URLs expire outside the browser, blame Akamai, not Apple.
No offense, but this is apologist. I blame Apple. They have a choice. They don’t *have* to use Akamai. They *choose* to. And that means that download URLs for >600MB files are not persistent and only work from the feature-limited context of a browser.
When I get an Ubuntu ISO have the choice of downloading via http (and a static URL) or using BitTorrent.
Canonical provides multiple vectors for getting large files, all of which can be easily throttled, and in the case of BitTorrent, are easily resumed for those on dialup.
That’s what you’re talking about Apple doing. And I’m saying, “Welcome to the party, Apple. What held you up?”
So if I understand correctly, you’re saying that when selecting a network distribution partner, Apple should have noticed that Akamai generated URLs containing session IDs. And not only should they have noticed, but the presence of those session URLs should have been a deal-killer, and Apple should have gone with another partner.
I would venture that they never noticed this, since grabbing a URL from the browser and pasting into the command line is something that, what, 1% of users are going to do 1% of the time? I would hope for Apple’s and Canonical’s or any other businesses sake that they don’t make big decisions based on the desires of a tiny fraction of the use cases.
Don’t get me wrong: I agree that URLs should be permanent identifiers in general. But there are cases where session information is critical to the functioning of an application. C’est la vie. That fact should not be a deal killer for anyone. If Akamai is the right solution, then it’s the right solution, and the .01% use case is not a serious factor.
It may be that LimeLight or other distribution networks don’t use session URLs — I don’t know. But there are a dozen other factors more important than this in choosing a distribution partner.
I’m not being an apologist because I don’t think Apple is “failing to make things easy” for users, or that they went with the wrong partner, etc.
What I’m saying is that when Apple is distributing >600MB files to an audience of developers they should take into account that many of these highly technical users may well not choose to download those files over an unthrottled http session. And thus URLs to these enormous files that have session information that expires may well make retrieval of these files a PITA to their audience.
I cannot say for certain, but my impression is that Canonical realized that the files that they would be distributing would be very large, hence: people would need multiple vectors for retrieval of those files, that some of those vectors would need the ability to easily resume a download, and by tapping the abilities of a distributed sharing network like BitTorrent would allow them to offload a good chunk of bandwidth while providing a solution to the first two points.
And if the Apple rumor is true, it seems Apple has figured out the same thing.
This discussion makes me wonder why the download managers in GUI browsers don’t let you right-click | “Throttle” any given d/l. Would be a nice plugin.
While I share Mnep’s annoyance, I’m still gonna have to side with Scot. I’m one of those >1% who would care about such things. And while I might not like the decision Apple made, I’m not their target audience (heck, I’m not even one of the “minorities” they should care about – there’s not enough people like me out there to even be a blip on their radar ;)
That’s what free, open-source systems are for. If you’re a propeller head who actually cares about something like this, there’s probably some Linux/*BSD solution, or at least the framework to roll yer own :) (see the Canonical example above)
So I just don’t see getting my knickers in a twist about Apple in this case. It’s like going to In-and-Out Burger and gettin’ all worked up that they don’t offer cruelty-free, vegan, stir-fried noodles or something…