A Day In The Cloud – Unlimited Can Work

Photo of author

Matt Smollinger

Unlimited. It’s a word that instills calm, peace, and serenity in many a technology consumer. Unlimited equates to “I don’t have to think”. It’s a term that grew out of the heady days of the Internet ISP craze. Everyone could be an ISP. All you needed were some telephone numbers, some modems, some routers, and bandwidth behind them.

When Tim and I originally discussed the idea behind this article, I was going to limit it to storage. After all, a great number of the reviews on SCB are storage and backup reviews. But as I started thinking and making some initial drafts, I realized that what I’m really talking about is the actual concept of “unlimited” in technology. The entire argument for and against revolves around limiting the Internet.

The Issue

A quick background on the Internet: the Internet looks like a small galaxy if you were to draw it. There are a couple hundred global interconnect points, usually around major cities, which have the infrastructure to maintain them. These global interconnects are basically giant routers, which route traffic through either undersea cables, or between regional interconnects.

Subsequently the regional interconnects then connect out to usually your ISP, which then connect out to their local offices, which then eventually reach the consumers. I’m simplifying a bit, but those are the basics of how the Internet works.

Like I mentioned, the term “unlimited” came out of when there were a million ISPs running around trying to woo customers for service. AOL, Prodigy, CompuServe (which apparently is in the spellchecker on my Mac), and so on. AOL was the first one to introduce “unlimited” hours, in order to distinguish themselves from their competitors. I’m sure many of you remember receiving $100+ bills from your ISP because you went over your allotted hours. I know I certainly do.

With AOL’s move to unlimited, everyone else quickly followed suit, and soon everyone was offering unlimited plans (who could afford it anyway). Eventually, in order to distinguish themselves from AOL, newer ISPs started dropping their lower rate plans alltogether, and focused on making unlimited cheaper.

Fast forward to today, and we face new challenges. I’ll use myself as an example. I have had for a couple of months a 25 Mbps synchronous connection to the Internet, thanks to Verizon trying to one-up the cable providers by pulling fiber to my apartment. 25 Mbps is about 446 times faster than the 56k modem I had growing up.

Now, my 25 Mbps has to travel all the way through those interconnect points, where it merges with other data, usually on more/larger fiber (a process called multiplexing). As you might imagine, there are limits to the amount of light you can shine down a fiber, both physical and technologically imposed. The Internet thus is a series of bottlenecks, leading up to huge bottlenecks in the core routers.

Wireless networks have the same problem, especially in densely populated areas. Imagine New York City or Los Angeles, where you have 10 million or more people being served by a couple hundred cellphone towers. Moving to digital cellphones helped alleviate the problem, but only temporarily.

Storage providers have similar issues. My laptop now holds more data than my desktop did 6 years ago. Desktop drives have also subsequently gotten much larger, with 3 TB being the current largest drives on the market.

Pioneered by Mozy, unlimited cloud storage was a way to get people to start using cloud storage. By not having to think about the amount of storage used, it made setup for the end-user much simpler (Just back up everything!), and it made billing simpler as well.

Now though, people have started using the services, and are backing up everything. You can read about in my previous article a somewhat extreme case of a small video studio backing up just their original RAW data and their finished products, which amount to about 10 TB of data currently (and that’s very small for video).

The Current (Wrong) Solutions

So what we have is a basic problem of supply and demand. In all three industries I listed, demand has either begun to, or already has, outstripped the supply. Now we begin to see the arguments against unlimited coming to light, and some proposed (and implemented) “solutions”.

The traditional ISPs have had this problem for far longer than the other two industries I mentioned, so they have had time to tune their solution more. Any residential traffic that eventually gets to the core routers gets re-prioritized lower than other traffic. What happens then is that residential traffic gets queued / dropped.

This has the effect of slowing down your traffic significantly. My 25 Mbps is throttled to about 8 Mbps going to LA. To the UK, I get throttled to about 3 Mbps. This is all upload mind you, as queuing / priority only works on traffic coming out of a router, not arriving.

This hasn’t stopped the ISPs though from further throttling selected network protocols like BitTorrent, or instituting bandwidth caps. Even the regional networks have now become overloaded as demand continues to grow. It’s a problem that’s probably never going to go away, either.

Wireless vendors are having a more difficult time. Their problems are more extreme because their bandwidth is more limited, so they’ve taken more extreme measures. First AT&T, and now Verizon, have removed unlimited bandwidth data plans alltogether and instituted tiered data plans, with very low overall bandwidth. This has many people worried it’s going to stifle innovation on mobile devices.

Storage vendors are fighting the good fight, but I think they are eventually going to cave too, just like Mozy. If you offer unlimited storage plans for cheap, people are going to eventually use the storage. Storage companies have been banking on slow Internet connections keeping people from uploading oodles of data, but the ISPs are trying very hard to remove that bottleneck. That will then move the bottleneck to the storage vendors.

Better Solutions

You’re probably wondering now what I’m trying to get at. Well, I see two solutions that could solve these problems and give consumers a lot more freedom, but no one has tried doing yet (well there’s one, but I’ll get to that).

The first is what European wireless providers have already been doing: completely metered service. The only big storage vendor who is also doing this is Amazon Web Services. Does it place some responsibility on the user? Absolutely. But I think it’s time people started taking responsibility for their data, and understood just what they are doing on their computers.

This could have far reaching implications. Just imagine if your ISP offered only metered plans. Its toughest for traditional ISPs, because they have the largest expense to bring a single customer online, but think about it. There wouldn’t have to be this “Well do I want the 5 Mbps plan or the 10 Mbps plan.” You would just get the absolute fastest connection the ISP has to offer, because it would make them the most money.

People would then take spyware and malware protection very seriously. Just imagine losing a GB of data you paid for, because you didn’t have an updated spyware scanner. It would be an computer support person’s dream, because suddenly there is a direct cost savings correlation: if you don’t have antivirus/antimalware, you could blow through your entire allotment of bandwidth in an instant.

The second option comes from economics class: if demand is outstripping supply, increase prices. Now this will subsequently make you lose customers, but there are ways to avoid that. The easiest way is to just make the unlimited plans expensive, and offer tiers that are less expensive.

Let’s put together a scenario: I use around 40 Gigabits of bandwidth a month, which is probably higher than the average consumer, but is low compared to many. In a tiered plan scenario, I could see an ISP offering 10, 25, 50, and 100 Gb of bandwidth a month, with a possible 250 Gb tier.

Additionally, they could offer an unlimited plan that is at least twice as expensive as the 250 Gb tier. This will force users to think about their usage, but also offer the option to continue having the convenience. Just price it high enough to cover worst case costs and alter plan pricing as needed, but don’t remove the choice.

Conclusion

Honestly, I think either option could work. I know about how much data I use a month, and the average user could too, as the concepts aren’t really that difficult to explain. The issue is that ISPs, storage, and wireless vendors are very shortsighted. They have a problem to fix, so fix it, and let’s not think about it again until it’s a problem again.

If that were true across the entire tech industry, things like the transistor, the integrated circuit, and heck the Internet would not have been invented. The transistor itself took years of incubation in Bell Labs to get working correctly. The best solutions come from well laid plans, and until these companies decide to stop implementing stop-gaps and actually assess the problem, we are all going to suffer.

I’m sure though there are other ideas and possible solutions, so lets hear it! We recently added comments to SCB, so go ahead and post your thoughts below. I look forward to reading through them and seeing what you have to say.

Related posts

Firefox 5 UI Mockups Hint At Deeper Cloud Integration

Mozilla has begun showing UI design mockups for Firefox 5, with deeper cloud integration being a primary design interest.

SmallCloudBuilder Giveaway Results: Buffalo CloudStor NAS

The contest has ended and the winner of the Buffalo CloudStor is Fernanda Foertter .

D-Link Gives Some Switches Cloud Smarts

D-Link has added cloud monitoring features to some of its managed switches.