Showing posts with label Home Internet services.. Show all posts
Showing posts with label Home Internet services.. Show all posts

Monday, April 6, 2020

The new reality of home Internet usage




I occurred to me that I have not checked my home Internet usage in quite some time. In my last post on the topic of Internet home bandwidth usage in https://kaplowtech.blogspot.com/2011/09/more-usage-same-cost-boom.html, it appeared that my bandwidth usage was heading towards the then Comcast cap of 250 Gbytes per month.   My calculation was accurate as was Comcast's change of policy to remove the "cap" before the majority of their customers would have bombarded them with hate and discontent.

So, today I took a look at my home's Internet usage and remarkably the increase was less than 100% over period of the third quarter of 2011 to the first quarter of 2020.  We have Netflix, Prime, and do a fair amount of YouTube streaming, so I would think that our usage pattern is typical.  Of course, starting in September one of my Sons was off to college, so I am pretty sure usage was a bit higher in August 2019  before he moved to his dormitory .


Of course, now he is back and our family of four confined (mostly) to our home is banging on the Internet pretty hard.  We are clearly streaming up a storm and Teams'ing, Zoom'ing, and Webex'ing much of our work that our use increased by over 200% in one month.

So far, the performance of these services has been good.  It appears that the Internet is much more resilient than some people would have you believe or to give credit to the companies and their staff that build and operate the hundreds of networks that make it possible.  Of course, there was this BGP hijack, and a proliferation of cyber attacks, but that is another story…

Finally, I am not paying a proportionately higher Internet service bill than nearly 10 years ago.  What the future in pricing holds I do know know.   Perhaps new  services including wireless and new satellite systems may enable enough competition to keep prices down. 



Sunday, February 10, 2019

Google Fiber Fail





I wrote way back in 2012 about Google Fiber.  Although that post focused on the marketing hype associated with Google providing 1Gbps Internet service, it did hint that Google was taking on physical infrastructure issues that could have major impacts to service.

The new installation techniques employed by Google and not tested by decades of real-world installations have come back to haunt Google and more importantly their complete customer base in Louisville, Kentucky.   Google attempted to take a shortcut to the installation of fiber with a technique that directly exposes fiber to the harsh realities of the environment.  This new method uses shallow slit trenching of fiber directly into the asphalt street pavement  and then it is covered with an epoxy - what could possibly go wrong?



Unlike concrete (which has its own issues),
Image result for google fiber louisville
asphalt is not solid or stable.  Asphalt moves and cracks. Years of layers of pavement generate layers that capture water, freeze and become potholes that are the bane of car tires everywhere.  The situation becomes worse depending on the combination of the type of ground under the pavement, weather, and very importantly they types and frequency of truck traffic.  What was the thought process that is approach was going to last a year, let alone the 10 to 20 year set-and-forget typically required to meet fiber or cable plant cost effectiveness?  You can actually see cracks and repairs in the pavement in the picture of Google installing fiber in Louisville.   Again, what were they thinking?

Although the Cable companies (the MSOs) use shortcut techniques of direct-bury of coaxial cable in the ground, this approach has cable systems that have the benefit of years of improvements and well known maintenance needs.  In general, although segments may fail, there are generally no regionally-wide systematic plant failures that require the complete re-installation of cable or fiber.

Google may have a fail-fast (or relatively fast if you are tracking Google's set of messaging applications) approach based on "Internet Time', but in this case, not only did they fail, there is no recovery enabled by the upgrade or download of an Android or iPhone app.  They created a false narrative that there was a shortcut to conventional installation approaches without performing the long-term testing that is the generally hallmark of stable and reliable telecommunications systems.

Finally, unlike other companies that plan for long-term commitments to their customers, Google Fiber is apparently leaving virtually all their Louisville customers forever.  Maybe Google will finally figure-out that although they can fix or abandon applications without significant damage to their main advertising-based revenue, failing at the physical layer to a customer is something that the customer will not soon forget.

Sunday, July 29, 2012

Google Fiber - Less Filling (cost) Tastes Great (more bandwidth)?


There is much excitement in the news and on the Web about the Google Fiber rollout planned for Kansas City. With the promise of 1Gbps to the home at a good price point, this sounds great. There have been posts that talk about whether Google understands what they are getting themselves into:


The bottom line is that the installation of physical facilities into people’s homes means Google now has to take responsibility for prompt repair that ranges from minor problems, a failed home unit (a GPON ONT?) or major damaged caused by acts of nature.

However, my bent is a bit different.  First, Verizon, which uses GPON technology for their FiOS service, could provide a “1Gbps” service.  But, let’s take a bit of a look at the reality here.  Any service, virtually no matter the technology, has aggregation points.  With GPON technology the concentration point is at the Optical Line Termination equipment (OLT).  Each tradition GPON port provides 2.4Gbps downstream and 1.2Gbps upstream that supports the Optical Network Terminals (ONTs) at the customer location.  Even if you provide a service template that enables 1Gbps peak to each customer, there are generally 32 to 64 customers per GPON segment.  Also at the OLT is the amount of uplink bandwidth from the OLT to the Internet.  In general there are one to four 10Gbps uplink connection.  So, in the best case there are 40Gbps to spread over the hundreds of customers connected to the OLT.  Moving to DWDM-GPON or 10G-GPON reduces contention on the segment to a customer, but there are still limitations from the OLT to Internet.

Of course, you would say that people only need 1Gbps for a very short time so there is great packet statistical gain on the system.  And of course you would be correct.  So, let’s look at the sustained traffic that could be demanded by a customer.  Unless the typical consumer has a video production company putting together 1080p contribution quality video for Hollywood, most likely the home’s bandwidth is dominated by video consumption.  Let’s say the home has four HD TVs, each with the ability record two streams at each TV.  In addition, there are three mobile devices that will watch video at the same time.

So, the math works out as:

4 HD TVs x 3 HD Stream + 3 Mobile HD x 1 HD Stream = 15 HD Streams


Wikipedia has listed bandwidths required by the popular streaming services.  The largest bandwidth, the 1920x1080 video from Sony is 8Mbps.  For our purposes, let’s round-up to 10Mbps.  With that in mind, the sustained bandwidth from a customer would be:

15 HD Streams x 10Mbps = 150Mbps

This current peak fantasy is approximately an order of magnitude less than 1Gbps.

The math from the OLT to the Internet is interesting as well.  Assuming that you only do 20 customers per GPON segment (so that they can each get their 150Mbps for their HD streams) and with the typical 40Gbps uplink on the OLT, you get a maximum of 20x40 = 800 customers per OLT.  And of course, you have to find a way to get the 40Gbps from the Internet.  A Content Delivery System located close to the OLT helps dramatically, but again drives-up cost.  Google has implemented their own fiber-based nationwide backbone network, is this something they plan on leveraging to become a new Tier 1 ISP?

The bottom line is that for the vast numbers of consumers, the practical limit of consumption has nothing to do with the limitation of the access system from the home and more to do with the limitation (which of course will change, although 15 HD streams seems pretty generous at the moment) of the ability to consume product.

This becomes a marketing game as there is no significant near-term and probably medium-term benefit for a 1Gbps connection, or anything above around 100Mbps.  Will local providers start removing the limits (where they can, for example if they use GPON or DOCSIS 3.0) on local access, moving the service bottleneck elsewhere? 

So, I don't sound like a Luddite, there are likely to be future new applications that may drive change in my analysis and new devices that consume even more.

Of course, if you can only get DSL services a several Mbps, it's time to call Google (is that possible) and petition for your community to go Google Fiber.