Showing posts with label Google Fiber. Show all posts
Showing posts with label Google Fiber. Show all posts

Sunday, February 10, 2019

Google Fiber Fail





I wrote way back in 2012 about Google Fiber.  Although that post focused on the marketing hype associated with Google providing 1Gbps Internet service, it did hint that Google was taking on physical infrastructure issues that could have major impacts to service.

The new installation techniques employed by Google and not tested by decades of real-world installations have come back to haunt Google and more importantly their complete customer base in Louisville, Kentucky.   Google attempted to take a shortcut to the installation of fiber with a technique that directly exposes fiber to the harsh realities of the environment.  This new method uses shallow slit trenching of fiber directly into the asphalt street pavement  and then it is covered with an epoxy - what could possibly go wrong?



Unlike concrete (which has its own issues),
Image result for google fiber louisville
asphalt is not solid or stable.  Asphalt moves and cracks. Years of layers of pavement generate layers that capture water, freeze and become potholes that are the bane of car tires everywhere.  The situation becomes worse depending on the combination of the type of ground under the pavement, weather, and very importantly they types and frequency of truck traffic.  What was the thought process that is approach was going to last a year, let alone the 10 to 20 year set-and-forget typically required to meet fiber or cable plant cost effectiveness?  You can actually see cracks and repairs in the pavement in the picture of Google installing fiber in Louisville.   Again, what were they thinking?

Although the Cable companies (the MSOs) use shortcut techniques of direct-bury of coaxial cable in the ground, this approach has cable systems that have the benefit of years of improvements and well known maintenance needs.  In general, although segments may fail, there are generally no regionally-wide systematic plant failures that require the complete re-installation of cable or fiber.

Google may have a fail-fast (or relatively fast if you are tracking Google's set of messaging applications) approach based on "Internet Time', but in this case, not only did they fail, there is no recovery enabled by the upgrade or download of an Android or iPhone app.  They created a false narrative that there was a shortcut to conventional installation approaches without performing the long-term testing that is the generally hallmark of stable and reliable telecommunications systems.

Finally, unlike other companies that plan for long-term commitments to their customers, Google Fiber is apparently leaving virtually all their Louisville customers forever.  Maybe Google will finally figure-out that although they can fix or abandon applications without significant damage to their main advertising-based revenue, failing at the physical layer to a customer is something that the customer will not soon forget.

Tuesday, July 31, 2012

Some additional comments on Google Fiber

A few more comments on Google Fiber:

  • The big deal here are the services Google is going to provide.  1TBytes of storage for data and a 2TByte DVR.   These are values.  However, are they really long-term discriminators against current services?.  Traditional DVR storage can be expanded easily, units can be upgraded to record more at a time, and Cloud storage services (already part of at least some providers services) can also be expanded.  Competition is good.
  • The $120/month looks good (for a two year contract that waives the $300 install fee), but is not that much different that current service deals from Verizon for FiOS.  Current two year pricing with a multi-room DVR and 75Mbps of Internet and 285 channels with 75 in HD is $130/month.  Since FiOS is GPON-based, providing "1Gbps" of access service to the Internet is possible.  This now descends into feature and marketing games.  Again, competition is good. 
  • Content is the key.  Much of the cost of cable service is the content, not getting the wire to the house. Just look at the jockeying between content owners and cable and satellite providers.  For a compelling offer, Google has to deal with this issue.
  • Apparently, Google has designed their own equipment.  It is not clear if this is true for the optical transport equipment or the home video termination equipment.  It is also not clear if the optical equipment is a clean-sheet design or a derivative of existing technology (e.g. GPON or ActiveEthernet).
  • People are focusing on the 1Gbps access rate which certainly is not needed for the eight simultaneous DVR sessions which even in 3D HD  is around 10Mbps x 2 x 8 = 160Mbps (are these done on the home unit or in the Google Cloud?, if in the Cloud what does the bandwidth to the home matter?)

Sunday, July 29, 2012

Google Fiber - Less Filling (cost) Tastes Great (more bandwidth)?


There is much excitement in the news and on the Web about the Google Fiber rollout planned for Kansas City. With the promise of 1Gbps to the home at a good price point, this sounds great. There have been posts that talk about whether Google understands what they are getting themselves into:


The bottom line is that the installation of physical facilities into people’s homes means Google now has to take responsibility for prompt repair that ranges from minor problems, a failed home unit (a GPON ONT?) or major damaged caused by acts of nature.

However, my bent is a bit different.  First, Verizon, which uses GPON technology for their FiOS service, could provide a “1Gbps” service.  But, let’s take a bit of a look at the reality here.  Any service, virtually no matter the technology, has aggregation points.  With GPON technology the concentration point is at the Optical Line Termination equipment (OLT).  Each tradition GPON port provides 2.4Gbps downstream and 1.2Gbps upstream that supports the Optical Network Terminals (ONTs) at the customer location.  Even if you provide a service template that enables 1Gbps peak to each customer, there are generally 32 to 64 customers per GPON segment.  Also at the OLT is the amount of uplink bandwidth from the OLT to the Internet.  In general there are one to four 10Gbps uplink connection.  So, in the best case there are 40Gbps to spread over the hundreds of customers connected to the OLT.  Moving to DWDM-GPON or 10G-GPON reduces contention on the segment to a customer, but there are still limitations from the OLT to Internet.

Of course, you would say that people only need 1Gbps for a very short time so there is great packet statistical gain on the system.  And of course you would be correct.  So, let’s look at the sustained traffic that could be demanded by a customer.  Unless the typical consumer has a video production company putting together 1080p contribution quality video for Hollywood, most likely the home’s bandwidth is dominated by video consumption.  Let’s say the home has four HD TVs, each with the ability record two streams at each TV.  In addition, there are three mobile devices that will watch video at the same time.

So, the math works out as:

4 HD TVs x 3 HD Stream + 3 Mobile HD x 1 HD Stream = 15 HD Streams


Wikipedia has listed bandwidths required by the popular streaming services.  The largest bandwidth, the 1920x1080 video from Sony is 8Mbps.  For our purposes, let’s round-up to 10Mbps.  With that in mind, the sustained bandwidth from a customer would be:

15 HD Streams x 10Mbps = 150Mbps

This current peak fantasy is approximately an order of magnitude less than 1Gbps.

The math from the OLT to the Internet is interesting as well.  Assuming that you only do 20 customers per GPON segment (so that they can each get their 150Mbps for their HD streams) and with the typical 40Gbps uplink on the OLT, you get a maximum of 20x40 = 800 customers per OLT.  And of course, you have to find a way to get the 40Gbps from the Internet.  A Content Delivery System located close to the OLT helps dramatically, but again drives-up cost.  Google has implemented their own fiber-based nationwide backbone network, is this something they plan on leveraging to become a new Tier 1 ISP?

The bottom line is that for the vast numbers of consumers, the practical limit of consumption has nothing to do with the limitation of the access system from the home and more to do with the limitation (which of course will change, although 15 HD streams seems pretty generous at the moment) of the ability to consume product.

This becomes a marketing game as there is no significant near-term and probably medium-term benefit for a 1Gbps connection, or anything above around 100Mbps.  Will local providers start removing the limits (where they can, for example if they use GPON or DOCSIS 3.0) on local access, moving the service bottleneck elsewhere? 

So, I don't sound like a Luddite, there are likely to be future new applications that may drive change in my analysis and new devices that consume even more.

Of course, if you can only get DSL services a several Mbps, it's time to call Google (is that possible) and petition for your community to go Google Fiber.