Friday, December 25, 2009

Yikes, the Internet for Enterprise Services

Since the first days of using of telecommunications to support an enterprise, the technical solution space has opted to maintain some semblance of a “private” network. Initially, this focused on using versions of private line communications to connect switching and routing devices owned directly by an enterprise. Over time, cost efficiencies and the need to scale helped create commercial service providers that provided additional flexibility. The enabled resource sharing and helped enterprises move away from completely dedicated services into using shared network services.

However, one of the key features of these early shared network services, for example ATM and Frame Relay, was and essentially remains that an enterprise needed dedicated access (private line service) to these networks. This is because, with some limited exceptions, service providers’ ATM and Frame Relay networks are not “peered” at the data layer with other service providers. The top two reasons for this are simple. First, customers wanted the security and performance of understanding exactly who and how their traffic was switched or routed, and second, the service provider wanted to keep 100% control and share of their customers network needs.

These two factors continue to this day. The use of service provider’s IP/MPLS-based VPN services has replaced traditional ATM and Frame Relay with a much more flexible technology. However, even with additional capabilities, the goal was to continue the concept that the network would continue to provide a service that ensured a level of performance and privacy. Dedicated access (that is private lines) is de rigueur for users of MPLS/VPNs service provider networks. However, there are several technical and business changes that change this perspective, and enable and drive the use of Internet services as the basis for an enterprise network:

  • Dedicated access is expensive. The requirement for dedicated access stems from the need to maintain end-to-end performance guaranteed by an MPLS/VPN service provider. However, since the access is a significant driver in total network costs, the smallest practical link is generally used. Low rates means that Quality of Service mechanisms are critical to ensure that voice and video services work effectively over pipes as small as T1s (1.5Mbps). There are rays of hope in the costs of dedicated (or what appears to be dedicated) access with the rapid expansion of Ethernet service delivery which enables access to the service provider’s network via Layer 2 Ethernet (which means that it does not interfere with IP addressing between the customer and the MPLS/VPN network). In general Ethernet technology equals reduced costs

  • Rapid deployment of broadband Internet services. These services are being deployed to provide Internet services, not services to enable access from a customer location to another service provider’s MPLS/VPN network. The competition for Internet services from the incumbent local provider (e.g., Verizon’s FiOS, DSL) and cable companies (e.g., Cox and Comcast) have caused the deployment of broadband that provides tens of megabits of capability at extremely attractive prices. In fact, in many cases, these finished services cost less than just the access service (whether Ethernet, T1, DS3, OC-x, etc.) alone.

  • Simplified end-to-end security products. Enterprises that use the Internet for connectivity use encryption to ensure security and integrity of their data. Virtually all of these are based on the IPSec protocol. Unfortunately, although reasonably easy to setup for a few locations, IPSec itself is a “tunnel” protocol which adds significant complexity when deployed at more than a handful of locations where a “full mesh” of connectivity (the native connectivity provided by the underlying network) is desired. New technologies, which enable centralized control not only provides the security of IPSec between all locations of an enterprise, but also enables the ability to control at Layer 3 and Layer 4 (the application layer) the data that can flow between locations on the network.

  • Lack of end-to-end Quality of Service (QoS). Overcoming this perception requires a bit of faith. With the increase in the use of Voice over IP (VoIP) services replacing traditional phone services, end-to-end QoS would seem to be needed. Virtually all MPLS/VPN providers use two mechanisms for QoS. First, they ensure that their core networks have virtually zero packet loss, and second, they enable packet prioritization of packets delivered from their network to a customer location. This packet prioritization is necessary to ensure that time critical VoIP packets are delivered before other packets. This is essential on the typical T1 (1.5 Mbps) MPLS/VPN customer location to get decent utilization and good voice quality. But, if you can buy 20Mbps or more Internet service for the same or less price than the MPLS/VPN T1, does this QoS really matter anymore? Well, tens of millions of consumer VoIP customers in the USA alone prove that not only does it not matter, but the quality of the Internet has improved dramatically with packet loss rates that look more like MPLS/VPN services.

What is the bottom line? The rapid deployment of low cost, high-quality, and high-capacity Internet services combined with more flexible and easier to deploy and manage security devices enables enterprises to leverage create a lower cost, higher-capability network environment. In fact, it is not uncommon for employees to remark that their Internet-based remote VPN access from home to their corporate network is better than access to corporate network resources sitting in their office!

With all good things, this comes at a price. Using Internet services:

  • Leaves sites vulnerable to Denial of Service (DoS) attacks. Service providers have to take action to mitigate the effects, generally after the attack begins

  • Relies on a more complicated set of cooperating companies to provide the end-to-end service

  • Hard to rectify service issues. Although a site is serviced by an ISP, Reachability to other sites requires, in general, many Internet providers. This complicates trouble resolution. In fact, in the Internet, trouble may not be local, but at the Tier 1 level of the net. The good news is that hundreds of billions of dollars of commerce and consumer services are so dependent on the Internet, that problems will be reported immediately and resolved as soon as possible

  • The Internet is a target for economic blackmail for profit, so any vulnerabilities that may exist could be used by network “hijackers” to blackmail service providers and even nations for cash to prevent “crash”. Because of this, service providers and vendors are constantly working to test and improve equipment, network architecture, and operating procedures

So, enterprises should now seriously consider:

  • Using low cost, high-bandwidth Internet services for virtually all administrative network needs. This includes remote office email, internal resource sharing, and phone services (e.g., hosted VoIP). Security provided using encryption with control over Layer 3 and 4 resources.

  • Using more expensive private lines and MPLS/VPN services for mission critical communications such as communication between data centers or mission critical sites that have significant impact on business operations.

From a communication service provider industry perspective, if this shift to using Internet services as the main method for enterprise networks, what happens to:

  • The cost of dedicated access services when there are fewer customers to share costs?

  • The cost of MPLS/VPN services if the enterprise customers that would normally use this service turn to the Internet?

Friday, October 30, 2009

The Net 2020 and Whither the Post Office?

It is pretty straight forward to understand why we have a postal system as the United States Constitution in Article 1, Section 8, in the enumeration of the Powers of Congress states “To establish Post Offices and Post Roads”. This first developed the Postal Department which eventually reorganized into the Postal Service we have today. For over 220 years, America has benefited from a well developed system of Postal delivery, which was critical for the development and expansion of the Republic.

More than providing an initial communications cohesion for a new country, the development of the Postal system became the foundation for several important legal processes and concepts:

  • The use of the Postal system was universal. It was never limited to just to Citizens, or to economic groups, or bounded by race. If you could write and afford a stamp, the mail would go and it would get delivered even if you were a minority
  • The Postal system provides First Class Mail which is mechanism for communication that is officially recognized by law. You can officially communicate with the government using mail, and optionally use Certified mail for proof of delivery for proof in legal proceedings
  • The Postal system is protected by law, and has wide powers of enforcement. Tamper with a mailbox, whether at a residence or a pick-up mailbox on the street and it is a serious Federal crime and it will land you in jail. Moreover, mail fraud will also send you to a Federal Penitentiary. There are Postal Police and a Postal Inspector all to make this happen, and all well founded on the government executing its Constitutional requirement

So you say, what is going to happen to the Post office in 2020? Will it still be around in a form that we recognize today? Email and social networking computer services have all-but eliminated personal first-class mail use except when someone wants to send a message with special impact. Web-enabled bill-pay is available from virtually every bank (and from all the virtual banks!). The real issue is outside of the social communications arena, but squarely in the legal one. Will our method of communicating legal issues with the government or for business become completely divorced from government run Postal services? If so, how do we keep the official and legal aspects of the current system to support official government and legal system operations?

Clearly, much communication happens outside of the Postal system today and its impact on companies is dramatic. Due to Sarbanes-Oxley, as well as other requirements, companies keep and track essentially every communications both internally to a company as well as externally. This is in case of audits by government regulators or for discovery related to a legal proceeding.

This may work well, at a cost, for businesses, but what does it mean for communications with Citizens and residents either to the government or legal matters? How does this person ensure that delivery was made? How does this person know that the email message was not tampered during delivery? What is the person’s recourse if it gets lost in the email?

As we contemplate issues like Net Neutrality and subsidized Broadband Access, have we forgotten to update our concepts of mail fraud and wire-fraud to today’s email-box stuffing spamming and intentional sending of viruses and spyware? Are these not the analogues of fraudulent offers and items sent in the mail that could damage your home and cost money and time to repair? If we use email to communicate with our government, is causing the outage of an email service provider on the same level of hijacking a mail delivery truck?

I know that these are all questions and I have not provided any significant answers. However, the transition to electronic commerce and government continues and accelerates and we need to help our government find the right approach to replace some of the critical government and legal functions that our Postal Service provides today. Moreover, are these issues that Postmaster General should be addressing or will the Postal Service continue to whither away?

Monday, April 20, 2009

Getting Telelom Companies to Green

Normally, the improvement in the infrastructure of a company is done to improve the cost of providing a good or service in the marketplace. The Age of the Industrial Revolution is replete with stories of companies that would essentially lay off their entire workforce while re-tooling their factories to be more competitive. When a new and more efficient steel making process became available Andrew Carnegie furloughed his entire workforce while his steel mills were overhauled to ensure long-term competitiveness.


Are these economic considerations the only reasons to implement new and more efficient - that is in this case energy efficient - technologies? The normal economic force that causes replacing equipment in many industries does not always provide enough benefit to warrant new capital investments. Let's focus here on the telecommunications industry.


Today, most nationwide telecommunications carriers have depreciated virtually their entire infrastructure to zero. In short, the finance folk do not think it costs them more than the cost of maintenance, so spending millions in capital does not make sense to them merely to keep the revenue they already have. This has lead to telecom facilities that are chock full of older equipment providing less capability and capacity than systems that can be bought today – especially in terms of functionality per Watt. If companies held onto computer systems like telecom companies hold on to transport equipment and routers, then corporate data centers would have 100 MHz Pentiums floor to ceiling covering acres. Even the government replaces on a regular schedule supercomputers worth tens of millions of dollars to improve the ability to predict the weather and other critical modeling problems.


When it comes down to it, nationwide telecom companies are real estate companies. A typical carrier will have hundreds of points-of-presence (POPs) adding up to millions of square feet of space. So, would it not make sense to get the most use out of a limited space? Would it not make sense to make the most efficient use of the available power and cooling? Of course it does. Another way of putting this is why are telecom companies not starting in earnest to become as efficient as possible to reduce costs and to reduce their environmental impacts?


So what is the hold-up for telecom companies making the same rip-and-replace decisions that Andrew Carnegie and virtually every person that uses information technology? It is certainly not the technologists at telecom companies, nor is it the line operations staff that has to deal with older equipment and its attendant maintenance requirements and issues.

There are probably several aspects that have restrained changes:

  • It is hard to justify replacing equipment that has no costs basis with new equipment only to keep the same revenue
  • It is hard to quantify the dollar value of power and space costs savings of replacing equipmentIt is hard to quantify the reduction in operations staff and improvement in service deployment time that reduces costs and increases revenue
  • There appears to be a brain-drain in the application of sciences, such as Operations Research, to the operation of telecom companies so understanding the real savings in technology changes cannot be developed with accuracy
  • Executives are not measured on ensuring that their company is in good shape three years from now (because their network is more efficient and there is power and space to continue growth) but whether revenue or margin increased this year or even this quarter
  • There are few external pressures such industry recognition or government regulation to make investments

So, what can be done? Somehow we need to find some way to make incentives to fix this problem. One possible solution is to extend the EPA Energy Star rating to services as well as products. Will consumers preferentially buy from a company that produces less pollution than a competitor company for a megabit of data transported? Is this also an ethical issue?

Clearly, bandwidth demands are going up and they are enabling applications that people demand and expect. It is hard to believe that reducing bandwidth demand (like reducing energy consumption) is a viable option. Especially if you believe that telecommunications services, enabling for example Cloud Computing, is the key technology to improve productivity and grow the economy in an environmentally sound manner.

Wednesday, March 11, 2009

Enterprise Services - At the speed of Internet innovation

Not too long ago, virtually every customized application required significant programming by a professional staff to implement the needs of an enterprise.  Then came Object Oriented Programming (OOP) as the saviour where the average Joe or Jane would be able to pick and choose from a world of programming objects, do a few clicks of the mouse and voila instant application. 

We'll it did not really turn out that way, but that does not mean that the thought was in vain.  The rise of customizable collaboration tools has hit the Web with a vengeance, making new business empires in the process.  Today, information organization has changed from filing systems to search systems.  Can't find the zip code for a city, you don't go to the US Postal service site, you hit a search engine and within seconds, you have your answer.

Want to organize your life and communicate with friends, there's Facebook and MySpace.  Need to share documents, there are offerings from Google and Microsoft (and plenty of others).   We are quickly becoming used to the availability of applications that are a click away that can serve multiple business purposes.  When we are deprived of these services, our productivity is significantly impacted.

Unfortunately, in many cases these essential tools are not available on the inside of a corporate enterprise network being blocked by network access policy or limited by corporate proprietary information policy.  So, as many organizations have found out, that as much as Internet access and good internal mission specific applications are critical for business operations, self-service collaboration applications on the inside of the corporate network can be just as critical.

Tired of creating a volumes of unorganized, non-searchable documents for customer service information, business process and procedures, competitive intelligence, product descriptions, and a myriad of other similar information?  These can all be developed and managed and supported by the appropriate functional or cross-functional organization within a business.  Wiki software and authoring tools are commercially available, at very attractive pricing I might add, that can be placed directely in the hands of those that are responsible.  This gets IT system administrative support out of the way for document changes and user access changes.  By assigning moderators to the Wiki environment, enterprises are finding that they can eliminate the barriers and this ensures that extremely high-quality and up-to-date information can be shared across the entire business, not just a single group within an enterprise.  Combined with a high-quality search engine, staff from across the company can search, view, and comment on information without having to drive through a myriad of hyperlinks that always seem to lead either nowhere or in a circle.

It is clear that an Enterprise Services environment, one that address not only access, remote access, network performance, and raw computing services for example is and should be a critical component of every company's or government agency's plan.  Instant messaging, self service websites that enable business process flow (e.g., Microsoft SharePoint), Wiki tools, and search engines are the tools that enable virtually instantaneous creativity by staff to organize information, get it shared, and let it be found.

As Michael Kennedy, Assistant Deputy Associate Director for Intelligence Community Enterprise Solutions for the Office of the Director of National Intelligence characterized it, his organization finds the outside tools that can be rapidly deployed at low cost that can scale, and he lets it loose on his community.  If it takes off, then it's a hit and an organizational success.  If there are few users then he throws it away.  With no two year requirement and two year development cycle, finding the right tools is nearly risk free and just plain nearly free.

In addition, there is always the next thing, being developed by thousands of large companies and entrepreneurs.  As these tools hit the Internet, it's like having your own research lab and beta test environment at no cost.  Why not put them to work for your organization?

Yes, you can have common Enterprise Services that adapt to users needs at the speed of the Internet.

Thursday, February 19, 2009

What to do with Government Telecom Money?

Now that the United States has now apparently allocated several billion dollars to improve the telecommunications infrastructure of the country, the real question is what to do with the money (assuming we do not want it to go to waste).  First, where could we spend the money?

  • Spreading WiMAX around the country, especially in rural areas
  • Increasing 3G wireless deployment to places where coverage is currently spotty
  • Improving wireline access capabilities

Each of the above, or a mix, would have a positive impact on network accessibility and should therefore enable current applications as well as novel applications to drive economic value.   However, there is only a limited amount of money, so trying to do too much in too many areas may spread the money too thinly and have very limited impact.  To avoid this, more concrete goals need to be established.

For any program, there has to be some guiding goals for success and principles on which to determine how to spend the money to achieve those goals.  Some possible goals are:

  •  Increase the capability and reduce the cost for telecommunication services for business with bandwidth between 10 to 100 Mbps to the Internet
  • Increase the capability and reduce the cost for telecommunication services to the home.  With bandwidth between 5 to 20 Mbps to the Internet
  • Increase Internet mobility (i.e., wireless) with bandwidth around 1 to 3 Mbps

The first goal is to ensure that businesses can get affordable Internet service so that they can run their business applications and bandwidth does not limit the capabilities of business-to-business and business-to-consumer applications.  Novel business applications could include voice and web integration, online video conferencing for customer support, and web-based services.  The second goal feeds what is the largest segment of the Internet population – the home user.  Applications here abound, including voice, downloading and streaming video, social networking, and consumer purchases of goods.  Internet mobility enables business to work on the move and provides the foundation for applications that are currently only in dreams.  However, these applications will only work if we ensure the quality of the services being provided.

To ensure quality, we may need to specify other important requirements over and above raw bandwidth:

  • All traffic must be treated the same, unless the customer or a service pays more for better service for a particular type of traffic
  • Customers much be able to buy at a reasonable rate 10 Mbps or more of sustained download speeds to enable streaming of High Definition video channels
  • Voice over IP traffic gets higher priority service, at least for three voice channels at no additional cost

·         Basic infrastructure must meet reasonable reliability goals that are close to traditional telco standards

If the government is going to subsidize network capital, then equality must be provided for traffic types, but we should not limit the ability for customers or suppliers to buy more capability.  The second goal is twofold.  First, providing 10 Mbps guaranteed to every home may cost too much, and second, there must be incentive for companies to provide additional services.  The next goal is to ensure that consumers and business have a real choice in voice service providers by leveraging Voice over IP (VoIP).  This is especially important if, as discussed below, the majority of the money goes to ILECs and cable companies.

With these goals and requirements, how do we determine where to spend money?  Since we still have a commercial marketplace, the government has to decide if it is going to play favorites.  This is especially important if the government wants to get the most bang for its buck (well, really the people’s bucks).  What does this mean?

  • Fund Sprint, Verizon Wireless, and AT&T to roll out 3G (and 4G) services to locations not currently served and more rapidly upgrade existing systems
  • Fund AT&T, Verizon, Qwest, and the other Incumbent Local Exchange (ILEC) providers (in many cases independent rural companies) to increase capability to every home
  • Fund the Cable companies to increase their capabilities to business and the home

Oops, where are the CLECs?  For business services, the typical CLEC may have dozens or a hundred or so buildings on-net in a particular city and tiny fraction consumers.   What this essentially means is that sending money to the CLECs will likely have less impact on providing a large number of businesses or consumers new services.  You may believe that this is unfortunate, but we could spend a million dollars on increasing capabilities for 100,000 consumers to 10 Mbps or spend the same million dollars and improve service for only 100 or so businesses.  The bottom line, the ILECs and cable companies have much more substantial fiber, copper, and rights-of-way than anyone else.

For wireless, the picture is more complicated.  There are many wireless companies, and incenting the development of new towers and systems means that again, you have to play favorites.  Approaches include:

  • Creation of municipal systems that then lease service to the major wireless companies
  • Map poorly served locations and create a bidding environment with the winner getting a regulated franchise for service in that area

How do we reconcile this against current law and regulation?  Since the government is making wholesale changes in the banking and healthcare markets, why not toss the Telecom Act too.  It is the telecom act that created the idea that the ILECs had to share their access lines and infrastructure.  Good for the CLECs, it made the business cases to upgrade ILEC facilities more difficult.  Companies and the government have spent much effort and expense to ensure that the ILECs play fair with CLECs.  In fact the ILECs could claim what the CLECs really accomplished was to cherry pick the larger higher revenue customers away, leaving lower margin smaller customers to the ILECs.  Because of this, one could argue that the current regulatory environment is not productive and in fact has led us as the current state of affairs where we are behind other nations in terms of affordable broadband penetration to the home and business.

One approach to change is to remove the cost of regulation and let the ILECs have their access lines back to do what they will – with a little old-style monopoly regulation to tame the beast.  In doing this, there are some consequences.  First, we are essentially abandoning competition in the wired home and the small business marketplace.  By subsidizing the incumbent providers we are going to hurt competitive access providers and simply reinforce the current duopoly, especially at the home (this being the cable company and the ILEC).  To make this work, we could go back to rate-of-return regulation with the government subsidizing the initial capital costs to enable the providers to get a fixed percentage return on operations.

Another approach, one that does not play favorites, is to establish a hyper marketplace:

  1. Create a government clearing house for all  requests for Internet services
  2. Service providers provide bids to the government that represent their capital costs to serve the location or area and the recurring costs for the service
  3. The government selects the service provider with the lowest evaluated lifecycle costs wins the bid, with the government subsidizing the capital cost

So, what works best?  Do we trust that a regulated duopoly will work or that a government run marketplace will do the trick?  In either case, the government now becomes the decider on what gets done or not, who wins and who loses.  There are many questions that are raised by this type of intervention, that include creating mechanisms to ensure technology upgrades.  Simply making winners and losers does not mean that the country is well served in the long term.  Unfortunately, the break-up of the Bell System that enabled the creation of new companies that brought new services, and yes the very Internet to the world, was long ago enough (1984) that many of us do not remember the previous technology stagnation.

Finally, there is an impact to service providers when we are able to provide high-quality Internet services that enable the streaming of High Definition video and Voice over IP services.  What is happening, and what will continue to happen is the decoupling of the physical service provider and the actual content itself.  A reasonable analogy has already happened in the music industry and the effect of Apple’s iTunes and like services has been the essential destruction of the previous distribution chain of music stores.  This could happen to the video game industry, but this industry has worked for years to make it extremely difficult to duplicate their software by keeping the creation and duplication of software for gaming consoles tightly controlled.

For the cable providers and ILECs providing cable and on-demand television services, this decoupling could be very frightening and may cause a re-evaluation of their business models for services such as FiOS and U-verse.  The reason is that these service providers make assumptions of Average Revenue Per User (a.k.a. ARPU) that included traditional cable TV subscriptions and premium and on-demand channel purchases.  With a decoupling of the service provider from the content and adequate bandwidth for streaming and downloadable content, consumers will be able to go closer to the source of the content and pick and choose what they want.  Services like Hulu and from the content providers themselves (e.g., USA Networks) are starting make large inroads into how people will watch television especially as a new generation of TVs come Internet enabled out-of-the-box.

Ooops, I think that I stumbled back onto the real question: What is the value chain in Internet services?  More on this in another post.