Google recently announced its goal of having 10 gigabits of bandwidth delivered to each and every customer of its Google Fibre service. This is an incredibly ambitious and forward-thinking initiative, but at the same time extremely Google-ly. 10 gigabits is a lot. Think about it, that’s 1250 Megabytes per second. Anyone who has lived through the days of Napster realizes how much bandwidth this truly is. With its revenue tied to widespread adoption of search and services, Google is doing its best to make the Internet better. It also does a good job of putting pressure on other ISPs and backbone providers to step up their game in a time when the FCC can no longer enforce net neutrality. But it’s something else, too: it’s a call to arms. Connectivity and speed are increasing and that is something many Internet innovators, content creators, entrepreneurs, and developers often forget about when thinking of the next big thing.
For all the amazing developments in online media, bandwidth is still a prohibitive barrier to entry for many entrepreneurs. The performance of sites like Hulu, Netflix, Steam, and Youtube require cooperation from ISPs, IXPs, backbone providers, transfer, and content distribution networks. There are very few ways to deliver content at scale at a low cost. However, this like all things is relative. As time passes, more bandwidth becomes available at lower costs. Bandwidth enables innovation and the abundance of bandwidth will change the way we think about applications, media and the Internet.
Jakob Neilsen proposed a law that wasn’t too dissimilar to the Moore’s law or the more modern Zuck’s law. Neilsen’s law has to do with bandwidth. He predicted that over a one year period bandwidth would increase by 50%. This is slightly slower than the 60% growth rate of Moore’s law but still noteworthy. For those that remember acoustic modems, ISDN lines and the coveted T1-3 connections of the 1990s, the amount of bandwidth we have available in the connected world is staggering. This increase in available bandwidth has created an enormous amount of value for both the innovators and the end users. Our expectations of Internet services become more demanding year over year. We expect Netflix to stream in 1080p. We expect Youtube to play any video from its immense library immediately. Online video games should have zero lag and have millions of concurrent players connected. Our expectations today were actually fairly predictable 15 years ago, the questions is, where do we go from here? We are still delivering familiar experiences in a new box and experimenting with subscription or ad supported business models.
The next generation of connected experiences will defy our expectations of what media and computer technology is. We have done a fair job at connecting and digitizing our analog experiences, but is the future just more media or is something new going to emerge? How will we use this new found bandwidth?
What is a Thin Client? The Thin Client concept dates back to mainframe time-sharing computing. The idea is to create a terminal at the fraction of the cost that can act as a conduit to a central processing center. Offloading much of the hard work to a large computer clusters is already what we do for advanced software as a service offerings. This even extends to devices we love today. Things like Apple TV and Chromecast are modestly powered platforms that allow for the delivery of specifically formatted content. The real power is in the data delivery, not the small h.264 processor on board.
As systems of chip-based architecture continue to accelerate in power and reduce in cost, most modern mobile and living room experiences are being powered by offsite processing. Google Now and push notifications based on intense predictive server side processing are early manifestation of these applications. This server-thin client relationship is going to power wearables and provide us with the contextual information provide by our array of new sensors. The Thin Client is back, just not in a form we envisioned.
There has always been a strong tie between computing resources and the sophistication of video games. This latest generation of home consoles are always-on connected media devices that sit underneath your HDTV. Currently, these systems use online stores with on-device storage and physical media to distribute their content and for now that is necessary, but in a low latency, bandwidth-abundant future, the console will become a conduit. Streaming video games is currently in it’s early days and the technological barriers are making themselves very clear. Challenges to make streaming games a reality have to do with latency, last mile, and bandwidth. A combination of incredibly low-latency and reliable bandwidth will be required to make it reality. Solutions today rely on distance to data centres but as bandwidth becomes more reliable the distance issues will become a thing of the past. Streaming video games represent today’s most complex interactive media needs and solving the problems associated with its real-time streaming delivery may power the future of computing, and help move us towards a true Thin Client solution.
We still have very large telecommunications companies controlling the distribution of content to most televisions. Services likes Hulu and Netflix have started to cut into their stranglehold and the availability of more bandwidth will create more viable alternatives to the cable monopolies. These alternatives will not just serve the consumer, but also the production side of the media business. Netflix, Youtube, and Hulu have all subsidized or created their own content for online-only distribution. Youtube is proving that when you combine independently produced video content of all types with programmatic advertising solutions you can enable a whole new generation of content creators. Netflix proved that if you create a great online service, people will pay for it. New ideas will emerge from access to bandwidth at lower and lower costs. Cloud-based on-demand computing services like EC2 have given developers access to scalable infrastructure and the costs of delivering data is going to mirror our ability to process it. New media experiments will be more common and far less risky. Innovation from the incumbents will be accelerated by the new flood of new entrants. With Cable companies, Apple, Google, Sony, Microsoft, and Samsung all fighting for the living room, the future of TV appears to be temporarily fragmented. When there is fragmentation, there is room for disruption.
I personally rely on a variety of online applications today to get my work done. I trust applications running in my browser to handle my calendar, email, social, word processing, and voice and text tasks. The curve of robustness of browser-based applications would likely follow a similar acceleration curve to Moore’s or Neilsen’s Law. The Chromebook is an example of an early attempt at computing exclusively through browser-based internet services and it is already providing a viable alternative to traditional heavy computing environments. New standards to the HTML specification are implementing some really interesting this such as RTC or real-time communication protocols. These enhancements are going to make commercial technologies available to every open source developer in the world. In the future full featured applications will use a combination of server and client-side processing to create seamless native-like web experiences.
The website as we know it will become more of an application of service. Services that run online will have far more bandwidth and transfer to create incredibly personalized experiences. Websites will use a vast array of processing technologies to be able to serve users in a variety of ways. Personal data assistants will interact with web services to deliver data to the consumer in real-time and in multiple formats. Users will opt in, like with Facebook Connect, to get vastly more personal experiences. Big portals like Amazon already make use of extreme personalization for merchandising. The Amazon level of segmentation and customer identification will look commonplace. Predictive technologies that store or cache websites will eliminate loading times for non-bandwidth intensive websites. The infrastructure costs of most web services are already becoming manageable at scale, soon the infrastructure to deliver experiences like Netflix will be deployable with a click. Search engines will have a real understanding of information on pages. The promise of knowledge graphs and answer engines will become accessible to the average person and these services will be available on demand for developers to build on. The website will become amorphous, it will become an end point, a combination of experience and information that will be useful in a traditional context and as a store of actionable information for other connected systems.
The opportunities for creating value in a big bandwidth environment are vast, but they are not as foreseeable as what we have done with the web today. If you look back at the road show IPO presentation for Broadcast.com, we are really just taking traditional forms of content and finding a new pipe to deliver them with. Obviously, formats for content will continue to get richer, but the true driver of innovation will not be squeezing more pixels into a frame, but new types of experiences that are yet to be invented.
3D printing will become much more like the replicator or transporter from Star Trek, minus the teleportation. The reproduction of physical goods at a resolution equal to a molecular level description is going to require an extreme amount of data. Delivering that data is a big bandwidth issue.
Connected transportation is going to rely on bandwidth reliability. Real time communication to a network of AI-controlled vehicles is going to require extremely fast and reliable network infrastructure. Current implementations of self-driving vehicles process their environment locally, but the benefits of a truly connected grid will define the future of transportation. New Ford cars are generating data at a rate of 25 gigabytes an hour. Today only a small fraction is made actionable, the rest is disregarded, but in the future all of this data will be uploaded, processed, stored and put to use.
The quantified self is going to take off in a big way. Wearable technology has become an emerging category in consumer technology and the advancements made in sensors are going to potentiate it’s adoption. New sensors that are small, embeddable and permanently carried with you are going to transform the personal data category. Eventually, like with cars, our body will become the hub of our connected life. Personal Data Assistants, Contextual Aware Computing, Electronic Medical Assistance and a wide array of new services will be growing on the back of an increasing large stream of data. The rate of which we are generating personal data is increasing rapidly, phones now tie together data sources from gyroscopes, GPS’s and wearables to give us a glance at our personal data. This is just a start, soon this data will be the important contexts which computing interprets to deliver new and predictive experiences.
Do we have enough bandwidth and transfer to support 6 Billion constant streams of real time biological information to servers? What data will we be collecting? What won’t be worth collecting and analyzing if bandwidth, storage and processing power are becoming more abundant? What services can we build on top of that information? Big bandwidth enables a lot of new big data applications.
The machines are starting to talk. They are talking to us, talking to each other, and talking to services that we know and love. As more appliances, machines, parts of the supply chain and other objects begin generating data the amount of data is going to increase. That data will only become useful if it can move.
A problem that many people are trying to solve is the handling of big data. With every company, person and device generating a ton of data how can we make it useful. In order to exploit any of the mass personalization opportunities, these gigantic data sets will have to be merged, mashed and compared. Transferring that amount of data in real time, processing it and delivering something out the other end is going to be challenge. Ten years ago, the number of companies sharding databases across multiple servers was small, today it is commonplace. Eventually, we will all have those needs. Querying large data sets or moving them from one place to another for processing is going to require bandwidth and lots of it. APIs are going to be delivering data in huge packages, it’s going to make the Twitter firehose seem easy to deal with.
Connect virtual reality is emerging as a hot topic these days. With the recent news from Sony about their VR headset and the acquisition of Occulus by Facebook this space is quickly heating up. Early pioneers of virtual reality long predicted this but for real time, 3d broadcasts of environments we are going to require bandwidth and lots of it. Visiting the doctors office in VR space will require new cameras with extremely levels of fidelity, enivronmental mapping and new interactive video formats that build upon the now nascent work of VRML.
It is a brain twisting exercise to imagine what the future of the website looks like in a bandwidth-abundant future. Bandwidth availability’s influence on application and design trends are easy to spot. Images have gotten bigger, music is higher fidelity and video is everywhere. The palette of tools for developers have allowed for web experiences that integrate different types of content into true multi-media. The near future will have iterations of these types of experience that will become better, clearer, faster and smoother. True innovation in big bandwidth will combine the best of real-time connections, interactive media, personalization and big data to create something truly unimagined.