Category: Design


As the amount of data in the world is rapidly increasing, so is the time required for machines to process it. Augmented Reality, Virtual Reality, Artificial Intelligence, Robotics, Real-Time Analytics, and Machine Learning algorithms are needing the cloud to be infinitely faster as well as to require unlimited computing power and an endless amount of storage. Interestingly, this is happening on heels of the slow down of Moore’s law. Chip maker Intel has signaled a slowing of Moore’s Law, a technological phenomenon that has played a role in almost every significant advancement in engineering and technology for decades. We are no longer able to cram transistors in the circuits at the velocity we have been doing.

By 2025, the needs for traditional compute functionality in the cloud will be so large that it can never be built.

Quantum computing’s arrival promises to revolutionize the cloud. What quantum computing provides is massively parallel processing, atomic-level storage, and security using the laws of physics rather than external cryptographic methods. If you have not begun looking at it, the time is now. The cloud will soon be powered by quantum computing, and software will be written in another way.

IBM, Microsoft, Google, Intel, D-Wave have made tremendous advances this year. It is now here to push the bounds of computer performance further forward.

What is Quantum Computing?

Quantum computing is about making use of quantum states of subatomic particles to perform memory and processing tasks. Classical computers switch transistors encode information as bits which represent either a ONE or a ZERO. In contrast, quantum computers use the fundamental building blocks of atoms (such as electrons, protons, and photons) themselves. These subatomic particles spin, so if the spin is in one direction — up, for example — that could be the equivalent of the ONE in a conventional computer, while a particle with a down spin could be a ZERO.

As per the laws of quantum physics, it may not be clear whether a particle has an up or a down spin. Or, perhaps something in between. These subatomic particles possess all of those properties all at the same time. This is called superposition. ONE qubit (a new term to refer to a quantum bit unlike classical bit) can exist simultaneously as a ZERO or a ONE. Two qubits can exist simultaneously as the four possible two-bit numbers (00, 01, 10, and 11). These superpositions allow qubits to perform multiple calculations at once rather than in sequence like a traditional machine. For example, you can compute four calculations with two qubits.


What quantum computing gives you is massively parallel processing!

An understandable example is Grover Search Algorithm. Think about a game where an prize is hidden behind one of four doors and you have to find the prize while opening as few doors as possible. A traditional computer will need to do, on average, a little over two operations to find the prize as it has to open each door in succession. The quantum computer, however, can locate the prize with one action because it can open all four doors at once! You can perform eight calculations with the three qubits. The number of such computations double for each additional qubit, leading to an exponential speed-up. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations (much larger than the Shannon Number) in one operation.


Top five things you should know about it

  • We will write programs differently. New programming paradigms and languages, new algorithms, as well as a new way of writing logic!
  • Quantum computer is “thousands of times” faster than a conventional computer. Google announced it has a Quantum computer that is 100 million times faster than any classical computer in its lab.
  • Quantum computing revolutionizes the way that we approach machine learning and artificial intelligence. It will accelerate machine learning remarkably. Quantum computers will reduce power consumption anywhere from 100 to 1000 times because quantum computers use quantum tunneling.
  • Quantum computing will destroy the internet security as it is known today. It would be able to crack several of today’s encryption techniques such as RSA and ECC within days if not hours. In this regards, Quantum computing is like a deja vu of discovering the use of enormous energy locked in an atom. Nuclear fission was found in 1938, nine months before the beginning of the Second World War, and it changed the world. Quantum computing could be the IT equivalent of an Atomic Bomb. We are now in a race against time to prepare modern cryptographic techniques before they get broken. New security methods that will allow us to secure data using the laws of physics rather than using external cryptographic methods.
  • Quantum computing is not for every problem. Classical computers are still better than Quantum computers at some tasks such as spreadsheets or desktop publishing. However, Quantum computing will be incredibly useful for solving notable chemistry issues, self driving cars coordination, financial modeling, weather forecasting, and particle physics, etc.Have you written your first quantum program yet? In the next few articles, I will go over how how to program using Quantum computing, how to determine which problems are best to be addressed between Quantum computing vs. Classical computing, how it impacts Machine Learning, and how you will develop killer apps such as self-driving car coordination as well as the challenges/solutions in the world where cryptography and quantum computing intersect. Quantum computing revolutionizes the way we approach computer science and logic. A lot of algorithms will need to be redesigned/rewritten using quantum computing paradigms – looking forward to it!PS: Isn’t this a remarkable pic? The heading picture is from the October 1927 Fifth Solvay International Conference on Electrons and Photons, where the world’s most notable physicists met to discuss the newly formulated quantum theory. Seventeen of the 29 attendees were or became Nobel Prize winners!
Tags: , , ,


Throughout humanity’s history, people have been exploring approaches to stretch the biological limitations of the human body. We have been creating the means and devices to augment our physical beings to overcome our genetic and environmental weaknesses from the beginning such as wearing shoes, using eyeglasses for better vision, prosthetic limbs, artificial organs and even more physical power through the performance-enhancing drugs to name a few.

“Three important revolutions have shaped the course of the history of mankind,” Yuval Noah Harari writes in his book Sapiens: A Brief History of Humankind

. If you have not read this book yet, I highly recommend it. The cognitive revolution kick-started history about 70,000 years ago. The agricultural revolution sped it up about 12,000 years ago and the scientific revolution, which got underway about 500 years ago. And, I believe, a new revolution has just begun. Augmentation Revolution.

In the Augmentation Revolution, the aim is to develop technologies and techniques for overcoming current limitations of human cognitive and physical abilities. No one knows where this revolution will end, but by the end of next decade, I believe that every individual on earth will be using one or more augmented capabilities in their daily lives. We’ll be relying more and more on these enhanced abilities. Our actions will depend upon our “level of augmentation.” We will no longer be able to distinguish between the natural and the augmented.

Superhuman, supernatural, or paranormal abilities have always fascinated us. Religion, History, Guinness books, Comic books list many of such powers. We admire Zeus’s power to hurl his thunderbolts or Sanjaya ability to see events at a distance about 50 miles to share the war details live to blind King Dhritarashtra in Mahabharata. We love to talk about Hanuman flying in the sky to cross the sea between India and Lanka, or Jesus suggesting to throw the net on the right side of the boat to find some more fish. It is no accident that superman, batman or iron man in cinemas draw huge crowds. We find comfort and hope in our superheroes.

And, in last few decades, we all got additional powers with the invention of computers, mobile devices, internet, Wi-Fi, and cellular technologies. We all got access to vast knowledge instantly with a brain extension called Google, an ability to instantly connect nearby or far with mobile phones, and much more capabilities.

And, now, breakthroughs in Artificial Intelligence (AI) and Machine Learning (ML), internet of things (IOT), Cloud, nanotechnologies, sensing technologies, genetic engineering, pharmacology, bioengineering combined with our understanding of how brain and body function are creating paths for this augmentation revolution. The augmented human will be smarter, happier, healthier, stronger and efficient. And, for the first phase of this revolution, you don’t need to have a chip implanted in your body or brain, or involve any biological tricks or chemicals. Just the IOT, AI, ML, and cloud is enough.

For example, specialized sensory receptors areatplay when we see colors, hear sounds, feel textures, smell flowers or taste sugar. There is no specific receptor for the time, and yet we are consciously aware of the passage of time. Although invisible and ephemeral, time is an essential variable in the perception of user experience. All the more so in the digital era. People are looking for network performance aligned with their needs, abilities, and expectations, at that moment, on that device. Every minute of network slowness is magnified when you are doing something time critical. As simple and measurable the concept of slowness might seem – we construct the experience of time in our minds. Perceived time is not isomorphic to physical time. And, this is where there is an opportunity to augment using IOT.

Our brainwaves are “signatures” of what is happening inside the brain i.e. how we are feeling. For example, the brainwaves are different when we are relaxed vs. when we are time-pressed. IOT sensors can be developed to capture these brainwaves. External IOT sensors (passive antenna with high impedance at least 10e12 Ohms can be integrated into Wi-Fi access points) or neural electrodes (that can be embedded in wearables such as in-ear, outer ear, or eyeglasses arms) can be used to capture the brainwaves. Once the brainwaves are captured, those can be analyzed in the cloud to fine tune the network availability and performance not just based on some “metrics” such as “Wi-Fi statistics” but as per user’s psychological needs. I will share more details on how brainwaves can be used to fine tune Wi-Fi waves i.e. how Wi-Fi performance can be aligned with the expectations at the user’s psychological level in another article. In this example, I am focusing on Wi-Fi, but once the brainwaves have been captured, the possibilities are endless!

Although we are talking about using artificial intelligence, cloud, IOT, Augmented Human is not to be confused with all the work that is going on in AI and Robotics. It is at the other end of the spectrum. Augmented Human is about making use of the technology to transform the way we live, interact and perform UNLIKE the use of technology to make robots and computers to be able to think and act like humans. Augmented Human is what I believe is the next evolutionary step for humankind. In fact, the fear that computers and robots will take over our jobs and our lives is NOT real. Because, in reality,we will evolve into a much more advanced and augmented human being. The technology, AI, and robotics will support everything we do at a whole new level. Only ask, I have to you is please don’t remain native – adapt to the augmentation. I am sure you have heard ‘It is not the most intellectual of the species that survives; it is not the strongest that survives, but the species that survives is the one that is able to adapt to and to adjust best to the changing environment in which it finds itself’. Not sure if Charles Darwin said these words as is, but I agree.

Tags: , , ,


Wi-Fi is crucial to the way we work today. Fast, reliable, and consistent wireless coverage in an enterprise is business-critical. Many day-to-day operations in the enterprise depend on it. And yet, most of the time, IT teams are flying blind when it comes to individual experience. This springs from two main challenges.

The first challenge is data collection.

We want to know the state of every user at every given time. But these states change constantly as network conditions and user locations change. With tens of thousands of devices being tracked, there is a huge amount of information to be collected. This volume of data simply cannot be handled in an access point or a controller running on an appliance with fixed memory and CPU.

The second challenge is data analysis.

It takes considerable time and effort to sort through event logs and data dumps to get meaningful insights. And significant Wi-Fi intelligence is required to actually make heads or tails out of the data.

Someday soon, I believe, big data and machine learning will solve the above hurdles. It will allow me to ask my network how it is feeling, it will tell me where it hurts, and it will provide detailed prescriptions for fixing the problem (or will automatically fix it for me). While this seems to be a futuristic vision, the foundation to achieve it is already being laid out through big data tools and machine learning techniques like unsupervised training algorithms.

Using these technologies, we can now continuously update models that measure and enforce the experience for our wireless users. For example, we can ensure specific internet speeds in real time (i.e throughput) with a high level of accuracy. This allows the IT staff to know a wireless user is suffering before they even realize it — and thus before they have to log a call with the help desk.

Once a user problem is detected, machine learning classification algorithms can isolate the root cause of the problem. For example, is the throughput issue due to interference, capacity, or LAN/WAN issues? After isolating the problem, machine learning can then automatically reconfigure resources to mediate the issue. This minimizes the time and effort IT teams spend on troubleshooting, while delivering the best possible wireless experience.

I’ve written before how artificial intelligence will revolutionize Wi-Fi. I would love to be able to just unleash IT teams on sifting through hordes of data so they can glean meaningful information. But it is like finding a needle in a haystack. Machine learning is key to automating mundane operational tasks like packet captures, event correlation, and root cause analysis. In addition, it can provide predictive recommendations to keep our wireless network out of trouble.

Also key to this vision is the elastic scale and programmability that modern cloud elements bring to the table. The cloud is the only medium suitable for treating Wi-Fi like a big data problem. It has the capacity to store tremendous amounts of data, with a distributed architecture that can analyze this data at tremendous speed.

Wi-Fi isn’t new. But how we use Wi-Fi has evolved. And now more than ever, Wi-Fi needs to perform flawlessly. We are in an era where wireless needs to be managed like a service, with all the flexibility, agility, and reliability of other business-critical platforms. With machine learning, big data, and the cloud, this new paradigm is quickly becoming a reality.

Tags: , , ,


Wi-Fi is 18 now!

The journey to now has been a tremendous one.

It started in 1999, yet it seems just like yesterday. Six companies (3ComAironetHarris SemiconductorLucentNokia and Symbol Technologies) came together to form a global non-profit association Wi-Fi Alliance. The objective of Wi-Fi Alliance was to develop and validate multi-vendor interoperability using a new wireless networking technology (802.11 protocol created in 1997). Shortly after, in year 2000, Wi-Fi alliance delivered the first standard 802.11b (11 Mbps in 2.4GHz band). Wi-Fi was gaining attention by retailers especially as they needed wireless bar code scanners, price verifiers etc.

I remember the days at WalMart (I was at Symbol Technologies in those days which got later acquired by Motorola, then acquired by Zebra and recently acquired by Extreme Networks), we were building and deploying wireless LAN controller and experiencing the challenges in front of Wi-Fi. In those days, it was not just about performance, interference, or coverage, but security of wireless itself was a huge issue too.  First came WEP as the security algorithm for wireless network but it had serious flaws. Three researchers working at Berkeley produced a paper named “(In)Security of the WEP algorithm“ and then U.S. FBI demonstrated how to break it using publicly available tools.  The second one was WPA. WPA was originally meant as a wrapper to WEP which tackles the insecurities caused by WEP. It was actually never meant as a security standard but just as a quick fix until WPA2 became available. It had flaws and hacker community demonstrated!  In fact, we used VPN on top of security protocols of wireless!  The unbreakable WPA 2 (making use of CCMP for cryptographic encapsulation) became available in 2004 as the protocol for secure wireless and it is still good.

While the security challenge was being tackled, Wi-Fi alliance continued to work on performance. It delivered 802.11a in 2002 with potential to provide 54 Mbps in 5GHz band. 802.11a did not pick up but not to worry, 802.11g came in 2003 providing 54 Mbps in 2.4 GHz band and it was loved. 802.11g gained popularity and mass deployments began shortly after. Many other companies were working on wireless networking equipment such as Aironet (later acquired by Cisco), Aruba (later acquired by HP), Ruckus Wireless (later acquired by Brocade, and recently acquired by Broadcom) and Meru Networks (later acquired by Fortinet). The world was changing. The excitement of wireless drove cities to become Wi-Fi enabled. Jerusalem, Israelbecame the first city to provide city Wi-Fi in 2004, followed by Mysore, India and Sunnyvale, California shortly after.

Even, Merriam Webster dictionary added Wi-Fi as a word in 2005. Although it was in dictionary, many businesses had deployed Wi-Fi, and some cities had city wide Wi-Fi, Wi-Fi gained true momentum only after “Wi-Fi only iPod”, iPhone, Android and iPad were introduced. The 2010 launch of iPad changed the need of Wi-Fi from needed for internet access to needed for life!

Today, it could be argued that Maslow’s perception of what constituted basic physical needs has been surpassed by something even more fundamental – WiFi.

Over the years, along with standards, Wi-Fi Architecture has come a long way too. Many great innovations have transformed the Wi-Fi landscape.  In addition to the autonomous (also known as thick access point architecture) and controller based architectures (also known as split mac or thin access point architecture), innovation happened in form of controller less architecture (Aerohive Networks), single channel architecture (Meru Networks) and Beam forming for better experience (Ruckus Wireless). Cloud controller (Meraki, acquired by Cisco) came soon after and then multiple variations of cloud managed architectures.  Even SDN (software defined networking) touched the Wi-Fi architecture.  OpenFlow enabled Access Points came into the market as well as Meru Networks (later acquired by Fortinet) created world’s first OpenFlow enabled Wi-Fi Controller in 2014. Not only the standards and architecture evolved, the focus also evolved from just connectivity to enabling value added services such as Wi-Fi CallingLocation based services etc.

Having gone thru consolidation in the industry, cloudification of the architecture, need of connectivity becoming a basic need, and with Gigabit speeds available today, one can say that it has been an incredibly fulfilling journey. I personally feel blessed to be part of this journey. And, now where do we go from here! What does the future hold.

In the next two parts, I share my thoughts on what users and businesses are expecting from Wi-Fi in future as well as the direction that Wi-Fi standards or competing standards may take.

Tags: , , ,


I’ve never hidden my love for wireless networking.  Wi-Fi has moved from a nascent technology to one that is widely accepted and become so commonplace that we wonder how we ever functioned without it.

It started from autonomous access points and was followed up by controller based architecture (with a centralized controller and thin access points).  And, as we learnt from the challenges in deployment of Wi-Fi and the ability of environment to impact user experience, companies have constantly tried to innovate.  Some focused on building dynamic channel or power planning, some controller-less  and others tried to make it work in single channel (don’t deploy single channel until you have read the challenges here).

Of course, the discussion around evolution of Wi-Fi architectures is not complete without talking about the adaptive beam forming antenna technologies.  This technology automatically adjusts to changes in the Radio Frequency environment providing stronger signals to the client.  Standards of Wi-Fi have been evolving constantly also to enable great user experience (read about 802.11ax).

A lot of  development has also happened around improving computing real time location of Wi-Fi clients as well as business applications using the indoor location.  And, of course, with everything moving into cloud, Wi-Fi controllers or management moved into cloud too.

And, yet, even after years of evolution and innovation, the vendors avoid the conversations that are centered around the guarantee of quality of experience for wireless users.  Wireless networks today still requires hands and heads.  The effort and time that goes into its support is significant.  There is a need of wireless management autonomic system that “watches” the network, “understands” normal functioning, “analyzes” real-time performance against norms, then “acts” to automatically solve known problems.

The problem is that  the data source in wireless network is very big.  The data varies at every transmission level.  There is a “data rate” of each message transmitted.  There are “retries” for each message transmitted.  The reason for not able to “construct” the received message are specific for each message.  The manual classification and analysis of this data is infeasible and uneconomic.  And, hence, all data available by different vendors is plagued by averages.   And, this is where, I believe Artificial Intelligence has a role to play.

Only use of AI can change the center of focus from evolution of wireless or adding value to wireless network to automating ensuring the experience.

Deep neural nets can automate the analysis  and make it possible to analyze every trend of wireless.   Machine learning and algorithms can ensure the end user experience.  SDN will play a key role to enable the programmability of the Wi-Fi network, however, SDN is not the end goal.  The end goal of the programmability (or sdnization) of wireless network is still the enablement of  self managed wireless network.  A network that proactively ensures the end user’s mobility experience.

Of late, AI is graduating from science fiction to reality.  The cheap computer processing power has made it real.  I am convinced that by 2020, enterprise Wi-Fi will be significantly artificial intelligence based.

I invite wi-fi engineers to supplement their domain expertise with artificial intelligence tools so that we can free up systems administrators from sitting at a console or waiting for alerts.  That is the only way to drive down IT support and administration costs over time.  The fully autonomic systems that I envision will take on more of the characteristics of human intelligence for managing Wi-Fi.  Until we see that product, we can use this as a yardstick for measuring the evolution of Wi-Fi management products.

Tags: , , ,