Latest Technology

Latest Technology

Network Infrastructure is Changing

- Posted in Latest Technology by

Network Infrastructure is Changing

Last year's switches are very appliance-like. They're a narrow mixture of layer-2, layer-3 and Power-over-Ethernet features--just as they have been for about fifteen years. Switches you might buy this year will behave like last year's when you need them to; you can still program them individually at layer-2 or layer-3 using the command line. But, they're different.

As with phones, cars, televisions, and other electronics, network switches (and access points) have benefited from great increases in computing power. New devices run general purpose operating systems in lieu of embedded firmware. They're built upon supercomputer-speed hardware. So, in addition to switching and routing, they can do general-purpose computer-like stuff, and they can be extended in their capabilities.

Like what? The best-hyped new capability in network equipment is AI. This is the ability to leverage observations taken in the past and apply them to your network in the present. Moreover, it is the ability to pool and organize observations across many organizations' networks to increase the network's resilience for everyone. This includes giving the network the power to recognize issues and heal itself.

Switches are smarter in non-AI ways, too. We've had NAC for a long time (and barely used it). Now, you can program the switches to recognize and classify a device, such as a printer or AP, and assign it the appropriate VLANs and permissions--all by itself. Switches can often apply ACLs and bandwidth restrictions without the help of a central authentication server.

Switches needn't be configured onesy-twosy any longer. Consistency in configuration, visibility, resiliency can be managed for the whole LAN or WAN, all at once. This is possible from the local or cloud-based managers. Software-defined networking--once the stuff of expensive data center switches--is right beneath the surface on many of these devices. The ability to create arbitrary logical network architectures, to safely extend encrypted VLANs from far-away cores and the ability to secure it are available, if you want to use it.

Wireless has advanced remarkably, too. Controller-based and hive (virtual controller) wireless is being supplanted by WiFi networks in which every AP is its own controller. Consider that if every access point has a 2.5 or 5 Gbit/S interface of its own, then centralized operation and switching will to become a liability for larger deployments. In the latest paradigm, AI and central configuration services manage and update the network, but the traffic is bridged onto the LAN at wire speeds by the access point, cooperating with neighbors, but acting on its own behalf.

Three features of WiFi6 are going to make today's wireless networks unrecognizably fast, tomorrow--especially in dense deployments. The first is modulation techniques, particularly QAM-1024, that push wireless into the gigbait+ range with a reasonable number of antenna chains. BSS Coloring greatly increases the efficient use of crowded radio space. OFDMA, which allows the sharing of transmissions between clients, can provide greatly improved transmission-cycle efficiency. If that's not all enough, WiFi6E is out with new spectrum in the 6GHz range, included very wide channels (up to 320 MHz) for huge performance. Look back at some of out earlier communications to learn more about these capabilities.

Bitcoin in 2021

- Posted in Latest Technology by

enter image description hereThere is a limit to the number of bitcoins--21 million in total, with about 2 million left to be mined. Presently, there are about 900 new bitcoins per day. As more are found, the hash target becomes increasing harder to hit and it becomes asymptotically harder to find additional bitcoins. It will be the year 2140 before the last one is finally found. There's investment and reward in the effort: Russia just took delivery of a 70 MW crypto mining farm. It is said that almost 65% of the cryptomining resources are in China. Bitcoin broke $50,000 on Tuesday. Elon Musk bought $1.3 billion in bitcoin a couple weeks ago, when it was already over $40,000. Projections are for $500,000 per bitcoin.

The US dollar has been the world's reserve currency for a century, even as the US has heaped on debt for wars and social programs. Other alternatives--the euro or the Yuan each have problems that continue to favor the dollar; the Euro zone is crumbly and has debts, too; China's currency has been tightly controlled by the government and there is a trust deficit for Chinese policy. This past year, with the United States' COVID 19 response, the US debt became larger than its economy. This makes the dollar just a little bit more risky as a reserve currency.

In January, it became illegal for you to personally trade in cryptocurrencies if you are a resident of Nigeria. The ban expands a list that includes China, Iran, Bolivia, Nepal, Bangladesh, Ecuador and Morocco. Cryptocurrency trading is still allowed in the US, but you are now required to indicate whether you have traded in cryptocurrencies on your tax return. The public argument against cryptocurrencies is that crypto is tender for criminal activities, and that it is difficult to track and tax. Alternative Central Bank Digital Currencies are likely to be introduced and sanctioned instead. They can be tracked. They will be centrally managed, unlike Bitcoin. They're digital fiat money. Bitcoin isn't.

Why do China, Russia, North Korea, (possibly) the United States want to be dabbling in Bitcoin? As is the case for you and me, there's a speculative upside. But there is also a possibility that Bitcoin will become a bona fide reserve currency--a place to park value. The prices will stabilize. Fiat currencies will weaken as money is converted into crypto. Like a game of musical chairs, when the music stops, some countries and individuals will have larger shares of the reserve currency. And if Bitcoin becomes that important, the effort spent gathering up Bitcoins today will affect the wealth of nations tomorrow.

Crypto trading rides on top of digital networks, and these are subject to tampering. I will address the network vulnerabilities for crypto trading in a future blog entry.

-Kevin Dowd

AI in your Network

- Posted in Latest Technology by

AI in your network

There's a scene in "Mars Attacks" where Pierce Brosnan, playing the scientist, dissects one of the dead Martians. He pulls some red jelly from the brain and says "Curious." It captures one of the problems with what we're calling AI today because like the components of an AI, though the jelly can do amazing things, you really can't look at it and say why.

The term Artificial Intelligence has changed its meaning many times over the last 50 years. It currently refers to systems that can do feature correlation and extraction from training data sets--often very large ones. For example, training a system to recognize a face (like your phone does) is an exercise in presenting exemplar data to the system along with reinforcing feedback when a face is dsiplayed. This is called supervised training. After seeing enough faces and getting the green light for each one, the system can learn a correlation and provide the green light on its own when a new face is presented.

enter image description here

The systems and theory for this kind of AI have been around since the 1960s--even the 1950s. A well-known example is called a multi-layer perceptron, or neural network. The original objective was to imitate the way neurons interconnect and to reinforce pathways in the presence of specific stimuli. The neural network would be made of two or often three layers--an input layer, a hidden layer and an output layer. Inputs to a given layer would add or subtract from one another in accordance with weights (multipliers). The weights would be "learned" during the training process. They were the jelly in the Martian's brain.

The neural network is an analog model of how neurons might work, and some analog implementations have been created. On a digital computer, however, it is represented by matrix arithmetic. Matrix arithmetic can often be parallelized so that multiple operations occur at the same time. This makes it fast—very fast--given suitable hardware.

In the last ten years, supercomputers have been teased up, not from liquid-cooled Crays or hypercubes, but from very small-featured computing devices like field-programmable gate arrays or graphics cards. In fact, Nvidia--the company that makes the some of the best graphics hardware--is also the world's leader in supercomputers. That's because Nvidia graphics cards are hyper-parallel, and they can be programmed to do AI just as well as to perform pixel-based ray tracing. Nvidia is currently building what will be the world's largest supercomputer in Britain.

In the network business, you see "AI" being applied to network analytics, intrusion detection and diagnostics. The systems are the product of supervised and unsupervised training that look to correlate events and to recognize unique patterns—such as a network intruder. Part of the reason why vendors are pushing the cloud so vigorously is in addition to being the customer, you are part of the product; your network experiences contribute the training sets for "AI" analytics. They need you to participate in the cloud, too.

Given copious compute power, the quest to make AIs more capable is correlated with making them deeper--adding more layers, more sophisticated back-propagation and weight adjustment. "Deep learning" makes the AI more powerful, but it also makes it subject to pitfalls that are endemic to higher order curve-fitting. That is, when the input is similar to training data, the results can be excellent. In the face of unfamiliar input, deep AI can be wildly unpredictable. "Curious," as Pierce Brosnan would say. And it’s coming to your network.

-Kevin Dowd