Why the Future of AI & Computers Will Be Analog
Summary
TLDRThe video script discusses the resurgence and potential of analog computing in a world dominated by digital technology. It highlights the energy efficiency of analog systems, which can be 1,000 times more efficient than digital counterparts, and how this could be part of the solution to the climate crisis. The script also touches on the limitations of digital computing, such as physical boundaries and energy consumption, and introduces companies like Mythic and Aspinity that are developing analog chips for modern applications. The potential for hybrid computers that combine the best of both worlds is also explored, hinting at a future where analog computing could play a significant role in our daily lives.
Takeaways
- šŗ Analog computing, once overshadowed by digital, is experiencing a resurgence due to its potential energy efficiency and unique problem-solving capabilities.
- š”ļø Analog systems have an infinite number of states compared to digital systems, which rely on a finite number of states determined by bits or transistors.
- š The Space Age and personal computers marked a decline in the size of computing devices, but analog computing might be reaching physical limits in terms of miniaturization.
- š” Digital computing, particularly in areas like AI and cryptocurrencies, is increasingly energy-intensive, prompting interest in more efficient alternatives like analog computing.
- š A return to analog computing could significantly reduce energy consumption, with analog processes sometimes being 1,000 times more efficient than digital ones.
- š ļø Analog computers operate based on physical models that correspond to the values of the problem being solved, as opposed to digital computers that follow algorithms and discrete data.
- š The limitations of digital computing are being recognized, with experts like Bernd Ulmann suggesting that we are approaching the fundamental physical boundaries of digital elements.
- š§ Analog computing's continuous data processing allows for real-time problem-solving and efficient parallel processing without the need for cooling facilities.
- š Hybrid computers that combine the energy efficiency of analog with the precision of digital are being explored for future technology development.
- š Everyday applications of analog computing could include low-power sensors for voice-enabled devices, environmental monitoring, and wearable technology.
Q & A
What is the fundamental difference between analog and digital computing?
-Analog systems have an infinite number of states and can represent a continuous range of values, while digital systems rely on a finite number of states determined by the number of bits or transistors that can be switched on or off.
How has the advancement of digital computing impacted the size of computing devices?
-Digital computing has led to a significant reduction in the size of computing devices, from large machines to personal computers and smartphones, following the predictions of Moore's Law which suggests a doubling of transistors on integrated circuits approximately every two years.
What are some of the environmental concerns associated with digital computing?
-Digital computing, especially in data centers and power-hungry applications like cryptocurrencies and AI, is becoming increasingly energy-intensive, contributing to global energy consumption and carbon emissions. It also requires substantial cooling systems, which can strain water resources.
Why is analog computing considered more energy-efficient than digital computing?
-Analog computing can perform the same tasks as digital computing with a fraction of the energy because it operates on a physical model corresponding to the problem being solved, which doesn't require the switching of transistors and can handle continuous data in real time.
What is the significance of the MONIAC computer in the history of analog computing?
-The MONIAC (Monetary National Income Analogue Computer), created by economist Bill Phillips in 1949, is a classic example of analog computing. It was designed to simulate the Great British economy on a macro level using water to represent money flow, and it could function with an approximate accuracy of Ā±2%.
What are some practical applications of analog computing today?
-Practical applications of analog computing today include flight computers used by pilots for manual calculations, as well as emerging technologies like low-power sensors for voice-enabled wearables, sound detection systems, and heart rate monitors.
How does the concept of Amdahl's law relate to the limitations of digital computing?
-Amdahl's law suggests that the speedup of a system is limited by its sequential operations that cannot be parallelized. As a result, adding more processors does not always lead to proportional improvements in speed, which is a challenge for digital computers when trying to handle increasingly complex tasks efficiently.
What are some of the challenges in integrating analog and digital systems?
-Integrating analog and digital systems requires seamless connectivity and synchronization between the two paradigms, which can be technically challenging. It also involves developing hybrid computers that combine the energy efficiency of analog with the precision and flexibility of digital computing.
What is the potential impact of analog computing on machine learning and AI?
-Analog computing has the potential to significantly reduce the power consumption of machine learning and AI applications by offering a more energy-efficient computing method. Companies like Mythic are developing analog matrix processors that aim to deliver the compute resources of a GPU at a fraction of the power consumption.
How might analog computing change the devices we use in our daily lives?
-As analog computing becomes more integrated with digital systems, we could see devices that are always on, like voice-enabled wearables and smart home sensors, consuming much less power. This could lead to longer battery life and reduced environmental impact without sacrificing functionality.
What are some ways for individuals to explore analog computing at home?
-Individuals can explore analog computing at home through models like the Analog Paradigm Model-1, which is designed for experienced users to assemble themselves, or The Analog Thing (THAT), which is sold fully assembled and can be used for a variety of applications from simulating natural sciences to creating music.
Outlines
šŗ The Resurgence of Analog Computing
This paragraph introduces the concept of analog computing and its resurgence in modern technology. It discusses the shift from analog to digital computing and the potential of analog computing to impact daily life. The speaker, Matt Ferrell, shares his curiosity about analog computing sparked by a Veritasium video and his subsequent exploration of the topic. The contrast between analog and digital systems is highlighted, emphasizing the infinite states of analog versus the finite states of digital, represented by bits. The energy efficiency of analog computing is also mentioned as a potential solution to the growing energy demands of digital computing, particularly in the context of cryptocurrencies and AI.
š” Historical Analog Computers and Their Applications
This paragraph delves into the history and practical applications of analog computers. It mentions the Moniac National Income Analogue Computer (MONIAC) as a prime example, which was designed to simulate the British economy. The paragraph also discusses the accuracy of analog computers and their continued relevance, such as pilots using slide rules for calculations. The contrast between the convenience of digital devices and the specialized applications of analog computers is explored, highlighting the limitations of digital computing and the potential for analog computing to break through these barriers.
š Pushing the Limits of Digital Computing
This paragraph examines the limitations of digital computing, referencing the predictions made by Gordon Moore, known as Moore's Law, and the physical boundaries that digital elements are reaching. It discusses the challenges of miniaturizing computer chips further and the heat generation and cooling requirements of dense components. The paragraph also touches on Amdahl's law and its implications for the efficiency of digital computers, especially when considering sequential operations and the diminishing returns of adding more processors. The potential of analog computing to offer a more parallel and energy-efficient approach is contrasted with the sequential nature of digital computing.
š Future of Analog Computing in Everyday Life
The final paragraph explores the future possibilities of analog computing in everyday life, discussing the development of hybrid computers that combine the energy efficiency of analog with the precision of digital. It mentions companies like Mythic and Aspinity that are working on analog chips for machine learning and low-power sensors. The potential applications of analog computing in household devices are highlighted, such as voice-enabled wearables and heart rate monitoring. The paragraph also addresses the challenges of making analog programming more accessible and the need for seamless connectivity between analog and digital systems. It concludes with a call to action for the audience to consider the potential of analog computing and engage in further discussion.
Mindmap
Keywords
š”Analog computing
š”Digital computing
š”Energy efficiency
š”Moore's Law
š”Amdahl's Law
š”Hybrid computing
š”Machine learning
š”Data centers
š”Differential equations
š”Surfshark
š”Climate crisis
Highlights
Analog computing is making a comeback and is also something that never really left.
Analog systems have an infinite number of states, unlike digital systems which rely on a fixed number of states.
Digital computing is becoming increasingly energy intensive, with significant implications for global energy consumption.
Analog computing could be part of the solution to energy efficiency, as it can accomplish tasks for a fraction of the energy.
The MONIAC, created in 1949, is an example of an analog computer used to simulate the economy.
Pilots still use flight computers, a form of slide rule, for calculations without the need for electricity.
Digital devices provide convenience, but analog computing has its own strengths, such as energy efficiency.
Digital computers are hitting basic physical boundaries, limiting how much further they can be shrunk.
Moore's Law, which predicts the doubling of transistors on a chip, is nearing its limits.
The more components on a chip, the harder it is to cool, leading to significant energy and resource use.
Research on new approaches to analog computing has led to the development of materials that donāt need cooling facilities.
Amdahl's law suggests that there will always be operations that must be performed sequentially in digital computing.
Analog computers can work in parallel, allowing for more efficient problem-solving without the need for sequential operations.
Hybrid computers that combine the best features of both digital and analog computing may be the future.
Mythic's Analog Matrix Processor chip aims to deliver significant compute resources at a fraction of the power consumption.
Aspinity's AML100 chip can act as a low-power sensor for various applications, with potential energy savings of up to 95%.
Analog computing, with its potential for energy efficiency and real-time processing, could become more approachable and accessible.
Transcripts
If your taste in TV is anything likeĀ mine, then most of your familiarity withĀ Ā
what analog computing looks like probablyĀ comes from the backdrops ofĀ something like
Columbo. Since digital took over the world,Ā analog has been sidelined into what seemsĀ Ā
like a niche interest at best. But this retroĀ approach to computing, much like space operas,Ā Ā
is both making a comeback, and also somethingĀ that never really left in the first place.
I found this out for myself about a yearĀ ago, when a video from Veritasium sparkedĀ Ā
my curiosity about analog computing. After that,Ā I started to read a few articles here and there,Ā Ā
and I gotta sayā¦it broke my brain a bit. WhatĀ I really wanted to know, though, was this:Ā Ā
How can analog computing impact ourĀ daily lives? And what will that lookĀ Ā
like? Because I definitely donātĀ have room in my house forĀ this.
Iām Matt Ferrell ā¦ welcome to Undecided.Ā
This video is brought to you by Surfshark and allĀ of my patrons on Patreon, but more on that later.
Depending on how old you are, you may rememberĀ when it was the norm for a single computer toĀ Ā
take up more square footage than your averageĀ New York City apartment. But after the end of theĀ Ā
Space Age and the advent of personal computers,Ā our devices have only gotten smaller and smaller.Ā Ā
Some proponents of analog computing argue thatĀ we might just be reaching our limits when itĀ Ā
comes to how much further we can shrink. WeāllĀ get to that in a bit, though. Emphasis on bits.
Speaking of bits, this brings us to theĀ fundamental difference between analogĀ Ā
and digital. Analog systems have an infiniteĀ number of states. If I were to heat this roomĀ Ā
from 68 F to 72 F, the temperature wouldĀ pass through an infinite set of numbers,Ā Ā
including 68.0000001 F and so on. DigitalĀ systems are reliant on the number of ābitsāĀ Ā
or the number of transistors that areĀ switched either on or off. As an example,Ā Ā
an 8-bit system has 2^8, or 256 states. ThatĀ means it can only represent 256 different numbers.
So, size isnāt the only aspect of theĀ technological zeitgeist thatās changed. DigitalĀ Ā
computers solve problems in a fundamentallyĀ different way from analog ones. Thatās ledĀ Ā
to some pretty amazing stuff in modern dayā¦atĀ a cost. Immensely energy intensive computingĀ Ā
is becoming increasingly popular. Just look atĀ cryptocurrencies and AI. According to a reportĀ Ā
released last year by Swedish telecommunicationsĀ company Ericsson, the information andĀ Ā
communication technology sector accounted forĀ roughly 4% of global energy consumption in 2020.
Plus, a significant amount of digitalĀ computing is not the kind you can take toĀ Ā
go. Just among the thousands of data centersĀ located across the globe, the average campusĀ Ā
size is approximately 100,000 square feet (orĀ just over 9,000 square meters). That's moreĀ Ā
than 2 acres of land! Data scientistĀ Alex de Vries estimates that a singleĀ Ā
interaction with a LLM is equivalent to āleavingĀ a low-brightness LED lightbulb on for one hour.ā
But as the especially power-hungry data centers,Ā neural networks, and cryptocurrencies of the worldĀ Ā
continue to grow in scale and complexityā¦we stillĀ have to reckon with the climate crisis. EnergyĀ Ā
efficiency isnāt just good for the planet,Ā itās good for the wallet. A return to analogĀ Ā
computing could be part of the solution. TheĀ reason why is simple: you can accomplish theĀ Ā
same tasks as you would on a digital setupĀ for a fraction of the energy. In some cases,Ā Ā
analog computing is as much as 1,000 timesĀ more efficient than its digital counterparts.
Before we get into exactly how it works and whyĀ weāre starting to see more interest in analogĀ Ā
computers again, I need to talk about anotherĀ piece of tech that can really help in your dailyĀ Ā
digital life and thatās todayās sponsor,Ā Surfshark. Surfshark is a fast, easy toĀ Ā
use VPN full of incredible features that you canĀ install on an unlimited number of devices with oneĀ Ā
account. Most of the time when we talk about VPNsĀ weāre focused on giving yourself security as youĀ Ā
travel around the world, but it can do way moreĀ than that. Since you can make it look like yourĀ Ā
IP address is coming from somewhere else in theĀ world, it unlocks geofencing blocks on content,Ā Ā
like streaming services. But ā¦ thatās notĀ all. Even shopping services will sometimesĀ Ā
gate prices based on your location, so you canĀ change your location to make sure youāre gettingĀ Ā
the best prices. They also have add-ons to theirĀ VPN service to unlock things like Surfshark Alert,Ā Ā
which will let you know if your email or personalĀ details, like passwords, have been leaked onlineĀ Ā
in a data breach. Right now theyāre running aĀ special deal ā¦ use my code UNDECIDED to get upĀ Ā
to 3 additional months for free. SurfSharkĀ offers a 30-day money-back guarantee,Ā Ā
so thereās no risk to try it out for yourself.Ā Iāve been using Surfshark for years and love it.Ā Ā
Link is in the description below. ThanksĀ to Surfshark, for supporting the channel.Ā Ā
And thanks to all of you, as well as my patrons,Ā who get early, ad-free versions of my videos. SoĀ Ā
back to how much more energy efficient analogĀ computing is from its digital counterparts.
To understand how that works, exactly, weĀ first need to establish what makes analogĀ Ā
computingā¦analog. The same way you would makeĀ a comparison with words using an analogy,Ā Ā
analog computers operate using a physicalĀ model that corresponds to the values ofĀ Ā
the problem being solved. And yeah,Ā I did just make up an analog analogy.
A classic example of analog computing is theĀ Monetary National Income Analogue Computer,Ā Ā
or MONIAC, which sounds like a long forgotten carĀ brand, which economist Bill Phillips created inĀ Ā
1949. MONIAC has a single purpose: to simulateĀ the Great British economy on a macro level.Ā Ā
Within the machine, water represented money asĀ it literally flowed in and out of the treasury.Ā Ā
Phillips determined alongside his colleagueĀ Walter Newlyn that the computer could functionĀ Ā
with an approximate accuracy of Ā±2%. AndĀ of the 14 or so machines that were made,Ā Ā
you can still find the first churning awayĀ at the Reserve Bank Museum in New Zealand.
Itās safe to say that the MONIAC workedĀ (and continues to work) well. The same goesĀ Ā
for other types of analog computers, fromĀ those on the simpler end of the spectrum,Ā Ā
like the pocket-sized mechanicalĀ calculators known as slide rules,Ā Ā
to the behemoth tide-predictingĀ machines invented by Lord Kelvin.
In general, it was never that analog computingĀ didnāt do its job ā quite the opposite. PilotsĀ Ā
still use flight computers, a form of slideĀ rule, to perform calculations by hand,Ā Ā
no juice necessary. But for more generalizedĀ applications, digital devices just provide a levelĀ Ā
of convenience that analog couldnāt. IncredibleĀ computing power has effectively become mundane.
To put things into perspective, an iPhoneĀ 14 contains a processor that runs somewhereĀ Ā
above 3 GHz, depending on the model.Ā The Apollo Guidance Computer, itself aĀ Ā
digital device onboard the spacecraftĀ that first graced the moonās surface,Ā Ā
ran atā¦0.043 MHz. As computer scienceĀ professor Graham Kendall once wrote,Ā Ā
āthe iPhone in your pocket has over 100,000 timesĀ the processing power of the computer that landedĀ Ā
man on the moon 50 years ago.ā ā¦ and we use itĀ to look at cat videos and argue with strangers.
In any case, that ease of use is one of theĀ reasons why the likes of slide rules andĀ Ā
abacuses were relegated to museum displaysĀ while electronic calculators reigned king.Ā Ā
So much for āruling.ā But, while digitalĀ has a lot to offer, like anything else,Ā Ā
it has its limits. And mathematicianĀ and self-described āanalog computerĀ Ā
evangelistā Bernd Ulmann argues that we canātĀ push those limits much further. In his words:
āDigital computers are hitting basicĀ physical boundaries by now. ComputingĀ Ā
elements cannot be shrunk much more than today,Ā Ā
and there is no way to spend even moreĀ energy on energy-hungry CPU chips today.ā
Itās worth noting here that Ulmann saidĀ this in 2021, years ahead of the explosionĀ Ā
of improvements in generative AI weāveĀ witnessed in just the past few months,Ā Ā
like OpenAIās text-prompt-to-videoĀ model, Sora. Which, really disturbsĀ Ā
me and I'm very excited by all at the sameĀ time, I need to make a video about that.
But what did he mean by āphysicalĀ boundariesā? Wellā¦digital computingĀ Ā
is starting to bump up against the law.Ā No, not that kindā¦the scientific kind.Ā Ā
Thereās actually a few that are at playĀ here. Weāve already started talking aboutĀ Ā
the relationship between digital computingĀ and size, so letās continue down that track.
In a 1965 paper, Gordon Moore, co-founder ofĀ Intel, made a prediction that would come toĀ Ā
be known as āMooreās Law.ā He foresaw thatĀ the number of transistors on an integratedĀ Ā
circuit would double every year for theĀ next 10 years, with a negligible rise inĀ Ā
cost. And 10 years later, Moore changed hisĀ prediction to a doubling every two years.
As Intel clarifies, Mooreās Law isnāt a scientificĀ observation, and Moore actually isnāt too keen onĀ Ā
his work being referred to as a ālaw.ā However,Ā the prediction has more or less stayed true asĀ Ā
Intel (and other semiconductor companies)Ā have hailed it as a goal to strive for:Ā Ā
more and more transistors on smaller andĀ smaller chips, for less and less money.
Hereās the problem. What happens when we canātĀ make a computer chip any smaller? According toĀ Ā
Intel, despite the warnings of experts in the pastĀ few decades, weāve yet to hit that wall. We canĀ Ā
take it straight from Moore himself, though,Ā that an end to the standard set by his law isĀ Ā
inevitable. When asked about the longevity of hisĀ prediction during a 2005 interview, he said this:
āThe fact that materials are made of atoms isĀ the fundamental limitation and it's not thatĀ Ā
far away. You can take an electron micrograph fromĀ some of these pictures of some of these devices,Ā Ā
and you can see the individual atoms ofĀ the layers. The gate insulator in the mostĀ Ā
advanced transistors is only about three molecularĀ layers thickā¦We're pushing up against some fairlyĀ Ā
fundamental limits, so one of these days we'reĀ going to have to stop making things smaller."
Not to mention, the more components you cramĀ onto a chip, the hotter it becomes during use,Ā Ā
and the more difficult it is to cool down. ItāsĀ simply not possible to use all the transistorsĀ Ā
on a chip simultaneously without risking aĀ meltdown. This is also a critical problemĀ Ā
in data centers, because itās not onlyĀ electricity use that represents a hugeĀ Ā
resource sink. Larger sites that use liquidĀ as coolant rely on massive amounts of waterĀ Ā
a day ā think upwards of millions of gallons.Ā In fact, Googleās data centers in The Dalles,Ā Ā
Oregon, account for over aĀ quarter of the cityās water use.
Meanwhile, emerging research on newĀ approaches to analog computing hasĀ Ā
led to the development of materials thatĀ donāt need cooling facilities at all.
Then thereās another law that stymiesĀ the design of digital computers:Ā Ā
Amdahlās law. And you might be able to get aĀ sense of why itās relevant just by looking atĀ Ā
your wrist. Or your wall. Analog clocks, the kindĀ with faces, can easily show us more advantages ofĀ Ā
analog computing. When the hands move forward onĀ a clock, they do so in one continuous movement,Ā Ā
the same way analog computing occurs in real time,Ā with mathematically continuous data. But when youĀ Ā
look at a digital clock, youāll notice that itĀ updates its display in steps. Thatās because,Ā Ā
unlike with analog devices, digital informationĀ is discrete. Itās something that you count,Ā Ā
rather than measure, hence theĀ binary format of 0s and 1s.
When a digital computer tackles a problem,Ā it follows an algorithm, a finite numberĀ Ā
of steps that eventually lead to an answer.Ā Presenting a problem to an analog computer isĀ Ā
a completely different procedure, and this cuteĀ diagram from the ā60s still holds true today:
First, you take note of the physical lawsĀ that form the context of the problem youāreĀ Ā
solving. Then, you create a differentialĀ equation that models the problem. If yourĀ Ā
blood just ran cold at the mention ofĀ math, donāt worry. All you need to knowĀ Ā
is that differential equations model dynamicĀ problems, or problems that involve an elementĀ Ā
of change. Differential equations can be usedĀ to simulate anything from heat flow in a cableĀ Ā
to the progression of zombie apocalypses. AndĀ analog computers are fantastic at solving them.
Once youāve written a differential equation,Ā you program the analog computer by translatingĀ Ā
each part of the equation into a physical part ofĀ the computer setup. And then you get your answer,Ā Ā
which doesnāt even necessarilyĀ require a monitor to display!
All of that might be tough to envision, soĀ hereās another analog analogy that hopefullyĀ Ā
is less convoluted than the labyrinth of wiresĀ that make up a patch panel. Imagine a playground.Ā Ā
Letās say two kids want to race to the sameĀ spot, but each one takes a different path.Ā Ā
One decides to skip along the hopscotch court,Ā and the other rushes to the slide. Who will win?
These two areas of the playground areĀ like different paradigms of computing.Ā Ā
You count the hopscotch spaces outlined onĀ the ground and move between them one by one,Ā Ā
but you measure the length of aĀ slide, and reach its end in oneĀ Ā
smooth move. And between these twoĀ methods of reaching the same goal,Ā Ā
one is definitely a much quicker process thanĀ the otherā¦and also takes a lot less energy.
There are, of course, caveats to analog.Ā If you asked the children in our playgroundĀ Ā
example to repeat their race exactly the sameĀ way they did the first time, who do you thinkĀ Ā
would be more accurate? Probably the one whoseĀ careful steps were marked with neat squares,Ā Ā
and whose outcomes will be the same ā landingĀ right within that final little perimeter ofĀ Ā
chalk. With discrete data, you can make perfectĀ copies. Itās much harder to create copies withĀ Ā
the more messy nature of continuous data. TheĀ question is: do we even need 100% accurateĀ Ā
calculations? Some researchers are proposingĀ that we donāt, at least not all the time.
That said, what does this have to do with AmdahlāsĀ law? Well, we can extend our existing scenarioĀ Ā
a little further. It takes time to rememberĀ the rules of hopscotch and then follow themĀ Ā
accordingly. But you donāt need to remember anyĀ rules to use a slide ā other than maybe āwaitĀ Ā
until there isnāt anybody else on it.ā CommentĀ below with your favorite playground accidents!
In any case, because digital computers 1.Ā reference their memories and 2. solve problemsĀ Ā
algorithmically, there will always be operationsĀ (like remembering hopscotch rules) that must beĀ Ā
performed sequentially. As computer scienceĀ professor Mike Bailey puts it, āthis includesĀ Ā
reading data, setting up calculations, controlĀ logic, storing results, etc.ā And because youĀ Ā
canāt get rid of these sequential operations,Ā you run into diminishing returns as you addĀ Ā
more and more processors in attempts to speed upĀ your computing. You canāt decrease the size ofĀ Ā
components forever, and you canāt increaseĀ the number of processors forever, either.
On the other hand, analog computersĀ donāt typically have memories theyĀ Ā
need to take time to access. This allowsĀ them more flexibility to work in parallel,Ā Ā
meaning they can easily breakĀ down problems into smaller,Ā Ā
more manageable chunks and divide themĀ between processing units without delays.
Hereās how Bernd Ulmann explains it In his 2023Ā textbook, Analog and Hybrid Computer Programming,Ā Ā
which contributed a considerableĀ amount of research to this video:
āFurther, without any memory thereĀ is nothing like a critical section,Ā Ā
no need to synchronize things,Ā no communications overhead,Ā Ā
nothing of the many trifles that hauntĀ traditional parallel digital computers.ā
So, you might be thinking: speedier, moreĀ energy-efficient computing sounds great,Ā Ā
but what does it have to do with me? Am I going toĀ have to learn how to write differential equations?Ā Ā
Will I need to knock down a wall in my officeĀ to make room for a retrofuturist analog rig?
Probably not. Instead, hybrid computers thatĀ marry the best features of both digital andĀ Ā
analog are what might someday be in vogue.Ā Thereās already whisperings of Silicon ValleyĀ Ā
companies secretly chipping away atā¦analogĀ chips. Why? To conserve electricity ā¦ andĀ Ā
cost. The idea is to combine the energyĀ efficiency of analog with the precision ofĀ Ā
digital. This is especially important forĀ continued development of the power-hungryĀ Ā
machine learning that makes generativeĀ AI possible. With any hope, that meansĀ Ā
products that are far less environmentallyĀ and financially costly, to maintain.
And thatās exactly what Mythic, headquartered inĀ the U.S., is aiming for. Mythic claims that itsĀ Ā
Analog Matrix Processor chip can ādeliver theĀ compute resources of a GPU at 1/10th the powerĀ Ā
consumption.ā Basically, as opposed to storingĀ data in static RAM, which needs an uninterruptedĀ Ā
supply of power, the analog chip stores dataĀ in flash memory, which doesnāt need power toĀ Ā
keep information intact. Rather than 1s and 0s,Ā the data is retained in the form of voltages.
Where could we someday see analog computingĀ around the house, though? U.S.-based companyĀ Ā
Aspinity has an answer to that. What itĀ calls the āworldās first fully analogĀ Ā
machine learning chip,ā the AML100, can act asĀ a low-power sensor for a bunch of applications,Ā Ā
according to its website. It can detect a wakeĀ word for use in voice-enabled wearables likeĀ Ā
wireless earbuds or smart watch, listen forĀ the sound of broken glass or smoke alarms,Ā Ā
and monitor heart rates, just to name a few.
For those devices that always need to beĀ on, this means energy savings that areĀ Ā
nothing to sneeze at (although I guessĀ you could program an AML 100 to detectĀ Ā
sneezes). Aspinity claims that its chipĀ can enable a reduction in power use of 95%.
So, the potential of maximizing efficiencyĀ through analog computing is clear,Ā Ā
and the world we interact with every day is itselfĀ analog. Why shouldnāt our devices be, too? But toĀ Ā
say that analog programming appears intimidatingĀ (and dated) isā¦somewhat of an understatement.
Itāll definitely need an image upgradeĀ to make it approachable and accessible toĀ Ā
the public ā though there are already models outĀ there that you can fiddle with yourself at home,Ā Ā
if youāre brave enough. German companyĀ Anabrid, which was founded by Ulmann in 2020,Ā Ā
currently offers two: the Analog ParadigmĀ Model-1, and The Analog Thing (or THAT).
The Model-1 is intended for more experiencedĀ users who are willing to assemble the machineĀ Ā
themselves. Each one is produced onĀ demand based on the parts ordered,Ā Ā
so you can tailor the modules to your needs.
THAT, on the other handā¦and by THAT I mean THAT:Ā The Analog Thing, is sold fully assembled. YouĀ Ā
could also build your own from scratch ā theĀ components and schematics are open source.
So what do you actually doĀ with the thing? Yāknowā¦THAT?Ā Ā
Iāll let the official wikiās FAQ answer that:
āYou can use it to predict in the naturalĀ sciences, to control in engineering,Ā Ā
to explain in educational settings, to imitate inĀ gaming, or you can use it for the pure joy of it.ā
The THAT model, like any analog computer,Ā solves whatever you can express in aĀ Ā
differential equation. As a reminder, thatāsĀ basically any scenario involving change,Ā Ā
from simulating air flow to solvingĀ heat equations. You can also make music!
But as analog computing becomes moreĀ readily available, thereās still aĀ Ā
lot of work to be done. For one thing,Ā Itāll take effort to engineer seamlessĀ Ā
connectivity between analog and digitalĀ systems, as Ulmann himself points out.
Until then, what do you think? Should we takeĀ the word of analog evangelists as gospel? OrĀ Ā
are we better off waiting for flying cars?Ā Jump into the comments and let me know. BeĀ Ā
sure to check out my follow-up podcast, StillĀ To Be Determined, where we'll be discussingĀ Ā
some of your feedback. Before I go, IādĀ like to welcome new Supporter+ patronsĀ Ā
Charles Bevitt and Tanner. Thanks so much forĀ your support. Iāll see you in the next one.
5.0 / 5 (0 votes)