Check out my new course on Technology and Investing in Silicon: www.anastasiintech.com/course The first 50 people to sign up get 25% off with the code “EARLY25”.
@TheBlueMahoe no man. Delegate! Make someone else do their strong suit and you stick to yours. Using collaboration and delegation to make a real team and fill in the gaps of each other's competence.
I though years ago about using a comparator neural net using arrays of D/A converters & comparators for near instantaneous results. Basically a DLN will have perform a huge array of comparisions to detemine a match. But this could all be done using an analog array. using D/A to create a comparison node value that feeds into one side of the comparator. This would be extremely energy efficient & as well as fast. Imagine a chip with a 100M analog comparator nodes using a couple of watts of power.
As a computer scientist and software engineer. I see great niche cases for p-bits or stochastic bits. But 100% of software is 99% deterministic, even when stochastic elements are put inside "deterministic cages". So I want to stress that nondeterministic computing has only niche uses within a deterministic framework, for the great majority of tasks for which humans want to program digital solutions.
That's true, but what is taking a toll on performance and power consumption _is_ the non-deterministic part. There will always be a mix of the two, if only for the very deterministic communications part.
@@tikabass this doesn't actually have a good power consumption, they only stated that it functions at room temperature, they did not state it functions at all "efficiently at room temperature". it only functions as efficiently as stated when it is supercooled- there is no inclusion of the cooling as a part of the power budget in this piece. instead as a consumer, it reads they are basically trying to refer to this component as "passive", when it _is not passive_. people who build servers will know this is the sort of chip that you need to have a significant portion of a server building dedicated to cooling to run at all efficiently. this piece steps around HPC by saying its "the old way", even though HPC accounts for the majority of computing required to be done by hardware right now. if you are looking to what is actually going to be relevant in the future, it's still deterministic, good old fashioned high performance compute, just with loads of added extras for developers- look at NextSilicon, they're on the yellow brick road in contrast to this extropic landfill, waste of sand.
@@quibster They stated this new tech has 100,000,000 (that's 1.e8) better power efficiency than the current tech, which mostly uses GPU cards. Take max the consumption of your GPU card and divide by 100,000,000 to have an idea of the power consumption of the new tech. Mine, a lowly GTX 1060 is rated at 120W, divided by 100,000,000 is 1.2 µW.... In terms of heat dissipation, that close to nil, and it's no surprise that this tech runs at room temperature. For comparison, the power consumption of the small LED that tells you your computer s on is in the order of 10mW, or 10,000 times more.
The idea of p-bits acting as a bridge between classical and quantum computing is mind-blowing! Could this be the practical ‘quantum’ tech we need before full quantum computers are ready? cool
@@Nandarion Yes so it seems that p-bits can solve with the equilibriums where the variance is converging, and then q-bits can solve where they diverge too.
This is not new. Doing it in silicon is new, but what has been done since the industrial revolution. Before there were active controls, you set up an equilibrium equation. On one side, you put things you could determine, and on the other side, you put things you couldn't determine. The met on something you wanted like the temperature of a room or the number of bags of flour you wanted to grind every day. You turned the system on and it settled on its equilibrium point. You fiddled with the things you could determine until the equilibrium point was what you wanted. Same is true for any analog computer. You dial in the properties of the equilibrium and you let it go. It's a beautiful implementation of a very old engineering method.
Connecting energy probabilities within equilibrium while maintaining synaptic connections is basically what the mind does? Putting that with a powerful AI… Because putting quantum computing with AI is what we have now with computers/bots?
There is "Monte Carlo Integration", which is a method of computing by random sampling, that has been known for several decades. To approximate the area of a shape in a rectangle, you could try covering just the interior of the shape with tiny boxes and then the total area of all the boxes is the approximate area of the shape; this is the classical computation. Alternatively, randomly choose a thousand points in the rectangle and count how many are inside the shape. This gives a statistical estimate of the area of the shape. As the number of dimensions of the rectangle increases (i.e., shapes in N-dimensional boxes), the numerical error associated with the classical computation tends to grow more quickly than the numerical error associated with random sampling, I recall. The "probabilistic computing" discussed in this video reminded me of these "random sampling" methods.
Basically, that’s what I also mentioned above with regards to random number generators using noise or other parameters from nature. Monte Carlo integration and Monte Carlo methods in general use the central limit theorem and law of large numbers for obtaining the right answer . This of course is more accurate if we have an “unbiased “ random number generator.
The problem is how to make those p-bits to make needed distribution with given parameters. They should be influenced by temperature very much. Will thermostatic solutions will be enough ?
@ I don’t need a course. Just a very high level description of the way you program with p-bits. Hardware advances are great. But the value of hardware is achieved through software. So it’s a great concept, but software will define its success.
@@roch145 "very high level description" sounds like a course; but, yes, the software is required. But isn't mathematical literature already being implemented as software for these quantum computers? To my recollection, the "software" is what enabled the hardware to begin development in the first place (seeing as the 'soft' is the structured thought and the 'hard' is the physical body) - but it's definitely something that needs to be more publicly enticing...
Would you reckon it will be able to be used in even more real ray tracing? Where these methods are used to cast a whole different order of magnitude rays in random directions, where we would cast from the light sources, reacting with materials (maybe based on actual physcs and photon interaction), where only a very tiny margin will reach the camera. Just like irl. I hope im making sense here.
Yep. I foresee these analog systems working in tandem with classical Von Neumann, Turing machines, all in the same box. So the CPU offloads a task to the analog chip, then takes result back to classical land.
@@DeltaNovum It's hard to foresee exactly how things would come to be, but one thing is for sure, once we humans are able to abstract functionality behind a layer, we find all kinds of novel ways to use it. Just look at what we were able to do with data and arithmetic logic gates.
Thanks Anastasi for this video. I saw the video from Jensen Huang and had trouble following it. You made it clear, and I also appreciate the included graphics and videos you added.
A computers memory is limited to the number of transistors it can use when computing (constants, variables, coefficients). When it is computing information is fed to it sequentially. The result of an Algorithm occurs as combinational logic. An analogy would be that computing is like making a bag of microwave popcorn, where each kernel is data (constants, variables, coefficients). Assume that (in this analogy) the kernels pop randomly, but once they are popped they are no longer a kernel. They are popcorn. So they are moved to a different part of memory (called the result). This frees up the initial memory so it may be used by the Algorithm. The Algorithm can speed up to finish the job faster because it can use more memory and therefore do parallel processing as in combinational logic. This saves time and energy.
@@garyrust9055 I think she understands conventional chip architecture. Plus i was under the impression its propositional, sequential and combinatorial logic used in a low level architecture
Thanks Anastasi for the informative content as always! Also, what is your bet that Graphene Processors could accelerate this further and shorten the time from conceptual phase to first hardware testing setup? Cheers and keep up with the amazing content!
@@JVerstry I also missed the computer shopper in the sense that you could only go through it once really to find what you wanted and wouldn’t have to doom scroll all day and look at videos about stuff that misinform you
@@JVerstry I just bought a $2000 VR headset that I’m waiting until February or March, and after the fact, I found out that they’ve never shipped a product yet even though they’ve announced two other products
Very interesting finds. Hope they manage to iron out the drawbacks before another tech breaks daylight. I've read about analog computers in the early computing age and if this technology arrives, it has become full circle. Great video. Thanks!
Also we can get faster classical c-bit computing with optical computing for precise algorithms. This p-bit tech seems suitable for AI especially. At some point we will have q-bit for similar purposes and for some advanced stuff.
There is no need to use anything fancy like superconductive Josephson Junctions in order to generate Quantum Randomness. It is quite sufficient to use reverse-biased diodes and amplify the resulting "shot noise". If it is desired to have digital random numbers, the interval between the "shots" provides that.
00:10 A new computing method embraces noise for vastly superior performance. 02:41 Probabilistic computing bridges classical and quantum concepts using environmental noise. 05:14 Introduction to p-bits as a bridge between classical and quantum computing. 07:32 Probabilistic machine achieves 100 trillion parameters with low power consumption. 09:38 Noise-based computing uses thermodynamics to enhance computational performance. 12:03 Thermodynamic computers drastically improve efficiency over classical GPUs. 14:18 Extropic's groundbreaking thermodynamic computer utilizes superconductivity for probabilistic computing. 16:35 Advancements in thermodynamics technology enhance CMOS-based probabilistic computing.
I read about Josephson junctions in the 1980s Proposed as a means of low noise superconducting switches. Seeing them used to bridge p-bits makes a lot of sense althoughi wonder about the scalability of bridging the stochastic behavior with low temperature junctions. Neurons use relatively slow, but programmable activation potentials and they work at body temperature. Just some random thoughts but this was a great topic and I really appreciate it so thank you!
Just had a crazy idea neural network related, I wonder if it has been tried. Basically instead of having one weight per neuron we would have 2, one being the normal activation weight the other would be a "functional" weight and this particular "weight" would decide what function is performed by this neuron instead of having all neurons on a layer perform the same computation.
Makes sense to me. A typical nuron speak to others with chemical signals as well as electrical ones. Unknowledgeable enough to know how it would work though.
Interesting idea. If you can figure out how to train such system, it could be used as an optimization (improve latency or energy efficiency). If I remember correctly, it has been shown mathematically that using just single non-linear function for every neuron is enough to have same computational abilitities (AI counterpart of Turing machine). However, the proof is about what's possible, not about what's easy/fast to compute.
Basically, you are using the universe, and its randomness, as part of your system. It reminds me of DNA, which doesn't just make things ex nihilo. It plugs into the environment to make things in a cooperative manner.
I love this. DNA was my goto analogy when describing the difference between code and software to a team of scientists (DNA being code and the phenotype/animal being its software) The initial part of your comment "...using the universe and its randomness" reminded me of Stephen Wolfram and his ruliad idea. He would say "using the universe and its computation".
@@trudyandgeorge "using the universe" ? By that glitzy narrative , so is asphalt cooling at night and heating the next day. Thermodynamics was always a cerebral subject.
In my understanding, Thermodynamic Computing involves the use of two thermodynamic quantities - energy and entropy. You mentioned the second law of thermodynamics. Classical computers have traditionally only accounted for information related to energy in the form of work. Work is very directed, whereas heat is entirely undirected. In classical computers, nearly all information must be removed as heat, and they need to be cooled to prevent damage. In Thermodynamic Computing, instead, as much information as possible is extracted and processed from this heat. However, it is not easy to measure and evaluate information such as time-dependent temperature values (the noise you mentioned) so precisely and at such a localized level. The use of this technology could be great. And thank you for your report from Vienna
Super interesting & super well presented. As I understand it, the solution obtained by a Boltzmann or reduced Boltzmann machine are minimums in the parameter space defined by the energy of each state & the total energy of the system. Boltzmann showed that the probability of a given state is proportional to the exponent of the energy divided by the temperature & the true probability is obtained by multiplying the system temperature by Boltzmann’s constant. It is a brilliantly simple model that works with physical systems & has been adopted by the two winners of this years physics Nobel prize to create AI systems that find solutions as minimums in the model space using Boltzmann’s equation. It is sad to recall that Boltzmann took his own life a little before his ideas became accepted. Thank you for sharing!
Vaguely, but in a sense, this is similar to cooking (e.g., poached or scrambled eggs) or metal annealing (blacksmithing and sword blades, etc.). Harnessing the random effects of heating and cooling to achieve some equilibrium-defined result of interest.
@ yes, but the difference is that the weights of each component are variables that can be adjusted as well as self adjusted to create minimums in the parameter space & unlike say eggs there are many more possible outcomes, some of which were never before explored as in the way AI became world champion at Go.
@@springwoodcottage4248 Yes, you are right about that. But for us to get to cookbooks and recipes (and the chemistry science of foods) our ancestors had to search manually (exploring Eric Drexler’s ”timeless potential landscape of technology”) .. Finding the right initial conditions and ingredients and treatments is a difference between me and a chef in the kitchen in search of a good "equilibrium" state! 🙂
@ the difference between our ancestors searching & finding some solutions is that AI can search at least a million times faster & a much larger range of potential ingredients. The success of AlphaGo, AlphaChip, AlphaFold,…etc indicates that the AI approach can find minima that humans have failed to find and can do the searches at such speed that many decades of human searching can be done in hours. AI is an extraordinarily powerful technology that takes its origin from the studies of Boltzmann over 100 years ago.
This reminds me of the infinite improbability drive for the Starship in The Hitchhikers Guide to the Galaxy. Maybe it was more of a prediction than a fantasy by Douglas Adams.
Analog computers have two serious disadvantages: There is no error correction for calculations and their hardware can only solve a single problem quickly. How can error correction be solved analog? Through correlation? How can you build an analog universal computer? Neural networks seem to be able to do it. It would be interesting to apply the inverse Z-transformation to digital data models to build analog solutions and see what comes out of it. There are already FPGAs that can calculate much faster than digital solutions. Digital signal processing is also becoming faster with special hardware, e.g. in signal processors. Overall, a middle way between digital and analog data processing will develop. E.g. sum (inputs) of each neuron with Schmitt trigger for output spike analog. Multiplicants in synapses digital. Trained data is loaded digitally as with FPGAs. Calculation is analog.
The probabilistic computation will run into is the issue of local minima when parallel calculations are running. This issue is equivalent to the physical phenomenon of density fluctuations near a critical point. So local solutions (maximum entropy) are strongly influenced by nearby minima in entropy.
I commented about this kind of leap forward about a month ago, that there would be some advancement that would GREATLY enhance efficiency of AI compute. And here it is! Thank you Nastya! 😂
For almost two years now I've been saying that doing AI using digital was completely broken. A neuron is an op amplifier, and what we're doing using SIMD to multiply, accumulate, then apply an activation function is just a super-expensive emulation of the op-amp. I just don't know how fast we could make op-amp work at the node technology used by CPUs , it might be possible that digital remains faster but I strongly doubt it. I'm still waiting for an analog AI chip.
The universe is probably deterministic. Our lack of ability to see all the variables means it looks random. Although we might be able to build machines to see more variables, to see them all we would probably need more material than the universe has.
I know you are probably ;) joking@@ip6289. But in case you aren't: there is a difference between using a word in an epistemological context versus in an ontological sense. His use of the word "proba bly" in the first sentence is in the former sense.
@@aclearlightI think so. I am also pretty sure (I.e., IIRC) that he also thought everything was really fundamentally discrete (thus essentially making the equations of physics what is approximated by nature (ideally, i.e. if the theory is good and/or the system under consideration is simple enough) - the exact opposite of how we are taught to think).
Yeah, so P-bit is a feedback machine set to automatic to find the "least obstructive" path forward. Simple solution, and common sense, like water flowing downhill, or choosing the best time and point to cross a road. If a problem "roadblock" occurs, the system (computation) backs up until it spatially (computationally) recognises an overflow to a new pathway. Elegant
As a computer scientist and software engineer, for 58 years now, I would point out that the majority of computer CPU time is wasted and not relevant to the discussion of AI problem solving using P-Computer if developed as presented will be a game changer and not exactly a niche item. Yes you can get excited about this!
OK, I'm might be shooting backwards on this but... If you built the CPU based on thermal-dynamics and took the following steps to influence the noise by shielding the hardware in a layer of synthetic skin that is temperature regulated to 98 degrees (like the human body). Then perhaps try and simulate movement or sleight pressure changes around the hardware and apply the AI programs and algorithm to it. Then build a second one and connect it to the other half and create a quantum/thermal bridge to connect the two halves, in theory it should act like a hyper evolved Intelligence right? Based on the human brain structure. Of course in given time it will discover how to improve itself.
The best thing about it all is that startups that work on these computational units do not know how to produce them, they get an "object" whose detailed description of behavior they get, but they do not know how the supplier produces it. In this way, the knowledge of the "alchemists" will remain only at the disposal of the alchemists. And scientists have the right to order their "gold" in the appropriate places of silicon wafer, they can use it, but they have no idea how it is created.
WONDERFUL! This is EXACTLY what we need! My god. What I could do with that and an ML weighting table! Gosh! And I love to see you having "your world turned upside down". This is _exactly_ what I have seen over and over every time I teach a coder to be a prompter. "NO! Stop trying to make it act like a computer! SURF THE NON-DETERMINISM! Make it _work_ for you." Fantastic.
Ich folge deinem Kanal nun schon eine ganze Weile. Heute war es soweit, dass ich dir absolut nicht mehr folgen konnte. Erst nach dem zweiten Anschauen und Recherche im Internet, was du überhaupt meinst, ist es mir gelungen, wenigstens etwas zu verstehen, worüber du sprichst. Wenn ich es jetzt richtig verstanden habe, sind die Ergebnisse, die der Computer auswirft, bei gleichen Eingaben, nicht immer die gleichen. Das ist es doch aber, was bei digitaler Rechentechnik so wichtig ist. Bei gleichen Eingaben, das gleiche Ergebnis.
It's already working, just like how they have quantum computers working. It's just a matter of refining the technology to make it better and competitive with current solutions.
@@SoloRenegade That's a strange statement, considering that 'they' have already demonstrated nonclassical performance with them in some areas of computation.
100,000,000 times as efficient. Is that considering the total system, including cooling? When comparing, it should take into consideration the total power of the system.
The point for me is its availability. Talking about these breakthroughs when they are just discovered is not really helpful. It's only hope, not a promise, i.e., wireless power transmission, neuromorphic computing, chiplets, 3d stacking, and optical computing. Sure, it's fun to dream, but getting excited about something that will take years to develop and most likely change beyond recognition of the descriptions now is not appealing. However, I enjoy your show, but sometimes it's too good to be true. Rob
@@JasminUwU, I'm sorry. I was misinformed by Microsoft Co-Pilot. I thought this was the case, but I assumed it was ones used in different configurations or forms. Rob
@@devilsolution9781 Well we must be careful to distinguish between ontological and epistemological determinism. Technically speaking, "determinism" only refers to the former in that the past determines the future (if so).
@@djayjp whats the difference? the only thing i can think of thats non deterministic is some aspect of quantum mechanics that describes the collapse of the probabilistic waveform to a particle
@@devilsolution9781 Ontological is what's actual in reality. Epistemic is what we know (or can know, in principle). Actually regarding QM, there are various, equally valid (to the Copenhagen interpretation), interpretations that posit determinism (such as pilot wave, many worlds, etc).
Because theyre grifters. They talk big to grift. I came up with a concept thatll do this stuff but academia grifters are all off using thier grifting powers to take it away from me :(
The statement is due to a lack of vision. If you can't take all the abstract information in and apply it to a vision you will never come up with a solution
Finally! Of course there are problems which we don´t need deterministic computing for -- within some probability confidence interval itś likely we have X solution. This is great for heuristics for NP problems - right?
We discussed this about a decade ago, Except instead of super cooling we were going to exploit the characteristics of tunnel diodes, where they could be manufactured on silicon substrates and work at room temperatures.
Cool!!❤❤ A semiclassical approach to computing i guess? But somehow it sounds alot like D-wave's quantum annealing to me. But i'm probably just confused. Anyway, great video 👍👍👍👍
The nice thing with Photons is that they don't decay (because time is frozen at the speed of light) unless they hit something and then, if is a mirror...
@thesnare100 , Decay in a metaphoric sense. If you cut power to a superconductor Q-Bit it will cease to exist. A photon can travel for ever if there is no obstacles.
@@marcbjorg4823 it makes me wonder what is there to stop you from going forever if YOU could travel at the speed of light, since there is "end of space" so to speak a point where space is still expanding/hasn't expanded to you, but it travels faster than the speed of light, as has been doing so since the big bang, so you couldn't up with it. I don't know if there's a name for it "the space wall" or something
This is so cool! I was imagining this as a quasi-analog probabalistic system ("continuous" states), but this makes sense given it harnesses quantum effects of thermal noise in cmos. I imagine the feedback "slides" the state toward a final state/equilibrium for each p-fet, (probabilistically weighted), and the system as a whole? I'm quite curious. This seems like a "once in a generation" discovery. I was about to say "this is like the happy-space between classical and quantum", amd "this is closer to how a brain miht work" - we came to the same conclusion on that 100%. Perhaps not a surprise, considering we vith come from an analog design background (worked in high speed mixed signal, RF/uW, mainly with cmos/bicmos). I now work in game dev and a bit of CV and AI. So it makes a tonne of sense how useful this can be. The potential applications for things such as non-deterministic state-based/programmatic ai behavior, optimization, etc -- are huge. This is very cool - ty for sharing!
I thought of this technique independently. If you want to add a list of numbers, instead of using the number use a probability of that number with a random number generator. The sum is probably more accurate than the precision. I tested it and it was more accurate.
Wow, you just hurt my head! But thinking about it, clearly neural networks would have absolute advantages to solving real-time problems if supremacy is realized. I would have to go back to school to understand what they are doing. I wish them luck in their pursuits. I could use the help in my technical analysis of market movements. James E Anderson MBA, MS, JD.
It makes me think about Fuzzy Logic from that I used nearly 30 years ago in rule based expert systems, and also tried to apply on neural networks, but with the computer power available at the time as well as training data, it was not very practical.
Question: if you have random bits, don't you still have to figure out an efficient way to read them out for computational use? Are the "computations" random and embedded into the memory somehow? This part was confusing to me.
As some one studying active inference in AI which is fundamentally probableistinc. This is quite exiting. As well as the free energy and bayesian inference.
Thank you once again for a marvelous video! I had no idea this was out there! I want to hear more about it. One question I have is, are probability distributions key in this new technology? Can p-bits be operating with different probability distributions? Can they be forced into particular probability distributions? Are such distributions the key to how a probability algorithm works? Another thing I'm wondering is, what would be an example of a probability algorithm, a basic one that might let us see how these machines work? I'm also wondering about interconnections between P bits. Is there such a thing as probability gates and how do they work? And you mentioned that information flows in both directions to and fro between p-bits, but your example was in a single direction. I want to know more about this bi-directional characteristic. How is it achieved in the circuit itself? And finally, I'm wondering about the mathematics used to represent probability circuits.
Whenever new better things are big improvements, it is normally a combination of the 2 things before it. Taking the pros from both and limiting the cons of each
I had no idea that research money was being spent for this.....seems random but then again to come up with something new you sometimes must think outside the box. Interesting video thank you!
Could you put digital and probabilistic computation on the same board? Gpus already have several kinds of specialized cores. Could you get bits and pbits to work in concert or are the differences too fundamental?
I think that in the future we'll have different kinds of technologies in the same computer, like a deterministic core, a probabilistic core, a quantum core and so on. And we'll use each core to compute the task that's best fitted for the problem.
Check out my new course on Technology and Investing in Silicon:
www.anastasiintech.com/course
The first 50 people to sign up get 25% off with the code “EARLY25”.
Love your content.
Do I need to study Electrical Engineering to do a Probabilistic Computer Start-up company?
@TheBlueMahoe no man. Delegate! Make someone else do their strong suit and you stick to yours. Using collaboration and delegation to make a real team and fill in the gaps of each other's competence.
If you're gonna harvest noise, what about random number generators??
I guess that sounds like synthetic data for stochastic parrots, kinda.... ;*[}
I though years ago about using a comparator neural net using arrays of D/A converters & comparators for near instantaneous results. Basically a DLN will have perform a huge array of comparisions to detemine a match. But this could all be done using an analog array. using D/A to create a comparison node value that feeds into one side of the comparator. This would be extremely energy efficient & as well as fast. Imagine a chip with a 100M analog comparator nodes using a couple of watts of power.
As a computer scientist and software engineer. I see great niche cases for p-bits or stochastic bits. But 100% of software is 99% deterministic, even when stochastic elements are put inside "deterministic cages". So I want to stress that nondeterministic computing has only niche uses within a deterministic framework, for the great majority of tasks for which humans want to program digital solutions.
AI isn’t 100% deterministic.
@@chriskiwi9833agree, I think he missed this point and the associated demand, niche, yeah right.
That's true, but what is taking a toll on performance and power consumption _is_ the non-deterministic part. There will always be a mix of the two, if only for the very deterministic communications part.
@@tikabass this doesn't actually have a good power consumption, they only stated that it functions at room temperature, they did not state it functions at all "efficiently at room temperature". it only functions as efficiently as stated when it is supercooled- there is no inclusion of the cooling as a part of the power budget in this piece. instead as a consumer, it reads they are basically trying to refer to this component as "passive", when it _is not passive_. people who build servers will know this is the sort of chip that you need to have a significant portion of a server building dedicated to cooling to run at all efficiently.
this piece steps around HPC by saying its "the old way", even though HPC accounts for the majority of computing required to be done by hardware right now. if you are looking to what is actually going to be relevant in the future, it's still deterministic, good old fashioned high performance compute, just with loads of added extras for developers- look at NextSilicon, they're on the yellow brick road in contrast to this extropic landfill, waste of sand.
@@quibster They stated this new tech has 100,000,000 (that's 1.e8) better power efficiency than the current tech, which mostly uses GPU cards. Take max the consumption of your GPU card and divide by 100,000,000 to have an idea of the power consumption of the new tech. Mine, a lowly GTX 1060 is rated at 120W, divided by 100,000,000 is 1.2 µW.... In terms of heat dissipation, that close to nil, and it's no surprise that this tech runs at room temperature. For comparison, the power consumption of the small LED that tells you your computer s on is in the order of 10mW, or 10,000 times more.
The idea of p-bits acting as a bridge between classical and quantum computing is mind-blowing! Could this be the practical ‘quantum’ tech we need before full quantum computers are ready? cool
Most likely, I been waiting for this for ~8 Years, iirc.
No. q-bits without quantum entanglement are same as p-bits. But to be faster then classical computing entanglement is required. We already had p-bits.
Very ai-bot comment. Genuinely asking are you a human?
@@Nandarion Yes so it seems that p-bits can solve with the equilibriums where the variance is converging, and then q-bits can solve where they diverge too.
Sounds like more vapor-ware marketing speak.
This is not new. Doing it in silicon is new, but what has been done since the industrial revolution. Before there were active controls, you set up an equilibrium equation. On one side, you put things you could determine, and on the other side, you put things you couldn't determine. The met on something you wanted like the temperature of a room or the number of bags of flour you wanted to grind every day. You turned the system on and it settled on its equilibrium point. You fiddled with the things you could determine until the equilibrium point was what you wanted. Same is true for any analog computer. You dial in the properties of the equilibrium and you let it go. It's a beautiful implementation of a very old engineering method.
Is there a good youtube video that explains this in detail?
@@TesterBoy Search for PID
@@TesterBoy First Principles of Engineering. It's actually greek philosophy. But more intrinsically like neurons.
Connecting energy probabilities within equilibrium while maintaining synaptic connections is basically what the mind does?
Putting that with a powerful AI…
Because putting quantum computing with AI is what we have now with computers/bots?
There is "Monte Carlo Integration", which is a method of computing by random sampling, that has been known for several decades. To approximate the area of a shape in a rectangle, you could try covering just the interior of the shape with tiny boxes and then the total area of all the boxes is the approximate area of the shape; this is the classical computation. Alternatively, randomly choose a thousand points in the rectangle and count how many are inside the shape. This gives a statistical estimate of the area of the shape. As the number of dimensions of the rectangle increases (i.e., shapes in N-dimensional boxes), the numerical error associated with the classical computation tends to grow more quickly than the numerical error associated with random sampling, I recall. The "probabilistic computing" discussed in this video reminded me of these "random sampling" methods.
Fascinating stuff. Stanislaw Ulam back in 1946 is generally credited for Monte Carlo method.
Basically, that’s what I also mentioned above with regards to random number generators using noise or other parameters from nature. Monte Carlo integration and Monte Carlo methods in general use the central limit theorem and law of large numbers for obtaining the right answer . This of course is more accurate if we have an “unbiased “ random number generator.
The problem is how to make those p-bits to make needed distribution with given parameters. They should be influenced by temperature very much. Will thermostatic solutions will be enough ?
You didn’t really address how p-bits and algorithms and data work together to produce an output or solution.
You have to buy the course. Can't give away all the secrets for free 🤪
@ I don’t need a course. Just a very high level description of the way you program with p-bits. Hardware advances are great. But the value of hardware is achieved through software. So it’s a great concept, but software will define its success.
In all fairness you’d probably need an entire video for that
@@roch145 "very high level description" sounds like a course; but, yes, the software is required. But isn't mathematical literature already being implemented as software for these quantum computers? To my recollection, the "software" is what enabled the hardware to begin development in the first place (seeing as the 'soft' is the structured thought and the 'hard' is the physical body) - but it's definitely something that needs to be more publicly enticing...
ruclips.net/video/VQjmO77wyQo/видео.html
It has been fun watching you mature from a bright young student into a powerful expert.
Was not what I thought the video would be about, learned a TON ...loads of cool info, can't wait to see this deployed ! :)
I solved this years ago when I hooked up my stereo to my computer, put on some Zappa, turned it all the way up, and then pulled the knob off...
Saw Frank and The Mothers back in 1971. What a show (Portland, Oregon).
@rodorr saw them in Ft Worth, I think it was 1976 or 77.
but did you dial the volume up to 11? ^_^
@@dwaynestomp5462 hi Dad, must’ve been a bad hangover because you probably didn’t remember what happened that night but, here I am. 😂
@phoenixfireclusterbomb awesome! Ready for me to move in yet?
Sounds very similar to adiabatic quantum computing. Useful for solving optimisation problems, but not universal computing.
Would you reckon it will be able to be used in even more real ray tracing? Where these methods are used to cast a whole different order of magnitude rays in random directions, where we would cast from the light sources, reacting with materials (maybe based on actual physcs and photon interaction), where only a very tiny margin will reach the camera. Just like irl.
I hope im making sense here.
Yep. I foresee these analog systems working in tandem with classical Von Neumann, Turing machines, all in the same box. So the CPU offloads a task to the analog chip, then takes result back to classical land.
@@DeltaNovum It's hard to foresee exactly how things would come to be, but one thing is for sure, once we humans are able to abstract functionality behind a layer, we find all kinds of novel ways to use it. Just look at what we were able to do with data and arithmetic logic gates.
rebranded "quantum annealing"
Fascinating! I've never heard of using noise for computation. I've worked with equipment that uses noise to hide in but never computing.
Good work Anastasi.
Thanks Anastasi for this video. I saw the video from Jensen Huang and had trouble following it. You made it clear, and I also appreciate the included graphics and videos you added.
A computers memory is limited to
the number of transistors it can
use when computing (constants,
variables, coefficients). When it
is computing information is fed to
it sequentially. The result of an
Algorithm occurs as combinational
logic. An analogy would be that
computing is like making a bag
of microwave popcorn, where each
kernel is data (constants, variables,
coefficients). Assume that (in this
analogy) the kernels pop randomly,
but once they are popped they are no
longer a kernel. They are popcorn. So
they are moved to a different part of
memory (called the result). This frees
up the initial memory so it may be used
by the Algorithm. The Algorithm can
speed up to finish the job faster
because it can use more memory and
therefore do parallel processing as in
combinational logic. This saves time
and energy.
@@garyrust9055 I think she understands conventional chip architecture. Plus i was under the impression its propositional, sequential and combinatorial logic used in a low level architecture
@garyrust9055 who wrote that poem?
@@mhamadkamel6891 ChatGPT
Thanks Anastasi for the informative content as always!
Also, what is your bet that Graphene Processors could accelerate this further and shorten the time from conceptual phase to first hardware testing setup? Cheers and keep up with the amazing content!
This is amazing. Great work. Thank you for curating such interesting and important knowledge.
If Byte magazine was still around today, you would be doing the digital video version of it. That’s exactly what your discussions remind me of.
You should rename the channel to Byte Anastasia. LOL
I miss Byte magazine so much...
@@JVerstry I also missed the computer shopper in the sense that you could only go through it once really to find what you wanted and wouldn’t have to doom scroll all day and look at videos about stuff that misinform you
@@JVerstry I just bought a $2000 VR headset that I’m waiting until February or March, and after the fact, I found out that they’ve never shipped a product yet even though they’ve announced two other products
Byte and Shopper. Good times !
Very interesting finds. Hope they manage to iron out the drawbacks before another tech breaks daylight. I've read about analog computers in the early computing age and if this technology arrives, it has become full circle. Great video. Thanks!
great video as always, thank you for posting Anastasi!!!
Excellent video. 👍 I remember proposing a similar idea for a lidar project. 👍
Very glad to see you talking about this topic!
Also we can get faster classical c-bit computing with optical computing for precise algorithms. This p-bit tech seems suitable for AI especially. At some point we will have q-bit for similar purposes and for some advanced stuff.
and with the same problems our carbon based computers in our heads have i.e. they are not as good as their owners think they are.
Wow, you really know your stuff....fluency even! Thanks for sharing!
Here comes the next layer of the simulation.
There is no need to use anything fancy like superconductive Josephson Junctions in order to generate Quantum Randomness. It is quite sufficient to use reverse-biased diodes and amplify the resulting "shot noise". If it is desired to have digital random numbers, the interval between the "shots" provides that.
Epic thanks for sharing your knowledge in this exciting future field of computer science! ❤
00:10 A new computing method embraces noise for vastly superior performance.
02:41 Probabilistic computing bridges classical and quantum concepts using environmental noise.
05:14 Introduction to p-bits as a bridge between classical and quantum computing.
07:32 Probabilistic machine achieves 100 trillion parameters with low power consumption.
09:38 Noise-based computing uses thermodynamics to enhance computational performance.
12:03 Thermodynamic computers drastically improve efficiency over classical GPUs.
14:18 Extropic's groundbreaking thermodynamic computer utilizes superconductivity for probabilistic computing.
16:35 Advancements in thermodynamics technology enhance CMOS-based probabilistic computing.
32K fully ray traced minecraft coming 😌
But can it run Crysis?
no thanks
With atomic sized voxels 🥵
Oh hell yeah 😎
99 googolplex fps
The only way forward, is by seeing what happens when you try. It is an obvious thing to say, but it is the only way. Good luck to you for this stream!
I read about Josephson junctions in the 1980s Proposed as a means of low noise superconducting switches. Seeing them used to bridge p-bits makes a lot of sense althoughi wonder about the scalability of bridging the stochastic behavior with low temperature junctions.
Neurons use relatively slow, but programmable activation potentials and they work at body temperature.
Just some random thoughts but this was a great topic and I really appreciate it so thank you!
Fascinating.
Room temp superconductors will make a huge difference.
SO LONG AND THANKS FOR ALL THE FISH.
Happy 2025! 🎉 ✨ Anastasi see you in the future!
Just had a crazy idea neural network related, I wonder if it has been tried. Basically instead of having one weight per neuron we would have 2, one being the normal activation weight the other would be a "functional" weight and this particular "weight" would decide what function is performed by this neuron instead of having all neurons on a layer perform the same computation.
Makes sense to me. A typical nuron speak to others with chemical signals as well as electrical ones. Unknowledgeable enough to know how it would work though.
Interesting idea. If you can figure out how to train such system, it could be used as an optimization (improve latency or energy efficiency). If I remember correctly, it has been shown mathematically that using just single non-linear function for every neuron is enough to have same computational abilitities (AI counterpart of Turing machine). However, the proof is about what's possible, not about what's easy/fast to compute.
Well ... build a proof-of-concept !
I am not sure if this has been done exactly as you describe but learnable activation functions is a relatively well explored area.
@@isleepy6801 Wouldn't be surprised, that said, have yet to see anything describing the node functions that way.
As an HDR monitor owner, I enjoyed getting spooked by the transition at 5:05 😁💛
Basically, you are using the universe, and its randomness, as part of your system. It reminds me of DNA, which doesn't just make things ex nihilo. It plugs into the environment to make things in a cooperative manner.
I love this. DNA was my goto analogy when describing the difference between code and software to a team of scientists (DNA being code and the phenotype/animal being its software)
The initial part of your comment "...using the universe and its randomness" reminded me of Stephen Wolfram and his ruliad idea. He would say "using the universe and its computation".
keep selling, salesman.
@@kakistocracyusa your comment went over my head. Who's the salesman and why?
@@trudyandgeorge "using the universe" ? By that glitzy narrative , so is asphalt cooling at night and heating the next day. Thermodynamics was always a cerebral subject.
@@kakistocracyusa I see now, thanks. You know, entropy and the second law is most certainly universal.
Thanks for presenting all this information on these new platforms - this has cleared up many things I didn't understand about them.
This sounds no different from fuzzy logic from the 1980'? Also using random seeds to determine outcome with adjustable weights.
Excellent analysis and explanation! Thank you.
"Any sufficiently advanced technology is indistinguishable from magic." - Arthur C. Clarke
I'm 66yrs old and a lifelong "Electronics Junkie". I understand the science. Nevertheless, I'm still amazed at my microwave oven. Lol⚡
In my understanding, Thermodynamic Computing involves the use of two thermodynamic quantities - energy and entropy. You mentioned the second law of thermodynamics. Classical computers have traditionally only accounted for information related to energy in the form of work. Work is very directed, whereas heat is entirely undirected. In classical computers, nearly all information must be removed as heat, and they need to be cooled to prevent damage. In Thermodynamic Computing, instead, as much information as possible is extracted and processed from this heat. However, it is not easy to measure and evaluate information such as time-dependent temperature values (the noise you mentioned) so precisely and at such a localized level. The use of this technology could be great. And thank you for your report from Vienna
Super interesting & super well presented. As I understand it, the solution obtained by a Boltzmann or reduced Boltzmann machine are minimums in the parameter space defined by the energy of each state & the total energy of the system. Boltzmann showed that the probability of a given state is proportional to the exponent of the energy divided by the temperature & the true probability is obtained by multiplying the system temperature by Boltzmann’s constant. It is a brilliantly simple model that works with physical systems & has been adopted by the two winners of this years physics Nobel prize to create AI systems that find solutions as minimums in the model space using Boltzmann’s equation. It is sad to recall that Boltzmann took his own life a little before his ideas became accepted. Thank you for sharing!
Vaguely, but in a sense, this is similar to cooking (e.g., poached or scrambled eggs) or metal annealing (blacksmithing and sword blades, etc.). Harnessing the random effects of heating and cooling to achieve some equilibrium-defined result of interest.
@ yes, but the difference is that the weights of each component are variables that can be adjusted as well as self adjusted to create minimums in the parameter space & unlike say eggs there are many more possible outcomes, some of which were never before explored as in the way AI became world champion at Go.
@@springwoodcottage4248 Yes, you are right about that. But for us to get to cookbooks and recipes (and the chemistry science of foods) our ancestors had to search manually (exploring Eric Drexler’s ”timeless potential landscape of technology”) .. Finding the right initial conditions and ingredients and treatments is a difference between me and a chef in the kitchen in search of a good "equilibrium" state! 🙂
@ the difference between our ancestors searching & finding some solutions is that AI can search at least a million times faster & a much larger range of potential ingredients. The success of AlphaGo, AlphaChip, AlphaFold,…etc indicates that the AI approach can find minima that humans have failed to find and can do the searches at such speed that many decades of human searching can be done in hours. AI is an extraordinarily powerful technology that takes its origin from the studies of Boltzmann over 100 years ago.
Amazing video, many thanks! 🙏🏻
Thank you!
This reminds me of the infinite improbability drive for the Starship in The Hitchhikers Guide to the Galaxy. Maybe it was more of a prediction than a fantasy by Douglas Adams.
Not the same thing.
q-bit, actually can be many states. Shortest path of the noise the equilibrium determines the constant probabilistic.
How I understand it.
Analog computers have two serious disadvantages: There is no error correction for calculations and their hardware can only solve a single problem quickly. How can error correction be solved analog? Through correlation? How can you build an analog universal computer? Neural networks seem to be able to do it. It would be interesting to apply the inverse Z-transformation to digital data models to build analog solutions and see what comes out of it.
There are already FPGAs that can calculate much faster than digital solutions. Digital signal processing is also becoming faster with special hardware, e.g. in signal processors.
Overall, a middle way between digital and analog data processing will develop. E.g. sum (inputs) of each neuron with Schmitt trigger for output spike analog. Multiplicants in synapses digital. Trained data is loaded digitally as with FPGAs. Calculation is analog.
The probabilistic computation will run into is the issue of local minima when parallel calculations are running. This issue is equivalent to the physical phenomenon of density fluctuations near a critical point. So local solutions (maximum entropy) are strongly influenced by nearby minima in entropy.
I commented about this kind of leap forward about a month ago, that there would be some advancement that would GREATLY enhance efficiency of AI compute. And here it is! Thank you Nastya! 😂
For almost two years now I've been saying that doing AI using digital was completely broken. A neuron is an op amplifier, and what we're doing using SIMD to multiply, accumulate, then apply an activation function is just a super-expensive emulation of the op-amp. I just don't know how fast we could make op-amp work at the node technology used by CPUs , it might be possible that digital remains faster but I strongly doubt it. I'm still waiting for an analog AI chip.
For two years now, apparently.
I have learned so much from your videos. Thanks Anastasi!!!
The universe is probably deterministic. Our lack of ability to see all the variables means it looks random. Although we might be able to build machines to see more variables, to see them all we would probably need more material than the universe has.
You first sentence said it all😊
This was essentially Einstein's position in the face of indeterminate wavefunction collapse upon measurement, yes?
I know you are probably ;) joking@@ip6289. But in case you aren't: there is a difference between using a word in an epistemological context versus in an ontological sense. His use of the word "proba bly" in the first sentence is in the former sense.
@@aclearlightI think so. I am also pretty sure (I.e., IIRC) that he also thought everything was really fundamentally discrete (thus essentially making the equations of physics what is approximated by nature (ideally, i.e. if the theory is good and/or the system under consideration is simple enough) - the exact opposite of how we are taught to think).
You cannot build such machine because it would have also include itself and thus recursively swell ad infinitum. Principially impossible
Yeah, so P-bit is a feedback machine set to automatic to find the "least obstructive" path forward. Simple solution, and common sense, like water flowing downhill, or choosing the best time and point to cross a road. If a problem "roadblock" occurs, the system (computation) backs up until it spatially (computationally) recognises an overflow to a new pathway. Elegant
Beff Jezos is watching
Probably
Yes, and he will undoubtedly figure out how to implement it with something less than vaguely phallic.
Its been a while i havent watch this channel,but everysingle day there is a breakthrough...
As a computer scientist and software engineer, for 58 years now, I would point out that
the majority of computer CPU time is wasted and not relevant to the discussion of
AI problem solving using P-Computer if developed as presented will be a game changer
and not exactly a niche item. Yes you can get excited about this!
OK, I'm might be shooting backwards on this but... If you built the CPU based on thermal-dynamics and took the following steps to influence the noise by shielding the hardware in a layer of synthetic skin that is temperature regulated to 98 degrees (like the human body). Then perhaps try and simulate movement or sleight pressure changes around the hardware and apply the AI programs and algorithm to it. Then build a second one and connect it to the other half and create a quantum/thermal bridge to connect the two halves, in theory it should act like a hyper evolved Intelligence right? Based on the human brain structure. Of course in given time it will discover how to improve itself.
auto ML can spawn super intelligence with this one
The best thing about it all is that startups that work on these computational units do not know how to produce them, they get an "object" whose detailed description of behavior they get, but they do not know how the supplier produces it. In this way, the knowledge of the "alchemists" will remain only at the disposal of the alchemists. And scientists have the right to order their "gold" in the appropriate places of silicon wafer, they can use it, but they have no idea how it is created.
WONDERFUL! This is EXACTLY what we need! My god. What I could do with that and an ML weighting table! Gosh!
And I love to see you having "your world turned upside down". This is _exactly_ what I have seen over and over every time I teach a coder to be a prompter. "NO! Stop trying to make it act like a computer! SURF THE NON-DETERMINISM! Make it _work_ for you." Fantastic.
Ich folge deinem Kanal nun schon eine ganze Weile. Heute war es soweit, dass ich dir absolut nicht mehr folgen konnte. Erst nach dem zweiten Anschauen und Recherche im Internet, was du überhaupt meinst, ist es mir gelungen, wenigstens etwas zu verstehen, worüber du sprichst.
Wenn ich es jetzt richtig verstanden habe, sind die Ergebnisse, die der Computer auswirft, bei gleichen Eingaben, nicht immer die gleichen. Das ist es doch aber, was bei digitaler Rechentechnik so wichtig ist. Bei gleichen Eingaben, das gleiche Ergebnis.
Ja, es handelt sich um ein anderes Funktionsprinzip, das auf andere Problemstellungen angewendet wird
So is this computer actually going to work?
Tech engineer pushes back up his glasses with one finger: Probably...
no
But maybe yes. Depending
It's already working, just like how they have quantum computers working. It's just a matter of refining the technology to make it better and competitive with current solutions.
@@iceshadow487 quantum computers is vaporware
@@SoloRenegade That's a strange statement, considering that 'they' have already demonstrated nonclassical performance with them in some areas of computation.
In music production harnessing noise can be destructive but can also make your songs sound so much bigger and interesting.
100,000,000 times as efficient. Is that considering the total system, including cooling? When comparing, it should take into consideration the total power of the system.
No need, it runs at room temp
the question is: gazillion times faster in computing 1% of the job isn't much of help.
This almost sounds like "Hitch-hikers Guide to the Galaxy" stuff!
The "Infinite Improbability Drive"!
That's a very fuzzy almost!
The point for me is its availability. Talking about these breakthroughs when they are just discovered is not really helpful. It's only hope, not a promise, i.e., wireless power transmission, neuromorphic computing, chiplets, 3d stacking, and optical computing. Sure, it's fun to dream, but getting excited about something that will take years to develop and most likely change beyond recognition of the descriptions now is not appealing. However, I enjoy your show, but sometimes it's too good to be true. Rob
Chiplets are already a widely used thing, what are you talking about?
But dreams are becoming reality sooner and sooner. Everything is speeding up. But alas, the bad is also speeding up.
@@JasminUwU, I'm sorry. I was misinformed by Microsoft Co-Pilot. I thought this was the case, but I assumed it was ones used in different configurations or forms. Rob
This idea of P Bits is so cool and Mind Blowing. Thanks❤
Not necessarily true. Reality may be totally deterministic actually.
agreed and thats how we operate, by some predictive measure
@@devilsolution9781 Well we must be careful to distinguish between ontological and epistemological determinism. Technically speaking, "determinism" only refers to the former in that the past determines the future (if so).
@@djayjp whats the difference? the only thing i can think of thats non deterministic is some aspect of quantum mechanics that describes the collapse of the probabilistic waveform to a particle
@@devilsolution9781 Ontological is what's actual in reality. Epistemic is what we know (or can know, in principle). Actually regarding QM, there are various, equally valid (to the Copenhagen interpretation), interpretations that posit determinism (such as pilot wave, many worlds, etc).
Superdeterminism is unfalsifiable.
Using Noise to DeNoise an image. Awesome!
i keep hearing about advancements for decades ! but nothing changes for me ! i kinda get sick and tired of this crap !
Because theyre grifters. They talk big to grift. I came up with a concept thatll do this stuff but academia grifters are all off using thier grifting powers to take it away from me :(
There's a delay between proof of concept then 'profitable' manufacturing and finally, you the average consumer. (5-15years)
99pct of the time, these great new ideas don't pan out when you take them out of the lab and into reality.
Maybe you should put more effort in understanding what the limitations of these new ideas are. Real life is complex
The statement is due to a lack of vision. If you can't take all the abstract information in and apply it to a vision you will never come up with a solution
Finally! Of course there are problems which we don´t need deterministic computing for -- within some probability confidence interval itś likely we have X solution. This is great for heuristics for NP problems - right?
Does this mean we can produce delusional AI?
DAI, still smarter than some people I know.
I've seen AI output that's pretty delusional...
Okay just a minor nitpick, the vacuum tube computers were in fact digital as well, they simply used tubes rather than solid state components.
Wish my colon was 100 million times better at digesting some of the latest frankenstein food ingredients slopped into our food.
First time it made sense to me. Thanks!
i got lost as soon as she said zeroes and ones. lol
This video is awesome and groundbreaking!
Basically sorting probabilities?
Sounds like day dreaming to me.
Thanks! Awesome job, Anastasi!!
We discussed this about a decade ago, Except instead of super cooling we were going to exploit the characteristics of tunnel diodes, where they could be manufactured on silicon substrates and work at room temperatures.
Cool!!❤❤ A semiclassical approach to computing i guess? But somehow it sounds alot like D-wave's quantum annealing to me. But i'm probably just confused. Anyway, great video 👍👍👍👍
The nice thing with Photons is that they don't decay (because time is frozen at the speed of light) unless they hit something and then, if is a mirror...
how can anything made of energy "decay" depends what you mean by decay, not decay like a corpse, or a neutron becoming a proton.
@thesnare100 , Decay in a metaphoric sense. If you cut power to a superconductor Q-Bit it will cease to exist. A photon can travel for ever if there is no obstacles.
@@marcbjorg4823 it makes me wonder what is there to stop you from going forever if YOU could travel at the speed of light, since there is "end of space" so to speak a point where space is still expanding/hasn't expanded to you, but it travels faster than the speed of light, as has been doing so since the big bang, so you couldn't up with it. I don't know if there's a name for it "the space wall" or something
This is so cool!
I was imagining this as a quasi-analog probabalistic system ("continuous" states), but this makes sense given it harnesses quantum effects of thermal noise in cmos. I imagine the feedback "slides" the state toward a final state/equilibrium for each p-fet, (probabilistically weighted), and the system as a whole? I'm quite curious. This seems like a "once in a generation" discovery.
I was about to say "this is like the happy-space between classical and quantum", amd "this is closer to how a brain miht work" - we came to the same conclusion on that 100%. Perhaps not a surprise, considering we vith come from an analog design background (worked in high speed mixed signal, RF/uW, mainly with cmos/bicmos). I now work in game dev and a bit of CV and AI. So it makes a tonne of sense how useful this can be. The potential applications for things such as non-deterministic state-based/programmatic ai behavior, optimization, etc -- are huge.
This is very cool - ty for sharing!
Great and very informative videos, but can you tell me, are you using auto-tune (or pitch shifter) on your vocal?
I thought of this technique independently. If you want to add a list of numbers, instead of using the number use a probability of that number with a random number generator. The sum is probably more accurate than the precision. I tested it and it was more accurate.
Bravo Anastasia thank you.
❤ 3:13 probabilistically yes 🎉
Wow, you just hurt my head! But thinking about it, clearly neural networks would have absolute advantages to solving real-time problems if supremacy is realized. I would have to go back to school to understand what they are doing. I wish them luck in their pursuits. I could use the help in my technical analysis of market movements. James E Anderson MBA, MS, JD.
We’re already doing this with modern random number generators using noise or other parameters from nature for generating random numbers.
Great, can you please give me the next largest prime number, please?
Great work.
It makes me think about Fuzzy Logic from that I used nearly 30 years ago in rule based expert systems, and also tried to apply on neural networks, but with the computer power available at the time as well as training data, it was not very practical.
Question: if you have random bits, don't you still have to figure out an efficient way to read them out for computational use? Are the "computations" random and embedded into the memory somehow? This part was confusing to me.
As some one studying active inference in AI which is fundamentally probableistinc. This is quite exiting. As well as the free energy and bayesian inference.
Thank you once again for a marvelous video! I had no idea this was out there! I want to hear more about it. One question I have is, are probability distributions key in this new technology? Can p-bits be operating with different probability distributions? Can they be forced into particular probability distributions? Are such distributions the key to how a probability algorithm works? Another thing I'm wondering is, what would be an example of a probability algorithm, a basic one that might let us see how these machines work? I'm also wondering about interconnections between P bits. Is there such a thing as probability gates and how do they work? And you mentioned that information flows in both directions to and fro between p-bits, but your example was in a single direction. I want to know more about this bi-directional characteristic. How is it achieved in the circuit itself? And finally, I'm wondering about the mathematics used to represent probability circuits.
Yes, PDF are configurable and it’s beautiful!
Whenever new better things are big improvements, it is normally a combination of the 2 things before it. Taking the pros from both and limiting the cons of each
I had no idea that research money was being spent for this.....seems random but then again to come up with something new
you sometimes must think outside the box. Interesting video thank you!
Could you put digital and probabilistic computation on the same board? Gpus already have several kinds of specialized cores. Could you get bits and pbits to work in concert or are the differences too fundamental?
I always learn something new with your videos. ❤💻
I think that in the future we'll have different kinds of technologies in the same computer, like a deterministic core, a probabilistic core, a quantum core and so on. And we'll use each core to compute the task that's best fitted for the problem.
subscribed for the investing stuff