The big win of Signals in the spec is not payload size or performance, it is interoperability. If this happens, it will be as impactful as Promises. It is not only about sharing code and state between frontend frameworks, but also replacing any kind of set+onchange interface out there, just like Promises replaced callback hell.
It's a signal because it's a two way thing. Push dirty flags towards consumers when root data changes, and push recompute requests towards the root when the new data is needed for consumption (and also do no recompute when inputs don't change). So it's a lazy, two-way observer pattern.
It's the front-end ecosystem. Everything must be named differently. Module ? No, component. Decorator ? No, High Order Component. Middleware ? No, Interceptor.
7:20 It's weird to me no one in chat pointed out the explanation of why this is valuable isn't complete. You could do the same with lambdas or function references that you call when you want to get the value, but the main difference with signals is the amount of computation. Signals are only computed once per update, they are not recomputed when you want to get the value, which is what's so powerful about them.
The main benefit is in effects though. The effect will be re-run only if its sources change. If your effect is a React component, that means it will be rerendered only when necessary, without rerendering its parent or child components.
??? I mean that is the point its not an event implementation you ask for an update manually this is exactly what you often do in all kinds of systems with lambdas or function references. In the end similar to lambdas it makes it extremely hard to find out where things are happening
Before standardizing this, just build a common library that has the feature set similar to this. A polyfill without patching global objects, if you will. Migrate all frameworks to use this library. Ship it. Make them use what this proposal is intending so ship. See if it works for all of them in production, not just listing the frameworks in a proposal. We don't want to end up with an implementation that half of the frameworks listed effectively cannot use. Take your time iterating. If done, let's see if there are still benefits by bringing it into the browser and that justifies adding something to the language that will stay there forever.
Signals have been around for a long damn time, and are demonstrably good for performance already. The reason to get it in the language is that then the DOM can use it and the implementation can be even more performant. It would be super nice to be able to assign a function to a label and it Just Works™.
I don’t hate Signals but putting things into the language this keenly is exactly how JavaScript ended up this way. Having said that, the performance advantage of lowering this work to a language primitive (think executed in C) would be huge. I would like to see what syntactic sugar like they used for async/await might look like because the library is ugly to use in the proposal.
I feel like we're getting closer and closer to "framework singularity" because most people agreed that signals are something worth implementing and using. Maybe at some point the difference between frameworks will come down only to syntax.
JS, and by extension TS, are the fastest moving languages in history. Love it or hate it, JS is the social programming language. That gives it the highest adoption, thus the most iterations/collaborators, thus the highest likelihood of becoming ubiquitous to the point of absorbing all competition. See: AssemblyScript (TS->Assembly); Hermes Static (TS->C); Frameworks are no exception. Resistance is futile. All will be assimilated.
19:26 I'm not familiar with other signal implementations, but Preact signals have a peek() method that lets you read the value of the signal without subscribing to changes. This is similar to passing a function to setState() to use the current value, since you shouldn't be using the current value of "state" directly. e.g. signal.set(signal.peek() + 1) vs setState(prevState => prevState + 1)
honestly it's ridiculous that this hasn't been a part of every language itself. I mean you have this code x = 1 y = 2 z = x + y print(z) // 3 then you assign x = 8 but somehow you still get print(z) // 3 which is a false statement of equality since x+y = 2+8=10. And z should be =10 I mean this is an obvious problem, and it's the root of all element rendering problems, as I see it.
I used CanJS and also its predecessor, JavaScript MVC. It's weird how we had all these great frameworks and they all disappeared and were replaced by React. Also, what happened to Polymer framework??? IMO, Polymer was superior to React but very few companies used it at the time. I guess at least now there is Lit... But you don't really even need a framework at all nowadays. Web Components is powerful stuff.
@@Remindor WC’s are super cool. Lit does at least purportedly help take some of the boilerplate out of writing WC’s. I haven’t used it yet but considering experimenting with it.
Not just Haskell, seems people are starting to understand that declarative is future (and I'm 100% sure imperative will die, as is spaghetti nonsense in long run), so instead of fully committing some salad gets invented. Which is still progress, but still. As shown by polyfill it was doable all along :) I guess just "environment" (read people) need to grow.
the concept of computed is the same from Vue. As some one who works with Vue2 since 2019 I can say these people are way behind Vue and this nothing too fancy or to be excited about.
@@MrDragos360vue has done a terrible job at marketing. I had a job interview recently where a senior frontend dev told me their company moved from vue to react for WAIT FOR IT performance reasons 🤣
getting signals into js itself would help vue performance wise as well. Being able to share vue ref states with other js frameworks/apps is also nice. But yeah vue already has most (or all?) of this and it's one of the reasons why I use vue when I get the choice.
You know I had to pinch myself because someone was proposing a change to the spec instead of just creating yet another framework. I thought methods like these were forbidden and everything new had to be done via a new framework in JS land. These brave guys really challenging the status quo and I hope it works for them. /s
I loved KnockoutJs and still have a fairly large codebase using it,, I find it hilarious the world is coming back round to what Knockout did in the first place. Unfortunately I also have a huge codebase using RxJs and React, so swings in roundabouts.
I do think it's cool how in almost every video you talk about the people, what they did, and what they are doing, as part of the context for what you're talking about. Bringing us up to speed on their "reputation" if you will, beyond just github stars. Of course the downside is that we're not evaluating things solely based on their own merits, but the world really doesn't work like that, does it?
We definitely need a more primitive-driven approach to web engineering application based on the main themes such as reactivity, rendering and networking. Exciting proposal.
7:20 I’m the one who asked that question, and I’m not convinced that the “implementation performance” is a good enough reason. I’d argue JS implementation of such a high-level concept is comparable to the browser native one. I don’t particularly see opportunities for optimisations that the browser sees, where JS doesn’t. I could accept an argument that it is done for interoperability between signals libraries, but debatably this could’ve been achieved by the common lib/adapters if desired. Don’t think there is much demand for interop. For it to be there, the signals ecosystem has to mature a bit. I might have missed the argument given by theo live. I was hella eepy at 3am (eu gang rise up). Might’ve been chatting in the chat at that moment.
You would be surprised by how much Signals have matured. This effort to standardize stems from the fact that all frameworks have derived pretty much the same Signals implementation independently, the push-pull tainted graph. They do differ on how effects run and other details and the proposal accounts for that.
And one point about interop: even if they all agree on a single base signal lib, you need a single instance of the lib running for it to work (since it relies on a single call stack context to track subscriptions). This is challenging as it requires npm/yark/pnpm trickery with peerDependencies, bundler configuration, and/or manual setup.
My argument rests on interop not being desired enough feature. Don’t think it is. Can’t see many reasons why it would be. If it were, it is something that community can solve without tc39 standardization. Additionally, signals can be adapted between implementation, even if there is a perf penalty. I doubt that cross-signal-implementation calls are frequent enough to manifest perf issues. As to other arguments I’ve encountered: - Perf gains are likely insignificant - Optimization opportunities are lacking - Signals being in DOM doesn’t achieve much. Maybe observability of attributes. Can be done with MutationObserver (I’ll give you that it isn’t a particularly nice API) Yea, it is doomer-like mentality and the “ooooh scare of change”, but maybe let’s not bloat JS even larger than it is. Look at where it has lead C++ to. But I guess either way, acceptance of the proposal won’t really affect me that much. I’ll gladly use JS signals if they are a part of library interface or they suit my needs (as opposed to reaching out for a 3rd party library). You have every right to be excited by this addition. I am just not. Not sure it sits right with me.
There is a whole lot of optimization opportunities here. Implementing stuff like this in V8 or another JS runtime will drastically improve performance when you start to natively optimize dependency graphs and things like (de)serialization.
@@yapetthe interop goes beyond frontend frameworks. Signals being a first-class citizen could replace every kind of set+subscribe interface. I find Promises to be a good parallel here. Assuming you have been in this space long enough, would you rather still be using Bluebird.js or Q today? Did you imagine what the ecosystem would have looked like with native Promises back then? Could you imagine a JS world without native Promises today?
So much about Signals reminds me of how Jotai is designed. The whole atomic state like each signal represents, and having derived state similar to signals.computed. However signals takes it a step forward by not re-computing the graph when the root node changes, but only when a specific node is consumed. Clever!
This is LITERALLY how KnockoutJS works since 2010. Variables are all basically functions in which you can subscribe to including computed fields, which makes data model binding to the UI just insanely easy. I moved onto React which meant a mindset shift to one way binding, looks like I need dust off those old parts of my brain once again.
LMAO, I know web components aren't ideal in an environment where you can use frameworks. That said, I build and use web components daily, they work extremely well in a template based language system like shopify liquid themes. I see them as little islands of logic 😄. Although, behind the scenes I'm using solidjs which makes the whole process much better xD. I've actually built an extremely complex set of web components used to render a dynamic js sidecart. The components work for rendering the data while having full control of styling the sidecart however you want. The best part is because we use SolidJs signals for both the cart and the web components graph, we can ensure fine grain updates within the sidecart's elements. Sorry for the mini rant, I just want think they work really well under the right circumstances 😄.
Exactly, I'm also very fond of web components and use them a lot in a lot of projects. Also I've seen some libraries for React like the Mux Client SDK and LiveKit that use web components under the hood and simply wrap their components in connectors to easily distribute their components across React, Svelte and others without having to rewrite their entire product.
Thanks for pointing this out. Been following various signal based systems (including early elm and purescript) for a bunch of years. It's neat to see that they're tackling rendering timeliness, out-of-date-ness and effectfulness. It's easy to write a signal graph but very hard to get the consumption API right. Don't feel great about the amount of work left to the consumer in the blog post, but we'll see how it falls out. There's nothing worse than writing something nice in a reactive system, adding one more component or relationship and suddenly being without a paddle trying to track down a lost or retrograde ui update.
18:00 MVC is a powerful pattern for decoupling business logic from UI. It shouldn't be frowned upon. It's sad that 99% of React devs will just build UI that's strictly coupled with business logic.
There are so many state management libs that use MVC, I'm curious why he frowns upon it but it is funny to see him react to things he feels averse to. I feel like the deeper you get into building stuff for a longer period of time, you develop your own style and aversions from badly written or very unperforment or buggy code. Kind of a PTSD
not doing MVC is the biggest reason JS FE (especially React but not only React) is such a hellscape. Signals won't fix anything, until this attitude changes.
Theo gives me big ego junior superstar vibes. I wouldn't be surprised if he had a lot of friction with his seniors at his previous jobs, and that a lot of his naive takes on certain concepts like MVC and separation of concerns is him projecting his frustrations with past colleagues forcing him to adhere to concepts he didn't fully understand at the time, and due to his large ego he's written them off as pointless, instead of learning their values. I think he's been stuck in that mindset ever since. I've seen this personality countless times, and I've been there too when I was younger.
@@csy897 can u provide MVC library, if the new frontend junior never learn mvc. They might only learn mvc over server side language through traditional university route. and the new frontend route is react. Basically it is what happen on me. Love write server side, but the most favourite of me is pure PWA, it scratch my head every project, i pick different style depend complexity
@@kasper_573 He also feels like he's got a big ego from doing actual valuable work. I believe that if he meets a problem that is better solved with one of the patterns he doesn't like, I'm sure he's the type that will cave. It's only human to have likes and dislikes, and therefore it is important to surround ourselves with different ideas. As long as we collectively are able to consider what is best for the application we're handling, I think our differences in inclination are what makes app dev fun.
2 way reactivity can be a challenge when you're first hit with the circumstance, especially on the frontend where you come across large data sets and performance bottlenecks, but frankly it's something I've only found difficult in some frontend framework implementations. Vue and solid implement it well but a native implementation is welcome. This will only negatively impact those who have rigid mental models of unidirectional binding
First of all, I've subscribed because I really enjoy your manner of delivering content. However, for the longest time, I was trying to figure out where I know you from and then it hit me. Has anyone ever told you you look like Lalo's (from Better Call Saul / Breaking Bad) healthier younger brother? I can't unsee it now! Thank you for the content, your channel is awesome!
Two thoughts: 1. We already had Observable as a proposal, which is essentially this. That flopped, how will this be different? 2. DOM values should be assignable to signals without an effect function, such that the value is auto-subscribed to the signal value.
This seems like the refined version of observable. Smaller and more minimal api based on proven usage within libraries. Seeing key members of the libs all working together is awesome. Looking forward to react adopting this (better late than never!)
Super excited for this, but it's interesting that the rendering/effect aspect of signals hasn't been included in the proposal (see 14:17). While I get why, it does kind of make this proposal feel like an incomplete implementation rather than true "signals" the way we've come to understand them. I wonder if effects will be added at some point in the future?
I guess my main question here is, how does it know? I mean, without React-style [dependency, arrays]? I get that a new language primitive could have the compiler examine the passed in function to see what's closured in and check if any of them are Signals, but a polyfill could never do that, so how could that work? The only way I can think of is to just evaluate everything every time and if the value hasn't changed, don't fire any of the events. But that won't really work for effects like x=> [x] because the new array won't equal the old one. I've had a look at the polyfill code and there is some tree management in there, but I didn't find anything that could really tell when something changes like this
It works the same way as it has in VueJS since at least 7 years and it's surprisingly simple. You run the function in the initial setup, and each call to `get()` registers the signal with the function. So for example effect(() => `console.log(myvalue.get()))` would initially log the value, and the call to "get()" would setup this lambda function to rerun the next time any call to `myvalue.set()` is being made (more or less). When it reruns the function it erases the tracking and registers all the get()'s again, that's why things like if-conditions still work. However, there are ways to also set up effects without running them first, in which case you would need to provide the signals as dependencies just like in Reacts use_effect.
Not sure if this is a good idea or not but if we need another data wrapper implementation I would prefer it to be native. I'm sick and tired of expanding proxy objects in the debugger to get the value of simple variables.
Please don't. There once was a time where we thought that observables were a good idea. They tried to bring it into JS. It went nowhere. In hindsight, good that it did not happen. Keep that in mind.
There a lot of people much smarter and more accomplished than you (and me!) working on this. Not saying that automatically has merit, but we could at least see what they come up with before trying to dismiss it.
The problem is that it didn't solve the issue that programmers were having. They had an idea for the use case and it was great for that use case but it was terrible for what programmers wanted to use them for.
import { ref, computed } from "vue"; const counter = ref(0) ; const isEven = computed(() => (counter.value & 1) == 0); const parity = computed(() => isEven.value ? "even" : "odd"); but vanilla, and apparently fairly intelligent :D I need to see a good example of a server app, though.
All of the same seem to be achievable to me via RxJS. With RxJS you can do push via subject.next and you can do pull via observable.subscribe ... a UI component can subscribe/unsubscribe to an observable and will cause computation it builds on only when that observable is subscribed to. With RxJS you also get teardown support and a bunch of powerful operators. Composability is amazing with RxJS. Unfortunately it's too complicated for people who used to imperative code. Signals seem to be easier to use.
Technically you can achieve with Rx Subjects what Signals are trying to achieve but I think you will have to do all the optimizations yourself: manage subscriptions between streams, use distinctUntilChanged, combineLatest, avoid unnecessary computations by unsubscribing from upstream dependencies when a computed Subject does not have any subscribers.
Huh, didn't even know this work was happening. Very cool. I think what would go really nicely in hand here would be a "query" library for asking to "subscribe" to certain signals. Every signal registers their name to a global namespace, then you can "listen" to signals from throughout the network by querying that namespace and getting a signal in return.
If there’s anything I want in a codebase, its readability. I feel like this signals goes against this. Theres nothing that tells you what updates when and thus is waaaay too magic.
Wow, looks like Knockout. There it was called Observables. I loved these functions. They are very intuitiv. Great to see them as a tool in JS and thus in other frameworks.
I've been writing "reactive" code especially for TCP/RPC servers for almost a decade using libs like rxjs and kefirjs. Nice of them to finally make it a first class citizen :P
The one thing that worries me is lists. Quite often, I do pass around lists of things, and arrays are - in JS land - quite often *weird*. If I use a signal, and my computed value is [...someSignal.value.map(a => !a)] (Inverting some booleans for whatever reason), when is it considered equal? Is it considered equal if the object reference is the same? But that might be dangerous, because then I can push to an array, and, well, nothing has changed. This is a problem reactive libraries like mobx try to solve, but it's actually not trivial. I can forsee quite a few footguns when working with arrays in either direction.
Regular arrays would presumably only be compared referentially. There is another proposal for structural equality. It adds records (which are essentially immutable objects) and tuples (immutable arrays). Look up "JavaScript Records & Tuples Proposal" if you're interested.
I'm impressed to see even "Bubble" that is a no-code tool working on this proposal together with Svelte, Solid, Angular and etc to push this idea forward!
This is looks like an excellent functional solution to private state. The same problem why OO languages always give the advice: don't pick values off your data objects, pass the data objects
Question: how does isEven know that counter is its source? By that I mean how does it know to recalculate itself when counter becomes dirty? In the definition I see no explicit definition of counter as source for isEven: isEven =new Signal.Computed( ()=> !(counter.get()&1 )
Thanks for covering this topic. This might finally replace those all too complicated state managers that one is bound to use just to keep data flow safe and smooth.
RXJS has everything it needs for the flexibility it requires for the complexity of the websites that use it. In most sites you will only use map, and mergeMap to transform the values inside of Observables, fairly easy ideas to grasp. But the core concept is very simple. Think of it like a Promise that can be triggered multiple times. It should have been added a long time ago, while leaving users/library creators to create operators for more complex uses.
RxJS is glitching by default, and that is terrible and leads to many bugs (at least in our production app it did), which are hard to debug and tedious to fix. Non-glitching signals are a *significant* improvement on Rx observables.
@@nUrnxvmhTEuU can you please tell us more about that "glitching"? I'm using RxJS for ~5years now and I had many problems with it... But EVERY time it was my mistake or missunderstoodment on some concepts/operators. But I must agree that debugging it is very hard.. And also the documentation is... you know. :D
I’ve worked in an environment where this type of acyclic graph dependency triggering behavior led to serious performance issues and unmaintainable spaghetti code where it’s so hard to debug due to having to figure out what triggered something when you’re 10 levels of indirection away from the trigger and a million sources to wade through.
This is great, and I can think of a clear analogy to node-based CG software, but on a much higher interaction level. In Gaea for example, when connecting noise and erosion nodes to generate terrain, there's a variation in compute time i.e. 1-20 seconds. So once a change is made, all downstream nodes are marked as dirty, and then only when browsing to that node will the new result be computed and displayed. If an upstream node has to be recomputed, it's done so automatically. This ensures that the minimum amount of computation is done to give the artist fresh results.
push then pull looks like the mobile carriers email service in the olden days, they had special protocols to push the info that something happened but when you actually wanted to view the message you had to pull it on demand (due to the costs back then) don't think anyone outside of Japan really used it but it was specced at least
yep the rate this got adopted gave me pause. i did recently realize mithril.js has had a hacked-together unintentional construction of half of this (in the style of every program eventually growing a bug-riddled half-implementation of common lisp) since its first release 10 years ago (2014 march 17), so that makes me feel much more like it's truly on to something. (i'm also a former maintainer of mithril.js, so...)
7:04 Can't you just define doubled to be ``` let x = 2; console.log(x); let doubled = function () { return x * 2; } console.log(doubled()) x = x + 4; console.log(x); console.log(doubled()) ``` The only difference is when the value is computed, but there's probably things for that without using signals, like by caching the value. Am I missing something? What do signals add that functions like the one I described above don't add?
Your solution wouldn't automatically update anything... You can get the latest value for x through doubled() but how would you know when to rerun that function to get the new value? This is where signals become useful, they help solve this problem in a simple way
@@baka_baca But the video said signals are only recomputed when they're used (if the dirty flag is set), so wouldn't you have the same issue with signals? What's the difference there between using doubled.get() for signals vs doubled() with my solution?
This is pretty fantastic. I hope you'll get us more updates if/when this progresses through the stages. Although I doubt it'll affect react, even if there's full browser support etc. But who knows.
A closure isn't reactive. Yes, if you reevaluate the function you'll get the new result, but there's no mechanism to tell that you _should_ reeval because x changed. And there's no automatic memoization, avoiding the recompute when it isn't needed (which wouldn't be an issue in your example but would be massive if the computation involved i/o, for example).
You can use the initial/ current polyfill in production by using Angular :P - jokes aside the current polyfil is based on the signal primitive that was written in Angular to be shared with WIZ
Theo, your reaction to the warning "Do not use this in production", if mirrored by the majority of the Javascript development community, fully explains how the language has become such a dumpster fire.
I used signals to link up imperative side with react recently and it’s a dream (: it’s a little hard to follow them around but I predict we will get a tool that visualizes signal connections
It's nice that one of the best features from QML (Qts alternative frontend language, a mixture of JS and JSON) will come to JS. Signals make stuff so easy.
I wonder if it is actually going to be composable with higher order functions and adding your own abstractions on top of it or if it is going to do some kind of chaining syntax that basically that really difficult to do and you're stuck only working at the lower level of abstractions.
Signals seemed cool until I used them in solidjs and did a production app. What I found out though is we had so many stale state issues and it really felt like working in backbonejs again with zombie views. This feels wayyyyyyyyy too overhyped. it's also created by people that did projects like mobx that nobody ever used either.
One thing in trying to wrap my brain around is using a const for those derived/computed signals. Feels weird since you're indicating it peobably will change at some point 🤔
const foo = { bar: 'baz' }; foo['bar'] = 'foobar'; To me it makes as much sense as this. foo never changes here as it always points to the same reference in memory. A signal might point to an address in memory and the value at that address can change constantly without ever changing the address itself.
(Edit: I switched up Signal and event in my understanding of this, so i mean events, not signals.) React is seriously missing good signals. If you have components that are not connected to each other, like in completely different branches of the component tree, and you need to pass data, you dont want the complete branch to be rerendered based on one small change. Thus there is a need to have a communication possibility beyond your branch.
A nice and declarative way to optimise data trees, I like it! It probably does still need some iteration, but the concept sounds great. Something this doesn't seem to supposed is parameterised signals. I don't know _how_ you'd support that in a performant way, but it would be nice to be able to call something like `const a_or_b = (x, y) => a(x, y) || b(x, y)` in a performant way. Memoising something like that seems like a pain though, so I don't really want to open that can of worms, especially since I haven't yet run into a use case complex enough where performance is a major issue for chains of computations like that.
no, it's pretty simple, you just use an object or a tuple (array?) I guess. Been a while since I used JavaScript but in Rust I just use Tuples and Structs as the parameter type, it's just like having multiple parameters.
@@Luxalpa You mean a map of arrays representing the provided arguments? That doesn't work because Array isn't a value type, so they'll never compare as equal and you'll never be able to find anything in the map.
Interoperability would be SO GOOD for the web. I would have already switched to Solid if not for the fact that so many of the react libraries I use don't have a good equivalent in the Solid/Start ecosystem (e.g. clerk, trpc, react-pdf, react-markdown, sonner, etc)
that overview was really good, I always assumed signals was just rxjs sugar and was just going to run into the same problems we had with knockout.js now I see how awesome signals are compared to that. Cool!
I do think that the web needs a standard way to deal with reactivity, but it must be done with caution, if the api is poor and not extensible, it can became something that people choose to create its own than using the default. Btw, it is funny that I too have a UI/Components library that uses an implementation of signals, if someone wants to see is lithen-fns in npm and @lithen/fns in jsr
Fine... I'll continue my Haskell journey tonight. Seriously how many more iterations of JS will it take just to solve 1 issue of STATE MANAGEMENT and PERFORMANT RENDERING.. It makes me sick.
You know what is another thing that is coming to js, native observables. Well have rxjs on js, the proposal is already on its way to being merged and whats better is that is the same creator of rxjs that made this posible. Another win for angular.
Sounds quite bad. I remember working with rxjs in angular 2-4 and it was a lot of unnecessary and complex boilerplate. Happy that nobody tried to force rxjs into react.
@@HyuLilium jQuery lit the fire for browsers to support querySelector API. RxJS as a concept with streaming is awesome. The Proxy object was neat in what it allowed RxJS and other libraries to do with watching updates. The ability that some languages have with being able to watch and trigger updates when values change is intriguing and could be useful. It is crazy with how much JS is event-driven that something like RxJS isn't just native.
Imagine the implicit mutations that's not trackable where you can mutate a signal in a different file and it triggers a bunch of changes in many others.
As someone who predominately uses C for my work and fun, I was a bit confused at the term `signal` here as that word has a completely different meaning for me. As you went along I thought maybe it was actually message passing by another name, but this is quite frankly weird. It sounds like maybe it'll involve some sort of pointer with reference counting scheme, possibly some mutexes for locking and perhaps doing something similar to copy-on-write to check for changes in values. Although, I'm not sure how they'll be able to efficiently accomplish that because it would seem that it would require tracking changes at multiple points. Perhaps it's just because I've written my own compilers before and am currently writing one for my own language, but I'm curious as to how this will actually be implemented. It really sounds like an attempt to shoehorn in a common compiled language idiom for synchronization of data exchange but with extra steps. However, the most surprising thing about this video is you saying you don't know the word `elide`. I'll assume that was a brain fart because the article did misspell it in a few places.
Sounds really exciting. As having build a lot of React code with Redux for state management (which is a pattern I love - partly because I can separate the complexity where it's easy to test), I also know that there are some pitfalls, e.g. make sure you write your selectors in a way that doesn't cause unnecessary rerendering, or expensive recomputations, when the relevant part of the state didn't change. The latter often involves the use of `createSelector`, but that is often also used incorrectly. It appears that Signals will solve these issues, allowing you to create the state selection more declaratively without worrying about the pitfalls, due to the buildin memoization. Imagine if we could bind to a signal in the JSX code direct, so maybe a signal update doesn't need to trigger rerendering the entire component, but just imperatively set that attribute.
Web-Components plus signals will be lots of fun. Can't wait to teleport templates around. Though, in my opinion, the way we write Web-Components needs to improve. It feels cumbersome.
I was using my weird signal type implementation on a reactive UI that I had to write in vanilla.js. So weird to me that they didn't give some examples on how it can be used to create pretty much JSX/React itself, the functions that returns templates, and dependent on the signal values, gets rerendered on different resulting changes, which is huge, and easy way to create Reactive UI's like React, in plain ass non compiled javascript.
What would make this truly awesome would be support for transactions, or more specifically the possibility to defer triggering dirtying/reevaluation/triggering of effects until a set of changes is complete (a transaction is committed). Especially if it properly works with nested transactions and configurable isolation. If that would be supported with good performance, it would make me want to go back to web development 🙂
When it comes to recomputing everything that depends on the given entity after it's changed... um, we already have this functionality in the form of React memo, right? If I'm interpreting this code correctly, signals do the exact same thing with a different syntax. Instead of including Foo, which is a state value, in the dependency array of memoized Bar, you turn both entities into signals, and use the "Computed" function to create a dependency between Foo and Bar. If this is how it works, then it isn't THAT different from what we already have. I'm probably misinterpreting it, though, since I've never dealt with the concept of signals.
Are signals actually a good idea? To me it seems like they temporarily make things a little easier but making debugging painful when something goes wrong. Similar to debugging C manual memory management where things could've gone wrong in a dozen places and you have to look at everything because you don't know where the bad signal came from.
The big win of Signals in the spec is not payload size or performance, it is interoperability. If this happens, it will be as impactful as Promises. It is not only about sharing code and state between frontend frameworks, but also replacing any kind of set+onchange interface out there, just like Promises replaced callback hell.
I feel like a baby because I have used Promises from day 1 and do not know a world without it. But I can see why signals would be as impactful
Its okay @csy897
Yeah looking at old axios code in large apps is just... Mind-blowing honestly 😂
> "Like Promises replaced callback hell"
Out of the frying pan and into the fire, e.g. async/await "two-color" function hell! 😢
@@gearboxworks to clarify, Promises are independent from async functions. But yeah, I feel you.
Javascript: The new javascript framework
we will eventually get there lol
Predictable and good
JavaScript is the new JavaScript
😮
Meta 😮
It's funny that we need the term "signals" when it's pretty much just a particular version of that good old observer pattern
Right? "Signals" is confusing the hell out of me because this is nothing to do with signals.
It's a signal because it's a two way thing. Push dirty flags towards consumers when root data changes, and push recompute requests towards the root when the new data is needed for consumption (and also do no recompute when inputs don't change). So it's a lazy, two-way observer pattern.
@@W1ngSMC exactly
currently playing with LegendApp state and even syntax for basic stuff is almost identical
It's the front-end ecosystem. Everything must be named differently. Module ? No, component. Decorator ? No, High Order Component. Middleware ? No, Interceptor.
7:20 It's weird to me no one in chat pointed out the explanation of why this is valuable isn't complete. You could do the same with lambdas or function references that you call when you want to get the value, but the main difference with signals is the amount of computation. Signals are only computed once per update, they are not recomputed when you want to get the value, which is what's so powerful about them.
This explanation actually clarified something about signals for me. Thanks.
He literally said that shortly after.
@@tvujtatata Yes it was in the article. It wouldn't have escaped you as well that I was talking about chat at a timestep in the video as well, right?
The main benefit is in effects though. The effect will be re-run only if its sources change. If your effect is a React component, that means it will be rerendered only when necessary, without rerendering its parent or child components.
??? I mean that is the point its not an event implementation you ask for an update manually this is exactly what you often do in all kinds of systems with lambdas or function references. In the end similar to lambdas it makes it extremely hard to find out where things are happening
Before standardizing this, just build a common library that has the feature set similar to this. A polyfill without patching global objects, if you will. Migrate all frameworks to use this library. Ship it. Make them use what this proposal is intending so ship. See if it works for all of them in production, not just listing the frameworks in a proposal. We don't want to end up with an implementation that half of the frameworks listed effectively cannot use. Take your time iterating. If done, let's see if there are still benefits by bringing it into the browser and that justifies adding something to the language that will stay there forever.
Signals have been around for a long damn time, and are demonstrably good for performance already. The reason to get it in the language is that then the DOM can use it and the implementation can be even more performant. It would be super nice to be able to assign a function to a label and it Just Works™.
The most useful features have come from jQuery, which was used basically everywhere. Nowadays we have a bunch of frameworks with own set of features
@@saiv46exactly, signals are already in many frameworks and have proven their value. This just promotes it to the language level.
It looks like that's what they're doing, right?
WHat arer you, old?
Have u seen all the blahblah.JS c0ntribootor nnicknames at allz?
I don’t hate Signals but putting things into the language this keenly is exactly how JavaScript ended up this way.
Having said that, the performance advantage of lowering this work to a language primitive (think executed in C) would be huge. I would like to see what syntactic sugar like they used for async/await might look like because the library is ugly to use in the proposal.
i think it makes a lot of sense to put it in the language since almost all popular frameworks use it, but i do agree that we need to be careful!
I feel like we're getting closer and closer to "framework singularity" because most people agreed that signals are something worth implementing and using. Maybe at some point the difference between frameworks will come down only to syntax.
And some edge case performance optimizations, and framework interoperability... Probably
I think 15 years from now best features of every framework will be native and we'll just write vanilla JS.
State management is indeed core and central to every framework, but there's much more to it.
JS, and by extension TS, are the fastest moving languages in history. Love it or hate it, JS is the social programming language. That gives it the highest adoption, thus the most iterations/collaborators, thus the highest likelihood of becoming ubiquitous to the point of absorbing all competition. See: AssemblyScript (TS->Assembly); Hermes Static (TS->C); Frameworks are no exception. Resistance is futile. All will be assimilated.
It all return to nothing...
Vue has had signals baked in for many years but with a nicer syntax. It was based on KnockoutJS that's ancient.
Yea, _very_ disappointing to see how Theo glosses over the OG and state of the art works cause it just ain't give clickz.
Also can define computed get and *set* abstraction in Vue. Very useful when binding.
They use proxy objects to attain this functionality
19:26 I'm not familiar with other signal implementations, but Preact signals have a peek() method that lets you read the value of the signal without subscribing to changes. This is similar to passing a function to setState() to use the current value, since you shouldn't be using the current value of "state" directly.
e.g.
signal.set(signal.peek() + 1)
vs
setState(prevState => prevState + 1)
You can do this in solid's signals too, through a couple of options, but it's a bit uglier IMHO.
It's untrack(() => ...) in Solid. Solid's signals are just two functions that you can call, not an object with methods like .get() .set()
@@reoseah Preact has untracked(() => ...) as well, but I haven't used it for basic state updates since its much more verbose
Something so convenient seems to go in the opposite direction to what the use of the subtle namespace is trying to imply.
So like Vue state (ref, computed, etc...) but in javascript itself?
Seems like it
No, it’s like React state (ref, memo, etc…) but in javascript itself.
@@fullfungo no
honestly it's ridiculous that this hasn't been a part of every language itself. I mean you have this code
x = 1
y = 2
z = x + y
print(z) // 3
then you assign
x = 8
but somehow you still get
print(z) // 3
which is a false statement of equality since x+y = 2+8=10. And z should be =10
I mean this is an obvious problem, and it's the root of all element rendering problems, as I see it.
10:30 wasn't sold until this point. this is actually super useful (from an optimization standpoint).
CanJS has had this since 2013 but so few people know about the framework that all their innovations went largely unnoticed
Knockout js also had since 2010..
I used CanJS and also its predecessor, JavaScript MVC.
It's weird how we had all these great frameworks and they all disappeared and were replaced by React.
Also, what happened to Polymer framework???
IMO, Polymer was superior to React but very few companies used it at the time. I guess at least now there is Lit... But you don't really even need a framework at all nowadays. Web Components is powerful stuff.
@@Remindor Vue is actually the most popular one that was inspired by KnockoutJs but the video doesn't mention that somehow.
@@ziad_jkhan … Vue is _explicitly_ called out at 13:15 as one of many frameworks that have provided input on the proposed spec.
@@Remindor WC’s are super cool. Lit does at least purportedly help take some of the boilerplate out of writing WC’s. I haven’t used it yet but considering experimenting with it.
Meanwhile Haskell devs: "Look what they need to mimic a fraction of our power!"
At least JS and TS ship once in a while
And that’s the point: they want a minimal base that provides the most useful parts without forcing laziness as default.
Not just Haskell, seems people are starting to understand that declarative is future (and I'm 100% sure imperative will die, as is spaghetti nonsense in long run), so instead of fully committing some salad gets invented. Which is still progress, but still. As shown by polyfill it was doable all along :) I guess just "environment" (read people) need to grow.
@@morphles isn’t that what the prolog people used to say?
@@ArneBab Only in a couple of days later, they need to wait for their proof of concept to finish evaluating.
Looks just like Vue. But I'm happy for all of you who are stuck with React because you might finally get a proper reactivity system lol
bro redux selectors predate vue
@@klausburgersten yeah they probably do... whatever that is
Dude bro
@@vincentv8991 dude
Lol
A number of concepts here that are very vue.js. Would love to see this in core js
the concept of computed is the same from Vue. As some one who works with Vue2 since 2019 I can say these people are way behind Vue and this nothing too fancy or to be excited about.
@@MrDragos360vue has done a terrible job at marketing. I had a job interview recently where a senior frontend dev told me their company moved from vue to react for WAIT FOR IT performance reasons 🤣
Evan you is also part of this proposal and he tweeted recently regarding this.
getting signals into js itself would help vue performance wise as well. Being able to share vue ref states with other js frameworks/apps is also nice. But yeah vue already has most (or all?) of this and it's one of the reasons why I use vue when I get the choice.
You know I had to pinch myself because someone was proposing a change to the spec instead of just creating yet another framework.
I thought methods like these were forbidden and everything new had to be done via a new framework in JS land. These brave guys really challenging the status quo and I hope it works for them.
/s
I loved KnockoutJs and still have a fairly large codebase using it,, I find it hilarious the world is coming back round to what Knockout did in the first place.
Unfortunately I also have a huge codebase using RxJs and React, so swings in roundabouts.
Rob Eiserberg is the one who created DurandalJS a decade back, which extensively used knockoutJS for reactivity. Definitely got inspired.
I do think it's cool how in almost every video you talk about the people, what they did, and what they are doing, as part of the context for what you're talking about. Bringing us up to speed on their "reputation" if you will, beyond just github stars.
Of course the downside is that we're not evaluating things solely based on their own merits, but the world really doesn't work like that, does it?
We definitely need a more primitive-driven approach to web engineering application based on the main themes such as reactivity, rendering and networking.
Exciting proposal.
23:44
elide verb
/ɪˈlaɪd/
elide something - to leave out the sound of part of a word when you are pronouncing it
“The ‘t’ in ‘often’ may be elided.”
🤓
Yes, this! Thank you
7:20 I’m the one who asked that question, and I’m not convinced that the “implementation performance” is a good enough reason. I’d argue JS implementation of such a high-level concept is comparable to the browser native one. I don’t particularly see opportunities for optimisations that the browser sees, where JS doesn’t.
I could accept an argument that it is done for interoperability between signals libraries, but debatably this could’ve been achieved by the common lib/adapters if desired. Don’t think there is much demand for interop. For it to be there, the signals ecosystem has to mature a bit.
I might have missed the argument given by theo live. I was hella eepy at 3am (eu gang rise up). Might’ve been chatting in the chat at that moment.
You would be surprised by how much Signals have matured. This effort to standardize stems from the fact that all frameworks have derived pretty much the same Signals implementation independently, the push-pull tainted graph. They do differ on how effects run and other details and the proposal accounts for that.
And one point about interop: even if they all agree on a single base signal lib, you need a single instance of the lib running for it to work (since it relies on a single call stack context to track subscriptions). This is challenging as it requires npm/yark/pnpm trickery with peerDependencies, bundler configuration, and/or manual setup.
My argument rests on interop not being desired enough feature. Don’t think it is. Can’t see many reasons why it would be. If it were, it is something that community can solve without tc39 standardization. Additionally, signals can be adapted between implementation, even if there is a perf penalty. I doubt that cross-signal-implementation calls are frequent enough to manifest perf issues.
As to other arguments I’ve encountered:
- Perf gains are likely insignificant
- Optimization opportunities are lacking
- Signals being in DOM doesn’t achieve much. Maybe observability of attributes. Can be done with MutationObserver (I’ll give you that it isn’t a particularly nice API)
Yea, it is doomer-like mentality and the “ooooh scare of change”, but maybe let’s not bloat JS even larger than it is. Look at where it has lead C++ to. But I guess either way, acceptance of the proposal won’t really affect me that much. I’ll gladly use JS signals if they are a part of library interface or they suit my needs (as opposed to reaching out for a 3rd party library).
You have every right to be excited by this addition. I am just not. Not sure it sits right with me.
There is a whole lot of optimization opportunities here. Implementing stuff like this in V8 or another JS runtime will drastically improve performance when you start to natively optimize dependency graphs and things like (de)serialization.
@@yapetthe interop goes beyond frontend frameworks. Signals being a first-class citizen could replace every kind of set+subscribe interface. I find Promises to be a good parallel here. Assuming you have been in this space long enough, would you rather still be using Bluebird.js or Q today? Did you imagine what the ecosystem would have looked like with native Promises back then? Could you imagine a JS world without native Promises today?
So much about Signals reminds me of how Jotai is designed. The whole atomic state like each signal represents, and having derived state similar to signals.computed. However signals takes it a step forward by not re-computing the graph when the root node changes, but only when a specific node is consumed. Clever!
This is LITERALLY how KnockoutJS works since 2010. Variables are all basically functions in which you can subscribe to including computed fields, which makes data model binding to the UI just insanely easy.
I moved onto React which meant a mindset shift to one way binding, looks like I need dust off those old parts of my brain once again.
LMAO, I know web components aren't ideal in an environment where you can use frameworks. That said, I build and use web components daily, they work extremely well in a template based language system like shopify liquid themes. I see them as little islands of logic 😄. Although, behind the scenes I'm using solidjs which makes the whole process much better xD.
I've actually built an extremely complex set of web components used to render a dynamic js sidecart. The components work for rendering the data while having full control of styling the sidecart however you want. The best part is because we use SolidJs signals for both the cart and the web components graph, we can ensure fine grain updates within the sidecart's elements.
Sorry for the mini rant, I just want think they work really well under the right circumstances 😄.
Exactly, I'm also very fond of web components and use them a lot in a lot of projects.
Also I've seen some libraries for React like the Mux Client SDK and LiveKit that use web components under the hood and simply wrap their components in connectors to easily distribute their components across React, Svelte and others without having to rewrite their entire product.
Ayyy solid!!
Thanks for pointing this out. Been following various signal based systems (including early elm and purescript) for a bunch of years. It's neat to see that they're tackling rendering timeliness, out-of-date-ness and effectfulness. It's easy to write a signal graph but very hard to get the consumption API right. Don't feel great about the amount of work left to the consumer in the blog post, but we'll see how it falls out. There's nothing worse than writing something nice in a reactive system, adding one more component or relationship and suddenly being without a paddle trying to track down a lost or retrograde ui update.
18:00 MVC is a powerful pattern for decoupling business logic from UI. It shouldn't be frowned upon. It's sad that 99% of React devs will just build UI that's strictly coupled with business logic.
There are so many state management libs that use MVC, I'm curious why he frowns upon it but it is funny to see him react to things he feels averse to. I feel like the deeper you get into building stuff for a longer period of time, you develop your own style and aversions from badly written or very unperforment or buggy code. Kind of a PTSD
not doing MVC is the biggest reason JS FE (especially React but not only React) is such a hellscape. Signals won't fix anything, until this attitude changes.
Theo gives me big ego junior superstar vibes. I wouldn't be surprised if he had a lot of friction with his seniors at his previous jobs, and that a lot of his naive takes on certain concepts like MVC and separation of concerns is him projecting his frustrations with past colleagues forcing him to adhere to concepts he didn't fully understand at the time, and due to his large ego he's written them off as pointless, instead of learning their values. I think he's been stuck in that mindset ever since. I've seen this personality countless times, and I've been there too when I was younger.
@@csy897 can u provide MVC library, if the new frontend junior never learn mvc. They might only learn mvc over server side language through traditional university route. and the new frontend route is react. Basically it is what happen on me. Love write server side, but the most favourite of me is pure PWA, it scratch my head every project, i pick different style depend complexity
@@kasper_573 He also feels like he's got a big ego from doing actual valuable work. I believe that if he meets a problem that is better solved with one of the patterns he doesn't like, I'm sure he's the type that will cave.
It's only human to have likes and dislikes, and therefore it is important to surround ourselves with different ideas. As long as we collectively are able to consider what is best for the application we're handling, I think our differences in inclination are what makes app dev fun.
2 way reactivity can be a challenge when you're first hit with the circumstance, especially on the frontend where you come across large data sets and performance bottlenecks, but frankly it's something I've only found difficult in some frontend framework implementations. Vue and solid implement it well but a native implementation is welcome. This will only negatively impact those who have rigid mental models of unidirectional binding
It's unfortunate that "a sink" and "async" are homophones.
Maybe a intended pun
Really really close. They sound different in my head but I can see the source of confusion in non native speakers
I say “a sink” as uh sink and “async” as ay sink
But usually the context in which the two will be used will be disambiguating. It’s probably only ambiguous without context.
yeah i woulda preferred consumer and producer but i'd also just call tree-shaking dead code removal :shrug:
Mobx at the time was revolutionary. Bloated, quirky at early versions. Still blown up my apps complexity down to 1/3 of what it was before.
Exactly, I always loved MobX, used it from first public versions
First of all, I've subscribed because I really enjoy your manner of delivering content. However, for the longest time, I was trying to figure out where I know you from and then it hit me. Has anyone ever told you you look like Lalo's (from Better Call Saul / Breaking Bad) healthier younger brother? I can't unsee it now!
Thank you for the content, your channel is awesome!
lalo lite
Two thoughts:
1. We already had Observable as a proposal, which is essentially this. That flopped, how will this be different?
2. DOM values should be assignable to signals without an effect function, such that the value is auto-subscribed to the signal value.
This seems like the refined version of observable. Smaller and more minimal api based on proven usage within libraries. Seeing key members of the libs all working together is awesome. Looking forward to react adopting this (better late than never!)
Super excited for this, but it's interesting that the rendering/effect aspect of signals hasn't been included in the proposal (see 14:17). While I get why, it does kind of make this proposal feel like an incomplete implementation rather than true "signals" the way we've come to understand them. I wonder if effects will be added at some point in the future?
I guess my main question here is, how does it know? I mean, without React-style [dependency, arrays]? I get that a new language primitive could have the compiler examine the passed in function to see what's closured in and check if any of them are Signals, but a polyfill could never do that, so how could that work? The only way I can think of is to just evaluate everything every time and if the value hasn't changed, don't fire any of the events. But that won't really work for effects like x=> [x] because the new array won't equal the old one. I've had a look at the polyfill code and there is some tree management in there, but I didn't find anything that could really tell when something changes like this
It works the same way as it has in VueJS since at least 7 years and it's surprisingly simple. You run the function in the initial setup, and each call to `get()` registers the signal with the function. So for example effect(() => `console.log(myvalue.get()))` would initially log the value, and the call to "get()" would setup this lambda function to rerun the next time any call to `myvalue.set()` is being made (more or less). When it reruns the function it erases the tracking and registers all the get()'s again, that's why things like if-conditions still work. However, there are ways to also set up effects without running them first, in which case you would need to provide the signals as dependencies just like in Reacts use_effect.
@@Luxalpa oh man that's clever 😄
So signals works the same way as a spreadsheet.
Not sure if this is a good idea or not but if we need another data wrapper implementation I would prefer it to be native. I'm sick and tired of expanding proxy objects in the debugger to get the value of simple variables.
Please don't. There once was a time where we thought that observables were a good idea. They tried to bring it into JS. It went nowhere. In hindsight, good that it did not happen. Keep that in mind.
There a lot of people much smarter and more accomplished than you (and me!) working on this. Not saying that automatically has merit, but we could at least see what they come up with before trying to dismiss it.
@@PraiseYeezus This doesn't help as an argument because one could have said that about observables at that time, too.
@@_nikeee "wait and see" isn't an argument for or against observables or signals or anything else. it's an argument against your dismissive statement.
Observables was shit. The current mutation observer is not great. If the implementation is like RxJS, then it could be good.
The problem is that it didn't solve the issue that programmers were having. They had an idea for the use case and it was great for that use case but it was terrible for what programmers wanted to use them for.
import { ref, computed } from "vue";
const counter = ref(0) ;
const isEven = computed(() => (counter.value & 1) == 0);
const parity = computed(() => isEven.value ? "even" : "odd");
but vanilla, and apparently fairly intelligent :D I need to see a good example of a server app, though.
All of the same seem to be achievable to me via RxJS. With RxJS you can do push via subject.next and you can do pull via observable.subscribe ... a UI component can subscribe/unsubscribe to an observable and will cause computation it builds on only when that observable is subscribed to. With RxJS you also get teardown support and a bunch of powerful operators. Composability is amazing with RxJS. Unfortunately it's too complicated for people who used to imperative code. Signals seem to be easier to use.
Technically you can achieve with Rx Subjects what Signals are trying to achieve but I think you will have to do all the optimizations yourself: manage subscriptions between streams, use distinctUntilChanged, combineLatest, avoid unnecessary computations by unsubscribing from upstream dependencies when a computed Subject does not have any subscribers.
@@andrei-ovisure, and it is okay to me
Huh, didn't even know this work was happening. Very cool. I think what would go really nicely in hand here would be a "query" library for asking to "subscribe" to certain signals. Every signal registers their name to a global namespace, then you can "listen" to signals from throughout the network by querying that namespace and getting a signal in return.
If there’s anything I want in a codebase, its readability. I feel like this signals goes against this. Theres nothing that tells you what updates when and thus is waaaay too magic.
Do a find all references on the variable. It's trivial.
Sideeffects all over the place…
It's very intuitive, once you get used to it.
Wow, looks like Knockout. There it was called Observables. I loved these functions. They are very intuitiv. Great to see them as a tool in JS and thus in other frameworks.
Right! Anyway subscribing to ko observables could cause cycling deps. I wonder how they resolve or at least protect us from it
I've been writing "reactive" code especially for TCP/RPC servers for almost a decade using libs like rxjs and kefirjs. Nice of them to finally make it a first class citizen :P
The one thing that worries me is lists. Quite often, I do pass around lists of things, and arrays are - in JS land - quite often *weird*.
If I use a signal, and my computed value is [...someSignal.value.map(a => !a)] (Inverting some booleans for whatever reason), when is it considered equal? Is it considered equal if the object reference is the same? But that might be dangerous, because then I can push to an array, and, well, nothing has changed.
This is a problem reactive libraries like mobx try to solve, but it's actually not trivial. I can forsee quite a few footguns when working with arrays in either direction.
Regular arrays would presumably only be compared referentially. There is another proposal for structural equality. It adds records (which are essentially immutable objects) and tuples (immutable arrays). Look up "JavaScript Records & Tuples Proposal" if you're interested.
I'm impressed to see even "Bubble" that is a no-code tool working on this proposal together with Svelte, Solid, Angular and etc to push this idea forward!
This is looks like an excellent functional solution to private state. The same problem why OO languages always give the advice: don't pick values off your data objects, pass the data objects
Question: how does isEven know that counter is its source? By that I mean how does it know to recalculate itself when counter becomes dirty?
In the definition I see no explicit definition of counter as source for isEven:
isEven =new Signal.Computed( ()=> !(counter.get()&1 )
Thanks for covering this topic. This might finally replace those all too complicated state managers that one is bound to use just to keep data flow safe and smooth.
I did. I just use rxjs as state manager
Signals is just a rebranding of reactive programming to make rxjs seem more friendly and approachable to devs.
how?
Rxjs is loaded with crap complexity, because it couldn't come up with smth js-native. Signals fixed that to some extent.
RXJS has everything it needs for the flexibility it requires for the complexity of the websites that use it.
In most sites you will only use map, and mergeMap to transform the values inside of Observables, fairly easy ideas to grasp.
But the core concept is very simple. Think of it like a Promise that can be triggered multiple times. It should have been added a long time ago, while leaving users/library creators to create operators for more complex uses.
RxJS is glitching by default, and that is terrible and leads to many bugs (at least in our production app it did), which are hard to debug and tedious to fix. Non-glitching signals are a *significant* improvement on Rx observables.
@@nUrnxvmhTEuU can you please tell us more about that "glitching"? I'm using RxJS for ~5years now and I had many problems with it... But EVERY time it was my mistake or missunderstoodment on some concepts/operators.
But I must agree that debugging it is very hard.. And also the documentation is... you know. :D
I’ve worked in an environment where this type of acyclic graph dependency triggering behavior led to serious performance issues and unmaintainable spaghetti code where it’s so hard to debug due to having to figure out what triggered something when you’re 10 levels of indirection away from the trigger and a million sources to wade through.
This is great, and I can think of a clear analogy to node-based CG software, but on a much higher interaction level. In Gaea for example, when connecting noise and erosion nodes to generate terrain, there's a variation in compute time i.e. 1-20 seconds. So once a change is made, all downstream nodes are marked as dirty, and then only when browsing to that node will the new result be computed and displayed. If an upstream node has to be recomputed, it's done so automatically. This ensures that the minimum amount of computation is done to give the artist fresh results.
Fokcenk pipe operator, dear committee, pretty please.
Is it too much to ask? How much was it, 10 years?
push then pull looks like the mobile carriers email service in the olden days, they had special protocols to push the info that something happened but when you actually wanted to view the message you had to pull it on demand (due to the costs back then)
don't think anyone outside of Japan really used it but it was specced at least
I always kind of figured this was the job of the Proxy object that JS already has and Vue uses as part of its reactivity. Still, I'll take it.
yup, redux also uses it. when I roll some vanilla state management solution, I usually go with a proxy.
After 2 years of adopting this" this feature has been deprecated "
YUP YUP! The profiles of the proponents worry me... I don't see their future. I don't see the future of their work.
yep the rate this got adopted gave me pause. i did recently realize mithril.js has had a hacked-together unintentional construction of half of this (in the style of every program eventually growing a bug-riddled half-implementation of common lisp) since its first release 10 years ago (2014 march 17), so that makes me feel much more like it's truly on to something. (i'm also a former maintainer of mithril.js, so...)
7:04 Can't you just define doubled to be
```
let x = 2;
console.log(x);
let doubled = function () {
return x * 2;
}
console.log(doubled())
x = x + 4;
console.log(x);
console.log(doubled())
```
The only difference is when the value is computed, but there's probably things for that without using signals, like by caching the value.
Am I missing something? What do signals add that functions like the one I described above don't add?
Your solution wouldn't automatically update anything... You can get the latest value for x through doubled() but how would you know when to rerun that function to get the new value? This is where signals become useful, they help solve this problem in a simple way
@@baka_baca But the video said signals are only recomputed when they're used (if the dirty flag is set), so wouldn't you have the same issue with signals? What's the difference there between using doubled.get() for signals vs doubled() with my solution?
This is pretty fantastic. I hope you'll get us more updates if/when this progresses through the stages. Although I doubt it'll affect react, even if there's full browser support etc. But who knows.
6:53 what's wrong with this approach?
let x = 2;
const doubled = () => x * 2;
x = 4;
Doubled is recomputed every time you call it even when it doesn't need to
A closure isn't reactive. Yes, if you reevaluate the function you'll get the new result, but there's no mechanism to tell that you _should_ reeval because x changed. And there's no automatic memoization, avoiding the recompute when it isn't needed (which wouldn't be an issue in your example but would be massive if the computation involved i/o, for example).
@@PeteC62 well we can implement caching if the computation gets expensive.
You can use the initial/ current polyfill in production by using Angular :P - jokes aside the current polyfil is based on the signal primitive that was written in Angular to be shared with WIZ
Theo, your reaction to the warning "Do not use this in production", if mirrored by the majority of the Javascript development community, fully explains how the language has become such a dumpster fire.
I used signals to link up imperative side with react recently and it’s a dream (: it’s a little hard to follow them around but I predict we will get a tool that visualizes signal connections
Having a sink that changes its value... isn't this like the computed refs in Vue?
Can anyone explain in 2 words how it builds dependency graph just from lambda function? It looks like magic for me
Definitely OVERHYPED MADNESS. thanks Theo.
This is such an interesting proposal, especially since I am currently working on my own form library that us built using the Preact teams Signals
It's nice that one of the best features from QML (Qts alternative frontend language, a mixture of JS and JSON) will come to JS. Signals make stuff so easy.
I wonder if it is actually going to be composable with higher order functions and adding your own abstractions on top of it or if it is going to do some kind of chaining syntax that basically that really difficult to do and you're stuck only working at the lower level of abstractions.
Signals seemed cool until I used them in solidjs and did a production app. What I found out though is we had so many stale state issues and it really felt like working in backbonejs again with zombie views. This feels wayyyyyyyyy too overhyped. it's also created by people that did projects like mobx that nobody ever used either.
Reinventing getters and setters
That article was the cleanest explanation you can think of
One thing in trying to wrap my brain around is using a const for those derived/computed signals. Feels weird since you're indicating it peobably will change at some point 🤔
const foo = { bar: 'baz' };
foo['bar'] = 'foobar';
To me it makes as much sense as this. foo never changes here as it always points to the same reference in memory. A signal might point to an address in memory and the value at that address can change constantly without ever changing the address itself.
(Edit: I switched up Signal and event in my understanding of this, so i mean events, not signals.)
React is seriously missing good signals.
If you have components that are not connected to each other, like in completely different branches of the component tree, and you need to pass data, you dont want the complete branch to be rerendered based on one small change.
Thus there is a need to have a communication possibility beyond your branch.
5:53 isn't isEven sourcing parity?
A nice and declarative way to optimise data trees, I like it! It probably does still need some iteration, but the concept sounds great.
Something this doesn't seem to supposed is parameterised signals. I don't know _how_ you'd support that in a performant way, but it would be nice to be able to call something like `const a_or_b = (x, y) => a(x, y) || b(x, y)` in a performant way. Memoising something like that seems like a pain though, so I don't really want to open that can of worms, especially since I haven't yet run into a use case complex enough where performance is a major issue for chains of computations like that.
no, it's pretty simple, you just use an object or a tuple (array?) I guess. Been a while since I used JavaScript but in Rust I just use Tuples and Structs as the parameter type, it's just like having multiple parameters.
@@Luxalpa You mean a map of arrays representing the provided arguments? That doesn't work because Array isn't a value type, so they'll never compare as equal and you'll never be able to find anything in the map.
As a non-native English speaker, I truly appreciate your sharing. It has been extremely helpful!
So, its language support for knockout.js.
Finally!
Somebody on the design team likes knockoutJS 😂
Exactly. Pretty much the same thing.
Having worked on a KnockoutJS project myself for many years (2014 - 2018), this was exactly my thought 😄
One of the best things about signals is how it can completely simplify application state management.
I'm with you, it's exactly what's needed.
Interoperability would be SO GOOD for the web. I would have already switched to Solid if not for the fact that so many of the react libraries I use don't have a good equivalent in the Solid/Start ecosystem (e.g. clerk, trpc, react-pdf, react-markdown, sonner, etc)
that overview was really good, I always assumed signals was just rxjs sugar and was just going to run into the same problems we had with knockout.js now I see how awesome signals are compared to that. Cool!
I do think that the web needs a standard way to deal with reactivity, but it must be done with caution, if the api is poor and not extensible, it can became something that people choose to create its own than using the default.
Btw, it is funny that I too have a UI/Components library that uses an implementation of signals, if someone wants to see is lithen-fns in npm and @lithen/fns in jsr
Fine... I'll continue my Haskell journey tonight. Seriously how many more iterations of JS will it take just to solve 1 issue of STATE MANAGEMENT and PERFORMANT RENDERING.. It makes me sick.
After the fiasco of the Observable proposal, I don't have a lot of confidence that this will ever land. TC39 is a mess.
You know what is another thing that is coming to js, native observables. Well have rxjs on js, the proposal is already on its way to being merged and whats better is that is the same creator of rxjs that made this posible. Another win for angular.
That is good news. I would use the hell out of RxJS if it was native.
Only another 9 years to go.
Sounds quite bad. I remember working with rxjs in angular 2-4 and it was a lot of unnecessary and complex boilerplate. Happy that nobody tried to force rxjs into react.
@@HyuLilium jQuery lit the fire for browsers to support querySelector API. RxJS as a concept with streaming is awesome. The Proxy object was neat in what it allowed RxJS and other libraries to do with watching updates.
The ability that some languages have with being able to watch and trigger updates when values change is intriguing and could be useful. It is crazy with how much JS is event-driven that something like RxJS isn't just native.
@@HyuLilium what boilerplate?
Imagine the implicit mutations that's not trackable where you can mutate a signal in a different file and it triggers a bunch of changes in many others.
As someone who predominately uses C for my work and fun, I was a bit confused at the term `signal` here as that word has a completely different meaning for me. As you went along I thought maybe it was actually message passing by another name, but this is quite frankly weird. It sounds like maybe it'll involve some sort of pointer with reference counting scheme, possibly some mutexes for locking and perhaps doing something similar to copy-on-write to check for changes in values. Although, I'm not sure how they'll be able to efficiently accomplish that because it would seem that it would require tracking changes at multiple points. Perhaps it's just because I've written my own compilers before and am currently writing one for my own language, but I'm curious as to how this will actually be implemented. It really sounds like an attempt to shoehorn in a common compiled language idiom for synchronization of data exchange but with extra steps. However, the most surprising thing about this video is you saying you don't know the word `elide`. I'll assume that was a brain fart because the article did misspell it in a few places.
Just seems like the Observer pattern, really.
Sounds really exciting. As having build a lot of React code with Redux for state management (which is a pattern I love - partly because I can separate the complexity where it's easy to test), I also know that there are some pitfalls, e.g. make sure you write your selectors in a way that doesn't cause unnecessary rerendering, or expensive recomputations, when the relevant part of the state didn't change. The latter often involves the use of `createSelector`, but that is often also used incorrectly.
It appears that Signals will solve these issues, allowing you to create the state selection more declaratively without worrying about the pitfalls, due to the buildin memoization.
Imagine if we could bind to a signal in the JSX code direct, so maybe a signal update doesn't need to trigger rerendering the entire component, but just imperatively set that attribute.
That is exactly what SolidJS is doing, check it out!
Check out legend-state
6:17 isn’t parity a sink of isEven ??? article mention exact opposite
So they want to bring Vue’s reactivity model to JS, sounds good to me 😅
Web-Components plus signals will be lots of fun. Can't wait to teleport templates around. Though, in my opinion, the way we write Web-Components needs to improve. It feels cumbersome.
Also looking forward to the pipe operator some day…
I was using my weird signal type implementation on a reactive UI that I had to write in vanilla.js. So weird to me that they didn't give some examples on how it can be used to create pretty much JSX/React itself, the functions that returns templates, and dependent on the signal values, gets rerendered on different resulting changes, which is huge, and easy way to create Reactive UI's like React, in plain ass non compiled javascript.
I don’t know about the actual implementation, but that’s sounds like vue.js
What would make this truly awesome would be support for transactions, or more specifically the possibility to defer triggering dirtying/reevaluation/triggering of effects until a set of changes is complete (a transaction is committed). Especially if it properly works with nested transactions and configurable isolation. If that would be supported with good performance, it would make me want to go back to web development 🙂
The return of Knockout.js (2010) and Bacon.js (2012)
When it comes to recomputing everything that depends on the given entity after it's changed... um, we already have this functionality in the form of React memo, right? If I'm interpreting this code correctly, signals do the exact same thing with a different syntax. Instead of including Foo, which is a state value, in the dependency array of memoized Bar, you turn both entities into signals, and use the "Computed" function to create a dependency between Foo and Bar. If this is how it works, then it isn't THAT different from what we already have. I'm probably misinterpreting it, though, since I've never dealt with the concept of signals.
Are signals actually a good idea? To me it seems like they temporarily make things a little easier but making debugging painful when something goes wrong. Similar to debugging C manual memory management where things could've gone wrong in a dozen places and you have to look at everything because you don't know where the bad signal came from.
Soo... we already have this in Vue :D
Rob Eisenberg is awesome. DurandalJS and AureliaJS are his babies (along with the community) and are / were awesome frameworks.