@Tata-ps4gy Homemade playing is the best way to go~! @memehamsterr Understanding the same number in different objects come before learning notations. If you've been worried someone would stand alienated linguistically, checking the methods of teaching and/or offering tips to improve or tests to check can be trustworthy. The fellow human interlocutor has already openly shown to be minding learning affairs.
Egyptian symbols for abstraction was asymmetric: it was legs that goes from the number: 𓂻 or 𓂽. It similar to arrows that you suggest. Egyptians usually write from right to left. Then 𓂽 would mean "subtraction". But sometimes they write from left to right. In this case 𓂻 would mean subtraction.
It's good to hear that I'm not the only person who is obsessed with mathematical notation. I found your arrow notation for subtraction really intuitive! I always found a bit weird that the notation for the formula of a vector from A to B first mentions B and then A.
I agree. In some situations (but not all), it makes more sense to put A before B. Good to know that there are plenty of people who care about notation!
This is so good! I always struggled with remembering that the difference between two vectors is final minus initial. With subtraction having an arrow notation this would never have been an issue!
Isn't the name for a - b literally just their "difference"? It seems hard to believe that one can forget that the difference between two things is their difference. Then again, "difference" also seems to imply an idea of symmetry and it might be more natural to think of |a - b| as the difference, which _is_ symmetric.
very nice! representing -a as (a -> 0) feels correct, as a and (a -> 0) cancelling out is greatly announced by the notation: a + (a -> 0) = 0 also, for vectors one can also have the classic way of substracting vectors purely with the notation: A - B = A + (B -> 0) -B = B -> 0 also gives notational insight on why -B flips the vector around.
what if we used an arrow with a bar through it? then ( A -> B ) in general gives you the mathematical object that when applied to A, gives you B. then you can put the operation that you are involving, in the middle. so subtraction would be arrow with a plus sign in the middle. but if you did an arrow with a multiplication symbol in it, then ( A -x-> B ) gives you B/A = A^-1 * B in geometry you could have the arrow operation give you objects that are not the same type as the original objects. you could have points P, Q, and have the arrow with a ‘v’ return the vector that takes P to Q. P -v-> Q = u, then u(P) = P + u = Q. you could have u -r-> v = R, give you a rotor-scalar that takes the vector u to v. if you don't have complex numbers, then it doesn't make sense to talk about multiplying vectors and it doesnt make sense to talk of the 'inverse' of u, but you can say that R(u)=v. you could have u -f-> v which returns a unique reflection transformation that takes u to v, and you could do the same with two points, P-f->Q giving a reflection transformation. you could have this work with tuples as well, provided there is a unique object. you could take two quadrilaterals of points (A,B,C,D) -p-> (F,G,H,J) = P and return a unique projection operation. this also avoids looking as much like a logical implication symbol.
Really like the concept! But I think that the simple arrow isn't the good operator to use (already used too much). Instead, I would prefer the -+ operator "minus plus" arrow that neatly reverse into the +- "plus minus" arrow. And what is really cool about it is that it (almost) doesn't change the original meaning of the binary plus (A+B) and unitary minus (-B) operators: A - B = A + -B = A +- B = B -+ A = B- + A So we get that the binary + combine with the unitary - to form +-, and then we can reverse it -+, but when decomposing it again the unitary - get to the right side of the symbole B- (so maybe it would imply that we can put the unitary minus on both side ? It would only be (very) confusing if we kept the binary -. Probably say to not use B- but B-+A is ok). For the chaining, the best way would probably then be: A +- B +- C = A + -B + -C = A - B - C C -+ B -+ A = C- + B- + A = A - B - C (is consistent) It is the way that make the most sense considering the decomposition, but would be different from logic where C => B => A = C => B ∧ B => A and gives C => A (but then the B doesn't serve anything to the arithmetic operation). But if we consider: A +- B -+ C = A + -B- +C = A + B + C ??? Whatever is the case, the chaining must only go in one direction, just like the regular binary minus forces you to. So maybe just use the unitary left minus -B (and no binary minus), and combine it to a binary plus if it make sense? (A + -B -> A +- B and -B + A -> B -+ A) edit note: apparently -text- -makes strike-through text- , zero-width space to the rescue.
This is a really cool alternative idea. I like the way the unary minus sign shows up on the right of its input. That's a real novelty. If you think of the '+' part of your operator as a unary plus, it also shows up on the right sometimes. Very consistent and elegant, I like it!
Using an arrow for subtraction is just so obvious, now that I've seen this notation I will try use this for vectors. Great video, this should be shown in schools ❤
Yes, exactly. The endpoints of the interval have a minus and a plus on them. This generalizes very nicely to higher dimensions too, as I hinted at in this video: ruclips.net/video/6ywt_rhxJfY/видео.html . You walk around the boundary of a shape in a specific orientation. The endpoints of an interval are its "boundary", and the opposite signs provide the orientation.
Why not take from logic again and use ¬? The single arrow is already used everywhere in math, possibly more then the minus sign itself if you take notation such as ¨f:A->B¨ into account. The main downside for my proposal would be that you can´t really flip this symbol when typing, but you can´t really flip a minus sign either so it would still be a net legibility improvement with very minimal adjusting. The macro in LaTeX for the symbol is also pretty short: eg The symbol also kind of looks like an arrow anyway
I was so convinced you had the arrow pointing the wrong way, i almost complained about it. The T shirt example really worked for me to understand that it's subtraction that is unintuitive, and this is largely because the operands are actually on the wrong sides of the operator.
I really like this idea and especially the symbol you chose, but I think there could be some confusion with respect to "chaining" the operation, in the sense that a -> b -> c is not c - b - a, but c - (b - a). I guess it's just a matter of getting used to it though
I think you are right. It needs such a symbol that it doesn”t change the order of the symbols in a chain of operations: x*a-b*y... is more easier to read than x*b[new minus sign]a*y with the same meaning. ("*" stands for other algebric operations).
This is also a problem because to me a -> b -> c should just looking at it be the overall path from a to c through b. But you actually can’t get that from just subtraction. The path from a to c through b is a ->b + b->c. Which simplifies to a->c. In logic it’s different. When you have a chain of logical implications a implies b implies c there’s an implied “and” there and if you know a is true then you do get b is true so c is true. This is also worrying to me because or is a better replacement for addition than and, but here we have and between the arrows.
@@adamkapilow That's a great analysis. The expression "a->b + b->c" has 2 versions of b in it, with opposite signs. That's why they drop out from the middle. Nice! By the way, the implied "and" in logic is equivalent to addition in arithmetic. So we might be able to get away with the chained arrow notation as long as we agree that it contains an implied "plus".
@@AllAnglesMath it's fair if I had not already decided to read "to A from B" when i see subtracion with vectors xD ps.: that was decided when I also decided to see everything as arrows...
Oooo that is a lovely idea. I do enjoy little notational tweaks, not because they are revolutionary but just because of how they give an insight into how we think. I really thought you would cover adding by ones complement as a way to subtract binary strings! It's exactly what you said by using the inverse to make commutativity more evident but in a more concrete setting
The arrow operator normally represents a _function_ from something to something else. I guess you could use it to represent the difference as a curried function, but the problem is it doesn't clarify which operator the function is using, so the expression is kind of underspecified. Does 2->4 represent addition? Multiplication? Exponentiation? All of them are perfectly valid, so limiting it to only a single operator kind of breaks existing notation.
Great take on the arising of invariance from anticommutativity ! =You got yourself a new subscriber I would suggest that getting rid of the subtraction operation in favor of adding negatives is merely a trick. The results may be the same but the operations are different. Especially, subtracting is essential to planning operations. If I am in position B and I want to get to A, then what I lack is a subtraction B -> A So if it is easier IMO to teach kids to use the bar notation (and perhaps that should be a first step), it could hamper the formation of an deeper, more geometrical intuition of the operation.
So that's why in Subleq the order is opposite, we write A B if we want to do [B] := [B] - [A]... If we rewrite it A->B, then it becomes easier to understand: we set [B] to be the difference, it even shows where we put the new value! Thanks for the video!
Very cool video! But I've never seen the overline negation notation, every paper/lecturer I can remember used the eg sign (this sign ¬). And for future videos I suggest a good rule of thumb, the corresponding latex symbol (or the most intuitive latex macro) is with high probability the most popular form.
@@koenvandamme9409 To be fair, I haven't seen it again since I entered university. Everything is written in more "international" (english) notation, including using dots for decimals despite germany using commas for that instead.
@@koenvandamme9409Norman Wildberger has a penchant for using extravagant notation no other mathematician uses, so, while it is great to provide examples, everyone should keep in mind nothing he does or writes is representative of common practices among mathematicians.
I never flipped the vector then moved it head to tail and then walked both vectors to get the result. In the case of A-B I simply walked B backwards then forward to A. B -> A notation already kind of exists , denoting the vector from B to A but the arrow is usually on top above both letters. Denoting non-commutative operations with asymmetric operators, is already kind of an idea that is around, but many operators do not follow this rule. Connecting subtraction with invariants is the new thing I was introduced to here.
3:48 logical negation is written with a subtraction sign but with a little perpendicular bit at the bottom right placed in front of the statement. I believe that the bar notation has been used for negatives historically, but I’ve never seen it used for logical negation before.
Yeah, I've typically seen logical negation as the "¬" symbol, and furthermore, the horizontal bar symbol above a character already usually means the complex conjugate to my knowledge.
It’s used in probability theory sometimes if you want to denote the complement of an event which is funny because the complement of an event is the whole probability space minus that event. So it’s related to the set difference in this case.
On the logic and invariance front : the truth of p->q does not depend on prior belief. Because, two honest persons can disagree on q but they should agree on wether q logically follow from p.
I found this notation useful. I recently saw a video on a simple algorithm for aimbotting in FPS games. In there we assume we can check the memory for the position of the player and the position of the opponent and we need to find the vector from the player to the opponent. With this video, the arrow notation is super intuitive and explains why we get that vector from the difference between the position of the opponent and the position of the player. Otherwise its hard to think about what subtracting a position from another position from a frame of reference gives us. Thank you for this "new" perspective
Yeah, I understand exactly what you mean. When programming the animations for our videos, I also often think how weird it s that I have to move and object *from* A *to* B by subtracting B-A.
Interesting video. I definitely think it would be nice if we could use anti-commutative notation for anti-commutative operations. Being a logician, I especially like how the logical notation carries over to the arithmetic. Unfortunately, such a programme won't work out in all cases, because we mathematicians like to be able to generalise. Presumably we'd want a common notation for things like groups, and yet some groups are anti-commutative, and others are commutative.
Actually I think it would be even better in that situation, because the arrow notation can be used to distinguish between left subtraction and right subtraction, which has always been very fiddly with non-commutative groups. Better yet, directional subtraction symbols are compatible with cancellative monoids, where the concept of "negation" is incoherent, despite that cancellation is well defined. The symmetrical notation for subtraction has been especially frustrating for me, since I work with ordinal numbers a lot, and ordinal numbers *only* have left subtraction. There's no accepted way way to notate ordinal subtraction, because the usual subtraction symbol is assumed to be right-subtraction, which is invalid for ordinals; very inconvenient. Another comment I saw used a modified arrow notation, conjoining "-+" into a cross-like arrow for left subtraction, and likewise "+-" for right subtraction. Incidentally, the right subtraction notation is compatible with the usual notation for right subtraction, where A-B = A+-B. In the same way, the two conventions combined is also roughly equivalent to making - no longer represent subtraction, and instead exclusively represent negation, while allowing it to be both a prefix and a postfix unary operation.
Nice video! I honestly thought you would go down the -1 = i^(2n+2) route and wipe out the symbol that way. Your approach with the arrow fits incredibly well with vector calculus and tensor calculus and most importantly, geometric algebra.
Wouldn't that be a circular reference? `i` is defined as the solution to `x^2 = -1` (AKA √-1). The proper way to define `-1` is the solution to `x+1 = 0`, which we can now call "the negative unity"
The point about velocities being more common than position reminds me of the preference for cohomology over homology. It has an antisymmetry too with pullback as a contravariant op.
Adopting the logic style of the bar over for a negative value and nothing for positive and adopt a mini summation sigma between and chain however you like. It almost creates a ledger type division with positive values here, negative there, total total difference done.
I'm not sure I understand why the implication arrow from logic is connected to the subtraction symbol in arithmetic. And why did you choose "or opposite" to correspond to "add negative" instead of "and opposite"?
The connection between subtraction and implication is merely based on the way they're defined in terms of opposites. The "or" operator has much in common with addition, while the "and" operation has more in common with multiplication.
Your solution makes perfect sense. However, I think using the arrow will be resisted by many. Perhaps a horizontal line with a less than or greater than sign ( -- ) would be more palpable. It works well with your description of acting along the vector from A to B as in A ->- B.
I had a stroke when u introduce the fact that -> signify going from B to A. Im just like: wtf why its inverted. I mean i know why its inverted but that doesnt feel intu-... oh shit... Wait this is the same line of reasoning for vector subtraction. This is literally vector subtraction. Turns out subtraction is unintuative not the other way around
So by the arrow notation you introduced another subtraction from the right to the left. That can lead to problems though when you look at a term with arrows in different directions like 5 −+ 2 +− 7. So there is not a single path defined. Would that mean that you would first subtract 2 from 5 and the subtract the result 3 from 7?
The problem you need to solve and which I need to double check every time is: is this difference inclusive or exclusive? e.g. In an array a region begins with index 2 and ends with index 4, how many elements does it have? Easy: 4-2= 2.. Wrong! it's 3 elements: 2,3,4 another e.g. In a row of cookies, the cookies 3 to 5 need padding on either size to grow more. How many paddings do you add? Easy: 5-3=2.. Wrong! It's 4 paddings: before 3, 3-4, 4-5, after 5 These are just pure pain to deal with when using just '-', your arrow is better, but it could be upgraded to something like { ->, |->, ->|, |->| } to handle all cases, but I fear the consequences of 4 substraction operators :))
Oh man, oh man, when i saw that (not p or q) is like p-q that made me go crazy and laughing out loud! Thank you so much, it's been 5 years since i dont know why the hell (not p or q) is and man i tried so so so so hard to understand it. THANKS ALOOOOT
That's a great question. I can't remember the actual moment, but this idea has been with me for a long time. As a SW developer and animator, I sometimes have to move an object from one position to another. This has always felt "backwards" to me because I have to specify the target position *before* the initial position. It always made much more sense to me the other way around.
Mathematics, or in this case algebra, is a programming language. A very old one, and that's why we don't get any revisions anymore. Just like a new version of for example Python or C++ add new language features it would be nice if some kind of math body would add things like showen in this video every now and then.
I absolutely love your philosophy on this, recently I was trying to come up with a nonsymmetric divisibility symbol to replace "|". I don't think your arrow is a very good choice because arrows like this already have a definition (e.g. in function definitions and limits). What about an arrow with only the top part? (Like, take the arrow and rub off the pointy bit below the line). Now this symbol is new and unused, still nonsymmetric, can be drawn with a single pen stroke, and is closer to the original subtraction sign because it only adds one additional line
There's also a problem in logic as IIUC it's quite common for basic math operators and logical operators to appear in the same expression or in nearby expressions, and the single right arrow is a widely-used, maybe the most widely-used, symbol for if (ie. material implication). There's also at least one variant right arrow used for non-material implication in non-classical logics (⥽, the 'fish-hook') so I suspect that people would generally want a non-standard right arrow to be used to signify a variant form of implication in a logical context. Then the single left arrow has other meanings too: for example in CS it's often used to refer to mutating a variable, when abusing = feels too grubby.
@@leocomerford I do not find the "abusing notation feels too grubby" argument to be compelling at all. The trivial solution to the problem is to not abuse notation.
@@angelmendez-rivera351 I generally dislike inaccurate or imprecise use of = myself. But here I was talking about what the common practice is, rather than what I think it should be.
How about using A⟞B for A−B and A⟝B for B−A? Then the inclusive differences can be A¬B=1+A−B and A⌐B=A+B−A. Why? Because then it fits with logic and probability: ¬B would mean 0¬B just as −B means 0−B, and the probability of ¬p is exactly 0−p, just as if you represent true and false as 1 and 0, NOT(x)=1−x.
This is really cool, and I like how the arrow is used as a binary operator. I might even start using it myself. However, I have to wonder how practical it is to use in more complicated expressions. That 3x3 determinant equation, for example. How do we pick what the arrow points to, and how do we arrange the terms? This is the same sort of issue 3b1b faced with the Triangle of Power. I’m curious what your ideas are for this, because I really want this notation to work
Part of what makes arithmetic in the current system convenient is that it’s easy to see that things being added and things being subtracted are “on opposing sides”. The unary operator looking the same as the binary operator makes this connection flow well in practice, and both work together with associativity without the need for parentheses.
Maybe a good way to keep both the intuition and the function is simply to replace the minus sign in its current usage with a left arrow, both binary and unary, and not put anything on top.
Great remark. Personally I wouldn't mind mixing the 2 notations (the existing one and the arrow). For example, if I want to move an object from one position to another, I might want to write smth like 'move(x1 -> x2-5)'. The arrow in the middle is the "from-to" part of the motion. The minus symbol on the right tells me that I'm moving 5 units to the left of x2. The operators have a clearly different meaning even though they calculate the same way.
@@jakobr_*...and both work together with associativity without the need for parentheses.* Not really. Subtraction is not an associative operation, even with associativity, people still get confused with addition, since infix notation inherently requires some concept of order of operations, which is not required with suffix or prefix notation, for example.
@@angelmendez-rivera351 It’s not associative, but it “works well with” it because we can very easily (in practice, on paper) interpret the binary operation as a combination of negation and addition, and addition is associative. This easy conversion makes computation simple and the path to the answer flexible.
14:15 - 14:21 As a physicist, I strongly disagree with this claim. I definitely do not think velocities are more central to physics than positions, and I do not know any physicist who thinks so. In fact, both are equally as important, but neither are central to physics. Gauge fields are what is central to modern physics. All of modern physics is studied in terms of gauge fields and their symmetry groups (which is something which was stated earlier in this series, I believe).
One example I had in mind: The speed of light is a universal constant, and this single observation leads to the entire theory of relativity. There is no similar statement about a special "universal position" in space. So it seems that in relativity, velocity is more central than position.
@@AllAnglesMath You are correct: the speed of light, in the special theory of relativity, is a universal constant. However, this does not mean velocity is more central than position, since all four-velocities achievable by systems with mass are not constants, and do undergo the same Lorentz transformations four-positions in spacetime undergo. However, you are also ignoring, there are various extensions of the theory, such as the theory of double relativity, where the Planck length plays a similar role as the speed of light. Furthermore, if we attempt to quantize the special theory of relativity, to get an (incomplete) quantum theory of relativity, then we learn that the constant we call "the speed of light" is not truly a speed, but something even more fundamental, which pertains to the structure of spacetime, and its interactions with massless systems. This becomes more clear when you delve into quantum field theory, which is the most empirically accurate framework we have in physics, even more so than the special theory of relativity. This is why I disagree with your claim. Your claim assumes we are still working with a theory which we have already improved upon decades ago.
Yes, commutative subtraction can be done with multisets of mark-antimark pairs which cancel out each other. Horizontal overbar over the symbol is a bitch in ASCII. For various reasons I prefer the chiral symbol pair of relational operators < and >. One among many reasons is that this way we can connect with Dyck language and then expand from outwards pairs < > of Dyck restriction and allow also palindromic strings inwards: > and < form a pair which cancels out ("Del>
Also one might notice; Subtraction: 0 - 0 = 0 0 - 1 = -1 1 - 0 = 1 1 - 1 = 0 Exclusive Or: 0 XOR 0 = 0 0 XOR 1 = 1 1 XOR 0 = 1 1 XOR 1 = 0 Subtraction is a difference operation which is like an exclusive OR operation. Difference is like a measure of dis-similarity, where zero is "full similarity". One could interpret negatives as "opposite", in this sense "adding an opposite" is like measuring the difference of the same thing (which would have no difference). a + (-a) = a - a
The XOR connective in Boolean logic is isomorphic to the addition operation in a group of order 2. As such, the analogy to subtraction is not appropriate. It only works as a matter of coincidence, because a group of order 2 is the only group where (-1) = 1.
@@angelmendez-rivera351 Interesting, I am not a mathematician so I don't really follow, I've been using this way of thinking to implement novel learning algorithms, it served it's purpose, it's a shame the analogy doesn't work because the algorithms do. thanks for your explanation even though I don't really follow.
i wouldn't say the area of a wedge product is given by a determinant; they express the same thing, but determinants are so often taught as just some esoteric operation to memorize that using it as the foundation for a concept makes that concept less intuitive by proxy.
Actually, calculating the area of a parallellogram is exactly what the determinant is for. In linear algebra, it tells you how much areas are stretched by a linear transformation.
@@AllAnglesMath i'm not saying you're wrong--you are correct, after all. i'm saying that, for the uninitiated, defaulting to the determinant is unhelpful as a conceptual foundation, since very few people have had determinants properly or completely explained to them to begin with. as a sidenote; in higher dimensions than just the 2d case, with determinants representing scale factors for (hyper-)volumes rather than areas, the wedge product of two vectors is no longer calculated by a determinant unless you're very specific about how you choose which matrix to take the determinant of (you'd need to find the 2x2 matrix representing how the full nxn transformation affects points in the span of the two vectors to be wedged, which, while calculable, is hard to visualize or reason with)
In defense of the wedge product symbol, it's actually very related to and operator with which it shares its symbol. (This relation is much less obvious when viewing there power product as giving a plane segment from two line segments.) There's also a duality operation usually written as *, ★, or if you're a programmer !. *a ∧ *b = *(a ∨ b) Wait... That's just De'Morgan's Law!
@@AllAnglesMath ¬a ∧ ¬b = ¬(a ∨ b), A' ∩ B' = (A ∪ B)', ★v⃗ ∧ ★u⃗ = ★(v⃗ ∨ u⃗). What is v⃗ ∧ u⃗?, Well, reading v⃗ and u⃗ as mirrors, v⃗ ∧ u⃗ is the subspace of points changed by neither reflection, also known as _their intersection!_ {p ∣ (p ∨ v⃗ = 0) ∧ (p ∨ u⃗ = 0)}. Note the ∧ used in this definition of ∧. There are also connections to traditional addition and subtraction. Saturating subtraction for booleans is the same as the set difference, but _modular_ subtraction for booleans is identical to addition. Saturating subtraction is the a
@@adiaphoros6842 As far as I currently understand it, the "and" & "or" symbols are used for the meet & join operations in GA (althought I can never remember which is which).
@@AllAnglesMath I was really just going for deliberate abuse of notation, because I thought the directional interpretation was interesting and it was fun to break it. Hence why I then add the 0 in, to make it actually sensible
4:02 - 4:17 I wish you had not said this, since this is factually incorrect. In Boolean logic, the logical connective which corresponds to + is not OR, but XOR, what most people in English call "exclusive or," or more generally, "exclusive disjunction." In fact, the Boolean symbol for XOR is ⊕. This is very different from the OR connective. x ⊕ y := (x or y) and not(x and y), and here, the AND connective corresponds to multiplication. I think this premise alone constitutes a problem for the rest of your video. For example, if we want to represent x - y as y -> x, then 0 -> 1 = 1, which denotes the fact 1 - 0 = 1, but one has 0 -> 0 = 1 in Boolean logic, even though 0 - 0 = 0. Similarly, 1 -> 0 = 0, but 0 - 1 = -1 = 1, and 1 -> 1 = 1, even though 1 - 1 = 0. Essentially, in almost none of the scenarios does the IF-THEN material implication from Boolean logic corresponds to subtraction. Furthermore, you said earlier in the video, the unary - operator, which is the additive inverse operator, corresponds to the logical connective NOT, but this cannot be the case, because NOT(1) = 0 and NOT(0) = 1, whereas -1 = 1 and -0 = 0, in a group of order 2. Otherwise, -1 is never equal to 0, and -0 = 0 still holds. The problem is, while the analogy you are trying to illustrate with the comparison is visually appealing, it is not conceptually sound. Mathematically speaking, the 'not' operator is a unary complementation operator in the category of Boolean lattices, whereas the - operator is a unary operator in the category of groups, and by extension, the category of rings. As such, we are talking about completely different and unrelated topics.
First of all: I really appreciate your thorough analysis and feedback. I went over your comments a few times to understand them better. I think the confusion stems from your assumption that I'm trying to use numbers as truth values. I am not claiming that 0 and 1 are both numbers and booleans at the same time, or that logical implication is like subtracting 1s and 0s from each other. I am also not trying to draw a formal isomorphism between addition modulo 2 and propositional logic. I'm just showing that there are similarities. Those run deeper than mere notation, but they're not so deep that they become formal identities. The analogy between logical disjunction and addition is commonly made. They both obey a number of similar rules after all. It's true that XOR also obeys many (all?) of those rules. I would say that XOR is more like addition modulo 2, while OR is more like addition proper. But again: don't think of it as adding the numbers 0 and 1. You mention the symbol for XOR, but there is also a convention to use '+' for logical disjunction. So that's an argument then can go both ways. Either way, I find these deep technical discussions very engaging, so feel free to shoot!
Commutative means a+b = b+a. Anti-commutative means a+b = -(b+a), so you pick up a minus sign. Non-commutative simply means "not commutative". This can be anything where a+b does not equal b+a, including anti-commutative. I hope this clarifies it. Thanks for the question!
Geometric algebra is indeed the domain of math where anti-commutativity shines and sparkles. We have videos about that topic in the pipeline, though it will be a few months before we can publish them.
7:33 - 7:42 I seriously doubt it. The earliest operation we are ever introduced to as literal children is a commutative operation. The next operation introduced is not commutative. This causes teeth grinding for children, simply because it breaks the expectation which had been established for them of "operations are commutative." It has very little to do with the actual visual symbols used. If there is anything which does cause real confusion visually, then it is the usage of - as both a binary quasigroup operation, and as a unary operation. However, using -> does not solve this problem, as others have pointed out already, especially since the analogy you started the video with is completely broken anyway.
9:32 - 9:40 Yes. However, even in this framework, the lattice-ring distinction is still explicitly maintained, in a way which your analogy in the video does not agree with, since your video instead insists on an equivalence between lattices and rings, which does not work, given the counterexamples I gave earlier. This is why I think using -> to replace the symbol for subtraction is a terrible idea. I think essentially any other symbol (aside from trivial nonchoices like the addition symbol) is actually a better choice. The problem is, in your approach, 'not' is being made to correspond with additive inverses, and 'or' is being made to correspond with addition, neither of which is true. In the set of truth-values, {0, 1}, addition is such that every element is its own additive inverse. Therefore, addition and subtraction are the same operation.
It's nice how −+ and +− also look like arrows, and they point in the right direction. So a +− b = a ← b, and - b + a = b → a.
Lots of great ideas in the comments.
i was thinking same. but wont have commented that.
Unless you get rid of the old minus of course, then there would be less of a problem.
I'll teach my children to subtract base six numbers with an arrow. You are a genius bro
And then they fail math class
@Tata-ps4gy Homemade playing is the best way to go~!
@memehamsterr Understanding the same number in different objects come before learning notations.
If you've been worried someone would stand alienated linguistically, checking the methods of teaching and/or offering tips to improve or tests to check can be trustworthy.
The fellow human interlocutor has already openly shown to be minding learning affairs.
Egyptian symbols for abstraction was asymmetric: it was legs that goes from the number: 𓂻 or 𓂽. It similar to arrows that you suggest. Egyptians usually write from right to left. Then 𓂽 would mean "subtraction". But sometimes they write from left to right. In this case 𓂻 would mean subtraction.
What an interesting idea! That
the appearance of a symbol should
share traits of its meaning!
Once again, a unique video!
Thanks ;-) Very glad you liked it.
It's good to hear that I'm not the only person who is obsessed with mathematical notation. I found your arrow notation for subtraction really intuitive! I always found a bit weird that the notation for the formula of a vector from A to B first mentions B and then A.
I agree. In some situations (but not all), it makes more sense to put A before B. Good to know that there are plenty of people who care about notation!
I would love to see a video from you about the Eric Hehner's system!
Thanks for the feedback. We'll see what we can do.
I'd second that!
@@timovandrey ruclips.net/video/niqqm1DRTkE/видео.html He did!!
Yes, please do a video on the unified framework
I very likely will, maybe later this year. Stay tuned! Thanks for the feedback.
This is so good! I always struggled with remembering that the difference between two vectors is final minus initial. With subtraction having an arrow notation this would never have been an issue!
Isn't the name for a - b literally just their "difference"? It seems hard to believe that one can forget that the difference between two things is their difference.
Then again, "difference" also seems to imply an idea of symmetry and it might be more natural to think of |a - b| as the difference, which _is_ symmetric.
@@angeldude101no. Difference is magnitude of subtraction. E.g. the difference of {3, 5} is 2. But only one of 5-3 and 3-5 is equal to 2.
@@angeldude101remembering the difference still means 1/2 the time you guess initial minus final instead of final minus initial
to calculate the result you still need to do the regular procedure of flipping the signs and adding, I think
very nice! representing -a as (a -> 0) feels correct, as a and (a -> 0) cancelling out is greatly announced by the notation:
a + (a -> 0) = 0
also, for vectors one can also have the classic way of substracting vectors purely with the notation:
A - B = A + (B -> 0)
-B = B -> 0 also gives notational insight on why -B flips the vector around.
also, a + (a -> b) = b looks a lot like a and a implies b => b, pretty cool:)
I already loved vector subtraction before and you made it even more appealing. Thanks!
Fantastic. This is the kind of meta math content that lets you see deeper into the fundamentals. I will definitely watch this again.
ah, a operator to designate Final - initial. I like this.
I would love to see a video on Eric Rehners system
It will very likely happen, but not right away.
what if we used an arrow with a bar through it? then
( A -> B ) in general gives you the mathematical object that when applied to A, gives you B.
then you can put the operation that you are involving, in the middle. so subtraction would be arrow with a plus sign in the middle.
but if you did an arrow with a multiplication symbol in it, then ( A -x-> B ) gives you B/A = A^-1 * B
in geometry you could have the arrow operation give you objects that are not the same type as the original objects. you could have points P, Q, and have the arrow with a ‘v’ return the vector that takes P to Q.
P -v-> Q = u, then u(P) = P + u = Q.
you could have u -r-> v = R, give you a rotor-scalar that takes the vector u to v. if you don't have complex numbers, then it doesn't make sense to talk about multiplying vectors and it doesnt make sense to talk of the 'inverse' of u, but you can say that R(u)=v. you could have u -f-> v which returns a unique reflection transformation that takes u to v, and you could do the same with two points,
P-f->Q giving a reflection transformation.
you could have this work with tuples as well, provided there is a unique object.
you could take two quadrilaterals of points (A,B,C,D) -p-> (F,G,H,J) = P and return a unique projection operation.
this also avoids looking as much like a logical implication symbol.
i think associative property we actually are better off without, we should represent subtracting a single number, as an addition of an inverse.
This is a very creative and interesting alternative. It's flexible because it can be applied to so many examples. Cool!
The arrow notation clicks very well for vectors
Really like the concept! But I think that the simple arrow isn't the good operator to use (already used too much). Instead, I would prefer the -+ operator "minus plus" arrow that neatly reverse into the +- "plus minus" arrow. And what is really cool about it is that it (almost) doesn't change the original meaning of the binary plus (A+B) and unitary minus (-B) operators:
A - B = A + -B = A +- B = B -+ A = B- + A
So we get that the binary + combine with the unitary - to form +-, and then we can reverse it -+, but when decomposing it again the unitary - get to the right side of the symbole B- (so maybe it would imply that we can put the unitary minus on both side ? It would only be (very) confusing if we kept the binary -. Probably say to not use B- but B-+A is ok).
For the chaining, the best way would probably then be:
A +- B +- C = A + -B + -C = A - B - C
C -+ B -+ A = C- + B- + A = A - B - C (is consistent)
It is the way that make the most sense considering the decomposition, but would be different from logic where C => B => A = C => B ∧ B => A and gives C => A (but then the B doesn't serve anything to the arithmetic operation). But if we consider:
A +- B -+ C = A + -B- +C = A + B + C ???
Whatever is the case, the chaining must only go in one direction, just like the regular binary minus forces you to.
So maybe just use the unitary left minus -B (and no binary minus), and combine it to a binary plus if it make sense? (A + -B -> A +- B and -B + A -> B -+ A)
edit note: apparently -text- -makes strike-through text- , zero-width space to the rescue.
This is a really cool alternative idea. I like the way the unary minus sign shows up on the right of its input. That's a real novelty. If you think of the '+' part of your operator as a unary plus, it also shows up on the right sometimes. Very consistent and elegant, I like it!
@@AllAnglesMathwhich you should! Its function is symmetric to the function of “-“ in this context.
im stealing this amazing notation
I'm using the relational operators as arrow heads. I think it's a semantically and foundationally a coherent way. See my comment for more details. :)
Using an arrow for subtraction is just so obvious, now that I've seen this notation I will try use this for vectors. Great video, this should be shown in schools ❤
17:01 fundamental theorem of calculus on a curve from a to b
Yes, exactly. The endpoints of the interval have a minus and a plus on them.
This generalizes very nicely to higher dimensions too, as I hinted at in this video: ruclips.net/video/6ywt_rhxJfY/видео.html . You walk around the boundary of a shape in a specific orientation. The endpoints of an interval are its "boundary", and the opposite signs provide the orientation.
I love those videos changing my perspective!
That's great to hear.
Why not take from logic again and use ¬?
The single arrow is already used everywhere in math, possibly more then the minus sign itself if you take notation such as ¨f:A->B¨ into account.
The main downside for my proposal would be that you can´t really flip this symbol when typing, but you can´t really flip a minus sign either so it would still be a net legibility improvement with very minimal adjusting.
The macro in LaTeX for the symbol is also pretty short:
eg
The symbol also kind of looks like an arrow anyway
⌐ is ¬ flipped.
Well putted. Thank you!
honey wake up new all angles video
Another Dope Submission! 👏
I was so convinced you had the arrow pointing the wrong way, i almost complained about it. The T shirt example really worked for me to understand that it's subtraction that is unintuitive, and this is largely because the operands are actually on the wrong sides of the operator.
I sympathize. While producing the video I also caught myself thinking several times: "oh no, this is in the wrong direction." 😆
Here’s a vote for the Hehner video! I’d love to see that.
I’d love to see a video about Eric Hehner’s framework!
Probably somewhere early next year. Stay tuned 😉
you convinced me bro they should change it. every math class is taught with this from now on
I really like this idea and especially the symbol you chose, but I think there could be some confusion with respect to "chaining" the operation, in the sense that a -> b -> c is not c - b - a, but c - (b - a). I guess it's just a matter of getting used to it though
I think you are right. It needs such a symbol that it doesn”t change the order of the symbols in a chain of operations: x*a-b*y... is more easier to read than x*b[new minus sign]a*y with the same meaning. ("*" stands for other algebric operations).
That's a really good point. Associativity can work differently with different symbols. I hadn't considered that.
But that's only an issue if you need to convert between the two notations.
This is also a problem because to me a -> b -> c should just looking at it be the overall path from a to c through b. But you actually can’t get that from just subtraction. The path from a to c through b is a ->b + b->c. Which simplifies to a->c.
In logic it’s different. When you have a chain of logical implications a implies b implies c there’s an implied “and” there and if you know a is true then you do get b is true so c is true. This is also worrying to me because or is a better replacement for addition than and, but here we have and between the arrows.
@@adamkapilow That's a great analysis. The expression "a->b + b->c" has 2 versions of b in it, with opposite signs. That's why they drop out from the middle. Nice!
By the way, the implied "and" in logic is equivalent to addition in arithmetic. So we might be able to get away with the chained arrow notation as long as we agree that it contains an implied "plus".
11:43 - i'm going to be honest i liked that backward arrow more than the arrow, it allows you to write the terms in the same order.
True. You can insist on always writing it right-to-left and this makes it much easier to adopt.
@@AllAnglesMath it's fair if I had not already decided to read "to A from B" when i see subtracion with vectors xD
ps.: that was decided when I also decided to see everything as arrows...
Oooo that is a lovely idea. I do enjoy little notational tweaks, not because they are revolutionary but just because of how they give an insight into how we think.
I really thought you would cover adding by ones complement as a way to subtract binary strings! It's exactly what you said by using the inverse to make commutativity more evident but in a more concrete setting
I love good notation
I enjoyed this. I think it demonstrated your concept clearly.
Thank you so much!
The arrow operator normally represents a _function_ from something to something else. I guess you could use it to represent the difference as a curried function, but the problem is it doesn't clarify which operator the function is using, so the expression is kind of underspecified. Does 2->4 represent addition? Multiplication? Exponentiation? All of them are perfectly valid, so limiting it to only a single operator kind of breaks existing notation.
Great video!
What symbol would you use for quaternion multiplication, which is neither commutative nor anticommutative?
I have no idea 🤔
Maybe just use no symbol at all, like the geometric product in GA?
Wow, I have seen a lot of videos with wonderful new concepts but this one gave me a wonderful new perspective on an old concept! Great content 🎉
Great take on the arising of invariance from anticommutativity ! =You got yourself a new subscriber
I would suggest that getting rid of the subtraction operation in favor of adding negatives is merely a trick. The results may be the same but the operations are different.
Especially, subtracting is essential to planning operations. If I am in position B and I want to get to A, then what I lack is a subtraction B -> A
So if it is easier IMO to teach kids to use the bar notation (and perhaps that should be a first step), it could hamper the formation of an deeper, more geometrical intuition of the operation.
Excellent points.
So that's why in Subleq the order is opposite, we write A B if we want to do [B] := [B] - [A]... If we rewrite it A->B, then it becomes easier to understand: we set [B] to be the difference, it even shows where we put the new value! Thanks for the video!
I never heard about subleq before. Interesting.
Very cool video!
But I've never seen the overline negation notation, every paper/lecturer I can remember used the
eg sign (this sign ¬).
And for future videos I suggest a good rule of thumb, the corresponding latex symbol (or the most intuitive latex macro) is with high probability the most popular form.
I've seen the overline notation used by Norman Wildberger, e.g. here: ruclips.net/video/lqH4BLHGsFw/видео.html
I was taught the overline negation notation in (high?) school. (germany)
@@Tumbolisu Wow, I had no idea that it was actually being taught. That's cool.
@@koenvandamme9409 To be fair, I haven't seen it again since I entered university. Everything is written in more "international" (english) notation, including using dots for decimals despite germany using commas for that instead.
@@koenvandamme9409Norman Wildberger has a penchant for using extravagant notation no other mathematician uses, so, while it is great to provide examples, everyone should keep in mind nothing he does or writes is representative of common practices among mathematicians.
I never flipped the vector then moved it head to tail and then walked both vectors to get the result. In the case of A-B I simply walked B backwards then forward to A. B -> A notation already kind of exists , denoting the vector from B to A but the arrow is usually on top above both letters. Denoting non-commutative operations with asymmetric operators, is already kind of an idea that is around, but many operators do not follow this rule. Connecting subtraction with invariants is the new thing I was introduced to here.
3:48 logical negation is written with a subtraction sign but with a little perpendicular bit at the bottom right placed in front of the statement.
I believe that the bar notation has been used for negatives historically, but I’ve never seen it used for logical negation before.
Yeah, I've typically seen logical negation as the "¬" symbol, and furthermore, the horizontal bar symbol above a character already usually means the complex conjugate to my knowledge.
It’s used in probability theory sometimes if you want to denote the complement of an event which is funny because the complement of an event is the whole probability space minus that event. So it’s related to the set difference in this case.
@@jorex6816 I also haven’t seen the complement denoted by a bar. I’ve only seen it as a superscript c or X\A. How often would you say you see this?
@@biblebot3947I will say I have also seen the bar above for negation used frequently. I have a degree in Statistics.
@@jorex6816Set difference is defined in terms of logical negation, though.
Please do! I'd love to hear about Eric Hehner's work!!!!
On the logic and invariance front : the truth of p->q does not depend on prior belief. Because, two honest persons can disagree on q but they should agree on wether q logically follow from p.
I found this notation useful. I recently saw a video on a simple algorithm for aimbotting in FPS games. In there we assume we can check the memory for the position of the player and the position of the opponent and we need to find the vector from the player to the opponent. With this video, the arrow notation is super intuitive and explains why we get that vector from the difference between the position of the opponent and the position of the player. Otherwise its hard to think about what subtracting a position from another position from a frame of reference gives us. Thank you for this "new" perspective
Yeah, I understand exactly what you mean. When programming the animations for our videos, I also often think how weird it s that I have to move and object *from* A *to* B by subtracting B-A.
wow, this is a new addition to interesting math videos, and it's abotu subtraction!
Interesting video. I definitely think it would be nice if we could use anti-commutative notation for anti-commutative operations. Being a logician, I especially like how the logical notation carries over to the arithmetic. Unfortunately, such a programme won't work out in all cases, because we mathematicians like to be able to generalise. Presumably we'd want a common notation for things like groups, and yet some groups are anti-commutative, and others are commutative.
Actually I think it would be even better in that situation, because the arrow notation can be used to distinguish between left subtraction and right subtraction, which has always been very fiddly with non-commutative groups. Better yet, directional subtraction symbols are compatible with cancellative monoids, where the concept of "negation" is incoherent, despite that cancellation is well defined. The symmetrical notation for subtraction has been especially frustrating for me, since I work with ordinal numbers a lot, and ordinal numbers *only* have left subtraction. There's no accepted way way to notate ordinal subtraction, because the usual subtraction symbol is assumed to be right-subtraction, which is invalid for ordinals; very inconvenient.
Another comment I saw used a modified arrow notation, conjoining "-+" into a cross-like arrow for left subtraction, and likewise "+-" for right subtraction. Incidentally, the right subtraction notation is compatible with the usual notation for right subtraction, where A-B = A+-B. In the same way, the two conventions combined is also roughly equivalent to making - no longer represent subtraction, and instead exclusively represent negation, while allowing it to be both a prefix and a postfix unary operation.
incredible video!
Nice video! I honestly thought you would go down the -1 = i^(2n+2) route and wipe out the symbol that way. Your approach with the arrow fits incredibly well with vector calculus and tensor calculus and most importantly, geometric algebra.
It's great to see that geometric algebra is gaining traction. We are planning a series of videos on that subject, so stay tuned 😉
Wouldn't that be a circular reference? `i` is defined as the solution to `x^2 = -1` (AKA √-1). The proper way to define `-1` is the solution to `x+1 = 0`, which we can now call "the negative unity"
Wow, I will never write the standard minus sign again.
Yes you will.
@@HoSza1 Didn't ask tho.
@@arseniikaurov1686 Wow that escalated quickly. Ahahahahaha.
@@HoSza1 Yeah, I'm kinda straight with my ideas.
NEVER SAY NEVERRRRRRRRRRR…
Well, maybe sometimes.
The point about velocities being more common than position reminds me of the preference for cohomology over homology. It has an antisymmetry too with pullback as a contravariant op.
I will have to take your word for it, since I know nothing about (co)homology. Is there an easy to follow explanation that you can recommend?
And also functions (movement) over objects (points).
This is very interesting as a teacher
2 -> 10 = 8, feels so natural
Wow, I am definitely using this symbol for my conlang’s mathematics
You're developing a conlang? Cool! If you have some videos about it, feel free to share.
Adopting the logic style of the bar over for a negative value and nothing for positive and adopt a mini summation sigma between and chain however you like.
It almost creates a ledger type division with positive values here, negative there, total total difference done.
I'm not sure I understand why the implication arrow from logic is connected to the subtraction symbol in arithmetic. And why did you choose "or opposite" to correspond to "add negative" instead of "and opposite"?
The connection between subtraction and implication is merely based on the way they're defined in terms of opposites.
The "or" operator has much in common with addition, while the "and" operation has more in common with multiplication.
@@AllAnglesMathThe OR connective does *not* correspond to the addition operator, though. However, the XOR connective does.
Your solution makes perfect sense. However, I think using the arrow will be resisted by many. Perhaps a horizontal line with a less than or greater than sign ( -- ) would be more palpable. It works well with your description of acting along the vector from A to B as in A ->- B.
If I ever make a non-esoteric prog-lang, I'll definitely implement that notation like this `->` & `
Cool! I'm also contemplating adding the arrow operators to a programming language at some point. That's how I got the idea in the first place.
Incredible video!
Thanks! Glad you liked it.
Would we need a new symbol for ±?
Very interesting video
Never thought of it, great work sir ❤
What you are doing with the minus sign is exactly how computers do arithmetic.
I didn't realize the bar over character for inversion was from logic. I know it from physics. The anti. Vector direction reversed
Minus is just easy to write, I always hated writing arrows during my physics class
The natural next step in this argument would be to go into Control Theory / Negative Feedback / Stability.
I had a stroke when u introduce the fact that -> signify going from B to A. Im just like: wtf why its inverted. I mean i know why its inverted but that doesnt feel intu-... oh shit... Wait this is the same line of reasoning for vector subtraction. This is literally vector subtraction. Turns out subtraction is unintuative not the other way around
Yup, I agree.
Sorry for giving you a stroke
😄
So by the arrow notation you introduced another subtraction from the right to the left. That can lead to problems though when you look at a term with arrows in different directions like 5 −+ 2 +− 7. So there is not a single path defined. Would that mean that you would first subtract 2 from 5 and the subtract the result 3 from 7?
Good point. The suggested notation definitely isn't perfect.
Can you make the video regarding Eric Hehner's work
I'm quite sure it will happen. Plenty of people have already asked for it. It may take some time though, so stay tuned.
The problem you need to solve and which I need to double check every time is: is this difference inclusive or exclusive?
e.g. In an array a region begins with index 2 and ends with index 4, how many elements does it have? Easy: 4-2= 2.. Wrong! it's 3 elements: 2,3,4
another e.g. In a row of cookies, the cookies 3 to 5 need padding on either size to grow more. How many paddings do you add? Easy: 5-3=2.. Wrong! It's 4 paddings: before 3, 3-4, 4-5, after 5
These are just pure pain to deal with when using just '-', your arrow is better, but it could be upgraded to something like { ->, |->, ->|, |->| } to handle all cases, but I fear the consequences of 4 substraction operators :))
Yeah, that's another really tricky one that I also often get wrong. No idea how to cleanly solve it though.
Oh man, oh man, when i saw that (not p or q) is like p-q that made me go crazy and laughing out loud! Thank you so much, it's been 5 years since i dont know why the hell (not p or q) is and man i tried so so so so hard to understand it. THANKS ALOOOOT
Now my task to know if this has something to do with abelian groups
this one was so interesting that I didnt feel the 18 minutes go by 😅 curious, where did you first hear/ think about a new minus symbol yourself?
That's a great question. I can't remember the actual moment, but this idea has been with me for a long time. As a SW developer and animator, I sometimes have to move an object from one position to another. This has always felt "backwards" to me because I have to specify the target position *before* the initial position. It always made much more sense to me the other way around.
Mathematics, or in this case algebra, is a programming language. A very old one, and that's why we don't get any revisions anymore. Just like a new version of for example Python or C++ add new language features it would be nice if some kind of math body would add things like showen in this video every now and then.
Well shit, I'm now convinced that we should use an arrow for subtraction.
I absolutely love your philosophy on this, recently I was trying to come up with a nonsymmetric divisibility symbol to replace "|".
I don't think your arrow is a very good choice because arrows like this already have a definition (e.g. in function definitions and limits). What about an arrow with only the top part? (Like, take the arrow and rub off the pointy bit below the line). Now this symbol is new and unused, still nonsymmetric, can be drawn with a single pen stroke, and is closer to the original subtraction sign because it only adds one additional line
Good idea. It's just that kids learn subtraction and not propositional logic (they should tho, it's my fav branch of math)
There's also a problem in logic as IIUC it's quite common for basic math operators and logical operators to appear in the same expression or in nearby expressions, and the single right arrow is a widely-used, maybe the most widely-used, symbol for if (ie. material implication). There's also at least one variant right arrow used for non-material implication in non-classical logics (⥽, the 'fish-hook') so I suspect that people would generally want a non-standard right arrow to be used to signify a variant form of implication in a logical context.
Then the single left arrow has other meanings too: for example in CS it's often used to refer to mutating a variable, when abusing = feels too grubby.
@@leocomerford I do not find the "abusing notation feels too grubby" argument to be compelling at all. The trivial solution to the problem is to not abuse notation.
@@angelmendez-rivera351 I generally dislike inaccurate or imprecise use of = myself. But here I was talking about what the common practice is, rather than what I think it should be.
I would also like to point out that the negative on top notation is already used in math with complex numbers
There are many notations that serve double duty. It's not a big problem in practice as long as you make your notational choices clear and explicit.
How about using A⟞B for A−B and A⟝B for B−A? Then the inclusive differences can be A¬B=1+A−B and A⌐B=A+B−A. Why? Because then it fits with logic and probability: ¬B would mean 0¬B just as −B means 0−B, and the probability of ¬p is exactly 0−p, just as if you represent true and false as 1 and 0, NOT(x)=1−x.
There are so many great ideas in the comments!
This is really cool, and I like how the arrow is used as a binary operator. I might even start using it myself.
However, I have to wonder how practical it is to use in more complicated expressions. That 3x3 determinant equation, for example. How do we pick what the arrow points to, and how do we arrange the terms? This is the same sort of issue 3b1b faced with the Triangle of Power. I’m curious what your ideas are for this, because I really want this notation to work
Part of what makes arithmetic in the current system convenient is that it’s easy to see that things being added and things being subtracted are “on opposing sides”. The unary operator looking the same as the binary operator makes this connection flow well in practice, and both work together with associativity without the need for parentheses.
Maybe a good way to keep both the intuition and the function is simply to replace the minus sign in its current usage with a left arrow, both binary and unary, and not put anything on top.
Great remark.
Personally I wouldn't mind mixing the 2 notations (the existing one and the arrow). For example, if I want to move an object from one position to another, I might want to write smth like 'move(x1 -> x2-5)'. The arrow in the middle is the "from-to" part of the motion. The minus symbol on the right tells me that I'm moving 5 units to the left of x2. The operators have a clearly different meaning even though they calculate the same way.
@@jakobr_*...and both work together with associativity without the need for parentheses.*
Not really. Subtraction is not an associative operation, even with associativity, people still get confused with addition, since infix notation inherently requires some concept of order of operations, which is not required with suffix or prefix notation, for example.
@@angelmendez-rivera351 It’s not associative, but it “works well with” it because we can very easily (in practice, on paper) interpret the binary operation as a combination of negation and addition, and addition is associative. This easy conversion makes computation simple and the path to the answer flexible.
New video dropping the evening before my algebra exam
sus 🤨
I hope your exam went well?
@@AllAnglesMath it was manageable 🫡
since we already have the expression "take 2 from 5" for "5 - 2" I think we should have the arrow pointing to the subtrahend
Interesting, but I think that will make things more confusing.
14:15 - 14:21 As a physicist, I strongly disagree with this claim. I definitely do not think velocities are more central to physics than positions, and I do not know any physicist who thinks so. In fact, both are equally as important, but neither are central to physics. Gauge fields are what is central to modern physics. All of modern physics is studied in terms of gauge fields and their symmetry groups (which is something which was stated earlier in this series, I believe).
One example I had in mind: The speed of light is a universal constant, and this single observation leads to the entire theory of relativity. There is no similar statement about a special "universal position" in space. So it seems that in relativity, velocity is more central than position.
@@AllAnglesMath You are correct: the speed of light, in the special theory of relativity, is a universal constant. However, this does not mean velocity is more central than position, since all four-velocities achievable by systems with mass are not constants, and do undergo the same Lorentz transformations four-positions in spacetime undergo. However, you are also ignoring, there are various extensions of the theory, such as the theory of double relativity, where the Planck length plays a similar role as the speed of light. Furthermore, if we attempt to quantize the special theory of relativity, to get an (incomplete) quantum theory of relativity, then we learn that the constant we call "the speed of light" is not truly a speed, but something even more fundamental, which pertains to the structure of spacetime, and its interactions with massless systems. This becomes more clear when you delve into quantum field theory, which is the most empirically accurate framework we have in physics, even more so than the special theory of relativity. This is why I disagree with your claim. Your claim assumes we are still working with a theory which we have already improved upon decades ago.
I would love to see a video on the diagram by Hehner
We're working on the video already. It will probably be released early next year. Thanks for your feedback!
Yes, commutative subtraction can be done with multisets of mark-antimark pairs which cancel out each other. Horizontal overbar over the symbol is a bitch in ASCII. For various reasons I prefer the chiral symbol pair of relational operators < and >.
One among many reasons is that this way we can connect with Dyck language and then expand from outwards pairs < > of Dyck restriction and allow also palindromic strings inwards: > and < form a pair which cancels out ("Del>
hey santeris..., i came here from reading ur reply in another top-level comment. but i couldnt understand a thing here
@@yash1152 Would you like to understand?
Also one might notice;
Subtraction:
0 - 0 = 0
0 - 1 = -1
1 - 0 = 1
1 - 1 = 0
Exclusive Or:
0 XOR 0 = 0
0 XOR 1 = 1
1 XOR 0 = 1
1 XOR 1 = 0
Subtraction is a difference operation which is like an exclusive OR operation.
Difference is like a measure of dis-similarity, where zero is "full similarity". One could interpret negatives as "opposite", in this sense "adding an opposite" is like measuring the difference of the same thing (which would have no difference).
a + (-a) = a - a
The XOR connective in Boolean logic is isomorphic to the addition operation in a group of order 2. As such, the analogy to subtraction is not appropriate. It only works as a matter of coincidence, because a group of order 2 is the only group where (-1) = 1.
@@angelmendez-rivera351 Interesting, I am not a mathematician so I don't really follow, I've been using this way of thinking to implement novel learning algorithms, it served it's purpose, it's a shame the analogy doesn't work because the algorithms do. thanks for your explanation even though I don't really follow.
i wouldn't say the area of a wedge product is given by a determinant; they express the same thing, but determinants are so often taught as just some esoteric operation to memorize that using it as the foundation for a concept makes that concept less intuitive by proxy.
Actually, calculating the area of a parallellogram is exactly what the determinant is for. In linear algebra, it tells you how much areas are stretched by a linear transformation.
@@AllAnglesMath i'm not saying you're wrong--you are correct, after all. i'm saying that, for the uninitiated, defaulting to the determinant is unhelpful as a conceptual foundation, since very few people have had determinants properly or completely explained to them to begin with.
as a sidenote; in higher dimensions than just the 2d case, with determinants representing scale factors for (hyper-)volumes rather than areas, the wedge product of two vectors is no longer calculated by a determinant unless you're very specific about how you choose which matrix to take the determinant of (you'd need to find the 2x2 matrix representing how the full nxn transformation affects points in the span of the two vectors to be wedged, which, while calculable, is hard to visualize or reason with)
@@rarebeeph1783 Great analysis for higher dimensions. I love the attention to details in the comments, it's really top notch.
In defense of the wedge product symbol, it's actually very related to and operator with which it shares its symbol. (This relation is much less obvious when viewing there power product as giving a plane segment from two line segments.) There's also a duality operation usually written as *, ★, or if you're a programmer !. *a ∧ *b = *(a ∨ b)
Wait... That's just De'Morgan's Law!
These connections between De Morgan, geometric algebra, set theory, ... are always really interesting and insightful.
@@AllAnglesMath ¬a ∧ ¬b = ¬(a ∨ b), A' ∩ B' = (A ∪ B)', ★v⃗ ∧ ★u⃗ = ★(v⃗ ∨ u⃗).
What is v⃗ ∧ u⃗?, Well, reading v⃗ and u⃗ as mirrors, v⃗ ∧ u⃗ is the subspace of points changed by neither reflection, also known as _their intersection!_ {p ∣ (p ∨ v⃗ = 0) ∧ (p ∨ u⃗ = 0)}. Note the ∧ used in this definition of ∧.
There are also connections to traditional addition and subtraction. Saturating subtraction for booleans is the same as the set difference, but _modular_ subtraction for booleans is identical to addition. Saturating subtraction is the a
What’s the OR symbol in geometric algebra?
@@adiaphoros6842 As far as I currently understand it, the "and" & "or" symbols are used for the meet & join operations in GA (althought I can never remember which is which).
Asymetric symbol for matrix multiplication would be very useful, becouse I always think about it in opposite way that its usualy written
That would indeed be useful. Any suggestions?
@@AllAnglesMath \ or a flipped L, to indicate that the left one goes on the bottom-left and the right one goes on the top right
@@Tordek Nice one!
What does "SoME3" mean?
This video is my entry for this year's Summer of Math Exposition contest. The tag for the contest this year is SoME3.
A
In your first expression, you seem to be using the arrow both as a binary *and* a unary operator. Was that what you were going for?
@@AllAnglesMath I was really just going for deliberate abuse of notation, because I thought the directional interpretation was interesting and it was fun to break it. Hence why I then add the 0 in, to make it actually sensible
4:02 - 4:17 I wish you had not said this, since this is factually incorrect. In Boolean logic, the logical connective which corresponds to + is not OR, but XOR, what most people in English call "exclusive or," or more generally, "exclusive disjunction." In fact, the Boolean symbol for XOR is ⊕. This is very different from the OR connective. x ⊕ y := (x or y) and not(x and y), and here, the AND connective corresponds to multiplication. I think this premise alone constitutes a problem for the rest of your video.
For example, if we want to represent x - y as y -> x, then 0 -> 1 = 1, which denotes the fact 1 - 0 = 1, but one has 0 -> 0 = 1 in Boolean logic, even though 0 - 0 = 0. Similarly, 1 -> 0 = 0, but 0 - 1 = -1 = 1, and 1 -> 1 = 1, even though 1 - 1 = 0. Essentially, in almost none of the scenarios does the IF-THEN material implication from Boolean logic corresponds to subtraction.
Furthermore, you said earlier in the video, the unary - operator, which is the additive inverse operator, corresponds to the logical connective NOT, but this cannot be the case, because NOT(1) = 0 and NOT(0) = 1, whereas -1 = 1 and -0 = 0, in a group of order 2. Otherwise, -1 is never equal to 0, and -0 = 0 still holds.
The problem is, while the analogy you are trying to illustrate with the comparison is visually appealing, it is not conceptually sound. Mathematically speaking, the 'not' operator is a unary complementation operator in the category of Boolean lattices, whereas the - operator is a unary operator in the category of groups, and by extension, the category of rings. As such, we are talking about completely different and unrelated topics.
First of all: I really appreciate your thorough analysis and feedback.
I went over your comments a few times to understand them better. I think the confusion stems from your assumption that I'm trying to use numbers as truth values. I am not claiming that 0 and 1 are both numbers and booleans at the same time, or that logical implication is like subtracting 1s and 0s from each other.
I am also not trying to draw a formal isomorphism between addition modulo 2 and propositional logic. I'm just showing that there are similarities. Those run deeper than mere notation, but they're not so deep that they become formal identities.
The analogy between logical disjunction and addition is commonly made. They both obey a number of similar rules after all. It's true that XOR also obeys many (all?) of those rules. I would say that XOR is more like addition modulo 2, while OR is more like addition proper. But again: don't think of it as adding the numbers 0 and 1.
You mention the symbol for XOR, but there is also a convention to use '+' for logical disjunction. So that's an argument then can go both ways.
Either way, I find these deep technical discussions very engaging, so feel free to shoot!
Wtf is this masterpiece
explain anticomutativity vs non-comutativity
Commutative means a+b = b+a.
Anti-commutative means a+b = -(b+a), so you pick up a minus sign.
Non-commutative simply means "not commutative". This can be anything where a+b does not equal b+a, including anti-commutative.
I hope this clarifies it. Thanks for the question!
@@AllAnglesMath I thought anticommutativness is a×a=1 and noncommutativness is a+b != b+a
Cold take: Anti commutativity is actually fine (see geometric algebra)
Geometric algebra is indeed the domain of math where anti-commutativity shines and sparkles. We have videos about that topic in the pipeline, though it will be a few months before we can publish them.
@@AllAnglesMath Very excited for that +1
7:33 - 7:42 I seriously doubt it. The earliest operation we are ever introduced to as literal children is a commutative operation. The next operation introduced is not commutative. This causes teeth grinding for children, simply because it breaks the expectation which had been established for them of "operations are commutative." It has very little to do with the actual visual symbols used. If there is anything which does cause real confusion visually, then it is the usage of - as both a binary quasigroup operation, and as a unary operation. However, using -> does not solve this problem, as others have pointed out already, especially since the analogy you started the video with is completely broken anyway.
I will vote against the "line on top" notation since it already means complex conjugation. Also, "not" has a different symbol in logic.
5:49 except under mod 4
Nice one!
Reject subtraction! Reject inverses! Embrace semirings!!! (Also, arrows are obviously exponentials, not sums)
You haven't commented on the fact that wedge symbol is not antisymmetric. Irony how math terminology makes it harder, though.
An arrow?! Why not take the + and make it not symmetric like a sideways T
9:32 - 9:40 Yes. However, even in this framework, the lattice-ring distinction is still explicitly maintained, in a way which your analogy in the video does not agree with, since your video instead insists on an equivalence between lattices and rings, which does not work, given the counterexamples I gave earlier. This is why I think using -> to replace the symbol for subtraction is a terrible idea. I think essentially any other symbol (aside from trivial nonchoices like the addition symbol) is actually a better choice.
The problem is, in your approach, 'not' is being made to correspond with additive inverses, and 'or' is being made to correspond with addition, neither of which is true. In the set of truth-values, {0, 1}, addition is such that every element is its own additive inverse. Therefore, addition and subtraction are the same operation.
i went from: «ok, but why» to «ohh», so 🤨→🤔
😄