The first 500 people to use my link https://skl.sh/zachstar02251 will get a 1 month free trial of Skillshare!
@numbersix8919 Says:
Thank yew mister.
@benfoxcroft2252 Says:
It is so bizarre hearing the same voice as in the comedy skits, because I keep expecting punchlines then remembering... this is just an educational video. Lol
@Deejay_quest Says:
I don't get this skit, I'm not gonna lie it wasn't very funny. And I got lost in the plot.
@mcalebc Says:
how does this work if the matrix sigma has complex numbers in it or if sigma is in jordan canonical form
@AntonRellik1123 Says:
can you do this in reverse? what about scaling a image? adding information to the matrix when it expands fill the space to the closest value of the original matrix
@Wolkenphoenix Says:
Such a cool video, thank you <3
@mightymcphee Says:
wtf when did you go from funny shorts a few years ago to explaining compression better than anyone I've seen. 2:53 this here is a good example of why this video and the script is so good. wow most impressed I've been with a presentation in years.
@TheBestDog Says:
WTF? 🤬
@MichaelKolczynski Says:
Only on the internet can you take 25 bits, turn them into at least 9 bits per component for a storage size of 180 bits and call it compression
@dermasmid Says:
Thank you skillshare
@aurelia8028 Says:
Bruh, your timing is almost a bit spooky. I'm taking a course in machine learning this semester and a couple of weeks ago the lecture was about svd. But there it was in the context of principal compenent analysis, not data compression
@GuyMichaely Says:
Would've appreciated more in depth explanation
@SamratDuttabdn Says:
He is both WILLING and ABLE to teach us how to compress an image with (basic) linear algebra
@limikeli9761 Says:
Hey Zack, what app or software did you use to create the pixel shading?
@Al-Hebri Says:
Inspiring, Thank you
@pawejerzyna5674 Says:
lovely video
@Impatient_Ape Says:
OMG, thank you for talking about "rank" correctly. Physics and engineering often use the word "rank" when they actually mean "degree". Rank-1 matrices are representations of "pure" tensors. And all tensors of higher rank can be written as linear combinations of pure tensors.
@Fereydoon.Shekofte Says:
Thank you very much
Greatest explanation on YouTube ❤❤🎉🎉😊😊
@eswyatt Says:
Seems so similar to PCA or principal component analysis.
@LuisLascanoValarezo Says:
Thanks Zack for making this videos!!
You know, a comedy video (which yours are great) I will watch once, but a great engineering video I'll watch as a prayer every time in need with connection with Mather Nature
@50secs Says:
Nice video Zach, I wrote a blog on this topic as well few years ago. To demonstrate how the matrices takes the form of data on MNIST dataset to explain the generalisation process.
@besthero12345 Says:
hey zach, r u still pursuing applied math as a
degree?
@user-pw5do6tu7i Says:
Getting all sorts of ideas around multi sampling windows here
@user-pw5do6tu7i Says:
Bro is like finding the 'prime factors' but of a matrix. Freaking nerd
@seedmole Says:
Nice, basically Fourier stuff but on images.
@timmygilbert4102 Says:
That just generative ai 😂
@featureboxx Says:
interesting, I was just going through a linear algebra course
I recognize column spaces
@littlestewart Says:
The guy who makes dumb skits uploaded a STEM video
@nnm35 Says:
I wish you had been my math professor. Super clear explanations! Tnx!
@awertyuiop8711 Says:
Now the real question is “How to compress an image with (basic) Geometric Algebra?”
@delec9665 Says:
Clearest video on svd gg
@jonathan3488 Says:
Thanks! I couldn't understand why we sort the eigenvalues by descending order in order to do low-rank approximations, but when I know that SVD leads to a linear combination of rank1 matrices, it became very clear. How cool!
@Vytor_01 Says:
having this video recomended after making a 3d renderer wasnt an smart idea, youtube. Im way too tired of matrices
@rb8049 Says:
As you on as you said linear algebra I knew the technique, but never thought of this application.
@paridhaxholli Says:
H
@derekpowersblight Says:
DLSS is wild
@sachininthetube Says:
Thanks!
@MasterHigure Says:
I did an SVD compression of a black-and-white image back in my intro linear algebra class. The fact that I still remember working on it almost 20 years after means I think it was cool.
@proboiz_50 Says:
I came here to know something BASIC
But it's ADVANCED for me
@shekarlakshmipathi Says:
Wish you and Grant (3Blue1Brown) were my teachers in high school. My heart starts beating fast when I start watching your videos. It is so exciting. I learn something new EVERYTIME.
@vaidphysics Says:
Exactly this method is used in tensor networks for compressing information about a very complicated many body quantum state into a smaller set of variables. Great video 👏👏👏
@ready1fire1aim1 Says:
It's super strange how reality is 3D but our logic, math and physics formalisms are 2D (bivalent, binary and third-person).
We even explicitly reject 3D with the Law of Excluded Middle.
It appears that Einstein should have squared Leibniz's first-person 0D-3D rather than Newton's third-person 3D-1D.
Leibniz started from 0D and built up but Newton started from 3D and reduced down. If we start from non-zero dimensions and try to reduce down to 0D then we only get to 1D (also we get to all our current major/minor open problems, contradictions and paradoxes).
The difference between 0D and non-zero dimensions is equivalent to the difference between P and NP. Leibniz's metaphysical monads and CS' mathematical monads have equivalencies, too, and I'd imagine that's how we get AGI.
Einstein should have done 0D-3D² for 3 sets of 3 dimensions: space, time and energy (stars are observable 3D energy).
Newton's 3D-1D can't reach 0D since that's not how reality works. 0D quarks form 3D protons and neutrons, not vice versa.
Also we probably shouldn't think of our quarks as a 2D disc with a triangle of quarks but rather as a 3D sphere with a tetrahedron of quarks (we just can't see the 4th quark since it's on the other side of 0D).
@SystemsMedicine Says:
Very clear, very lovely…
@wintutorials2282 Says:
This is so awesome
Thanks for the awesome video
@sadeepweerasinghe Says:
you sound just like the guy from the funny videos
@parzh Says:
I wonder if the first frame being an overly compressed image of SkillShare’s logo is intentional or not.
@BlackDevilSTi Says:
4:00 what image we would get if we normalize this matrix into values between 0 and 1 so we have no negative values and greater than 1 values?
@Polecam-FABI-ECURepair_channel Says:
Let be blasphemed anyone who favors decomposing a 2000x1000 3 dimensional body matrices instead of FFT.
@broccoloodle Says:
two very important notes: (1) assuming natural data has low rank (2) svd is the optimal method for minimizing Frobenius norm
LATEST COMMENTS