<<@zachstar
says :
The first 500 people to use my link https://skl.sh/zachstar02251 will get a 1 month free trial of Skillshare!
>>
<<@numbersix8919
says :
Thank yew mister.
>>
<<@benfoxcroft2252
says :
It is so bizarre hearing the same voice as in the comedy skits, because I keep expecting punchlines then remembering... this is just an educational video. Lol
>>
<<@Deejay_quest
says :
I don't get this skit, I'm not gonna lie it wasn't very funny. And I got lost in the plot.
>>
<<@mcalebc
says :
how does this work if the matrix sigma has complex numbers in it or if sigma is in jordan canonical form
>>
<<@AntonRellik1123
says :
can you do this in reverse? what about scaling a image? adding information to the matrix when it expands fill the space to the closest value of the original matrix
>>
<<@Wolkenphoenix
says :
Such a cool video, thank you <3
>>
<<@mightymcphee
says :
wtf when did you go from funny shorts a few years ago to explaining compression better than anyone I've seen. 2:53 this here is a good example of why this video and the script is so good. wow most impressed I've been with a presentation in years.
>>
<<@TheBestDog
says :
WTF? 🤬
>>
<<@MichaelKolczynski
says :
Only on the internet can you take 25 bits, turn them into at least 9 bits per component for a storage size of 180 bits and call it compression
>>
<<@dermasmid
says :
Thank you skillshare
>>
<<@aurelia8028
says :
Bruh, your timing is almost a bit spooky. I'm taking a course in machine learning this semester and a couple of weeks ago the lecture was about svd. But there it was in the context of principal compenent analysis, not data compression
>>
<<@GuyMichaely
says :
Would've appreciated more in depth explanation
>>
<<@SamratDuttabdn
says :
He is both WILLING and ABLE to teach us how to compress an image with (basic) linear algebra
>>
<<@limikeli9761
says :
Hey Zack, what app or software did you use to create the pixel shading?
>>
<<@Al-Hebri
says :
Inspiring, Thank you
>>
<<@pawejerzyna5674
says :
lovely video
>>
<<@Impatient_Ape
says :
OMG, thank you for talking about "rank" correctly. Physics and engineering often use the word "rank" when they actually mean "degree". Rank-1 matrices are representations of "pure" tensors. And all tensors of higher rank can be written as linear combinations of pure tensors.
>>
<<@Fereydoon.Shekofte
says :
Thank you very much Greatest explanation on YouTube ❤❤🎉🎉😊😊
>>
<<@eswyatt
says :
Seems so similar to PCA or principal component analysis.
>>
<<@LuisLascanoValarezo
says :
Thanks Zack for making this videos!! You know, a comedy video (which yours are great) I will watch once, but a great engineering video I'll watch as a prayer every time in need with connection with Mather Nature
>>
<<@50secs
says :
Nice video Zach, I wrote a blog on this topic as well few years ago. To demonstrate how the matrices takes the form of data on MNIST dataset to explain the generalisation process.
>>
<<@besthero12345
says :
hey zach, r u still pursuing applied math as a degree?
>>
<<@user-pw5do6tu7i
says :
Getting all sorts of ideas around multi sampling windows here
>>
<<@user-pw5do6tu7i
says :
Bro is like finding the 'prime factors' but of a matrix. Freaking nerd
>>
<<@seedmole
says :
Nice, basically Fourier stuff but on images.
>>
<<@timmygilbert4102
says :
That just generative ai 😂
>>
<<@featureboxx
says :
interesting, I was just going through a linear algebra course I recognize column spaces
>>
<<@littlestewart
says :
The guy who makes dumb skits uploaded a STEM video
>>
<<@nnm35
says :
I wish you had been my math professor. Super clear explanations! Tnx!
>>
<<@awertyuiop8711
says :
Now the real question is “How to compress an image with (basic) Geometric Algebra?”
>>
<<@delec9665
says :
Clearest video on svd gg
>>
<<@jonathan3488
says :
Thanks! I couldn't understand why we sort the eigenvalues by descending order in order to do low-rank approximations, but when I know that SVD leads to a linear combination of rank1 matrices, it became very clear. How cool!
>>
<<@Vytor_01
says :
having this video recomended after making a 3d renderer wasnt an smart idea, youtube. Im way too tired of matrices
>>
<<@rb8049
says :
As you on as you said linear algebra I knew the technique, but never thought of this application.
>>
<<@paridhaxholli
says :
H
>>
<<@derekpowersblight
says :
DLSS is wild
>>
<<@sachininthetube
says :
Thanks!
>>
<<@MasterHigure
says :
I did an SVD compression of a black-and-white image back in my intro linear algebra class. The fact that I still remember working on it almost 20 years after means I think it was cool.
>>
<<@proboiz_50
says :
I came here to know something BASIC But it's ADVANCED for me
>>
<<@shekarlakshmipathi
says :
Wish you and Grant (3Blue1Brown) were my teachers in high school. My heart starts beating fast when I start watching your videos. It is so exciting. I learn something new EVERYTIME.
>>
<<@vaidphysics
says :
Exactly this method is used in tensor networks for compressing information about a very complicated many body quantum state into a smaller set of variables. Great video 👏👏👏
>>
<<@ready1fire1aim1
says :
It's super strange how reality is 3D but our logic, math and physics formalisms are 2D (bivalent, binary and third-person). We even explicitly reject 3D with the Law of Excluded Middle. It appears that Einstein should have squared Leibniz's first-person 0D-3D rather than Newton's third-person 3D-1D. Leibniz started from 0D and built up but Newton started from 3D and reduced down. If we start from non-zero dimensions and try to reduce down to 0D then we only get to 1D (also we get to all our current major/minor open problems, contradictions and paradoxes). The difference between 0D and non-zero dimensions is equivalent to the difference between P and NP. Leibniz's metaphysical monads and CS' mathematical monads have equivalencies, too, and I'd imagine that's how we get AGI. Einstein should have done 0D-3D² for 3 sets of 3 dimensions: space, time and energy (stars are observable 3D energy). Newton's 3D-1D can't reach 0D since that's not how reality works. 0D quarks form 3D protons and neutrons, not vice versa. Also we probably shouldn't think of our quarks as a 2D disc with a triangle of quarks but rather as a 3D sphere with a tetrahedron of quarks (we just can't see the 4th quark since it's on the other side of 0D).
>>
<<@SystemsMedicine
says :
Very clear, very lovely…
>>
<<@wintutorials2282
says :
This is so awesome Thanks for the awesome video
>>
<<@sadeepweerasinghe
says :
you sound just like the guy from the funny videos
>>
<<@parzh
says :
I wonder if the first frame being an overly compressed image of SkillShare’s logo is intentional or not.
>>
<<@BlackDevilSTi
says :
4:00 what image we would get if we normalize this matrix into values between 0 and 1 so we have no negative values and greater than 1 values?
>>
<<@Polecam-FABI-ECURepair_channel
says :
Let be blasphemed anyone who favors decomposing a 2000x1000 3 dimensional body matrices instead of FFT.
>>
<<@broccoloodle
says :
two very important notes: (1) assuming natural data has low rank (2) svd is the optimal method for minimizing Frobenius norm
>>
NEXT VIDEO
>>