X/0.5 easy
Today the internet learns that dividing by a fraction is multiplication
And the other times that this exact meme was reposted elsewhere.
if you call a cell X
I think it is being divided like X/2, not X/0.5
They multiply in two by dividing in half (2x=x/0.5 as 0.5=¹/₂ which is the inverse of two)
Dont be too hard on them. They are convinced that buying 12 2/4 melons in a train that started in new York at 8:15 going west and dividing 45 apples in another train that started at 10:45 in San Francisco going North are completely normal activities.
As long as you show your work
My dog… Well… Did that “dividing to multiply” thing. It was a field study. Turns out, homework doesn’t multiply.
Just wait until they learn that computers subtract by adding, and multiply by adding, and divide by adding, and do exponents by adding, and do logarithms by adding.
it multiplies by using a complex set of gate arrays that do some adding, otherwise hardware multipliers are like multiplier tables built up by logic gates. Early CPUs did multiplication by adding (essentially multiplications are just recursively adding the same numbers to themselves), and if you were lucky it was optimized to use bit-shifts.
Division is a lot more complicated though. I did some optimization by multiplying with reciprocals instead, but speed gain was negligible due to memory bandwith limitations.
There must be add-vantages to this design.
And don’t get me started on demorgan’s law!
Wait…is it All…just adding? ಠ_ಠ
Always has been
They divide by 0.5.
1÷0.5 = 2.
from a formal perspective, division is an “”abbreviation”” for multiplying by a reciprocal. for example, you first define what 1/3 is, and then 2/3 is shorthand for 2 * (1/3). so in this sense, multiplication and division are extremely similar.
same thing goes for subtraction, but now the analogy is even stronger since you can subtract any two numbers (whereas you “can’t” divide by 0). so x - y is shorthand for x + (-y). and -y is defined “to be the number such that y + (-y) = 0”.
The way I think of it, there is no subtraction, and there is no division. Or square roots.
There is the singular layer of operations (the adding/subtracting layer which I think of as counting, multiplying/dividing layer which I think of as grouping, etc).
Everything within that layer is fundamentally the same thing. But we just have multiple ways of saying it.
Partly because teaching kids negative numbers is harder than subtraction, and thinking of fractions is hard enough without thinking of it as a representative process of relationships via multiplication.
Again, just how my brain does things. I’m not a mathematician or anything, but I’m pretty decent at regular math.
i think this a really nice way of thinking of things, especially for regular everyday life.
as a mathematician though, i wanted to mention how utterly and terribly cursed square roots are. (mainly just to share some of the horrors that lurk beneath the surface.) they’ve been a problem for quite some time. even in ancient greece, people were running into trouble with √2. it was only fairly recently (around the 17th century) that they started looking at complex numbers in order to get a handle on √-1. square roots led to the invention of two different “extensions” of the standard number systems: the real numbers (e.g. for √2), and later, the complex numbers (e.g. for √-1).
at the heart of it, the problem is that there’s a fairly straightforward way to define exponentiation by whole numbers: 3n just means multiply 3 by itself a bunch of times. but square roots want us to exponentiate things by a fraction, and its not really clear what 31/2 is supposed to mean. it ends up being that 31/2 is just defined as 31/2 = x, where x is "“the number that satisfies x2 = 3"”. and so we’re in this weird situation where exponentiating by a fraction is somehow defined differently than exponentiating by a whole number.
but this is similar to how multiplication is defined: when you multiply something by a whole number, you just add a number to itself a bunch of times; but if you want to multiply by a fraction, then you have to get a bit creative. and in a very real sense, multiplication “is the exponentiation of addition”.
Just look at it as a mass calculation. Original mass is X and after dividing its X/2 so you have two halves. Simple as that :)
All the people trying to explain why division and multiplication are the same and dividing by fractions bla bla bla…
But it’s missing the point that a cell dividing is nothing like algebraic division so the analogy just doesn’t make sense.
Saying its “dividing in half” so its actually “x/0.5 = 2x” doesn’t make any sense because the phrase “divide in half” in every other context means “x/2”…
Any ways if you want to model a cell dividing you should use an exponential
Mathematicians: they don’t think biology be like that, but it do
Biology is just when molecules try too hard.
deleted by creator
“Given enough time, hydrogen will begin to wonder where it came from.”
No good nucleic acids making everyone else look bad. Seriously who TF thinks it’s healthy to work for a billion years straight? 🙄
Multiplication and division are inverse operations, you can express one as the other, like expressing addition as the subtraction of a negative.
IIRC you can do this with integration and differentiation too but I’ve never tried it and also I’m personally not entirely sure if that’s true or if it’s something someone tried to convince me is good enough for work funded by taxes when trying to apply integrals and derivatives in an engineering project
from a practical perspective, you can mostly think of integration and differentiation as inverse operations. (it works fine for most functions that come up in most applications.)
but this doesnt really hold true in general. a famous example is that the gaussian distribution (used to make bell curves) is an integral that cannot be solved by using differentiation to “undo” integration. the general problem is that its a lot easier for a function to be integrable than it is for a function to be differentiable. (all continuous functions are integrable, but not all continuous functions are differentiable. even more troubling, there are integrable functions that aren’t even continuous.)
Always bothered me that this feels unintuitive in maths, even though this is precisely what maths tries to model with division.
But yeah, being able to divide by fractions of 1 and negative numbers and whatnot, that really does not make it feel like you’re cutting cake.
If you divide two cakes by half a cake you get four half cakes.
And negative cakes just make you more hungry. So if you’re 2 cakes hungry and I give you 5 you’ll only have 3 left because you ate them.
Yeah, I guess, lots of maths being done without units is the culprit here.
2 / 0.5 = 4
just makes it sound like you’ve magically applied some transformation to2
, which has cloned it.
2 cakes / 0.5 cakes = 4 half-cakes
rather makes it clear why it’s suddenly double the amount, without cloning involved.I’m quite excited to help with any other things you have difficulties with.
x * y = x / ( 1/y )
It’s Feynmann’s multiplication technique. What you do is make a slightly more complicated multiplication, then divide everything by infinity so it all goes to zero, then pull out of a hat a magician pulling himself out of a hat.
Multiply by the reciprocal to divide. Or judiciously divide in a modulus group to make the number bigger. Mathematicians have seen some shit.