The latest numberphile video on the Dehn invariant has me wondering about the motivation behind Hilbert’s 3rd Problem

According to the Wikipedia page on it (, the motivation behind the question was due to the derivation of the equation for the volume of tetrahedra being the only result needing the method of exhaustion in Euclid’s Elements, if restricted to polytope geometry (only points, straight lines, and flat planes allowed). But thinking about it more, I wondered if there was a different way to prove the volume is equal to 1/3rd its base area times its height, by using the fact that one can divide any tetrahedron into collections of similar, smaller tetrahedra and related octahedra in various ways, setting different ways to divide the same arbitrary tetrahedra equal to each other in volume, and solving the equations involving different ratios of tetrahedra to octahedra (I’m typing this on my phone quickly but I did a lot of drawing on geogebra and stuff that helped me visualize it). Anyways, to my surprise, I was actually able to do it (I’ll post my work tomorrow), which supposedly wasn’t possible. After going through it, I realized my mistake—I assumed that the volume of a tetrahedra with edges of length a,b,c,d,e,f, is to the volume of a tetrahedra with all edges length a/2,b/2,c/2,d/2,e/2,f/2 as that tetrahedra’s volume is to one with edge lengths a/4,b/4,c/4,d/4,e/4,f/4. It turns out that Euclid used the method of exhaustion to prove a tetrahedra’s volume first, and THEN used that to prove that the ratio of the volumes of similar sections of similar polyhedra is equal to the ratio of the whole volumes of the similar polyhedra.

But then this got me wondering—all this basically amounts to is that the length of similar polyhedra’s edges are to each other as the square roots of the areas of their faces and the cube roots of their volumes. If one could prove that adjusting all the edges of a polytope by a factor of n while keeping all angles the same adjusts the volume by a factor of n3, then you could use that to prove the volume of a tetrahedra is equal to 1/3rd its base area times height. Anyways I have more to write about this involving inscribing similar tetrahedra in similar cuboids but I apologize because I gotta run but will return to this post later. Basically, I’m wondering if there is still a way to figure out the tetrahedron volume formula without using a method of exhaustion using ratios of similar polyhedra that wouldn’t need to rely on cutting and moving around parts into a cuboid (which is obviously impossible due to the dehn invariant difference).

submitted by /u/HexNash
[link] [comments]

How do I find correlation between categorical features and target variable?

I have a linear regression model.i have done hot encoding for the nominal categorical features and normal category type in pandas for ordinal features.So now, how do I find the correlation between these with the target variable so I can see if there is any correlation.

submitted by /u/Ghjjj4433
[link] [comments]

How do YOU do mathematical research?

I’m an aspiring theoretical computer scientist and mathematician, and I’m beginning what could become a long career of research. I would therefore like to pick up good habits for conducting research, with respect to daily routine, work ethic, particular methods for learning new things or discovering something new, etc. Today what I do is relatively ad hoc and I don’t have a particular “process”, so-to-speak, that I follow. To that end, I would very much like to hear what you all do when you sit yourself down to grind out some original work (and maybe that’s an oversimplification, and I’d love to hear why!). Thanks!

P.S. I did not put this in the career and education thread because this post isn’t necessarily about boosting my career. This is more along the lines of having mathematical researchers share their best practices with each other to foster better habits for us all

submitted by /u/skmchosen1
[link] [comments]

Does anyone else not like to do exercises when reading a math book independently?

I like to read math books for fun. I find many of the big ideas fascinating, and I love trying to get an intuitive/visual understanding of the various concepts. But I don’t particularly care for the details of how to conduct a given proof, how to make a certain calculation, etc., so this often means I skip the exercises when reading a book independently (i.e., not for a course).

I realize the exercises are important if you’re looking to do this professionally, but I am not — I just enjoy reading and learning about math.

Does anyone else share this tendency?

submitted by /u/derpderp235
[link] [comments]

Proper repeat statement for SAS PROC genmod

I am trying to implement Poisson regression with log link and with robust error variance for survey data.

Here is a working code for non survey data that I tested and it works as intended:

proc genmod data = eyestudy; class carrot id; model lenses = carrot/ dist = poisson link = log; repeated subject = id/ type = unstr; estimate 'Beta' carrot 1 -1/ exp; run; 

Code above and more information about Poisson regression with log link and with robust error variance but fro non survey data is here:

Below is an example how to use code for PROC genmod for survey analysis (but with dist=binomial link=identity and I think without robust error variance)

proc genmod data=nis10; class seqnumt estiapt10; model r_tet_not_utd = / dist=binomial link=identity; weight provwt; repeated subject=seqnumt(estiapt10); where sex = 2; run; 

here strata variable name is estiapt10, cluster variable name is seqnumt and weight variable name is provwt.

Code above and more information about survey data analysis here:

My strata variable name is CSTRATM, cluster variable name is CPSUM and weight variable name is PATWT. Dependent variable name is DIETNUTR independent variable name is age_group_var. My data is located in sas_stata. So I tryed this code:

proc genmod data=sas_stata; class age_group_var id CPSUM CSTRATM; model DIETNUTR = age_group_var/ dist = poisson link = log; weight PATWT; repeated subject = id/ type = unstr; repeated subject = CPSUM(CSTRATM); estimate 'Beta' age_group_var 1 -1/ exp; run; 

but it gave me warning:

WARNING: Only the last REPEATED statement is used. 

As I understand after reading articles above and some other material I am doing everything right except not the proper repeated statement. For Poisson regression with log link and with robust error variance for survey data I assume there should be some combination of two repeated statements in my code above. I tried several variants of combining those repeated statements but without any luck.

So my question is: What is the code for Poisson regression with log link and with robust error variance for survey data?

submitted by /u/vasili111
[link] [comments]

Interesting Math books to read that aren’t completely droning

Hi everybody,

I got my bachelor’s degree in math and work in the gaming industry. I was never super interested in completing a PHD because I wasn’t as excited about high level math as I was about using it to create things in a non-abstract way.

With that being said, I still have some interest in high level math, linear algebra among others. Does anybody know of any good, popular books you could find at the library that are interesting and won’t put me to sleep?

submitted by /u/T0mSellecksMoustache
[link] [comments]

STATA not detecting multicollinearity?

I have a study in which I investigate seizures vs a lot of variables, two of which are presence of headache and age.

Both have been found to correlate significantly with seizures.

However, when I run a regression of these two variables, headache vs age, I find a statistical relationship as well, with presence of headaches significantly higher the lower the age.

Shouldn’t this be detected as multicollinearity by STATA? It seems to me that age has nothing to do with seizures, but whether people have a headache or not (or the other way around, or younger people have a higher risk of headache, whatever it is).

submitted by /u/OssToYouGoodSir
[link] [comments]