Intersection of Geometric Measure Theory and PDE?

Hey guys, I’m currently in a seminar class on topics in geometric measure theory and I’ve been tasked with preparing a talk on recent-ish research (past 15 years or so) on a topic of my choice. My interests lie in PDE (maybe), and I was wondering if anyone had any insights on recent interesting problems in the intersection of GMT and PDE.


submitted by Nevin Manimala Nevin Manimala /u/UnorthodoxGentleman
[link] [comments]

Are some people just inherently better at math?

In college right now. Have taken multivariable calc, linear algebra. Taking analysis/discrete.

It seems like I get extremely diminishing returns when studying for an exam. I never seem to be able to break the 1 SD above average mark, even if I put like 50 hours into studying for an exam.

Contrast this with the only two other math majors I know. We live on the same floor, and all they do is play League and Overwatch all day. They maybe study like 5 hours for an exam, but they consistently get more than 2 SDs above average. To be fair, one of them did do USAMO is HS, which probably helps. But even for entirely new concepts, things that take them like an hour to understand take me 5. I feel really dumb when we study together.

All this might sound petty, but as someone seriously considering doing math for a living, I’m starting to doubt how good I am at it. I’m starting to think that I’m not cut out for grad school and academia.

Any thoughts? Is it time to quit and do something “softer” like econ/stats?

submitted by Nevin Manimala Nevin Manimala /u/ranjeetkuma
[link] [comments]

Is there a relationship between the Gaussian integral and Fourier series?

The way Fourier series are typically taught, you take the integral of f(x)cos(nx) or f(x)sin(nx) from -pi to pi, then divide by pi, and this works because cos(nx) and sin(mx) are orthogonal on the interval [-pi, pi], and so are sin(nx) and sin(mx) for n=/=m, and cos(nx) and cos(mx) for n=/=m.

So I learned about inner product spaces and I realized that cos(nx) and sin(nx) are used as the basis for the function space C[-pi, pi] when you develop a Fourier series. This makes sense because these functions are orthogonal to each other — but they’re not orthonormal, because the integral of (sin(nx))2 from -pi to pi is pi, not 1, and same for cos(nx).

To make them orthonormal, all you have to divide them by sqrt(pi). Then the 1/pi term in the Fourier coefficients is no longer necessary, and you can treat it like a vector space, where we can represent a vector as v = ∑(ve(n))e(n).

My question is whether that sqrt(pi) term is purely coincidental or if there’s something more going on here, because I know that that’s the value of the Gaussian integral. What’s weird to me is that you can normalize trig functions over [-pi, pi] by dividing by sqrt(pi), and you can normalize e-x^(2) over [-inf, +inf] by doing the same thing. I know that the generalized Gaussian integral is important in probability, and that its limit is the Dirac delta, which is again very important. Am I on to something, or am I misguided?

And lastly, the reason Fourier series were invented was to solve the heat equation and other PDE’s. Is there something relating function spaces, PDE’s, and the Gaussian integral? What branch/theory of math is this all a pat of?

PS: Reddit should really implement support for subscripts, or some kind of basic text editor.

submitted by Nevin Manimala Nevin Manimala /u/doom_chicken_chicken
[link] [comments]

Do 4 dimensional objects have 3 dimensional shadows when rotated at certain angles?

like, a 2 dimensional square has a 1 dimensional shade when rotated 90 degrees, and a 1 dimensional line has a 0 dimensional shadow when you stand it up, so i guess that could translate into 4 dimensions?

submitted by Nevin Manimala Nevin Manimala /u/champyheteromer
[link] [comments]

Several questions about “theoretical”/”applied” Linear Algebra & its applications to Machine Learning.

I am a beginner student. I often hear how Linear Algebra is very important to know for Machine Learning. I found there are two parts to Linear Algebra “applied” (along the lines of Strang’s book) and “theoretical” (along the lines of Axler’s book).

My questions are as follows:

1) Which version of LA do people mean to learn for applications in ML? The “applied” or “theoretical” version?

2) What parts of LA are used in ML? What specific topics/theorems do people have in mind when they say LA is applied to ML?

3) Is theoretical LA useful to know for ML? If so, how? What applications does theoretical LA have in ML? What are the important aspects of theoretical LA in ML?

4) What sort of research is being done in Computer Science/Machine Learning that utilizes Linear Algebra? Are there any fields I can google to find papers on these sorts of research problems? Googling “Linear Algebra Research and Machine Learning” is a bit vague.


submitted by Nevin Manimala Nevin Manimala /u/retrofit_
[link] [comments]

Question about permutations and average distance

Credit to /u/Sarcon5673 for the original question!

For any permutation f of the set {1, 2, … , n}, it can be shown that the “ergodic average” E(f),

E(f) := lim (m -> inf) (1/m)(sum k = 1 to m)(sum j = 1 to n) |fk (j) – j|

exists. What permutation maximizes E? Is the permutation unique?

submitted by Nevin Manimala Nevin Manimala /u/WaltWhit3
[link] [comments]

To what degree are you supposed to understand lecture material as it is presented to you?

Presuming the lecturer is above average, how many of you are able to understand atleast what is going on, or even be able to draw more questions or conclusions while material is being presented? The reason I ask is because in my analysis class, I no longer have any intuition behind the material (we are covering metric spaces and norms), so I feel as though I am just writing things down to later learn during HW and studying, but other students within the class seem to know enough to even ask decent questions or answer the professor’s questions.

submitted by Nevin Manimala Nevin Manimala /u/samason97
[link] [comments]