ProdigyInspired
Tafe Advocate
- Joined
- Oct 25, 2014
- Messages
- 643
- Gender
- Male
- HSC
- 2016
So, as most Ext 2 students know, the last part of Integration involves proving the odd and even definite function proofs.
So for the odd proof (<s>I'll just do it below </s>, its completely wrong don't follow)
<s>
![](https://latex.codecogs.com/png.latex?\bg_white Show\quad \int _{ -a }^{ a }{ f(x) } dx\quad =\quad 0)
![](https://latex.codecogs.com/png.latex?\bg_white Let\quad u\quad =\quad -x\\ \\ \begin{matrix} When\quad x\quad =\quad a,\quad u\quad =\quad -a \\ when\quad x\quad =\quad -a,\quad u\quad =\quad a \end{matrix}\\ \begin{matrix} \therefore \quad I\quad =\quad \int _{ -a }^{ a }{ f(-u) } (-du) \\ =\quad -\int _{ -a }^{ a }{ f(u) } (du) \end{matrix})
Then I get stuck here.
</s>
I've done it before but I still can't wrap my head around the concept of the dummy variable of x.
If the area is bound by a curve that uses the value of x i.e. f(x), why would it be the same thing as using a variable of f(u)?
So for the odd proof (<s>I'll just do it below </s>, its completely wrong don't follow)
<s>
Then I get stuck here.
I've done it before but I still can't wrap my head around the concept of the dummy variable of x.
If the area is bound by a curve that uses the value of x i.e. f(x), why would it be the same thing as using a variable of f(u)?
Last edited: