First Year Mathematics B (Integration, Series, Discrete Maths & Modelling) (1 Viewer)

leehuan

Well-Known Member

Can you recall whether there have been any contradiction proofs in the HSC more recently than 2003? Surely there should've been (though I can't remember any off the top of my head)?
Couldn't really off my head either

Flop21

Well-Known Member

1. Will I lose marks if I just write down the limit in questions asking for 'does this series converge, if so find its limit'?

2. Will I lose marks if I don't show fully working when row reducing in questions where it's just a step, e.g. finding eigenvectors I can just sometimes do it my head and will just write the final form.

leehuan

Well-Known Member

1. Will I lose marks if I just write down the limit in questions asking for 'does this series converge, if so find its limit'?

2. Will I lose marks if I don't show fully working when row reducing in questions where it's just a step, e.g. finding eigenvectors I can just sometimes do it my head and will just write the final form.
Well if you want to find the limit of a series, unless it's that obvious how do you plan to do it by inspection though

But with 2 I also skip row reductions if it's that obvious. I think my tutor does as well.
I work around it by saying matrix*x = 0 therefore x = my answer

leehuan

Well-Known Member

Good thing is that they're like we WANT to give you marks. So they don't have mark allocations like in the HSC so that they aren't limiting themselves.

So I reckon if you jumped some steps but didn't jump too excessively to get the right answer you'd get it. (But of course if you jumped too many steps only to get the wrong answer well that's another story)

leehuan

Well-Known Member

(for the same linearly independence qn)

Another friend wants to know if there's fault in what he did and I can't really communicate anything

InteGrand

Well-Known Member

(for the same linearly independence qn)

Another friend wants to know if there's fault in what he did and I can't really communicate anything
Yeah that's correct.

leehuan

Well-Known Member

I could probably research it up but I feel as though InteGrand's answers are more comprehensible...

$\bg_white \text{Prove that }i^i=\exp \frac{-\pi}{2}$

InteGrand

Well-Known Member

I could probably research it up but I feel as though InteGrand's answers are more comprehensible...

$\bg_white \text{Prove that }i^i=\exp \frac{-\pi}{2}$
$\bg_white \noindent Need to define complex powers and logarithms to make sense of something like i^i. I'll be slightly brief now (search up things like complex logarithm and powers online for more info). We generally define (for z and c complex and z \neq 0) z^c as e^{c \log z}, where \log is the complex logarithm and is \textsl{multivalued} (so powers are generally multivalued). For complex numbers z = re^{i\theta} (r > 0, \theta \in \mathbb{R}), we define \log z = \ln r + i\theta + 2\pi i n (n\in \mathbb{Z}), where \ln is the usual real logarithm.$

$\bg_white \noindent So i^i = e^{i\log i}, and \log i = \ln \color{blue}{1}\color{black}{}+ i \color{red}{\frac{\pi}{2}}\color{black}{}+ 2\pi i n = i\frac{\pi}{2} + 2\pi i n \Big{(}since |i| = \color{blue}{1} and \mathrm{Arg}(i) = \color{red}{\frac{\pi}{2}}\color{black}{}\Big{)}. Hence i^i = e^{i\left(i\frac{\pi}{2} + 2\pi i n \right)} = e^{-\frac{\pi}{2} - 2\pi n}, n\in \mathbb{Z}. If we take n = 0, this gives us the \textbf{principal value} (corresponding to the \textsl{principal logarithm} being used -- you can search online for more info on this) of e^{-\frac{\pi}{2}}.$

$\bg_white \noindent So the key things are that in general complex powers and logarithms are multivalued objects. Also pretty cool, i^i is real!$

Last edited:

Drsoccerball

Well-Known Member

I could probably research it up but I feel as though InteGrand's answers are more comprehensible...

$\bg_white \text{Prove that }i^i=\exp \frac{-\pi}{2}$
Spending too much time on mathematical memes ?

FritzTheCat

New Member
Is everyone still doing this? I'd very much like to participate.

Here's a problem I've been stuck on for a little while. Any takers?

Last edited:

sida1049

Well-Known Member
Consider the equation \lambda_1.v_1 + ... \lambda_m.v_m = 0. We need to show that all coefficients are zero. To do this, take the dot product of v_i to both sides to show that for arbitrary i = 1, 2, ... , m, \lambda_i = 0 (by the property of orthogonality). And we're done.

Last edited: