This question is more of a why does this particular code work this way than a how can I make this code work this way.
Im going through the codecademy tutorials for JavaScript and I came across a lesson that I conceptually can make use of in my own code because I can see the pattern of this particular code - but it doesn't make sense why this code works this way.
Below is an example:
let myArray = ['First','Second','Third'];
var last = myArray[myArray.length - 1];
console.log(last);
The console displays "Third" when running the above code. I know the JavaScript language is a "zero indexed" language and I know that "first" in this array would be position "0" counting from left to right but my question is even if we start at "0" and count every item; 0 then 1 then 2; shouldn't the console log "Second" instead of "Third"? Or does the method of "length" in JavaScript ignore the "0" based indexing system and actually start counting at "1"; in witch case the answer should STILL be "Second" and not "Third"... No matter how my brain adds this up I keep getting "Second" instead of "Third" in my mind even though the output is "third" on the console...
Can anyone explain what I'm missing?
myArray.length - 1 == 2... now what do you think ... the length is the length, which is 3 ... not 2, because there are 3 elements ... the index is zero based, the length is ... well, the length0, 1, 2still has length of3