How do I find the average of an array in JavaScript?
I’m working on calculating the average of an array in JavaScript, but I’m running into some issues.
Here’s the array I’m using:
const grades = [80, 77, 88, 95, 68];
I first tried:
let avg = (grades / grades.length) * grades.length;
console.log(avg);
But that gave me NaN
.
Next, I tried:
for (let grade of grades)
avg = (grade / grades.length) * grades.length;
console.log(avg);
And this resulted in 68, which I don’t quite understand.
My questions are:
- Why did I get 68 as the output?
- How can I properly calculate the average of an array in JavaScript?
Hey @Punamhans i get your point, there is an old scoll method that I try may be you can try this too
let sum = 0;
for (let i = 0; i < grades.length; i++) {
sum += grades[i];
}
let avg = sum / grades.length;
console.log(avg);
You were getting 68 earlier because your loop was overwriting avg every time, instead of accumulating the total. This method fixes that, giving you the true JavaScript average of all the elements.
Modern and clean: for…of : This one feels a bit more natural if you’re looping over values directly:
let total = 0;
for (let grade of grades) {
total += grade;
}
let avg = total / grades.length;
console.log(avg);
Same idea—just simpler to read. It’s one of the most popular ways devs calculate an average of an array in JavaScript.
Now if you’re into functional programming or just like a compact one-liner:
const avg = grades.reduce((sum, grade) => sum + grade, 0) / grades.length;
console.log(avg);
This method uses .reduce() to total up all values in one go.
It’s elegant, efficient, and very “modern JavaScript.” Perfect for when you want that JavaScript average without writing multiple lines.