The sum of the squares of the first ten natural numbers is 1^2+2^2+...+10^2=385. The square of the sum of the first ten natural numbers is (1+2+...+10)^2=552=3025. Hence the difference between the sum of the squares of the first ten natural numbers and the square of the sum is 3025−385=2640. Find the difference between the sum of the squares of the first one hundred natural numbers and the square of the sum.

This one is easy to do wrong, and easy to do right - which means I can mess around with a goofy language! Let's try Rockstar, an esoteric language which allows coders to write programs that look like rock lyrics:

```
Tommy was a lovestruck ladykiller
Jimmy was nothing,
Billy was nothing
while Tommy ain't nothing,
Put Tommy of Tommy with Jimmy into Jimmy
Put Tommy with Billy into Billy
knock Tommy down
Put Billy of Billy without Jimmy into Jimmy
shout Jimmy
```

Of course, this isn't a particularly lyrical block of code, but it solves the problem! But we can do better than just loop through 1-100, performing operations for each number. We already know how to compute the sum of integers in a range, using Gauss's method. But luckily there's a way to compute the sum of the squares of numbers in a series as well!

I won't pretend to have come up with this myself - I use Google like anybody else. As it turns out, we can figure out this problem by thinking three dimensionally, something I'm rarely brave enough to do. I don't understand a lot of that page, but I know how to copy-paste a formula! Let's use this help from wikipedia to write an n=1 solution, this time in ... one line of Python‽:

`print(((100*(101))/2)**2 - ((200**3 + 300**2 + 100)/6))`

OK, that "one line" solution wasn't as zany as they generally are. But that's mostly because this problem, when solved the right way, is as simple as plugging numbers into a calculator.