Answer:
When electricity flows through a wire, electrical resistance changes some of the energy into heat. The higher the amperage, the more energy you lose to heat. Too much heat causes the wires to melt.
Luckily, we can use transformers at the power plant to give us very high voltage, low amperage electricity, so it can be transmitted long distances without too much energy loss, and without burning up the wires. For the main lines, the voltage can be more than 500,000 volts. Substations have transformers that step the voltage down some, increasing the amperage, and then the pole transformer steps the voltage down again, to either 120 volts or 240 volts, depending on where you live. That increases the amperage too, giving us the right amount of power for our appliances.
As for the 120 volts in the US, it is due to economics. When electricity was first available to homes, it was used only for lighting. At the time, 120 volts was optimum for light bulbs, so that was the voltage that was used. Sixty cycles per second was the most efficient frequency for powering electric motors in factories, so that was also adopted.
As use of electricity spread to other countries, it was found that you could use smaller wires if you used higher voltages, so Europe adopted 240 volts, letting them save money on powerlines. One price that they pay is that their incandescent bulbs have thinner filaments, and burn out more easily. AEG in Germany felt that 50 cycles per second was better, and set up their system using that.
Once the systems were in place, it was just too expensive to change them. Imagine if the government suddenly told everyone that they had to replace every electrical device in their house because they were switching systems.