For many of us, powering a lightbulb with a potato was our first encounter with the idea of self-generated electricity.
Some wire, copper, zinc and a potato was all that was needed to illuminate an LED bulb, with the potato’s sugar, water and acid reacting with the metals to create an electrical current.
From an early age we are presented with the idea that electricity can be generated from renewable sources, yet outside of the classroom in the ‘real world’ our energy markets are focussed on fossil fuels.
We’ve lived this way for more than a century which impacts the quality of the air we breathe and has been the driver to change energy sources in highly populated cities.
In part as a result of the detrimental environmental impact, this has fueled the preconception that in order to be greener we need to be conservative; turn off the lights and use appliances sparingly.
But I think it’s time we stop demonizing the air-conditioner and take a leaf out of our science school books.
Because ‘energy’ itself isn’t bad – it’s simply the sources we’ve traditionally used to produce electricity that can be harmful. In fact it is energy that powers our economies and raises our standards of living.
Making The Right Connections
Ever since Thomas Edison’s lightbulb invention began to power civilisation in the 1800s, fossil fuels have been the main source of electricity supply.
The process is relatively straightforward; we burn fossil fuels, like coal and gas, to drive large steam turbines that produce electricity. This is then passed through transmission and distribution infrastructure, ie through power lines until it eventually meets your home or business.
In Australia, the National Electricity Market (NEM) is one of the largest interconnected electricity systems in the world, with more than 24,000 miles of transmission lines and cables supplying about nine million customers.
But is this really effective?
This was one of the questions I asked myself…