Measuring Earthquakes

Measuring Earthquakes

Illustration

This post is also available in: heעברית (Hebrew)

Illustration
Illustration

Earthquakes aren’t measured linearly, but in orders of magnitude, which means a 6.1 magnitude quake like the one that recently shook Northern California is about twice as big as the 5.8 earthquake that rattled Washington, D.C., in 2011 – and nearly three times as strong in terms of the amount of energy it released. Some more context: The 7.0 earthquake that devastated Haiti in 2010 was eight times bigger than the Northern California quake, and released 22 times more energy. A 6.0 quake releases 31,622.776 times as much energy as a 3.0 quake. And a 7.0 releases 31.622 times as much energy as a 6.0.

So why do geologists talk about earthquakes this way? Why not use a scale that operates more like the linear ones used to measure weight, or length, or temperature, or any number of other natural phenomena?

iHLS Israel Homeland Security

The Richter Magnitude Scale is the method of earthquake measurement widely used in the United States last century. govexec.com was told that Richter’s idea was to track the amount of energy released by a quake the way an astronomer would measure the brightness of a star. Each number on the magnitude scale indicated an earthquake 10 times stronger than the last.

The moment magnitude scale is used by seismologists to measure the size of earthquakes in terms of the energy released. The magnitude is based on the seismic moment of the earthquake, which is equal to the rigidity of the Earth multiplied by the average amount of slip on the fault and the size of the area that slipped. The scale was developed in the late 1970s by seismologists Thomas C. Hanks and Hiroo Kanamorito from Caltech’s seismic lab to replace Richter-style magnitude scales. Even though the formulae are different, the new scale retains the familiar continuum of magnitude values defined by the Richter scale, which is still been used for earthquakes that do not exceed 7.