Urban Microgrid: Fact or Fiction?

by Visitor Siggy62 on ‎10-02-2017 12:41 PM

"Microgrid" as a term has been around for decades. Its popularity seems to ebb and flow with how local generation costs compare with central utility costs energy costs; which only tells me there is something viable to the concept that just needs to be refined.

 

Let's start with a definition of Microgrid. At its core, a microgrid is any defined area that contains power generation capability which allows said area to operate separate from the utility. It's important to note that the definition itself does not define what makes up the power generation resource. Most often people think of alternative energies such as solar and wind coupled to some kind of energy storage. But, are these the only options? (I think not.)

 

When we think of industries whose business places as high value on "uptime", data centers, hospitals, broadcasting, and process manufacturing come to mind. These are large consumers of power are often located in or near large metropolitan areas. It was recently reported that data centers consume 70 billion kilowatt-hours of electricity in 2014 in the USA alone. For all of these "mission critical" facilities energy costs are a sensitive issue. Most organizations are constantly looking at ways to reduce their energy costs. Lower costs yield higher shareholder returns.

 

Now, let's shift to an urban environment where space is at a premium. What is the likelihood that PV or wind resources can be utilized in these crowded metro areas? It takes acres of land covered with PV to yield 1MW.

 

What, if any, other strategies are you using in metropolitan areas to offset utility costs?

Announcements
Help us grow the Caterpillar Community: Invite a Friend

Meet Our Bloggers