Ecorithm uses big data analytics to help optimize the efficiency of large commercial building HVAC systems. They define big data as datasets too large to analyze with traditional techniques. Ecorithm collects over 100 million data points per month from a single medium office tower to identify inefficiencies. Their software cuts through noise in the data to provide simple answers and insights. Initial analysis typically finds 10% electricity savings, and ongoing monitoring maintains savings and discovers new opportunities beyond the initial fixes. Ecorithm works with some of the largest building owners and their software is expanding to major real estate markets.
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Zondits ecorithminterview v4 jnp
1. Zondits Interview with Ecorithm
Ecorithm, Inc. created software that uses big data to help identify and improve the inefficiencies in large
buildings’ HVAC operation. Zondits asked Chris Tagge, Ecorithm’s Chief Operating Officer, and Igor
Mezic, co-founder and inventor of Ecorithm’s technology, to talk about their experiences with software
systems that use big data.
To make sure we’re all on the same page, how do you define big data?
Ecorithm: Big Data is often defined as a data set so large or unwieldy that it is virtually impossible to
manage and extract meaning using traditional techniques. To give you an idea of how much data is
involved in our analysis, a single medium-sized office tower trending 6,000 points would add about 100
million rows to our database every month. We also track unstructured data, which adds even greater
complexity, but is often critically important in the analysis and development of the historical record for
the building. Factor in the natural fluctuations due to weather and occupancy and the human element of
operations, then multiply that by a portfolio of buildings and you’re quickly confronted with an
overwhelming amount of data. Ecorithm provides the tools to cut through all the noise and distill the
immense database of building information into simple answers and valuable intelligence.
Some people fear that big data will start reducing the demand for engineers, forgetting that big
data alone is not useful unless it is analyzed. How do your experiences support this statement?
E: The development of better medical diagnostic tools certainly hasn’t decreased the need for doctors.
Rather, it’s led to quicker and more certain diagnoses, improved treatments and outcomes, and better
communication among doctors, between doctor and patient, and between the medical community and
the public.
Similarly, we see an opportunity to add value to every segment of the building engineering, energy, and
real estate ecosystem. Our goal is to provide the software that enables rapid and more conclusive
diagnosis of issues within the building and to definitively demonstrate the value of the implemented
solutions.
For example, the lack of definitive data and analytics has led to a vast underappreciation of building
complexity and created an enormous challenge for building owners to understand the risk and return of
investments in the building infrastructure. From an engineer’s perspective, their contributions have
often been undervalued because the comfort and energy metrics were too coarse and many external
factors (weather, tenant behavior, equipment, etc.) obscured the real value of implemented solutions.
Advanced analytics can alleviate those concerns and facilitate communication between engineers and
building owners to validate results and justify budgets for new projects. Similarly, building managers
and engineers can proactively use analytics to communicate the impact of the improvements in comfort
and energy usage to make tenants and increase occupancy.
2. What are the biggest obstacles you’ve run in to in getting the technology off the ground? Were
they associated with the nature of the emerging technology, or do you see them persisting in the
future?
E: Primarily, we faced challenges were on the data acquisition side. Traditionally, trends were only used
to spot checking equipment, so systems weren’t easily configurable to trend thousands of points on a
continuous basis. In some cases, the proprietary nature of building management systems and the lack of
a standard nomenclature added an extra degree of difficulty.
There is a general recognition of these issues in the industry and solutions are evolving. Recently,
controls OEMs and upstart appliance manufacturers have embraced the opportunity and have responded
with upgrades for building management systems. In the last 12 months, we’ve seen a dramatic increase
in the availability of full building trend data. Additionally, initiatives like Project Haystack are defining a
new standard for point naming conventions.
What are common limitations of these systems?
E: Just as our Internet connections have migrated from dial-up, to DSL, to cable modem, to 4G wireless, a
similar trend is underway for building systems. Beginning in the mid-90s when networks came to
building systems, they were designed to serve local ‘communities of interest’ with room temperature
sensors communicating with the dampers that feed air into that same space with the goal of bringing the
temperature within the set point programmed into the system, with no need to share that data globally
throughout the system. Sub-100kbps networks were no issue. The trend we’re seeing is the BMS
manufacturers are identifying the need to convert these ‘control networks’ to ‘information networks’
making the data generated by these devices as a matter of course, available for applications like ours.
Describe the main benefits of working with big data.
E: On a building level, we analyze interactions on a whole-systems level, which enables us to uncover
systemic and pervasive faults that lead to large scale energy waste and occupant discomfort. We also
capture the unstructured data, e.g., the notes written on a clipboard next to the boiler, and put it into
context in a historical record. Finally, we can dust off the energy model that was rendered obsolete as
soon as the building was completed and continuously calibrate it with that massive amount of actual data
for predictive modeling and greatly improved measurement and verification.
Extension to the portfolio level provides the opportunity to see the definitive impact of specific measures,
equipment, and architecture on building comfort and energy use.
In what types of buildings is big data most helpful?
E: Our focus is primarily large commercial buildings for a few different reasons. While large commercial
buildings account for only a small percentage of the total number of buildings in the US, they represent
the majority of the energy use. Here’s a nice quote from an article on the APS (American Physical Society)
website (where they also discuss the underperformance of LEED buildings – another concern we
address): “Roughly 5% of the nation’s commercial buildings account for half of the gsf of the building
stock–and an even larger fraction of primary energy consumption.”
(http://www.aps.org/publications/apsnews/201307/backpage.cfm)
3. Additionally, data availability, instrumentation, and control are important factors. Larger buildings are
most likely to have building management systems with sufficient sophistication to facilitate data
acquisition and provide enough elements of control to be able to tune the building. Effectively, more
complexity means more opportunities for optimization and savings.
In regards to your projects, are they based on a per-square-foot cost?
E: Sensor density varies widely from building to building and we feel that a square footage pricing
model won’t yield a consistent ROI to the customer. Ecorithm’s model is predominantly based on type
and quantity of equipment in the building.
How many data points and of what kind and frequency do you need to be able to reach a
justifiable conclusion?
E: Ecorithm takes a whole-systems approach, and our definition of whole system is uniquely broad and
detailed – from the central plant all the way through the terminal units, and the more points the better.
The quantity of data points is based entirely on the equipment in the building.
From a frequency standpoint, we prefer measurements on 5-minute intervals, which allows us to see the
intensity of fluctuations that would otherwise be masked by sampling at longer intervals, e.g., 15
minutes.
Do you use inputs other than standard points, like temperature? Perhaps occupancy and loads on
electrical panels?
E: We definitely want temperatures, flow rates, setpoints, energy use data, floor plans, mechanicals, and
orders of operations. It’s also great to have humidity, occupancy, and lighting when they’re available.
While more data is definitely better, we can also do a lot with only a little data.
How long do you have to wait to collect enough data points to perform a useful analysis,
implement solutions, and see results?
E: We’re not an alarm system. Modern BMSs often have alarms and we often find that building engineers
are dealing with too many alarms as it is – they don’t need more. Rather, they need a system to cut
through the noise of those alarms to prioritize the issues that are persistent and important.
In order to see the natural fluctuations of the building, we’ll typically collect 30 days of data before
providing the first analysis. At this point, we’ve found that our partners and customers prefer to receive
in-depth analysis on a monthly basis, which provides sufficient time to work through a checklist of newly
identified issues with their in-house engineers or through their maintenance contracts. The nice thing
about the way we monitor is that we’re often able to see the effect of implemented changes immediately
in the data to make sure it’s having the desired effect without adversely impacting other elements
throughout the system.
In the future, as data availability increases and our software evolves, we will offer real-time analysis for
optimization and fine tuning.
4. We imagine savings typically play out in two phases: one initial savings after the initial analysis,
and then some smaller savings later on as the building is monitored. Do savings typically play out
in this manner? Can you give us a range of typical savings results you’ve seen?
E: There’s some great information available from Lawrence Berkeley National Labs that describes the
relative impact of initial measures, how those savings deteriorate over time if they’re not monitored
(about 18 months), and the impact of finding many additional opportunities. [Source: Building
Commissioning study conducted by the Lawrence Berkeley National Laboratory for the California
Energy Commission (http://cx.lbl.gov/2009-assessment.html)]. In short, ongoing analysis is critically
important to maintain savings and find new opportunities. Without monitoring, savings evaporate over
time – often a very short time.
The impact of the issues that we identify is highly dependent on the nature of the building and the
equipment, and the building manager’s priorities for implementation. One of our customers reported a
10% savings in electricity in the first 6 months while implementing only simple controls changes. We’ll
obviously see new faults evolve during changes in seasons and occupancy, but more importantly, fixing
initial faults often uncovers issues that were previously masked, which provides many additional
opportunities for tuning the building – often far beyond the design intent.
A simple example: a building had tightly controlled temperature setpoints with the thought that the
narrow deadband was needed in order to keep the temperatures throughout the zones within the tenantspecified range. However, as more and more faults were identified and fixed, it became clear that the
system is fully capable of maintaining a very tight tolerance, which provides the opportunity to widen
deadband by a couple of degrees while still maintaining comfort. This small change alone will result in an
additional 6% savings on an annual basis.
About Ecorithm
Ecorithm’s power over energy has resonated with renowned thought leaders from the real estate, building
engineering, and energy industries, many of whom have invested their time and resources to accelerate Ecorithm’s
product development and commercial advances. Ecorithm’s advisory board includes: Dan Tishman, Chairman and
CEO of Tishman Construction; Robert Fox, pre-eminent green architect of Cook+Fox Architects and head of the
GSA Green Team; Scott Frank, Senior Partner of JB&B, and board member of the Urban Green Council; Robert
Kantor, President and COO of Time Equities, Inc.; and Kristina Johnson, former Under Secretary of Energy for the
US Department of Energy.
Ecorithm’s customers include some of the most sophisticated and recognizable buildings in the country as well as
some of the largest REITs and privately held commercial building owners. Initially focused on New York City,
Ecorithm has since expanded its footprint on the East Coast. Over the next 12 months Ecorithm will establish a
powerful channel in America’s top commercial real estate markets, to enable country wide deployment before
turning its attention to overseas markets.