How do companies currently handle and store their environmental information?
Managing an environmental project (contaminated site, emission source, or GHG inventory) is similar to making a Hollywood movie, with one difference: duration. A movie is usually made in few months, whereas an environmental project typically spans years or decades.
The work involved in investigating, remediating or monitoring of contaminated or emissions sites is almost universally performed by outside consulting firms. Large companies rarely “put all their eggs in one basket,” choosing instead to apportion their environmental work amongst several to 10, 20, or even more consulting firms. The actual work at a particular site is generally managed and performed by the nearest local office of the firm that has been assigned to the site.
At larger production facilities such refinery or a Superfund site, the environmental work is likely to span 10, 20, or 30 years while monitoring may continue even longer. Over this period of time, investigations are planned, samples collected, reports written, remedial designs created, and following agency approval, one or more remedies may be implemented. Not only is turnover in personnel commonplace, but owing to the rebidding of national contracts, the firm assigned to do the work typically changes multiple times over the life span of a remedial project.
The investigation of a single large, potentially contaminated site often requires the collection of hundreds or even thousands of samples. A typical sample may be tested for the presence of several hundreds of chemicals, and many locations may be sampled multiple times per year over the course of many years. The end result is an extraordinary amount of information. Keep in mind that this is just for one site. Large companies with manufacturing and/or production facilities often have anywhere from a few to several hundred sites. Those that also have a retail component to their operations (e.g., oil companies) can have thousands of sites. Add to this list compliance and reporting data, engineering studies, real time emission monitoring, and the amount of data becomes staggering and unmanageable by conventional databases and spreadsheets. Given the magnitude and importance of this information, one would expect environmental data management to be a high priority item in the overall strategy of any company subject to environmental laws and regulations. But this is not so; instead, our surveys of the industry reveal that a large portion of information sits in spreadsheets and home-built databases. In short, you have an entire industry with billions in liability making decisions using tools that are not up to the task. Robust databases are standard tools in other industries – but for whatever reason, the environmental business has failed to fully embrace them.
As a result, many organizations and governmental agencies are simply “flying blind” when it comes to managing their environmental information.
The lack of standards and inconsistencies in information management practices among the firms performing environmental work for a company impose a significant cost on the company’s overall environmental budget. The fact that some firms may use spreadsheets, others their own databases, and still other various commercial applications may appear on the surface to be a benign practice, as each firm’s office uses the tools it is most comfortable with. But the overall cost to the customer in fact is enormous.
A Better Way
Is there a better approach that companies (both consultants and owners of environmental liability) can adopt to manage their environmental data? The solution seems obvious: get all the information about sites out of paper files, spreadsheets, and stand-alone or inaccessible databases and into an electronic repository in a structured and formatted form that—and this is the crucial point — any project participant can access, preferably from the web, at any time or any place. In other words, the solution is not merely to use computers, but to use the web to link the parties involved in an emission management or site cleanup, and this includes not only site owners and their consultants but also regulators, laboratories, and insurers, thus making them, in current jargon, “interoperable.” This may be obvious, but today it is also a very distant goal.
What would the ideal IT architecture of environmental industry in future look like? It would start, with wireless data entry using mobile devices by technicians in the field and wireless sensors where feasible. Labs would upload the results of analytical testing directly from their instrumentation and LIMS systems into the web-based database. During the upload process any necessary error checking and data validation would take place automatically. Consultants would review these uploads and put their stamp of approval on the data before it becomes part of the permanent database. Air monitoring devices and sensors would automatically upload their measurements into the same system. Ditto for any water or air treatment systems installed at facilities, metering devices for consumption of energy, water, or fuel, etc. Anything with an IP address and connected to the internet that produces data relevant to environmental or sustainability monitoring should feed data into the same system. (In today’s word there is a word for it: Internet of Things or IoT).
Behind the scenes, all data would be formatted and stored according to recognized and standard protocols. Contrary to widespread concerns, this does not require a single central repository for all data or any particular hardware architecture. Instead, it relies on common software protocols and formats so that individual computer applications can find and talk to one another across the Internet. The good news is that the most of these standards, such as XML, SOAP, AJAX, REST, and WSDL, already exist and are used by many industries. Others, such as DMR, SEDD, GRI, CDP, EDF, CROMERR, or EDD (spelling them out makes them sound no less obscure) are unique to the environmental industry and govern data interchange between, laboratories, consultants, clients and regulatory agencies. On top of these, there needs to be hacker-proof layers of authentication and password protection so that only the right people can access critical or sensitive information.
There is still some work to do to refine these technologies but the basic building blocks are already readily available and implemented by few progressive companies and regulatory agencies. The problems that this changed approach would address are many. First, data would be entered or uploaded just once, preferably electronically. Secondly, data transfer costs would drop and data quality would improve. No longer would the need exist to transfer data whenever one consulting firm is replaced by another or to maintain multiple databases that must be kept in sync. Third, the significant amounts of time that engineers, managers, and scientists now spend determining where a particular report is correct or looking up information on a site would dramatically decline. Fourth, by having their data in a consistent electronic format, companies would be in a better position to comply with the emerging demand to upload information on their sites to state or federal agencies and organizations. Several progressive states have already imposed electronic deliverable standards (e.g., California and New Jersey), and US EPA is working on its own standards based on XML technology. Last, and most significantly, site owners would assume possession of their data and as such, finally gain ready access to information about their own sites. This would seem particularly beneficial to public companies attempting to comply with the SOX.
The good news is that a system described above already exists.
We would love to discuss your environmental data situation with you. We can be reached at (650) 960-1640 or email@example.com.