Using fiber-optic cables for seismic sensing

 
 

The report for this post can also be downloaded by clicking this link.

Our understanding of activity under the earth’s surface is largely limited by our ability to collect and process data. Up to now, seismic measurements have mainly been carried out with broadband seismometers, which can report the seismic activity with accuracy if they are conveniently located at any given time. Proper information of the movements of the geological plates is therefore entirely limited to the unrealistic idea of covering the earth’s surface with these broadband seismometers, which is an extremely expensive task in terms of time and capital. These limitations can lead to natural disasters occurring with short notice without any knowledge of mankind. Considering these limitations of modern seismic sensors there has been a strong demand for innovation and progressive ways to increase humanity’s knowledge and awareness of seismic activity.

In recent years, it has proved realistic to use Distributed Acoustic Sensing (DAS) measurements to gather information on seismic activity through fiber optic cables. With the ideology surrounding DAS technology, traditional fiber optic cables can be used to compile extensive datasets that are used to assess seismic activity through error calculations. Data collection of this nature creates a continuous stream of data that belongs to the subject of big data. The complexity of gathering, storing, and utilizing these kinds of big data has so far been one of the main challenges associated with this type of measurement.

DAS and its technology

The DAS sensor, traditionally called an “interrogator”, is connected to one end of the fiber optic cable where it fires several hundred laser pulses per second through the fiber optic cable. When the laser pulse is launched into an optical fiber, a fraction of the light is elastically scattered (Rayleigh backscattered). These same DAS sensors can detect and collect the backscattered photons from the internal impurities in the core of the fiber. For any section of the fiber, the phase difference of photons scattered at both ends of the section is linearly related to the length of this section. When the section of the sensing fiber is unmoved, the length and consequently the phase difference remains unchanged. Thus, any external distress on the fiber will change that difference. This strain rate can therefore be mapped along with the fiber by examining the changes in the phase of the elastically backscattered photons between successive measurements.

Data collection of this nature creates a continuous stream of data that belongs to the subject of big data. The complexity of gathering, storing, and utilizing these kinds of big data has so far been one of the main challenges associated with this type of measurement.

NetApp

NetApp is a leader in data warehousing and offers world-renowned cloud solutions in collaboration with many of the world’s largest service providers, such as AWS, Azure, and Google. NetApp is one of the most progressive companies in the world in terms of processing, storage, and utilization of large databases with data storage in the cloud. With advances in algorithms and the mastery of big data, it has now become possible to collect and utilize data streams for analytics and knowledge. This data stream acts as a continuously updating real-time database that is difficult to construct for the provision of information. The Ripples case study aims to combine historical static data of seismic sensing with a streaming application to provide enriched live data information.

NetApp’s expertise in Kubernetes makes this project feasible. Kubernetes automates deployment, scaling, and management of containerized applications such as the Ripples project. Ocean adds on top of that an optimally utilized and cost-efficient way of automating these cloud infrastructures for the containers. The Wave environment automates the infrastructure provisioning and scaling for big data cloud environments, namely spark applications. Thus, by running applications like these on Wave, you eliminate manual infrastructure, save up to 90% of infrastructure costs and enjoy rich observability into resource utilization.

The Ripples Case Study at NetApp

The idea behind the implementation of a project of this type is the usage of the Apache Spark programming environment, where NetApp is well-versed. The project at its core is creating a Spark Streaming App (or App Suite) which can run on many systems and is ideal for NetApp’s in solutions. The benefits of a project like this can lead to a much more efficient spatial information of seismic activities and thus optimization of knowledge. There are as well countless benefits from transferring information and data from standard computing modules databases to the Spark stream. Furthermore, NetApp software, Ocean, and Wave create an efficient and cost-saving cloud environment to manage these extensive data sets in an almost futuristic way.

NetApp looks at the Ripples case study as a service to the community in which it thrives, as well as enhancing the image of the company. With NetApp’s dominant market position in big data and cloud solutions, Iceland’s unique location, and the Icelandic Meteorological Office’s expertise, a project like this will have meaningful benefits.

Want to test it out yourself?

If the desire of the reader is to test out a sampled deployment of this process, contact me here at nokkvidan.com or via mail at nokkvidan@gmail.edu.

 
Previous
Previous

Who is the best shooter in the NBA?