HeliosAgile

by Anonymous

Posted on November 20, 2018 at 12:04 PM



SDS is a data analytics provider founded by data scientists and engineers for the purpose of applying modern information processing technology to Big Data in a variety of applications.

What is Pipeline Integrity?

The industrial growth that began in United States in the early 1800’s continued even after civil war. However, after that, American industry changed dramatically. Machines took over most of the human labor which resulted in fast production and increased competition. To keep up with the rapidly growing demands of the industry advanced machineries were brought in, which resulted in increased fuel consumption. Also, with time increased the population of the country which raised fuel demands. Now,to cover wide geographic area of a country like US some efficient transmission technique was needed. This was accomplished through metallic pipes.A large proportion of America’s pipelines run next to factories, farmlands, schools, public areas and nearly two-thirds of Americans live within 600 fit of a pipeline carrying natural gas or crude oil. Factors such as nature of environment and extreme temperatures affect the state of these pipelines, resulting in corrosion. Pipeline integrity is about using intelligent and efficient tools using Machine Learning models and AI tools to detect and locate pipeline defects. Techniques such as Magnetic Flux Leakage signals and tools such as PIG (Pipeline Inspection Gauge) use AI to inspect pipelines for the purpose of preventing leaks, which can be explosive to the environment.

What are costs of pipeline spills?

There are various types of direct and indirect costs involved in pipeline spills. These include cost of loss due leakages, cost adverse effects on external environment and sometimes even hazardous situations which can affect human lives. One such incident happened in Jan 2018, where a company applied Machine Learning and AI solutions to detect pipeline failures and ended up identifying a severe point of failure. The key point in this case study is that this pipeline ran through an almond grove where each tree is valued at more than $100,000. The Pipeline and Hazardous Materials Safety Administration (PHMSA) reported that significant pipeline incidents grew 26.8% from 2006 to 2015. These incidences also involved death, serious injury, property damage and fire explosion. Pipeline spills also result in some indirect costs such as millions of hours of manpower in efforts to insure pipeline integrity.

Benefits of Machine Learning

Machine learning brings in the intelligence to the system that otherwise, without proper monitoring puts lives, environment and property at heavy risk. Use of Machine Learning to monitor pipelines takes the process very close to zero failures. It also reduces manpower required to monitor resulting in reduced investments, increased safety and increased overall confidence for a company about its reputation in market. A failure of a pipeline running through critical environments such as costly groves, densely populated areas and water bodies could create disasters. Machine Learning enabled system learns the features of a functioning system through the past data provided to its model to assess a situation. For example, factors such as thickness of pipeline over a period, magnitude of external corrosion, external and internal temperatures allow the monitoring machine learning enabled AI device to learn and predict the possibilities of ruptures or damages to the pipeline. As the devices work 24 hours a day and 365 days a year, humongous amount of data is captured to assess pipeline system to make accurate predictions about failure, well ahead of time.

What improvements do Machine learning models bring to current pipeline integrity systems?

What improvements do Machine Learning models bring to current pipeline integrity system? Machine Learning requires enormous amount of past data to learn patterns and statistics behind the functioning system to make predictions about failure before they happen. This data collection is often accomplished through advanced devices and tools which inspect pipelines and provide parameters to the system to get patterns of situations in which failure has occurred in the past. This is an effective way because these devices monitor pipelines all the time generating data constantly. Therefore, all situations of failure and their combinations are covered which consolidates Machine Learning abilities to come very close to perfection while making predictions.

Following are the improvements Machine Learning models bring:

  • Companies understand failure or a high-risk event well ahead of time.
  • It becomes easy to understand most influential variables that affect the system adversely.

Special prevention measures can be taken to monitor these critical parameters.

  • Expenses of failure are reduced, which could be used for other business investments.
  • With enormous data perfection goes on increasing as every possibility is covered.

Superior Data Science Pipeline Integrity Services


Leave a Comment: