Event-Study Method, Similiarity, Susceptibility Ranking


Details
Three speakers are contributing to our meeting on September. The door is opening from 5:30 pm with pizza and soda for networking. A meeting starts from 6:00 pm. The following is a summary of three talks (due to the limit of words). A full abstract is noted on the page at "https://imstatsbee.github.io/calgaryr/meetings1920.html"
- Burning out of Time: Power-Plant Decommissions and Mine Cloures in the Appalachians, by Reinaldo Viccini
He investigates the claim that power plant decommissions are causing closures and production declines in Appalachian mines. He tested the assumption that power plants buy local coal and argues that observed differences in power plant efficiencies. He then estimates the correlations between aggregate electricity generation and coal consumption from nearby power plants on the mine’s production (intensive margin) and probability of closure (extensive margin). Finally, he estimates the causal effect of a decommission within the 10 nearest plants to the mine using an event-study method.
- Using unsupervised machine learning to tag oil and gas pressure drop methods used in commercial flow simulators, by Pablo Adames
Oil and gas engineers rely on flow simulators to design and troubleshoot pipelines, wells, and thus the facilities needed to achieve production and operation targets within the safety and economic constraints. Commercial software simulators offer a wide choice of calculation methods for the pressure drop and liquid holdup at every point of the system, these are usually referred to as flow correlations for historical reasons. The difference in the numerical results of the simulations can vary significantly as a function of the flow method selected, leaving everything else constant. In his study, Schlumberger’s PIPESIM was used to assess 35 different methods on a model built from field data in the public domain, a metric was defined to assess similarity using unsupervised method. The machine learning results were compared to the empirical knowledge, and consistent results were identified for this specific production scenario.
- Asset Failure Susceptibility Ranking, using LambdaMART, by Busayo Akinloye
The electric distribution system is one of the most diverse systems in the electrical grid. It consists of both overhead and underground assets. Growing power quality and reliability expectations from regulatory authorities and customers demand minimal downtime of equipment. Metrics such as System Average Interruption Frequency Index (SAIFI), and System Average Interruption Duration Index (SAIDI) are closely being monitored by electric utilities and form a major part of the business’ performance indices. These growing expectations, coupled with aging assets and budget constraints require innovative and cost-effective ways to realize actionable intelligence in order to optimize spending, while improving or maintaining the quality and reliability of the electric grid. Data analysis offers a unique solution that is reproducible across all asset infrastructure of an electric grid. It employs complex machine learning and statistical algorithms to extract actionable insights and learnings from historical data. These insights will help utilities better allocate both financial and human resources to the most failure susceptible assets, truly making data-driven decisions. I will discuss the development of an asset failure susceptibility ranking system on the Calgary area Underground Residential Distribution (URD) System. This system employs the supervised ranking system used by information retrieval systems. The framework of this ranking system can be applied to all distribution system assets (equipment), due to the reproducible nature of the statistical algorithms it employs.
This event is supported by PIMS and Cenvous.

Sponsors
Event-Study Method, Similiarity, Susceptibility Ranking