As we wrote about earlier this month, the Texas Railroad Commission (RRC) released a treasure trove of data freely available to the public on their site. It was like Christmas in the Mi4 office. After we sang some carols and drank some hot chocolate, we realized that there was so much data. We didn’t know where to start.
Christmas in September
As my colleague @Talal wrote last week, we decided to get Lat/Long coordinates for every Texas well. In his post, he explained, there are many use cases for this data, so it seemed like an excellent place to start.
In this post I will go over my contribution to the exercise: creating a serverless function to process data in blob storage. Continue reading “Processing the RRC Data in the Cloud with Azure Functions”
In this series of articles we will take the RRC data from the SQL database and serve it via an OData REST API and show how we did it.
The series will consist of 3 parts:
- Part 1: Creating the OData REST API in Visual Studio using Entity Framework Core and C#
- Part 2: Setup Continuous Integration and Delivery for automated build and release pipelines to publish to Azure using Azure DevOps
- Part 3: Offering the API to the public via a secured Azure API Management gateway and developer portal
Sounds like a lot…I know, so if you want to fast forward to the end and start testing the API and the data just continue reading…you should be up and running in under 5 minutes.
Continue reading “TX RRC OData Feed – Intro”
Part 1: Well Locations
Last week I wrote a blog post talking about the TX RRC publishing data to the public that previously was only available via a paid subscription.
After downloading the different data sets and examining the various types of data and formats, I decided to take a closer look at the data that might prove to be useful to us and to our Productioneer clients.
It is not uncommon for location and depth data, as well as other well-header and meta data to be incomplete or non-existent during a software migration, after all this data might not be considered crucial for the daily gauge sheets. Especially in the case of Excel gauge sheets, where additional columns are a waste of “prime real estate” and might be considered as cluttering that particular production report.
It would be nice if we have a quick way to pull the Lat Long data in bulk to speed up the on-boarding process.
Continue reading “Visualizing the RRC Data in the Cloud”
In case your missed the announcement yesterday, The Railroad Commission of Texas (RRC) has released data sets on their website, below is the bulk of the email:
Continue reading “TX RRC Announces: Data Sets now available for free”
Welcome to the second post in our miniseries: “Are You Developing Power BI Reports the Right Way?” In the two-part series we are designing a sample Power BI report visualizing the weather on Mars and using some real world techniques.
The highlights of our first post were:
- Getting data from Mars
- Using a JSON file as a data source
- Performing operations in the “Get Data” phase using M
- Implementing a Dynamic Slicer
- Using Chiclet Slicer and Dummyimage to create a Legend Slicer
- Making design decisions
This post will focus on:
- Switching an Axis Between Logarithmic and Linear Scales via a Slicer
- Adding a Date Slicer
- Using a Dark Theme
Continue reading “Power BI Switching between Logarithmic and Linear Scales”
Data is king, or queen depending on your household dynamics. How you communicate that data and its impact to your clients can help or hurt your business. Both your short and long term relationships can hang in the balance, which is why the quality and delivery of your Power BI reports are everything.
Case Study: Power BI Report Development
In this blog post mini-series, we will be taking you through the process of creating a Power BI report. The demo report draws from one we created for a client as part of a larger dashboard project. It implements a line chart to visualize the weather on Mars.
The actual report we developed had nothing to do with Mars, space, or weather, but you should find it useful to understand how real-life issues can be resolved and optimizations can be employed. The first post in the series focuses on data-prep and implementation of a legend slicer.
Continue reading “Power BI Legend Slicer from a JSON File with M and DAX”
It’s time to spring forward for most of us in the US (not forgetting about you in Arizona and Hawaii), but what about the rest of the world? Well I compiled some data from Time and Date and put together a quick Power BI dashboard on my lunch break…ok a lunch break and a coffee break….ok 2 coffee breaks.
Continue reading “Springing Forward”
Here is a quick tip for analyzing daily data in Power BI. If you have a query or dataset that contains a date, a category, and a value measure for that category and you want to create a static measure that always returns the total value for all categories, here is the DAX expression you would use:
DailyTotalAllCategories = CALCULATE(sum(Query1[value]),ALL(Query1[categoryname]))
You could then use this daily total DAX measure in calculation of the percentage of the total for each category with this formula:
PercentDailyTotal = DIVIDE(SUM(Query1[value]),DailyTotalAllCategories)
If you are only interested in the percent daily total you could bypass the DailyTotalAllCategories measure altogether and your formula would look like this:
PercentDailyTotal = DIVIDE(
Speaking in generalities can be a little hard to follow. A simple oil and gas implementation of these concepts is after the break. Continue reading “Power BI Tip: Daily Total and Percent of Daily Total DAX Expressions”