Step In, Step Up, Or Step Off!
The Impact of the Concept of Workflow in a Complex Healthcare...
How important is business process management when building or...
The Impact of the Concept of Workflow in a Complex Healthcare...
Leveraging Digital Transformation with WMS
Setrag Khoshafian, Chief Evangelist & VP of BPM Technology, Pegasystems
Intelligent and Insight-driven Workflows
Shannon Werb, CIO & COO, vRad (Virtual Radiologic)
Product Enhancement with Front-line Professionals
Walter Carvalho, VP & Corporate CIO, Carnival Corporation
Implementing Standalone Workflow Management System
Alan Sioberg, VP and CIO, Georgia System Operations Corporation
Thank you for Subscribing to CIO Applications Weekly Brief

Stephen King, Director Of Technology, Irby Construction Company
I started my career hitting missiles with missiles for President Reagan’s “Star Wars” program long before Space Force. We were taking data from several disparate sensors like RADAR systems to Optical system. Then, we’d munch that data together to get a “best guess” at the target missile’s location. Use that information to feed the input to the next set of calculations with new sensor data to refine the guess. An Intercontinental Ballistic Missile (ICBM) travels about 14,500 MPH. It’s moved 4 miles every second. An ICMB is flying from LA to NYC in 10 minutes. If there is an error in the estimate, you get to guess again in another second. All while predicting where a missile flying 3 times the speed of sound would be in the next milliseconds to provide a shooting solution to a launch platform and missile. There was a LOT of data processing.
That data was not exact. Imagine a 3D bell curve. The sensors error estimate may look like a massive football for a RADAR system or a massive hotdog for an optical system.
You take the data and create a virtual 3D Venn diagram where these error volumes intersect. One may update every 500 milliseconds and the other every 5 seconds. They both have their own time domain and relative coordinate system that you must account for in your calculations.
Why go through this description? I had to be very careful in how to place my thumb on the “data scales.” There was no time for error. I had to make assumptions.
The construction industry does not run data in a cadence of milliseconds. We operate in days, weeks, and months. In many ways, that “infrequency” of data is more challenging than the volume of data in the milliseconds. For an ICBM flight of 10 minutes, we have a 5 minute window of processing time to find a shooting solution. If we could only process at once a second, we would have 18,000 processing cycles of data. Compare that to a typical construction project. If you are looking at the project “manhours spend” for a power line build, your data may be updated once a week or once a month. An 18 month contract may only have 18 to 220 full sets of data to work with.
In the Data Analytics, we must approach the process with a preconceived notion. That bias is vital to a fast, accurate solution many times. You rely on historical patterns in your data. You use the knowledge of the project leadership. You make some educated “gut” calls. You must place your thumb on the scales. You just have to do it wisely.
Look at your past projects of a similar duration, customer, crews, project leadership, and other important factors. These can help you have a better guess at where you are and where you are headed. If you don’t have that historical data, you need to start building it. You need to go back to past projects to see what you can learn from the data.
Be careful! If you torture the data long enough, it will confess to anything. It’s easy to convince yourself you know more than you really do. Or, you are in a better position this time around. Or, you have better actuals. Or, . . . whatever. Where do you draw the line between a light thumb on the
scales and too heavy a hand?
“The construction industry does not run data in a cadence of milliseconds. We operate in days, weeks, and months”
First, do you believe the number you are seeing? Remember the old data analyst commandment, “Thou shalt not kid thyself!” Many times, we look at our numbers and just know they are off. As much as we would like to believe, we know it’s not right.
Next, if you applied the same factors to the last set of data would you be close to the current dataset? In a missile system, there are several loops in the data to ensure you are getting the most accurate data possible. Do you have the feedback loops in your data. If you applied your weights to previous projects, would see the same data results.
Can you defend your numbers? I think one of the best exercises is the “code review.” Or in this case, the “data review.” Sit down with the leadership and peers to explain your numbers AND the process you used to generate the numbers. Show how your processing would have projected past projects. Be VERY open to questions. Don’t wear your feeling on your sleeves.
In the end remember. Thou Shalt Not Kid Thyself!
I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info
Featured Vendors
-
Jason Vogel, Senior Director of Product Strategy & Development, Silver Wealth Technologies
James Brown, CEO, Smart Communications
Deepak Dube, Founder and CEO, Datanomers
Tory Hazard, CEO, Institutional Cash Distributors
Jean Jacques Borno, CFP®, Founder & CEO, 1787fp
-
Andrew Rudd, CEO, Advisor Software
Douglas Jones, Vice President Operations, NETSOL Technologies
Matt McCormick, CEO, AddOn Networks
Jeff Peters, President, and Co-Founder, Focalized Networks
Tom Jordan, VP, Financial Software Solutions, Digital Check Corp
Tracey Dunlap, Chief Experience Officer, Zenmonics