Skip to content

Data Governance, Data Quality – Space, a Race and the Base

, | October 7, 2021 | By

by Joe Mendel – 

Captains Log, Stardate: 10.21.2021

Captain Kirk of the Starship “Enterprise” will be rocketed into space for a 15-minute flight on board the Blue Origin rocket “New Shepard”, launching from the space port at Blue Origin’s Launch Site One in West Texas.  William Shatner1, who played the famous Captain Kirk on the TV show Star Trek, at 90 will be the oldest person to be launched into space.  What a cool and wonderous opportunity for Shatner as well as the millions of Star Trek fans across the globe.  But I couldn’t help but think of the role that data has played in the space race since Ham (the monkey), Russian Yuri Gagarin, and American Alan Shepard (whom the Blue Origins rocket is named after) took flight.

It has been said, and hard to believe, that the historic Apollo flights and moon landing by Neil Armstrong, Buzz Aldrin and Michael Collins on Apollo 11 were successfully completed with primitive computers and slide rules.  It has been reported many times that we have more computing power in each of our cell phones than was available to the astronauts on the Apollo flights.  But none of this could be possible without being able to capture and analyze the data surrounding the unknowns of flight outside the atmosphere.  I’m sure there were mistakes along the way which resulted in catastrophic events.  Most of which I can assume were from incomplete data or bad interpretation of data.  Data comes at us in many forms and methods of delivery and from that input we are able to make both bad and successful decisions.

Consequences of Bad Data

Bad data, as we instinctively know, can have disastrous results if not captured, analyzed and interpreted correctly.  The first caveman and his buddy who wondered if the sabretooth tiger would make a good cave pet understood this as caveman 1 decides to pet the tiger while caveman 2 catalogues and interprets the disastrous results, leading to the well-known, “I don’t have to run faster than the sabretooth, just faster than you” theory.

Back to Star Trek.  Star Trek, The Next Generation featured, Captain Jean-Luc Picard (Patrick Stewart) and a character named Data.  Data is a character in the fictional Star Trek universe portrayed by actor Brent Spiner. Data is a sentient android who serves as the second officer and chief operations officer aboard the star ship USS Enterprise.  Data’s positronic brain allows him impressive computational capabilities.  However, Data had an evil (bad) “brother” named Lore.  And as you can expect Lore was the opposite of the good Data and wreaked havoc wherever he went until the good Data confronted him in a typical ‘good triumphs over evil” episode.  But my point, as silly as this example seems, is that bad data is never good – never acceptable.

Data Provides an Edge in Formula 1 Racing

Formula 1 racing is the pinnacle of auto competition having the most complex and intricate machines to ever grace a racetrack.  It is a billion-dollar industry with a global audience approaching 2 billion each race season.  It certainly is big business and big competition with great rewards to the winning teams.  Not only are the machines architected to micromillimeter perfection and operated by drivers who are performing at superhuman abilities, the on-track performance is highly monitored and changing track, environmental, driver and racecar conditions analyzed on a real-time basis from thousands of sensors.

“Formula 1 is a data-based industry and it has always been. However, it’s only in the last decade that simulations have reached such levels of sophistication and speed that they can drastically influence the results of a test.

Today, each car houses between 150 and 300 sensors, which generate millions of data points every weekend. In terms of numbers, around 300GB of data per car is generated each Grand Prix weekend. If we add this to the data generated by the rest of the team, it can amount to 40–50TB of data per week. ”

from MAPFRE INNOVATION | 28.02.2020 Data analysis in Formula 1: the difference between victory and defeat

The sensors process over 1,000 bits of input parameters in real-time and transmits over 1.5 gigabytes of information over a 250-mile race.  Over the two-hour race the ECU will receive and send over 750 million data points.

Race: Good Data Makes the Difference with Seconds to Spare

Of the 20 cars that are gridded for any Grand Prix race, the difference between the car sitting on the pole position (first place on the grid won by speed in qualifying races) and the last car on the grid is 4.29 seconds.  So, the margin between the “best” and “worst’ car is really compressed when you think about a car traveling at a speed of 233mph that covers 341 feet (more that a football field) every second.  Think about that when you factor in all the variables that good data is able to provide the teams.  A typical pit stop for cars changing tires or adjusting the aerodynamics takes roughly 20 – 30 seconds each stop depending on the race track (the actual stopped adjustment time after the car enters “pit-in”, makes the changes, and exits at “pit-out) is less than 3 seconds.  A physical error or miscalculation during this pit stop alone that adds 1 or 2 seconds to the stop time allows the cars on the track to gain almost 700 feet advantage which could be exacerbated by an incorrect adjustment based on bad data.  Races are won and lost not only on the racetrack but by the data scientists trackside and at the headquarters.

Formula E, the electric race car grand prix series is no different.  Sensors measure just about everything on the car that can be measured: pressures, speeds, temperatures, displacements. A Formula E car can have 200 data channels providing information that must be collected and logged—creating gigabytes of data. The data is used three ways. Some is directed to the driver’s display.  Other data is sent by telemetry to the race team for real-time analysis that will allow adjustments to be made during pit stops.  Finally, all the data will be analyzed after the race. The data forms a treasure trove of information about the car’s performance to allow adjustments to the mechanical and aerodynamic setup.

Checkered Flag

Analytics at Jaguar Racing Formula E Speed

For those in the Chicago area, please join us at the RCG hosted “Analytics at Jaguar Racing Formula E Speed” on October 14th. 

Base: Market share is won by decisions based on good data

So what does all of this mean to financial services organizations.  Simply, market share is won today by millimeters and seconds with compressed competitive advantages for those that can capture the unique customer experiences demanded today.  Those who are making informed decisions based on good quality data with strong governance rules will leave the competition seconds, minutes or even laps behind.  It is the base, the foundation for all to come after.

As Rick Skriletz, Global Managing Principal, Data, Analytics, & Innovative Technologies for RCG Global Services says in his recently published e-Book, 7 Steps to Data-Driven Success: Transform Data Management to Accelerate Digital Business Operations, “Data – the element that fuels digital business – is often taken for granted. Bad data, like bad fuel, can clog or block distribution and leave a business stranded. Make data good and fuel your digital business.”

Javeria Khan writes in IT Insights2, that there are 5 ways that bad data is harmful to an organization:

  1. It creates flawed insights
  2. It causes failed migration projects
  3. It affects organizational efficiency
  4. It is a bottleneck in digital transformation
  5. It results in costly expenses.

And I would add the consequences of which aren’t limited to:

  1. Extra costs
  2. Duplication
  3. Lost sales and revenue
  4. Delays in deploying new systems
  5. Reduced customer satisfaction
  6. Damaged brand reputation
  7. Inaccurate business decisions
  8. Difficult to analyze specific trends
  9. Errors in forecasting sales
  10. Inaccurate reporting

Gartner also mentions in their 2020 Marketing Data and Analytics Survey3 that only 54% of analytics influence marketing decisions, primarily because of poor data quality resulting in results that are not actionable and unclear recommendations.

IBM in 2016 estimated that the cost of bad data was approaching $3.1 Trillion lost to poor data quality.  Imagine what that number is in 2021 and what it might be in 3 to 5 years when IoT establishes itself as a factor for financial services.  Harvard Business Institute attributes losses due to bad data4 to:

  • 50% — the amount of time that knowledge workers waste in hidden data factories, hunting for data, finding and correcting errors, and searching for confirmatory sources for data they don’t trust.
  • 60% — the estimated fraction of time that data scientists spend cleaning and organizing data, according to CrowdFlower.
  • 75% — an estimate of the fraction of total cost associated with hidden data factories in simple operations, based on two simple tools, the so-called Friday Afternoon Measurement and the “rule-of ten.”

Get ready for the coming data explosion

The point being, that if financial services organizations are to remain competitive in the market, they need to prepare now to minimize the amount of bad data already on the books as “data debt” before the explosion of data coming with the Internet of Things (IoT), 5G (& 6G) and Wifi6.

We at RCG call this a step into the hyper-business age, where the goals at the macro company level are to get from strategy on paper, to execution to impact in the shortest amount of time possible with the potential for largest returns, increased transparency and simplicity and improved competitive advantage.  5G (and the already in discussion 6G) combined with WiFi6 implementation will create the broader bandwidth and speed via larger transmission “pipes”.  Both 5G and 6G take advantage of higher frequencies on the wireless spectrum to transmit more data, faster. 6G will deliver speeds 1,000 times faster than 5G (which is only four to five times faster than 4G). 5G Makes the Internet of Things Possible. 6G Speeds It Up.

The Internet of Things is the mechanism that will supply the information, the data points to expand the types and amount of data collected to create the ultimate in customer experience, allowing financial services organizations to better understand their customer behaviors and to enable predictive and prescriptive marketing campaigns

7 Steps to Data-Driven Success

Transforming an organization to be data-driven requires changes to the way people use data in their work, and operational processes, by introducing digital processes and practices. Nowhere is this more important than how a company handles its data, including in its applications. At its core, it requires changing application-centric business operations to data-driven ones.

Takeaways: Make sure your data is Good data

Converting data from bad to good is not an easy task but will save money in current dollars and position your organization to take advantage of the impending data explosion.  Whether it is Space, Race or Base (in financial services) having access to good data is foundational.

Some thoughts to consider:

  • Admit that your organization had a data problem and act now to remediate.
  • Focus on controls placed on data exposed to the public, your customers and regulators
  • Define and administer an advanced data excellence program. Consider the Carnegie Mellon CMMI DMM.
  • Put a microscope on how you treat data in general and don’t forget to focus on data lineage
  • Institute a system to continuously monitor and improve data quality

Confirm accurate insight into customer behaviors

Financial Services organizations are losing on average over $15 million per year just due to bad data.  The only remedy is to proactively internalize an enterprise-wide initiative to identify and minimize bad data and the resulting effects. This is the only way to guarantee accurate insight into customer behaviors that will allow predictive and prescriptive selling to the base and prepare for new ways of marketing that will be uncovered by the integration of IoT into the financial arena.

As an ancient Chinese proverb says, “a journey of a thousand miles begins with a single step”.  And my advice is to heed a more current tidbit of wisdom from Captain Picard, “Make it so….” Otherwise, financial services organization will not enjoy the ability to “Live Long and Prosper” as Spock would say.

Make your business a Hyper Business

At RCG Global, we help Financial Service organizations “boldly go” at warp speed into the hyper-business age; to understand the complexity of remaining viable and competitive in the on-demand economy.  We have been successfully solving complex business problems for financial service companies globally since 1974.  Our clients rely on us to help them harness the speed of now that drives evolution to the hyper-business age.  We know companies struggle to exist in an ecosystem of three simultaneously co-existing states, where the adoption of an evolutionary mitosis process constantly transforms and evolves operational and engagement models.  Our engagement model has evolved also to provide the expertise necessary to identify evolutionary solutions and get you from strategy to impact efficiently and with minimal disruption, allowing you to off-load, mitigate and de-risk your most critical initiatives.

We ask questions to solve business problems.  We engineer the velocity necessary to create competitive advantage through impactful outcomes.  We make your strategy a reality by compressing the time between strategy on paper, to execution, to impact, maximizing returns on investment while creating competitive advantage.

Contact me at Joseph.Mendel@rcggs.com

 

Works Cited

1. William Shatner https://williamshatner.com/ws/bills-appearances/

2. IT Chronicles (2022, Apr 4) "5 Reasons Bad Data is Harmful to Your Business" Retrieved from https://itchronicles.com/big-data/reasons-bad-data-is-harmful-to-your-business/

3. Gartner (202, Oct 21) "Marketing Data and Analytics Survey: Analytics Teams Must Upskill to Adapt to Automation" Retrieved from https://www.gartner.com/en/marketing/insights/articles/gartner-marketing-d-a-survey-2020-analytics-teams-must-upskill

4. Harvard Business Review (2016, Sept 22) "Bad Data Costs the U.S. $3 Trillion Per Year" Retrieved from https://hbr.org/2016/09/bad-data-costs-the-u-s-3-trillion-per-year