The Rear View Mirror (RVM) Effect in HR Decision Support Systems

Are things in HR analytics closer than they appear?


I was driving the other day and noticed this message on the rear view mirror - "Objects in mirror are closer than they appear". This is essentially due to the convexity of the mirror which increases the field of vision but reduces the depth perception.

Two questions -

a. Do Decision Support Systems, at times, have this one dimensional focus as well? I.e. in the quest to increase the quantity of information being provided to decision makers (ie width of vision) is there a compromise on the quality of information (ie depth of vision)?

b. Is there too much focus on the lagging metrics (Rear View) viz. leading metrics and predictive capabilities?

The volume / variety / velocity of data being generated today can quite easily overwhelm decision makers unless we improve the quality of the information by answering these fundamental questions.

1) In the initial phases of designing a Decision Support System do we ensure direct inputs from the end user (that could be a Trainee or a Vice President) and not be limited by the capability of the reporting platform or the suggestions provide by the development team. In other words, how are we ensuring Relevance?

Way Forward : To ensure relevance, in addition to the tried and tested method of keeping an open channel with your stakeholders, one perspective that is fast gaining credence is Design Thinking (DT). The key element of DT is a structured process of understanding what the customer wants. The Design School at Stanford is a doing a lot of work in this area and Coursera has a good course on this topic. At the risk of oversimplifying the concept, here are few key pointers -

  • Voice of Customer Trumps All: The design of the DSS system is to be driven almost exclusively by the end user, which means a truly effective design workshop is one where there is representation across all stakeholders.
  • Fail Fast / Fail Early: On account of the number of stakeholders involved and complexity of systems it may not be possible to get it right the first time. So the key is to do small and frequent design iterations which will ensure engagement with the end user and also ensure a Minimum Viable Product is up and running quickly.
  • Never Sell Your Solution: Seems counter intuitive? With developers and analysts there is a tendency to sell a DSS solution to the end user once it has been created. While this is a human tendency on account of the sense-of-ownership, it can put off the end user if the system does not meet "unstated needs". So, as the solution evolves it is important to take the end user feedback as sacred and at-the-most provide only clarifications and context on the reason for the current design. That way, the customer will continue having a truly engaged conversation with the designers.
  • Last but not the least, iterate / iterate / iterate!

2) How can we help decision makers quickly assimilate and prioritize large volumes of information? This is especially important for senior / executive leadership who don't have the luxury of being in contact with the "doers". Which means the only way they can do a pulse check of a process is by looking at the data. At that level, their span-of-control would involve multiple processes and multiple functions - which in turn means data overload. In other words, how are we ensuring data assimilation?

Way Forward: A simple and effective solution to this problem is Data Visualization. Why does this work? The brain is naturally wired to process colors and shapes much better than numbers. Which means if we can convert information into a visual format - it opens up a high bandwidth data transmission channel between the DSS solution and the decision maker. Including a pretty insightful video and article on Data Visualization. Tools like Tableau and Microstrategy are a good (free!) way to get you started on your DV journey.

3) In addition to providing information, are we enabling end users with the context and the instrumentation to convert these Insights > to > Action. In other words, how are we ensuring actionability?

Way Forward: Ideally a good DSS system should enable the end user to action on a particular data point from within the application. This could involve anything from forwarding the data to initiating a conversation with process owners from within the application.

4) Do DSS frameworks factor-in metrics that are relevant across the industry? While current frameworks like PCMM use a Maturity Level > Process Area > Goals > Practices outline to provide a structure for conversations around processes, it doesn't give a dictionary of industry accepted metrics that can measure the effectiveness of these practices. Also, one of the key requirements of maintaining a competitive advantage is to be compare performance with competitors in an objective and quantifiable manner. In other words, do all metrics have an outside-in approach?

Way Forward: Organizations like SHRM and the Center For Talent Reporting (CTR) are doing critical and foundational work on setting up reporting standards which are applicable across industries. As administrator of the U.S. Technical Advisory Group ISO/TC 260, SHRM continues to publish key global standards for the HR fraternity. Here is a framework on how standards are evolved in collaboration with firms across industries. HR practitioners need to ensure they are plugged into these conversations to ensure metrics are line with the industry.

5) As the Balanced Scorecard Framework asks, are the current crop of metrics focused on after-the-fact reporting or are we ensuring that our reports help decision makers look ahead of the curve. In other words, how can we ensure all insights are based on leading factors?

Way Forward: We can take two corrective actions here -

  • Critical Review of Factors: Is it better to report downstream metrics like, say, overall Learning Scores, or should we focus on upstream metrics like, say, which channels are employees using the most to get training? The idea is - if DSS solutions can provide upstream clarity, that can help in making informed inferences.
  • Predictive Capability: A much more evolved way of leveraging Leading Factors is to used Predictive Models. The right combination of Programming, Statistical and Functional Capabilities can help in creating models which can actually predict key metrics like, chances of someone taking up a joining offer / chances of someone leaving the organization etc. While this might sound like an attractive option, these programs tend to lose steam quickly since these involve distinct skill sets and therefore multiple groups. Each group would have their own perspectives so consistent leadership involvement is key to ensure all stakeholders are kept engaged and no one loses sight of the actual objective.

In conclusion - while hindsight is certainly important for us to introspect, our "Rear View Mirror" DSS solutions may not be giving us truly contextualized and nuanced view to the organization since the focus may be on volume of reporting and not quality of reporting. Also, forward looking and over-the-horizon reporting is becoming a key requirement to provide a competitive advantage and ensure our decision makers are staying ahead of the curve.

Hope you found this useful, do share your thoughts !


Read next:

Social TV: Cross-Channel Insights on the ShareThis Platform