HRExaminer v3.18 May 4, 2012
Table of Contents
Big Data Isn’t Analytics
In the 20th Century, progress was driven by the principles of automation. The more you could streamline and control a process, the cheaper you could make it. Repeatability was the key.
Monitoring, in the form of process control charts, spreadsheet generated graphics and staff meeting gotcha reports, was the way an enlightened executive ran his business. Progress was a matter of making the line move up and to the right. Cost cutting involved moving it down and to the right. Statistical process control (SPC) minimized variation.
Here’s a reasonable definition of SPC.
Statistical process control (SPC) is the application of statistical methods to the monitoring and control of a process to ensure that it operates at its full potential to produce conforming product. Under SPC, a process behaves predictably to produce as much conforming product as possible with the least possible waste. While SPC has been applied most frequently to controlling manufacturing lines, it applies equally well to any process with a measurable output.
Key tools in SPC are control charts, a focus on continuous improvement and designed experiments. Much of the power of SPC lies in the ability to examine a process and the sources of variation in that process using tools that give weight to objective analysis over subjective opinions and that allow the strength of each source to be determined numerically. Variations in the process that may affect the quality of the end product or service can be detected and corrected, thus reducing waste as well as the likelihood that problems will be passed on to the customer.
With its emphasis on early detection and prevention of problems, SPC has a distinct advantage over other quality methods, such as inspection, that apply resources to detecting and correcting problems after they have occurred. In addition to reducing waste, SPC can lead to a reduction in the time required to produce the product or service from end to end. This is partially due to a diminished likelihood that the final product will have to be reworked, but it may also result from using SPC data to identify bottlenecks, wait times, and other sources of delays within the process.
Process cycle time reductions coupled with improvements in yield have made SPC a valuable tool from both a cost reduction and a customer satisfaction standpoint.
- Increasing Volume (amount of data)
Generally speaking, the first big data problem is dealing with the amount of data. The important point about the volume of data is that it’s bigger than current toolsets can handle
- Velocity (speed of data in/out)
Historically, data has been processed when possible, not in real time. Companies like Google, Facebook, Twitter (and the rest of the high volume, real time data processors)
are perfecting the art of handling information as it emerges.
- Variety (range of data types, sources)
In some ways, this is the biggest piece of the puzzle. Big data is a way to make novel correlations that create insight that wasn’t before possible by integrating disparate (and what used to be seen as unrelated) data sets.
In the end, though, what matters is the ability to see patterns in the data. Big Data is really a way of talking about the challenges and opportunities that emerge from the data that is drowning us all. It’s not about building new and larger process management tools, it’s about mining the insights that can explode productivity in systems that have fewer and fewer repeatable processes.
Talx and eThority Set Big Data Standards for HR
I spent much of this week getting to know the team behind Talx. Five years into being owned by Equifax, the St. Louis stalwart is beginning a metamorphosis. Being inside the Equifax firewall gives Talx (and Equifax) some very interesting market opportunities.
If you’re not familiar with Talx, they are a friendly, midwestern company with headquarters in St. Louis. The company offers a basket of Talent Acquisition and Payroll services including.
- Assessments (a huge library of tools)
- I-9 Management
- Tax Credits and Incentives
- Paperless Pay
- W-2 Management
- Employment Verifications
- Unemployment Cost Management
- Reemployment Services
- Data Breach Solutions
- Employment Tax Services
It’s pretty dry stuff. Talx is really good at moving, handling, processing and managing great big hunks of data, particularly the stuff associated with tax compliance, tax incentives, and employment verification. You know, the sort of stuff that would drive any sensible person crazy. The stuff that is at the heart of the transactional parts of HR.
Talx helps streamline the processes that are at the heart of HR.
The beating heart of the Talx ecosystem is “the work number“. 50,000 organizations use the work number to verify employment information. There are nearly 50 Million employee files in the system. As the company moves forward, new products will be tied to this core data asset. (50 Million is about 1/3 of the workforce).
By itself, Talx was an extraordinary data asset. Imagine that all of the information associated with the 11 services listed above were collected in a single database. If you could figure out how to sort and sift it, all sorts of things would be possible.
Now, think about Equifax. You know, the credit reporting people. With credit data for 85% of Americans (and a variety of other monstrous databases of the stuff that isn’t reported in credit files like phone bills, utility bills and so on). Historically, the Equifax customer was a financial institution. Equifax knows so much about the financial behavior of the people who make up our economy that the CEO is a regular in the halls of financial power.
In order to grow significantly, however, Equifax needed to stretch beyond the boundaries of banking and into the rest of the commercial universe. While the enterprise is tightly regulated by the Fair Credit and Reporting Act (FCRA), that simply limits what can be done with data that discloses the identity of individuals. Equifax is perfectly free (and able) to mine their data treasure trove for aggregate information. (In some of the examples I was given, an aggregate group could be as small as the members of a 9 digit zip code group.)
Now think about the Talx database again.
The first amazing thing is that Equifax has a series of bits an pieces of credit behavioral data for virtually every person who has a file in the Talx system. While the FCRA prohibits using credit data in very significant ways (including the fact that it always has to be anonymized except under very specific circumstances). So Equifax can’t simply sell businesses credit data about their clients.
But, they can sell the aggregate data.
For instance, Talx uses the data to help customers understand the leverage they have in a variety of settings. Talx can tell you the total indebtedness of your workforce. They can give you averages and data on subsets. One emerging service offers insight into student loan debt as a way of identifying retention leverage. Another project shows employers the car loan volume i the workforce creating the opportunity for the company to cut a money saving deal with a financial institution for lower car load rates. (a nice, inexpensive new benefit)
All of the data integration and analysis is being channeled through the eThority technology that Talx bought last fall. This week, the company rolled out “Elements” which is the talxified version of eThority.
If you want to understand how big data will transform HR, start by watching Talx and Equifax
Employment Branding: Where’s the Limit
A recent Mother Jones article (“Gangbang Interviews” and “Bikini Shots”: Silicon Valley’s Brogrammer Problem) takes some Silicon Valley startups to task for the way they are building their cultures. Citing a presentation by a Path executive (creatively titled “Adding Value as a Non-Technical No Talent Ass-Clown”), the notoriously liberal rag bashed the 80% male culture and its sexist predilections.
It’s worth taking a moment to read the article.
In a nutshell, Mother Jones suggests that building a ‘brogrammer’ culture (the term for company cultures that resemble frat houses) is simultaneously self defeating and bad for the industry. It works against companies like Path by building a culture that can’t effectively include women and perpetuates the 80% male culture in the industry, sez Mother Jones. Following the media scrutiny, there has been enormous hand wringing and finger pointing.
The more interesting question for this audience is about the boundaries of propriety in employment branding. It’s the essence of the question of whether the employment brand even belongs in HR at all.
The brogrammer idea and the image of testosterone driven programmers relentlessly serving the cause is the essence of one approach to building a high tech startup. Maniacs who work excruciating hours produce agile operations. The 80% male ratio already lays the foundation for frat house climates. Nerf gun wars and marathon foozball tourneys are predominantly make things anyhow.
(It’s worth noting that companies like Work4Labs, Branchout, UpMo and other networking companies don’t use this model. But, finding women in leadership in most valley companies is a challenge and their cultures trend to stereotypical maleness. That is to say that not all high tech companies operate this way but there may be an underlying thing going on here.))
So, here’s the question.
Should an employment brand be focused on the heart of the competition or should it be designed to solve larger social issues?
The former makes it a marketing question, the latter makes it an HR question. My sense is that differentiation is critical in employment branding, particularly when scarcity is an issue. While Mother Jones and a whole host of social forces would prefer that diversity be the primary value in the conversation, employment branding is about attracting the available people.
From my perspective, the future of HR depends on being able to integrate the marketing perspective with the diversity perspective. They are not mutually exclusive and their priority in a given decision varies. This is exactly where regulation, social agenda and competition come into conflict.
Thoughtful answers to questions like this are rarely binary.
Any new technology offers competitive differentiation that is inversely proportional to its adoption in the market. The newer the idea (and the fewer people using it), the greater the return. The more widespread the approach, the lower the return. In other words, when an idea is a real winner, the first people to use it receive extraordinary benefit.
The problem, of course, is that the high payoff is accompanied by higher risk. Many seemingly powerful innovations fall flat leaving their investors and early customers in the lurch.
New technologies, by definition, begin their lives as untested hypotheses. The ideas make logical sense, but have not been tested in any way. With social media, the rush to deploy has included a number of idealistic ventures that did not work out.
A technology’s effectiveness changes as it moves through the marketplace. If a new technology will have a lasting impact, the first several years of its life always deliver the maximum return for adopters.
Technology moves through the HR Market in phases. As the process matures, both the risk and the reward for adopting the new approach decrease. People who adopt a new approach very early can get enormous benefit. They can also look pretty silly.
Here are the phases through which new ideas pass as they move from novelty to broad acceptance.
- Discovery As the story of those early successes move through the market (via trade shows, publications, blogs and professional associations), the addressable market grows dramatically. The first people to buy into the new idea look silly to their more conventional brethren. If the idea gains real attraction, they look like geniuses. If it fsails, they look like spoiled rich kids.
- Popularization As more companies adopt the technology and the users talking about it increase, enthusiasm and excitement accelerates, which leads to even more users. After the early adopters bear the risk, there is still a tremendous store of competitive advantage to be had. The optimal time for an organization to start using a new technology is here.
- Best Practice As the market begins to embrace the tool, it becomes a so-called Best Practice. By this time, much of the competitive advantage has already been delivered. Even so, the mantle of ‘best practice’ further accelerates both adoption and the decline in effectiveness. Once new tools have been used long enough for there to be ROI calculations, industry analysts start referring to them as ‘best practices’. Best practices are not innovations. Rather, they are ideas that will provide a modest return and carry a modest risk.
- Standard Practice By the time a new idea has become Standard Practice, prices are falling and the new idea is simply a part of the cost of doing business. Organizations rely on standard practices to meet the minimum requirements of operating in their markets. Vey little competitive advantage is gained by this point in the evolution of an idea.
Two paradoxical things are true about the new generation of referral tool.
The addition of data from the social graph tends to reduce the level of intimacy (and therefore the effectiveness) of individual referrals; and
Empowered with data from the social graph, an individual referral can be a good deal more effective than an old fashioned referral.
The variations in referral effectiveness can be simply understood.
The temptation to ‘spam’ is one of the industrial weaknesses of the integration of Internet technology into society. One way of thinking about spam is to notice that every good idea has an embedded limit to its effectiveness. All ideas do.
Where a little bit of pepper makes a dish, a pound of pepper can ruin it. Where a small fire heats a house, a large one burns it down. A glass of wine a day reduces stress; a bottle a day increases liver damage.
The trick in any problem solution is to find the balance point, the place between ineffectiveness and disaster, where the tool has optimal benefit.
When I vouch for your ability to do a job as both a favor to you and a favor from the hiring manager, each of the three of us are bound into a relationship that depends on the others for success. By investing our reputations in each other, we leverage performance. The risk for each of the three of us is that the deal doesn’t produce results.
In volume referrals, risk (the fundamental facet of the programs that CEOs love) goes missing. By eliminating personal accountability, the highest volume referral programs lose all of the ‘oomph’ that makes the basic approach work.
As long as the referral simply gets the referred candidate into the same pool as everyone else, the hiring manager’s ability to invest is not available. If the referrer (the employee pushing jobs to the network) is not held accountable for the performance of the referred person, the tool is simply a sourcing mechanism.
Referral programs work in direct proportion to the investment required of the individual participants. Most of the current offerings are simply new interfaces for volume sourcing.
Another important variable in referral programs is the actual time investment required by the members of the organization. Most people don’t know their friends as potential coworkers. Their relationships are friendships, not business development opportunities.
This means that any effort required to reframe the friendship as a potential employment opportunity requires time, thought and interest. That’s a lot to ask from someone whose attention is focused on doing their job. The reason that even the best referral programs have limited participation is that the referrers are usually busy working.