So, how do we feel about this? You're a guest at a hotel. As you approach the front desk, a discreetly placed video camera records your interactions with staff. The image of your face is scanned by facial recognition software, which analyzes your mood.
If it determines that you're sad, angry, frustrated, stressed or unhappy, the hotel manager is notified and has the opportunity to intervene, perhaps sending an email to you with a discount offer in the hope that it will change your perception of the hotel and that you will not, upon checkout, write a negative review on TripAdvisor.
Such a system actually exists, and it has a name: Face Snapper. Since it was first deployed three months ago, it has found its way into eight hotels, ranging in size from 30 to 200 rooms.
So, again, how do we feel about this? Taking into consideration our often contradictory beliefs and actions, I think the answer is, it depends.
Big Data advertising models currently run surveillance on our online behavior, looking for clues about our buying intentions. We're subsequently cyberstalked and served ads and promotions tailored to match our presumed desires.
And we, for the most part, accept this as the cost of entry to explore content and conduct trade on the Web. (Or, at any rate, at some point we clicked "I agree" and granted permission for this activity, even if we never bothered to read what we were agreeing to.)
But when it comes to having our physical behavior watched and dissected, we tend to get queasy. Barnes & Noble had decided to analyze customer behavior in its brick-and-mortar stores by recording patrons to determine what displays attracted their eyes, how customers moved through stores and whether they actually made a purchase. They attempted to mimic, in a retail setting, what Big Data does, in stealth mode, on the Web.
B&N posted signs disclosing what it was doing. Customer reaction was swift and negative, and the program was abruptly canceled.
But the reality is, there's some video surveillance we accept, and some we object to.
"Most hotels are already recording video and audio right now, for security," Face Snapper CEO J.P. Gagne told me. "They're just not using it for customer service."
Gagne said he is sensitive to Big Brother concerns.
"We saw it as a potential obstacle and researched and thought about it carefully," he said. "And we realized some people may want to not use the system for that very reason, or [might decide to] provide additional notice. [Most states require signage where video recordings are made.] We don't want to make anyone uncomfortable."
But, he said, attitudes are changing, and they're moving toward acceptance of the marriage of video surveillance and marketing.
"Right now, if you walk into Macy's, there are cameras that are analyzing your face and will send coupons to your cellphone, even if you haven't signed up for anything," Gagne said. "They are connecting databases that associate your face with your customer profile. The world is getting used to it."
Gagne said his company is readying a second version of the software. The first merely identifies facial expressions and creates a daily report of pictures of faces with a brief analysis -- sad, mad, frustrated --along with time stamps.
"The feature that we added two weeks ago, and that we're completing work on, records a video," he said. "Either the manager can, or we can, view the video and hear exactly what the customer and staff said and can do customer outreach before the guest leaves the hotel. Or, management can go back to the employee and use the video as a training tool. In fact, a manager told us he's compiling [Face Snapper] videos to make a training video for new staff."
Gagne believes the potential for training is huge: "The software can be used to assess whether certain employees trigger a high rate of dissatisfaction, or a high rate of happy customers for that matter."
There was a study done on police body cameras, he said, that indicated wearing a camera changed an officer's behavior for the better.
"I think our system could have a similar effect," he said. "If staff knows their interactions are being recorded and observed, it could help them be better employees."
I asked how, in a large hotel with hundreds of guests, the system could identify the name of the guest with whom staff was speaking. "When we transcribe the video, most of the time the room number or customer name is mentioned," he said.
While Gagne would not characterize his system as a Big Data play, he believes that at some point "we could use this information in a Big Data way, because we store the information and make it available to customers. And we can report, by day, the number of customers that are especially happy or unhappy. But the system will primarily be used to review video footage in a way that would otherwise be time-consuming for humans to do, and identify situations where management attention is needed."
The analysis of interactions flagged by the system as problematic is not likely to ever become fully human-free, because "human review can eliminate false positives," Gagne said. "As we grow, we can work toward greater automation, but I don't think we'll ever eliminate human review. All this computer stuff, it's great, but getting to 100% accuracy is very expensive or impossible."
More and more, we seem to be accepting a benign version of Big Brother -- or perhaps a benevolent Big Brother, one who hands out discount coupons.
But another perspective was expressed by a colleague who, when told about Face Snapper, sighed, "If only half of those resources were dedicated to provide good service and engage with the customer, reputation would naturally follow the hotel. Not the other way around."