Skip to content

March 17, 2021

Technology & Society, Artificial Intelligence

New Data Fuels AI Opportunities in a Remote World

Sarah Hoffman

All Posts

Now that so many of us are doing almost everything online at home -- shopping, work, doctor appointments, school, financial check-ins, parent-teacher conferences --- you may have noticed some not so subtle behavior changes among those around you. Maybe a colleague has suddenly started blocking her video feed during Zoom calls. Perhaps a customer has started speaking slower, making a lot of spelling errors, or is typing at a different pace. Maybe you’ve noticed a change in the tones of a colleague’s MS Teams or Yammer posts or changes in a customer’s chatbot message style. All of this could be indicative of something important, and AI is an ideal tool to pick up on these changes.

Of course, when we think about AI monitoring this kind of behavior, we call it corporate surveillance and a slew of dystopian scenarios come to mind: businesses snooping to find out who’s been working and who’s been playing hooky; data gathered to determine who gets fired, gets a raise, or gets promoted. But it doesn’t have to be this way. With all the new data and behaviors these systems now have access to, a far more beneficial application of AI is feasible, one that could vastly improve:

Employee wellbeing. Managers know well the concerns raised by an employee who suddenly starts sending a lot of emails at 3 in the morning. AI systems can not only flag worrisome off hours activity, but also indicate when Zoom fatigue is likely or when a much-deserved break from a given task is a good idea. Changes in facial expressions can indicate even subtle mood changes, and AI systems could suggest taking a short walk, getting a cup of coffee, or even turning the next meeting into an audio-only or walking meeting. Some services, like Receptiviti’s, plug into email and messaging systems like Slack to search for signals that employees are depressed or burned out. Other companies, like Cornerstone OnDemand, take things further and are experimenting with heart-rate data from wearable devices like watches. They’re exploring ways to tie this data to entries from a person’s calendar or project management software to determine whether certain meetings, projects, or even people could correlate with elevated stress levels.1

Customer experience. As more of our conversations with customers go digital, we can look at calls, video meetings, and chatbot conversations in new ways, looking for changes in customer behaviors and emotions to better understand their context and needs. We can detect in real-time over a video call if we’re losing their attention and can prompt representatives to reengage the customer when needed. Automated summaries of video meetings could be useful in prioritizing product enhancements based on unmet needs. Transcripts of customer video calls can be used to learn more about what customers desire, find annoying or confusing. Changes in customers’ behaviors – whether it’s their voice, expressions, or keystrokes – can also be used to help customers who may be aging into certain disabilities. Researchers are studying whether AI tools that analyze typing speed and speech could be used to better identify people with early-stage dementia.2

Fraud detection. These digital conversations with customers can also be used to find other shifts in customers’ behaviors. Changes in word choices or unusual syntax by customers over chat could potentially indicate that a customer’s identity is being used without their consent. ProctorU, an online exam monitoring service, uses facial recognition software to match students to the image on their ID and verifies their identities with a typing test, to confirm the speed and rhythms of a student’s keystrokes.3 Some insurance companies are using voice analytics to detect whether a customer is telling the truth when submitting a claim.4

Inclusive design. There may also be ways to use this technology to increase inclusion for employees and customers. For those with difficulty focusing, an AI prompt could nudge people they’re meeting with to repeat things that may have been missed. When people have difficulty hearing they often make a facial expression that communicates their confusion. AI can read this face and nudge a contact center agent to either speak louder or shift to another interface, like text chat. Other facial expressions by a listener could be a sign that the speaker needs to slow down or explain things in simpler terms. All of this could be done through AI nudges sent in real-time.

Questions to Consider

Even with the best of intentions, deploying this kind of technology can be tricky. Companies need to carefully consider their approach to using the abundance of data we now have, especially in regard to privacy.

Do people want this? Different cultures and age groups may have different tolerances for this kind of relationship with an employer. Many employees are feeling more stressed and exhausted due to the expanded corporate surveillance but are hesitant to speak up.5 In April, Zoom removed an “attention tracking” setting, which alerted a call host when a participant was focused elsewhere, following a public uproar about its invasiveness.6 Schools faced a major backlash after using cheating software that tracked eye and head movements while students took their exams remotely due to coronavirus. Some students were so afraid that the testing system would brand them as cheating that they cried from the stress, threw up in trash cans, and even urinated at their desks; students with dark skin shined bright lights at their faces, worrying the systems wouldn’t recognize them.7

How do we monitor this? Clearly for any level of buy in, communication on this front is essential; we need transparent descriptions of what is being collected and how it’s used. But beyond that, how do we oversee performance? Using facial expressions to identify emotions sounds promising, but some argue that this doesn’t make sense to consider across cultures and contexts.8 One person may scowl when angry; another might smile politely. Someone can look downward as a sign of respect; another person may do that out of shyness. In December 2018, a study showed that emotion detection technology assigned more negative emotions to Black men’s faces than white men’s.9 In 2016, an algorithm rejected an Asian man’s passport photo for having his eyes “closed”.10 Perhaps video analytic features will make similar mistakes. Do companies need an AI ethics board to oversee this?

Can we trust this data? As these technologies become more mainstream, people are getting more adept at faking data. A recent analysis found that companies are adapting their language in their forecasts, SEC regulatory filings, and earnings calls due to the rise of AI that analyzes and derives signals from their words.11 And this isn’t stopping with earning calls. Before Zoom removed its “attention tracking” setting, people were easily able to fool the system by using a second device or making sure to head back to the zoom window before 30 seconds passed. Presence Scheduler, which can set your Slack status as permanently active, doubled in sales and traffic in the beginning of the pandemic (until Slack closed the coding loophole).12 To make people appear more engaged, Microsoft added a feature to its Surface Pro X to switch your eye position so it looks like you’re looking at your camera when you’re actually looking at on-screen faces.13 Photoshop’s Neural Filters lets users change facial expressions, strengthening or reducing feelings like joy, surprise, or anger.14 It’s not hard to imagine this technology being incorporated into videoconferencing.

References & Disclaimers

1 Cutter, C., & Feintzeig, R. (2020). Smile! Your boss is tracking your happiness. The Wall Street Journal.
https://www.wsj.com/articles/smile-your-boss-is-tracking-your-happiness-11583255617.
2 Wang, S. (2020). AI May Help Identify Patients with Early-Stage Dementia. The Wall Street Journal.
https://www.wsj.com/articles/ai-may-help-identify-patients-with-early-stage-dementia-11604329922.
3 Harwell, D. (2020). Mass school closures in the wake of the coronavirus are driving a new wave of student surveillance. Washington Post.
https://www.washingtonpost.com/technology/2020/04/01/online-proctoring-college-exams-coronavirus/
4 McCormick, J. (2019). What AI Can Tell from Listening to You. The Wall Street Journal.
https://www.wsj.com/articles/what-ai-can-tell-from-listening-to-you-11554169408
5Harwell, D. (2020). Managers Turn to Surveillance Software, Always-on Webcams to Ensure Employees are (Really) Working From Home. Washington Post.
https://www.washingtonpost.com/technology/2020/04/30/work-from-home-surveillance/
6 Ibid.
7 Harwell, D. (2020). Cheating-Detection Companies Made Millions During the Pandemic. Now Students are Fighting Back. Washington Post. https://www.washingtonpost.com/technology/2020/11/12/test-monitoring-student-revolt/
8 Schwartz, O. (2019). Don’t look now: why you should be worried about machines reading your emotions. The Guardian.
https://www.theguardian.com/technology/2019/mar/06/facial-recognition-software-emotional-science
9 Rhue, L. (2019). Emotion-reading tech fails the racial bias test. The Conversation.
https://theconversation.com/emotion-reading-tech-fails-the-racial-bias-test-108404
10 Cheng, S. (2016). An algorithm rejected an Asian man’s passport photo for having ‘closed eyes’. Quartz.
https://qz.com/857122/an-algorithm-rejected-an-asian-mans-passport-photo-for-having-closed-eyes/
11 Cao, S., Jiang, W., Yang, B., & Zhang, A. L. (2020). How to Talk When a Machine is Listening: Corporate Disclosure in the Age of AI. National Bureau of Economic Research.
https://www.nber.org/papers/w27950
12 Christian, A. (2020). Bosses started spying on remote workers. Now they're fighting back. Wired.
https://www.wired.co.uk/article/work-from-home-surveillance-software
13 Protalinski, E. (2019). Microsoft’s AI-powered eye gaze tech is exclusive to the Surface Pro X. VentureBeat.
https://venturebeat.com/2019/10/03/microsofts-ai-powered-eye-gaze-tech-is-exclusive-to-the-surface-pro-x/
14 Horwitz, J. (2020). Adobe’s Photoshop Neural Filters use AI to change faces, recolor photos. VentureBeat.
https://venturebeat.com/2020/10/20/adobes-photoshop-neural-filters-use-ai-to-change-faces-recolor-photos/

968260.2.0

Related posts

Technology & Society

A Family of One: The Ascendance of Solo Households

Deanna Laufer

March 27, 2023

Families have been the basic unit of social and economic life across cultures, geographies, and nationalities for thousands of years. That started to change in the middle of the 20th century, when a growing number of people started living alone for sustained periods. These solo dwellers, or soloists, include people delaying partnership to later in life, those choosing to remain single and live alone without roommates or family, as well as divorced or widowed adults who reside independently.

Artificial Intelligence

How Creative AI Can Fasttrack Innovation

Sarah Hoffman

September 2, 2021

As companies try to figure out how and when to bring employees back to the office, one often purported benefit of doing so is that the proximity to colleagues – and the chance for spontaneous meetings and conversations – spurs innovation. Others argue, however, that no evidence supports that contention, and instead office culture may actually hamper innovation because needing to be in a prescribed place at specific times excludes some people. 1 And in fact, technology and artificial intelligence have enabled us to move beyond in-person experiences in many ways − just consider how we shop for food and clothing, chose a movie to watch, and even how we date. So can AI enable us to innovate more effectively, with anyone, at any time, and in any location, thereby reducing the need to be in an office?

Artificial Intelligence, Design

Teaching a Robot to Read

COLLEEN MCCRETTON

August 18, 2021

Over the last several years one of the FCAT AI teams - code named “RoboReader” - has been working on processing documents and taking needed information from unstructured text and transforming it into structured data that can be used by the business. In the course of the work, we have noticed parallels in the way we are “teaching” the system and how we read as humans.